K
K
Kamil2020-06-30 13:21:55
Software and Internet Services
Kamil, 2020-06-30 13:21:55

How to bulk check links for 200/301/302 responses?

Hello.
Tell me a tool that can check hundreds of links for server responses
200 - 301 - 302 and so on
. There is a link base (regular sites) that needs to be filtered, but there are a lot of addresses in the database, one at a time is not an option.

Answer the question

In order to leave comments, you need to log in

3 answer(s)
S
Stalker_RED, 2020-06-30
@Stalker_RED

wget -nv --spider -i list.txtwhere in list.txt are links, one per line.
closer to the combat version, with loading cookies, speed limit and pauses of 0.3 seconds between requests:

wget -nv --load-cookies cookies.txt --spider -i list.txt --limit-rate=20k -w 0.3

V
Vladimir, 2020-06-30
@MechanID

There are many options - it depends on your skills,
in Linux you can write a bash script that will go through the list of links in the file, fork the call of the curl utility for each line and write the response to the file, a limiter on the number of simultaneously running curl processes will also come in handy - because links if a lot can run out of memory.
about the same, I'm sure you can write in python or perl.
Under Windows, I also think it is possible to take some kind of PL and implement a similar algorithm.

M
Maxim Kirshin, 2020-06-30
@meowto16

SEO Frog Spider. You can use online services, write in Google something like "mass bulk check response code"

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question