Answer the question
In order to leave comments, you need to log in
How to bulk check links for 200/301/302 responses?
Hello.
Tell me a tool that can check hundreds of links for server responses
200 - 301 - 302 and so on
. There is a link base (regular sites) that needs to be filtered, but there are a lot of addresses in the database, one at a time is not an option.
Answer the question
In order to leave comments, you need to log in
wget -nv --spider -i list.txt
where in list.txt are links, one per line.
closer to the combat version, with loading cookies, speed limit and pauses of 0.3 seconds between requests:
wget -nv --load-cookies cookies.txt --spider -i list.txt --limit-rate=20k -w 0.3
There are many options - it depends on your skills,
in Linux you can write a bash script that will go through the list of links in the file, fork the call of the curl utility for each line and write the response to the file, a limiter on the number of simultaneously running curl processes will also come in handy - because links if a lot can run out of memory.
about the same, I'm sure you can write in python or perl.
Under Windows, I also think it is possible to take some kind of PL and implement a similar algorithm.
SEO Frog Spider. You can use online services, write in Google something like "mass bulk check response code"
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question