Answer the question
In order to leave comments, you need to log in
How to check website for broken links using wget?
I ask you not to hit hard, as it would seem that all the keys were correctly indicated, but this command still does not work:
Initial data:
Main domain:
home.domain.com
cdn.domain.com
png
It is necessary to check broken links on cdn.domain.com when accessing main.domain.com and output them to the brokenlinks.txt file
wget -o brokenlinks.txt -nv --spider --referer=main.domain.com -w0.2 -r -l inf -p -A jpg,png -H https://main.domain.com
--referer
- include the "Referer: URL" header in the HTTP request. -o brokenlinks.txt
- output file -nv
- disable output of detailed information --spider
- do not download -w0.2
- time between requests in seconds -r
- recursively - recursion depth -l
(inf and 0 - infinity) -p
download all images and so on.-A
-H
Answer the question
In order to leave comments, you need to log in
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question