G
G
gremlintv22018-10-16 15:24:40
bash
gremlintv2, 2018-10-16 15:24:40

How to check website for broken links using wget?

I ask you not to hit hard, as it would seem that all the keys were correctly indicated, but this command still does not work:
Initial data:
Main domain:

home.domain.com

Additional domain on which links:
cdn.domain.com

Files to check for broken links:
png

Task:
It is necessary to check broken links on cdn.domain.com when accessing main.domain.com and output them to the brokenlinks.txt file

How tried:
wget -o brokenlinks.txt -nv --spider --referer=main.domain.com -w0.2 -r -l inf -p -A jpg,png -H https://main.domain.com

--referer - include the "Referer: URL" header in the HTTP request.
-o brokenlinks.txt- output file
-nv- disable output of detailed information
--spider- do not download
-w0.2- time between requests in seconds
-r- recursively - recursion depth
-l(inf and 0 - infinity)
-pdownload all images and so on.
-A
-H

Answer the question

In order to leave comments, you need to log in

1 answer(s)
D
Dimonchik, 2018-10-20
@dimonchik2013

how do you expect it to work?
log out and parse for 404

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question