Answer the question
In order to leave comments, you need to log in
Why is the site not downloading with wget -r?
I want to download a site for local use - mitsubishi.ilcats.ru, I try to do it through wget -r - as a result, only the main page and robots.txt are downloaded, wget does not go further.
There are links on the main downloaded page, of course.
Tried through httrack - exactly the same situation.
What can be wrong? some sort of magic.
Answer the question
In order to leave comments, you need to log in
Try it: wget --mirror
Perhaps the site has absolute links to another domain - by default, wget does not download other domains (an extra www is enough).
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question