Answer the question
In order to leave comments, you need to log in
Where to download a quality search robot?
There is a masterserver. I want to get all ip address from this master server manually and download all sites from these ip addresses. Often sites are simply not indexed by search engines. There are on average about 600 such sites on each master server by the number of servers.
I'm interested in popular universal search robots/crawlers, methods of complete site parsing, offline browsers and other methods of how to download the entire site in html (html copy). Why search robots? Popular search engines are able to make a cached copy of the entire indexed site. If there is a universal offline browser - also write. They don't look at robot.txt. Maybe a snapshot of the site.
Answer the question
In order to leave comments, you need to log in
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question