P
P
PyNen2018-12-26 21:48:01
Search engines
PyNen, 2018-12-26 21:48:01

Where to download a quality search robot?

There is a masterserver. I want to get all ip address from this master server manually and download all sites from these ip addresses. Often sites are simply not indexed by search engines. There are on average about 600 such sites on each master server by the number of servers.
I'm interested in popular universal search robots/crawlers, methods of complete site parsing, offline browsers and other methods of how to download the entire site in html (html copy). Why search robots? Popular search engines are able to make a cached copy of the entire indexed site. If there is a universal offline browser - also write. They don't look at robot.txt. Maybe a snapshot of the site.

Answer the question

In order to leave comments, you need to log in

1 answer(s)
D
Dmitry Rublev, 2018-12-29
@dmitryrublev

From the old: Teleport Pro .

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question