Answer the question
In order to leave comments, you need to log in
How can you "humanize" a web scraper?
I have some experience in writing and operating parsers, and by this time I have encountered a problem - they are blocked. Requests from the parser are perceived as automatic, access to the site is closed and they are asked to enter a captcha, or it can be even worse: an abuse is sent to the hosting and they can block the whole VPS.
How can parsers be made more "human"? So that the work of the parser is practically no different from the work of a real user. How and what can be imitated? Of course, you can parse through a proxy, but this is a paid pleasure and generally does not solve the problem.
Answer the question
In order to leave comments, you need to log in
Of course, you can parse through a proxy, but this is a paid pleasure.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question