Answer the question
In order to leave comments, you need to log in
Is it possible to block bots at the nginx level by the frequency of downloading files?
I would like to protect the catalog site from parsing information about products by parser bots.
It is clear that it is not even worth doing for user agents (from too big fools).
I'm more interested in such a semi-intelligent solution.
During normal visits, the user does not go through the pages at a speed of 10 pages per second, plus he (more precisely, his browser) downloads files (pictures, styles, js) that are related to the current page.
So I thought - is it possible to somehow rely on these 2 facts?
That is, roughly speaking, if there is a page jump from the IP address at high speed + files randomly, then block it.
At the same time, without blocking search robots)
Or is it all an empty idea?
Answer the question
In order to leave comments, you need to log in
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question