V
V
Vladimir2014-09-03 11:07:40
Nginx
Vladimir, 2014-09-03 11:07:40

Is it possible to block bots at the nginx level by the frequency of downloading files?

I would like to protect the catalog site from parsing information about products by parser bots.
It is clear that it is not even worth doing for user agents (from too big fools).
I'm more interested in such a semi-intelligent solution.
During normal visits, the user does not go through the pages at a speed of 10 pages per second, plus he (more precisely, his browser) downloads files (pictures, styles, js) that are related to the current page.
So I thought - is it possible to somehow rely on these 2 facts?
That is, roughly speaking, if there is a page jump from the IP address at high speed + files randomly, then block it.
At the same time, without blocking search robots)
Or is it all an empty idea?

Answer the question

In order to leave comments, you need to log in

2 answer(s)
A
Antony Ryabov, 2014-09-03
@ruvasik

google fail2ban

A
Anton Solomonov, 2014-09-03
@Wendor

ngx_http_limit_conn_module

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question