P
P
PendalF892022-03-25 10:58:07
Nginx
PendalF89, 2022-03-25 10:58:07

How to properly configure NGINX so as not to block search robots?

Hello!
I'm setting up Nginx to avoid primitive DDoS attacks, but I don't want to accidentally block crawlers. Now it's set up like this:

http {
  limit_req_zone $binary_remote_addr zone=one:10m rate=10r/s;
  # ...
  server {
      location / {
          limit_req zone=one burst=20 nodelay;
          # ...
      }
  }
}


With the current configuration from 1 ip address there can be up to 10 requests per second with an interval of 100ms per request. At the same time, there can be a one-time "burst" of up to 20 simultaneous requests, after which each new request will also be processed with an interval of 100ms.

Is this configuration enough not to ban Google and Yandex search robots?

Answer the question

In order to leave comments, you need to log in

1 answer(s)
K
ky0, 2022-03-25
@ky0

In robots.txtyou can specify limits on the frequency of requests - whoever does not take this into account is to blame.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question