P
P
Puma Thailand2013-04-16 16:46:57
Nginx
Puma Thailand, 2013-04-16 16:46:57

How to use limit_req and not kill search engines

Actually, the site is actively crawled by bots, I want to cool them with limit_req, but the site is no less actively crawled by search engines.
How was it possible to make everything beautiful in nginx and so that the bots were cut off and the search engines indexed?

Answer the question

In order to leave comments, you need to log in

2 answer(s)
M
Marat, 2013-04-16
@Xakki

Start with 10 at random, and in the webmaster settings on Google, specify that Google makes less than 3 requests per second (if your page is rendered in less than a second, even under heavy loads.)

A
Alexey Sundukov, 2013-04-16
@alekciy

In robots.txt, specify the required crawl speed for the site.
Based on server logs and PTRs, build a map of IP addresses that fall into location with a deliberately high limit.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question