A
A
Airat Kadyrmaev2015-10-06 22:31:34
WordPress
Airat Kadyrmaev, 2015-10-06 22:31:34

Can a rule in IPTABLES on a server prevent crawlers?

such a situation: the load on the server has sharply increased, and in order to somehow reduce it, the hosters suggested doing the following:
"For our part, we can suggest adding an IPTABLES rule:
iptables -A INPUT -p tcp --syn --dport 80 -m connlimit - -connlimit-above 5 -j REJECT --reject-with tcp-reset "
"This rule will limit the number of connections to your server's port 80. If more than 5 connections come from the same IP at the same time, they will be reset."
Tell me, is this rule in iptables, can somehow prevent search robots from indexing the site?

Answer the question

In order to leave comments, you need to log in

2 answer(s)
P
Puma Thailand, 2015-10-07
@opium

It will cut off all intensive bots and get errors in webmasters.
You would first figure out why the load increased, if these are your real users, then why did you decide to cut them off? This is idiocy of the highest order.

K
Konstantin Radkov, 2015-10-09
@shqarok

it is necessary to cut not users, but
the hoster, what exactly is the load on?
percent / screw / operative / channel ?
variants of the problem
1. initially crookedly configured VDS
2. problem with iron
3. communication channel
4. all taken together

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question