S
S
Svyatoslav Torg2018-01-22 11:13:18
Nginx
Svyatoslav Torg, 2018-01-22 11:13:18

How to reduce the load on the server from search robots?

Hello everyone, I have a resource located on elasticweb hosting, with payment for resources!
YII2 NGINX
_
_
_
_
5a659b6d37ec7092305971.jpeg
_ _ _ _ fixed, not as before date() !
5a659c5be6806617349524.jpeg
And now it has indexed new MAPs, and still walks hard through the pages))) Is there
any way to make it walk less??

Answer the question

In order to leave comments, you need to log in

3 answer(s)
D
dmitriy, 2018-01-22
@sefkiss

use cache

A
Alexander Aksentiev, 2018-01-22
@Sanasol

https://yandex.com/support/search/robots/request-r...
But in general, if there is some kind of super load from the robot, but you need to cut the code normally, because there are a lot of robots, and most of them don’t care about what you write in robots or sitemap.
The site should not crash or consume the entire server when crawled by bots.

V
Viktor Taran, 2018-01-22
@shambler81

155600 - what are you selling?
or do you have a scientific library?
Or is it stupid with all kinds of get queries in the filter?
If so, then prohibit indexing their
disallow in robots.txt , write down the
canonical rial.
Close with ajax.
Exclude from the sitemap.
Give me a link

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question