Answer the question
In order to leave comments, you need to log in
How to make sure the quality of the server?
I do SEO and I got a rather large project in terms of urls with a small number of visits per month (50-70k).
There are tens of millions of pages on the site, but bots go reluctantly (30-40k from one bot per day, although, as admins say, there are no limits anywhere).
But during one of the scans of the site by Bing's tool, Timed out errors appeared (occurs when our bots try to access your server and unable to do it lack of availability of server). There are no these 5xx errors in the webmaster, but obviously bots cannot always get the server's response. This can affect traffic from search engines.
Techies say that everything is fine and this cannot be - the site processes 30 threads and this data is not consumed.
How do I query the data to make sure the servers are able to receive the load without problems with timed out errors? For sites with so many URLs, it is desirable to have a load of 300-700 thousand visits per day from bots.
Answer the question
In order to leave comments, you need to log in
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question