Answer the question
In order to leave comments, you need to log in
How to deny access to the site, but so as not to ditch it in the top search engines for testing?
And so, I'm developing a site, soon very soon it will get to the server with its own domain.
But I don't want search engine bots looking at my site because it can kill the ranking of the site before testing and fixing all kinds of bugs from design to button click.
Absolutely everything will be tested, UX / UI, design, functionality, convenience, and so on.
How can you publish a site, and only then, on command, give access to pages and a site for search engines without trying to fall in the top of the issue?
For example:
I need user A from another city to evaluate my site, optimize download speeds over 3G, 4G, 5G, Wifi, LAN, check for compatibility with Firefox, Google Chrome, Safari, and so on, but at the same time, I can only enter is he.
And so that the search engines could not find me.
How much risk am I taking? And is it actually possible to do this?
Answer the question
In order to leave comments, you need to log in
Everything is described here https://developers.google.com/search/docs/advanced... and there is an example of how to completely disable indexing
<meta name=”robots” content=”noindex, nofollow”/>
or
User-agent: *
Disallow: /
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question