A
A
Alexander Lamdan2021-04-11 18:37:43
Search Engine Optimization
Alexander Lamdan, 2021-04-11 18:37:43

How to deny access to the site, but so as not to ditch it in the top search engines for testing?

And so, I'm developing a site, soon very soon it will get to the server with its own domain.

But I don't want search engine bots looking at my site because it can kill the ranking of the site before testing and fixing all kinds of bugs from design to button click.

Absolutely everything will be tested, UX / UI, design, functionality, convenience, and so on.

How can you publish a site, and only then, on command, give access to pages and a site for search engines without trying to fall in the top of the issue?

For example:

I need user A from another city to evaluate my site, optimize download speeds over 3G, 4G, 5G, Wifi, LAN, check for compatibility with Firefox, Google Chrome, Safari, and so on, but at the same time, I can only enter is he.

And so that the search engines could not find me.

How much risk am I taking? And is it actually possible to do this?

Answer the question

In order to leave comments, you need to log in

2 answer(s)
A
antonwx, 2021-04-11
@alexander_lamdan

Everything is described here https://developers.google.com/search/docs/advanced... and there is an example of how to completely disable indexing

D
Danny Arty, 2021-04-11
@DanArst

<meta name=”robots” content=”noindex, nofollow”/>
or

User-agent: *
Disallow: /

I don’t understand what the problem is simply to completely close the site from indexing for the time of testing or optimizing content for promotion? This is quite a common practice
. And in order for only a certain user to have access, you can also set up an IP filter or simply make password access

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question