Answer the question
In order to leave comments, you need to log in
How to block a site copy from search engines?
A copy of the site on Wordpress was created so that experiments could be carried out.
I disabled indexing in the admin panel.
Is this enough to not harm the SEO of the main site?
or another solution is needed: otherwise prohibit indexing, delete a copy, and so on.
Answer the question
In order to leave comments, you need to log in
Using the robots.txt file or using the server settings (access by password, or block access to certain IP addresses, or access only from a certain IP)
If you closed through the admin panel, then it is enough
when you close access through the admin panel, then the VI adds a special tag to the head
1. In the settings, set "do not index" - robots.txt will contain "Disallow /" for all robots. Theoretically, this is enough, but some robots may ignore it. However, the main ones (Google, Yandex, etc.) are not ignored, so we can assume that this is a fairly reliable method.
2. Additionally, you should not worry too much, since the addresses will be different, and in theory, just the second copy will be a "duplicate", even if it is indexed. Theoretically, the positions of the first, original copy should not suffer. But there can also be nuances.
3. Sites that should not be public (all sorts of sandboxes, staging sites) are usually closed from public access in general. There are a lot of options for how to do this. Offhand:
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question