Z
Z
Zhw2020-10-17 23:33:11
Google
Zhw, 2020-10-17 23:33:11

How safe is it for SEO to launch a beta of a new version of a site?

We have domain.com and want to test a new site template on beta.domain.com. The beta.domain.com domain is needed only temporarily, then it will be removed and the new template (after testing) put on the main domain.com domain.

What will be the correct algorithm of actions so as not to negatively affect the SEO of the main domain.com domain?

It's strange, but there weren't many useful manuals on the Internet, except for this topic: https://webmasters.stackexchange.com/questions/820...

Where the following tips are given:

1. Put robots.txt with the noindex flag in beta.domain .com
2. Register canonical tags from beta.domain.com to domain.com Everything seems to

be logical with the first point, but why is the second one needed? If we already have noindex there, is canonical really needed?

Are there any other recommendations or links on how to arrange this whole process more correctly?

By the way, do you need to add beta.domain.com to Google Webmasters Tools?

Answer the question

In order to leave comments, you need to log in

4 answer(s)
D
Daulet, 2020-10-17
@phpneguru

It all depends on how you are going to use the beta version of your site.
If you have already launched the site and want it to be the main one, then you need to put down canonicals, register the move in Webmasters. Make a redirect to beta.domain.com.
If you want domain.com to remain the main site, then you need to put rel="noindex nofollow" on the beta.domain.com domain and close the site from indexing in the robots

A
Alexey Grachenkov, 2020-10-17
@fantasticlucky

The easiest and most effective way to close a subdomain from robots:
1) File the subdomain. The subdomain is opened by login and password.
2) Not necessarily. But, just in case, from the experience of working with programmers, create robots.txt for the subdomain and write there:
User-agent: *
Disallow: /
# Deny access to the site to all robots

P
Puma Thailand, 2020-10-18
@opium

it's all garbage and it's not guaranteed to work,
just close the site with a password and voila

A
Alexander Denisov, 2020-10-28
@Grinvind

If I understand correctly, then we are talking about making the site available to users in order to send traffic to it and test the conversion. Then, as mentioned above, close in robots through

User-agent: *
Disallow: /

Thus, you prevent crawling of the site pages by search engines. It's enough. noindex disables indexing. It is important not to confuse crawling and indexing. If there is a ban on scanning, then robots will not even see noindex, because will not open the code for this page.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question