A
A
amigo_0982015-06-08 10:59:13
Search engines
amigo_098, 2015-06-08 10:59:13

How to block subdomains from being indexed by search engines?

there is such a problem, it pops up every time - we deploy sites and do not close them in robots, and sometimes even worse - we transfer the site closed in the robots to the client's main domain) then in general - the site falls out of the index. Therefore, there may be a solution that excludes "forgotten" - because. Since our sites deployed on our host have the address *.sorokarm.ru, then perhaps by applying some kind of rule for the sorokarm.ru domain, we can close all subdomains. perhaps inserting A-records to this domain. Does anyone have any thoughts?

Answer the question

In order to leave comments, you need to log in

4 answer(s)
M
Max, 2015-06-08
@AloneCoder

There is a solution that excludes "forgotten" - check and do not forget
. And for sorokarm.ru domains, you can register a redirect in .htaccess for robots.txt files

P
Puma Thailand, 2015-06-08
@opium

When creating, handing over, transferring a project, there is always a standard tudu, if it is not there, then this is sheer idiocy, you simply add an item to check the robots and other things to the list and live in peace.

V
Viktor Savchenko, 2015-06-10
@UAPEER

Take NGINX, write:
server {
server_name SUBDOMAIN.HERE;
add_header X-Robots-Tag noindex;
}
Done. 100% works for google.

K
kanuhamru, 2016-01-24
@kanuhamru

There is a solution for Apache here: searchengines.guru/showpost.php?p=8776153&postcount=11
The essence of the method is to replace robots.txt for subdomains at the server level (we give robots.txt specially prepared for subdomains, and not from the subdomain folder).
I would be grateful for a similar one for Nginx (Viktor Savchenko's solution is not suitable - robots.txt is needed).

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question