E
E
Eugene2017-02-23 10:41:46
Domain name market
Eugene, 2017-02-23 10:41:46

How to implement different robots.txt for WordPress site with multiple linked sub-domains?

There is a site on WordPress (not a multisite) several subdomains are linked to the site in this way
site.ru, sub1.site.ru, sub2.site.ru is it possible to somehow create a separate robots.txt for each subdomain?

Answer the question

In order to leave comments, you need to log in

2 answer(s)
D
Denis Yanchevsky, 2017-02-23
@deniscopro

If I understood the task correctly, then somehow I solved it with the help of .htaccess

RewriteEngine On
RewriteCond %{HTTP_HOST} ^www1\.(.*)$
RewriteCond %{REQUEST_URI} =/robots.txt
RewriteRule ^(.*)$ robots_disallow.txt [L,QSA]

When requesting robots.txt for the www1 subdomain, robots_disallow.txt will be returned.
Another option:
RewriteCond %{HTTP_HOST} !^www\.(.*)$
RewriteCond %{REQUEST_URI} =/robots.txt
RewriteRule ^(.*)$ robots_disallow.txt [L,QSA]

RewriteCond %{HTTP_HOST} ^www\.(.*)$
RewriteCond %{REQUEST_URI} =/robots.txt
RewriteRule ^(.*)$ robots_allow.txt [L,QSA]

D
dampiru, 2021-05-05
@dampiru

When you solve your question, visit this page: https://dampi.ru/pravilnyiy-robots-txt-dlya-sayta-... I just described in detail everything that needs to be done. For those who like to shove anything into robots: if you open uploads for all bots, then uploaded PDF and other text files appear in the index. And in Yandex webmaster, in the "Excluded pages" report, error messages appear when indexing pictures, saying that the content is not supported. Before writing and giving advice, did you try to analyze the messages of GSC and YaWebmaster?

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question