Answer the question
In order to leave comments, you need to log in
How to implement different robots.txt for WordPress site with multiple linked sub-domains?
There is a site on WordPress (not a multisite) several subdomains are linked to the site in this way
site.ru, sub1.site.ru, sub2.site.ru is it possible to somehow create a separate robots.txt for each subdomain?
Answer the question
In order to leave comments, you need to log in
If I understood the task correctly, then somehow I solved it with the help of .htaccess
RewriteEngine On
RewriteCond %{HTTP_HOST} ^www1\.(.*)$
RewriteCond %{REQUEST_URI} =/robots.txt
RewriteRule ^(.*)$ robots_disallow.txt [L,QSA]
RewriteCond %{HTTP_HOST} !^www\.(.*)$
RewriteCond %{REQUEST_URI} =/robots.txt
RewriteRule ^(.*)$ robots_disallow.txt [L,QSA]
RewriteCond %{HTTP_HOST} ^www\.(.*)$
RewriteCond %{REQUEST_URI} =/robots.txt
RewriteRule ^(.*)$ robots_allow.txt [L,QSA]
When you solve your question, visit this page: https://dampi.ru/pravilnyiy-robots-txt-dlya-sayta-... I just described in detail everything that needs to be done. For those who like to shove anything into robots: if you open uploads for all bots, then uploaded PDF and other text files appear in the index. And in Yandex webmaster, in the "Excluded pages" report, error messages appear when indexing pictures, saying that the content is not supported. Before writing and giving advice, did you try to analyze the messages of GSC and YaWebmaster?
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question