Answer the question
In order to leave comments, you need to log in
Different robots.txt for subdomains?
There is a shop with subdomains. It is necessary that each subdomain has its own robots.txt
So that when a search engine requests robots.txt for St. Petersburg, it receives robots_spb.txt, but the path remains robots.txt
What am I doing wrong?
# Moscow
RewriteCond %{HTTP_HOST} ^www.*$ [NC]
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.*robots\.txt.*\ HTTP/ [NC]
RewriteRule ^robots\.txt /robots_msk.txt [NC,L]
# Saint-Petersburg
RewriteCond %{HTTP_HOST} ^spb.*$ [NC]
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.*robots\.txt.*\ HTTP/ [NC]
RewriteRule ^robots\.txt /robots_spb.txt [NC,L]
# Novgorod
RewriteCond %{HTTP_HOST} ^nnv.*$ [NC]
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.*robots\.txt.*\ HTTP/ [NC]
RewriteRule ^robots\.txt /robots_nnv.txt [NC,L]
Answer the question
In order to leave comments, you need to log in
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question