Answer the question
In order to leave comments, you need to log in
How to change robots.txt for some hosted domains?
Given:
Site hosting, Ubuntu 12.04, Apache2.22
The hosting hosts "live" sites and "test" sites.
Live - these are sites to which a normal domain name of the domainname.ru type is attached
Test - these are sites that respond by the name of the domainname.test.ru type
Task: It is
necessary to return real robots.txt when requesting "live" domains, "test" - file /var/www/default/robots.txt, so that "test" sites are closed from indexing.
It should be noted that if two domain names domainname.ru and domainname.test.ru actually have the same DocumentRoot and essentially refer to the same site, then real robots.txt should be given for the first one, and according to default
An attempt to write the following in apache2.conf did not work:
<Files "robots.txt">
RewriteEngine on
RewriteCond %{HTTP_HOST} test\.ru$
RewriteCond %{HOST_URI} robots.txt$
RewriteRule ^(.*)$ /var/www/default/robots.txt [L]
</Files>
Answer the question
In order to leave comments, you need to log in
1. add extension txt mime types :
AddType application/x-httpd-php .php .txt
2. Create robots.txt with
<?
if ($_SERVER[HTTP_HOST]=="domain1.ru" } { readfile("robots-domain1.txt"); }
if ($_SERVER[HTTP_HOST]=="domain2.ru" } { readfile("robots-domain2 .txt"); }
?>
3. Create actual robots-domain1/2.txt files with content for robots
Not exactly what you need - this solution involves modifying site files.
And it is necessary that the server, when requesting files according to the template "*.test.ru/robots.txt", gives the desired text, and if it does not fall under this template, then the real file from the site directory is given
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question