Answer the question
In order to leave comments, you need to log in
How to make one robots.txt for all subdomains?
The situation is this, the site has a large number of dynamic subdomains, such as username.site.com, they are all processed by the script of the main domain, the rules in .htaccess
RewriteCond %{HTTP_HOST} ^(www\.)?([^\.]+)\.site\.com$ [NC]
RewriteRule (.*) index.php?cn=%2 [NC,QSA]
Answer the question
In order to leave comments, you need to log in
Do you have a username.site.com file structure that is dynamic and is set once for all? Not on each user after all the directory?
Then you just need to add a symlink to this dynamic directory. Nothing complicated. A symlink is like a shortcut in Windows.
If there is no graphic interface, then all that
[email protected]~$ cd /path/to/usersite/folder
[email protected]~$ ln -s /path/to/site.com/robots.txt robots.txt
why don't you rewrite a certain script that will determine the access to robots.txt and give out the necessary content, otherwise load the requested script?
In the apache config, this is done like this:
In the nginx config:
It should work in htaccess too, but the mb syntax is different.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question