Answer the question
In order to leave comments, you need to log in
How to dynamically create a robots.txt file for each subdomain?
The situation is this, the site is regionally divided into subdomains. There are a lot of them, and search bots need to give robots.txt and sitemap different for each subdomain. From here the question arises, how is it possible and is it possible at all to dynamically create robots.txt for the subdomain to which the bot entered, using a script when entering subdomain.site.ru/robots.txt?
Now robots.txt and sitemap are in the root of the www.site.ru site directory and are accordingly shown for all other subdomains, I would like to change this.
Answer the question
In order to leave comments, you need to log in
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question