Answer the question
In order to leave comments, you need to log in
Robot.txt and sitemap for subdomains?
Hello everyone, the site is made on Yii2, a site with subdomains of the 1st and 2nd levels, there is one entry point. those. subdomains are not located in separate folders. For all subdomains, a separate robot.txt and separate sitemap.xml are needed,
since the content is displayed differently depending on which domain the user is in and they move through search engines as separate sites. The question is where and how to locate these files for different subdomains?
The SEO specialist immediately told me that dynamic files do not suit him, he needs static files, so that he would set everything up himself?!
index.php entry point in web folder
defined('YII_DEBUG') or define('YII_DEBUG', true);
defined('YII_ENV') or define('YII_ENV', 'dev');
require(__DIR__ . '/../vendor/autoload.php');
require(__DIR__ . '/../vendor/yiisoft/yii2/Yii.php');
require(__DIR__ . '/../common/config/bootstrap.php');
switch ($_SERVER['HTTP_HOST']) {
case 'frontend.dev':
case 'growex.com.ua':
define('YII_APP', 'frontend');
break;
case 'seeds.dev':
case 'seeds.growex.com.ua':
define('YII_APP', 'seeds');
break;
case 'szr.dev':
case 'szr.growex.com.ua':
define('YII_APP', 'szr');
break;
case 'backend.dev':
case 'admin.growex.com.ua':
define('YII_APP', 'backend');
break;
default:
header("HTTP/1.1 301 Moved Permanently");
header("Location: growex.com.ua");
}
define('YII_APP_DIR', Yii::getAlias('@apps') . '/' . YII_APP);
apps
-----backend - admin.site.com
-----frontend - site.com
-----seeds - seeds.site.com
-----szr - szr.site.com
common
console
vendor
web
-----assets
-----css
-----js
-----fonts
-----images
-----uploads
-----htaccess
-----index.php - то самый файл, через него проходит точка входа и он проверяет на домен
htaccess
requirements.php
init.php
composer
.... и всякое другое!
Answer the question
In order to leave comments, you need to log in
What are the other entry points? Is this some kind of dll-ka chtoli for you! There is a domain, there is a subdomain, and a shitty cloud of URLs. Each subdomain is considered by the PS as a separate domain (roughly speaking) and it doesn't care what kind of structure you have there. A domain, subdomain, subsubdomain ... has a so-called. root. You need to put a sitemap and robots in it (if we consider the default options). Your SEO specialist simply does not screw in the generator of the same sitemap. Give it not static, but an unloading control interface.
That's all the wisdom.
UPD:Whoa-whoa, "optimizers of business processes", Palekhche! What are you going to automate in robots then?))) Sitemap - no options, but robots, in my opinion, is the most, if not the only (!) Of all files that require manual creation! Yes, I can’t even imagine how it can be completely generated by a script, given the logic and meaning of its content!
Try to resolve everything through htaccess, which for robots.txt internally redirects via rewrite to robots.php, which, in turn, depending on the domain, gives the necessary file from robots_subdomain.txt (this is if there are domains in the database and full dynamics). Or directly for each of the domains (if their list is initially known and does not change for a long time) through rewrite, it is also possible to give a file from /robots/subdomain/robots.txt. Seoshnik will be able to edit the statics and you have everything set up in dynamics. Similarly with sitemap.xml.
I have exactly the same structure with subdomains. Just create a field in the admin panel for each subdomain, and set up the action, and in the action set the titles. Business, the search engines will think that this is an Orthodox robot, the SEO specialist will be able to write whatever he wants in the admin panel, in the most ordinary text box. Everyone is happy, everyone is happy.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question