Answer the question
In order to leave comments, you need to log in
Should robots.txt have separate sections for each user-agent?
Is there any difference in terms of search engine optimization between this:
User-agent: *
Disallow:
and this:User-agent: Yandex
Disallow:
User-agent: Googlebot
Disallow:
? Answer the question
In order to leave comments, you need to log in
In your situation, in terms of optimization, most likely there is no difference.
It may be necessary to use different User-agents to solve certain problems.
For example:
When using User-agent: * - Adsense bot (User-agent: Mediapartners-Google) cannot display ads in blocked sections of the site. In this case, it is logical to use different sections.
For Google and Yandex, it may also be necessary to use different instructions, but until you have it, don't worry.
There are general rules for all bots, but each bot supports additional sets of rules.
If we use only general directives, then it is enough for * to specify.
only if in a certain browser you got an index of an unrequired section, usually in Google, therefore for it you write a separate rule to prohibit such and such a section forcibly, otherwise not.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question