M
M
megaterminator2015-03-25 19:33:36
HTML
megaterminator, 2015-03-25 19:33:36

Should robots.txt have separate sections for each user-agent?

Is there any difference in terms of search engine optimization between this:

User-agent: *
Disallow:
and this:
User-agent: Yandex
Disallow: 

User-agent: Googlebot
Disallow:
?
The site is a simple business card. Am I doing it right that I do not prohibit anything for indexing?

Answer the question

In order to leave comments, you need to log in

4 answer(s)
V
Vladislav Yanovsky, 2015-03-25
@megaterminator

In your situation, in terms of optimization, most likely there is no difference.
It may be necessary to use different User-agents to solve certain problems.
For example:
When using User-agent: * - Adsense bot (User-agent: Mediapartners-Google) cannot display ads in blocked sections of the site. In this case, it is logical to use different sections.
For Google and Yandex, it may also be necessary to use different instructions, but until you have it, don't worry.

I
Ilya Korablev, 2015-03-25
@swipeshot

User-agent: * - any bot. Read at the
same time .

U
un1t, 2015-03-25
@un1t

There are general rules for all bots, but each bot supports additional sets of rules.
If we use only general directives, then it is enough for * to specify.

V
Viktor Taran, 2015-03-25
@shambler81

only if in a certain browser you got an index of an unrequired section, usually in Google, therefore for it you write a separate rule to prohibit such and such a section forcibly, otherwise not.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question