R
R
Ruslan Yanborisov2018-12-10 12:40:14
robots.txt
Ruslan Yanborisov, 2018-12-10 12:40:14

Is it possible to first close all pages from indexing and then enable only some?

Will this robots.txt be correct?

User-agent: *
Disallow: /
Allow: /price/
Allow: /contacts

Will there be conflicts or will the line below be a priority for the robot?
And how in this case to allow for indexing the main page, which is located at http://domen.ru/?
PS I know about verification services, but I did not find information on this issue on the Internet. Let there be a manual in the form of your answers).

Answer the question

In order to leave comments, you need to log in

1 answer(s)
I
igorux, 2018-12-10
@Rus_K_o

See the Yandex.Webmaster cheat sheet :
******
The Allow and Disallow directives from the corresponding User-agent block are sorted by URL prefix length (from smallest to largest) and are applied sequentially. If several directives are suitable for a given page of the site, then the robot selects the last one in the order of appearance in the sorted list. Thus, the order of the directives in the robots.txt file does not affect how the robot uses them.
In the event of a conflict between two directives with prefixes of the same length, the Allow directive takes precedence.
******

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question