A
A
Azami2018-12-28 13:51:51
Search Engine Optimization
Azami, 2018-12-28 13:51:51

How to write a condition in robots.txt?

Good afternoon, I ask for help in the following.
The site has pages in the form
/category/view/15
/category/view/15?status=1
/category/view/15?status=1&page=2
How to prevent robots from indexing /category/view/15?status=1, but at the same time that /category/view/15?status=1&page=2 and /category/view/15 are indexed?
Disallow: /category/view/*?status=1 doesn't help. /category/view/15?status=1&page=2 also blocks.
Thanks

Answer the question

In order to leave comments, you need to log in

2 answer(s)
I
igorux, 2018-12-28
@Azami

Disallow: /category/view/*?status=1
Allow: /category/view/*?status=1&page=2
The Allow and Disallow directives from the corresponding User-agent block are sorted by URL prefix length (lowest to longest) and applied sequentially . If several directives are suitable for a given page of the site, then the robot selects the last one in the order of appearance in the sorted list. Thus, the order of the directives in the robots.txt file does not affect how the robot uses them.
In the event of a conflict between two directives with prefixes of the same length, the Allow directive takes precedence.

D
Dmitry Dart, 2018-12-28
@gobananas

Try
Disallow: /category/view/15?status=1&page=*

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question