Answer the question
In order to leave comments, you need to log in
How to hide pseudo pages with filters from Yandex indexing?
There is a new site. It has catalog pages. You can apply different sortings on these pages.
Example:
domen.ru/page/?order=popularity
domen.ru/page/?order=price_asc
, etc.
Such pages are indexed by Yandex, but I don't need it.
I made the following entries in robots.txt
Disallow: /?order=*
Disallow : /?order=?
Disallow: /?order=popularity
Disallow: /?order=price_asc
Disallow: /?order=popularity?
Disallow: /?order=price_asc?
And in Yandex.webmaster
In the "delete url" section, similar commands were indicated. But such pages continue to be indexed
Answer the question
In order to leave comments, you need to log in
1. Yandex is uncovered for a long time.
2. Add canonical to the heap.
Well, firstly, in Disallow you excluded the order parameter only for the main page, and you have a subpage /page/
Disallow: /page/?*order=*
- somehow it will be more fun.
Close from indexing by writing in the code of the filter pages before /head - .
For this to work, you need to remove prohibiting directives from robots.txt on those pages on which you write "noindex, follow"
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question