Answer the question
In order to leave comments, you need to log in
Should I close /filter in robots.txt?
The essence of the problem: Yandex.webmaster swears at duplicate pages and the same meta tags. And if the issue with duplicates can be solved by closing the GET parameters via Dissalow: *? , then the filter is more difficult. The page address is as follows: site.ru/catalog/section/ filter /filter parameter
I read somewhere that a filter can have a positive effect on indexing, because robots see that the site is optimized for users. Should I put Dissalow: *filter in robots.txt ?
Answer the question
In order to leave comments, you need to log in
I read somewhere that a filter can have a positive effect on indexing
that the site is optimized for users.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question