M
M
mtclwn2021-12-29 10:42:39
Search Engine Optimization
mtclwn, 2021-12-29 10:42:39

Should I close /filter in robots.txt?

The essence of the problem: Yandex.webmaster swears at duplicate pages and the same meta tags. And if the issue with duplicates can be solved by closing the GET parameters via Dissalow: *? , then the filter is more difficult. The page address is as follows: site.ru/catalog/section/ filter /filter parameter

I read somewhere that a filter can have a positive effect on indexing, because robots see that the site is optimized for users. Should I put Dissalow: *filter in robots.txt ?

Answer the question

In order to leave comments, you need to log in

1 answer(s)
D
Dmitry, 2021-12-29
@mtclwn

I read somewhere that a filter can have a positive effect on indexing

Right.
that the site is optimized for users.

not so much they see that the site is optimized for users, but it can give a significant influx of additional traffic.
Ideally - to refine the filters so that they give out different meta tags. Then Yandex will stop swearing.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question