Answer the question
In order to leave comments, you need to log in
Do duplicate filters and trade offers have a negative effect on promotion?
Tell. There is an online store for Bitrix on a ready-made solution from Aspro.
The site has its own filter for products in each section.
The site also has most of the products with trade offers.
There is also a product search.
All these things create a huge number of takes. Because each filter, each trade offer and search phrases on the site have a separate page. This is how Bitrix works.
for example site.ru/catalog/section/filter/curtain-green/color-is-kiwvgjzm-or-cgj1jy0m/apply/
Total in Yandex
Loaded pages 15042 - that's a lot!
Searched pages 558
Excluded pages 1315
How does this affect the search engine optimization of the site as a whole? And issuance in the PS. Moreover, that in the search itself relevant pages.
And how to deal with a large number of downloaded pages?
Answer the question
In order to leave comments, you need to log in
yes influence
and not only filters and even peydzhenatsiya.
Because it cannibalizes the request.
In fact, you get 50 pages with the same title and description.
And people walk all over them. Thus, the "weight" of the page falls apart on them.
NOT to mention filters
1. option to remove everything for bxajax - but the solution is so-so.
Since Google does not care about Ajax, it will find links anyway and index them.
And the same paging will lose the ability to give a colleague a link to the desired page.
The solution so far is only 1
1. we prohibit all get requests in robots.txt!
that's all because the site should work 100% on the cnc, and you don't need search indexing and the like.
2. rial canonical right in the header by the condition of having a get request! and of course this is the current page but without the get parameter.
3. already at will, which can be removed for Ajax.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question