Answer the question
In order to leave comments, you need to log in
How to remove links with GET parameters from the output using Robots.txt?
Tell me how to remove links with get parameters that are duplicated in Yasha's search results through robots.txt.
View links:
https://site.ru/topic/155-otkryvaem-hajdy-s-drugih-resursov/page-419?hl=%20key%20%20collector
https://site.ru/topic/161-otkryvaem-hajdy-s-drugih-resursov/page-459?hl=%D0%BA%D0%B2%D0%B0%D0%B7%D0%B0%D1%80
User-agent: Yandex
Disallow:
Clean-param: hl /page*
Answer the question
In order to leave comments, you need to log in
I have a suspicion that in your case it makes sense to specify
Clean-param: hl /topic/
either
Clean-param: hl /topic/*
I used the top option for the CNC forum.
At the same time, in case of doubt, you can always specify simply Clean-param: hl
and it should definitely work. It is unlikely that you have such a parameter somewhere else, and even changing other pages of the site.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question