N
N
Naked2021-09-14 12:15:05
robots.txt
Naked, 2021-09-14 12:15:05

How to remove links with GET parameters from the output using Robots.txt?

Tell me how to remove links with get parameters that are duplicated in Yasha's search results through robots.txt.

View links:

https://site.ru/topic/155-otkryvaem-hajdy-s-drugih-resursov/page-419?hl=%20key%20%20collector

https://site.ru/topic/161-otkryvaem-hajdy-s-drugih-resursov/page-459?hl=%D0%BA%D0%B2%D0%B0%D0%B7%D0%B0%D1%80


Said so, but I'm not sure if it's correct
User-agent: Yandex
Disallow:
Clean-param: hl /page*


The manual itself from Yandex https://yandex.ru/support/webmaster/robot-workings...

Answer the question

In order to leave comments, you need to log in

1 answer(s)
Z
Zettabyte, 2021-09-14
@Zettabyte

I have a suspicion that in your case it makes sense to specify
Clean-param: hl /topic/
either
Clean-param: hl /topic/*
I used the top option for the CNC forum.
At the same time, in case of doubt, you can always specify simply Clean-param: hland it should definitely work. It is unlikely that you have such a parameter somewhere else, and even changing other pages of the site.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question