Answer the question
In order to leave comments, you need to log in
How do I prevent Google from treating pages with GET parameters as standalone pages?
Hello!
I will reveal the essence of the issue: Yandex clearly says in the documentation that you can tell Yandex to "glue" pages with and without GET parameters. For example, consider /catalog and /catalog?term= the same and not index the second one.
For this, the Clean-param directive is used.
But Google does not understand this directive and even highlights it as incorrect in the robots.txt validator.
Nevertheless, there are now a lot of garbage pages in the index with parameters that you want to remove, because. because of this, there are many duplicates of title and description.
Has anyone faced this issue? I will be glad to advice and help.
Answer the question
In order to leave comments, you need to log in
As far as I know for Google, this issue is solved by canonical links
Well, or in robots
Disallow: /*?*dir=
Disallow: /*?*order=
Disallow: /*?*p=
Use , each module has its own fixed url of the current page <link rel="canonical" href="//myweb.com/url123" />
Moreover, I recommend using URL parameters in GSC (Google Search Console) according to the old WMT (Web Master Tool). There, for each URL parameter, you can tell Google where your sorting is, and where is pagination, and so on. Thus, he himself will understand which URLs with parameters should be visited more often to update the content (for example, pagination), and which URLs will lead to duplicate content (filters, sorting, ...).
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question