Answer the question
In order to leave comments, you need to log in
How to close all ?get= requests from indexing in robots.txt?
One self-written site uses CNC, so there are absolutely no (?podobnyh=) requests anywhere.
But it is not clear where duplicates of the form began to appear: site.com/page/1/ == site.com?page=1, etc.
I decided to remove them from indexing through the robots.txt file and found this solution:
Disallow: /*?*
Is it correct it?
Answer the question
In order to leave comments, you need to log in
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question