Answer the question
In order to leave comments, you need to log in
Robots.txt, Clean-param directive for Get request - How to configure?
Good afternoon. The site indexed 2 pages, which are essentially duplicates:
site.ru and site.ru/?
Some of the pages are configured in such a way that they use get parameters.
Is it possible to somehow close such an address site.ru/ from indexing? or somehow solve the problem in another way?
Answer the question
In order to leave comments, you need to log in
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question