N
N
Nataly_m2017-06-19 15:23:13
robots.txt
Nataly_m, 2017-06-19 15:23:13

Robots.txt, Clean-param directive for Get request - How to configure?

Good afternoon. The site indexed 2 pages, which are essentially duplicates:
site.ru and site.ru/?
Some of the pages are configured in such a way that they use get parameters.
Is it possible to somehow close such an address site.ru/ from indexing? or somehow solve the problem in another way?

Answer the question

In order to leave comments, you need to log in

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question