S
S
SvizzZzy2016-04-06 13:16:53
robots.txt
SvizzZzy, 2016-04-06 13:16:53

How to close all ?get= requests from indexing in robots.txt?

One self-written site uses CNC, so there are absolutely no (?podobnyh=) requests anywhere.
But it is not clear where duplicates of the form began to appear: site.com/page/1/ == site.com?page=1, etc.
I decided to remove them from indexing through the robots.txt file and found this solution:
Disallow: /*?*
Is it correct it?

Answer the question

In order to leave comments, you need to log in

1 answer(s)
S
SvizzZzy, 2016-04-06
@SvizzZzy

Так и не нашел инфу об этом...
Но проверил эту опцию в гугл и яндекс вебмастерах - всё работает как надо.
Вроде ок.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question