Answer the question
In order to leave comments, you need to log in
robots.txt. Filter by GET parameters?
All requests go through index.php. Addresses look like this:
example.com/?c=controller&a=action¶m=value
Is it possible to disable robots.txt from crawling addresses based on GET parameters?
For example, ban all addresses starting with example.com/?c=main&a=test (and other options may follow).
Answer the question
In order to leave comments, you need to log in
User-agent: Yandex
Disallow: /add.php?*user=
# disables all 'add.php?' scripts with the 'user' parameter It is
better to read more here
help.yandex.ru/webmaster/?id=996567#996572
It is possible to disable via HTTP headers (but it is better not to do this). You can disable it through - so the parameters to the script can go in any order.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question