H
H
Hint2011-05-15 20:04:13
robots.txt
Hint, 2011-05-15 20:04:13

robots.txt. Filter by GET parameters?

All requests go through index.php. Addresses look like this:
example.com/?c=controller&a=action¶m=value
Is it possible to disable robots.txt from crawling addresses based on GET parameters?
For example, ban all addresses starting with example.com/?c=main&a=test (and other options may follow).

Answer the question

In order to leave comments, you need to log in

3 answer(s)
Y
yakubovsky, 2011-05-15
@Hint

User-agent: Yandex
Disallow: /add.php?*user=
# disables all 'add.php?' scripts with the 'user' parameter It is
better to read more here
help.yandex.ru/webmaster/?id=996567#996572

S
Sergey Beresnev, 2011-05-17
@sectus

It is possible to disable via HTTP headers (but it is better not to do this). You can disable it through - so the parameters to the script can go in any order.

I
ildar r. khasanshin, 2017-09-20
@ildarkhasanshin

https://seolib.ru/tools/generate/robots/

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question