S
S
Seeker2020-11-28 16:49:32
Search Engine Optimization
Seeker, 2020-11-28 16:49:32

How to hide some service pages for search engines?

I want to make sure that search engines do not index the service pages of the site. I add the Disallow directives to robots.txt, but I have doubts about the syntax of this directive. It is necessary that the index does not include pages with parameters, such as:
https://domain.ltd/videos/fav?o=tf&t=m
and such:
https://domain.ltd/videos?o=tf&t=m
Added line :
Disallow: /videos/*?
But the URL of the second type I do not understand what directive to remove. Tell me plz.
And yet, I read in one place that the expression with an asterisk and a question mark only works for Googlebot, is that true? I would also like to remove YandexBot and bingbot.

Answer the question

In order to leave comments, you need to log in

2 answer(s)
A
Angry Snowman, 2020-11-29
@good-villain

Have you tried noindex, nofollow in link parameters?

A
Anton Velichko, 2020-11-30
@ReactorHeart

And yet, I read in one place that the expression with an asterisk and a question mark only works for Googlebot, is that true? I would also like to remove YandexBot and bingbot.

Each search engine handles robots.txt directives differently.
In order not to guess and reliably close service pages from the index, you must set the closing /head on all pages of the site that you decide to close in the robots meta tag
: must be deleted (for those pages where it was registered in the code of the pages in the robots: meta tag)

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question