Answer the question
In order to leave comments, you need to log in
How to block part of url from indexing in robots?
Hello!
You need to close part of the URL.
Example:
site.ru/repeated word_some word1.html
site.ru/repeated word_some word2.html
site.ru/repeated word_some word3.html
, etc. ...
It is necessary that the .html extension be closed from indexing only if the URL contains "a repeated word_some word". Otherwise, the .html extension must remain.
Huge request for advice. I'm not a programmer myself, I've already broken my head))))
Answer the question
In order to leave comments, you need to log in
If I understand your condition correctly, then:
Disallow: /*repeated word*.html$
Use of special characters * and $
Use the tools for checking the availability of documents for indexing in the webmaster consoles of search engines. After editing robots, check if what you wanted to allow is allowed for indexing, etc.
The path value is used to determine if the rule should apply to a specific URL on the site. Except for wildcards, the path must match the beginning of any valid and valid URL. Non-7-bit ASCII characters in paths can be added as UTF-8 characters or as UTF-8 escaped values
Google, Bing, Yahoo, and Ask support certain wildcards for path values. Here is their complete list:
Therefore, such a condition can hardly be drawn up, only manually registered if all possible links on the site, well, or use these two stage signs to set a simple condition.
Hello!
I'm continuing my education :) Now I'm faced with the problem of prohibiting indexing in robots of a range of URLs starting with capital letters. Rummaged the entire Internet. Found it with numbers. Not with letters.
Is it possible to specify that directories with a range of capital letters AZ are prohibited after niviuk_ ?
Sincerely, Alexander.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question