A
A
Alex-Broudy2019-02-18 13:11:12
robots.txt
Alex-Broudy, 2019-02-18 13:11:12

How to disable indexing of only a group of pages in robots.txt?

Greetings!
Please tell me how to properly close from indexing
there are pages of this type:

https://example.com/city/название-города/products/product-1

/city-name/ - cities more than 400
city - city-
name must be open - products must be closed
- must be open, but everything after products must be closed from indexing
, while product-1 pages can have arbitrary names, i.e. .e. product-1 can be portrets or bouquets or whatever

Answer the question

In order to leave comments, you need to log in

2 answer(s)
I
igorux, 2019-02-18
@Alex-Broudy

city ​​- must be open

Allow: /city/$
Make a common substring at the end of the url for these pages, for example, /city/city-name/city-info/
Then you get:
Disallow: /city/*/city-info/$
Allow: /city/*/products/$
Disallow: /city/*/products/*
#/city/*/products/ matches both conditions, the Allow directive will be selected

A
Alex-Broudy, 2019-02-18
@Alex-Broudy

and if I need to later open for indexing everything that comes before products, including products, but close everything that comes after products, should it look like this?
Allow: /city/$
Disallow: /city/*/products/*

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question