D
D
Dmitry2017-12-04 17:18:30
robots.txt
Dmitry, 2017-12-04 17:18:30

Robots.txt setting for smart directory filter?

Maybe someone has a solution for setting up a robots.txt file for a directory filter.
In order not to index all possible outputs of the filter pages, I wrote in the robots file:
Disallow: /catalog/*/filter/*
I remove all URLs with parameters from indexing:
Disallow: /*?
There is a filter page with its own meta data, for example:
/catalog/kirpich-v-stile-loft/filter/tcvet-is-red/apply/
In order for it to be indexed, I prescribe:
Allow: /catalog/kirpich-v-stile-loft/filter /tcvet-is-red/apply/
But for some reason the page with parameters is still indexed:
/catalog/kirpich-v-stile-loft/filter/tcvet-is-red/apply/?display=block
/catalog/kirpich-v -stile-loft/filter/tcvet-is-red/apply/?PAGEN_1=3
How to get rid of it?
Only helps:
Disallow: /catalog/kirpich-v-stile-loft/filter/tcvet-is-red/apply/?*
But it's too voluminous to prescribe for each filter...

Answer the question

In order to leave comments, you need to log in

2 answer(s)
V
Vladimir Kiper, 2017-12-05
@Kipeer

On pages with get parameters:
/catalog/kirpich-v-stile-loft/filter/tcvet-is-red/apply/?display=block
/catalog/kirpich-v-stile-loft/filter/tcvet-is-red /apply/?PAGEN_1=3
Write The robots.txt file is advisory for search engines, but a strict directive. <meta name="robots" content="noindex, follow">
<meta name="robots" content="noindex, follow">

D
Dmitry, 2017-12-05
@kazankin

Yes perishing, here is you and service, one answer and the other way!

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question