Answer the question
In order to leave comments, you need to log in
Robots.txt setting for smart directory filter?
Maybe someone has a solution for setting up a robots.txt file for a directory filter.
In order not to index all possible outputs of the filter pages, I wrote in the robots file:
Disallow: /catalog/*/filter/*
I remove all URLs with parameters from indexing:
Disallow: /*?
There is a filter page with its own meta data, for example:
/catalog/kirpich-v-stile-loft/filter/tcvet-is-red/apply/
In order for it to be indexed, I prescribe:
Allow: /catalog/kirpich-v-stile-loft/filter /tcvet-is-red/apply/
But for some reason the page with parameters is still indexed:
/catalog/kirpich-v-stile-loft/filter/tcvet-is-red/apply/?display=block
/catalog/kirpich-v -stile-loft/filter/tcvet-is-red/apply/?PAGEN_1=3
How to get rid of it?
Only helps:
Disallow: /catalog/kirpich-v-stile-loft/filter/tcvet-is-red/apply/?*
But it's too voluminous to prescribe for each filter...
Answer the question
In order to leave comments, you need to log in
On pages with get parameters:
/catalog/kirpich-v-stile-loft/filter/tcvet-is-red/apply/?display=block
/catalog/kirpich-v-stile-loft/filter/tcvet-is-red /apply/?PAGEN_1=3
Write
The robots.txt file is advisory for search engines, but a strict directive. <meta name="robots" content="noindex, follow">
<meta name="robots" content="noindex, follow">
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question