Answer the question
In order to leave comments, you need to log in
How to properly configure robots.txt for virtuemart 2.5?
Good afternoon!
Can you please tell me how to properly configure the robots.txt file for an online store on virtuemart 2.5 in order to avoid all unnecessary pages from getting into the index?
So far I have come to this option:
User-agent: Yandex
Disallow: /administrator/
Disallow: /cache/
Disallow: /cli/
Disallow: /includes/
Disallow: /installation/
Disallow: /language/
Disallow: /libraries/
Disallow: /logs /
Disallow: /modules/
Disallow: /plugins/
Disallow: /tmp/
Disallow: /webalizer/
Disallow: */dirDesc
Disallow: */by*product_name
Disallow: */by*product_price
Disallow: /*print=1
Disallow: */index.php?
Disallow: */*created_on*
Disallow: */*product_in_stock*
Disallow: */askquestion*
Disallow: */notify*
Disallow: /*error=404
Disallow: */search?
Allow: /
Allow: /index.php?option=com_xmap&sitemap=1&view=xml
Looking through Yandex Webmaster noticed that a bunch of pages with addresses like this are being indexed:
https://www.biosalon.ru/shop/search/results,631-630
https://www.biosalon.ru/shop/results,14311-14310
Is there any point in indexing them or is it better to ban them? As far as I understand they are absolutely not talking about anything.
Also, the question is what to do with the "doubles" of such a plan? In the webmaster, these pages are marked as duplicates, although the path to the product cards is different here: in one case, through the catalog and brand, in the second, simply through the catalog and category of aquariums.
https://www.biosalon.ru/shop/akvariumy-i-tumby/juw...
https://www.biosalon.ru/shop/akvariumy-i-tumby/juw...
Answer the question
In order to leave comments, you need to log in
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question