L
L
Lexxtor2015-02-02 02:24:01
Search Engine Optimization
Lexxtor, 2015-02-02 02:24:01

Is it good for SEO to block some of the links in robots.txt?

robots.txt:

User-agent: *
Disallow: *sort=*

I have the main part of the site - a large table divided into more than 100 pages. In the table barcodes of goods, names and prices. Above the columns there are links for sorting, 5 pieces, so I blocked them, based on the fact that content indexing is important, and not the number of links to it.
Before I did this, for some queries (barcodes), Google gave results that were flooded by my site (8 out of 10 links in the results led to my site). It seems to have been fixed, but now I thought, maybe it was on the contrary good? Yandex shows that 453 pages are prohibited by the robots.txt file

Answer the question

In order to leave comments, you need to log in

1 answer(s)
M
Michael, 2015-02-04
@Lexxtor

Sorting results, as well as search results from your site, should not be indexed by search engines. they will be perceived as duplicates of the main content.
In Yandex, this should be dealt with using robots.txt, Google has a special tool for this - URL Parameters

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question