A
A
asdz2015-09-30 14:03:04
Search Engine Optimization
asdz, 2015-09-30 14:03:04

Are links with different get parameters duplicates?

If a search engine finds a link with an arbitrary get parameter in the URL, for example
somesite.ru?fakeparam=blablabla
, will it be considered a duplicate of somesite.ru or not?
Those. the point is that you can add some parameters to any page, if the page does not process them, then they will not have any effect on the output. But if the search engine perceives such links as a duplicate, then it turns out that it is necessary to check for the correctness of the parameters and either redirect or 404 or use canonical?

Answer the question

In order to leave comments, you need to log in

4 answer(s)
D
Dmitry, 2015-09-30
@EvilsInterrupt

I wouldn't consider it a duplicate. The parameter can influence the formation of the result. Just based on the not very good name of the parameter with the word 'fake', I would not draw conclusions about how it affects the result.
In other words, I would ask myself: Does the parameter affect the result? If so, then they are not duplicates.

A
Aram Aramyan, 2015-09-30
@GreenBee

If the PS adds both of these URLs to its database, and the same content is shown on them (and since the parameter is fake, then the content will be the same), then yes, it will consider these pages duplicates.
canonical will really help to avoid this.
There are also directives in Yandex's robots.txt: https://yandex.ru/support/webmaster/controlling-ro...
but Google does not seem to support them.
In general, use Sitemap and you will be happy

D
Dmitry Kashkovsky, 2015-10-02
@Zverushko

yes, duplicates
can be closed from indexing Disallow: /*?
or clean it through Clean Param - https://yandex.ru/support/webmaster/controlling-ro...
or write a canonical on the page you want to leave in the search,
each of these options has disadvantages, so you need to choose for the site individually

N
Nikita Tarasov, 2015-10-05
@tarasnick1

The example you gave is a duplicate - because the URLs of the pages are different, but the content posted on them is the same. The simplest solution is to close duplicates generated using get parameters via robots.txt
You currently do not have robots.txt on the site
To close the double that you gave as an example from indexing, place the following robots.txt file in the root of the site :
User-agent: Yandex
Disallow: /?fakeparam=
Host: www.somesite.ru/ Robots.txt
analyzer to help you - https://webmaster.yandex.ru/robots.xml#results :)

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question