Answer the question
In order to leave comments, you need to log in
Is there any harm from partially-duplicate content within the site?
Hello friends, the situation is as follows.
There are about a hundred pages on the site, on various problems, but the essence of their solution is the same.
10-20% of the top page is unique. H1, the first screen and a couple of paragraphs that smoothly lead to a non-unique template that is the same on all pages.
Everything else on the page (80%) is essentially this template.
Is it dangerous to do this for SEO? After all, you really don’t want to write unique content for 100 pages. After all, in fact it will be a stupid rewrite. I want to write these 80% very high quality, for people.
PS: I thought about making articles from 20% of the content and giving a link to the page (where 80%), but it turns out fragmentary, as if the first page was given from the book, and look for the continuation in the next book.
Answer the question
In order to leave comments, you need to log in
Of course, you need to get rid of duplicate content. Google has been punishing for this since Google Fred, Yandex has introduced similar filters since the spring of 2020. This is partly what caused the "storm" in the issuance of the last six months.
I note that this is not only about SEO texts, but about shingles, navigation, cross-cutting blocks, etc. And speaking of uniqueness, I mean semantic uniqueness. This can hardly be fixed by rewriting, rather, by trimming the end-to-end content or de-indexing duplicate pages, which, in principle, are not needed in the search.
What you describe fits perfectly in the definition of web spam . This is a direct violation of the rules of all popular search engines. Your problem is probably the wrong content structure. Indexing such pages can be difficult.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question