Z
Z
zarabotok2015-05-21 16:36:46
robots.txt
zarabotok, 2015-05-21 16:36:46

Duplicate wordpress pages, why do not drop out of the search?

Good afternoon everyone.
I have a personal blog strofeyem.ru (still under construction) I did everything myself.
The problem is that there was material on the site before that, but there was no protection against duplicates, due to my inexperience, I allowed this. There were 20 articles on the site and 120 in the Google index!
After, for some reason (to write for a long time), all materials were removed from the site. And say so, I started doing everything again and already wisely :)
Now there are 6 records on the site, if you look at google there are 111
of them In a month, only 5 non-existent pages left the index, why is it so slow? how to speed up this process? They really worry me a lot. I want to get rid of them.
The site has such a robots.txt Redirect
Plugin installed and configured: All in One SEO Pack

RewriteCond %{HTTP_HOST} ^www.strofeyem.ru
RewriteRule ^(.*)$ http://strofeyem.ru/  [R=301,L]<br>
RewriteRule (.+)/feed /$1 [R=301,L]
RewriteRule (.+)/tag /$1 [R=301,L]
RewriteRule (.+)/attachment /$1 [R=301,L]
RewriteRule (.+)/category /$1 [R=301,L]
RewriteCond %{QUERY_STRING} ^attachment_id= [NC]
RewriteRule (.*) $1? [R=301,L]

Answer the question

In order to leave comments, you need to log in

2 answer(s)
D
dadster, 2015-05-21
@dadster

Google webmasters has a tool - Google Index -> Remove URLs.
There you need to enter the URLs that you want to remove from the index.
But it will be really long, it is impossible to predict in advance :)

I
Igor Petrov, 2015-05-21
@Slami6e

Don't add anything to google webmasters. You yourself then look, why did you close access to the robot in robots? You have a parameter on the page "meta name="robots" content="noindex,follow"", which will instruct Google not to index this page. But in order for Google to read this parameter, it needs to get to the page, and you yourself closed access to robots.
Therefore, nothing falls out of the index. The robot sees that the page is there, but it cannot read it because of the ban. Open everything for the robot.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question