Answer the question
In order to leave comments, you need to log in
How to remove thousands of obsolete urls from the search engine index?
At first, the site had pages like
site.ru/books/books_id=3
site.ru/article/article_id=5
Now the urls have changed a bit and become like this:
site.ru/books/3
site.ru/article/5
But in the search engine index systems, the old addresses have not gone away and produce duplicates, since site.ru/books/books_id=3 and site.ru/books/books_id=45 now give the same thing: site.ru/books
Somehow the old addresses from the index put away. You can't do it manually (at least in Yandex) - there are thousands of pages. How else?
There were several thoughts:
1. close all directories in robots.txt from indexing. And after a while open. But I'm not sure if this will remove the old addresses from the index. Yes, and somehow it's dumb to deprive almost all content of indexing is not known how much ....
2. In htaccess, come up with some kind of thread rule that will redirect from old addresses to new ones. But then again, I'm not sure that the redirect removes the address from the search engine index.
3. In the site code, make the old addresses return a 404 error. Probably it will throw out the address from an index. But while the robot will go through them ... and will it go through ...
Please advise who fumbles in this!
Answer the question
In order to leave comments, you need to log in
site.ru/books/books_id=3 and site.ru/books/books_id=45 now give the same output:
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question