E
E
ESSES18682019-04-26 08:29:23
Yandex
ESSES1868, 2019-04-26 08:29:23

Bypass ban in Robots.txt and 301 redirect. How right?

Friends!
Some time ago I deleted a section from the site (~2000 old pages). Those pages that had good links (~200) added 301 redirects to similar actual pages. Is it possible to prevent search robots from visiting a remote section, or then they will not know about the 301 redirect? I think it's not helpful if robots see 404 errors on the remaining 1800 pages that don't have 301 redirects. It's easier to disable the entire section and they won't knock on it. Apparently you need to wait until the search engines bypass the pages, see the redirect, and then ban it? What's the best way to do it?
ADDITION
In fact, I'm leaning towards disabling indexing of the old section and removing redirects. These pages are of no use, they were brief news items and were written before 2007. Then I switched to a new engine and left the old one with these pages. They were not of very high quality and also produced duplicates. There are no real users on them for many years and cannot be.
It turns out that Yandex sees links to these pages on some sites and constantly goes to look at them. As a result, there are always a lot of pages with 404 in the crawl history.
The only thing is that it’s a pity for links from cool sites like news.com.ru or ixbt.com, which probably gave weight to my site as a whole. That's why I did 301 redirects for them.
Now, when switching to HTTPS, Yandex will see non-existent pages that have a global 301 redirect, and then the target returns 404, which I don’t like either.
In general, maybe just close the sections and forget about them?

Answer the question

In order to leave comments, you need to log in

1 answer(s)
V
VeryLongAgoDid, 2019-04-26
@VeryLongAgoDid

Let's try to decipher:
Total 2000+- pages. 200+- remained, moved and redirected. 1800+ - removed and no longer relevant for the site.
And now the conclusion. You have configured the robot not to lose working links and want to disable them altogether. If you close the entire section, it will say that you have deleted 2000 pages, which is not so. Let the answer be not 404, but you will make the pages inaccessible for the search engine. And that means they will also gradually leave the search.
It seems to me that you have already done enough to work correctly by setting up redirects. And it is pointless to prohibit bypassing a section that does not exist and to which calls will come to naught.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question