Answer the question
In order to leave comments, you need to log in
How to wean google from indexing https?
Greetings colleagues!
There is a shared hosting and a website on it. Once there was ssl and everything was fine, now there is no ssl. In the Google index, the site remained with https://site.ru
What's the problem? At the hosting, when accessing https://site.ru and there is no certificate, a stub page is issued (that is, a normal response with the code 200), while there is a robot.txt in which indexing is prohibited. And when searching for my site, the version with https is issued with a message that indexing is prohibited due to robot.txt (from a stub). What have you been doing?
- added the host site.ru option to your robot.txt
- made a home link with the rel="canonical" option, in the hope that he would understand
. Useless.
In the hosting TP they answered that we don’t care, this is a hosting feature.
Вижу два действенных варианта, менять хостинг или купить новый ssl. Может можно еще что-то попробовать сделать? Без резких движений?
Answer the question
In order to leave comments, you need to log in
https://debian.pro/1628 i told you so! =) * GIF with Eliot *
In fact - no way. You can try to put the correct sitemap and feed it to search engines through webmaster services, but it's better not to use such hosting.
The hoster is obliged to close https on shared-ip, and for those who want ssl, issue a separate one.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question