E
E
Ewboa2014-12-24 22:00:11
PHP
Ewboa, 2014-12-24 22:00:11

How do I prevent third-party sites from extracting information from pages on my site?

For example, I copy a link from a certain page of the site and paste it into vk. How can I make my site prohibit VKontakte from generating previews using the link of my site? Really?

Answer the question

In order to leave comments, you need to log in

6 answer(s)
A
Azim Kurt, 2014-12-24
@Symphony

Delete your site.

M
m0rd, 2014-12-24
@m0rd

Vkontakte is most likely real. You need to look at the user agent with which he surfs the sites and stupidly ban it (this is the first thing that comes to mind). For the general case, no. Well, either it will be done with great damage to usability.
Upd. Option for the general case: we define ip, then whois, well, or nslookup it and decide whether it is a user or a parser

T
tzlom, 2014-12-25
@tzlom

The trick is simple - set a cookie that is set to the visitor when they first enter the site, if there are no cookies, set the cookie and give the page where the redirect will be formatted as JS or html, but not 403. It will not be noticeable to the user, but you can make a whole landing page for the robot .
Only it will be necessary to provide that the search engines do not die from this, but this is already easier - they do not hide.

H
haiku, 2014-12-24
@haiku

Look at the headers from where they are knocking on you, but cut those who are objectionable in the bud.
It is possible even without the code with the usual furvol a la iptables or whatever.

P
Padabum, 2014-12-24
@Padabum

See how to build robot.txt, disable indexing.

S
SagePtr, 2014-12-24
@SagePtr

Absolutely paranoid option - to collect the contents of the site with Javascript. Then the search engines will not be able to index.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question