Answer the question
In order to leave comments, you need to log in
Can links be made safer?
Imagine the following case - there is a conscientious resource on which a completely respectable author places a link to a 'good' page. After being posted, the data at the specified link has changed and, at best, does not contain the information that was originally referred to, and at worst, to malicious code. Note that the link is located on a site that is trusted and users follow it without a second thought.
One of the options for solving the problem of the validity of links can be assigned to a respectable site, the developer of which will write a script that scours links for something there.
But in my opinion, link validation should be part of the web standard. The simplest thing is to insert a unique hash of the page into the url, which will be checked against the current hash during the transition and cursed if they do not match.
When the hash is legally changed, the page should notify everything that links to it and the links should be recalculated or removed (if, for example, the subject changes or information is deleted)
This idea has a number of disadvantages that are associated with the dynamism of the web, but this method is suitable, for example, for links to articles or to validate the source code of the page
Any ideas or thoughts about this?
Answer the question
In order to leave comments, you need to log in
Are you religiously opposed to using magnet-urls and want us to help you reinvent them?
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question