Answer the question
In order to leave comments, you need to log in
Parsing other people's sites - is it good or bad?
Hello!
Once again I thought about this topic.
I would like to hear your opinion on the following points:
- Is it legal and is it possible to present something to another site for stolen materials? Is the content protected because it is not patentable? And if so, under what article can you sue?
- How should the link be correctly indicated on the site that stole the materials?
- How search engines give authorship to content - as I understand it, often stolen sites turn out to be authors for search engines.
Like others, I encounter this problem - they steal the content ordered from copywriters and often the search engine considers thieves to be authors.
But I also sometimes have ideas to parse other sites - not for articles, but for directories, and so on. and I'm wondering what it threatens and can there be a way of legal and private parsing?
Therefore, the opinion of experts is of interest.
Thank you!
Answer the question
In order to leave comments, you need to log in
When parsing, the number of sources and what you do with the information is important.
If you make a site my-lenta.ru and parse all the news from lenta.ru there, then you are breaking the law.
On the other hand, news aggregators do not break the law. By collecting information from several directories and creating your own more advanced directory (and even indicating links), you are not breaking the law, but creating a new one. Links should be specified along with the displayed content to the page where you got it from.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question