Answer the question
In order to leave comments, you need to log in
Answer the question
In order to leave comments, you need to log in
The main protection against parsing is only when accessing authorization data and setting limits on data (the amount available to the user or for a certain period of time, for example, a day / month).
Anonymously accessible data, in general, cannot be protected from being uploaded by users. Everything that the user sees on the screen can be stupidly copied and analyzed.
In some cases, if you collect a high-quality browser fingerprint, you can assign a certain identifier to anonymous users and, based on it, set data access limits on the backend, but as always, difficulties are in the details and if you overdo it, you can interfere with the work of legitimate users.
You can put a 'stick in the wheel' by making this process more difficult (and more expensive), mainly obfuscation / encryption of data available directly (via api) from the backend and obfuscation of the code, converting it into user-visible content, so that classic (cheap) tools didn't work. As always, the cost of protection (development costs) should be comparable to the costs of grabbers for obtaining data (usually it is easier for them).
Unfortunately , along with the content grabber, search engine robots will be misled , because their main job is to rob content.
There is no need to assign the task of protecting against parsing to nginx, it was not created for this.
You need to defend yourself on the back
I can recommend a report from 2GIS and their version of writing a lua module for nginx (opernresty)
https://www.youtube.com/watch?v=pYxnW7kYcbU
The report is at least useful in that it contains useful information on how to detect parsers and what to do with it.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question