Answer the question
In order to leave comments, you need to log in
How is the parsing of a large amount of data and their further storage implemented?
Hello! I am engaged in pampering at my leisure and create all sorts of whistle-blowers. I wanted to figure out how huge data arrays are parsed and how they are stored.
There is a site working through the rest api and giving a small final amount of data in json format. It turns out that you need to do, for example, 3000 consecutive requests to get the full amount of necessary data. The final json in this case can probably weigh a lot. How to store such data and where. What is the best and fastest way to process?
Answer the question
In order to leave comments, you need to log in
don't store it like that. Merge data from queries into the database every 100 queries, for example.
Learn to parse - where to start?
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question