L
L
Leshiy666132019-03-02 12:06:05
Parsing
Leshiy66613, 2019-03-02 12:06:05

How is the parsing of a large amount of data and their further storage implemented?

Hello! I am engaged in pampering at my leisure and create all sorts of whistle-blowers. I wanted to figure out how huge data arrays are parsed and how they are stored.
There is a site working through the rest api and giving a small final amount of data in json format. It turns out that you need to do, for example, 3000 consecutive requests to get the full amount of necessary data. The final json in this case can probably weigh a lot. How to store such data and where. What is the best and fastest way to process?

Answer the question

In order to leave comments, you need to log in

2 answer(s)
D
Dmitry Bay, 2019-03-02
@kawabanga

don't store it like that. Merge data from queries into the database every 100 queries, for example.
Learn to parse - where to start?

R
Roman, 2019-03-02
@procode

Depending on what ... depending on who ..)))
The general solution is this: parse json and shove it into mysql and then do whatever your heart desires

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question