V
V
Vitaly2016-06-30 10:57:27
MySQL
Vitaly, 2016-06-30 10:57:27

How to handle large amounts of data in a stream?

Good day everyone.
Guys, I'm asking for advice.
There is a large database (about 5 million records). It is necessary to sequentially receive data from each line and process it.
I understand that this is done through a stream, but I do not quite understand the implementation of this process.
How can you get 10 records sequentially and then process them, and repeat this process until the source table is read and processed completely.
If there are any other modules for connecting nodes and myskul, I would be grateful for a link and an example of use.

Answer the question

In order to leave comments, you need to log in

1 answer(s)
T
ThunderCat, 2016-06-30
@ThunderCat

Is the base in active use or can you stop working for an hour?
How heavy are the entries in the table?
How complex is the processing on the client?
For 10 records, you will be fooling from morning to morning, I think at least 200-500 records at a time, and ideally 1000 norms. will. For spend more time on connection than on sampling and inserts.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question