Answer the question
In order to leave comments, you need to log in
How to write large files to mongodb using streams?
A relatively large .csv file (500mb) comes from my client as a stream. I need to somehow write everything to the database using streams. But my experience with streams is limited to reading and writing files. Here's what I've written so far, it works on small files, but if you upload 500mb, then the RAM reaches 2.5GB and naturally completes the job. As I understand it, this is due to the fact that the stream flies and does not wait until the data is written to the database, since this is done one line at a time and readline events are used. The question is how to do it humanly. I read about back pressure and that you can pause the stream and then resume it, but I didn’t understand how to implement it, because what I did didn’t change anything, the memory still leaks somewhere.
https://gist.github.com/rostikowb/566b7a525676fa25...
Answer the question
In order to leave comments, you need to log in
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question