M
M
Messi2019-09-17 09:51:59
MongoDB
Messi, 2019-09-17 09:51:59

Large array to write to MongoDB?

Hello! Please tell me how to do it right, I have a large file (let's say 15 GB), this file is processed and the data is written to an array, and after BatchInsert it is written to Mongo (to one collection).
For example, 200 thousand records will be collected in the array and this needs to be written to mongo. How to do it right, do not create a huge array, but rather immediately, the line from the file is processed and written to mongo, etc., or is it better to collect a huge array and do a batchInsert?

Answer the question

In order to leave comments, you need to log in

1 answer(s)
G
Gip, 2019-09-17
@FitTech

if RAM allows - do it to the maximum, at least everything.
I have a similar task in one project, I do a batchInsert of 1000 objects so as not to load the server RAM too much.
I load from a large exchange file, something like this:

while (($line = fgets($handle)) !== false) {
        $objects[] = json_decode($line);
        //Для ускорения грузим пачками, по сколько влезает в оперативку(кол-во объектов настраивается в конфиге)
        if ($count === Yii::$app->objects->insertBatchCount) {
          Yii::$app->mongodb->createCommand()->batchInsert('objects', $objects);
          $count = 0;
          $objects = [];
        }
        $count++;
      }

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question