V
V
Vit2015-04-13 10:20:41
Laravel
Vit, 2015-04-13 10:20:41

How to process 250,000 table rows in Laravel?

Good afternoon. To calculate statistics, I need to process 250,000 records from a table.
I do it like this:

DB::connection()->disableQueryLog();
foreach (DB::table('table_name')->get() as $row)
{
  //
}

I get Allowed memory size of 536870912 bytes exhausted
Tried like this:
DB::table('table_name')->chunk(100, function($rows)
{
  foreach($rows as $row)
  {
    //
  }
});

The mysqld process is loaded up to 70% and hangs like that, there is no result. Even if you do this:
DB::table('table_name')->limit(10)->chunk(10, function($rows)
{
  foreach($rows as $row)
  {
    //
  }
});

The process hangs. Tell me how to complete the task correctly? Thank you.

Answer the question

In order to leave comments, you need to log in

3 answer(s)
V
Vyacheslav Plisko, 2015-04-13
@AmdY

The chunk option works even for 1M records, you may have a problem inside the closure or mysql settings, especially since the memory does not flow as it should be for chunk.

S
Sergey Gladkovskiy, 2015-04-13
@SMGladkovskiy

Stock up on sql. If the logic is complex and not implemented in sql, use commands (jobs) and queues (queues). Well, if there is no way for the cli to increase the memory limits, break the entire volume into blocks and execute in parallel or sequentially.

I
index0h, 2015-04-13
@index0h

Do you have to use php for this? Partial (or full processing) is theoretically possible with the muscle itself, the results are stored in a temporary table. It makes sense to put indexes on fields with search (not full-text).
I'm not strong in Laravel, but judging by the dock limit is not needed + chunk more records, for example 500

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question