Answer the question
In order to leave comments, you need to log in
How to process 250,000 table rows in Laravel?
Good afternoon. To calculate statistics, I need to process 250,000 records from a table.
I do it like this:
DB::connection()->disableQueryLog();
foreach (DB::table('table_name')->get() as $row)
{
//
}
DB::table('table_name')->chunk(100, function($rows)
{
foreach($rows as $row)
{
//
}
});
DB::table('table_name')->limit(10)->chunk(10, function($rows)
{
foreach($rows as $row)
{
//
}
});
Answer the question
In order to leave comments, you need to log in
The chunk option works even for 1M records, you may have a problem inside the closure or mysql settings, especially since the memory does not flow as it should be for chunk.
Stock up on sql. If the logic is complex and not implemented in sql, use commands (jobs) and queues (queues). Well, if there is no way for the cli to increase the memory limits, break the entire volume into blocks and execute in parallel or sequentially.
Do you have to use php for this? Partial (or full processing) is theoretically possible with the muscle itself, the results are stored in a temporary table. It makes sense to put indexes on fields with search (not full-text).
I'm not strong in Laravel, but judging by the dock limit is not needed + chunk more records, for example 500
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question