O
O
Ofigenen2015-08-03 04:44:39
MySQL
Ofigenen, 2015-08-03 04:44:39

PHP/Yii2: how to speed up the execution of ~1 million requests in a row?

Hello.
A large array of data is obtained from a third-party server through the API, which needs to be scattered across different tables. In total - one million with something of records. It is obviously impossible to try to solve the problem "on the forehead":

foreach ($array as $attributes) {
    $model = new Model();
    $model->setAttributes($attributes);
    $model->save();
}

I tried it on smaller volumes, the server can no longer cope with ~ 200,000 records, and the current million is just the beginning. Options thought such:
1. Receiving and record in parts. Uncomfortable.
2. Writing all data to each table in one query. How? Transaction?
Maybe there is something else?

Answer the question

In order to leave comments, you need to log in

9 answer(s)
A
Alexander Taratin, 2015-08-03
@Taraflex

2. Writing all data to each table in one query. How? Transaction?

Naked mysqli without any wrappers, in batches of as many inserts as a maximum will fit into . request (a 1mb line should fit),

P
Papa, 2015-08-03
Stifflera @PapaStifflera

Use queues.

O
Oleg, 2015-08-03
@ollisso

I see 2 options:
1. LOAD DATA - the link has already been given above.
2. insert many rows at once:
Example:
dev.mysql.com/doc/refman/5.6/en/insert.html
+ The best thing is to tune the server so that it is ready for such a volume (a million lines - that doesn't say anything)
but how exactly does the server fail? (a million lines - that doesn't say anything)

M
matperez, 2015-08-03
@matperez

You can also merge everything into a file, and then import via the console.

M
Mikhail Osher, 2015-08-03
@miraage

1) www.yiiframework.com/doc-2.0/yii-db-command.html#b...
2) I have seen this trick

file_put_contents(
    '/path/to/some/insert/file.sql', // куда
    'INSERT INTO table VALUES', // формируем запрос вставки с экранированными данными
    FILE_APPEND // константа, которая указывает, что добавляем в конец файла
)

Then, in the console, run the following command:
What we get: the script adds insert queries, tail -f transfers each new line directly to insert into mysql

H
He11ion, 2015-08-03
@He11ion

https://dev.mysql.com/doc/refman/5.6/en/load-data.html
The standard way to load large volumes.
"The server can't cope" - most likely there is not enough memory, break it into pieces that fit into memory / allocate more of it if necessary through Yii

M
Marcuzy, 2015-08-03
@Marcuzy

I propose such a simple implementation of a buffer that batch writes to the database every n rows: https://gist.github.com/marcuzy/3bff7ab350ac6741def0

M
movetz, 2015-08-03
@movetz

There was a similar problem in an environment with a very weak server, they fought like this - they collected all one text, and then they loaded everything via LOAD DATA. There was an even more crutch solution through a CSV file and exec.

A
asd111, 2015-08-03
@asd111

Break the base into parts, put the parts on different servers. Those. sharding or in other words horizontal scaling.
If the data is not particularly critical, you can try switching to memcache.
Topic slides. www.slideshare.net/profyclub_ru/web-20-c-9951602

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question