S
S
Sergey2015-01-13 08:48:57
MySQL
Sergey, 2015-01-13 08:48:57

How to properly implement multiple mysql data update?

Good day. There was a question about multiple updating of the data. Initially, the algorithm was written using the on duplicate key construct and worked fine, but was too slow due to algorithmic waste in the php code (quadratic complexity, which gave an incredibly large number of loop iterations and led to a script timeout with large amounts of data), now the algorithm rewritten and had to abandon the insert on duplicate key update construct, because according to the algorithm I now have two data arrays. One that needs to be added to the database, the other that needs to be updated, therefore, if in the section that needs to be updated do insert on duplicate key, then not only will the insert never be executed, but auto_increment will also grow.
The question is to find an approach that would allow us to update, say, 10,000 records in a short time. The solutions I came up with by googling:
1. Prepared statements
2. Leave insert on duplicate key update
3. Use a construct like:

UPDATE tbl_country SET price = CASE
WHEN code = 1 THEN 123;
WHEN code = 2 THEN 456;
…
END
WHERE code IN (1,2,…)

Answer the question

In order to leave comments, you need to log in

4 answer(s)
S
Sergey, 2015-01-13
@Pjeroo

Ran tests - insert on duplicate key update works perfectly, even if the insert never executes. Updating 10,000 rows takes 0.04 s. Thanks everyone for the tips.

I
index0h, 2015-01-13
@index0h

It is possible to use UPDATE + JOIN.

I
ivankomolin, 2015-01-13
@ivankomolin

To update a large number of records in a table, you can use insert on duplicate key update in the following construct:

INSERT INTO 
{{table}}
(id, field1, field2) 
VALUES 
(1, "значение1.1", "значение1.2"),
(2, "значение2.1", "значение2.2"),
(3, "значение3.1", "значение3.2"),
(4, "значение4.1", "значение4.2"),
(5, "значение5.1", "значение5.2")
ON DUPLICATE KEY UPDATE 
field1 = VALUES(field1), field2 = VALUES(field2)

However, you should take into account the amount of data in the fields, with 10,000 entries, the query text may exceed the limit, in which case either increase the limit or split the data into several parts.

E
Eugene, 2015-01-13
@Nc_Soft

Is this not a one-time operation? If such updates are launched every day, then this is not a problem with them.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question