G
G
G0rDi2011-03-15 13:16:49
MySQL
G0rDi, 2011-03-15 13:16:49

Importing an array of data into a mySQL table?

Good time of the day.
There is a task to parse TXT and update the data in the table using the parsing result.
The first subtask is solved without problems, but the second task is solved, but the solution is not entirely correct in my opinion.
TXT:

001-03-0027,4000.50,5<br/>
001-03-0031,8000.50,6<br/>
001-03-0028,12000.50,7<br/>
001-03-0033,16000.50,8<br/>
...<br/>

Table:
CREATE TABLE properties(<br/>
 ...<br/>
 id_1c VARCHAR(255) DEFAULT NULL,<br/>
 price FLOAT DEFAULT 0,<br/>
 count INT(11) DEFAULT 0,<br/>
)<br/>

Processing the file line by line, I get 3 variables $array['id_1c'], $array['price'], $array['count']. Which I stuff into the request:
<br/>
'UPDATE properties SET price='.$array['price'].', count='.$array['count'].' WHERE id_1c = '.$array['id_1c'];<br/>

As a result, I get the number of requests directly proportional to the number of lines in the file, and there are 5000 of them.
Is there any way to optimize this process?
Taking into account the fact that it is required to monitor the success of the data update:
ID 001-03-0027: обновлен успешно<br/>
ID 001-03-0031: не найден в базе<br/>
ID 001-03-002: не найден в базе<br/>
ID 001-03-0033: обновлен успешно<br/>

Thank you.

Answer the question

In order to leave comments, you need to log in

3 answer(s)
J
JeanLouis, 2011-03-15
@JeanLouis

Sequence of actions:
1. First, read the record IDs from the file. We found out with SELECT what IDs are in the database and which are not. This is for monitoring.
2. Loaded the file into the table using "LOAD DATA INFILE" and "REPLACE" key.
See the syntax of this command here: www.mysql.ru/docs/man/LOAD_DATA.html
That's all, no extra loads. This will work instantly.
If there is quite a lot of data, then experiment with additional commands:
SET autocommit = 0;
SET UNIQUE_CHECKS=0;
SET foreign_key_checks = 0;
SET sql_log_bin = 0;
// your request is here
commit;
In my task, all of them are used to speed up the loading of a large amount of data.

A
Anatoly, 2011-03-15
@taliban

just a theory!
make a multi insert into another table of all records in one query, and then do something like this:

UPDATE table1, table2
SET table1.field1 = table2.field2
    table1.field2 = table2.field2
WHERE table1.id = table2.id

I do not know for sure whether such an update will work, but I would look in this direction.

P
pentarh, 2011-03-15
@pentarh

Why are you not satisfied with 5k requests? Absolutely regular amount.
These multiple inserts can be crammed into one request (no longer than max_allowed_packet), and updates will have to be launched with a train)

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question