Answer the question
In order to leave comments, you need to log in
How to speed up inserts in sql database?
We have xml with 1.5 million records. This file weighs about 3GB. I parse it DOMdocument and immediately upload it to the database in a loop.
The code scheme is like this.
foreach ($leiLEIobj as $leiLEIs)
{
foreach ($leiLEIobj as $leiLEIs)
{
$LEI = $leiLEIs->nodeValue;
$arResult[$LEI][$leiLEIs->nodeName] = $LEI;
//$LEIs[$row->nodeName] = $row->nodeValue;
..... много много всякий форейчев
}
$qwery = "INSERT INTO `leis`($fields) VALUES ($values)
}
Answer the question
In order to leave comments, you need to log in
Your best bet is to do a batch insert, or else convert the xml to the corresponding sql (with the same batch inserts) and execute it via
Pay attention to -p password
, between them, if my memory serves me, there is no space.
For the SQL version, form like this:
INSERT INTO tbl_name
(a,b,c)
VALUES
(1,2,3),
(4,5,6),
(7,8,9);
<?php
$batchSize = 1000;
$counter = 0;
$valuesBatch = array();
foreach ($leiLEIobj as $leiLEIs)
{
foreach ($leiLEIobj as $leiLEIs)
{
$LEI = $leiLEIs->nodeValue;
$arResult[$LEI][$leiLEIs->nodeName] = $LEI;
//$LEIs[$row->nodeName] = $row->nodeValue;
}
$valuesBatch[] = "($values)";
$counter ++;
if ($counter==$batchSize)
{
$qwery = strtr(
"INSERT INTO `leis`($fields) VALUES ($values) :text",
array(
':text' => implode(",\r\n", $valuesBatch)
)
);
// Выполнить запрос или записать его в общую переменную-накопитель
$counter = 0;
$valuesBatch = array();
}
}
LOAD DATA LOCAL INFILE 'abc.csv' INTO TABLE abc
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES
(col1, col2, col3, col4, col5...);
Specify which DBMS you are using.
To increase insert performance, you can disable/delete indexes on the table where the insert is taking place. After the operation is completed, turn it on again. Recreating the index after an insert will be faster than recalculating it after each value is inserted.
In addition, DBMS usually support bulk insert operations (batch insert), for this you may have to convert the file into a format that the DBMS understands. MySQL does not seem to have this, in this case only the option proposed by Dmitry remains
In MSSQL, Bulk Insert works faster than the usual insert into ... values (), so maybe it makes sense to convert xml into a plain text file and import it into the database already?
I'm confused by the gradual drop in performance.
According to my unconfirmed feelings, the reason lies in the mechanism for supporting the transactional nature of the database you use.
You are trying to insert a bunch of records in one transaction, the mechanisms for supporting transaction rollback (in case a rollback command is executed or a session crash) require significant database resources. At about 100.000, these resources are enough, and then some kind of trash begins (you need to read depending on the database).
What would I advise to start with:
1. Using the database administration tools, look at the expenditure of resources (to understand what is spent on).
2. If my guesses are confirmed, then a batch insert is unlikely to help you (although it can be implemented differently in different databases).
3. Try inserting a commit every 50,000 entries. Thus, you will relieve the load on the mechanisms for supporting transaction rollback.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question