Answer the question
In order to leave comments, you need to log in
How to write to the PosgresQl database faster?
There are 700,000 entries. there is a script that parses the csv file and writes the data to the database.
Recording takes a very long time, about 200 records per second. Can you tell me how can I speed this up?
foreach ($ipPool as $ip) {
$date = new \DateTime($csv[5]);
Resource::updateOrCreate([
'ip' => trim($ip),
'domain' => $csv[1],
'url' => $csv[2],
'organization' => $csv[3],
'number' => $csv[4],
'date' => $date,
], [
'ip' => trim($ip),
'domain' => $csv[1],
'url' => $csv[2],
'organization' => $csv[3],
'number' => $csv[4],
'date' => $date,
'version_date' => $updatedDate
]);
}
Answer the question
In order to leave comments, you need to log in
The fastest is COPY request .
Moreover, immediately from the csv file, local to the DBMS.
Through the application - much better, too, COPY. In PDO, the COPY interface is terrible, though.
Worse - relatively large inserts
even worse - a bunch of queries in a transaction
a bunch of queries outside the overall overall transaction - obviously it will be very slow.
I hope not a cycle of one model at a time is written.
Make a batch insert of 1000-10000 pieces (the optimal number is selected empirically).
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question