K
K
kaxa32012019-06-05 12:58:11
PostgreSQL
kaxa3201, 2019-06-05 12:58:11

How to write to the PosgresQl database faster?

There are 700,000 entries. there is a script that parses the csv file and writes the data to the database.
Recording takes a very long time, about 200 records per second. Can you tell me how can I speed this up?

foreach ($ipPool as $ip) {
                    $date = new \DateTime($csv[5]);
                     Resource::updateOrCreate([
                        'ip'           => trim($ip),
                        'domain'       => $csv[1],
                        'url'          => $csv[2],
                        'organization' => $csv[3],
                        'number'       => $csv[4],
                        'date'         => $date,
                    ], [
                        'ip'           => trim($ip),
                        'domain'       => $csv[1],
                        'url'          => $csv[2],
                        'organization' => $csv[3],
                        'number'       => $csv[4],
                        'date'         => $date,
                        'version_date' => $updatedDate
                    ]);
                }

Answer the question

In order to leave comments, you need to log in

2 answer(s)
M
Melkij, 2019-06-05
@melkij

The fastest is COPY request .
Moreover, immediately from the csv file, local to the DBMS.
Through the application - much better, too, COPY. In PDO, the COPY interface is terrible, though.
Worse - relatively large inserts
even worse - a bunch of queries in a transaction
a bunch of queries outside the overall overall transaction - obviously it will be very slow.

K
Konata Izumi, 2019-06-05
@Konata69lol

I hope not a cycle of one model at a time is written.
Make a batch insert of 1000-10000 pieces (the optimal number is selected empirically).

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question