A
A
Alexander2016-10-17 08:53:44
PostgreSQL
Alexander, 2016-10-17 08:53:44

How to get rid of record level locks and improve the performance of large tables in PostgreSQL?

There are two situations:

  1. A table with ~20k records, some of which are frequently (up to several times per second) updated with queries like "UPDATE table_1 SET counter = counter + 1 WHERE id = 123". In this case, the record may be blocked for two hours, and all subsequent requests will hang in the air until unlocked.
  2. A table with ~2.5M records, data is added in bulk (INSERT SELECT), and then deleted one by one. Here, the performance is simply below the plinth.

Indexes on frequently used fields added. How to deal with these problems? I sin on autovacuum, at least in the second case, but I did not find any sensible recommendations on how to set it up.
Environment: PostgreSQL 9.5 server with 16GB RAM, ssd drive and pgbouncer as pool manager.
Thanks in advance.

Answer the question

In order to leave comments, you need to log in

2 answer(s)
A
Alexander, 2016-10-23
@BetsuNo

As it turned out, the first problem is related to the peculiarities of pgbouncer. The bottom line is that pgbouncer does not block a connection in which there is an active transaction for other clients, and in one place we have a rather long (up to 15 minutes can hang) transaction on a completely left group of tables. In general, it turned out that a bunch of updates went through one row from one table, and some of them got into a transaction that was intended for other tables, and then, after the commit, the server figured out for a long time in what order to do it all. Solution: wrap important data change requests (in our case, updating the counter) in a separate transaction, even if they go one by one.
And the second problem was solved by changing the structure of the table.

S
Sergey, 2016-10-17
@begemot_sun

At a minimum, do not delete, but mark what is deleted. Then delete in bulk.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question