E
E
er122015-08-01 19:59:01
PostgreSQL
er12, 2015-08-01 19:59:01

How to optimize a query to delete records from a table?

I have the most common delete query (delete from table t where ...);
How can I optimize it if there are sooooo many records in the table, and about 200 thousand records need to be deleted. The table contains millions of records. At removal there is still a removal from dependent tables (at one record of the order of 30 dependent records in different tables).
explain analyzeshows

Delete on t  (cost=0.00..37370.84 rows=4758 width=6) (actual time=146.687..146.687 rows=0 loops=1)
  ->  Seq Scan on t(cost=0.00..37370.84 rows=4758 width=6) (actual time=146.682..146.682 rows=0 loops=1)
        Filter: (event_id = 437)
        Rows Removed by Filter: 1497652
Total runtime: 146.726 ms

in this example, about 3 thousand records are deleted

Answer the question

In order to leave comments, you need to log in

3 answer(s)
S
sim3x, 2015-08-01
@sim3x

Premature optimization

A
Alexey Skahin, 2015-08-02
@pihel

Here it is necessary to optimize not the query, but the table:
* disable all indexes so that they are not rebuilt
* disable FK so that there are no integrity checks. (dependencies clean manually)
* If there are millions of records, then you can make partitions and clean partitions ALTER TABLE t1 TRUNCATE PARTITION p0; (pimer from Oracle)
* Disable all triggers
, etc. it is necessary to limit the influence of all external factors.

A
Alexander Evseev, 2015-08-06
@alex1t

You can use the "Soft Delete" approach. Then there will be a simple update Table set IsDeleted = 1 where...

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question