Answer the question
In order to leave comments, you need to log in
How to optimize a query to delete records from a table?
I have the most common delete query (delete from table t where ...);
How can I optimize it if there are sooooo many records in the table, and about 200 thousand records need to be deleted. The table contains millions of records. At removal there is still a removal from dependent tables (at one record of the order of 30 dependent records in different tables). explain analyze
shows
Delete on t (cost=0.00..37370.84 rows=4758 width=6) (actual time=146.687..146.687 rows=0 loops=1)
-> Seq Scan on t(cost=0.00..37370.84 rows=4758 width=6) (actual time=146.682..146.682 rows=0 loops=1)
Filter: (event_id = 437)
Rows Removed by Filter: 1497652
Total runtime: 146.726 ms
Answer the question
In order to leave comments, you need to log in
Here it is necessary to optimize not the query, but the table:
* disable all indexes so that they are not rebuilt
* disable FK so that there are no integrity checks. (dependencies clean manually)
* If there are millions of records, then you can make partitions and clean partitions ALTER TABLE t1 TRUNCATE PARTITION p0; (pimer from Oracle)
* Disable all triggers
, etc. it is necessary to limit the influence of all external factors.
You can use the "Soft Delete" approach. Then there will be a simple update Table set IsDeleted = 1 where...
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question