Answer the question
In order to leave comments, you need to log in
How to take penultimate value before inserting a new record in a large postgres table?
We have a table with customer debts, it is large, about 5 million records, the essence of the problem is that before inserting a new debt value for a specific client, I need to get a record with the client’s previous debt (usually debts are accrued in bundles by crown), solving the problem I chose two solutions:
1) a trigger is written to this large table and before insertion, the id of the last record for a certain client id is calculated and written to the intermediate table. But it seems to me that this option is not very humane to use, because the speed of executing the query for the selection of the id of the last record I need is ~ 1.2 seconds, about 50,000 records are inserted into the table per night, it is quite possible that I am not correctly compiling the query for the selection, if something goes wrong please point me in the right direction, here is my request:
select id from debt where id<6534794 AND cus_id = 143867 ORDER BY id desc LIMIT 1
Answer the question
In order to leave comments, you need to log in
and than RETURNING it is not pleasant?
You can also attach anything there.
For example, when deleting some data, I pull out what was generally deleted and fasten everything that I need to pull out with joins. For example
DELETE FROM
waybills
USING
invoices
JOIN invoice_data ON
invoice_data.invoice_uuid = invoices.invoice_uuid
WHERE
waybills.invoice_uuid = waybills.invoice_uuid AND
waybills.waybill_uuid = ?
RETURNING
invoice_data.order_data_id,
invoices.order_id,
invoices.invoice_uuid
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question