B
B
beduin012021-03-03 10:16:39
PostgreSQL
beduin01, 2021-03-03 10:16:39

What is the best way to insert into a database?

The situation is this. There is PostgreSQL. There is a JSON whose fields correspond to the database schema and which I turn into an INSERT statement. those.

{
  id: 123,
  price: 824,
  objs: [{ name: "apple"}, { name: "milk"}, { name: "water"}]
}


Accordingly, I scatter this data in tables.

The problem is this: sometimes I can get id's that are already in the database. Accordingly, I need to delete the old ones, and insert the new ones. But performance is extremely important to me and I do not know how best to do it.

At the beginning, I checked whether the specified ID exists in the database tables. If there is, removed it. Then I realized that it was so slow and started catching an exception when trying to insert duplicates into the database. If there is an exception - I delete records and I try to execute an insertion anew.

But now I'm thinking - is it effective? Will it be better if I wrap it in a transaction like:
REMOVE from mainTable WHERE id=123 -- удаляем из главной таблицы
REMOVE from obj WHERE id=123 -- удаляем из второй таблицы
INSERT INTO mainTable ...
INSERT INTO obj ...


I also thought about the INSERT OR UPDATE (UPSERT) option, but theoretically it is possible that in the second-level tables (meaning when looking at JSON) there will be composite keys that do not match the new data, i.e. there will be less new data than old data, and part of the old data will remain with UPSERT.

How is it better? What are the options?

Answer the question

In order to leave comments, you need to log in

1 answer(s)
D
Dr. Bacon, 2021-03-03
@bacon

UPSERT and make sure that there is no theoretical situation. If the insert is massive, then make it one transaction, so as not to commit after each operation

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question