Answer the question
In order to leave comments, you need to log in
How to increase the speed of the script working with the database?
There is a Python script that takes data from one database, checks if it is in the second database. If they are not there (checking by GUID), then inserts them there.
The problem is that the script is very slow. I think how to increase the speed of his work. The first thing that comes to mind is to make a selection by ID, split them into two halves and then check guides in another database in two threads. But maybe there are better options?
Just in case, I will clarify that the databases are not identical and only certain fields need to be synchronized.
PostgreSQL database
Answer the question
In order to leave comments, you need to log in
Grab a profiler and see where you're wasting your time.
What class of DBMS are we talking about?
For example, if we are talking about a transactional RDBMS, the rows are voluminous, then you can do this:
create temporary table updatetable(guid uuid not null)
in batches of 1000 guids, read guids from the source and write them to this temporary table.
select guid from updatetable where not exists (select 1 from normaltable where normaltable.guid = updatetable.guid)
So we got a list of guids that do not exist in the target database. We went to the source for the full version of all the data of these guids, and wrote them in batches to the target database.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question