Answer the question
In order to leave comments, you need to log in
How to optimize fetching from a database with a lot of insert/update?
I understand that the architecture was not well thought out, but now there is no way to redo it, it needs to be corrected.
The situation is this. There is an insert/update record in the database with a fairly high frequency. per minute, just a few records, or even dozens. But you also need to get data often and quickly. Several times per second.
I understand that when inserting and updating, indexes are rebuilt, it takes a lot of time. Getting rid of them is also a bad option - the sample will take a long time. But even now, select requests slow down significantly. How to achieve optimization?
Answer the question
In order to leave comments, you need to log in
This is not a very big load, I would even say none. Regular mode. Set up the indexes and work as usual. until there are tens of millions of records in the database, it should work smartly. Of course it depends on the structure, but in the general case it will be so.
I correctly understood that you have two indexes on the table with five fields each, but in the selection you use a combination of fields that is different from those in the indexes?
I would separate these indexes somehow more sane (if this query of yours is critical and frequently used).
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question