Answer the question
In order to leave comments, you need to log in
Is it possible to reduce query execution time?
There is a table with statistical data. Now it has 6.5 million records, every day the number increases by about 300,000 records. You need to run this type of query regularly.
SELECT MAX(players) as max,AVG(players) as avg_players,AVG(ping) as avg_ping FROM y_update_sa WHERE server_id = :id
Answer the question
In order to leave comments, you need to log in
As an option - store statistics separately and recalculate each time a record is added. In statistics, store for a specific server_id the max value of players, the sum of all players, the sum of all pings, the number of elements. When inserting into a table (on a trigger, for example), update the values in the statistics table. And the selection from the statistics itself will take 1 reading and 2 divisions, instead of almost a fullscan of the entire table.
Overtake in XtraDB set per table, increase the cache, it should be more fun, but these are general tips, you need to test and look at the environment variables and server settings.
As SagePtr said , take out statistics separately, increment / decrement values when changing the main tables. That's right, first do the normalization of the database, then do the denormalization when optimizing.
Here you can also read about database optimization ruhighload.com/server section "Databases"
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question