P
P
Pavel2015-03-18 12:26:47
MySQL
Pavel, 2015-03-18 12:26:47

What is the best way to organize the database and update MySQL product data?

There is a database with 2 million items of goods and prices, 300-400k items are updated daily, i.e. it is necessary to remove the old goods of suppliers and add new ones. Every day this overloads the database very much and the server periodically crashes. Please tell me how best to organize the update to reduce the load. Table one ID, SUPPLIER, NAME, PRICE. The name is searched in full text. Now the table is in MyISAM format, MySQL database. I would appreciate any help on how best to proceed in this situation.
UPD:
Import comes from an XML file, if a supplier appears that is updated, then all records for it are deleted from the table, i.e. for example DELETE FROM baza WHERE postavshik = 2
and then multiple INSERTs of 10,000 rows are added to the database. For one update, 40-50 suppliers can be updated, the total number of items for each is approximately 10-20k

Answer the question

In order to leave comments, you need to log in

4 answer(s)
A
Andrey Mokhov, 2015-03-18
@mokhovcom

load the temporary table(s), then lock the database, delete the existing table(s) and rename the temporary

I
ivankomolin, 2015-03-18
@ivankomolin

Need more information, how is the deletion and creation of new records organized?
Maybe you delete and add one record at a time in a cycle)))
I think such operations should not greatly overload the server if you use multiple insert

W
whats, 2015-03-18
@whats

blocking will happen. And he probably needs not to delete everything, and then fill in a new one, but only update the existing prices in a pack. But the option is the same, a temporary table where to load a new batch of data using load data . Further, either by ID or by two other fields, delete all records from the main table. deletion will be fast if there is an index on the checked fields. The third step is to make an insert on another table.
I now have a similar situation, only the database consists of more than 50 million positions, and up to 30 million positions can be updated every day. At the beginning, I load the csv file into a temporary table. Removal goes across the field - manufacturer, name and then insert. The whole thing is in Postgres - the execution time for such an operation with 100 thousand records is less than a second. Nothing is being blocked

I
imhuman, 2018-01-25
@imhuman

How to visually rotate a CSS playing card? almost your case

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question