S
S
smoln2020-12-11 10:13:13
MySQL
smoln, 2020-12-11 10:13:13

How to download a completely large dump of a mysql table?

Good afternoon!
There are 2 database table dumps
first dump 9.5 GB
second dump 2.5 GB
Each dump has more than 50 million rows.
I load through the console, the download process is about two million per step, then time passes and the download process freezes, that is, it simply does not go further, does not write any errors, it just hangs as if it is working further and nothing happens, roughly speaking, the last step hangs 2 million and all (waited for a day).
I go to the database, look at the table, it freezes at 48-49 million rows and does not load anymore, as if there is some kind of limit.
Other dumps are less than 50 million lines, it loads without problems, but these don’t want to, it reaches 50 and that’s it, and if I add the load again without clearing the database, then the process will go on and be more than 50 million, it will reach 100 million and freeze again (this will not work, duplicates are obtained)
I load through the openserver console, tried mysql 5.7, 5.8, 8.0, also MariaDB, I understand some settings, most likely, but I can’t understand which ones!

Answer the question

In order to leave comments, you need to log in

3 answer(s)
V
Vitaly Karasik, 2020-12-11
@vitaly_il1

There are no special settings.
As said above, look in error.log for disk space. And what does top show - does mysqld do something when it "hangs"? If yes, perhaps builds indexes.

T
ThunderCat, 2020-12-11
@ThunderCat

https://github.com/fadlee/bigdump

D
Denis, 2020-12-11
@sidni

The openserver comes with heidiSQL and it can split large files by itself (without downloading), try....

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question