Answer the question
In order to leave comments, you need to log in
Mysql backup takes the site down for a few minutes?
Actually I do backup mysqldump
In a DB there is one big table on 8 million records, probably at its backup the site also falls.
Like lowering the priority of mysqldump
nice -n 19 ionice -c2 -n7 mysqldump mybd | gzip >mybd.sql.gz
What is the best way to solve this problem?
Answer the question
In order to leave comments, you need to log in
If there is a free server, replicate the database to it as a master plume, merge backups from the plume, for one there will be a hot backup of the database in case of a server fakap.
google says
"When dumping a database in MySQL using the mysqldump utility, by default it makes the tables unwritable for the duration of the dump. This leads to negative consequences ... ”the first thing that came to mind, maybe it crashes due to blocking?
The original task is not quite clear. Make a buck? Or what would not fall from mysqldump?
Just besides mysqldump, in the first case, you can advise
a) sypex dumper
or
b) mysqlhotcopy
Mysqldump is a good regular utility, but it is already unsatisfactorily slow on such volumes of data.
I recommend watching the excellent presentations from Percona [ 1 , 2 ] and switching to another database backup/restore tool.
Add --single-transaction --quick and --lock-tables=false to avoid locking at the time of the dump.
Example
mysqldump -u USER -pPASSWORD --single-transaction --quick --lock-tables=false DATABASE | gzip > backupDB.gz
Source
https://dev.mysql.com/doc/refman/8.0/en/mysqldump....
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question