V
V
Valery2016-08-20 12:38:12
Debian
Valery, 2016-08-20 12:38:12

How to make a backup and not lay down the server?

Hello everyone,

we have a joint shopping project (something like an online store).
It runs on a dedicated server Xeon E3-1230v2 16GB RAM.
The files in total take up about 200 GB.

All this is backed up using ZBackup (there used to be regular ISP manager consequences, but they are worse).

If you run a backup at midnight, it will last 10-12 hours and at the same time greatly slow down the server:
load average: 3.99, 4.50, 4.52 or higher

Of course, this affects the convenience of the site and I would like to somehow get rid of this problem.

Maybe someone faced a similar problem.
How can you reduce the load from creating backups?
How are backups of large (in terms of volume) projects generally made?

Answer the question

In order to leave comments, you need to log in

3 answer(s)
R
Roman Mirilaczvili, 2016-08-20
@2ord

I think this is due to the fact that LZMA compression is performed, which has excellent compression characteristics, but is very system intensive (especially for poorly compressible data types like JPG). It makes sense to try LZO , which focuses on compression speed rather than compression quality.
You can also try playing around with the parameters --cache-size, --threads.
However, it is worth trying to apply them separately first, otherwise it will not be clear which parameter affects and which does not.
By the way, for JPG, you can still experiment with the lepton packer (lossless compression / decompression).
In general, you can choose the best compression method based on the nature of the data.
Ну и, наконец, самое напрашивающееся решение: использование выделенного вычислительного ресурса только для целей резервирования данных. Само собой, у него должен быть прямой доступ к данным.
В качестве решения инкрементальных резервных копий имеется
zbackup-tar

zbackup-tar is a very relaxed incremental tarring tool. Very relaxed means that the cost of tarring a file we don't need is very low (since it will be deduplicated) so we can tar files we don't strictly need, so long as we never miss tarring a file we do need.

АртемЪ, 2016-08-20
@Jump Куратор тега Резервное копирование

Видимо проблема в том, что вы делаете полный бэкап.
Зачем вам каждый раз бэкапить 200Гб? Если за день у вас изменилось от силы 2-3Гб?

Владимир, 2016-08-20
@MechanID

Посмотрите на nice и ionice
эти утилиты позволяют понизить приоритет использования цпу и дисковой подсистемы процессу во время запуска или уже после того как он был запущенн.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question