L
L
lecherg2018-08-03 10:09:26
Backup
lecherg, 2018-08-03 10:09:26

How to wish Backup of continuously changing data?

There is a large site, about 60 GB of data and a 200 MB database. The data in the database and on the website changes every second. How many ways of archiving data have not been tried - the archives do not open, because at the beginning of archiving and at the end the data is different. What means exist for such purposes? Server on Centos8

Answer the question

In order to leave comments, you need to log in

3 answer(s)
A
Artem @Jump, 2018-08-03
Tag

60 GB data and 200 MB database

A backup of any database should be done by means of a DBMS that manages it, that there you have postgri or myskul - they have their own tools for backup.
To get a correct backup of files, you need to use the file system tools - take snapshots. This can be done, for example, using lvm in Linux and not only with them, in Windows these are shadow copies.

R
RidgeA, 2018-08-03
@RidgeA

Replication will not work?
As an option: make replication -> disable the slave -> make a backup from the slave -> return to the cluster -> wait until it catches up -> repeat
In general, a lot depends on the DBMS...

D
Dmitry Shitskov, 2018-08-03
@Zarom

You can try snapshots. I recommend using ZFS or BTRFS for snapshots, but LVM snapshots are fine too - they're just not that convenient to operate with.
Procedure:

  1. Take a snapshot
  2. Mount and copy static data from snapshot
  3. Delete Snapshot

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question