W
W
WebforSelf2022-03-09 22:37:50
Backup
WebforSelf, 2022-03-09 22:37:50

How to backup a large site?

Initial data, there is a site that weighs 500 gigs.
There is a home PC on Linux (Ubuntu)
You need to back up this huge site on a local machine.
The problem is that there are a lot of files on it, for example, downloading via FTP completely disappears.
There was an option to make an archive on the same disk, made tar.gz and also downloads for 100 years.

Also, the option was to connect via the terminal and download the site via rsync without archiving.
It also failed, for some reason not all files were downloaded and in the process rsync cursed a lot.

Then there was an attempt to have the archive, which was made from 500 GB, transferred to Yandex.Disk.
The application is put on the server, but when trying to synchronize, Yandex says that it cannot synchronize the 500 GB archive. How in general in such situations to make backups stably? Break into pieces?

Answer the question

In order to leave comments, you need to log in

3 answer(s)
R
Roman Mirilaczvili, 2022-03-10
@WebforSelf

Try ZBackup (deduplication + LZMA compression) on the server. and then rsync to your machine.
When creating an archive, the output is a lot of 2 MB BLOB files and they are downloaded without problems. I myself have been using ZBackup for a long time.

R
rPman, 2022-03-09
@rPman

break into pieces and back up not on your machine with a half-dead Internet, but on a server in the data center or, as you already tried, a Yandex disk , which will be available later

D
Drno, 2022-03-09
@Drno

Try instead of rsync - rclone Set
the mood where and where - it will backup everything

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question