Answer the question
In order to leave comments, you need to log in
Collecting backups from multiple hosts
Hello!
Please suggest some solution. There are several hosting, quite motley. There is a home machine Ubuntu 11.04. All remote machines have SSH. Task: initiate an incremental backup of files and databases from a home machine and upload archives to it. It is desirable to store all data (logs, exception tables, etc.) only on the home page.
PS
Basically, the solutions found require the installation of additional. remote software, but this is not always possible, because There are also shared hosting. Therefore, most likely the maximum that can be used is tar and gzip.
Thanks to.
Answer the question
In order to leave comments, you need to log in
If windows, install locally cygwin.
On the local machine, a primitive script of the form is written:
scp [email protected]:/path1/files/* /local/path1/
scp [email protected]:/path2/files/* /local/path2/
And on the servers, authorization is configured by a key file
Instead of scp it is better to use rsync - it is more efficient in terms of traffic, especially when not everything in the list of files changes (but to increase efficiency, rsync should be installed on servers as a daemon, for * nix this is not a problem)
rsnapshot.org/ - makes network incremental backups from any machines that have ssh and rsync installed.
I can’t help you exactly according to your scheme, but such a service may suit you: www.dropmysite.com.
In the free version, it gives 2 gigabytes of space for backups, collects them automatically on a schedule via ftp, or can connect to the database directly.
I've been using it for a couple of days, it seems to be working fine :)
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question