Answer the question
In order to leave comments, you need to log in
Is it possible to organize a backup of files to a remote server without archiving them on a production server?
Hello.
Until recently, the following scheme was used for backup:
1) Files and databases were backed up on the main server.
2) The backup was copied to a remote ftp server
3) The backup was deleted on the main server.
On the main server there is a limit of 50GB (you can buy more resources, but sooner or later I will hit the ceiling).
When the size of the files on the server exceeded 25GB, it is logical that the backup could not be made, because. there was not enough free disk space.
Question: is it possible to somehow organize a backup so that the archive is created immediately on the ftp server without using the space on the main one? Or that the backup is created in parts: archived, for example, 1GB - dropped to ftp, deleted on the main one, made the next volume of the archive, etc.
Answer the question
In order to leave comments, you need to log in
Easily.
Since there are 1 million methods and another 1001 thousand, it makes no sense to tell them all.
Now, if you clarify your question - what do you use, then you can already tell.
send files directly
and curl will write the input stream to a file on a remote password-protected ftp server
Lift ssh and copy immediately via scp
tar -cz <что нужно заархивировать> | scp /dev/stdin [email protected]:/backup-path
It's possible, I'll allow it.
Another answer won't make sense.
You have a volume of more than 20 gig - and you drive all this on FTP?!
Cron + mysqldump + rsync = I always have a local copy of the Bitrix site on the internal Debian server that is updated every night. You can play around with it, running around solutions, you can restore the main one from a copy of the databases if something goes wrong. What the hell are some "built-in tools" and other crutches?
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question