Answer the question
In order to leave comments, you need to log in
How to organize a backup of a 150gb file to a remote server?
On the disk is a file container created with Truecrypt, about 150 GB in size. It's where I keep all my dirtiest secrets. Once a week, on the night from Friday to Saturday, this file is backed up to another HDD. I have been thinking about how to implement a backup to a remote server or cloud for a long time, but I have not yet found a final solution.
So far there are a few raw ideas:
1) As I understand it, it is impossible to upload such a file to any cloud in its entirety (correct me if I'm wrong). You can locally split the file into pieces, 4 gigabytes each, using winrar, then upload it to the cloud using special software on a schedule. The problem is how to do automatic breakdown without my participation, then copy these fragments to the cloud?
2) generally refuse to store the encrypted container on the PC.
Let's say this 150GB file container is stored in the cloud, and I can mount it in Truecrypt in the same way as if it is now stored on a local disk. Is it possible to implement this? Or am I a dreamer?
3) refuse to use Truecrypt and use the cryptographic protection capabilities provided by cloud services. If on garlic, then I do not trust them, just as I do not trust cloud CRM.
I ask for wise advice from those who have solved detailed problems. How else can you make encryption + backup 150GB of information?
Answer the question
In order to leave comments, you need to log in
If the question is purely about how to regularly move such a large amount of information, then -
This requires a protocol that works not with a file, but with blocks.
At a minimum - btsync
Not ideal, but not completely necessary to send.
Ideally, you need a mechanism tied to deduplication, for example branch-cache
In general, point by point.
1) It is completely problematic or impossible to fill such a volume on the cloud. Breaking manually is long and unreliable.
2) The container stored in the cloud cannot be mounted. That is, it is theoretically possible, in practice such a solution will not work.
3) You are doing the right thing by not trusting.
Although there are services that do encryption competently, for example, the same https://mega.co.nz
There, encryption goes on the client side, and the cloud receives already encrypted information.
You can use the FTP protocol . You
rent a remote server with the required HDD capacity, raise the FTP server and upload it there.
There are a lot of programs on the Internet that upload a file on a schedule to FTP
. There are no size restrictions, if only the Internet crashes, then an error may occur or restrictions from the server
Why exactly 150GB?
Isn't it too much? How much is really busy? How much of it is occupied by "dirty secrets", and how many are just personal photos that are not updated very often, and there is no point in storing them in the cloud?
If in fact there is not enough data, you can recreate the disk to a dynamic disk, which will take exactly as much data on the disk as there is data and automatically grow if you add.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question