Answer the question
In order to leave comments, you need to log in
What is the correct way to copy 10TB of data from one server to another?
Hello.
There are two linux servers. It is necessary to transfer 10TB of data from one to another in the form of folders and files.
What tools would be best to do this?
For the most part, the issue of breaking the connection is of concern, so that the transfer will then resume correctly.
Thank you.
Answer the question
In order to leave comments, you need to log in
Apache+Wget :)
If the connection is broken, it will resume downloading;)
There is a longer way, provided that the servers are on the same network.
On server #1, share the folder via samba
On server #2, mount the folder from server #1
Copy what you need.
Unmount the folder.
The distance between the servers is not clear. If the servers are on different continents, then you can use something like UFTP (FTP over UDP), because at high latency TCP is not efficient.
If nearby, then the simplest FTP or rsync.
rsync is useful in that it will fix errors in already downloaded files if they are broken for some reason. If the servers are on the same network, it's better to run the rsync daemon. in SSH mode, the processor load will increase due to traffic encryption.
FTP on a local network is one of the fastest protocols at hand. But I'm not sure that FTP console clients will work with resume and in batch mode. Although you can try Midnight Commander - it supports FTP, but I'm not sure if it will give you the maximum speed.
And of course, you need to start copying inside the screen or tmux session so that copying is not interrupted when the terminal is disconnected.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question