S
S
SimpleQ2019-02-16 11:41:00
Data transfer
SimpleQ, 2019-02-16 11:41:00

What is the best way to transfer a file> 100GB client-server?

Periodically, it is required to receive a large file (> 100 GB) from colleagues for further processing.
1) You need to resume files when the connection is broken
2) Open-source software.
3) High download speed.
You can deploy anything on the server, I tend to Freebsd.
PS: there is a white external IP. The speed is still 100 Mbps, but over time we will increase the speed of the channel.

Answer the question

In order to leave comments, you need to log in

5 answer(s)
R
rPman, 2019-02-16
@rPman

100+ gigabytes looks like problems created from scratch (because even with an ideal provider you can’t do such tricks for a long time), perhaps in the form of an archive in one file? And there, I suppose, inside 99% of the files already transferred earlier.
The correct solution is not to pack the files into an archive, but to keep them as unpacked as possible, often as a source (that is, where the data comes from), and there - rsync (install the rsync server on the source side) will compare the files (by hashes) he will copy the necessary ones, delete the old ones and this is as efficient as possible.
ps in case your large files change, although not significantly, but from the point of view of rsync it will be a new file, then you can store two copies of the files on the source - from the last transfer and the current one, then use diff (there are different utilities, historically diff was only for texts but now universal, and there are specialized versions of binary diff) for generating a patch, which will allow you to restore a new one from the old version and a small patch file, then you can transfer this patch.
--------------
Tell me, what kind of files do you have? what's in them? permanent 100GB look like something unusual? Do you have a mini collider?

D
Dim Boy, 2019-02-16
@twix007

rsync solution #1

V
VoidVolker, 2019-02-16
@VoidVolker

  • Deploy FreeNAS on a server, and use any file service there (from samba and WebDav to cloud solutions like seafile).
  • Torrent. A distribution is created locally, where the server's ip:port web seed is specified. A torrent file is sent to the client and the client downloads it.

E
Evgeny Petryaev, 2019-02-16
@Gremlin92

Install Windows and download the filezilla server where the settings are trifling (you can even fasten encryption a little with a tambourine dancing) there is no such program under Linux. Then you can download any ftp client, for example, just filezilla

V
Vladislav, 2019-02-17
@ghostiam

I myself personally transferred > 100Gb of data to the server via rsync a few days ago.
To support resume when the connection is broken, rsync has flags "--partial --append-verify"

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question