T
T
Troodi Larson2021-05-11 23:00:42
linux
Troodi Larson, 2021-05-11 23:00:42

How to download a file from the server faster?

There is a Linux machine, without speed limits (as the hoster writes). There is a 50 gig file, but it is downloaded via SFTP at a speed of 5-12 megabytes per second. What could be the limitation? The hoster cuts speed or runs into some resources, although the server itself is very powerful. At the same time, when I upload something to the server itself, the speed of 20 megabytes per second calmly pulls. Tell me what to check, please.

Answer the question

In order to leave comments, you need to log in

2 answer(s)
K
ky0, 2021-05-11
@ky0

In addition to the hoster, call all intermediate Internet providers between the server and the place where you are uploading the file - suddenly somewhere in the middle a channel of not guaranteed width?</sarcasm>

R
rPman, 2021-05-12
@rPman

If you download to your home, then remember that the presence of a gigabit tariff does not guarantee such a speed, especially for one connection.
If all other reasons are taken into account (remote disk speed, local disk speed, processor speed) and the reason is really in one of the provider's channels, then I recommend using torrent.
But it’s not just to raise a torrent client here and there (console rtorrent for example), but to raise it on several servers at the same time, placed in such a way that different channels are used, it’s very difficult to choose exactly where to launch right away, but you can empirically study this by trying the speeds of different providers on the receiving side, since it is much cheaper (ask 2,3 ... 5 friends from your city sitting on different providers to open a link, probably anyone can temporarily allocate 50GB), the final speed can add up literally.
The torrent technology downloads random parts of files to each node, so each client downloads and immediately gives its part of the file (you can still play with traffic limits on different connections, since additional clients will download parts for themselves not only from the original server, but also from other similar clients by downloading the local feed).
This advice is not a theory, I almost tripled the download speed of a large file so once, downloading it simultaneously in three places, although then it was about downloading when the speed did not rise above 3 mb / s (local providers limited, but intracity traffic was unlimited), increasing it almost to the then limit of 100Mbps - 11Mbps.
ps torrent is an example of a ready-made technology, it would be possible to file the multi-threaded download utility itself, managing connections on your own

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question