I
I
IRT2017-10-30 10:32:09
SSH
IRT, 2017-10-30 10:32:09

Reliable data transfer over an unstable connection?

There is a Macbook with a slightly buggy Wi-Fi adapter, after 5-10 GB of data transferred via Samba or SSHFS, the connection falls off. If you immediately start the transfer again, everything goes fine.
Question. How to transfer many gigabytes quickly through unstable and falling off Wi-Fi? There is rsync, but it does not have a GUI and you need to fence scripts like:

#!/bin/bash

while [ 1 ]
do
    rsync -avz --partial source dest
    if [ "$?" = "0" ] ; then
        echo "rsync completed normally"
        exit
    else
        echo "Rsync failure. Backing off and retrying..."
        sleep 180
    fi
done

Through SSHFS and Midnight Commander (launched in screen on a macbook), it looks like this. We start copying, after a while an error, click "Continue" .. Segmentation Fault, mc crashes.
How to copy hundreds of gigabytes over the network with confidence that everything was copied bit by bit? Checking MD5 hashes for every copied file is not an option. I see all this as some kind of reliable network file system. Where the data is sent in chunks of, say, 64 MB, with automatic hash check of each chunk after transmission. And about 10 attempts to retransmit this 64 MB chunk before finally giving up. The main thing is that for Midnight Commander and other programs, while the file system is trying to pick out another piece of data, it would not look like an I / O error and they would not fall into a Segmentation Fault.
If anything, the problem is not in the server, everything flows stably through the cable.

Answer the question

In order to leave comments, you need to log in

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question