K
K
kiranananda2019-01-18 15:50:12
linux
kiranananda, 2019-01-18 15:50:12

Linux copy of data on multiple servers?

Good afternoon.
There is docker-swarm which includes several servers. On each of them, the web application itself can spin. It is necessary to share the data on the disk so that all running nodes can see it. At the moment, lsync is configured on the master, from where updates are distributed to all servers on which rsync is running accordingly. Nfs does not roll here, since it sometimes freezes and the speed of access to files is noticeably lower than they would be locally. Yes, and fault tolerance is important, when one of the nodes falls, even the master, so that everything continues to work. lsync, for example, stops working correctly when one of the servers goes missing. As for cluster fs, I don’t know how they will behave here either, how high the speed of access to data is, the site is still not small there. I had little experience with this matter, but I should have raised it yesterday :) ...

Answer the question

In order to leave comments, you need to log in

1 answer(s)
P
pfg21, 2019-01-18
@kiranananda

syncthing (free open source) and resilio sync (nee bittorrent sync, commerce, there is a free cropped version) - p2p file synchronization.
both can inotify, both daemons, small native resource-intensive, compiled for almost every axis.
There are author's repositories.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question