V
V
vovansystems2013-03-02 18:55:24
Data synchronization
vovansystems, 2013-03-02 18:55:24

What is the best way to synchronize a large number of files in real time?

There are two servers (debian) in different data centers. Balancing will be set. It is required to synchronize in real time directories with thousands of small files of small volume. About a megabyte of data will be added per minute (about 10 files in total). What is the best tool for this?
What are some good options? I looked towards NFS, rsync daemon, iSCSI, but they each have their own drawbacks ...

Answer the question

In order to leave comments, you need to log in

3 answer(s)
I
Ilya Evseev, 2013-03-02
@IlyaEvseev

habrahabr.ru/post/132098/ - found by Google for "csync2 inotify".
Pay attention not only to the article, but also to the comments to it.

S
shadowalone, 2013-03-03
@shadowalone

look at glusterfs .

M
miragenn, 2013-03-03
@miragenn

To you these files on HTTP to give? I have the same task. I reviewed a lot, until I decided to do this:
There is a main node, it is being written to, it periodically rsyncs to the second one.
On both nodes, nginx distributes these files, but on the second, if the file is not found, the request is proxied to the main node. Organize the files in such a way that rsync would run in a smaller volume.
Rsync works faster if you run it in several threads here is a small example .

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question