Answer the question
In order to leave comments, you need to log in
How to deploy multiple failover servers?
There is a web application. It is necessary to achieve maximum fault tolerance. We will use multiple servers. First two servers, then maybe even more. That is, if one server fails (for example, a fire in a data center), then the application continues to work at the expense of the second / third server in other data centers.
What deployment options will be here? I would not want to manually configure each server (installing and configuring Nginx, Ruby, Rails, PostgreSQL). Is there an option to set up one server and clone it to the others?
Plus, you need synchronization of PostgreSQL databases (master - master, so that both servers can write to the database) and application files between servers. What are the best solutions to use here?
Is it possible to synchronize all files between servers? To make it enough to update the software on one server and all the updates were pulled up to the rest of the servers?
Answer the question
In order to leave comments, you need to log in
The points:
Administration and configuration tools are used, such as puppet, chif, ansible . Those. once we create a scenario on the selected instrument, and then we "play" it to the desired server.
I wouldn’t do it with a master-master, I strongly wouldn’t! It is better to run several databases and shard data on them. For example, read about sharding here - ruhighload.com/index.php/2009/05/06/%D1%88%D0%B0%D...
And even better, some (if not all data) should be stored in databases key / value like memcache/ redis, or object like hadoop/mongodb.
It is best to make a distributed file system like glusterfs, GPFS, cephfs, luster. And the simplest is to just use rsync, but it's slow, lag, and inefficient.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question