Answer the question
In order to leave comments, you need to log in
Efficient data exchange between three websites?
Good afternoon,
I inherited three Ruby On Rails web services that I am gradually improving. And I have one architectural problem that I constantly avoided and refused to solve, but the time has come ...
and so, PROBLEM :
Three applications essentially rummage around the base between each other. (i.e. everyone reads\writes from one database directly). This method has its pros and cons:
- You have to copy models everywhere (and tests along with them)
- again, you have to very clearly check that all data verifications EVERYWHERE work the same way, so that somehow something terrible does not happen
- Problems with migrations, you have to at least restart applications after deploying migrations (so that all schemes are updated)
- It is difficult to guarantee data security when all three applications have the same access to the database. If one app had access to the database and distributed it to others, I would have slept better.
I would like to make one application the main one - so that only one of them has access to the database.
Options that I considered, but rejected:
- Database replication - each of the applications not only reads, but also writes data to the database. Therefore, you will have to create two-way replication (between three bases) - it sounds scary, I don’t want to.
- Creation of JSON API - we already have api for the "main" application, which gives data to mobile applications. But you will have to significantly expand this api (the functionality of mobile applications is extremely limited, and they do not consume all the data that other sites need), and you will also have to write code that reads this data - a lot of work for our small team in my opinion.
What is the main thing for me :
- Maximum data security (preferably an encrypted channel)
- Very few resources at my disposal (3 people, one of them is a tester) - therefore, maximum ease of implementation
- Possibility of gradual introduction - we are unlikely to be able to immediately transfer all applications to a new architecture
- Fast data reading speed (so that you don't need to cache anything on third-party apps) - it is desirable to keep a permanently open connection to the main server. (writing to the database can be eventually consistent")
What options I'm considering :
- Event Sourcing - we are already using Redis, it is possible to use for writing data. Alternatively, you can continue to read directly from the database, but write to the database only through Event Sourcing.
- WebSockets - the option that seems to me the most suitable now
- ActiveRecord::Live - I have never even tried it, so I still don’t understand what it is.
I would love to hear your opinion on how to solve such a problem.
Answer the question
In order to leave comments, you need to log in
Basically, there are 2 options:
If all three applications are so closely intertwined, then you need to realize that this is all a single application. And based on this, merge everything to the heap in one place.
If applications lend themselves to dismemberment, separate independent parts into separate databases, and arrange a small common part as a separate service.
perhaps the proposal is not correct, but why not do it through the engine?
models in the main scope, controllers to scatter over several engines and sort everything out by routing?
Remove code duplication
Make it multi-site - a single database eliminates the need to exchange data between sites = less code = fewer errors
Replication is better to set up in order to save data in the database
- WebSockets - вариант который мне кажется сейчас самым подходящим
overkill Event Sourcing
if we are talking about https://developer.mozilla.org/en-US/docs/Web/API/E... then it is optimal for a self-written version of asynchronous front-back interaction- It is difficult to guarantee data security when all three applications have the same access to the database. If one app had access to the database and distributed it to others, I would have slept better.and get a single point of failure for three sites and the same level of security
- Maximum data security (preferably an encrypted channel)no one will wedge between your sites, they will take root in one and get access to everything you need.
Our company has the same tasks, with the difference that not 3, but 2 web services written in different languages and different platforms.
My vision is that it is necessary to transfer component by component to a central JSON API web service that has access to the DBMS. The API must be authenticated ( JSON Web Token or OAuth 2) and only be accessible over HTTPS and HSTS .
All domain logic should be kept in one place to ensure correct operation and reduce development resources.
A guarantee of correctness will be covered by tests. It will become easier to develop and will not need to be torn apart. That is, you can move towards the dismemberment into smaller services. It will be easier to delimit the area of responsibility for each developer.
If there are any general tasks, especially long-running ones, then it is necessary to use message queues using software such as Beanstalk , RabbitMQ or use third-party APIs. With the help of queues, you can achieve scalability, fault tolerance and get rid of the connectedness of services and technologies used.
It is provided by the correct application architecture and data caching on the API server, knowing typical usage scenarios. All sorts of solutions like memchached, Aerospike, etc. can come up here.
Agree with sim3x that WebSockets is overkill for this purpose.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question