Answer the question
In order to leave comments, you need to log in
How to stream data from MySQL/PostrgreSQL?
There are several Go server applications that will run on different servers but work with the same data. It is necessary for me, that application learned at once about changes in a number of tables. In particular, for example, a list of users. Those. for example, one application banned the user, and another immediately found out about it and cut off his connection.
Everything revolves on AWS. Initially, it seemed that DynamoDB is an ideal solution: servers can be located in different parts of the world, it itself replicates everything to all availability regions, so the data is close, and with the help of Streams, applications learn about changes. I already coded almost everything, but then I realized that showing all the data (the list of users for the admin, for example) using the Scan operation is super expensive. Those. even if you take out all the data for synchronization once a day, it still turns out to be expensive, either spend a lot of money and the entire provisioned throughput to get data, or pull out one line per hour.
Is there a cheap way to get around this? I have almost resigned myself to switching to Mysql/Postgresql, but is there any way to stream data in real time? Write about Debezium+Kafka, but it looks like a solution with too much overhead.
What are your ideas on how the problem could be solved?
Thanks in advance.
Answer the question
In order to leave comments, you need to log in
In postgres, this is done via LISTEN + NOTIFY
https://www.postgresql.org/docs/current/static/sql...
or a queue like
pgq
https://wiki.postgresql.org/wiki/PGQ_Tutorial
There are a number of extensions for bindings and all
The Pub/Sub model is suitable for alerting subscribers. The same AWS SNS.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question