Answer the question
In order to leave comments, you need to log in
Do I need to cache data in Redis from MongoDB?
Greetings. Such a task - we have a collection of Rooms in MongoDB. In the room, we have the id of the participants in this room, but since SocketIO does not allow you to create a connection with a specific ID, so you have to store user_id => socket_id in Redis.
Each time a message is sent to a room, in order to notify other participants, you need to read the users' id from the Rooms, look for their socket_id in Redis and send it to it. Those. constantly reading data from MongoDB.
I see the output as follows: when a user writes to a room, we read the data from MongoDB id_room clients and enter it into Redis (In order not to store this room there permanently, it will probably be necessary to set the lifetime of this entry since the last message there) And with each next check if this room is already created in Redis and if so, work with it. Each rewrite of any data in the room occurs first in MongoDB, then it is overwritten in Redis.
Question - how to be more correct? Use full MongoDB, or still use Redis?
Answer the question
In order to leave comments, you need to log in
Um... why keep it in reddis? More efficient directly in the main thread, no? Well, not the point.
Why storing the socket ID in mongo (as in any other DBMS) is a bad idea after the first reboot of the system, falling off connections, etc. That is, for each sneeze, you will also have to pull the base, which is relatively slow by the way when compared with key => value in memory, or even more so just a map in memory.
Caching rooms in reddis is also a good idea. All hot data should be stored in memory and access time minimized. But there is an opinion that you will not have such loads that everything will be bad. So solve as problems arise.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question