A
A
Al2016-07-29 10:45:43
Node.js
Al, 2016-07-29 10:45:43

How to use socket.io to send a message to a specific client if the application uses cluster and runs in several processes on different ports?

The application starts in cluster mode, we establish a connection with a socket for each worker, we use a radish adapter:

app.set('port', httpPort);
    let server = http.createServer(app);
    let io = require('./socketServer')(server);
    io.adapter(redis({host: host, port: port}));
app.set('io', io);

here we connect the main socket.io file (socketServer), where the socket authorization takes place and the on.connection event, where we get the session id into the socketID variable and store the current client in the io.clients array.
io.sockets.on('connection', (socket) =>{
        var socketID = socket.handshake.user.sid;     
        io.clients[socketID] = socket;
        io.clients[socketID].broadcast.emit('loggedIn',socket.handshake.user.data);

        socket.on('disconnect', () =>{
            delete io.clients[socketID];
        });
});

Before all this, there is nginx with upstream configured to organize "sticky sessions" (as here: socket.io/docs/using-multiple-nodes/#nginx-configu...
Next, when we want to send a message to a specific client, we already from the controller by the user's id, we find out his session-id (we store these correspondences in redis during authorization), and then we send the message like this:
this.redis.getByMask(`sid_clients:*`,(err,rdbData) =>{
            Async.each(clients,(client,next)=>{
                let sid = `sid_clients:${client}`;
                let currentClient = rdbData[sid];
                if(!currentClient || !this.io.clients[currentClient]) return next();
                this.io.clients[currentClient].emit(event,data);
                return next();
});

Everything works great when we run the application in one process, but when running in a cluster, all clients on all processes receive a message when connecting "loggedIn", but if a message is sent from a specific process to a client that has connected on another process, it does not work, because. each process has its own io.clients array and they always differ in content, so the message does not reach the desired client.
And so, how correctly to implement sending to the client in a cluster mode? How to store all connected sockets in one place (maybe in a radish) to avoid situations like mine? I think that I messed it up in vain and I don’t understand the essence of the work of io.adapter(redis(...)).
As long as the thought comes to mind, when we want to make an `emit` from a particular worker to a client whose socket is located on another worker, then first we send a message to the master process (with all the data that we want to send), in the master we look for whether there is the right client and on which worker it is, for example through redis, we find this match, and even then we send the data from the master to the desired worker in which we catch this message from the master in `process.on` and send it to the right client.
But what will happen if the application is launched in a cluster on several physical machines (servers), everything will break again. In general, I do not like this idea, but I still need the advice of specialists!

Answer the question

In order to leave comments, you need to log in

1 answer(s)
A
Alexander Aksentiev, 2016-07-29
@Sanu0074

socket.io/docs/using-multiple-nodes

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question