socket.on inside socket.on server side api socket.io in nodejs - javascript

Why do we require to put socket.on inside socket.on here? What does this represent? Is there any other way of writing this?
This is the code of server.js in nodejs.
var objExpress = require('express')()
var objHttp = require('http').createServer(objExpress)
var objSocketIO = require('socket.io')(objHttp)
objExpress.get('/', (request, result) => result.send('hello'))
objSocketIO.on('connection', (argSocket) => {
console.log('A user connected!');
argSocket.on('message', (argMsg) => {
console.log(argMsg);
argSocket.broadcast.emit('message-broadcast-xyz', argMsg)
})
})
objHttp.listen(3000, () => {
console.log("Listening on port 3000")
})

Inside of this:
objSocketIO.on('connection', argSocket => {
// here's where you know the socket for a newly connected socket
});
is the only place where you get notified of a newly connected socket. If you want to listen for events on that newly connected socket, then this is the place to install those event listeners.
objSocketIO.on('connection', argSocket => {
// here's where you know the socket for a newly connected socket
// and where you can install event listeners on that newly connected socket
argSocket.on('message', (argMsg) => {
// here's where you get a message on that connected socket
// from the previously installed event handler
argSocket.broadcast.emit('message-broadcast-xyz', argMsg)
});
});
Why do we require to put socket.on inside socket.on here?
Well, that's how event driven programming works. You listen for an event by installing an eventListener. In this case, when you get an event from your server that a new socket has connected, then you install event listeners on that new socket so you can get events from it.
Is there any other way of writing this?
Other ways could be dreamed up, but they'd have to be doing something like this under the covers because listening for events is the way you program with servers and sockets in node.js.

Related

Do you always have to wrap your socket.on in a io.on connection block?

I am new to socket.io and have realized that every server-side eventlistenrs are wrapped in a io.on connection block:
io.on('connection', socket => {
socket.on('event name', callback)
})
I am a little confused about two things:
do I always have to wrap my socket.on in this kind of blocks? Is it because the socket represents the individual client that raised the connection event?
Why should I use socket.on inside the block instead of io.on since they seem to do the same thing(like io.on('connection', () => io.on('event name', callback)))?
io.on() means adding an event listener to the server
io.on('connection', socket => {
socket.on('event name', callback)
})
socket.on() means adding an event listener to the client as he connects
You cannot add an event listener to the client if he isn't connected, yet.
So what we do, is we add an event listener to the server, that triggers, when a client connects. Once the client is connected, we add event listeners to him, that trigger, when he is sending data (for example)

Best practice for handling Socket.io events?

I am switching from ajax polling to socket.io live data pushing but have a question about best practice of managing events.
Let's say I have a basic server like this:
var Clients = [];
// Event fired every time a new client connects:
io.on('connection', function(socket) {
Clients.push(socket);
// What else should go in here?
// Individual functions for each event?
socket.on('chat message', function(msg){
console.log(':> '+msg);
io.emit('chat message', msg);
});
// When socket disconnects, remove it from the list:
socket.on('disconnect', function() {
var index = Clients.indexOf(socket);
if (index != -1) { Clients.splice(index, 1); }
});
});
In the example there is a chat message event, but ideally there are many other events I'd like to push to clients in real-time, such as:
User sends message, user starts/stops typing, user likes a post, user invites you to become friends/group member, and so on.
There are many events that I'd like to capture in real-time but it seems like there has to be a better way than cramming the io.on() statement full of them all.
I've looked into Node OOP but not sure if it would necessarily help in this application.
So where we have the socket.on() statements, what would the best practice be for including event catching like I described?
There are many events that I'd like to capture in real-time but it
seems like there has to be a better way than cramming the io.on()
statement full of them all.
Nope. Inside the io.on('connection', ...) is where you have a closure with the socket variable that pertains to a new connection. So, any event handler you want to respond to from that socket goes in there or in some function you call from there and pass the socket to. That's just how it works.
FYI, in the interest of modularity, you don't have to put all the event handlers inside of one io.on('connection', ...). If you export the io object to other modules or pass it to them in a module constructor, you can have other modules create their own io.on('connection', ...) listeners and install their own event handlers inside their own io.on('connection', ...). io is an EventEmitter so it can support as many listeners for the connection event as you want and all will be notified.

the way to listen the Worker's event in Clustor module in node

I'm using clustor module to do multiple process application. I want to listen the event of Worker, the example in the document is like this:
cluster.fork().on(‘listening’,cb);
But I want to take apart the worker and master, so in my worker.js file, I write like this:
const http = require(‘http’);
const cluster = require(‘cluster’);
const worker = cluster.worker;
http.createServer((req,res)=>{
res.writeHead(200);
res.end(“hello world”);
}).listen(8099);
worker.on(‘listening’,address=>{
console.log(address:${address});
});
But the listening callback will not run, so does it mean I can only listen the Worker's event in master.js?

Socket.io Leak - How do I remove this mongoose event listener that is registered onConnect?

I have the following subscribe even happening when a user connects via socket.io:
socket.on('subscribe', function(room) {
console.log('joining room', room.id);
socket.join(room.id);
socket.roomName = room.id;
// Insert sockets below
require('../api/item/item.socket').register(socket);
});
The "item.socket" code attaches mongoose event handlers for db events (save, delete, etc..) and emits socket messages when they happen. Here is that code:
var Item = require('./it');
exports.register = function(socket) {
Item.schema.post('save', function (doc) {
console.log("hit save message");
onSave(socket, doc);
});
Item.schema.post('update', function (doc) {
console.log("hit update message");
onSave(socket, doc);
});
Item.schema.post('remove', function (doc) {
onRemove(socket, doc);
});
};
function onSave(socket, doc, cb) {
socket.emit(doc._id + ':item:save', doc);
}
function onRemove(socket, doc, cb) {
socket.emit(doc._id + ':item:remove', doc);
}
When a client disconnects the following code is executed:
function onDisconnect(socket) {
console.log('disconnected: ' + socket.roomName);
socket.leave(socket.roomName);
socket.removeAllListeners("subscribe");
}
The basic problem that i'm having is, if a user connects the get that item.socket mongoose handler to send them updates about model changes. When they disconnect (reload page, leave come back, etc..) that even handler never goes away. For instance, I make a change to "item" and I'll get 10 "hit save" messages if I reload the page 10 times.
Ideally when a client leaves the page, that mongoose handler would be destroyed so its not trying to send messages to no one.
EDIT: It's very clear to me that this is an event handler leak. And basically I need to do something to destroy the handler on disconnect. I need to somehow create a reference to require('../api/item/item.socket').register(socket); and then on disconnect unregister it but i can't figure out how to properly do that with the mongoose models and the way i have it set up.
function onDisconnect(socket) {
console.log('disconnected: ' + socket.roomName);
socket.leave(socket.roomName);
socket.removeAllListeners("subscribe");
//destroy handler here
}
One possible solution is as follows.
You could maintain a collection of the sockets that are connected in "item.socket".
"item.socket" would then export functions to add and remove sockets from this collection. So, you would "require" item.socket in your code once and then call its add method with a socket when a connection is established. Similarly you would call its remove method with the socket when it disconnects. Your mongoose event handlers within item.socket should iterate over the collection of sockets and emit messages as applicable.
In case you would "require" the module in such a way that multiple instances are loaded, you might as well attach the socket collection to the global namespace.
Thanks for the insight, sometimes you just stare at a problem to long and realize the answer is straight forward. I pretty much did what you said, moved to only register the item.socket once. However instead of keeping a list of sockets around, we just use the socket.io way to just to send messages to the right sockets, code as follows:
'use strict';
var Item = require('./item.model');
var log4js = require('log4js');
var logger = log4js.getLogger();
exports.register = function(socketio) {
Item.schema.post('save', function (doc) {
logger.debug("hit save message " + doc._id);
onSave(socketio, doc);
});
Item.schema.post('update', function (doc) {
logger.debug("hit update message");
onSave(socketio, doc);
});
Item.schema.post('remove', function (doc) {
onRemove(socketio, doc);
});
};
function onSave(socketio, doc, cb) {
socketio.sockets.in(doc._id).emit(doc._id + ':item:save', doc);
}
function onRemove(socketio, doc, cb) {
socketio.sockets.in(doc._id).emit(doc._id + ':item:remove', doc);
}
Where doc._id is the room a client joins when subscribing. Leak fixed!!! Thanks :-)

Release event handlers on disconnection of Socket.IO client

I'm using Socket.IO like in this sample:
io.sockets.on("connection", function (socket) {
myService.on("myevent", function() {
socket.emit("myevent", { /* ... */ });
// some stuff happens here of course
});
});
myService is a singleton and a subclass of EventEmitter which triggers the myevent over the time. Anything works fine, however I guess that I create some kind of leak in this case. How does my service know that it doesn't need to call the handler once the connection is destroyed? Is there some kind of destroy event I can catch and then remove the handler from myService?
Listen to the socket disconnect event and when you get a disconnect event, remove the relevant event handler from the myService object.
You should be able to do that like this:
io.sockets.on("connection", function (socket) {
function handler() {
socket.emit("myevent", { /* ... */ });
// some stuff happens here of course
}
myService.on("myevent", handler);
socket.on("disconnect", function() {
myService.removeListener("myevent", handler);
});
});
If what you're trying to do is to broadcast to all connected sockets, you could just install one "myevent" listener (not one per connection) and use io.emit() to broadcast to all sockets too and not have to handle the connect or disconnect events for this purpose.
If you are planning to send data to all sockets when some other event fires, you don't need to add/remove another listeners every time a client connects/disconnects.
It is a lot more efficient and easier to simply fire the socket.io event to all sockets that are connected right now using io.sockets (which is a reference to the default namespace with all clients on it by default) and io.sockets.emit:
myService.on('myevent', () => {
io.sockets.emit('myevent', {/*...*/});
});
If you only need to fire this event to some subset of your users, try using specific namespaces or rooms:
myService.on('myevent', () => {
//with namespaces
io.of('namespace').emit('myevent', {/*...*/});
//with rooms
io.to('room').emit('myevent', {/*...*/});
});

Categories