When using the socket.io library, I am a little confused about how to place the different methods.
In a very simple chat application I have server.js:
io.sockets.on('connection', function(socket) {
//some methods to handle when clients join.
socket.on('text', function(msg) {
socket.broadcast.emit('text', msg);
});
});
and client.js:
var socket = io.connect();
socket.on('connect', function() {
//some methods to fire when client joins.
socket.on('text', function(msg) {
console.log(msg)
});
});
Right now, the methods that handle when a client joins AND the methods that handle the sending and receiving of messages afterwards, are placed within the connect / connection event methods, both on the server and the client side, but this structure seems to work as well on the client side:
var socket = io.connect();
socket.on('connect', function() {
//some methods to fire when client joins.
});
socket.on('text', function(msg) {
console.log(msg)
});
+potentially many more methods...
My question is, what is the fundamental difference between placing a method inside the connect method and outside, and what is considered the best option?
When you call this,
socket.on('xyz', function listener() {});
you listen for event xyz and add function listener as event handler. It is executed whenever xyz occurs. So when you do :
socket.on('connect', function() {
socket.on('text', function(msg) {
console.log(msg)
});
});
Event handler/listener for text is added only when connect event happens (connect event handler is executed). There is only one listener before connect happens and two (one more) when connect happens. But when you do :
socket.on('connect', function() {
//some methods to fire when client joins.
});
socket.on('text', function(msg) {
console.log(msg)
});
There are two listeners at any time, before/after connect happens.
The previous method is more efficient and logical. Logical because text cannot happen before connect happens, so why listen to it. Efficient as in the event-loop does not have unnecessary events to look for. Adding too many events may not hurt very much, but for performance-critical applications it can be a drag. The latter one just looks good, all event handlers placed one by one.
Related
I am new to socket.io and have realized that every server-side eventlistenrs are wrapped in a io.on connection block:
io.on('connection', socket => {
socket.on('event name', callback)
})
I am a little confused about two things:
do I always have to wrap my socket.on in this kind of blocks? Is it because the socket represents the individual client that raised the connection event?
Why should I use socket.on inside the block instead of io.on since they seem to do the same thing(like io.on('connection', () => io.on('event name', callback)))?
io.on() means adding an event listener to the server
io.on('connection', socket => {
socket.on('event name', callback)
})
socket.on() means adding an event listener to the client as he connects
You cannot add an event listener to the client if he isn't connected, yet.
So what we do, is we add an event listener to the server, that triggers, when a client connects. Once the client is connected, we add event listeners to him, that trigger, when he is sending data (for example)
I am switching from ajax polling to socket.io live data pushing but have a question about best practice of managing events.
Let's say I have a basic server like this:
var Clients = [];
// Event fired every time a new client connects:
io.on('connection', function(socket) {
Clients.push(socket);
// What else should go in here?
// Individual functions for each event?
socket.on('chat message', function(msg){
console.log(':> '+msg);
io.emit('chat message', msg);
});
// When socket disconnects, remove it from the list:
socket.on('disconnect', function() {
var index = Clients.indexOf(socket);
if (index != -1) { Clients.splice(index, 1); }
});
});
In the example there is a chat message event, but ideally there are many other events I'd like to push to clients in real-time, such as:
User sends message, user starts/stops typing, user likes a post, user invites you to become friends/group member, and so on.
There are many events that I'd like to capture in real-time but it seems like there has to be a better way than cramming the io.on() statement full of them all.
I've looked into Node OOP but not sure if it would necessarily help in this application.
So where we have the socket.on() statements, what would the best practice be for including event catching like I described?
There are many events that I'd like to capture in real-time but it
seems like there has to be a better way than cramming the io.on()
statement full of them all.
Nope. Inside the io.on('connection', ...) is where you have a closure with the socket variable that pertains to a new connection. So, any event handler you want to respond to from that socket goes in there or in some function you call from there and pass the socket to. That's just how it works.
FYI, in the interest of modularity, you don't have to put all the event handlers inside of one io.on('connection', ...). If you export the io object to other modules or pass it to them in a module constructor, you can have other modules create their own io.on('connection', ...) listeners and install their own event handlers inside their own io.on('connection', ...). io is an EventEmitter so it can support as many listeners for the connection event as you want and all will be notified.
I have the following subscribe even happening when a user connects via socket.io:
socket.on('subscribe', function(room) {
console.log('joining room', room.id);
socket.join(room.id);
socket.roomName = room.id;
// Insert sockets below
require('../api/item/item.socket').register(socket);
});
The "item.socket" code attaches mongoose event handlers for db events (save, delete, etc..) and emits socket messages when they happen. Here is that code:
var Item = require('./it');
exports.register = function(socket) {
Item.schema.post('save', function (doc) {
console.log("hit save message");
onSave(socket, doc);
});
Item.schema.post('update', function (doc) {
console.log("hit update message");
onSave(socket, doc);
});
Item.schema.post('remove', function (doc) {
onRemove(socket, doc);
});
};
function onSave(socket, doc, cb) {
socket.emit(doc._id + ':item:save', doc);
}
function onRemove(socket, doc, cb) {
socket.emit(doc._id + ':item:remove', doc);
}
When a client disconnects the following code is executed:
function onDisconnect(socket) {
console.log('disconnected: ' + socket.roomName);
socket.leave(socket.roomName);
socket.removeAllListeners("subscribe");
}
The basic problem that i'm having is, if a user connects the get that item.socket mongoose handler to send them updates about model changes. When they disconnect (reload page, leave come back, etc..) that even handler never goes away. For instance, I make a change to "item" and I'll get 10 "hit save" messages if I reload the page 10 times.
Ideally when a client leaves the page, that mongoose handler would be destroyed so its not trying to send messages to no one.
EDIT: It's very clear to me that this is an event handler leak. And basically I need to do something to destroy the handler on disconnect. I need to somehow create a reference to require('../api/item/item.socket').register(socket); and then on disconnect unregister it but i can't figure out how to properly do that with the mongoose models and the way i have it set up.
function onDisconnect(socket) {
console.log('disconnected: ' + socket.roomName);
socket.leave(socket.roomName);
socket.removeAllListeners("subscribe");
//destroy handler here
}
One possible solution is as follows.
You could maintain a collection of the sockets that are connected in "item.socket".
"item.socket" would then export functions to add and remove sockets from this collection. So, you would "require" item.socket in your code once and then call its add method with a socket when a connection is established. Similarly you would call its remove method with the socket when it disconnects. Your mongoose event handlers within item.socket should iterate over the collection of sockets and emit messages as applicable.
In case you would "require" the module in such a way that multiple instances are loaded, you might as well attach the socket collection to the global namespace.
Thanks for the insight, sometimes you just stare at a problem to long and realize the answer is straight forward. I pretty much did what you said, moved to only register the item.socket once. However instead of keeping a list of sockets around, we just use the socket.io way to just to send messages to the right sockets, code as follows:
'use strict';
var Item = require('./item.model');
var log4js = require('log4js');
var logger = log4js.getLogger();
exports.register = function(socketio) {
Item.schema.post('save', function (doc) {
logger.debug("hit save message " + doc._id);
onSave(socketio, doc);
});
Item.schema.post('update', function (doc) {
logger.debug("hit update message");
onSave(socketio, doc);
});
Item.schema.post('remove', function (doc) {
onRemove(socketio, doc);
});
};
function onSave(socketio, doc, cb) {
socketio.sockets.in(doc._id).emit(doc._id + ':item:save', doc);
}
function onRemove(socketio, doc, cb) {
socketio.sockets.in(doc._id).emit(doc._id + ':item:remove', doc);
}
Where doc._id is the room a client joins when subscribing. Leak fixed!!! Thanks :-)
I'm using Socket.IO like in this sample:
io.sockets.on("connection", function (socket) {
myService.on("myevent", function() {
socket.emit("myevent", { /* ... */ });
// some stuff happens here of course
});
});
myService is a singleton and a subclass of EventEmitter which triggers the myevent over the time. Anything works fine, however I guess that I create some kind of leak in this case. How does my service know that it doesn't need to call the handler once the connection is destroyed? Is there some kind of destroy event I can catch and then remove the handler from myService?
Listen to the socket disconnect event and when you get a disconnect event, remove the relevant event handler from the myService object.
You should be able to do that like this:
io.sockets.on("connection", function (socket) {
function handler() {
socket.emit("myevent", { /* ... */ });
// some stuff happens here of course
}
myService.on("myevent", handler);
socket.on("disconnect", function() {
myService.removeListener("myevent", handler);
});
});
If what you're trying to do is to broadcast to all connected sockets, you could just install one "myevent" listener (not one per connection) and use io.emit() to broadcast to all sockets too and not have to handle the connect or disconnect events for this purpose.
If you are planning to send data to all sockets when some other event fires, you don't need to add/remove another listeners every time a client connects/disconnects.
It is a lot more efficient and easier to simply fire the socket.io event to all sockets that are connected right now using io.sockets (which is a reference to the default namespace with all clients on it by default) and io.sockets.emit:
myService.on('myevent', () => {
io.sockets.emit('myevent', {/*...*/});
});
If you only need to fire this event to some subset of your users, try using specific namespaces or rooms:
myService.on('myevent', () => {
//with namespaces
io.of('namespace').emit('myevent', {/*...*/});
//with rooms
io.to('room').emit('myevent', {/*...*/});
});
I am trying to factor out the anonymous callback functions from my socket.on() events for two reasons:
It makes the code slightly easier to read and document.
I believe it can be more memory efficient.
Since the first reason is more of a personal preference, I will not address it--although I am definitely interested in seeing style guides and/or general recommendations.
The second reason is because the only way I have seen socket.io used is like this:
var io = require('socketio');
io.on('connection', function(socket) {
socket.on('event', function(data) {
// logic goes here
socket.emit('different event', differentData);
});
});
This can work quite well, but I believe the anonymous functions are instantiated for each and every incoming connection.
I would factor this out through the following:
io.on('connection', function(socket) {
socket.on('event', eventHandler);
function eventHandler(data) {
// logic goes here
socket.emit('different event', differentData);
});
});
Still, that appears to still create an instance of 'eventHandler' for every new connection.
My current attempt at factoring this out looks like this:
var sockets = {}
io.on('connection', function(socket) {
socket.on('login', login.bind(socket));
socket.on('get information', getInformation);
});
function login(data) {
// logic goes here
// Save the socket to a persistent object
sockets[data.userId] = this;
this.emit('logged in', loginData);
});
function getInformation(data) {
// authentication, db access logic goes here
// Avoid binding by using previously stored socket
sockets[data.userId].emit('got information', infoData);
});
This works, but I am believe that bind() just creates another function, anyway, subverting any real benefit of not including the login callback in the connection callback.
As for the other handlers, it seems that a persistent object is necessary, as evidenced by EtherPad's method (SocketIORouter.js:62). Since I plan to be listening to a sizable number of events, it seems like pulling the callbacks out of the connection callback is the best way to go about it. Maybe bind will create a new function each time it is called, but in this case, it will only be called twice per connection rather than 10-20 times. I imagine that for thousands of clients, that would make a difference, but I am not sure.
So, to sum up, is this a proper approach, or is there perhaps a better one that I should be using? Or, is this just another case of premature optimization?