I'm trying to give the socket object to my 'ConnectionHandler' class, but when using this socket object it gives this error: 'cannot read property socket of undefined socket.io'.
Server class:
Server.prototype.handleConnections = function ()
{
this.queueTime = 15; // Queue time in seconds
var that = this; // Create a global variable of the server object
// On incoming connection
this.io.on('connection', function (socket) {
console.log('connection incoming...'); // Log a message to the server console
// When a client tries to join the queue
socket.on('client_join_queue', function (username) {
// Check if the username is valid
if (! (username.length < 3)) {
var newPlayer = new player(username);
var connectionHandler = new connectionHandling(socket, that, newPlayer);
that.connections.push(connectionHandler);
}
});
});
}
ConnectionHandler class:
'use strict';
var ConnectionHandler = function (_socket, _server, _player)
{
this.socket = _socket;
this.server = _server;
this.player = _player;
this.server.queueHandler.addPlayer(this.player);
this.server.connections[0].socket.emit('player_joined_queue', this.player, this.server.queueHandler.getQueue().length);
var that = this;
socket.on('disconnect', function () {
console.log("user disconnected");
console.log("queue:", that.server.queueHandler.getQueue());
});
}
module.exports.ConnectionHandler = ConnectionHandler;
I've absolutely no idea what I'm doing wrong.
What fljs was saying, you are emitting an event called 'player_joined_queue'.
But you are listening for the event 'client_join_queue'. You need to listen for the event with the same name you are emitting. So you would need to change one or the other. for example,
socket.on('player_joined_queue', function (username) {
...
I am newbie to Node.js and I am writing DAO layer for HBase which will wrap thrift and provide clear interface to other layers. I am trying to write unit tests for it using sinon.js and mocha but not sure how to ensure mock one event of Thrift connection class and its event handler.
My DAO code is as follows:
var thrift = require('thrift');
var libDirRelativePath = "../../../lib";
var hbaseThriftDirPath = libDirRelativePath + "/hbase-gen-nodejs";
var hbase = require(hbaseThriftDirPath + '/THBaseService');
var hbaseTypes = require(hbaseThriftDirPath + '/hbase_types');
var thritfPrimaryServerAddress = 'nn2';
var thritfBackupServerAddress = 'backup-nn2';
var thriftServerPort = 9090;
exports.putRows = function(tableName, putObjectArray, callback) {
var primaryClusterConnection = thrift.createConnection(thritfPrimaryServerAddress, thriftServerPort, {
transport: thrift.TBufferedTransport,
protocol : thrift.TBinaryProtocol
});
console.log('DEBUG : connection object created.');
var client = thrift.createClient(hbase, primaryClusterConnection);
console.log('DEBUG : client object created.');
primaryClusterConnection.on('connect', onConnectOfPutRows);
primaryClusterConnection.on('connect', function() {
console.log('Connected to HBase thrift server at ' + thritfPrimaryServerAddress + ":" + thriftServerPort);
client.putMultiple(tableName, putObjectArray, callback);
connection.close();
});
primaryClusterConnection.on('error', function() {
console.log('Error occurred in HBase thirft server connection.');
});
}
For above code I Just want to create stubs primaryClusterConnection and client objects which I have managed but problem is that stub of primaryClusterConnection doesn't have any idea about connect event and its handler so console.log('Connected to HBase thrift server at '... line never gets executed. I want to test that part of the code as well. Can anyone please help me in writing proper stubs/mocks for this problem?
My test code is as follows:
var hbaseDao = require('../../../src/dao/hbase/HBaseDao.js');
var libDirRelativePath = "../../../lib";
var hbaseThriftDirPath = libDirRelativePath + "/hbase-gen-nodejs";
var hbase = require(hbaseThriftDirPath + '/THBaseService');
var chai = require('chai');
var should = chai.should();
var expect = chai.expect;
var sinon = require('sinon');
describe("HBaseDao", function() {
describe(".putRows()", function() {
it("Should execute callback after inserting objects in HBase.", function(done) {
var commonStub = sinon.stub();
var connection = {
close : function() {
console.log('connection closed.');
}
};
commonStub.withArgs('nn2', 9090).returns(connection);
var client = {};
commonStub.withArgs(hbase, connection).returns(client);
var tableName = 'DUMMY_READINGS_TABLE';
var callBackMethod = function() {
console.log('dummy callback function.');
};
commonStub.withArgs(tableName, [], callBackMethod).returns(0);
hbaseDao.putRows(tableName, [], callBackMethod);
expect(hbaseDaoSpy.callCount).to.equal(1);
done();
});
Let's start by simplifying the problem a bit.
it.only("Should execute callback after inserting objects in HBase.", function(done) {
var events = require('events');
var hbaseDao = new events.EventEmitter();
hbaseDao.putRows = function() {
console.log('putting rows');
this.emit('notify');
};
hbaseDao.on('notify', function(){
console.log('notify event fired');
done(); //here's where you call the callback to assert that the event has fired
});
sinon.spy(hbaseDao, 'putRows');
var commonStub = sinon.stub();
var tableName = 'DUMMY_READINGS_TABLE';
var client = {};
var connection = {
close : function() {
console.log('connection closed.');
}
};
var callBackMethod = function() {
console.log('dummy callback function.');
};
commonStub.withArgs('nn2', 9090).returns(connection);
commonStub.withArgs({}, connection).returns(client);
commonStub.withArgs(tableName, [], callBackMethod).returns(0);
hbaseDao.putRows(tableName, [], callBackMethod);
//assertions
assert(hbaseDao.putRows.calledOnce);
});
The above test will just work, because it creates a new "hbaseDao" from a simple event emitter and has the method and the notify event ready to go.
Because we're doing an async test, we need to have the done callback in the spec. Notice that this will only fire "done" when the event has occurred. Hence, the test will not pass unless the event fires. Also notice that we're spying specifically on the the hbaseDao 'putRows' and we're asserting that the its called once, another way to ensure that the test is working. Now consider this example and apply it to your original question.
I think you almost got it, but you need to put your done callback in the callback stub as so:
var callBackMethod = function() {
console.log('dummy callback function.');
done();
};
That way, when your primaryClusterConnection.on('connect') event is fired, the supplied callback will execute the done and complete the test.
That being said, you should leave your primaryClusterConnection intact and let the implementation details of hbaseDao not be considered in your test.
You mentioned that:
primaryClusterConnection doesn't have any idea about connect
But that can't be right, because you're creating a new connection in the test and there's nothing in your implementation that tells me you have changed the event handler for the connection.
So I think in the end, you're missing the point of the test, which is simply should execute callback... and you're stubbing out stuff that you don't even need to.
Try something like this:
//use it.only to make sure there's no other tests running
it.only("Should execute callback after inserting objects in HBase.", function(done) {
//get the class
var hbaseDao = require('../../../src/dao/hbase/HBaseDao.js');
//spy on the method
sinon.spy(hbaseDao, 'putRows');
//create a table name
var tableName = 'DUMMY_READINGS_TABLE';
//create callback method with done.
var callBackMethod = function() {
console.log('dummy callback function.');
done();
};
//run the function under test
hbaseDao.putRows(tableName, [], callBackMethod);
//assert called once
assert(hbaseDao.putRows.calledOnce);
});
So I want to be able to share methods across different node.js processes created using the cluster module.
If I run the code below I can share the method server.handleRequest across the child processes, however If I uncomment //server.test(); in the second file and try to use the method test in the original process, node crashes.
"use strict";
var os = require('os');
var http = require('http');
var cluster = require('cluster');
function testMethod() {
console.log('Test');
}
function handleRequest(req, res) {
res.writeHead(200);
res.end("This answer comes from the process " + process.pid);
}
var createServer = function createServer(opts) {
var server = {};
server.test = testMethod;
server.handleRequest = handleRequest;
if (cluster.isMaster) {
var cpuCount = require('os').cpus().length;
for (var i = 0; i < cpuCount; i += 1) {
cluster.fork();
}
return server;
} else {
// Create HTTP server.
http.Server(function(req, res) {
server.handleRequest(req, res);
}).listen(8080);
}
}
module.exports = {
createServer: createServer,
};
The second file that includes the above file.
"use strict";
var router = require('./test.js');
var server = router.createServer();
//server.test();
But If I don't use the cluster module I can use the test method outside the factory function with out crashing. So how do I share methods created in factory functions across all node processes, while using the cluster module? And why do the child processes execute test when only the original process calls server.test()?
var http = require('http');
function testMethod() {
console.log('Test');
}
function handleRequest(req, res) {
res.writeHead(200);
res.end("This answer comes from the process " + process.pid);
}
var createServer = function createServer(opts) {
var server = {};
server.test = testMethod;
server.handleRequest = handleRequest;
http.Server(function(req, res) {
server.handleRequest(req, res);
}).listen(8080);
return server;
}
module.exports = {
createServer: createServer,
};
In the following code I'm expecting console.log to output the data that is passed along with the custom emitter 'output' but that's not occurring. From what I can tell Looper.prototype.output is called properly from withing the server handler but it's not responding to the emitter 'output' event that's defined in Looper.prototype.run. Why isn't my output event handler recognizing these events?
var express = require('express');
var http = require('http');
var spawn = require('child_process').spawn;
var util = require('util');
var fs = require('fs');
var EventEmitter = require("events").EventEmitter;
var sys = require("sys");
function Looper(req) {
this.req = req;
EventEmitter.call(this);
}
sys.inherits(Looper, EventEmitter);
Looper.prototype.run = function() {
var cmd = spawn('./flow',[this.req]); // <-- script that outputs req every second
cmd.stdout.setEncoding('utf8');
cmd.stdout.on('data', function(data) {
this.emit('output',data);
});
}
Looper.prototype.output = function(callback) {
this.on('output', function(data) {
return callback(data.trim());
});
}
var looper = new Looper('blah');
looper.run();
var app = express();
var webServer = http.createServer(app);
app.use(express.static(__dirname + '/public'));
app.get('/', function(req, res) {
res.send(
"<h1>hello world</h1>"
);
looper.output(function(res) {
console.log('blah');
console.log(res);
});
});
webServer.listen(3000);
Looper.prototype.run = function() {
var cmd = spawn('./flow',[this.req]); // <-- script that outputs req every second
cmd.stdout.setEncoding('utf8');
cmd.stdout.on('data', function(data) {
this.emit('output',data);
// ^ not what you think it is.
});
}
I think that this is not what you think it is in that callback. You need to capture the value of this outside of the callback first.
Looper.prototype.run = function() {
var self = this; // save this
var cmd = spawn('./flow',[this.req]); // <-- script that outputs req every second
cmd.stdout.setEncoding('utf8');
cmd.stdout.on('data', function(data) {
self.emit('output',data); // use previously saved value of this
});
}
Otherwise, this would default to the global object, and when the global object emits an event, noone is listening to it.
Is it possible to have a socket.io client respond to all events without to have specify each event individually?
For example, something like this (which obviously doesn't work right now):
var socket = io.connect("http://myserver");
socket.on("*", function(){
// listen to any and all events that are emitted from the
// socket.io back-end server, and handle them here.
// is this possible? how can i do this?
});
I want this callback function to be called when any / all events are received by the client-side socket.io code.
Is this possible? How?
Updated solution for socket.io-client 1.3.7
var onevent = socket.onevent;
socket.onevent = function (packet) {
var args = packet.data || [];
onevent.call (this, packet); // original call
packet.data = ["*"].concat(args);
onevent.call(this, packet); // additional call to catch-all
};
Use like this:
socket.on("*",function(event,data) {
console.log(event);
console.log(data);
});
None of the answers worked for me, though the one of Mathias Hopf and Maros Pixel came close, this is my adjusted version.
NOTE: this only catches custom events, not connect/disconnect etc
It looks like the socket.io library stores these in a dictionary. As such, don't think this would be possible without modifying the source.
From source:
EventEmitter.prototype.on = function (name, fn) {
if (!this.$events) {
this.$events = {};
}
if (!this.$events[name]) {
this.$events[name] = fn;
} else if (io.util.isArray(this.$events[name])) {
this.$events[name].push(fn);
} else {
this.$events[name] = [this.$events[name], fn];
}
return this;
};
Finally, there is a module called socket.io-wildcard which allows using wildcards on client and server side
var io = require('socket.io')();
var middleware = require('socketio-wildcard')();
io.use(middleware);
io.on('connection', function(socket) {
socket.on('*', function(){ /* … */ });
});
io.listen(8000);
Here you go ...
var socket = io.connect();
var globalEvent = "*";
socket.$emit = function (name) {
if(!this.$events) return false;
for(var i=0;i<2;++i){
if(i==0 && name==globalEvent) continue;
var args = Array.prototype.slice.call(arguments, 1-i);
var handler = this.$events[i==0?name:globalEvent];
if(!handler) handler = [];
if ('function' == typeof handler) handler.apply(this, args);
else if (io.util.isArray(handler)) {
var listeners = handler.slice();
for (var i=0, l=listeners.length; i<l; i++)
listeners[i].apply(this, args);
} else return false;
}
return true;
};
socket.on(globalEvent,function(event){
var args = Array.prototype.slice.call(arguments, 1);
console.log("Global Event = "+event+"; Arguments = "+JSON.stringify(args));
});
This will catch events like connecting, connect, disconnect, reconnecting too, so do take care.
Note: this answer is only valid for socket.io 0.x
You can override socket.$emit
With the following code you have two new functions to:
Trap all events
Trap only events which are not trapped by the old method (it is a default listener)
var original_$emit = socket.$emit;
socket.$emit = function() {
var args = Array.prototype.slice.call(arguments);
original_$emit.apply(socket, ['*'].concat(args));
if(!original_$emit.apply(socket, arguments)) {
original_$emit.apply(socket, ['default'].concat(args));
}
}
socket.on('default',function(event, data) {
console.log('Event not trapped: ' + event + ' - data:' + JSON.stringify(data));
});
socket.on('*',function(event, data) {
console.log('Event received: ' + event + ' - data:' + JSON.stringify(data));
});
As it is in v3.0 documentation:
socket.onAny((event, ...args) => {
console.log(`got ${event}`);
});
The current (Apr 2013) GitHub doc on exposed events mentions a socket.on('anything'). It appears that 'anything' is a placeholder for a custom event name, not an actual keyword that would catch any event.
I've just started working with web sockets and Node.JS, and immediately had a need to handle any event, as well as to discover what events were sent. Can't quite believe this functionality is missing from socket.io.
socket.io-client 1.7.3
As of May 2017 couldn't make any of the other solutions work quite how i wanted - made an interceptor, using at Node.js for testing purposes only:
var socket1 = require('socket.io-client')(socketUrl)
socket1.on('connect', function () {
console.log('socket1 did connect!')
var oldOnevent = socket1.onevent
socket1.onevent = function (packet) {
if (packet.data) {
console.log('>>>', {name: packet.data[0], payload: packet.data[1]})
}
oldOnevent.apply(socket1, arguments)
}
})
References:
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Function/apply
https://github.com/socketio/socket.io-client/blob/ff4cb3eed04a95c9725b8aaba8b64fa9fa1ca413/lib/socket.js#L257
Because your question was pretty general in asking for a solution, I'll pitch this one that requires no hacking the code, just a change in how you use the socket.
I just decided to have my client app send the exact same event, but with a different payload.
socket.emit("ev", { "name" : "miscEvent1"} );
socket.emit("ev", { "name" : "miscEvent2"} );
And on the server, something like...
socket.on("ev", function(eventPayload) {
myGenericHandler(eventPayload.name);
});
I don't know if always using the same event could cause any issues, maybe collisions of some kind at scale, but this served my purposes just fine.
There is a long discussion about this topic going on at the Socket.IO repository issue page. There are a variety of solutions posted there (e.g., overriding EventEmitter with EventEmitter2). lmjabreu released another solution a couple weeks ago: a npm module called socket.io-wildcard that patches in a wildcard event onto Socket.IO (works with the current Socket.IO, ~0.9.14).
Even though this is a old question, I have the same problem and solved using the native socket in Node.js, which has a event of .on('data'), fired everytime some data comes. So this is what I've done so far:
const net = require('net')
const server = net.createServer((socket) => {
// 'connection' listener.
console.log('client connected')
// The stuff I was looking for
socket.on('data', (data) => {
console.log(data.toString())
})
socket.on('end', () => {
console.log('client disconnected')
})
})
server.on('error', (err) => {
throw err;
})
server.listen(8124, () => {
console.log('server bound');
})
All methods I found (including socket.io-wildcard and socketio-wildcard) didn't work for me. Apparently there is no $emit in socket.io 1.3.5...
After reading socket.io code, I patched up the following which DID work:
var Emitter = require('events').EventEmitter;
var emit = Emitter.prototype.emit;
[...]
var onevent = socket.onevent;
socket.onevent = function (packet) {
var args = ["*"].concat (packet.data || []);
onevent.call (this, packet); // original call
emit.apply (this, args); // additional call to catch-all
};
This might be a solution for others as well. However, ATM I don't exactly understand why nobody else seems to have issues with the existing "solutions"?!? Any ideas? Maybe it's my old node version (0.10.31)...
#Matthias Hopf answer
Updated answer for v1.3.5. There was a bug with args, if you wanna listen on old event and * event together.
var Emitter = require('events').EventEmitter;
var emit = Emitter.prototype.emit;
// [...]
var onevent = socket.onevent;
socket.onevent = function (packet) {
var args = packet.data || [];
onevent.call (this, packet); // original call
emit.apply (this, ["*"].concat(args)); // additional call to catch-all
};
In v4, Socket.IO has Catch-all listeners. For example:
socket.prependAny(() => {
console.log("This will be fired first");
});
I'm using Angular 6 and the npm package: ngx-socket-io
import { Socket } from "ngx-socket-io";
...
constructor(private socket: Socket) { }
...
After connect the socket, I use this code, this is handling all custom events...
const onevent = this.socket.ioSocket.onevent;
this.socket.ioSocket.onevent = function (packet: any) {
const args = packet.data || [];
onevent.call(this, packet); // original call
packet.data = ["*"].concat(args);
onevent.call(this, packet); // additional call to catch-all
};
this.socket.on("*", (eventName: string, data: any) => {
if (typeof data === 'object') {
console.log(`socket.io event: [${eventName}] -> data: [${JSON.stringify(data)}]`);
} else {
console.log(`socket.io event: [${eventName}] -> data: [${data}]`);
}
});