I'm trying to write a simple wrapper class around the Paho MQTT JavaScript client. (The idea is to put some extra validation around MQTT messaging, to ensure messages are processed in the correct order.)
I'm not very comfortable with JavaScript classes, and I'm getting in a mess trying to work out what's wrong with this...
class Hermes {
constructor(uri, topic, callback) {
var clientId = "clientID_" + parseInt(Math.random() * 1000);
this.client = new Paho.MQTT.Client(uri, clientId);
this.topic = topic;
this.callback = callback;
this.client.onMessageArrived = this._onMessageArrived;
this.client.onConnectionLost = this._onConnectionLost;
this.client.connect({
onSuccess: this._onConnect,
onFailure: this._onFailure
});
}
_onConnect() {
// Once a connection has been made, make a subscription and send a message.
console.log("_onConnect: " + this.client.clientId)
this.client.subscribe(this.topic);
}
// called when connection fails
_onFailure(responseObject) {
console.log("_onFailure: "+responseObject.errorMessage);
}
// called when a message arrives
_onMessageArrived(message) {
console.log("_onMessageArrived: "+message.payloadString)
// TODO: validate message and pass to callback
}
// called when client loses connection
_onConnectionLost(responseObject) {
if (responseObject.errorCode !== 0) {
console.log("onConnectionLost: "+responseObject.errorMessage);
}
}
}
function handleMessage(message) {
// TODO: handle message
}
var hermes = new Hermes("ws://mqtt.example.com:9001/mqtt", "test", handleMessage);
Expected result:
_onConnect: clientID_xxx should be logged in the console when the client successfully connects.
Actual result:
onConnectionLost: AMQJS0005E Internal error. Error Message: undefined is not an object (evaluating 'this.client.clientId'), Stack trace: _onConnect#file:///Users/richardguy/Desktop/hermes.js:16:45
The MQTT broker is running on a VPS and I can publish/subscribe messages successfully using the Paho Javascript library outside of a class, like so...
uri = "ws://mqtt.example.com:9001/mqtt"
var clientId = "clientID_" + parseInt(Math.random() * 1000);
client = new Paho.MQTT.Client(uri, clientId);
client.onConnectionLost = onConnectionLost;
client.onMessageArrived = onMessageArrived;
client.connect({
onSuccess: onConnect,
onFailure: onFailure
});
function onConnect() {
// Once a connection has been made, make a subscription and send a message.
console.log("_onConnect: " + client.clientId)
client.subscribe("test");
}
// called when connection fails
function onFailure(responseObject) {
console.log("_onFailure: "+responseObject.errorMessage);
}
// called when a message arrives
function onMessageArrived(message) {
console.log("_onMessageArrived: "+message.payloadString)
// TODO: validate message and pass to callback
}
// called when client loses connection
function onConnectionLost(responseObject) {
if (responseObject.errorCode !== 0) {
console.log("onConnectionLost: "+responseObject.errorMessage);
}
}
Is this just a mistake in the class definition, or something to do with the Paho MQTT library??
Solution:
I needed to pass an object (in this case the instance of the Hermes class) to use as the context for the onSuccess callback rather than using this (which isn't what I thought it was, as usual...), using invocationContext in the connection options.
class Hermes {
constructor(uri, topic, callback) {
var clientId = "clientID_" + parseInt(Math.random() * 1000);
this.client = new Paho.MQTT.Client(uri, clientId);
this.topic = topic;
this.callback = callback;
this.client.onMessageArrived = this._onMessageArrived;
this.client.onConnectionLost = this._onConnectionLost;
this.client.connect({
onSuccess: this._onConnect,
onFailure: this._onFailure,
invocationContext: this
});
}
_onConnect(responseObject) {
// Once a connection has been made, make a subscription and send a message.
let self = responseObject.invocationContext;
self.client.subscribe(self.topic);
}
// called when connection fails
_onFailure(responseObject) {
console.log("_onFailure: "+responseObject.errorMessage);
}
// called when a message arrives
_onMessageArrived(message) {
console.log("_onMessageArrived: "+message.payloadString)
// TODO: validate message and pass to callback
}
// called when client loses connection
_onConnectionLost(responseObject) {
if (responseObject.errorCode !== 0) {
console.log("onConnectionLost: "+responseObject.errorMessage);
}
}
}
function handleMessage(message) {
}
var hermes = new Hermes("ws://mqtt.example.com:8080/mqtt", "test", handleMessage);
Your problem is that this is not what you think it is.
The callbacks are all made from the clients network handler, so this is actually a reference to the handler.
You can pass an object to use as the context for the onSuccess and onFailure callbacks in the connection options using invocationContext, but not for the other callbacks.
(note, I'm using Node.js to create a telnet server and handle user input)
I'm having issues using variables/arrays in javascript inside functions in other scripts.
Let me set a simpler example:
var connections = []
is an array in my main.js
Broadcast.js is a function I put in a separate module and attempt to use the broadcast() function in my main.js.
The error I get is stating that connections is undefined. How do get Broadcast.js able to see the connections array in the main.js
~~
For example in my main.js I set an array to handle clients connecting to a server
//point to Telnet library
const Telnet = require('ranvier-telnet');
const logger = require('./logger.js')
var outspeak = []
var connections = []
var clients = []
let server = new Telnet.TelnetServer(rawSocket => {
let telnetSocket = new Telnet.TelnetSocket();
//attaching socket
telnetSocket.attach(rawSocket);
//setting telnet options
telnetSocket.telnetCommand(Telnet.Sequences.WILL, Telnet.Options.OPT_EOR);
//giving clients a name
telnetSocket.name = rawSocket.remoteAddress + ":" + rawSocket.remotePort
//pushing client names to array
clients.push(telnetSocket.name);
//pushing client connections to an array
connections.push(rawSocket);
console.log(`${telnetSocket.name} has connected`)
logger(`${telnetSocket.name} has connected`)
broadcast(telnetSocket.name + " connected.")
telnetSocket.on('data', function (data) {
//broadcast (telnetSocket.name + ">" + data, telnetSocket);
}
function broadcast (message, sender) {
connections.forEach(function (connection) {
//don't want to send it to sender
if (connection === sender) return;
connection.write(`${message} \n`);
});
}
Now inside my main script, I could called that array/push to that array, read from that array, as long as I type out the function inside the main.js file.
And it can easily use the broadcast function.
Now I want to make it more advance and make reduce my lines on my main.js
but once I separate the broadcast function into it's own module.
use strict'
//broadcast function
function broadcast (message, sender) {
connections.forEach(function (connection) {
//don't want to send it to sender
if (connection === sender) return;
connection.write(`${message} \n`);
});
}
module.exports = broadcast
I get a connection undefined error any time I try to invoke that broadcast function. It's like my global variable/array can't be seen by broadcast.js function.
this is how I'm invoking it
// handle input
telnetSocket.on('data', function (data) {
broadcast (telnetSocket.name + ">" + data, telnetSocket);
});
And yes, const broadcast = require('./broadcast.js'); as been added to the file at the top.
Here's the broken code complete:
'use strict'
//point to Telnet library
const Telnet = require('ranvier-telnet');
const logger = require('./logger.js');
const broadcast = require('./broadcast.js');
var connections = []
var clients = []
//had to call message as global variable
//Asan's timestamp functionm
//telnetstuff
console.log("Starting...");
let server = new Telnet.TelnetServer(rawSocket => {
let telnetSocket = new Telnet.TelnetSocket();
//attaching socket
telnetSocket.attach(rawSocket);
//setting telnet options
telnetSocket.telnetCommand(Telnet.Sequences.WILL, Telnet.Options.OPT_EOR);
//giving clients a name
telnetSocket.name = rawSocket.remoteAddress + ":" + rawSocket.remotePort
//pushing client names to array
clients.push(telnetSocket.name);
//pushing client connections to an array
connections.push(rawSocket);
console.log(`${telnetSocket.name} has connected`)
logger(`${telnetSocket.name} has connected`)
broadcast(telnetSocket.name + " connected.")
// handle input
telnetSocket.on('data', function (data) {
broadcast (telnetSocket.name + ">" + data, telnetSocket);
});
//removing client/connection from array
rawSocket.on('end', function () {
clients.splice(clients.indexOf(telnetSocket), 1);
connections.splice(connections.indexOf(rawSocket), 1);
broadcast(telnetSocket.name + " has left.\n");
logger(telnetSocket.name + " has left.");
console.log(telnetSocket.name + " has left.");
});
}).netServer
server.listen(4000);
console.log('ServerRunning...');
logger('>Server started.');
What I'm missing here? Also I apologize in advance this is my first question ever asked and I've gone through as much I could today to even figure out how to ask my question, maybe I'm not using correct lingo/terms? any help is appreciative.
refactor\broadcast.js:5
connections.forEach(function (connection) {
^
ReferenceError: connections is not defined
In nodejs, when you declare a variable not inside any function definitions, it is scoped to the file only. (This is different from browser javascript.) If you want something to be accessible from outside, you need to export it:
module.exports.connections = connections;
Then import it into the other file:
const connections = require(myFile);
This will work as long as you don't try to set the value of the variable in either file, but if you do that they'll end up pointing to separate objects. But mutating it, calling methods on it, etc should work fine.
I am using service workers to intercept requests for me and provide the responses to the fetch requests by communicating with a Web worker (also created from the same parent page).
I have used message channels for direct communication between the worker and service worker. Here is a simple POC I have written:
var otherPort, parentPort;
var dummyObj;
var DummyHandler = function()
{
this.onmessage = null;
var selfRef = this;
this.callHandler = function(arg)
{
if (typeof selfRef.onmessage === "function")
{
selfRef.onmessage(arg);
}
else
{
console.error("Message Handler not set");
}
};
};
function msgFromW(evt)
{
console.log(evt.data);
dummyObj.callHandler(evt);
}
self.addEventListener("message", function(evt) {
var data = evt.data;
if(data.msg === "connect")
{
otherPort = evt.ports[1];
otherPort.onmessage = msgFromW;
parentPort = evt.ports[0];
parentPort.postMessage({"msg": "connect"});
}
});
self.addEventListener("fetch", function(event)
{
var url = event.request.url;
var urlObj = new URL(url);
if(!isToBeIntercepted(url))
{
return fetch(event.request);
}
url = decodeURI(url);
var key = processURL(url).toLowerCase();
console.log("Fetch For: " + key);
event.respondWith(new Promise(function(resolve, reject){
dummyObj = new DummyHandler();
dummyObj.onmessage = function(e)
{
if(e.data.error)
{
reject(e.data.error);
}
else
{
var content = e.data.data;
var blob = new Blob([content]);
resolve(new Response(blob));
}
};
otherPort.postMessage({"msg": "content", param: key});
}));
});
Roles of the ports:
otherPort: Communication with worker
parentPort: Communication with parent page
In the worker, I have a database say this:
var dataBase = {
"file1.txt": "This is File1",
"file2.txt": "This is File2"
};
The worker just serves the correct data according to the key sent by the service worker. In reality these will be very large files.
The problem I am facing with this is the following:
Since I am using a global dummyObj, the older dummyObj and hence the older onmessage is lost and only the latest resource is responded with the received data.
In fact, file2 gets This is File1, because the latest dummyObj is for file2.txt but the worker first sends data for file1.txt.
I tried by creating an iframe directly and all the requests inside it are intercepted:
<html>
<head></head>
<body><iframe src="tointercept/file1.txt" ></iframe><iframe src="tointercept/file2.txt"></iframe>
</body>
</html>
Here is what I get as output:
One approach could be to write all the files that could be fetched into IndexedDB in the worker before creating the iframe. Then in the Service Worker fetch those from indexed DB. But I don't want to save all the resources in IDB. So this approach is not what I want.
Does anybody know a way to accomplish what I am trying to do in some other way? Or is there a fix to what I am doing.
Please Help!
UPDATE
I have got this to work by queuing the dummyObjs in a global queue instead of having a global object. And on receiving the response from the worker in msgFromW I pop an element from the queue and call its callHandler function.
But I am not sure if this is a reliable solution. As it assumes that everything will occur in order. Is this assumption correct?
I'd recommend wrapping your message passing between the service worker and the web worker in promises, and then pass a promise that resolves with the data from the web worker to fetchEvent.respondWith().
The promise-worker library can automate this promise-wrapping for you, or you could do it by hand, using this example as a guide.
If you were using promise-worker, your code would look something like:
var promiseWorker = new PromiseWorker(/* your web worker */);
self.addEventListener('fetch', function(fetchEvent) {
if (/* some optional check to see if you want to handle this event */) {
fetchEvent.respondWith(promiseWorker.postMessage(/* file name */));
}
});
I'm playing with NodeJS and WebSockets, there is the upgrade event with it's head parameter, as I understood from here that is basically data that directly trails the headers, but for my use case it's always empty so I don't really know what it means, I'd be glad if someone could provide a simple use case where the data parameter within the upgrade event isn't empty.
Looking at the source that emits that upgrade event in the node repo, you'll see the following implementation:
function socketOnData(d) {
assert(!socket._paused);
debug('SERVER socketOnData %d', d.length);
var ret = parser.execute(d);
onParserExecuteCommon(ret, d);
}
function onParserExecute(ret, d) {
debug('SERVER socketOnParserExecute %d', ret);
onParserExecuteCommon(ret, undefined);
}
function onParserExecuteCommon(ret, d) {
if (ret instanceof Error) {
debug('parse error');
socket.destroy(ret);
} else if (parser.incoming && parser.incoming.upgrade) {
// Upgrade or CONNECT
var bytesParsed = ret;
var req = parser.incoming;
debug('SERVER upgrade or connect', req.method);
if (!d)
d = parser.getCurrentBuffer();
socket.removeListener('data', socketOnData);
socket.removeListener('end', socketOnEnd);
socket.removeListener('close', serverSocketCloseListener);
unconsume(parser, socket);
parser.finish();
freeParser(parser, req, null);
parser = null;
var eventName = req.method === 'CONNECT' ? 'connect' : 'upgrade';
if (EventEmitter.listenerCount(self, eventName) > 0) {
debug('SERVER have listener for %s', eventName);
var bodyHead = d.slice(bytesParsed, d.length);
// TODO(isaacs): Need a way to reset a stream to fresh state
// IE, not flowing, and not explicitly paused.
socket._readableState.flowing = null;
self.emit(eventName, req, socket, bodyHead);
...
The passed parameter, bodyHead, reflects the data from the passed parameter d on the socketOnData function, which is the default socket data event handler. On the other hand, it could be undefined if the onParserExecuteCommon was called from onParserExecute. I'd have to look through the source more to understand which cases would apply depending on how your server is implemented. Maybe you could enable the debug logs to see which methods are being called.
Is it possible to have a socket.io client respond to all events without to have specify each event individually?
For example, something like this (which obviously doesn't work right now):
var socket = io.connect("http://myserver");
socket.on("*", function(){
// listen to any and all events that are emitted from the
// socket.io back-end server, and handle them here.
// is this possible? how can i do this?
});
I want this callback function to be called when any / all events are received by the client-side socket.io code.
Is this possible? How?
Updated solution for socket.io-client 1.3.7
var onevent = socket.onevent;
socket.onevent = function (packet) {
var args = packet.data || [];
onevent.call (this, packet); // original call
packet.data = ["*"].concat(args);
onevent.call(this, packet); // additional call to catch-all
};
Use like this:
socket.on("*",function(event,data) {
console.log(event);
console.log(data);
});
None of the answers worked for me, though the one of Mathias Hopf and Maros Pixel came close, this is my adjusted version.
NOTE: this only catches custom events, not connect/disconnect etc
It looks like the socket.io library stores these in a dictionary. As such, don't think this would be possible without modifying the source.
From source:
EventEmitter.prototype.on = function (name, fn) {
if (!this.$events) {
this.$events = {};
}
if (!this.$events[name]) {
this.$events[name] = fn;
} else if (io.util.isArray(this.$events[name])) {
this.$events[name].push(fn);
} else {
this.$events[name] = [this.$events[name], fn];
}
return this;
};
Finally, there is a module called socket.io-wildcard which allows using wildcards on client and server side
var io = require('socket.io')();
var middleware = require('socketio-wildcard')();
io.use(middleware);
io.on('connection', function(socket) {
socket.on('*', function(){ /* … */ });
});
io.listen(8000);
Here you go ...
var socket = io.connect();
var globalEvent = "*";
socket.$emit = function (name) {
if(!this.$events) return false;
for(var i=0;i<2;++i){
if(i==0 && name==globalEvent) continue;
var args = Array.prototype.slice.call(arguments, 1-i);
var handler = this.$events[i==0?name:globalEvent];
if(!handler) handler = [];
if ('function' == typeof handler) handler.apply(this, args);
else if (io.util.isArray(handler)) {
var listeners = handler.slice();
for (var i=0, l=listeners.length; i<l; i++)
listeners[i].apply(this, args);
} else return false;
}
return true;
};
socket.on(globalEvent,function(event){
var args = Array.prototype.slice.call(arguments, 1);
console.log("Global Event = "+event+"; Arguments = "+JSON.stringify(args));
});
This will catch events like connecting, connect, disconnect, reconnecting too, so do take care.
Note: this answer is only valid for socket.io 0.x
You can override socket.$emit
With the following code you have two new functions to:
Trap all events
Trap only events which are not trapped by the old method (it is a default listener)
var original_$emit = socket.$emit;
socket.$emit = function() {
var args = Array.prototype.slice.call(arguments);
original_$emit.apply(socket, ['*'].concat(args));
if(!original_$emit.apply(socket, arguments)) {
original_$emit.apply(socket, ['default'].concat(args));
}
}
socket.on('default',function(event, data) {
console.log('Event not trapped: ' + event + ' - data:' + JSON.stringify(data));
});
socket.on('*',function(event, data) {
console.log('Event received: ' + event + ' - data:' + JSON.stringify(data));
});
As it is in v3.0 documentation:
socket.onAny((event, ...args) => {
console.log(`got ${event}`);
});
The current (Apr 2013) GitHub doc on exposed events mentions a socket.on('anything'). It appears that 'anything' is a placeholder for a custom event name, not an actual keyword that would catch any event.
I've just started working with web sockets and Node.JS, and immediately had a need to handle any event, as well as to discover what events were sent. Can't quite believe this functionality is missing from socket.io.
socket.io-client 1.7.3
As of May 2017 couldn't make any of the other solutions work quite how i wanted - made an interceptor, using at Node.js for testing purposes only:
var socket1 = require('socket.io-client')(socketUrl)
socket1.on('connect', function () {
console.log('socket1 did connect!')
var oldOnevent = socket1.onevent
socket1.onevent = function (packet) {
if (packet.data) {
console.log('>>>', {name: packet.data[0], payload: packet.data[1]})
}
oldOnevent.apply(socket1, arguments)
}
})
References:
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Function/apply
https://github.com/socketio/socket.io-client/blob/ff4cb3eed04a95c9725b8aaba8b64fa9fa1ca413/lib/socket.js#L257
Because your question was pretty general in asking for a solution, I'll pitch this one that requires no hacking the code, just a change in how you use the socket.
I just decided to have my client app send the exact same event, but with a different payload.
socket.emit("ev", { "name" : "miscEvent1"} );
socket.emit("ev", { "name" : "miscEvent2"} );
And on the server, something like...
socket.on("ev", function(eventPayload) {
myGenericHandler(eventPayload.name);
});
I don't know if always using the same event could cause any issues, maybe collisions of some kind at scale, but this served my purposes just fine.
There is a long discussion about this topic going on at the Socket.IO repository issue page. There are a variety of solutions posted there (e.g., overriding EventEmitter with EventEmitter2). lmjabreu released another solution a couple weeks ago: a npm module called socket.io-wildcard that patches in a wildcard event onto Socket.IO (works with the current Socket.IO, ~0.9.14).
Even though this is a old question, I have the same problem and solved using the native socket in Node.js, which has a event of .on('data'), fired everytime some data comes. So this is what I've done so far:
const net = require('net')
const server = net.createServer((socket) => {
// 'connection' listener.
console.log('client connected')
// The stuff I was looking for
socket.on('data', (data) => {
console.log(data.toString())
})
socket.on('end', () => {
console.log('client disconnected')
})
})
server.on('error', (err) => {
throw err;
})
server.listen(8124, () => {
console.log('server bound');
})
All methods I found (including socket.io-wildcard and socketio-wildcard) didn't work for me. Apparently there is no $emit in socket.io 1.3.5...
After reading socket.io code, I patched up the following which DID work:
var Emitter = require('events').EventEmitter;
var emit = Emitter.prototype.emit;
[...]
var onevent = socket.onevent;
socket.onevent = function (packet) {
var args = ["*"].concat (packet.data || []);
onevent.call (this, packet); // original call
emit.apply (this, args); // additional call to catch-all
};
This might be a solution for others as well. However, ATM I don't exactly understand why nobody else seems to have issues with the existing "solutions"?!? Any ideas? Maybe it's my old node version (0.10.31)...
#Matthias Hopf answer
Updated answer for v1.3.5. There was a bug with args, if you wanna listen on old event and * event together.
var Emitter = require('events').EventEmitter;
var emit = Emitter.prototype.emit;
// [...]
var onevent = socket.onevent;
socket.onevent = function (packet) {
var args = packet.data || [];
onevent.call (this, packet); // original call
emit.apply (this, ["*"].concat(args)); // additional call to catch-all
};
In v4, Socket.IO has Catch-all listeners. For example:
socket.prependAny(() => {
console.log("This will be fired first");
});
I'm using Angular 6 and the npm package: ngx-socket-io
import { Socket } from "ngx-socket-io";
...
constructor(private socket: Socket) { }
...
After connect the socket, I use this code, this is handling all custom events...
const onevent = this.socket.ioSocket.onevent;
this.socket.ioSocket.onevent = function (packet: any) {
const args = packet.data || [];
onevent.call(this, packet); // original call
packet.data = ["*"].concat(args);
onevent.call(this, packet); // additional call to catch-all
};
this.socket.on("*", (eventName: string, data: any) => {
if (typeof data === 'object') {
console.log(`socket.io event: [${eventName}] -> data: [${JSON.stringify(data)}]`);
} else {
console.log(`socket.io event: [${eventName}] -> data: [${data}]`);
}
});