NodeJS ZeroMQ - When producer is ready to send message after connection? - javascript

I have made small reasearch about patterns supported by zeromq. I would like to describe problem with PUB/SUB pattern, but probably I discover this problem in my recent project also in PUSH/PULL pattern. I use NodeJS zeromq implementation.
I prepare two examples (server.js & client.js). I recognized that first message from server.js is lost every time I restart server (message is send every 1 second). client.js doesn't get first message. It is probably caused by to short time before sending messages. When I start sending messages after some time (e.g. 1 second) everything works fine. I thing that zmq needs some time for initialization connection between publisher and subscriber.
I would like to know when producer (server) is ready to sending messages for subscribed clients. How get this information?
I don't understand why client.js connected and subscribed for messages doesn't get it, because server is not ready for support subscriptions after restart.
Maybe it works like this by design.
server.js:
var zmq = require('zmq');
console.log('server zmq: ' + zmq.version);
var publisher = zmq.socket('pub');
publisher.bindSync("tcp://*:5555");
var i = 0;
var msg = "get_status OK ";
function sendMsg () {
console.log(msg + i);
publisher.send(msg + i);
i++;
setTimeout(sendMsg, 1000);
}
sendMsg();
process.on('SIGINT', function() {
publisher.close();
process.exit();
});
client.js:
var zmq = require('zmq');
console.log('client zmq: ' + zmq.version);
var subscriber = zmq.socket('sub');
subscriber.subscribe("get_status");
subscriber.on('message', function(data) {
console.log(data.toString());
});
subscriber.connect("tcp://127.0.0.1:5555");
process.on('SIGINT', function() {
subscriber.close();
process.exit();
});

In the node zmq lib repo you have stated the supported monitoring events. Subscribing to this will allow you to monitor your connection, in this case the accept event. However don't forget that you'll also have to call the monitor() function on the socket to activate monitoring.
You should end up with something like:
var publisher = zmq.socket('pub');
publisher.on('accept', function(fd, ep) {
sendMsg();
});
publisher.monitor(100, 0);
publisher.bindSync("tcp://*:5555");

Related

Sending data in a timer by JS WebRTC crashes when reloading before the end

I have this simple piece of code in a server.js javascript file served by node:
function multiStep(myConnection, data) {
var i=0;
var myTimer=setInterval(function() {
if (i<data.length){
var element=JSON.stringify(data[i]);
console.log("mando elemento: "+element);
myConnection.send(element);
i++;
}
}, 3000);
}
//require our websocket library
clearInterval(myTimer);
var WebSocketServer = require('ws').Server;
//creating a websocket server at port 9090
var wss = new WebSocketServer({port: 9090});
//when a user connects to our sever
wss.on('connection', function(connection) {
loadJSON(function(response) {
//when server gets a message from a connected user
connection.on('message', function(message){
console.log("Got message from a user:", message);
});
var json = JSON.parse(response);
multiStep(connection, json, 0);
})
});
loadJSON simply loads a json data file from another web site.
When I run the client application the first time or when the timeout has ended everything goes fine. Yet if I reload the page while the timeout is not finished I get a crash when I try to use the connection of the old page on the server with report:
/var/www/html/MATERIALI/phonegap/node_modules/ws/lib/WebSocket.js:219
else throw new Error('not opened');
^ Error: not opened
at WebSocket.send (/var/www/html/MATERIALI/phonegap/node_modules/ws/lib/WebSocket.js:219:16)
at null. (/var/www/html/MATERIALI/phonegap/WebRTC/server.js:36:9)
at wrapper [as _onTimeout] (timers.js:261:14)
at Timer.listOnTimeout [as ontimeout] (timers.js:112:15)
As a matter of fact I could simply ignore the old session given the page is reloaded. How do I avoid the server to crash in these circumstances?
Ok, I think I found the solution; function multiStep becomes:
function multiStep(myConnection, data) {
var i=0;
clearInterval(myTimer);
myTimer=setInterval(function() {
if (i<data.length){
var element=JSON.stringify(data[i]);
console.log("mando elemento: "+element);
try {
myConnection.send(element);
console.log("mandato elemento");
} catch(err) {
console.log('Websocket error: %s', err);
}
i++;
} else {
}
}, 3000);
}
And it does not crash any longer.
You need to do some checking along the way. Your code assumes that everything is 100%.
var json = JSON.parse(response);
multiStep(connection, json, 0);
You assume there is data in response (it might be empty, or contain non-json data)
You should also check that json is a valid array before passing it to multiStep
The function multiStep also assumes that data.length will return something numeric
This may not be the complete answer, but it should give you a start on making your code more robust.
It's probably failing at myConnection.send(element); but that is probably only a symptom of your lack of checking along the way (you can also check if myConnection is still valid before you send something to it)
Referring to https://developer.mozilla.org/en-US/docs/Web/API/WebSocket, you should be able to check the myConnection.readyState value:
Ready state constants
These constants are used by the readyState attribute to describe the state of the WebSocket connection.
Constant Value Description
CONNECTING 0 The connection is not yet open.
OPEN 1 The connection is open and ready to communicate.
CLOSING 2 The connection is in the process of closing.
CLOSED 3 The connection is closed or couldn't be opened.
Your code will look like this now:
console.log("mando elemento: "+element);
if (myConnection.readyState === 1)
myConnection.send(element);
else
console.log("web socket not open");

Processing external triggers in Node.js applications

I have a Node.js daemon application that runs on my Debian home server 24/7.
I would like it to process triggers generated by motion, a program that monitors the video signal from cameras that is installed on the same machine. Motion can execute a command on certain events, for example when motion was detected, or camera connection was lost.
I can write a script that will process these events and record them in the database and in my daemon I can continuously poll the database. But that would be highly inefficient, right?
What would be the optimal way to process external triggers in Node.js applications?
Have a look at dnode. It allows you to do exactly what you are looking for.
In your daemon you will have something like this.
var dnode = require('dnode');
var server = dnode({
transform : function (eventObject, cb) {
//handle the event
cb(callbackDataHere)
}
});
server.listen(5004);
You will then need to create the command that Motion will call
var dnode = require('dnode');
var d = dnode.connect(5004);
d.on('remote', function (remote) {
var eventDataToSend = {}
remote.transform(eventDataToSend, function (s) {
//Do stuff with arguments sent back from the callback on the server
});
});

PeerJS Auto Reconnect

I have recently developed a web app using PeerJS, and am trying to add reconnect functionality.
Basically, my app works by someone creating a server that clients then connect to. The server person can control what the hosts are doing but its basic two way communication.
If a client disconnects they simply reconnect and it works normally. However if the server user refreshes the page, or their computer crashes then they need to be able to re-establish control over the clients.
The start of this is by regaining the original connection ID and peer api ID, which is fine and easy as they are stored in a database and assigned a unique ID the server user can use to query them. Then to enable the client to reconnect I do this upon close:
// connection is closed by the host involuntarily...
conn.on('close', function() {
// if the clients connection closes set up a reconnect request loop - when the host takes back control
// the client will auto reconnect...
connected = false;
conn = null;
var reconnect_timer = setInterval(function () {
console.log('reconnecting...'); // make a fancy animation here...
conn = peer.connect(connectionid, {metadata: JSON.stringify({'type':'hello','username':username})});
// upon connection
conn.on('open', function() { // if this fails need to provide an error message... DO THIS SOON
// run the connect function...
connected = true;
connect(conn);
});
// didnt connect yet
conn.on('error', function(err) {
connected = false;
});
if(connected === true) {
clearInterval(reconnect_timer);
}
}, 1000);
});
This appears to work, as on the server end the client looks like they have reconnected - the connect function has fired etc. However messages cant be sent between, and the client console says:
Error: Connection is not open. You should listen for the `open` event before sending messages.(…)
Where the 'open' event is shown as having been listened to above...
I hope this is clear - any help is appreciated :)
So in the end to create an auto reconnect script, I simply dealt with the client end of things, ensuring the server was set to the same api_key (for cloudservers) and key:
peer = new Peer(return_array.host_id, {key: return_array.api_key});
and then having the client, upon connection closing:
// connection is closed by the host involuntarily...
conn.on('close', function() {
// if the clients connection closes set up a reconnect request loop - when the host takes back control
// the client will auto reconnect...
peer.destroy(); // destroy the link
connected = false; // set the connected flag to false
conn = null; // destroy the conn
peer = null; // destroy the peer
// set a variable which means function calls to launchPeer will not overlap
var run_next = true;
// periodically attempt to reconnect
reconnect_timer = setInterval(function() {
if(connected===false && run_next===true) {
run_next = false; // stop this bit rerunning before launchPeer has finished...
if(launchPeer(false)===true) {
clearInterval(reconnect_timer);
} else run_next == true;
}
}, 1000);
});
Where launch peer will attempt to launch a new peer. To ensure continuity the new id from the client replaces the old id from the client and everything is a smooth takeover. The hardest part in the end was having the "setInterval" only fire once which is achieved (badly...) through use of boolean flags.
Thanks to anybody who read and thought how they could help :)

Storing RabbitMQ connection in NodeJs

I currently forced to create a new RabbitMQ connection every time a user loads a page on my website.
This is creating a new TCP connection every time. However, i'm trying to reduce the number of TCP connections i make to Rabbit with the NodeJS AMQP plug in. Here is what i have:
var ex_conn = get_connection(uri); //http:rabbitm.com
if(ex_conn == false) {
var tempConn = amqp.createConnection({
url: uri
});
connections.push({
host: uri,
obj: tempConn
});
}
else {
var tempConn = ex_conn.obj;
}
The issue i'm running into is that if i try to do:
tempConn.on('ready', function() {
});
Then the ready function does not get triggered. I'm assuming, that is because the ready call back was already defined and it is not going to be re triggered. What i'm looking to do is bind a new queue by doing:
tempConn.queu('', {});
Any thoughts on how to get around this issue is much appreciated.
thanks.

Connecting to an already established UNIX socket with node.js?

I am working on a node.js application that will connect to a UNIX socket (on a Linux machine) and facilitate communication between a web page and that socket. So far, I have been able to create socket and communicate back and forth with this code in my main app.js:
var net = require('net');
var fs = require('fs');
var socketPath = '/tmp/mysocket';
fs.stat(socketPath, function(err) {
if (!err) fs.unlinkSync(socketPath);
var unixServer = net.createServer(function(localSerialConnection) {
localSerialConnection.on('data', function(data) {
// data is a buffer from the socket
});
// write to socket with localSerialConnection.write()
});
unixServer.listen(socketPath);
});
This code causes node.js to create a UNIX socket at /tmp/mysocket and I am getting good communication by testing with nc -U /tmp/mysocket on the command line. However...
I want to establish a connection to an already existing UNIX socket from my node.js application. With my current code, if I create a socket from the command line (nc -Ul /tmp/mysocket), then run my node.js application, there is no communication between the socket and my application (The 'connect' event is not fired from node.js server object).
Any tips on how to go about accomplishing this? My experiments with node.js function net.createSocket instead of net.createServer have so far failed and I'm not sure if that's even the right track.
The method you're looking for is net.createConnection(path):
var client = net.createConnection("/tmp/mysocket");
client.on("connect", function() {
... do something when you connect ...
});
client.on("data", function(data) {
... do stuff with the data ...
});
I was just trying to get this to work with Linux's abstract sockets and found them to be incompatible with node's net library. Instead, the following code can be used with the abstract-socket library:
const abstract_socket = require('abstract-socket');
let client = abstract_socket.connect('\0my_abstract_socket');
client.on("connect", function() {
... do something when you connect ...
});
client.on("data", function(data) {
... do stuff with the data ...
});
You can also connect to a socket like this:
http://unix:/path/to/my.sock:

Categories