Connecting to an already established UNIX socket with node.js? - javascript

I am working on a node.js application that will connect to a UNIX socket (on a Linux machine) and facilitate communication between a web page and that socket. So far, I have been able to create socket and communicate back and forth with this code in my main app.js:
var net = require('net');
var fs = require('fs');
var socketPath = '/tmp/mysocket';
fs.stat(socketPath, function(err) {
if (!err) fs.unlinkSync(socketPath);
var unixServer = net.createServer(function(localSerialConnection) {
localSerialConnection.on('data', function(data) {
// data is a buffer from the socket
});
// write to socket with localSerialConnection.write()
});
unixServer.listen(socketPath);
});
This code causes node.js to create a UNIX socket at /tmp/mysocket and I am getting good communication by testing with nc -U /tmp/mysocket on the command line. However...
I want to establish a connection to an already existing UNIX socket from my node.js application. With my current code, if I create a socket from the command line (nc -Ul /tmp/mysocket), then run my node.js application, there is no communication between the socket and my application (The 'connect' event is not fired from node.js server object).
Any tips on how to go about accomplishing this? My experiments with node.js function net.createSocket instead of net.createServer have so far failed and I'm not sure if that's even the right track.

The method you're looking for is net.createConnection(path):
var client = net.createConnection("/tmp/mysocket");
client.on("connect", function() {
... do something when you connect ...
});
client.on("data", function(data) {
... do stuff with the data ...
});

I was just trying to get this to work with Linux's abstract sockets and found them to be incompatible with node's net library. Instead, the following code can be used with the abstract-socket library:
const abstract_socket = require('abstract-socket');
let client = abstract_socket.connect('\0my_abstract_socket');
client.on("connect", function() {
... do something when you connect ...
});
client.on("data", function(data) {
... do stuff with the data ...
});

You can also connect to a socket like this:
http://unix:/path/to/my.sock:

Related

How to get "receiver" socket object in socket.io /lib/client.js file on v > 1.4.0?

I had built interception calls in the socket.io file located under node_modules/socket.io/lib/client.js with version 1.3.7 (at least I think so, however I have to update to 1.4.5 because of other requirements). These changes allowed spoof information coming from the sender socket and they were done before continuing to send the data to the receiver socket.
Before (around 1.3.7), the method ran before sending a packet was the following:
Client.prototype.packet = function(packet, preEncoded, volatile){
var self = this;
var sockets = this.sockets[0]; //this helds the socket object
but now (1.4.5) socket.io changed its call to the following
Client.prototype.packet = function(packet, opts){
var sockets = this.sockets[0]; //gives undefined
I tried to look throughout the given objects but couldn't find the sockets of the receiver user.
Back in 1.3.7 I was able to effortlessly give properties to a socket object (e.g: socket.some-property = 1; in the .js file ran by nodejs in the root of the server) and later be able to get this some-property back in node_modules/client.js whenever the receiver got some packet so I could intercept the call but now it does not work and I would like to apply my old code to this new context in order for it all to function again.
var socketObject = {};
io.sockets.on('connection', function (client) {
socketObject[client.id] = {socket: client};
client.on('data', function (somedata) {
socketObject[client.id].data = someData;
});
client.on('disconnect', function() {
delete socketObject[client.id];
});
});

Processing external triggers in Node.js applications

I have a Node.js daemon application that runs on my Debian home server 24/7.
I would like it to process triggers generated by motion, a program that monitors the video signal from cameras that is installed on the same machine. Motion can execute a command on certain events, for example when motion was detected, or camera connection was lost.
I can write a script that will process these events and record them in the database and in my daemon I can continuously poll the database. But that would be highly inefficient, right?
What would be the optimal way to process external triggers in Node.js applications?
Have a look at dnode. It allows you to do exactly what you are looking for.
In your daemon you will have something like this.
var dnode = require('dnode');
var server = dnode({
transform : function (eventObject, cb) {
//handle the event
cb(callbackDataHere)
}
});
server.listen(5004);
You will then need to create the command that Motion will call
var dnode = require('dnode');
var d = dnode.connect(5004);
d.on('remote', function (remote) {
var eventDataToSend = {}
remote.transform(eventDataToSend, function (s) {
//Do stuff with arguments sent back from the callback on the server
});
});

NodeJS ZeroMQ - When producer is ready to send message after connection?

I have made small reasearch about patterns supported by zeromq. I would like to describe problem with PUB/SUB pattern, but probably I discover this problem in my recent project also in PUSH/PULL pattern. I use NodeJS zeromq implementation.
I prepare two examples (server.js & client.js). I recognized that first message from server.js is lost every time I restart server (message is send every 1 second). client.js doesn't get first message. It is probably caused by to short time before sending messages. When I start sending messages after some time (e.g. 1 second) everything works fine. I thing that zmq needs some time for initialization connection between publisher and subscriber.
I would like to know when producer (server) is ready to sending messages for subscribed clients. How get this information?
I don't understand why client.js connected and subscribed for messages doesn't get it, because server is not ready for support subscriptions after restart.
Maybe it works like this by design.
server.js:
var zmq = require('zmq');
console.log('server zmq: ' + zmq.version);
var publisher = zmq.socket('pub');
publisher.bindSync("tcp://*:5555");
var i = 0;
var msg = "get_status OK ";
function sendMsg () {
console.log(msg + i);
publisher.send(msg + i);
i++;
setTimeout(sendMsg, 1000);
}
sendMsg();
process.on('SIGINT', function() {
publisher.close();
process.exit();
});
client.js:
var zmq = require('zmq');
console.log('client zmq: ' + zmq.version);
var subscriber = zmq.socket('sub');
subscriber.subscribe("get_status");
subscriber.on('message', function(data) {
console.log(data.toString());
});
subscriber.connect("tcp://127.0.0.1:5555");
process.on('SIGINT', function() {
subscriber.close();
process.exit();
});
In the node zmq lib repo you have stated the supported monitoring events. Subscribing to this will allow you to monitor your connection, in this case the accept event. However don't forget that you'll also have to call the monitor() function on the socket to activate monitoring.
You should end up with something like:
var publisher = zmq.socket('pub');
publisher.on('accept', function(fd, ep) {
sendMsg();
});
publisher.monitor(100, 0);
publisher.bindSync("tcp://*:5555");

Connect to socket-io already defined in index.js using external node script

I am using hapijs in my MEAN stack and implemented socket.io (using this for reference: http://matt-harrison.com/using-hapi-js-with-socket-io/) Everything works fine, no problems there. It works great in my application!
However, there will be script I will be running via command line separately (which will be doing some maintenance on the application) that I was hoping to connect to the same web socket and be able to push to clients messages if data needs to be refreshed.
My index.js taken straight from the example:
var Hapi = require('hapi');
var server = new Hapi.Server();
server.connection({ port: 3000 });
var io = require('socket.io')(server.listener);
io.on('connection', function (socket) {
socket.emit('Hello');
});
server.start();
I tried to create a separate JS file, and do a:
var socket = require('socket.io');
var io = socket.listen(3000);
Then passed io to send a message. This doesn't seem right... I guess I'm wondering if this can even be done. Messing around I've either created a separate web socket or no connection to the client.
Please let me know if I need to provide more information.
Thanks.
T
In your provided code, you're creating 2 servers. [io.listen()][1] listens on a port as a server.
What you need to do instead to pass messages around is to create a socket.io client in your separate script. There's a separate module for this called socket.io-client, which you can require to be a client:
client.js
var io = require('socket.io-client');
var socket = io('http://localhost:3000');
socket.on('beep', function () {
console.log('beep');
socket.emit('boop');
});
server.js
Here's a slightly updated version of your server script too (hapi v9.0.0 has a mandatory callback for server.start()):
var Hapi = require('hapi');
var server = new Hapi.Server();
server.connection({ port: 3000 });
var io = require('socket.io')(server.listener);
io.on('connection', function (socket) {
socket.emit('beep');
socket.on('boop', function () {
console.log('boop');
});
});
server.start(function () {
console.log('Started Server!');
});
If you open up a couple of terminals and run these, you should see messages passed between them and beep and boop logged out:

Listening for outside events. Bash to NodeJS bridge

Being inside of a NodeJS process, how can I listen for events from bash?
For example
NodeJS side
obj.on("something", function (data) {
console.log(data);
});
Bash side
$ do-something 'Hello World'
Then in the NodeJS stdout will appear "Hello World" message.
How can I do this?
I guess it's related to signal events.
The problem with using signals is that you can't pass arguments and most of them are reserved for system use already (I think SIGUSR2 is really the only safe one for node since SIGUSR1 starts the debugger and those are the only two that are supposed to be for user-defined conditions).
Instead, the best way that I've found to do this is by using UNIX sockets; they're designed for inter process communication.
The easiest way to setup a UNIX socket in node is by setting up a standard net server with net.createServer() and then simply passing a file path to server.listen() to create the socket at the path you specified. Note: It's important that a file at that path doesn't exist, otherwise you'll get a EADDRINUSE error.
Something like this:
var net = require('net');
var server = net.createServer(function(connection) {
connection.on('data', function(data) {
// data is a Buffer, so we'll .toString() it for this example
console.log(data.toString());
});
});
// This creates a UNIX socket in the current directory named "nodejs_bridge.sock"
server.listen('nodejs_bridge.sock');
// Make sure we close the server when the process exits so the file it created is removed
process.on('exit', function() {
server.close();
});
// Call process.exit() explicitly on ctl-c so that we actually get that event
process.on('SIGINT', function() {
process.exit();
});
// Resume stdin so that we don't just exit immediately
process.stdin.resume();
Then, to actually send something to that socket in bash, you can pipe to nc like this:
echo "Hello World" | nc -U nodejs_bridge.sock
What about using FIFOs?
NodeJS code:
process.stdin.on('readable', function() {
var chunk = process.stdin.read();
if (chunk !== null) {
process.stdout.write('data: ' + chunk);
}
});
NodeJS startup (the 3>/tmp/... is a trick to keep FIFO open):
mkfifo /tmp/nodeJsProcess.fifo
node myProgram.js </tmp/nodeJsProcess.fifo 3>/tmp/nodeJsProcess.fifo
Bash linkage:
echo Hello >/tmp/nodeJsProcess.fifo
The signals described in the page that you've linked are used to send some specific "command" to processes. This is called "Inter Process Communication". You can see here a first definition of IPC.
You can instruct you node.js code to react to a specific signal, as in this example:
// Start reading from stdin so we don't exit.
process.stdin.resume();
process.on('SIGUSR1', function() {
console.log('Got SIGUSR1. Here you can do something.');
});
Please note that the signal is sent to the process, and not to a specific object in the code.
If you need to communicate in a more specific way to the node.js daemon you can listen on another port too, and use it to receive (and eventually send) control commands.

Categories