In my express server there are some functions which I need to run as child processes because otherwise they'll tie up the server and other people won't be able to access it. They're already using the async module but they still tie up the server unless they're run as child processes.
One problem is passing the req and res parameters to them.
How can this be done?
Using child_process.fork, you can send messages to child processes.
Edit: I incorrectly advised to pass req and res as message parameters to the child process. This is not possible, as all messages to and from child processes are converted to JSON. Instead, you could keep some kind of queue in your server. The below is only meant as an example, you may want something more robust:
child.js:
process.on('message', function(message) {
// Process data
process.send({id: message.id, data: 'some result'});
});
server.js:
var child_process = require('child_process');
var child = child_process.fork(__dirname + '/child.js');
var taskId = 0;
var tasks = {};
function addTask(data, callback) {
var id = taskId++;
child.send({id: id, data: data});
tasks[id] = callback;
};
child.on('message', function(message) {
// Look up the callback bound to this id and invoke it with the result
tasks[message.id](message.data);
});
app.post('/foo', function(req, res) {
addTask('some data', function(result) {
res.send(result);
});
});
It's a bit more involved, but it should work. You may quickly grow out of such a system, and may be better served by a proper queue.
Related
I am currently using node-serialport module for serial port communication. I will send a command ATEC and it will respond with ECHO.
However, this process of sending and receiving data is async(after i send the data, i will not know when the data will arrive in the data event), the example code is below:
//Register the data event from the serial port
port.on('data', (data) => {
console.log(data);
});
//Send data using serialport
port.write('ATEC');
Is there anyway I could write it in this way?
//When i send the command, I could receive the data
port.write('ATEC').then((data)=> {
console.log(data);
});
Is this possible to achieve?
In http communication using request client, we could do something like
request.get('http:\\google.com')
.on('response', (res) => {
console.log(res);
});
I want to replicate the same behaviour using serialport
I wrap a promise in the serial data receive
function sendSync(port, src) {
return new Promise((resolve, reject) => {
port.write(src);
port.once('data', (data) => {
resolve(data.toString());
});
port.once('error', (err) => {
reject(err);
});
});
}
Please take note, the event is using once instead of on to prevent event from stacking (please check the comments below for more information - thanks #DKebler for spotting it)
Then, I could write the code in sync as below
sendSync(port, 'AThello\n').then((data) => {
//receive data
});
sendSync(port, 'ATecho\n').then((data) => {
//receive data
});
or I could use a generator, using co package
co(function* () {
const echo = yield sendSync(port, 'echo\n');
const hello = yield sendSync(port, 'hello 123\n');
return [echo, hello]
}).then((result) => {
console.log(result)
}).catch((err) => {
console.error(err);
})
We have a similar problem in a project I'm working on. Needed a synchronous send/receive loop for serial, and the serialport package makes that kinda weird.
Our solution is to make some sort of queue of functions/promises/generators/etc (depends on your architecture) that the serial port "data" event services. Every time you write something, put a function/promise/etc into the queue.
Let's assume you're just throwing functions into the queue. When the "data" event is fired, it sends the currently aggregated receive buffer as a parameter into the first element of the queue, which can see if it contains all of the data it needs, and if so, does something with it, and removes itself from the queue somehow.
This allows you to handle multiple different kinds of architecture (callback/promise/coroutine/etc) with the same basic mechanism.
As an added bonus: If you have full control of both sides of the protocol, you can add a "\n" to the end of those strings and then use serialport's "readline" parser, so you'll only get data events on whole strings. Might make things a bit easier than constantly checking input validity if it comes in pieces.
Update:
And now that code has been finished and tested (see the ET312 module in http://github.com/metafetish/buttshock-js), here's how I do it:
function writeAndExpect(data, length) {
return new Promise((resolve, reject) => {
const buffer = new Buffer(length);
this._port.write(data, (error) => {
if (error) {
reject(error);
return;
}
});
let offset = 0;
let handler = (d) => {
try {
Uint8Array.from(d).forEach(byte => buffer.writeUInt8(byte, offset));
offset += d.length;
} catch (err) {
reject(err);
return;
}
if (offset === length) {
resolve(buffer);
this._port.removeListener("data", handler);
};
};
this._port.on("data", handler);
});
}
The above function takes a list of uint8s, and an expected amount of data to get back, returns a promise. We write the data, and then set ourselves up as the "data" event handler. We use that to read until we get the amount of data we expect, then resolve the promise, remove ourselves as a "data" listener (this is important, otherwise you'll stack handlers!), and finish.
This code is very specific to my needs, and won't handle cases other than very strict send/receive pairs with known parameters, but it might give you an idea to start with.
I am currently developing APIs in express js. I want to write a function to which saves the analytic in DB but I should be able to call the function in fire and forget way. The function should accept parameters and do its work. This should work like a separate thread and current code execution should not wait for its response. For example the way Akka Actors work in Java. Can someone suggest a way to do it or some link to refer?
Node is async by default. Just send your response outside of the db query callback:
app.get("/ping", function (req, res) {
// fire
dbConnection.query("UPDATE analytics SET count = count + 1", function(err, result) {
// forget
});
res.send("Pong");
});
You can add your information to some kind of MessageQueue and then launch another process which will listen for MQ and process messages accordingly.
It's not particularly how Actors work, but that's how it's usually done in nodejs realm.
For example you can use kue or AWS SQS or Google PubSub or any other available solution
// example with kue
// http-process.js
var kue = require('kue');
var queue = kue.createQueue();
...
app.post('/something-somewhere', (req, res) => {
var job = queue.create('event', {
data: 'analytics, data',
median: 5.3,
}).save( function(err){
if( !err ) return next(err);
res.send('ok');
});
});
// event-processor.js
var kue = require('kue');
var queue = kue.createQueue();
queue.process('event', function(job, done){
someKindOfORM.myEventsTable.insert({
job.data
}).notify(done);
});
I have a child process worker, that receives some data and sends back results to dynamically attached listener.
Simplified code:
//app.js
var worker = childProcess.fork('./app_modules/workers/worker1.js');
worker.setMaxListeners(0);
require('./app_modules/sockets-user/foobar.js')(io, worker);
//foobar.js
io.sockets.on('connection', function (socket) {
socket.on('trigger', function (data) {
worker.send(data);
worker.once('message', function(responseData) {
//here I get a response from worker
socket.emit('response', responseData);
});
});
});
It was working great until I discovered that If socket.on('trigger' is triggered at the very exact moment by different users every listener would receive the same message.
I could change worker.once to worker.on but its not a fix, because I would have to filter incoming data and then probably find a way to clear dynamically added listeners. What did I do wrong here?
Probably one of the easiest solutions would be to pass some user-specific data (e.g. remote IP address and port or some other unique identifier) to the worker than merely gets passed right back to the parent in the response. This way you can match up the response with the correct socket.
This means that you would only have one message listener (added outside of the socket.io connection handler). You would then look up the socket based on the information passed in the response, and send whatever data back to that client. For example:
//foobar.js
worker.on('message', function(responseData) {
// assuming worker returns `{id: ..., data: ...}`
var socket = io.sockets.sockets[responseData.id];
if (socket)
socket.emit('response', responseData.data);
});
io.sockets.on('connection', function (socket) {
socket.on('trigger', function (data) {
worker.send({ id: socket.id, data: data });
});
});
I am attempting to use the subscribe function described here. However, when editing /assets/js/app.js, I am getting this error:
Uncaught ReferenceError: Room is not defined
So, I am not entirely sure why, but it cannot find my model. Here is my code:
Room.subscribe(req, [{id: "5278861ab9a0d2cd0e000001"}], function (response) {
console.log('subscribed?');
console.log(response);
});
and here is is in the context of app.js
(function (io) {
// as soon as this file is loaded, connect automatically,
var socket = io.connect();
if (typeof console !== 'undefined') {
log('Connecting to Sails.js...');
}
socket.on('connect', function socketConnected() {
// Listen for Comet messages from Sails
socket.on('message', function messageReceived(message) {
///////////////////////////////////////////////////////////
// Replace the following with your own custom logic
// to run when a new message arrives from the Sails.js
// server.
///////////////////////////////////////////////////////////
log('New comet message received :: ', message);
//////////////////////////////////////////////////////
});
///////////////////////////////////////////////////////////
// Here's where you'll want to add any custom logic for
// when the browser establishes its socket connection to
// the Sails.js server.
///////////////////////////////////////////////////////////
log(
'Socket is now connected and globally accessible as `socket`.\n' +
'e.g. to send a GET request to Sails, try \n' +
'`socket.get("/", function (response) ' +
'{ console.log(response); })`'
);
///////////////////////////////////////////////////////////
// This is the part I added:
Room.subscribe(req, [{id: "5278861ab9a0d2cd0e000001"}], function (response) {
console.log('subscribed?');
console.log(response);
});
//
});
// Expose connected `socket` instance globally so that it's easy
// to experiment with from the browser console while prototyping.
window.socket = socket;
// Simple log function to keep the example simple
function log () {
if (typeof console !== 'undefined') {
console.log.apply(console, arguments);
}
}
})(
Am I going about this the right way? should I be storing this directly in app.js?
To subscribe to a model instance, I use the following Real-Time Model Event pattern, some of which resides on the client and some on the server. Keep in mind the client can’t just subscribe itself- you have to send a request to the server letting it know that you’d like to be subscribed-- this is the only way to do it securely. (e.g. you might want to publish notifications with sensitive information-- you want to make sure a connected socket has permission to see that information before subscribing them to it.)
I’m going to use an example of an app with a User model. Let’s say I want to notify folks when existing users login.
Client-Side (Part I)
On the client-side, for simplicity, I’m going to use the existing app.js file in the /assets/js folder (or /assets/linker/js folder if you used the --linker switch when you built the app.)
To send my socket request to the server within assets/js/app.js, I’m going to use the socket.get() method. This method mimics the functionality of an AJAX “get” request (i.e. $.get() ) but uses sockets instead of HTTP. (FYI: You also have access to socket.post(), socket.put(), and socket.delete()).
The code would look something like this:
// Client-side (assets/js/app.js)
// This will run the `welcome()` action in `UserController.js` on the server-side.
//...
socket.on('connect', function socketConnected() {
console.log("This is from the connect: ", this.socket.sessionid);
socket.get(‘/user/welcome’, function gotResponse () {
// we don’t really care about the response
});
//...
Server-Side (Part I)
Over in the welcome() action in UserController.js, now we can actually subscribe this client (socket) to notifications using the User.subcribe() method.
// api/UserController.js
//...
welcome: function (req, res) {
// Get all of the users
User.find().exec(function (err, users) {
// Subscribe the requesting socket (e.g. req.socket) to all users (e.g. users)
User.subscribe(req.socket, users);
});
}
//...
Back on the client-side (Part II)...
I want the socket to ‘listen’ for messages I’m going to send it from the server. To do this I’ll use:
// Client-side (assets/js/app.js)
// This will run the `welcome()` action in `UserController.js` on the backend.
//...
socket.on('connect', function socketConnected() {
console.log("This is from the connect: ", this.socket.sessionid);
socket.on('message', function notificationReceivedFromServer ( message ) {
// e.g. message ===
// {
// data: { name: ‘Roger Rabbit’},
// id: 13,
// verb: ‘update’
// }
});
socket.get(‘/user/welcome’, function gotResponse () {
// we don’t really care about the response
});
// ...
Back on the server-side (Part II)...
Finally, I’ll start sending out messages, server-side, by using: User.publishUpdate(id);
// api/SessionController.js
//...
// User session is created
create: function(req, res, next) {
User.findOneByEmail(req.param('email'), function foundUser(err, user) {
if (err) return next(err);
// Authenticate the user using the existing encrypted password...
// If authenticated log the user in...
// Inform subscribed sockets that this user logged in
User.publishUpdate(user.id, {
loggedIn: true,
id: user.id,
name: user.name,
action: ' has logged in.'
});
});
}
//...
You can also check out Building a Sails Application: Ep21 - Integrating socket.io and sails with custom controller actions using Real Time Model Events for more information.
Say my server is preparing a new object to send out in response to a POST request:
var responseObj = {
UserID : "0", // default value
ItemID : "0", // default value
SomeData : foo
}
Now, when I create this new object, I want to increment the UserId and ItemID counters that I'm using in redis to track both items. But that seemingly requires two separate asynchronous callbacks, which seems like a problem to me, because I can't just stick the rest of my response-writing code into one of the callbacks.
What I mean is, if I only had one key and one callback to worry about, I would write something like:
app.post('/', function(req, res, next) {
// do some pre-processing
var responseObj = {};
redis.incr('UserID', function(err, id) {
responseObj.UserID = id;
// do more work, write some response headers, etc.
res.send(responseObj);
});
}
But what do I do with two INCR callbacks I need to make? I don't think this would be right, since everything is asynchronous and I can't guarantee my response would be correctly set...
app.post('/', function(req, res, next) {
// do some pre-processing
var responseObj = {};
redis.incr('UserID', function(err, id) {
responseObj.UserID = id;
// do some work
});
redis.incr('ItemID', function(err, id) {
responseObj.ItemID = id;
// do some work
});
res.send(responseObj); // This can't be right...
}
I feel like I'm missing something obvious, as a newbie node.js and redis programmer...
You can execute multiple redis commands in one call either through transaction or lua script. That way you won't have to deal with one callback per command, but rather execute multiple commands and deal only with one callback. For example try to look at multi method/command in redis client.