ExpressJs(NodeJs) Fire & Forget - javascript

I am currently developing APIs in express js. I want to write a function to which saves the analytic in DB but I should be able to call the function in fire and forget way. The function should accept parameters and do its work. This should work like a separate thread and current code execution should not wait for its response. For example the way Akka Actors work in Java. Can someone suggest a way to do it or some link to refer?

Node is async by default. Just send your response outside of the db query callback:
app.get("/ping", function (req, res) {
// fire
dbConnection.query("UPDATE analytics SET count = count + 1", function(err, result) {
// forget
});
res.send("Pong");
});

You can add your information to some kind of MessageQueue and then launch another process which will listen for MQ and process messages accordingly.
It's not particularly how Actors work, but that's how it's usually done in nodejs realm.
For example you can use kue or AWS SQS or Google PubSub or any other available solution
// example with kue
// http-process.js
var kue = require('kue');
var queue = kue.createQueue();
...
app.post('/something-somewhere', (req, res) => {
var job = queue.create('event', {
data: 'analytics, data',
median: 5.3,
}).save( function(err){
if( !err ) return next(err);
res.send('ok');
});
});
// event-processor.js
var kue = require('kue');
var queue = kue.createQueue();
queue.process('event', function(job, done){
someKindOfORM.myEventsTable.insert({
job.data
}).notify(done);
});

Related

How do I make a MongoDB query throw an error if there is no database connection? [duplicate]

I'm new to Mongo. I needed a database for a simple project and ended up following a tutorial using Mongo with Monk but I have problems understanding how to handle errors.
Background: I have a registration form on the client side. When the user clicks a button, the data is sent via AJAX to the controller that (upon validation, but this is not relevant now) inserts such data into the database and sends back either success or error. When the db is up all seems to work fine.
The problem: If I don't start the db and try to send the request anyway, no error is returned. Simply nothing happens. After some time on the console I get: POST /members/addmember - - ms - -.
I think some error should be returned to the user in this case, so how could I do this?
The post request is below (pretty much as from the tutorial):
// app.js
var db = monk('localhost:27017/dbname')
[...]
// I realize it might be not optimal here
app.use(function(req,res,next){
req.db = db;
next();
});
// members.js
router.post('/addmember', function(req, res) {
var db = req.db;
var collection = db.get('memberstest');
collection.insert(req.body, function(err, result){
res.json(
(err === null) ? { msg: 'success' } : { msg: err }
);
});
});
If the db is down I guess the problem is actually even earlier than the insert, that is in that "db.get()". So how to check if that get can actually be done? I suppose that given the asynchronous nature of node something like a try/catch would be pointless here. Correct?
EDIT: After Neil's answer and a bit of trying, I put together the following that seems to do the job. However, given my scarce degree of confidence on this, I'd appreciate a comment if the code below works because it makes sense or by chance. I added the bufferMaxEntries: 0 options and modified the controller as follows. In the ajax callback I simply have an alert for now that shows the error message thrown (if any).
router.post('/addmember', async (req,res) => {
try {
let db = req.db;
let collection = db.get('memberstest');
collection.insert(req.body, function(err, result){
res.json(
(err === null) ? { msg: 'success' } : { msg: err }
);
});
await db.then(() => 1);
} catch(e) {
res.json({msg: e.message})
}
});
Well you can actually set the bufferMaxEntries option ( documented under Db but deprecated for that object usage, use at "top level as demonstrated instead" ) on the connection, which essentially stops "queuing" requests on the driver when no connection is actually present.
As a minimal example:
index.js
const express = require('express'),
morgan = require('morgan'),
db = require('monk')('localhost/test',{ bufferMaxEntries: 0 }),
app = express();
const routes = require('./routes');
app.use(morgan('combined'));
app.use((req,res,next) => {
req.db = db;
next();
});
app.use('/', routes);
(async function() {
try {
await db.then(() => 1);
let collection = db.get('test');
await collection.remove({});
await collection.insert(Array(5).fill(1).map((e,i) => ({ a: i+1 })));
console.log('inserted test data');
await app.listen(3000,'0.0.0.0');
console.log('App waiting');
} catch(e) {
console.error(e);
}
})();
routes.js
var router = require('express').Router();
router.get('/', async (req,res) => {
try {
let db = req.db,
collection = db.get('test');
let response = await collection.find();
res.json(response);
} catch(e) {
res.status(500).json(e);
}
});
module.exports = router;
So I am actually awaiting the database connection to at least be present on "start up" here, but really only for example since I want to insert some data to actually retrieve. It's not required, but the basic concept is to wait for the Promise to resolve:
await db.then(() => 1);
Kind of trivial, and not really required for your actual code. But I still think it's good practice.
The real test is done by stopping mongod or otherwise making the server unreachable and then issuing a request.
Since we set the connection options to { bufferMaxEntries: 0 } this means that immediately as you attempt to issue a command to the database, the failure will be returned if there is no actual connection present.
Of course when the database becomes available again, you won't get the error and the instructions will happen normally.
Without the option the default is to "en-queue" the operations until a connection is resolved and then the "buffer" is essentially "played".
You can simulate this ( as I did ) by "stopping" the mongod daemon and issuing requests. Then "starting" the daemon and issuing requests. It should simply return the caught error response.
NOTE: Not required, but in fact the whole purpose of async/await syntax is to make things like try..catch valid again, since you can actually scope as blocks rather than using Promise.catch() or err callback arguments to trap the errors. Same principles apply when either of those structures are actually in use though.

Pausing Node.js Readable Stream

I'm building a bar code scanning app using the node-serialport. Where I'm stuck is making a AJAX call to trigger a scan and then have Express server respond with the data from the readable stream.
Initialize Device:
// Open device port
var SerialPort = require('serialport');
var port = '/dev/cu.usbmodem1411';
var portObj = new SerialPort(port, (err) => {
if(err) {
console.log('Connection error ' + err);
}
});
//Construct device object
var Scanner = {
// Trigger Scan
scan : () => {
portObj.write(<scan cmd>), (err) => {
if(err) {
console.log('Error on scan' + err);
}
});
}
}
I've tried two approaches and neither produce the 'scan-read-respond' behavior I'm looking for.
First, I tried putting a event listener immediately following a scan, then using a callback in the listener to respond to the AJAX request. With this approach I get a 'Can't set headers after they are sent' error'. From what I understand Node is throwing this error because res.send is being called multiple times.
First Approach -- Response as callback in listener:
app.get('/dashboard', (req, res) => {
Scanner.scan(); //fire scanner
portObj.on('data', (data) => {
res.send(data); //'Can't set headers after they are sent' error'
});
}
In the second approach, I store the scan data into a local variable ('scanned_data') and move the response outside the listener block. The problem with this approach is that res.send executes before the scanned data is captured in the local variable and so comes up as 'undefined'. Also intriguing is the scanned_data that is captured in the listener block seems to multiple with each scan.
Second Approach -- Response outside listener:
app.get('/dashboard', (req, res) => {
var scanned_data; //declare variable outside listener block
Scanner.scan(); //trigger scan
portObj.on('data', (data) => {
scanned_data = data;
console.log(scanned_data); //displays scanned data but data multiplies with each scan. (e.g. 3 triggers logs 'barcode_data barcode_data barcode_data')
});
console.log(scanned_data); //undefined
res.send(scanned_data);
}
I'm a front end developer but have gotten to learn a lot about Node trying to figure this out. Alas, I think I've come to a dead end at this point. I tinkered with the .pipe() command, and have a hunch that's where the solution lies, but wasn't able to zero in on a solution that works.
Any thoughts or suggestions?
You should not make assumptions about what chunk of data you get in a 'data' event. Expect one byte or many bytes. You need to know the underlying protocol being used to know when you have received a full "message" so you can stop listening for data. At that point you should then send a response to the HTTP request.

Node Express 4 send a response after multiple API calls

I am building a NodeJS server using Express4. I use this server as a middleman between frontend angular app and 3rd party API.
I created a certain path that my frontend app requests and I wish on that path to call the API multiple times and merge all of the responses and then send the resulting response.
I am not sure how to do this as I need to wait until each API call is finished.
Example code:
app.post('/SomePath', function(req, res) {
var merged = [];
for (var i in req.body.object) {
// APIObject.sendRequest uses superagent module to handle requests and responses
APIObject.sendRequest(req.body.object[i], function(err, result) {
merged.push(result);
});
}
// After all is done send result
res.send(merged);
});
As you can see Im calling the API within a loop depending on how many APIObject.sendRequest I received within request.
How can I send a response after all is done and the API responses are merged?
Thank you.
Check out this answer, it uses the Async module to make a few requests at the same time and then invokes a callback when they are all finished.
As per #sean's answer, I believe each would fit better than map.
It would then look something like this:
var async = require('async');
async.each(req.body.object, function(item, callback) {
APIObject.sendRequest(item, function(err, result)) {
if (err)
callback(err);
else
{
merged.push(result);
callback();
}
}
}, function(err) {
if (err)
res.sendStatus(500); //Example
else
res.send(merged);
});
First of all, you can't do an async method in a loop, that's not correct.
You can use the async module's map function.
app.post('/SomePath', function(req, res) {
async.map(req.body.object, APIObject.sendRequest, function(err, result) {
if(err) {
res.status(500).send('Something broke!');
return;
}
res.send(result);
});
});

Dealing with asynchronous node_redis functions, awkward using INCR on two keys at once

Say my server is preparing a new object to send out in response to a POST request:
var responseObj = {
UserID : "0", // default value
ItemID : "0", // default value
SomeData : foo
}
Now, when I create this new object, I want to increment the UserId and ItemID counters that I'm using in redis to track both items. But that seemingly requires two separate asynchronous callbacks, which seems like a problem to me, because I can't just stick the rest of my response-writing code into one of the callbacks.
What I mean is, if I only had one key and one callback to worry about, I would write something like:
app.post('/', function(req, res, next) {
// do some pre-processing
var responseObj = {};
redis.incr('UserID', function(err, id) {
responseObj.UserID = id;
// do more work, write some response headers, etc.
res.send(responseObj);
});
}
But what do I do with two INCR callbacks I need to make? I don't think this would be right, since everything is asynchronous and I can't guarantee my response would be correctly set...
app.post('/', function(req, res, next) {
// do some pre-processing
var responseObj = {};
redis.incr('UserID', function(err, id) {
responseObj.UserID = id;
// do some work
});
redis.incr('ItemID', function(err, id) {
responseObj.ItemID = id;
// do some work
});
res.send(responseObj); // This can't be right...
}
I feel like I'm missing something obvious, as a newbie node.js and redis programmer...
You can execute multiple redis commands in one call either through transaction or lua script. That way you won't have to deal with one callback per command, but rather execute multiple commands and deal only with one callback. For example try to look at multi method/command in redis client.

Spawning child processes in an express server

In my express server there are some functions which I need to run as child processes because otherwise they'll tie up the server and other people won't be able to access it. They're already using the async module but they still tie up the server unless they're run as child processes.
One problem is passing the req and res parameters to them.
How can this be done?
Using child_process.fork, you can send messages to child processes.
Edit: I incorrectly advised to pass req and res as message parameters to the child process. This is not possible, as all messages to and from child processes are converted to JSON. Instead, you could keep some kind of queue in your server. The below is only meant as an example, you may want something more robust:
child.js:
process.on('message', function(message) {
// Process data
process.send({id: message.id, data: 'some result'});
});
server.js:
var child_process = require('child_process');
var child = child_process.fork(__dirname + '/child.js');
var taskId = 0;
var tasks = {};
function addTask(data, callback) {
var id = taskId++;
child.send({id: id, data: data});
tasks[id] = callback;
};
child.on('message', function(message) {
// Look up the callback bound to this id and invoke it with the result
tasks[message.id](message.data);
});
app.post('/foo', function(req, res) {
addTask('some data', function(result) {
res.send(result);
});
});
It's a bit more involved, but it should work. You may quickly grow out of such a system, and may be better served by a proper queue.

Categories