I am currently building a small node application that makes a few api calls and renders a webpage with charts on it. I'm using express and jade as the render engine.
The problem is that I'm quite new to javascript and I don't know how to scheme out my http requests so I can pass an object of variables I got from the api (http get) when there is more than one request. I don't know how to map it out to make a single object and send it to the jade rendering engine.
Here is what I have so far :
app.get('/test', function(req, res) {
apiRequestGoesHere(name, function(error, profile) {
//Get some data here
});
anotherApiRequest(tvshow, function(error, list) {
//Get some data here
});
res.render('test', data);
});
As it is right now, the page renders and the requests are not done yet, and if I place res.render inside one of the request, I can't access the other's data.
So what I want is a way to set it up so I can have multiple api calls, then make an object out of some elements of what is returned to me from the rest api and send it to Jade so I can use the data on the page.
You probably want to use async to help with this. async.parallel is a good choice for something simple like this:
app.get('/test', function(req, res) {
async.parallel([
function(next) {
apiRequestGoesHere(name, function(error, profile) {
//Get some data here
next(null, firstData);
});
},
function(next) {
anotherApiRequest(tvshow, function(error, list) {
//Get some data here
next(null, secondData);
});
}], function(err, results) {
// results is [firstData, secondData]
res.render('test', ...);
});
});
The first argument to those functions next should be an error if relevant (I put null) - as soon as one is called with an error, the final function will be called with that same error and the rest of the callbacks will be ignored.
You can async parallel.
async.parallel([
function(callback){
// Make http requests
// Invoke callback(err, result) after http request success or failure
},
function(callback){
// Make http requests
// Invoke callback(err, result) after http request success or failure
}
],
// optional callback
function(err, results){
// the results array will be array of result from the callback
});
The reason your page renders is the callbacks haven't "called back" yet. To do what you want, you would need to do something like:
app.get('/test', function(req, res) {
apiRequestGoesHere(name, function(error, profile) {
//Get some data here
anotherApiRequest(tvshow, function(error, list) {
//Get some data here
res.render('test', data);
});
});
});
This strategy leads to what is known as "pyramid code" because your nested callback functions end up deeper and deeper.
I would also recommend the step library by Tim Caswell. It would make your code look something like:
var step = require('step');
app.get('/test', function(req, res) {
step(
function () {
apiRequestGoesHere(name, this)
},
function (error, profile) {
if error throw error;
anotherApiRequest(tvshow, this)
},
function done(error, list) {
if error throw error;
res.render('test', list)
}
)
});
You could also use the group method to make the calls in parallel and still maintain the sequence of your callbacks.
Gl,
Aaron
Related
I'm trying provide the response of an external API call when I perform a local GET request but struggling with how to get this to work.
My code at the moment is:
app.get('/', function(req, res){
request.post('http://data.fixer.io/api/latest?access_key=' + apikey +
'&symbols=gbp', function(err, res, body) {
console.log(body)
})
res.render('index')
})
My knowledge and experience with callbacks and async programming is limited, but how do I pass the response of the request POST into the GET request to then pass it to the index?
Thanks!
Callbacks can be difficult to understand, and the problem you describe isn't uncommon (it even has a name - Callback Hell). Partly the reason why Node introduced the async / await syntax - here's the equivalent of your code in that style
app.get('/', async (req, res, next) => {
try {
const uri = `http://data.fixer.io/api/latest?access_key=${apikey}&symbols=gbp`;
const data = await request.post(uri);
return res.render('index', { data }); // or pass whatever you need from `data` into the view
} catch (e) {
return next(e);
}
}
Notice the one big difference? No callbacks and you get all the same benefits of asynchronous code with bonus of writing code in a synchronous style.
You can chain calls in Express, so it's very easy to call an external service within a GET request, e.g.
app.get('/', function(req, res){
request.post('http://data.fixer.io/api/latest?access_key=' + apikey + '&symbols=gbp', function(err, response, body) {
console.log(body)
res.send(body);
})
})
In this case we're sending back the raw response from the POST, it is easy to wrap this in another object, e.g.
res.send( { status: 'ok', post_result: body });
I want to use flickrapi (https://www.npmjs.com/package/flickrapi) package. I need to authorize it:
Flickr.tokenOnly(flickrOptions, function(error, flickr) {
//I need this flickr variable
});
and I want to use this flickr variable in my express code
app.get('/', function (req, res) {
//do something with flickr
});
How should I do it?
Modular approach:
Put your flickr connectivity code separate:
flickr-public.js
var Flickr = require("flickrapi"),
flickrOptions = {
api_key: "API key that you get from Flickr",
secret: "API key secret that you get from Flickr"
};
module.exports = (function(){
Flickr.tokenOnly(flickrOptions, function(error, flickr) {
//handle error here
console.log('Flickr Object Obtained');
return flickr;
});
})();
Note: Better instantiate the flickr object in your app.js file.
So that the object gets created immediately when server starts. As this flickr object is for public API only and does not need authentication again and again.
You can instantiate the flickr object by simply requiring it in app.js file:
require('./flickr-public');
Now Simply access flickr object anywhere by simply requiring it.
routes.js
const flickr = require('../path-to/flickr-public');
app.get('/', function (req, res) {
//use flickr object to perform actions.
});
Explanation:
From the node.js documentation:
Modules are cached after the first time they are loaded. This means (among other things) that every call to require('foo') will get exactly the same object returned, if it would resolve to the same file.
Multiple calls to require('foo') may not cause the module code to be executed multiple times.
Just put it inside your get
app.get('/', function (req, res) {
Flickr.tokenOnly(flickrOptions, function(error, flickr) {
//do something res.status(200).send('what you want here');
});
});
use it directly inside your route callback
app.get('/', function (req, res) {
Flickr.tokenOnly(flickrOptions, function(error, flickr) {
//call someother method to get photos etc. and finally call res.send()
res.send(photos); // where photos is obtained from flickr or anything you can pass which should be response of you request.
});
});
I am building a NodeJS server using Express4. I use this server as a middleman between frontend angular app and 3rd party API.
I created a certain path that my frontend app requests and I wish on that path to call the API multiple times and merge all of the responses and then send the resulting response.
I am not sure how to do this as I need to wait until each API call is finished.
Example code:
app.post('/SomePath', function(req, res) {
var merged = [];
for (var i in req.body.object) {
// APIObject.sendRequest uses superagent module to handle requests and responses
APIObject.sendRequest(req.body.object[i], function(err, result) {
merged.push(result);
});
}
// After all is done send result
res.send(merged);
});
As you can see Im calling the API within a loop depending on how many APIObject.sendRequest I received within request.
How can I send a response after all is done and the API responses are merged?
Thank you.
Check out this answer, it uses the Async module to make a few requests at the same time and then invokes a callback when they are all finished.
As per #sean's answer, I believe each would fit better than map.
It would then look something like this:
var async = require('async');
async.each(req.body.object, function(item, callback) {
APIObject.sendRequest(item, function(err, result)) {
if (err)
callback(err);
else
{
merged.push(result);
callback();
}
}
}, function(err) {
if (err)
res.sendStatus(500); //Example
else
res.send(merged);
});
First of all, you can't do an async method in a loop, that's not correct.
You can use the async module's map function.
app.post('/SomePath', function(req, res) {
async.map(req.body.object, APIObject.sendRequest, function(err, result) {
if(err) {
res.status(500).send('Something broke!');
return;
}
res.send(result);
});
});
I'm making a web application using the MEAN framework and MVC design pattern. I am trying to perform a POST request from the Angular front-end for finding a document in my server-side MongoDB (version 2.4.9). The console logs show that the query is successful, but when I try to send the response back to the client, the query result is undefined.
I understand that NodeJS is asynchronous and uses callbacks, but I am having trouble understanding what is wrong with my code. I tried using returns and callbacks but I can't get it working. I'm confused how to use the controller to access the model and have the controller ultimately send the response.
Here is my code to connect to the database (model):
module.exports = {
readDocument : function(callback, coll, owner) {
// Connect to database
MongoClient.connect("mongodb://localhost:27017/tradingpost", function(err, db) {
if (err) {
console.log("Cannot connect to db (db.js)");
callback(err);
}
else {
console.log("Connected to DB from db.js: ", db.databaseName);
//Read document by owner
// Get the documents collection
var collection = db.collection(coll);
// Find document
collection.find({owner: owner}).toArray(function (err, result) {
if (err) {
console.log(err);
} else if (result.length) {
console.log('Found:', result);
} else {
console.log('No document(s) found with defined "find" criteria!');
}
// Close connection
db.close();
return callback(result);
});
}
})
}}
And here is my controller that sends the response:
var model = require('../models/db');
exports.sendRecentPosts = function (req,res) {
// Connect to the DB
// Run query for recent posts
// Close the connection
// Send the data to the client
var result = model.readDocument(dbCallback, "gs", "Mana");
res.end( result );
};
Client's post request:
// Use post for secure queries
// Need recent posts for display
$http.post('/recent').
success(function(responseData) {
$scope.testValue = responseData;
}).
error(function(responseData) {
console.log('Recent posts POST error. Received: ', responseData);
});
Snippet for my express route:
var goodsServices = require('../controllers/gs-server-controller.js');
app.post('/recent', goodsServices.sendRecentPosts);
I have been struggling with this for a long time and searched the forum for solutions but could not find any. Thanks for any feedback.
I do not know why this question has not been answered yet. When I faced the same problem, I learnt that the response to all DB queries are returned after the DB transaction is complete. Try placing db.close() within the success callback response of the find() call.
A web app I'm building will send out invoices to clients every third month. This will be a scheduled event that is run in the middle of the night, but under development I have put this code into a route so I can test it.
In short i want the code to do the following.
QUery all unsent invoices from DB.
Make a call to Mandrill for each invoice (In this call I'm also invoking a function creating a Mandrill message object from the invoice).
For every message Mandrill send, Update the DB invoice sent: true.
When all invoices are sent, make a final callback in the async.waterfall
The code below works. but i have some concerns regarding the _.each.
invoices.post('/invoices/send/', function(req, res, next) {
async.waterfall([
// Query all unsent invoices
function(callback) {
db.invoices.find({sent: false}).toArray(callback);
},
// Send all unsent invoices
function(invoices, callback) {
if (invoices.length === 0) {
var err = new Error('There are no unsent invoices');
err.status = 400;
return next(err); //Quick escape if there are no matching invoice to process
}
// Make a call to Mandrill transactional email service for every invoice.
_.each(invoices, function(invoice) {
mandrillClient.messages.sendTemplate({template_name: "planpal-invoice", template_content: null, message: mandrillClient.createInvoiceMessage(invoice)}, function(sendResult) {
console.log(sendResult);
db.invoices.updateById(invoice._id, {$set: {sent: true}}, function(err, saveResult) {
console.log(saveResult);
});
}, function(err) {
return next(err);
});
});
callback(null, 'done');
}
],
function(err, result) {
if (err) {
return next(err);
}
res.json(result);
});
});
I'm thinking I should use async.eachLimit instead.... but I dont know how to write it.
I have no idea what i should set the limit to, but I guess several parallel request would be better than running all mandrill request in serie like above, am I wrong? EDIT _.each run the callbacks in parallel. The difference from a async.each is that I dont get a "final callback"
Conclusion: Should i use a async.eachLimit above? If Yes, what is a good limit value?
I think you can use the https://github.com/caolan/async#each function.
it will execute the queries in parallel too