So,I have the following js files:
test_api.js:
var request = require("request")
//I actually have a url here.API call which returns JSON.
var url = "";
request({
url: url,
json: true
}, function (error, response, body) {
if (!error && response.statusCode === 200) {
module.exports = body;
console.log(body) // Prints the json response
}
});
test.js:
var api = require('./test_api.js');
console.log(api);
So,when I run node test.js I get:
console.log(body) // Prints the json response
console.log(api); //Prints an empty object
Any idea why I get an empty object back?
When you call request(), you pass it a callback function. That callback function is called sometime in the future (that's an asynchronous callback). Meanwhile the rest of your module continues to execute and your module initialize completes with nothing assigned to module.exports yet.
So, when the caller does:
var api = require('./test_api.js');
The module has finished loading and nothing was assigned to module.exports yet so therefore, it is still an empty object and thus api contains only an empty object.
Then, sometime later, your request() operation finishes and calls its callback. You then assign something to module.exports, but it's too late. The module was already loading and the caller already grabbed the old module.exports before you replaced it.
All network I/O in node.js is asynchronous. This means that the completion callback is called some indeterminate time in the future and the rest of your Javascript continues to execute. The only place you can process asynchronous results is inside the completion callback or in some other function that is called from that callback. Or, you can use promises to do that type of work for you.
So, basically you can't return results that were retrieved with asynchronous operations from the loading of a module and you can't assign them to module.exports. So instead, the modern way to design this is to export either a promise or a function that returns a promise and then the caller can use .then() on the promise to get access to the results.
Here would be a modern way to implement what it looks like you're trying to do using a promise.
var request = require("request")
//I actually have a url here.API call which returns JSON.
var url = "";
function requestP(options) {
return new Promise((resolve, reject) => {
request(options, (error, response, body) => {
if (error) {
reject(error);
} else if (response.statusCode !== 200) {
reject(new Error(`Network request returned status code ${response.statusCode}`));
} else {
resolve(body);
}
});
});
}
module.exports = requestP({url, json: true});
Then, the caller would use that like this:
let api = require('./test_api.js');
api.then(body => {
// process body here
}).catch(err => {
// process err here
});
For a more general discussion of returning asynchronous results, see How do I return the response from an asynchronous call?
You cannot assign module.exports or assign to exports asynchronously. Instead, you should consider exporting a function that accepts a callback and performs the request (caching/reusing the result if needed).
Related
I am starting to learn about promises in Javascript and I am still not getting my head around it. The code below is mostly real code. I have placed several debugger statements so the program stops and I can understand how the flow works and inspect some variables. I have read some blog posts about promises and I still can't understand everything. This is from an app which uses AngularJS and q library.
Few questions:
1- What does deferred.Resolve() do exactly? What is it doing with response.data? When I inspected the 'deferred' object and its 'promise' object, I couldn't see any trace for response.data.
2- When I resumed execution after debugger #1, I thought the http post statement would run but execution jumped to the return statement. I guess that's where the promise jumped in and the post will happen in the future?
3- How do I know when the post will happen when the function returns? The caller will get the return promise, what is the caller expected to do with it?
this.GetData = function()
{
var data = blahblah;
var deferred = this.$q.defer();
debugger; //1
this.$http.post(someurl, data,
{
headers: { 'Content-Type': 'application/x-www-form-urlencoded' },
handleErrors: false
})
.then(function(response) {
debugger; //2
(domesomething...)
deferred.resolve(response.data);
},
function(error) {
(logerror...)
deferred.reject(error);
});
debugger; //3
return deferred.promise;
};
It's appropriate to use q.defer() (an explicit creation of a promise) when wrapping callback style code, "promisifying"...
this.timeoutWithAPromise = function(msec) {
let defer = q.defer();
setTimeout(() => defer.resolve(), msec);
return defer.promise;
};
The foregoing says: "Create a promise and return it right away. That promise will be fulfilled when msec has passed.
Question 1: resolve() fulfills the promise, calling whatever function that's been set with then(), passing it whatever parameter was passed to resolve.
Question 2: You're right, the async operation is begun and a promise for it's completion is returned right away.
Question 3a: The post will begin (or the timeout will commence in my e.g.) on another thread as soon as that invocation is made. It will continue concurrently with execution on this calling thread, until it completes, which you understand from Question 1, is done by resolving the promise, invoking it's then().
Question 3b: What is the caller to do with a returned promise? Attach a block to it that you wish to run upon completion, possibly additional async actions. Taking my example...
let self = this;
self.timeoutWithAPromise(1000).then(()=> {
console.log('1000msec have passed');
return self.timeoutWithAPromise(1000);
}).then(()=> {
console.log('now, another 1000msec have passed');
// ...
Rewriting your example, realizing that $http already conforms to promises...
var data = blahblah;
let headers = { 'Content-Type': 'application/x-www-form-urlencoded' };
let config = { headers: headers, handleErrors: false };
let httpPromise = this.$http.post(someurl, data, config).then((response)=> {
console.log('got a response. tell our caller about it');
return response.data;
}).catch((error)=>
console.log('got an error. either handle it here or throw it');
throw error;
});
// this is important: return the httpPromise so callers can chain off of it
return httpPromise;
Now a caller can say:
let promiseToDoEvenMore = this.GetData().then((data)=> {
console.log(data);
return this.GetMoreData(); // etc.
});
return promiseToDoEvenMore; // and so on
I am trying to utilize request with Bluebird's Promises:
const request = Promise.promisify(require('request'));
Promise.promisifyAll(request);
Unfortunately, the result I am getting does not reflect what I was expected based on examples:
request('http://google.com').then(function(content) {
// content !== String
// Object.keys(content) => ['0', '1']
};
content is not a string
I have to access the content via content['0'] respectively content['1'] and here my actually expected response, the HTML string is.
This seems fishy to me, like I misused the Promise API here. What have I done wrong?
Use .spread(function(response, content){}) instead of .then(function(content){}).
Why not to use native node's Promises?
function req() {
return new Promise (function(resolve, reject){
request('http://google.com',function (error, response, body){
if (error)
reject(error);
resolve(body) //if json, JSON.parse(body) instead of body
});
});
}
req().then(data => doSomethingWithData);
You should use spread() instead of then().
Here is a link of bluebird, which explains the usage of spread().
https://github.com/petkaantonov/bluebird/blob/2.x/API.md#spreadfunction-fulfilledhandler--function-rejectedhandler----promise
All the callback function of request is comprised of three params (err, response, content). In promisifed request, err is handled by catch().
But how to pass the other two parameters to the next chained callback function? It is not able to be achieved by then().
So promisifed request will return an array of two promises, one will return repsonse and the other content. And they can be caught by the spread() method, which is able to catch a fixed-size array of promises.
I have an event triggering a Metor.call():
Meteor.call("runCode", myCode, function(err, response) {
Session.set('code', response);
console.log(response);
});
But my runCode function inside the server's Metheor.methods has inside it a callback too and I can't find a way to make it return something to response in the above code.
runCode: function(myCode) {
var command = 'pwd';
child = exec(command, function(error, stdout, stderr) {
console.log(stdout.toString());
console.log(stderr.toString());
// I Want to return stdout.toString()
// returning here causes undefined because runCode doesn't actually return
});
// I can't really return here because I don't have yet the valuer of stdout.toString();
}
I'd like a way to have the exec callback return something as runCode without setInterval which would work, but as a hacky way in my opinion.
You should use Future from fibers.
See docs here : https://npmjs.org/package/fibers
Essentially, what you want to do is wait until some asynchronous code is run, then return the result of it in a procedural fashion, this is exactly what Future does.
You will find out more here : https://www.eventedmind.com/feed/Ww3rQrHJo8FLgK7FF
Finally, you might want to use the Async utilities provided by this package : https://github.com/arunoda/meteor-npm, it will make your like easier.
// load future from fibers
var Future=Npm.require("fibers/future");
// load exec
var exec=Npm.require("child_process").exec;
Meteor.methods({
runCode:function(myCode){
// this method call won't return immediately, it will wait for the
// asynchronous code to finish, so we call unblock to allow this client
// to queue other method calls (see Meteor docs)
this.unblock();
var future=new Future();
var command=myCode;
exec(command,function(error,stdout,stderr){
if(error){
console.log(error);
throw new Meteor.Error(500,command+" failed");
}
future.return(stdout.toString());
});
return future.wait();
}
});
I am making a call to an API within a function. At this point I get "undefined" as the returned value. I know the call to the API is successful since the URLs that I am trying to get in the call print to the term no problem. I am 99% sure that call to the encapsulating function gets triggered before the request function is done ("undefined" is returned before URLs are listed). Wanted to confirm this and ask if there is a tutorial or code snippet someone could point me to with a good description of a pattern I should follow in this case. <-- Obviously still struggling with the async nature of the beast :)
function GetPhotoURLs(url){
var photo_urls= new Array();
request({
url: url,
json: true
}, function (error, response, body) {
if (!error && response.statusCode === 200)
{
//console.log(body) // Print the json response
var inhale_jsonp=body;
var blog_stream= inhale_jsonp.substring(22,inhale_jsonp.length-2); //getting JSON out of the wrapper
blog_stream=JSON.parse(blog_stream);
for(var i=0;i<blog_stream.posts.length;i++)
{
photo_urls[i]=blog_stream['posts'][i]['photo-url-500'];
console.log(photo_urls[i]+"\n"); //checking that I am getting all the URLs
}
console.log("success!");
console.log(photo_urls[1]);
return photo_urls;
}
else
{
photo_urls[0]='none';
console.log("nope!");
console.log(photo_urls[0]);
return photo_urls;
}
});
}
output sequence -->
1. undefined
2. Listing of URLs
3. Success message + second element from URLs array
The request() function is asynchronous. As such, it finishes long after your original function has completed. Thus, you can't return the result from it (the result isn't even known yet when the function returns). Instead, you must process the result IN your callback or call a function from within that callback and pass the result to that function.
As for general design patterns for this type of work, you can either process the result in the callback as suggested above or switch to using promises to help you manage the asynchronous nature of the request. In all cases, you will be processing the result in some sort of callback.
You can read this answer for some more detail on handling async responses. That specific answer is written for client-side ajax calls, but the concepts for handling async responses are identical.
In your specific case, you may want to make getPhotoURLs() take a callback function which you can call with the result:
function GetPhotoURLs(url, callback){
.....
request(..., function(error, response, body) {
...
// when you have all the results
callback(photo_urls);
})
}
GetPhotoURLs(baseurl, function(urls) {
// process urls here
// other code goes here after processing the urls
});
I'm using node.js and the async package.
Here's the code I have:
async.waterfall(
[
function(callback) {
var data = getSomeData();
callback(null, data);
},
function(data, callback) {
someFunctionThatNeedsData(data);
callback(null, 'done');
}
],
function(err, result) {
}
);
getSomeData has an asynchronous HTTP request that grabs some data from a web service. I'd like to wait until I get a response, and then return that data and pass it to someFunctionThatNeedsData.
What I expected was that getSomeData -- including the callback inside of it -- would have to complete before moving on to invoke someFunctionThatNeedsData.
The problem is that, despite using the waterfall function here, data is undefined by the time it gets to someFunctionThatNeedsData.
Additionally, from console.log I can see that the end of getSomeData is reached before the callback inside of getSomeData even begins.
Am I using waterfall incorrectly, or is it just not the right tool here? If it's just not right, what can I use to achieve the desired effect?
Or do I have to resign to having deeply nested callbacks (which, with future work, I will) and have to just mitigate it by extracting inline code into named functions?
getSomeData() has an asynchronous http request that grabs some data from a web service.
This is the issue. The execution flow already continued to the callback and executed it. This is how asynchronous functions work!
You have to pass the callback to getSomeData, which calls it once the HTTP request finished. So yes: You may need to nest the callbacks.
If you have async operation. You don't necessary to use async.waterfall. You could just do that in a promise chain style.
getSomeData().then(function(data)
{
var changeData = changeYourData(data);
return changeData;
}).then(function(changedData)
{
// some more stuff with it. You can keep on forwarding to the next `then`
}).catch(function(err)
{
// if any error throw at any point will get catch here
}).finally(function()
{
// this one will guarantee get call no matter what,
// exactly the same like async.waterfall end of chain callback
});
This example will work with Q, When, and any promise lib that follow standard.
If you need to use async.waterfall (because you could drive it with an Array.map)
You just need to callback in your then
async.waterfall(
[
function(callback) {
// A
getSomeData().then(function(data)
{
callback(null, data);
});
// B - just throw the whole thing in
callback(null , getSomeData());
},
function(data, callback) {
// A
someFunctionThatNeedsData(data);
// B
data.then(function(resolvedData)
{
someFunctionThatNeedsData(resolvedData);
callback(null, 'done');
});
}
],
function(err, result) {
});
Hope this help.