I'm probably missing the point somewhere here so I'm looking for advice.
I have a nodejs server which is listening for client connections and, based on the data received, makes calls to an API.
The very first call to that API gets an ID which needs to be used on subsequent calls to group them together.
Where I'm struggling is that the call to the API is necessarily asynchronous and in the callback I'm assigning the ID to a variable. While that async call is being processed by the API server, more data is coming in from the client and needs more API calls made BUT I can't fire them until I know the results from the first call as the second calls depend on it.
What's the proper way to handle this? I feel like I should be using Q to promise the results of the first API call to the second, but I'm not sure how it should be structured. Or should I just be queueing up the API calls until the first completes? How would I do that?
Example problem code :
var server = net.createServer();
//set up the callback handler
server.on('connection', handleConnection);
handleConnection(conn) {
//do some stuff...
firstAPICall();
conn.on('data', handleData);
}
handleData(data) {
//do some stuff...
otherAPIcall();
}
firstAPICall() {
client.get("http://myAPI/getID", function (data, response) {
conn.myID = data[0].myID;
}
}
}
otherAPICall() {
//How do I make sure I actually have a value
//in conn.myID from the first function???
client.post("http://myAPI/storeData", { data: {myID:conn.myID, data:someData} }, function (data, response) {
//do some stuff...
}
}
}
Yes, you should be using promises for this. Make a promise for the id that is asynchronously resolved from the first call, and then use it in the subsequent calls:
handleConnection(conn) {
//do some stuff...
var idPromise = firstAPICall();
conn.on('data', function handleData(data) {
//do some stuff...
otherAPIcall(idPromise).then(function(result) {
…
});
});
}
firstAPICall() {
return Q.Promise(function(resolve, reject) {
client.get("http://myAPI/getID", function (data, response) {
resolve(data[0].myID);
});
});
}
otherAPICall(idPromise) {
return idPromise.then(function(myID) {
return new Promise(function(resolve, reject) {
client.post("http://myAPI/storeData", {
data: {myID:myID, data:someData}
}, function (data, response) {
//do some stuff...
resolve(…);
});
});
});
}
Probably you should factor out creating a promise for the result of a client.get call in an extra function. Also make sure to handle errors correctly there and call reject with them. If client would use the node callback conventions, Q even has some nice helper functions for that.
Try using promises. Then use 'then' to call the otherAPICall()
I think you can assume they will be sending data immediately after connecting. So you can simplify and just check in otherAPICall if you have an ID, if not, you can just use a callback. Promises or the async/await keywords might make things sort of nicer down the line but aren't required for this.
var server = net.createServer();
//set up the callback handler
server.on('connection', handleConnection);
handleConnection(conn) {
conn.on('data', handleData(connm, data));
}
handleData(conn, data) {
//do some stuff...
otherAPIcall(conn);
}
checkID(conn, cb) {
if (!conn.myID) {
client.get("http://myAPI/getID", function (data, response) {
conn.myID = data[0].myID;
cb();
});
} else {
cb();
}
}
otherAPICall(conn) {
checkID(conn, function() {
client.post("http://myAPI/storeData", { data: {myID:conn.myID, data:someData} }, function (data, response) {
//do some stuff...
});
});
}
promises can chain values and are always resolved after the callback occurs with the returned value,
function async(value) {
var deferred = $q.defer();
var asyncCalculation = value / 2;
deferred.resolve(asyncCalculation);
return deferred.promise;
}
var promise = async(8)
.then(function(x) {
return x+1;
})
.then(function(x) {
return x*2;
})
.then(function(x) {
return x-1;
});
promise.then(function(x) {
console.log(x);
});
This value passes through all the success callbacks and so the value 9 is logged ((8 / 2 + 1) * 2 - 1).
Related
I am trying to wrap my post/get/put/delete calls so that any time they are called, if they fail they will check for expired token, and try again if that is the reason for failure, otherwise just resolve the response/error. Trying to avoid duplicating code four times, but I'm unsure how to resolve from a non-anonymous callback.
factory.post = function (url, data, config) {
var deferred = $q.defer();
$http.post(url, data, config).then(factory.success, factory.fail);
return deferred.promise;
}
factory.success = function (rsp) {
if (rsp) {
//how to resolve parent's promise from from here
}
}
Alternative is to duplicate this 4 times:
.then(function (rsp) {
factory.success(rsp, deferred);
}, function (err) {
factory.fail(err, deferred);
});
One solution might be using bind function.
function sum(a){
return a + this.b;
}
function callFn(cb){
return cb(1);
}
function wrapper(b){
var extra = {b: b};
return callFn(sum.bind(extra));
}
console.log(wrapper(5));
console.log(wrapper(-5));
console.log(wrapper(50));
For your solution check bellow example
factory.post = function (url, data, config) {
var deferred = $q.defer();
$http.post(url, data, config).then(factory.success.bind({deferred: deferred}), factory.fail.bind({deferred: deferred}));
return deferred.promise;
}
factory.success = function (rsp) {
if (rsp) {
this.deferred.resolve(rsp);
//how to resolve parent's promise from from here
}else {
//retry or reject here
}
}
From what I understand, you just want to resolve the deferred object on success and retry on error in case of expired token. Also you probably want to keep a count of number of retries. If so,
Edit - Seems I misunderstood the question. The answer suggested by Atiq should work, or if you are using any functional JS libraries like underscore or Ramdajs, you could use curry function. Using curry function, you can pass some parameters to the function and the function will get executed only after all the parameters are passed. I have modified the code snippet to use curry function from underscorejs.
factory.post = function (url, data, config) {
var deferred = $q.defer();
$http.post(url, data,
config).then(_.curry(factory.success(deferred)),
_.curry(factory.fail(deferred));
return deferred.promise;
}
factory.success = function (deferred, rsp) {
if (rsp) {
//handle resp
deferred.resolve(rsp);
}
}
factory.fail = function(deferred, err){
//handle retry
deferred.reject(err);
}
I'm using Angular 1.5.8. The views in my app require different combinations of the same 3 ajax requests. Some views require data from all three, others require data from two, or even one single endpoint.
I'm working on a function that will manage the retrieval of this data, requiring the app to only call each endpoint once. I want the ajax requests to be called as needed, but only when needed. Currently I've created a function which works, but seems like it could use improvement.
The following function is contained within the $rootScope. It uses the fetchData() function to cycle through the get requests as requested. When data is retrieved, it is stored in the global variable $rootScope.appData and then fetchData() is called again. When all data is retrieved the deferred promise is resolved and the data is returned to the controller.
$rootScope.appData = {};
$rootScope.loadAppData = function(fetch) {
var deferred = $q.defer();
function getUser() {
$http
.get('https://example.com/api/getUser')
.success(function(result){
$rootScope.appData.currentUser = result;
fetchData();
});
}
function getPricing() {
$http
.get('https://example.com/api/getPricing')
.success(function(result) {
$rootScope.appData.pricing = result;
fetchData();
});
}
function getBilling() {
$http
.get('https://example.com/api/getBilling')
.success(function(result) {
$rootScope.appData.billing = result;
fetchData();
});
}
function fetchData() {
if (fetch.user && !$rootScope.appData.currentUser) {
getUser();
} else if (fetch.pricing && !$rootScope.appData.pricing) {
getPricing();
} else if (fetch.billing && !$rootScope.appData.billing) {
getBilling();
} else {
deferred.resolve($rootScope.appData);
}
}
if ($rootScope.appData.currentUser && $rootScope.appData.pricing &&$rootScope.appData.billing) {
deferred.resolve($rootScope.appData);
} else {
fetchData();
}
return deferred.promise;
};
An object fetch is submitted as an attribute, this object shows which ajax requests to call. An example call to the $rootScope.loadAppData() where only user and pricing data would be requested would look like this:
$rootScope.loadAppData({user: true, pricing: true}).then(function(data){
//execute view logic.
});
I'm wondering:
Should the chaining of these functions be done differently? Is the fetchData() function sufficient, or is this an odd way to execute this functionality?
Is there a way to call all needed Ajax requests simultaneously, but wait for all required calls to complete before resolving the promise?
Is it unusual to store data like this in the $rootScope?
I'm aware that this function is not currently handling errors properly. This is functionality I will add before using this snippet, but isn't relevant to my question.
Instead of using the .success method, use the .then method and return data to its success handler:
function getUserPromise() {
var promise = $http
.get('https://example.com/api/getUser')
.then( function successHandler(result) {
//return data for chaining
return result.data;
});
return promise;
}
Use a service instead of $rootScope:
app.service("myService", function($q, $http) {
this.loadAppData = function(fetchOptions) {
//Create first promise
var promise = $q.when({});
//Chain from promise
var p2 = promise.then(function(appData) {
if (!fetchOptions.user) {
return appData;
} else {
var derivedPromise = getUserPromise()
.then(function(user) {
appData.user = user;
//return data for chaining
return appData;
});
return derivedPromise;
);
});
//chain from p2
var p3 = p2.then(function(appData) {
if (!fetchOptions.pricing) {
return appData;
} else {
var derivedPromise = getPricingPromise()
.then(function(pricing) {
appData.pricing = pricing;
//return data for chaining
return appData;
});
return derivedPromise;
);
});
//chain from p3
var p4 = p3.then(function(appData) {
if (!fetchOptions.billing) {
return appData;
} else {
var derivedPromise = getBillingPromise()
.then(function(user) {
appData.billing = billing;
//return data for chaining
return appData;
});
return derivedPromise;
);
});
//return final promise
return p4;
}
});
The above example creates a promise for an empty object. It then chains three operations. Each operations checks to see if a fetch is necessary. If needed a fetch is executed and the result is attached to the appData object; if no fetch is needed the appData object is passed to the next operation in the chain.
USAGE:
myService.loadAppData({user: true, pricing: true})
.then(function(appData){
//execute view logic.
}).catch(functon rejectHandler(errorResponse) {
console.log(errorResponse);
throw errorResponse;
});
If any of the fetch operations fail, subsequent operations in the chain will be skipped and the final reject handler will be called.
Because calling the .then method of a promise returns a new derived promise, it is easily possible to create a chain of promises. It is possible to create chains of any length and since a promise can be resolved with another promise (which will defer its resolution further), it is possible to pause/defer resolution of the promises at any point in the chain. This makes it possible to implement powerful APIs. -- AngularJS $q Service API Reference - Chaining Promises
Found a good way to answer question 2 in the original post. Using $q.all() allows the promises to execute simultaneously, resolving once they all complete, or failing as soon as one of them fails. I've added this logic into a service thanks to #georgeawg. Here's my re-write putting this code into a service, and running all calls at the same time:
services.factory('appData', function($http, $q) {
var appData = {};
var coreData = {};
appData.loadAppData = function(fetch) {
var deferred = $q.defer();
var getUser = $q.defer();
var getPricing = $q.defer();
var getBilling = $q.defer();
if (!fetch.user || coreData.currentUser) {
getUser.resolve();
} else {
$http
.get('https://example.com/api/getUser')
.success(function(result){
coreData.currentUser = result;
getUser.resolve();
}).error(function(reason) {
getUser.reject(reason);
});
}
if (!fetch.billing || coreData.billing) {
getBilling.resolve();
} else {
$http
.get('https://example.com/api/getBilling')
.success(function(result) {
coreData.billing = result;
getBilling.resolve();
}).error(function(reason) {
getBilling.reject(reason);
});
}
if (!fetch.pricing || coreData.pricing) {
getPricing.resolve();
} else {
$http
.get('https://example.com/api/getPricing')
.success(function(result) {
coreData.pricing = result;
getPricing.resolve();
}).error(function(reason) {
getPricing.reject(reason);
});
}
$q.all([getPricing.promise, getUser.promise, getBilling.promise]).then(function(result) {
deferred.resolve(coreData);
}, function(reason){
deferred.reject(reason);
});
return deferred.promise;
};
return appData;
});
I have this API call where i make sure the data return in the same order i send it. However, i realized thats not really what i want, i want to make sure the data is send and taken care of one at a time.
data[n] has returned before data[n+1] is send.
the reason for this is:
If i do it as seen below, the server still gets it in a random order, and therefor saves the data in my DB in a random order. (or well not random, heavier data gets processed slower)
var promiseArray = [];
for (var i = 0; i < data.length; i++) {
var dataPromise = $http.post('/api/bla/blabla', $httpParamSerializer(data[i]))
.then (function (response) {
//return data for chaining
return response.data;
});
promiseArray.push(dataPromise);
}
$q.all(promiseArray).then(function (dataArray) {
//succes
}).catch (function (errorResponse) {
//error
});
how can i make sure the data is send and processed and returned, one at a time in a smooth way ?
You could do something like this:
var i = -1;
processNextdata();
function processNextdata() {
i++;
if(angular.isUndefined(data[i]))
return;
$http.post('/api/bla/blabla', $httpParamSerializer(data[i]))
.then(processNextdata)
}
Update:
Callback after every result:
var i = -1;
processNextdata();
function processNextdata() {
i++;
if(angular.isUndefined(data[i]))
return;
$http.post('/api/bla/blabla', $httpParamSerializer(data[i]))
.then(function(result) {
// do something with a single result
return processNextdata();
}, errorCallback);
}
Callback after everything is done:
var i = -1, resultData = [];
processNextdata()
.then(function(result) {
console.log(result);
}, errorCallback);
function processNextdata() {
i++;
if(angular.isUndefined(data[i]))
return resultData;
$http.post('/api/bla/blabla', $httpParamSerializer(data[i]))
.then(function(result) {
resultData.push(result.data);
return processNextdata();
}, $q.reject);
}
When using the Promise.all([...]) method, the documentation shows the following:
The Promise.all(iterable) method returns a promise that resolves when all of the promises in the iterable argument have resolved, or rejects with the reason of the first passed promise that rejects.
What this tells us is that there is no expected order of synchronized operations, but in fact the promises run parallel to one another and can complete in any order.
In your case, there is an expected order that you want your promises to run in, so using Promise.all([...]) won't satisfy your requirements.
What you can do instead is execute individual promises, then if you have some that can run in parallel use the Promise.all([...]) method.
I would create a method that takes a request as an argument, then returns the generated promise:
function request (req) {
return new Promise(function (resolve, reject) {
request({
url: url
, port: <port>
, body: req
, json: <true/false>
, method: '<POST/GET>'
, headers: {
}
}, function (error, response, body) {
if (error) {
reject(error);
} else {
resolve(body);
}
});
});
You can then call this function and store the result:
var response = request(myRequest);
Alternatively, you could create an array of your requests and then call the function:
var requests = [request1, request2, ..., requestN];
var responses = [];
for (var i = 0; i < requests.length; i++) {
responses.push(request(requests[i]));
}
Let's say I have some code that looks like this:
var doSomething = function(parameter){
//send some data to the other function
return when.promise(function(resolveCallback, rejectCallback) {
var other = doAnotherThing(parameter);
//how do I check and make sure that other has resolved
//go out and get more information after the above resolves and display
});
};
var doAnotherThing = function(paramers){
return when.promise(function(resolveCallback, rejectCallback) {
//go to a url and grab some data, then resolve it
var s = "some data I got from the url";
resolveCallback({
data: s
});
});
};
How do I ensure that var other has completely resolved before finishing and resolving the first doSomething() function? I'm still wrapping my head around Nodes Async characteristic
I really didn't know how else to explain this, so I hope this makes sense! Any help is greatly appreciated
EDIT: In this example, I am deleting things from an external resource, then when that is done, going out the external resource and grabbing a fresh list of the items.
UPDATED CODE
var doSomething = function(parameter){
//send some data to the other function
doAnotherThing(parameter).then(function(){
//now we can go out and retrieve the information
});
};
var doAnotherThing = function(paramers){
return when.promise(function(resolveCallback, rejectCallback) {
//go to a url and grab some data, then resolve it
var s = "some data I got from the url";
resolveCallback({
data: s
});
});
};
The return of doAnotherThing appears to be a promise. You can simply chain a then and put your callback to utilize other. then also already returns a promise. You can return that instead.
// Do stuff
function doSomething(){
return doAnotherThing(parameter).then(function(other){
// Do more stuff
return other
});
}
// Usage
doSomething().then(function(other){
// other
});
Below is how to accomplish what you're trying to do with bluebird.
You can use Promise.resolve() and Promise.reject() within any function to return data in a Promise that can be used directly in your promise chain. Essentially, by returning with these methods wrapping your result data, you can make any function usable within a Promise chain.
var Promise = require('bluebird');
var doSomething = function(parameter) {
// Call our Promise returning function
return doAnotherThing()
.then(function(value) {
// Handle value returned by a successful doAnotherThing call
})
.catch(function(err) {
// if doAnotherThing() had a Promise.reject() in it
// then you would handle whatever is returned by it here
});
}
function doAnotherThing(parameter) {
var s = 'some data I got from the url';
// Return s wrapped in a Promise
return Promise.resolve(s);
}
You can use the async module and its waterfall method to chain the functions together:
var async = require('async');
async.waterfall([
function(parameter, callback) {
doSomething(parameter, function(err, other) {
if (err) throw err;
callback(null, other); // callback with null error and `other` object
});
},
function(other, callback) { // pass `other` into next function in chain
doAnotherThing(other, function(err, result) {
if (err) throw err;
callback(null, result);
})
}
], function(err, result) {
if (err) return next(err);
res.send(result); // send the result when the chain completes
});
Makes it a little easier to wrap your head around the series of promises, in my opinion. See the documentation for explanation.
it is a common pattern that we cascade across a list of sources of data with the first success breaking the chain like this:
var data = getData1();
if (!data) data = getData2();
if (!data) data = getData3();
et cetera. if the getDataN() functions are asynchronous, however, it leads us to 'callback hell':
var data;
getData1(function() {
getData2(function () {
getData3(function () { alert('not found'); })
})
});
where the implementations may look something like:
function getData1(callback) {
$.ajax({
url: '/my/url/1/',
success: function(ret) { data = ret },
error: callback
});
}
...with promises I would expect to write something like this:
$.when(getData1())
.then(function (x) { data = x; })
.fail(function () { return getData2(); })
.then(function (x) { data = x; })
.fail(function () { return getData3(); })
.then(function (x) { data = x; });
where the second .then actually refers to the return value of the first .fail, which is itself a promise, and which I understood was chained in as the input to the succeeding chain step.
clearly I'm wrong but what is the correct way to write this?
In most promise libs, you could chain .fail() or .catch() as in #mido22's answer, but jQuery's .fail() doesn't "handle" an error as such. It is guaranteed always to pass on the input promise (with unaltered state), which would not allow the required "break" of the cascade if/when success happens.
The only jQuery Promise method that can return a promise with a different state (or different value/reason) is .then().
Therefore you could write a chain which continues on error by specifying the next step as a then's error handler at each stage.
function getDataUntilAsyncSuccess() {
return $.Deferred().reject()
.then(null, getData1)
.then(null, getData2)
.then(null, getData3);
}
//The nulls ensure that success at any stage will pass straight through to the first non-null success handler.
getDataUntilAsyncSuccess().then(function (x) {
//"success" data is available here as `x`
}, function (err) {
console.log('not found');
});
But in practice, you might more typically create an array of functions or data objects which are invoked in turn with the help of Array method .reduce().
For example :
var fns = [
getData1,
getData2,
getData3,
getData4,
getData5
];
function getDataUntilAsyncSuccess(data) {
return data.reduce(function(promise, fn) {
return promise.then(null, fn);
}, $.Deferred().reject());// a rejected promise to get the chain started
}
getDataUntilAsyncSuccess(fns).then(function (x) {
//"success" data is available here as `x`
}, function (err) {
console.log('not found');
});
Or, as is probably a better solution here :
var urls = [
'/path/1/',
'/path/2/',
'/path/3/',
'/path/4/',
'/path/5/'
];
function getDataUntilAsyncSuccess(data) {
return data.reduce(function(promise, url) {
return promise.then(null, function() {
return getData(url);// call a generalised `getData()` function that accepts a URL.
});
}, $.Deferred().reject());// a rejected promise to get the chain started
}
getDataUntilAsyncSuccess(urls).then(function (x) {
//"success" data is available here as `x`
}, function (err) {
console.log('not found');
});
As a beginner, stumbling across the same problem, I just realized how much simpler this has become with async and await:
The synchronous pattern
var data = getData1();
if (!data) data = getData2();
if (!data) data = getData3();
can now easily be applied to asynchronous code:
let data = await getData1();
if (!data) data = await getData2();
if (!data) data = await getData3();
Just remember to add an async to the function that this code is used in.