I have an $http request that is returning a bunch of rows. I need to process each of those results synchronously. Having trouble wrapping my brain around Angular.
Each of the records needs to be processed against a local SQLite database on an iOS device, and that is an asynchronous call.
If any of the loop records fail, I need to abort the entire operation (and loop).
Here's the code to see if it helps...
var username = $rootScope.currentUser;
window.logger.logIt("Executing incremental sync with username " + username);
var url = $rootScope.serviceBaseUrl + 'SyncData/GetSyncItems?userid=' + username + '&lastSyncDate=' + lastSyncDate.toString();
var encoded = encoder.encode($CONFIG.serviceAccount);
$http.defaults.headers.common.Authorization = 'Basic ' + encoded;
$http({ method: 'Get', url: url })
.success(function(data, status, headers, config) {
var processes = [];
for (var i in data) {
var params = data[i].Params;
var paramsMassaged = params.replaceAll("[", "").replaceAll("]", "").replaceAll(", ", ",").replaceAll("'", "");
var paramsArray = paramsMassaged.split(",");
var process;
if (data[i].TableName === "Tabl1") {
window.logger.logIt("setting the process for a Table1 sync item");
process = $Table1_DBContext.ExecuteSyncItem(data[i].Query, paramsArray);
} else if (data[i].TableName === "Table2") {
window.logger.logIt("setting the process for an Table2 sync item");
process = $Table2_DBContext.ExecuteSyncItem(data[i].Query, paramsArray);
} else {
window.logger.logIt("This table is not included in the sync process. You have an outdated version of the application. Table: " + data[i].TableName);
}
window.logger.logIt("got to here...");
processes.push(process);
}
window.logger.logIt("Finished syncing all " + data.length + " records in the list...");
$q.all(processes)
.then(function (result) {
// Update the LastSyncDate here
$DBConfigurations_DBContext.UpdateLastSyncDate(data[i].CreatedDate);
alert("finished syncing all records");
}, function (result) {
alert("an error occurred.");
});
})
.error(function(data, status, headers, config) {
alert("An error occurred retrieving the items that need to be synced.");
});
Table2 ExecuteSyncItem function:
ExecuteSyncItem: function (script, params) {
//window.logger.logIt("In the Table2 ExecuteSyncItem function...");
//$DBService.ExecuteQuery(script, params, null);
var deferred = $q.defer();
var data = $DBService.ExecuteQuery(script, params, null);
if (data) {
deferred.resolve(data);
} else {
deferred.reject(data);
}
return deferred.promise;
}
DB Service code:
ExecuteQuery: function (query, params, success) {
$rootScope.db.transaction(function (tx) {
tx.executeSql(query, params, success, onError);
});
},
Update: In response to Maxim's question "did you log process method". Here's what I'm doing...
ExecuteSyncItem: function (script, params) {
window.logger.logIt("In the Experiment ExecuteSyncItem function...");
//$DBService.ExecuteQuery(script, params, null);
var deferred = $q.defer();
var data = $DBService.ExecuteQuery(script, params, function () { window.logger.logIt("successCallback"); });
if (data) {
window.logger.logIt("success");
deferred.resolve(data);
} else {
window.logger.logIt("fail");
deferred.reject(data);
}
return deferred.promise;
}
"data" is undefined everytime. "fail" is logged everytime, as well as "successCallback". Also, the executeQuery IS working, and updating the data the way I expect.
So now, it's just a matter of the promise syntax I guess. If the ExecuteQuery isn't actually populating the "data" variable since it's asynchronous, how do I set the deferred.resolve() and deferred.reject stuff?
You are on right way
I would use $q.all
$q.all([async1(), async2() .....])
Combines multiple promises into a single promise that is resolved when all of the input promises are resolved.
Returns a single promise that will be resolved with an array/hash of values, each value corresponding to the promise at the same index/key in the promises array/hash. If any of the promises is resolved with a rejection, this resulting promise will be rejected with the same rejection value.
For example:
var processes = [];
processes.push(Process1);
processes.push(Process2);
/* ... */
$q.all(processes)
.then(function(result)
{
/* here all above mentioned async calls finished */
$scope.response_1 = result[0];
$scope.response_2 = result[1];
}, function (result) {
alert("Error: No data returned");
});
From your example you run in loop and call async methods (Process1, Process2) 10 times (8 and 2 respectively). In order to use $q.all the Process1, Process2 must return promise.
So I would write it something like that:
var Process1 = function(stuff) {
var deferred = $q.defer();
var data = $DBService.ExecuteQuery(stuff.query); // This is asynchronous
if (data ) {
deferred.resolve(data);
} else {
deferred.reject(data);
}
return deferred.promise;
}
Related
I need to retrieve data from multiple API calls and aggregate them with the click of a button in the UI. I need to print the data once they all have been executed fully. I am returning a Promise from the function that runs a for-loop to make all API calls in succession. I am also processing the API call results as I am receiving them. Hence I resolve the promise just outside that for-loop. (Reminder, the for-loop makes some API calls inside.) Now, when I call this function, the promise gets resolved immediately and the success function runs which basically gives me empty aggregate data which is not expected. Where/how should I resolve the promise in such case?
Basic structure of my code:
forLoopFunction(xyz)
.then(function(){ // print aggregate data when successfull})
.catch(function(){ // print couldn't retrieve data })
forLoopFunction(){
return new Promise(resolve, reject){
for(var i=0; i<...){
api1_Call().$promise
.then(
api2_call().$promise
.then(
//work on aggregate data
)
.catch(reject(error))
).
.catch(reject(error))
}
//end of for loop
resolve(aggregated_data);
}
}
Edited code structure:
//$scope.requests is populated before this function call, have seen it printed
$scope.myFunc()
.then(function(result){console.log(result)})
.catch(function(error){console.log("failed")})
$scope.myFunc = function() {
var promiseArray = [];
for(var i=0; i<$scope.requests.data.length; i++) {
promiseArray.push(new Promise(function(resolve, reject) {
var requests= $scope.requests.data[i];
WorkflowsService.get({id: requests.key}).$promise
.then(
function (data) {
StepsService.query({workflowId: workflowData.id}).$promise
.then(
function (steps) {
//some variables calculation
var metadata = []; //this belongs to a single work request
//some more values pushed to metadata array
//switch case to calculate appropriate endpoint and system id
//$.ajaxSetup({async: false});
$.ajax({
url: apiEndpoint + systemId,
type: 'GET',
success: function(resp){
compartmentId = resp.compartmentId;
$.get("/api/compartments/" + compartmentId, function (resp) {
//some values pushed to metadata
});
},
error: function(resp) {
//put dummy data to metadata array
}
});
//construct a URL to be pushed into metadata
$scope.metadataString += metadata.join("\n");
Promise.resolve("resolved promise");
})
.catch( function(error){ Promise.reject("rejected"); console.log(error) } )
})
.catch( function(error){ Promise.reject("rejected"); console.log(error) } )
});
promiseArray.push(promiseObject);
}
return Promise.all(promiseArray).then(function() { return $scope.metadataString; });
}
You should maintain promise for all api call then resolve all in one time which I am doing by using promise.all, this will give you data in then callback in the form of array of each api call.
forLoopFunction(xyz)
.then(function(data){ console.log(data); // print aggregate data when successfull})
.catch(function(){ // print couldn't retrieve data })
forLoopFunction(){
var promiseArray = [];
for(var i=0; i<...){
var promiseObject = new Promise(function(resolve, reject){
api1_Call().$promise
.then(
api2_call().$promise
.then(
resolve(aggregated_data);
)
.catch(reject(error))
).
.catch(reject(error))
});
promiseArray.push(promiseObject);
}
return Promise.all(promiseArray)
}
I am trying to wrap my post/get/put/delete calls so that any time they are called, if they fail they will check for expired token, and try again if that is the reason for failure, otherwise just resolve the response/error. Trying to avoid duplicating code four times, but I'm unsure how to resolve from a non-anonymous callback.
factory.post = function (url, data, config) {
var deferred = $q.defer();
$http.post(url, data, config).then(factory.success, factory.fail);
return deferred.promise;
}
factory.success = function (rsp) {
if (rsp) {
//how to resolve parent's promise from from here
}
}
Alternative is to duplicate this 4 times:
.then(function (rsp) {
factory.success(rsp, deferred);
}, function (err) {
factory.fail(err, deferred);
});
One solution might be using bind function.
function sum(a){
return a + this.b;
}
function callFn(cb){
return cb(1);
}
function wrapper(b){
var extra = {b: b};
return callFn(sum.bind(extra));
}
console.log(wrapper(5));
console.log(wrapper(-5));
console.log(wrapper(50));
For your solution check bellow example
factory.post = function (url, data, config) {
var deferred = $q.defer();
$http.post(url, data, config).then(factory.success.bind({deferred: deferred}), factory.fail.bind({deferred: deferred}));
return deferred.promise;
}
factory.success = function (rsp) {
if (rsp) {
this.deferred.resolve(rsp);
//how to resolve parent's promise from from here
}else {
//retry or reject here
}
}
From what I understand, you just want to resolve the deferred object on success and retry on error in case of expired token. Also you probably want to keep a count of number of retries. If so,
Edit - Seems I misunderstood the question. The answer suggested by Atiq should work, or if you are using any functional JS libraries like underscore or Ramdajs, you could use curry function. Using curry function, you can pass some parameters to the function and the function will get executed only after all the parameters are passed. I have modified the code snippet to use curry function from underscorejs.
factory.post = function (url, data, config) {
var deferred = $q.defer();
$http.post(url, data,
config).then(_.curry(factory.success(deferred)),
_.curry(factory.fail(deferred));
return deferred.promise;
}
factory.success = function (deferred, rsp) {
if (rsp) {
//handle resp
deferred.resolve(rsp);
}
}
factory.fail = function(deferred, err){
//handle retry
deferred.reject(err);
}
I've written a program that makes an HTTP GET request for three distinct URLs. The program is supposed to output the message body in the order the URLs are provided, however it's not doing so even though I'm making callbacks in exactly that order.
The final program is supposed to require the user to input the URLs via command line, however I've simply made variable assignments for ease of testing.
I realize this code could be more object-oriented - however I'm new to JavaScript and it's not my focus to learn how at the moment
var http = require('http')
// var url_1 = process.argv[2]
// var url_2 = process.argv[3]
// var url_3 = process.argv[4]
var url_1 = 'http://youvegotmail.warnerbros.com/cmp/0frameset.html'
var url_2 = 'http://www.3riversstadium.com/index2.html'
var url_3 = 'http://toastytech.com/evil/'
var output_1 = ''
var output_2 = ''
var output_3 = ''
function getHttp_1 (callback) {
http.get(url_1, function getResponse (response1) {
response1.setEncoding('utf8')
response1.on('data', function (data) {
output_1 = output_1 + data
})
response1.on('end', function processData() {
console.log("Printing Result 1:")
callback(output_1)
})
})
}
function getHttp_2 (callback) {
http.get(url_2, function getResponse (response2) {
response2.setEncoding('utf8')
response2.on('data', function (data) {
output_2 = output_2 + data
})
response2.on('end', function processData() {
console.log("Printing Result 2:")
callback(output_2)
})
})
}
function getHttp_3 (callback) {
http.get(url_3, function getResponse (response3) {
response3.setEncoding('utf8')
response3.on('data', function (data) {
output_3 = output_3 + data
})
response3.on('end', function processData() {
console.log("Printing Result 3:")
callback(output_3)
})
})
}
function printResults(output) {
console.log("Result")
// console.log(output)
}
getHttp_1(printResults)
getHttp_2(printResults)
getHttp_3(printResults)
EDIT:
Results I'm generally getting:
Printing Result 3:
Result
Printing Result 2:
Result
Printing Result 1:
Result
Results I'm expecting:
Printing Result 1:
Result
Printing Result 2:
Result
Printing Result 3:
Result
In contrast to the sequential callback approach proposed by some answers, using Promises will make this both more efficient (the requests will be made in parallel) and simpler:
var http = require('http'),
urls = [
'http://youvegotmail.warnerbros.com/cmp/0frameset.html',
'http://www.3riversstadium.com/index2.html',
'http://toastytech.com/evil/'
];
Promise.all(urls.map(getUrl))
.then(function (results) {
results.forEach(function (output, i) {
console.log("Result #" + (i + 1) +
" with length: " + output.length);
});
});
function getUrl(url, i) {
return new Promise(function (resolve, reject) {
http.get(url, function getResponse(resp) {
var output = '';
resp.setEncoding('utf8');
resp.on('data', function (data) {
output += data;
});
resp.on('end', function processData() {
console.log("Resolving Result " + (i + 1) + ":");
resolve(output);
});
})
});
}
Welcome to the asynchronous life of node.js! As you fire off those HTTP requests, one will not wait for the request before it to finish before it fires. You are seeing this odd behavior because you are practically sending all 3 requests at once, and simply printing as you see the responses.
Edit: If you do want to see them in correct order, fire off the second HTTP request inside the callback of the first, and then the third inside the callback of the second. That guarantees you won't get the data until after each one before it finishes.
function getHttp_1 (callback) {
http.get(url_1, function getResponse (response1) {
response1.setEncoding('utf8')
response1.on('data', function (data) {
output_1 = output_1 + data
})
response1.on('end', function processData() {
console.log("Printing Result 1:")
callback(output_1)
getHttp_2(callback)
})
})
}
The async module can really help for controlling how async tasks are executed. For example, if you want your requests to happen one after the other:
async.series([
function (next) { makeRequest(url_1, next); },
function (next) { makeRequest(url_2, next); },
function (next) { makeRequest(url_3, next); },
], function (err, result) {
// All done
});
// Or you can get fancy
//async.series([
// makeRequest.bind(null, url_1),
// makeRequest.bind(null, url_2),
// makeRequest.bind(null, url_3),
//]);
function makeRequest(url, callback) {
http.get(url, function getResponse (res) {
var output = '';
res.setEncoding('utf8')
res.on('data', function (data) {
output += data
})
response1.on('end', function processData() {
callback(output)
})
})
}
If you don't care what order they occur in but want to output them in order:
async.parallel([
function (next) { makeRequest(url_1, next); },
function (next) { makeRequest(url_2, next); },
function (next) { makeRequest(url_3, next); },
], function (err, results) {
if (err) {
return void console.error('Got an error:', err.stack);
}
console.log(results); // Will output array of every result in order
});
If the requests are dependent on each other, async.auto is useful to tie the result of one request to the request of another.
JavaScript/AJAX calls are async so don't follow the order you call them. To call them in sequence/specific order, do like:
$(function () {
//setup an array of AJAX options, each object is an index that will specify information for a single AJAX request
var ajaxes = [{ url : '<url>', dataType : 'json' }, { url : '<url2>', dataType : 'utf8' }],
current = 0;
//declare your function to run AJAX requests
function do_ajax() {
//check to make sure there are more requests to make
if (current < ajaxes.length) {
//make the AJAX request with the given data from the `ajaxes` array of objects
$.ajax({
url : ajaxes[current].url,
dataType : ajaxes[current].dataType,
success : function (serverResponse) {
...
//increment the `current` counter and recursively call this function again
current++;
do_ajax();
}
});
}
}
//run the AJAX function for the first time once `document.ready` fires
do_ajax();
});
Another option could be:
function callA() {
$.ajax({
...
success: function() {
//do stuff
callB();
}
});
}
function callB() {
$.ajax({
...
success: function() {
//do stuff
callC();
}
});
}
function callC() {
$.ajax({
...
});
}
callA();
Ref: Multiple Calls in Order
I am using the when library with Node js. I create a deffered object, place the resolve inside an encapsulated Mongoose findOne() function, and return the promise outside. But it seems my promise is always returned before the data is retrieved.
User.prototype.getProfile = function(criteria) {
var deferred = when.defer();
var options = {
criteria: criteria,
select: 'name id email'
};
this.User.load(options, function(err, data) {
if (data) {
this.name = data.name;
this.email = data.email;
this.id = data.id;
} else {
return false;
}
console.log(data);
deferred.resolve();
});
console.log('returning promise');
return deferred.promise;
};
Caller
User.getProfile(req.query).then(
function success(data) {
res.send('Hello ' + User.name);// Hello ''
}
);
Outputs 'returning promise' before the data
Yes, promise will be returned to the caller instead of the data and that is how we can take advantage of the asynchronous functions. This is the common sequence of actions in handling async calls,
Make an async call.
Return a Promise to the caller.
At this point, caller doesn't have to wait for the result. It can simply define a then function, which knows what to do when the data is ready and move on to the next task.
Later point of time, resolve (or reject, if failed) the promise when you get the result from the async call.
Execute the then function on the Promise object, with the result from the async call.
So, your code will have to be modified a little bit, like this
User.prototype.getProfile = function(criteria) {
var deferred = when.defer();
var options = {
criteria: criteria,
select: 'name id email'
};
this.User.load(options, function(err, data) {
if (err) {
// Reject, if there is an error
deferred.reject(err);
} else {
// Resolve it with actual data
deferred.resolve(data);
}
});
return deferred.promise;
};
Then your caller will do something like this
userObject.getProfile()
.then(function(profileObject) {
console.log(profileObject);
// Do something with the retrieved `profileObject`
})
.catch(function(err) {
console.err("Failed to get Profile", err);
});
// Do something else here, as you don't have to wait for the data
Here, caller just calls getProfile and attaches a function which says what to do with the returned data and moves on.
Edit If you want the same object to be updated, then you can simply use similar code, but you need to preserve this in some other variable, because the binding of this happens at runtime.
User.prototype.getProfile = function(criteria) {
var deferred = when.defer();
var options = {
criteria: criteria,
select: 'name id email'
};
var self = this;
this.User.load(options, function(err, data) {
if (err) {
// Reject, if there is an error
deferred.reject(err);
} else {
self.name = data.name;
self.email = data.email;
self.id = data.id;
}
deferred.resolve(data);
});
return deferred.promise;
};
That's how promises work.
Since you have an async task that takes some time, and JavaScript is a single threaded language, you don't want to block your code and wait for that async operation to complete itself - otherwise nobody would use JavaScript!!
So what do you do? You create a promise and continue your code.
You add callbacks to that promise and when the promise is resolved your callbacks are invoked.
I didn't use the when library but what you want to do is something like this:
User.prototype.getProfile = function(criteria){
var deferred = when.defer();
var options = {
criteria : criteria,
select : 'name id email'
};
this.User.load(options, function(err, data) {
if (data) {
this.name = data.name;
this.email = data.email;
this.id = data.id;
console.log(data);
// the callback will invoke after the deferred object is resolved.
deferred.promise.then(function(o){ console.log('resolved!!!'); });
deferred.resolve(data);
}else{
deferred.reject('something bad occured');
return false;
}
});
return deferred.promise;
};
I'm trying to use the AngularJS promise/then with a recursive function. But the then-function is not called (none of the error-, success-, notify-callbacks gets called).
Here is my code:
recursive function
loadSection2 = function() {
var apiURL = "http://..."
var deferred = $q.defer();
$http({
method: "GET",
url: apiURL
}).success(function(result, status, headers, config) {
console.log(result);
loadCount++;
if(loadCount < 10) {
newSectionArray.push(result);
loadSection2();
} else {
loadCount = 0;
deferred.resolve();
return deferred.promise;
}
}).error(function() {
return deferred.reject();
});
deferred.notify();
return deferred.promise;
};
then
loadSection2().then(function() {
console.log("NEW SECTIONS LOADED, start adding to document");
addContent();
}, function() {
console.log("ERROR CALLBACK");
}, function() {
console.log("NOTIFY CALLBACK");
}).then(function() {
loadScrollActive = false;
});
I think, the then has to get the first notify-callback at least. But there is no callback.
Is then not working with recursive function?
EDIT - 11/11/2015 There is a much cleaner way if you don't care about notify:
loadSection2 = function (){
var apiURL = "http://..."
return $http.get(apiURL)
.then(function(response){
loadCount++;
if (loadCount < 10) {
newSectionArray.push(response.data);
return loadSection2();
}
loadCount = 0;
});
};
Old answer available here:
You could continuously pass the promise all the way through.
loadSection2 = function(deferred) {
if(!deferred){
deferred = $q.defer();
}
var apiURL = "http://..."
$http({
method: "GET",
url: apiURL
}).success(function(result, status, headers, config) {
console.log(result);
loadCount++;
if(loadCount < 10) {
newSectionArray.push(result);
loadSection2(deferred);
} else {
loadCount = 0;
deferred.resolve();
return deferred.promise;
}
}).error(function() {
return deferred.reject();
});
deferred.notify();
return deferred.promise;
};
I wanted to make a solution that doesn't pass "deferred" variable around and even though I wouldn't say it is a better approach, it works and I learned a from it (jsfiddle).
19/Aug/14 - Updated the code to a much shorter version by removing the creation of another promise in f1(). I hope that it is clear how it relates to the original question. If it isn't let me know in a comment.
f1().then(function() {
console.log("done");
});
function f1(counter) {
if (!counter) {
counter = 0;
}
counter++;
console.log(counter);
return asyncFunc().then(function() {
if (counter < 10) {
return f1(counter);
} else {
return;
}
});
}
function asyncFunc() {
var deferred = $q.defer();
$timeout(function() {
deferred.resolve();
}, 100);
return deferred.promise;
}
Fauphi,
Recursion is totally viable but not a particularly "promisy" approach.
Given that you have deferreds/promises available, you can dynamically build a .then() chain, which delivers a promise of a populated array.
function loadSection2(arr) {
return $http({
method: "GET",
url: "http://..."
}).then(function(result, status, headers, config) {
console.log(result);
arr.push(result);
return arr;//make the array available to the next call to loadSection2().
}, function() {
console.log("GET error");
return $q.defer().resolve(arr).promise;//allow the chain to continue, despite the error.
//or I think $q's .then() allows the much simpler form ...
//return arr; //allow the chain to continue, despite the error.
});
};
var newSectionPromise = $q.defer().resolve([]).promise;//note that the deferred is resolved with an anonymous new array.
//Now we build a .then() chain, ten long, ...
for (var i=0; i<10; i++) {
newSectionPromise = newSectionPromise.then(loadSection2);
}
// ... and do something with the populated array when the GETs have done their thing.
newSectionPromise().then(function(arr) {
console.log(arr.length + " new sections loaded, start adding to document");
addContent(arr);
}, function() {
console.log("ERROR CALLBACK");
}).then(function() {
loadScrollActive = false;
});
untested
What was newSectionArray is now created anonymously and passed down the .then() chain regardless of success/failure of the individual GETs, emerging as arr in the final .then's success handler, where it is passed to addContent(). This avoids the need for member newSectionArray in the outer scope.
Rearranging slightly, loadSection2 could be made anonymous, further reducing the number of members added to the outer scope.
The need for an explicit notification disappears as :
there is no longer a master deferred to be notified
console.log(result); in the GET success handler provides all the notification necessary.