Loop calling an asynchronous function - javascript

Introduction to the problem
I need to call an asynchronous function within a loop until a condition is satisfied. This particular function sends a POST request to a website form.php and performs some operations with the response, which is a JSON string representing an object with an id field. So, when that id is null, the outer loop must conclude. The function does something like the following:
function asyncFunction(session) {
(new Request({
url: form.php,
content: "sess=" + session,
onComplete: function (response) {
var response = response.json;
if (response.id) {
doStaff(response.msg);
} else {
// Break loop
}
}
})).get();
}
Note: Although I've found the problem implementing an add-on for Firefox, I think that this is a general javascript question.
Implementing the loop recursively
I've tried implementing the loop by recursivity but it didn't work and I'm not sure that this is the right way.
...
if (response.id) {
doStaff(response.msg);
asyncFunction(session);
} else {
// Break loop
}
...
Using jsdeferred
I also have tried with the jsdeferred library:
Deferred.define(this);
//Instantiate a new deferred object
var deferred = new Deferred();
// Main loop: stops when we receive the exception
Deferred.loop(1000, function() {
asyncFunction(session, deferred);
return deferred;
}).
error(function() {
console.log("Loop finished!");
});
And then calling:
...
if (response.id) {
doStaff(response.msg);
d.call();
} else {
d.fail();
}
...
And I achieve serialization but it started repeating previous calls for every iteration. For example, if it was the third time that it called the asyncFunction, it would call the same function with the corresponding parameters in the iterations 1 and 2.

Your question is not exactly clear, but the basic architecture must be that the completion event handlers for the asynchronous operation must decide whether to try again or to simply return. If the results of the operation warrant another attempt, then the handler should call the parent function. If not, then by simply exiting the cycle will come to an end.
You can't code something like this in JavaScript with anything that looks like a simple "loop" structure, for the very reason that the operations are asynchronous. The results of the operation don't happen in such a way as to allow the looping mechanism to perform a test on the results; the loop may run thousands of iterations before the result is available. To put it another way, you don't "wait" for an asynchronous operation with code. You wait by doing nothing, and allowing the registered event handler to take over when the results are ready.

Thank you guys for your help. This is what I ended doing:
var sess = ...;
Deferred.define(this);
function asyncFunction (session) {
Deferred.next(function() {
var d = new Deferred();
(new Request({
url: form.php,
content: "sess=" + session,
onComplete: function (response) {
d.call(response.json);
}
})).get();
return d;
}).next(function(resp) {
if (resp.id) {
asyncFunction(session);
console.log(resp.msg);
}
});
}
asyncFunction(sess);

Why wouldn't you just use a setInterval loop? In the case of an SDK-based extension, this would look like:
https://builder.addons.mozilla.org/addon/1065247/latest/
The big benefit of promises-like patterns over using timers is that you can do things in parallel, and use much more complicated dependencies for various tasks. A simple loop like this is done just as easily / neatly using setInterval.

If I correctly understand what you want to do, Deferred is a good approach. Here's an example using jQuery which has Deferred functionality built in (jQuery.Deferred)
A timeout is used to simulate an http request. When each timeout is complete (or http request is complete) a random number is returned which is equivalent to the result of your http request.
Based on the result of the request you can decide if you need another http request or want to stop.
Try out the below snippet. Include the jQuery file and then the snippet. It keeps printing values in the console and stops after a zero is reached.
This could take while to understand but useful.
$(function() {
var MAXNUM = 9;
function newAsyncRequest() {
var def = $.Deferred(function(defObject) {
setTimeout(function() {
defObject.resolve(Math.floor(Math.random() * (MAXNUM+1)));
}, 1000);
});
def.done(function(val) {
if (val !== 0)
newAsyncRequest();
console.log(val);
});
};
newAsyncRequest();
});
Update after suggestion from #canuckistani
#canuckistani is correct in his answer. For this problem the solution is simpler. Without using Deferred the above code snippet becomes the following. Sorry I led you to a tougher solution.
$(function() {
var MAXNUM = 9;
function newAsyncRequest() {
setTimeout(function() {
var val = Math.floor(Math.random() * (MAXNUM+1));
if (val !== 0)
newAsyncRequest();
console.log(val);
}, 1000);
}
newAsyncRequest();
});

Related

How to structure these nested asynchronous requests to complete a batch before proceeding?

I have the need to do a main AJAX form submit. However, I want to perform series of other preliminary form submits and AJAX requests halfway, before continuing the main from submit.
Below is the idea, but with a lot of pseudocode. I want to call the ajaxFunction as shown, complete all its tasks, then proceed with the main form submission:
$('#mainform').submit(function(e){
e.preventDefault();
var main = this;
var data = $('#section :input', main).serialize();
//preliminary nested ajax requests
var mainresult = ajaxFunction('arg1', 'arg2');
alert("All preliminary AJAX done, proceeding...");
if(mainresult){
//final ajax
$.post('mainurl', data, function(result){
console.log(result);
});
}else{
//do nothing
}
});
function ajaxFunction(param1, param2){
//ajax1
ajaxFetchingFunction1('url1', function(){
//ajax2
ajaxFetchingFunction2('url2', function(){
//submit handler
$('#anotherform').submit(function(){
if(someparam === 1){
return true;
}else{
return false;
}
});
});
});
}
As it is now, I know it won't work as expected because of all the asynchronous nested AJAX calls. What I get is that alert("All preliminary AJAX done, proceeding..."); executes even before any of the AJAX calls in ajaxFunction.
I believe that this is just the kind of scenario ("callback hell") for which the Deferred/Promise concept was introduced, but I've been struggling to wrap my head around this. How can I structure these different AJAX requests, such that code execution would wait until ajaxFunction completes and returns mainresult for subsequent use?
How can I structure these different AJAX requests, such that code
execution would wait until ajaxFunction completes and returns
mainresult for subsequent use?
You can't and you don't. Javascript will not "wait" for an asynchronous operation to complete. Instead, you move the code that wants to run after the async operation is done into a callback that is then called when the async operation is done. This is true whether using plain async callbacks or structured callbacks that are part of promises.
Asynchronous programming in Javascript requires a rethinking and restructing of the flow of control so that things that you want to run after an async operation is done are put into a callback function rather than just sequentially on the next line of code. Async operations are chained in sequence through a series of callbacks. Promises are a means of simplifying the management of those callbacks and particularly simplifying the propagation of errors and/or the synchronization of multiple async operations.
If you stick with callbacks, then you can communicate completion of ajaxFunction() with a completion callback:
function ajaxFunction(param1, param2, doneCallback){
//ajax1
ajaxFetchingFunction1('url1', function(){
//ajax2
ajaxFetchingFunction2('url2', function(){
doneCallback(someResult);
});
});
}
And, then use it here:
$('#mainform').submit(function(e){
e.preventDefault();
var main = this;
var data = $('#section :input', main).serialize();
//preliminary nested ajax requests
ajaxFunction('arg1', 'arg2', function(result) {
// process result here
alert("All preliminary AJAX done, proceeding...");
if(result){
//final ajax
$.post('mainurl', data, function(result){
console.log(result);
});
}else{
//do nothing
}
});
});
Note: I removed your $('#anotherform').submit() from the code because inserting an event handler in a function that will be called repeatedly is probably the wrong design here (since it ends up creating multiple identical event handlers). You can insert it back if you're sure it's the right thing to do, but it looked wrong to me.
This would generally be a great place to use promises, but your code is a bit abstract to show you exactly how to use promises. We would need to see the real code for ajaxFetchingFunction1() and ajaxFetchingFunction2() to illustrate how to make this work with promises since those async functions would need to create and return promises. If you're using jQuery ajax inside of them, then that will be easy because jQuery already creates a promise for an ajax call.
If both ajaxFetchingFunction1() and ajaxFetchingFunction2() are modified to return a promise, then you can do something like this:
function ajaxFunction(param1, param2){
return ajaxFetchingFunction1('url1').then(function() {
return ajaxFetchingFunction2('url2');
});
}
And, then use it here:
$('#mainform').submit(function(e){
e.preventDefault();
var main = this;
var data = $('#section :input', main).serialize();
//preliminary nested ajax requests
ajaxFunction('arg1', 'arg2').then(function(result) {
// process result here
alert("All preliminary AJAX done, proceeding...");
if(result){
//final ajax
$.post('mainurl', data, function(result){
console.log(result);
});
}else{
//do nothing
}
});
});
Promises make the handling of multiple ajax requests really trivial, however the implications of "partial forms" on GUI design are maybe more of a challenge. You have to consider things like :
One form divided into sections, or one form per partial?
Show all partials at the outset, or reveal them progressively?
Lock previously validated partials to prevent meddling after validation?
Revalidate all partials at each stage, or just the current partial?
One overall submit button or one per per partial?
How should the submit button(s) be labelled (to help the user understand the process he is involved in)?
Let's assume (as is the case for me but maybe not the OP) that we don't know the answers to all those questions yet, but that they can be embodied in two functions - validateAsync() and setState(), both of which accept a stage parameter.
That allows us to write a generalised master routine that will cater for as yet unknown validation calls and a variety of GUI design decisions.
The only real assumption needed at this stage is the selector for the form/partials. Let's assume it/they all have class="partialForm" :
$('.partialForm').on('submit', function(e) {
e.preventDefault();
$.when(setState(1)) // set the initial state, before any validation has occurred.
.then(validateAsync.bind(null, 1)).then(setState.bind(null, 2))
.then(validateAsync.bind(null, 2)).then(setState.bind(null, 3))
.then(validateAsync.bind(null, 3)).then(setState.bind(null, 4))
.then(function aggregateAndSubmit() {
var allData = ....; // here aggregate all three forms' into one serialization.
$.post('mainurl', allData, function(result) {
console.log(result);
});
}, function(error) {
console.log('validation failed at stage: ' + error.message);
// on screen message for user ...
return $.when(); //inhibit .fail() handler below.
})
.fail(function(error) {
console.log(error);
// on screen message for user ...
});
});
It's syntactically convenient here to call setState() as a then callback although it's (probably) synchronous
Sample validateAsync() :
function validateAsync(stage) {
var data, jqXHR;
switch(stage) {
case 1:
data = $("#form1").serialize();
jqXHR = $.ajax(...);
break;
case 2:
data = $("#form2").serialize();
jqXHR = $.ajax(...);
break;
case 3:
data = $("#form3").serialize();
jqXHR = $.ajax(...);
}
return jqXHR.then(null, function() {
return new Error(stage);
});
}
Sample setState() :
function setState(stage) {
switch(stage) {
case 1: //initial state, ready for input into form1
$("#form1").disableForm(false);
$("#form2").disableForm(true);
$("#form3").disableForm(true);
break;
case 2: //form1 validated, ready for input into form2
$("#form1").disableForm(true);
$("#form2").disableForm(false);
$("#form3").disableForm(true);
break;
case 3: //form1 and form2 validated, ready for input into form3
$("#form1").disableForm(true);
$("#form2").disableForm(true);
$("#form3").disableForm(false);
break;
case 4: //form1, form2 and form3 validated, ready for final submission
$("#form1").disableForm(true);
$("#form2").disableForm(true);
$("#form3").disableForm(true);
}
return stage;
}
As written setState(), will need the jQuery plugin .disableForm() :
jQuery.fn.disableForm = function(bool) {
return this.each(function(i, form) {
if(!$(form).is("form")) return true; // continue
$(form.elements).each(function(i, el) {
el.readOnly = bool;
});
});
}
As I say, validateAsync() and setState() above are just rudimentary samples. As a minimum, you will need to :
flesh out validateAsync()
modify setState() to reflect the User Experience of your choice.

Returning the data from a deferred?

I have a class that uses Google's places service. A user can enter an address and Google will return information about it.
Later on I wish to find out lat and lng coordinates on this place, so I have this method which utalizes Google's places service to get the coords.
I return a deferred as this may take some time.
p.getLatLong = function() {
var dfd = $.Deferred();
this.placesService.getDetails({
reference: this.pacReference
}, function(details, status){
if(details){
dfd.resolve({'lat' : details.geometry.location.lat(), 'lng' : details.geometry.location.lng()});
}
else{
dfd.reject();
}
})
}
return dfd;
};
I want to be able to access the above method and just return the coords or null (if the dfd is rejected) but the method returns a deferred.
How can I just return the result of the dfd rather than the dfd itself?
I do not wish to have to call:
this.geo.getLatLng().done(function(data){console.log(data})
But something like this:
console.log(this.geo.getLatLng());
I get your point, though promises exist for a reason, the reason here being the asynchronous nature of asking for data.
There is a way, I used to think it was good before I understood the goal of promises. You could return the reference of the 'to be populated' data, but then, when will you be able to use it? Are you planning on polling the state of an object...? I hope not, seriously stick to promises you will avoid a lot of problems for small profit of a bunch of keystrokes.
Deferred objects are meant to allow the thread to continue while long running operations proceed in the background. They serve a specific purpose, and shouldn't work the way you describe by design.
Remember that JavaScript is single-threaded. That means that if you pause the thread waiting for a long operation to complete, the entire page/UI will be frozen as well.
That warning stated, you could potentially accomplish what you want by wrapping all this into your own closure with a loop that checks to see if the process completes.
Please note this is dangerous, will freeze the page, and should be avoided. It is here for academic reasons only.
var getGetLatLng = (function () {
var running = false;
return function () {
var latlng;
//While we haven't instructed the loop to break.
while (!breakLoop) {
//If we haven't instructed the API call to execute in this iteration of the loop.
if (!running) {
//On next iteration, tell it we are already running, to prevent multiple requests being fired.
running = true;
//Your logic here for getLatLng
this.geo.getLatLng()
//When it completes successfully, set latlng
.done(function (data) {
latlng = data;
})
//always break the loop when HTTP completes.
.always(function () {
breakLoop = true;
});
}
}
//Return latlng - it could be undefined if there was an error.
return latlng;
};
})();
You could wrap this same structure around your original p.getLatLng function body too. Again, I don't recommend it.

javascript function completion

I have the following lines of code. The issue is that the function GetScore() takes some time to complete (about 2 seconds). what I want to do is set the score header accordingly. The issue right now is that execution goes to line 3 and score is computer sometime later on. How could I "wait" for score to be ready and only the execute line 3.
$("#ScoreHeader").html('Calculating...');
score = GetScore();
$("#ScoreHeader").html('Done');
Any ideas will be much appreciated.
You can try to refactor GetScore to take a callback function.
Define GetScore like this:
function GetScore(cb){
var score = ... // the score calculation logic
cb(score);
}
And then you can do
$("#ScoreHeader").html('Calculating...');
var score;
GetScore(function(data){
score = data;
$("#ScoreHeader").html('Done');
});
Since you are using jQuery, there is a very efficient pattern that you can use to simplify the interface of asyncronous functions (usually ajax).
function getScore() {
return $.ajax({...}); //$.ajax returns a promise/deferred object
}
Then you can do something like
getScore().done(function (score) {
});
I strongly advise you to read about the Deferred object in jQuery.
EDIT: If you are not performing an ajax request but use setTimeout or setInterval instead to process the data in an asynchronous way, you could still use the Deferred object.
Here's an example where we have a function that sums values asynchronously and returns a Deferred object that allows the client code to efficiently handle the control flow.
function sumValuesAsync(values) {
var deferred = arguments[1] || $.Deferred(),
i = arguments[2] || 0,
sum = arguments[3] || 0;
sum += values[i];
if (++i === values.length) {
deferred.resolve(sum); //notice observers that the process is completed
} else {
setTimeout(function () {
sumValuesAsync(values, deferred, i, sum);
}, 500);
}
return deferred;
}
sumValuesAsync([1, 1, 1, 1]).done(function (total) {
console.log(total);
});
If GetScore() is an asychronous function that doesn't stop execution until it's done (which your description makes it sound like it is), then you can't pause javascript execution until it's done.
You would need to delve into the details of how GetScore() is implemented and create a notification (callback) inside of GetScore() that will get called when it's actually complete. If GetScore() uses AJAX to retrieve the score, then the core AJAX implementation will have a notification when it's actually done and you can use that to trigger your own callback.

NodeJS and asynchronous hell

I just came to this awful situation where I have an array of strings each representing a possibly existing file (e.g. var files = ['file1', 'file2', 'file3']. I need to loop through these file names and try to see if it exists in the current directory, and if it does, stop looping and forget the rest of the remaining files. So basically I want to find the first existing file of those, and fallback to a hard-coded message if nothing was found.
This is what I currently have:
var found = false;
files.forEach(function(file) {
if (found) return false;
fs.readFileSync(path + file, function(err, data) {
if (err) return;
found = true;
continueWithStuff();
});
});
if (found === false) {
// Handle this scenario.
}
This is bad. It's blocking (readFileSync) thus it's slow.
I can't just supply callback methods for fs.readFile, it's not that simple because I need to take the first found item... and the callbacks may be called at any random order. I think one way would be to have a callback that increases a counter and keeps a list of found/not found information and when it reaches the files.length count, then it checks through the found/not found info and decides what to do next.
This is painful. I do see the performance greatness in evented IO, but this is unacceptable. What choices do I have?
Don't use sync stuff in a normal server environment -- things are single threaded and this will completely lock things up while it waits for the results of this io bound loop. CLI utility = probably fine, server = only okay on startup.
A common library for asynchronous flow control is
https://github.com/caolan/async
async.filter(['file1','file2','file3'], path.exists, function(results){
// results now equals an array of the existing files
});
And if you want to say, avoid the extra calls to path.exists, then you could pretty easily write a function 'first' that did the operations until some test succeeded. Similar to https://github.com/caolan/async#until - but you're interested in the output.
The async library is absolutely what you are looking for. It provides pretty much all the types of iteration that you'd want in a nice asynchronous way. You don't have to write your own 'first' function though. Async already provides a 'some' function that does exactly that.
https://github.com/caolan/async#some
async.some(files, path.exists, function(result) {
if (result) {
continueWithStuff();
}
else {
// Handle this scenario
}
});
If you or someone reading this in the future doesn't want to use Async, you can also do your own basic version of 'some.'
function some(arr, func, cb) {
var count = arr.length-1;
(function loop() {
if (count == -1) {
return cb(false);
}
func(arr[count--], function(result) {
if (result) cb(true);
else loop();
});
})();
}
some(files, path.exists, function(found) {
if (found) {
continueWithStuff();
}
else {
// Handle this scenario
}
});
You can do this without third-party libraries by using a recursive function. Pass it the array of filenames and a pointer, initially set to zero. The function should check for the existence of the indicated (by the pointer) file name in the array, and in its callback it should either do the other stuff (if the file exists) or increment the pointer and call itself (if the file doesn't exist).
Use async.waterfall for controlling the async call in node.js for example:
by including async-library and use waterfall call in async:
var async = require('async');
async.waterfall(
[function(callback)
{
callback(null, taskFirst(rootRequest,rootRequestFrom,rootRequestTo, callback, res));
},
function(arg1, callback)
{
if(arg1!==undefined )
{
callback(null, taskSecond(arg1,rootRequest,rootRequestFrom,rootRequestTo,callback, res));
}
}
])
(Edit: removed sync suggestion because it's not a good idea, and we wouldn't want anyone to copy/paste it and use it in production code, would we?)
If you insist on using async stuff, I think a simpler way to implement this than what you described is to do the following:
var path = require('path'), fileCounter = 0;
function existCB(fileExists) {
if (fileExists) {
global.fileExists = fileCounter;
continueWithStuff();
return;
}
fileCounter++;
if (fileCounter >= files.length) {
// none of the files exist, handle stuff
return;
}
path.exists(files[fileCounter], existCB);
}
path.exists(files[0], existCB);

Force Javascript function call to wait until previous one is finished

I have a simple Javascript function:
makeRequest();
It does a bunch of stuff and places a bunch of content into the DOM.
I make a few calls like so:
makeRequest('food');
makeRequest('shopping');
However, they both fire so quickly that they are stepping on each other's toes. Ultimately I need it to have the functionality of.
makeRequest('food');
wait....
makeRequest('shopping'); only if makeRequest('food') has finished
Thoughts on getting these to execute only one at a time?
Thanks!
If these functions actually do an AJAX request, you are better keeping them asynchronous. You can make a synchronous AJAX request but it will stop the browser from responding and lead to bad user experience.
If what you require if that these AJAX requests are made one after the other because they depend on each other, you should investigate your function to see if it provides a callback mechanism.
makeRequest('food', function()
{
// called when food request is done
makeRequest('shopping');
});
Using jQuery, it looks something like that
$.get("/food", function(food)
{
// do something with food
$.get("/shopping", function(shopping)
{
// do something with shopping
});
});
I would recommend that you simply write them asynchronously--for example, call makeRequest('shopping'); from the AJAX completion handler of the first call.
If you do not want to write your code asynchronously, see Javascript Strands
I suppose that you have a callback method that takes care of the response for the request? Once it has done that, let it make the next request.
Declare an array for the queue, and a flag to keep track of the status:
var queue = [], requestRunning = false;
In the makeRequest method:
if (requestRunning) {
queue.push(requestParameter);
} else {
requestRunning = true;
// do the request
}
In the callback method, after taking care of the response:
if (queue.length > 0) {
var requestParameter = queue.splice(0,1)[0];
// do the request
} else {
requestRunning = false;
}

Categories