So I've got this basket functionality where you enter, say, an author's name and it lists the available books. You select what you want and then you can click to select another author. When you do, you get a list looking roughly like this:
Stephen King
The Stand [remove]
The Talisman [remove]
Pet Sematary [remove]
Terry Pratchett
Mort [remove]
Guards Guards [remove]
In the example above, the Stephen King books have been stored in Session, the Terry Pratchett books have not. If I click the remove button on a Pratchett book, some jquery will just hide those. If I remove a Stephen King book, an ajax request is triggered to remove it from Session before jquery hides it.
So my javascript looks something like this:
$('.book-remove').click(removeBook);
function deleteFromBasket(bookId) {
var ajaxArgs = { BookId : bookId };
return $.ajax({
// blah blah, no problems here
success: function(e) {
hideRow();
}
});
}
function hideRow(bookId) {
$('.book-id-' + bookId).hide();
}
function removeBook(e) {
e.preventDefault();
if ($(this).hasClass('needs-ajax-call') {
var promise = deleteFromBasket($(this).prop("id").replace("book-id-", ""));
// this is the problem. how do I wait here for the ajax to complete?
}
else
hideRow();
// if i put promise.done(hideRow) up there, it still runs this too soon.
doMoreStuff();
}
You can structure all your code (both code paths with and without ajax) to use promises. The code that doesn't have to do an ajax call can just start with a promise that is already resolved and both code paths will execute the same sequence (one will be faster to execute because it doesn't have to wait for the ajax call), but both will execute things in the same order:
Any stuff you want done AFTER the ajax call simply has to be moved into the promise.then() handler:
function removeBook(e) {
e.preventDefault();
var promise;
if ($(this).hasClass('needs-ajax-call') {
promise = deleteFromBasket($(this).prop("id").replace("book-id-", ""));
} else {
// get an already resolved promise since no ajax call is required
promise = $.Deferred().resolve();
}
// you can return this promise if you want the calling code
// to be able to know when you're done with everything
return promise.then(function() {
hideRow(bookId); // assumes you've calculated a bookId somewhere
doOtherStuff();
});
}
This has the advantage that the lions share of your code is in one code path rather than two separate code paths and it solves your issue because all the code in the .then() handler will not execute until after the ajax call is done.
Here's an answer to the question plus some suggestions for tidier DOM/javascript.
First, let's make sure that :
the "session" entries are in static container(s) (eg a <div>) with class="sessionResultsWrapper"
the "non session" entries are in static container(s) (eg a <div>) with class="otherResultsWrapper"
each entry is an element (eg an <li>) with class="entry" and data-bookID="xxxx"
Now you are in a better position :
to select elements without the need for cumbersome id parsing
to establish a click handler that will fire on all existing "remove" buttons and any that are added later to the static containers.
$(".sessionResultsWrapper, .otherResultsWrapper").on('click', '.book-remove', function(e) {
e.preventDefault();
var $this = $(this),
promise;
if ($this.parents(".sessionResultsWrapper").length) {
promise = deleteFromBasket($this.closest('entry').data('bookID'));
} else {
promise = $.when();//this is the most compact syntax for a resolved promise
});
// At this point, you have a promise regardless of whether a "session" or "non-session" button was clicked.
// The only difference is that a "non-session" promise will be already resolved, while a "session" promise will be resolved later.
// Because of the way promises work, we can simply chain a .then(fn), which will fire immediately (non session, Terry Pratchett title) or later (when ajax successfully returns, Stephen King title).
promise.then(function() {
$this.closest(".entry").hide();//or, if the entry will never be re-shown then .remove()
doMoreStuff();
});
});
You could add the doMoreStuff as a callback to the promise.
Something like this
function removeBook(e) {
e.preventDefault();
if ($(this).hasClass('needs-ajax-call') {
var promise = deleteFromBasket($(this).prop("id").replace("book-id-", ""));
promise.done(doMoreStuff);
}
else{
hideRow();
doMoreStuff();
}
}
Related
I have the need to do a main AJAX form submit. However, I want to perform series of other preliminary form submits and AJAX requests halfway, before continuing the main from submit.
Below is the idea, but with a lot of pseudocode. I want to call the ajaxFunction as shown, complete all its tasks, then proceed with the main form submission:
$('#mainform').submit(function(e){
e.preventDefault();
var main = this;
var data = $('#section :input', main).serialize();
//preliminary nested ajax requests
var mainresult = ajaxFunction('arg1', 'arg2');
alert("All preliminary AJAX done, proceeding...");
if(mainresult){
//final ajax
$.post('mainurl', data, function(result){
console.log(result);
});
}else{
//do nothing
}
});
function ajaxFunction(param1, param2){
//ajax1
ajaxFetchingFunction1('url1', function(){
//ajax2
ajaxFetchingFunction2('url2', function(){
//submit handler
$('#anotherform').submit(function(){
if(someparam === 1){
return true;
}else{
return false;
}
});
});
});
}
As it is now, I know it won't work as expected because of all the asynchronous nested AJAX calls. What I get is that alert("All preliminary AJAX done, proceeding..."); executes even before any of the AJAX calls in ajaxFunction.
I believe that this is just the kind of scenario ("callback hell") for which the Deferred/Promise concept was introduced, but I've been struggling to wrap my head around this. How can I structure these different AJAX requests, such that code execution would wait until ajaxFunction completes and returns mainresult for subsequent use?
How can I structure these different AJAX requests, such that code
execution would wait until ajaxFunction completes and returns
mainresult for subsequent use?
You can't and you don't. Javascript will not "wait" for an asynchronous operation to complete. Instead, you move the code that wants to run after the async operation is done into a callback that is then called when the async operation is done. This is true whether using plain async callbacks or structured callbacks that are part of promises.
Asynchronous programming in Javascript requires a rethinking and restructing of the flow of control so that things that you want to run after an async operation is done are put into a callback function rather than just sequentially on the next line of code. Async operations are chained in sequence through a series of callbacks. Promises are a means of simplifying the management of those callbacks and particularly simplifying the propagation of errors and/or the synchronization of multiple async operations.
If you stick with callbacks, then you can communicate completion of ajaxFunction() with a completion callback:
function ajaxFunction(param1, param2, doneCallback){
//ajax1
ajaxFetchingFunction1('url1', function(){
//ajax2
ajaxFetchingFunction2('url2', function(){
doneCallback(someResult);
});
});
}
And, then use it here:
$('#mainform').submit(function(e){
e.preventDefault();
var main = this;
var data = $('#section :input', main).serialize();
//preliminary nested ajax requests
ajaxFunction('arg1', 'arg2', function(result) {
// process result here
alert("All preliminary AJAX done, proceeding...");
if(result){
//final ajax
$.post('mainurl', data, function(result){
console.log(result);
});
}else{
//do nothing
}
});
});
Note: I removed your $('#anotherform').submit() from the code because inserting an event handler in a function that will be called repeatedly is probably the wrong design here (since it ends up creating multiple identical event handlers). You can insert it back if you're sure it's the right thing to do, but it looked wrong to me.
This would generally be a great place to use promises, but your code is a bit abstract to show you exactly how to use promises. We would need to see the real code for ajaxFetchingFunction1() and ajaxFetchingFunction2() to illustrate how to make this work with promises since those async functions would need to create and return promises. If you're using jQuery ajax inside of them, then that will be easy because jQuery already creates a promise for an ajax call.
If both ajaxFetchingFunction1() and ajaxFetchingFunction2() are modified to return a promise, then you can do something like this:
function ajaxFunction(param1, param2){
return ajaxFetchingFunction1('url1').then(function() {
return ajaxFetchingFunction2('url2');
});
}
And, then use it here:
$('#mainform').submit(function(e){
e.preventDefault();
var main = this;
var data = $('#section :input', main).serialize();
//preliminary nested ajax requests
ajaxFunction('arg1', 'arg2').then(function(result) {
// process result here
alert("All preliminary AJAX done, proceeding...");
if(result){
//final ajax
$.post('mainurl', data, function(result){
console.log(result);
});
}else{
//do nothing
}
});
});
Promises make the handling of multiple ajax requests really trivial, however the implications of "partial forms" on GUI design are maybe more of a challenge. You have to consider things like :
One form divided into sections, or one form per partial?
Show all partials at the outset, or reveal them progressively?
Lock previously validated partials to prevent meddling after validation?
Revalidate all partials at each stage, or just the current partial?
One overall submit button or one per per partial?
How should the submit button(s) be labelled (to help the user understand the process he is involved in)?
Let's assume (as is the case for me but maybe not the OP) that we don't know the answers to all those questions yet, but that they can be embodied in two functions - validateAsync() and setState(), both of which accept a stage parameter.
That allows us to write a generalised master routine that will cater for as yet unknown validation calls and a variety of GUI design decisions.
The only real assumption needed at this stage is the selector for the form/partials. Let's assume it/they all have class="partialForm" :
$('.partialForm').on('submit', function(e) {
e.preventDefault();
$.when(setState(1)) // set the initial state, before any validation has occurred.
.then(validateAsync.bind(null, 1)).then(setState.bind(null, 2))
.then(validateAsync.bind(null, 2)).then(setState.bind(null, 3))
.then(validateAsync.bind(null, 3)).then(setState.bind(null, 4))
.then(function aggregateAndSubmit() {
var allData = ....; // here aggregate all three forms' into one serialization.
$.post('mainurl', allData, function(result) {
console.log(result);
});
}, function(error) {
console.log('validation failed at stage: ' + error.message);
// on screen message for user ...
return $.when(); //inhibit .fail() handler below.
})
.fail(function(error) {
console.log(error);
// on screen message for user ...
});
});
It's syntactically convenient here to call setState() as a then callback although it's (probably) synchronous
Sample validateAsync() :
function validateAsync(stage) {
var data, jqXHR;
switch(stage) {
case 1:
data = $("#form1").serialize();
jqXHR = $.ajax(...);
break;
case 2:
data = $("#form2").serialize();
jqXHR = $.ajax(...);
break;
case 3:
data = $("#form3").serialize();
jqXHR = $.ajax(...);
}
return jqXHR.then(null, function() {
return new Error(stage);
});
}
Sample setState() :
function setState(stage) {
switch(stage) {
case 1: //initial state, ready for input into form1
$("#form1").disableForm(false);
$("#form2").disableForm(true);
$("#form3").disableForm(true);
break;
case 2: //form1 validated, ready for input into form2
$("#form1").disableForm(true);
$("#form2").disableForm(false);
$("#form3").disableForm(true);
break;
case 3: //form1 and form2 validated, ready for input into form3
$("#form1").disableForm(true);
$("#form2").disableForm(true);
$("#form3").disableForm(false);
break;
case 4: //form1, form2 and form3 validated, ready for final submission
$("#form1").disableForm(true);
$("#form2").disableForm(true);
$("#form3").disableForm(true);
}
return stage;
}
As written setState(), will need the jQuery plugin .disableForm() :
jQuery.fn.disableForm = function(bool) {
return this.each(function(i, form) {
if(!$(form).is("form")) return true; // continue
$(form.elements).each(function(i, el) {
el.readOnly = bool;
});
});
}
As I say, validateAsync() and setState() above are just rudimentary samples. As a minimum, you will need to :
flesh out validateAsync()
modify setState() to reflect the User Experience of your choice.
In my previous question I thought I'd got it sorted, but I've found an intermittent edge condition where the "then" part is being carried out before all the promises resolve in the Q.all call.
Simply, I have the following set up where a generic calculation is called multiple times with different parameters: (code simplified a bit to remove irrelevant code and actual parameter names)
var promiseArray = [];
promiseArray.push(genericCalc(param1, param2, param3));
promiseArray.push(genericCalc(param1, param2, param3));
promiseArray.push(genericCalc(param1, param2, param3));
var calcPromise = Q.all(promiseArray);
return calcPromise
.then(calcsDone)
.fail(calcsFailed);
function calcsDone(res) {
calcTotals();
setChart(selectedRow());
console.log("done recalculation routine");
}
function calcsFailed() {
logger.logError("Failure in Calculations", "", null, true);
}
genericCalc (with some irrelevant code removed)
var genericCalc = function (param1, param2, param3) {
//specific calculation is determined using parameters passed in and varies but is same structure for each as below
calcPromise = specificCalc(params...);
return calcPromise
.then(calcsDone)
.fail(calcsFailed);
function calcsDone(res) {
//some minor subtotalling in here using "res" and flag setting
return Q.resolve();
}
function calcsFailed() {
//handle error....
}
};
There are 3 or 4 different specific calculations and they all have roughly the same sort of layout:
function specificCalc1(params...) {
var promise = calcContext.ajaxCalc(params);
return promise
.then(calcsDone)
.fail(calcsFailed);
function calcsDone(res) {
rv(res);
console.log("done specific calc 1");
return Q.resolve();
}
function calcsFailed() {
logger.logError("Failure in specific calc 1", "", null, true);
return Q.resolve();
}
};
Now I know that this is not a great idea to push the result of the ajax calc into a return value, but at present I don't want to change this as there is just too much code change involved and I feel that, even if not the best methodology at present, this is a learning curve for me and I'll address that part once I have my head around this strange edge condition.
So... what throws me is that every now and again when I change some values on screen which trigger a recalculation, one of the "done" console log messages from one of the specific calcs appears AFTER the "done recalculation routine" one from the first section!
I thought it was being caused by a poorly-constructed promise leading to the function being executed immediately, but the REALLY weird thing is that when I put a debug stop in the server code for that calculation, all works correctly and the client browser is paused until the server code is continued and then the client side debugs show that the next points are being hit.
9 times out of 10 it all works perfectly. On the 10th one, I see "done recalculation routine" in the console immediately followed by "done specific calc 2" and this causes the totals to be wrong.
I cannot see how this can be happening. I've artificially put time delay loops in all the specific calc functions to make them take several seconds and never once do I see the out of sequence debug messages, but when there is no artificial slowdown in place, I see it 1 in 10 times approximately.
Please would someone try and put me out of my misery here... I just want to know I can rely on the "Q.all" call working in the first block of code, such that when it hits "calcsDone" it has ALWAYS done the generic and specific calcs!
Edit 1:
To explain the "rv" or returnvalue a bit. In "genericCalc", I declare an observable:
var res = ko.observable(); //holds returned results
And as part of the call to "specificCalc" I pass that return value in, for example:
calcPromise = specificCalc1(isPackaging, res, thisArticle);
In the specificCalc I just put the ajax result into the observable, so it is available in "calcsDone" in "genericCalc" and, once all calcs are completed from the "Q.all()" function, the calculation routines can tot up the individual results from each specific Calc.
Edit 2
The console log, when it goes wrong is:
done specificCalc1
done specificCalc2, isPackaging=true
Done specificCalc3, redraw = false
done calctotals
chart replotted
done recalculation routine
done specificCalc2, isPackaging=false
...you can see the "done specificCalc2" after "done recalculation routine" there.
Edit 3
I reduced the promiseArray to just one item to check, using the parameters for what seems to be the troublesome call (specificCalc2):
var promiseArray = [];
promiseArray.push(genericCalc(param1, param2, param3));
And then I put a stop point in the server code.
The result is that the stop in the server code happens and the console already says "done" so it's a problem with the promise construction after all and was somehow being masked by one of the other calls being done. Once I release the stop in server code, I get my console messages from the ajax call function AND the generic Calc function saying "done".
So, it appears to me as if the problem is in the main Q.all call from what I can tell, as that doesn't wait for the generic calc function being completed before it carries on with its "calcsDone" function.
I just tried returning the genericCalc promise:
return genericCalc("eol_", true, false, false)
//return calcPromise
.then(calcsDone)
.fail(calcsFailed);
...and it instantly fails with "cannot call method "then" of undefined, thus confirming my problem is in the generic calc function, so off to look at that now.
Edit 4
Narrowed the problem down to a further function call within genericCalc. As part of the calculation, this calls a function to remove the impact value as stands before the calculation is done. When the calculation returns it then adds the result back in to the overall amount.
If I "return Q.resolve()" from genericCalc on the line before I do:
impactRemove(currentPrefix, currentArticle); //remove impacts for this prefix
then the promise works, if I do it on the line after, it fails. So for some reason calling that function seems to resolve the promise. Investigating further now.
Edit 5
So the problem is caused when I call a standard function midway through the genericCalc routine. The logic behind this is:
Change on browser form retriggers calculation
Function is called that sets up array of promises (several of which call the same function with different parameters)
Inside that generic function (genericCalc) I call a standard non-promise function that removes the current calculation totals from the project total
Calculation is complete
Standard non-promise function called to add results of calculation back to project total
GenericCalc returns promise to main function
Overall totals updated with latest project totals, graphics are updated
What actually seems to happen is that when I call the standard javascript functions with genericCalc, they execute immediately, therefore resolving the promise and although the ajax call is still done, the Q.all call does not wait as it believes the promise is resolved as "genericCalc" is returning "undefined" and not a promise.
At this point, Bergi is screaming at me about my hideous anti-pattern noob coding and I tend to agree with him. Trouble is, I'd like to get it working this way so I have something to test against when I finally adapt it to work correctly.
So... if I have two functions called from within "genericCalc" like so:
impactRemove(currentPrefix, currentArticle); //remove impacts for this prefix
and
impactAdd(currentPrefix, currentArticle); //addimpacts for this prefix
Which are basically like this:
function impactAdd(prefix, prjArt) {
if (!prjArt) {return} //just get out as there's nothing to calculate
factorArray.forEach(function (f) {
var orig_projValue = pGlobalVar.editProject()[prefix + f]();
var new_projArtValue = prjArt[prefix + f](); //must be set first!
pGlobalVar.editProject()[prefix + f](orig_projValue + new_projArtValue); //add new value back in to project value
});
}
...then how do I call these "midpoint" functions within the promise of genericCalc so that I Can still return a promise when a) impactRemove has been completed, b) the remote ajax call has been completed and c) the impactAdd function has been completed?
Do I need to set up code within genericCalc to do something like:
impactRemove(params...).then(<ajax function that returns new calculation results>).then(impactAdd(params))
...or will (params) after my functions automatically invoke those functions, thus resolving the promise too early? This is all so confusing compared to what I'm used to!
Edit6
All genericCalc does is this:
var genericCalc = function (param1, param2, param3) {
//specific calculation is determined using parameters passed in and varies but is same structure for each as below
calcPromise = specificCalc(params...);
impactRemove(currentPrefix, currentArticle); //remove impacts for this prefix
return calcPromise
.then(calcsDone)
.fail(calcsFailed);
function calcsDone(res) {
//some minor subtotalling in here using "res" and flag setting
impactAdd(currentPrefix, currentArticle); //addimpacts for this prefix
return Q.resolve();
}
function calcsFailed() {
//handle error....
}
};
"specificCalc" returns a promise - that one works as I've checked the contents of the promise at a breakpoint. If I remove the calls to "impactRemove" and "impactAdd" above, then "genericCalc" also works. It is those calls that cause the problem.
This is why I think I need something like:
impactRemove(params)
.then(return calcPromise(params)
.then(impactAdd(params);
...but neither impactAdd nor impactRemove do anything asynchronously and I'm also not sure how I can set this up as I need to use params and yet you said those params will mean the functions are immediately invoked...
Edit 7
So, as mentioned in the lengthy comments section, this is being caused by a "forEach" loop in genericCalc:
var genericCalc = function (calcPrefix, other params...) {
gcCount++;
console.log("generic calc starting for: " + calcPrefix + ", count=" + gcCount);
pGlobalVar.projectIsCalculating(true); //controls spinner gif
var res_Array = ko.observable(); //holds returned results
var _prjArticleArray = []; //create local array to use
if (currentArticle == true) {
_prjArticleArray.push(pGlobalVar.editProjectArticle());
} else {
projectResults().projectArticles().forEach(function (pa) {
_prjArticleArray.push(pa);
});
};
_prjArticleArray.forEach(function (thisArticle) {
var calcPromise;
switch (calcPrefix) {
case "eol_":
calcPromise = Q.all([calcEOL(isPackaging, res_Array, thisArticle)]);
break;
case "use_":
calcPromise = Q.all([calcUse(isPackaging, res_Array, thisArticle)]);
break;
case "air_":
calcPromise = Q.all([calcFlight(isPackaging, res_Array, thisArticle)]);
break;
default:
break;
}
impactRemove(calcPrefix, thisArticle); //remove impacts for this prefix
return calcPromise
.then(calcsDone)
.fail(calcsFailed);
function calcsDone(args) {
//do some calcs and totalling based on returned results
impactAdd(calcPrefix, thisArticle); //add this article impact into project total
console.log("generic calc done for: " + calcPrefix + ", count=" + gcCount);
calcTotals(); //accmulates individual results into the "total_xxx" used on the results table and in the chart
setChart(selectedRow());
pGlobalVar.projectIsCalculating(false);
}
function calcsFailed() {
logger.logError("Failure in " + calcPrefix + "calculation", "", null, true);
impactAdd(calcPrefix); //will add back in results as they were at start of calc
pGlobalVar.projectIsCalculating(false);
}
});
};
The only reason I've posted it in all its ugly glory is to point out that all works perfectly IF I remove the "forEach" loop and just run this for one article. Why on earth would the forEach loop make it die a horrible death ?
I think you just want to exchange the order of the specificCalc() and impactRemove() calls. Even if the first is asynchronous, it will start doing its task right now - only the results will arrive in the next turn. If you want to chain anything after a synchronous task, just put it on the next line ("semicolon monad").
Also, if impactRemove does assign to your global (!) variable calcPromise, it might not be a promise any more and throw when calling a .then() method on it. What you want seems to be
function genericCalc(param1, param2, param3) {
impactRemove(currentPrefix, currentArticle); //remove impacts for this prefix
return specificCalc(params...).finally(function() {
impactAdd(currentPrefix, currentArticle); // add impacts for this prefix
}).then(function calcsDone(res) {
// some minor subtotalling in here using "res" and flag setting
return subtotal;
}, function calcsFailed() {
// handle error....
});
}
Why on earth would the forEach loop make it die a horrible death?
Because you're not returning a promise. A forEach loop has its own callback, from which you can return, but from the genericCalc function nothing is returned. Q.all will not fret about that, but just take the undefined as a value. However, the async action is to be started and you get your callbacks, but Q.all won't wait for it, because it does not know of it.
The solution is a quite simple change and has already been explained here.
I'm using deferred as I need to execute several processes asynchronously.
To be clearer, here is the signification of my treatments :
Treatment1 : call of an ajax service providing user rights
Treatment2 : call of an ajax service providing links and labels.
I need to call these 2 services at the same time and then get the unified response of both services in order to display links depending on rights (my real problem is with a 3rd ajax service but let's talk about with only 2 to simplify).
First, I declare the deferred as global var :
var treatment1 = $.Deferred();
var treatment2 = $.Deferred();
Then, when I need to do the job, I call the resolve method with needed data for using it in the global unique treatment:
when my 1st ajax service responds : treatment1.resolve(responseData1)
when my 2nd ajax service responds : treatment2.resolve(responseData2)
When the treatment1 & 2 are finished, the done event is fired :
$.when(treatment1, treatment2).done(function(responseData1,responseData2) {
DoGlobalTreatmentWithAllResponseData(responseData1,responseData2)
}
My problem is that deferred works only once.
As my website is realized in ajax mainly, I need to fire the event multiple times.
The user can click a button to search for users. Then a list of users is displayed and the ajax services are all called asynchronously. This operation can be repeated infinitely.
I just need a way to reuse the principle of deferred but multiple times. I know that this problem has already been discussed and everyone says deferred can't work this way.
But, is it really not possible to reset the deferred state or reset the promises (even by implementing a custom solution, using AOP or something else)?
If it's impossible, what solution could I use? I don't want to fire treatments one after another but I really want to do a global treatment after all the treatments are finished (that is to say, after the last treatment in activity is finished) and I want to use the responseData of each services.
Here is my sample code that I would like to customize : http://jsfiddle.net/PLce6/14/
I hope to be clear as English is not my native language.
Thank you in advance for your help.
Deferreds can be resolved/rejected only once... However, I think the issue is how you're structuring your code...
As long as you're initializing your deferred each time, there isn't any problem in doing this...
I think the issue is this:
First, i declare the deferred as global var:
var treatment1 =$.Deferred();
var treatment2 = $.Deferred();
Instead, can you try doing this in a function that's invoked in the button click
The user can clic a button to search for users
so have a function like so:
function onClick() {
var treatment1 =$.ajax({url: '/call1'});
var treatment2 = $.ajax({url: '/call2'});
$.when(treatment1, treatment2).done(function(obj1, obj2) {
// do whatever else you need
});
}
Now from the rest of your post, looks like you're trying to reuse the deferreds - but in that case, your original solution should not have a problem with keeping deffereds as global since your done will be called with whatever data they were resolved with.
Can you post some more of your code to help explain what you're trying to do.
Updated from my own comment below for elaboration
based on op's fiddle, he wants to be able to trigger dependent action multiple times. Solution is to have the dependent action create new deferreds and hook up a $.when to itself. See updated fiddle at http://jsfiddle.net/PLce6/15/
// global
var d1 = $.Deferred();
var d2 = $.Deferred();
var d3 = $.Deferred();
// here's the reset
function resetDeferreds() {
d1 = $.Deferred();
d2 = $.Deferred();
d3 = $.Deferred();
$.when(d1, d2, d3).done(
function (responseData1, responseData2, responseData3) {
DoGlobalTreatmentWithAllResponseData(responseData1, responseData2, responseData3);
resetDeferreds();
});
// the onclick handlers
function do3() {
d3.resolve('do3 ');
return d3;
}
// the top level $.when
$.when(d1, d2, d3).done(function (responseData1, responseData2, responseData3) {
DoGlobalTreatmentWithAllResponseData(responseData1, responseData2, responseData3);
resetDeferreds();
});
Perhaps you code is not well designed?
I do not see how that would be an issue. The asynchronous process should be responsible for creating a new Deferred object everytime.
function doSomething() {
var d = $.Deferred();
setTimeout(function () {
d.resolve();
}, 1000);
return d;
}
function doSomethingElse() {
var d = $.Deferred();
setTimeout(function () {
d.resolve();
}, 1000);
return d;
}
Then you can always do the following:
$.when(doSomething(), doSomethingElse()).done(function () {
console.log('done');
});
There's always a solution:
If you absolutely need to be able to call resolve multiple times on the same Deferred, then you should wrap the Deferred into another object, let's say DeferredWrapper, which would expose the same API as a Deferred but would delegate all method calls to the it's encapsulated Deferred.
In addition of delegating the function calls, the DeferredWrapper would have to keep track of all listening operations (e.g. done, always, fail...) that were made on the object. The DeferredWrapper could store all actions as [functionName, arguments] tuples in an internal this._actions property.
Finally, you would need to provide a special implementation for state changing operations (e.g. reject, resolve, resolveWith...etc) that would look like:
Let d be the internal Deferred referenced by this._deferred.
Let fn be the function name of the function being called.
If d.state() is not pending:
3.1 Do d = this._deferred = [[native jQuery Deferred]]
3.2 Apply all actions on d.
Return the result of d[fn].apply(d, arguments)
Note: You would also need to implement a custom promise implementation and make sure it behaves correctly. You can probably use a similar approach like the one described.
I'm going to suggest a small change. One element you weren't clear on is whether or not the treatment1 and treatment2 results are different each time. If they are then do what #raghu and #juan-garcia
function onClick() {
var treatment1 =$.ajax({url: '/call1'});
var treatment2 = $.ajax({url: '/call2'});
$.when(treatment1, treatment2).done(function(obj1, obj2) {
// do whatever else you need
});
}
If they don't change then do this :
var treatment1 =$.ajax({url: '/call1'});
var treatment2 = $.ajax({url: '/call2'});
function onClick() {
$.when(treatment1, treatment2).done(function(obj1, obj2) {
// do whatever else you need
});
}
Or some variation of that. Because once they are complete, your callback function will always execute right away. It's still asynchronous, but it doesn't need to wait since everything is ready to go. This serves both use cases. This is a very common pattern for data that may take a few seconds to load before it's functionally useful when drawing a new component in the page. It's a lazy-load mechanism that's very useful. Once it's in though everything looks as if it's responding instantaneously.
I reworked the javascript in your example on JSFiddle to show just the basics I think you needed to see. That is here. Given your example, I think the mistake is in believing that resolve must be called multiple times to trigger a behavior. Invoking the done behavior cues a one time behavior and each invocation of done loads a new behavior into the queue. Resolve is called one time. $.when().done() you call as many times as you have behaviors dependent on the specific when() condition.
I have a function that is bound to mouse click events on a Google Map. Due to the nature of the function it can take a few moments for processing to complete (.1sec - 2sec depending on connection speeds). In itself this is not much of a problem, however if the user gets click happy, this can cause problems and later calls are a bit depended on the previous one.
What would be the best way to have the later calls wait for previous ones to complete? Or even the best way to handle failures of previous calls?
I have looked at doing the following:
Using a custom .addEventListener (Link)
Using a while loop that waits previous one has processed
Using a simple if statement that checks if previous one needs to be re-run
Using other forms of callbacks
Now for some sample code for context:
this.createPath = function(){
//if previous path segment has no length
if (pathSegment[this.index-1].getPath().length === 0){
//we need the previous path segment recreated using this same function
pathSegment[this.index-1].createPath();
//now we can retry this path segment again
this.createPath();
}
//all is well, create this path segment using Google Maps direction service
else {
child.createPathLine(pathSegment[this.index-1].getEndPoint(), this.clickCoords);
}
}
Naturally this code as it is would loop like crazy and create many requests.
This is a good use case for promises.
They work like this (example using jQuery promises, but there are other APIs for promises if you don't want to use jQuery):
function doCallToGoogle() {
var defer = $.Deferred();
callToGoogleServicesThatTakesLong({callback: function(data) {
defer.resolve(data);
}});
return defer.promise();
}
/* ... */
var responsePromise = doCallToGoogle();
/* later in your code, when you need to wait for the result */
responsePromise.done(function (data) {
/* do something with the response */
});
The good thing is that you can chain promises:
var newPathPromise = previousPathPromise.then(
function (previousPath) { /* build new path */ });
Take a look to:
http://documentup.com/kriskowal/q/
http://api.jquery.com/promise/
To summarize promises are an object abstraction over the use of callbacks, that are very useful for control flow (chaining, waiting for all the callbacks, avoid lots of callback nesting).
Introduction to the problem
I need to call an asynchronous function within a loop until a condition is satisfied. This particular function sends a POST request to a website form.php and performs some operations with the response, which is a JSON string representing an object with an id field. So, when that id is null, the outer loop must conclude. The function does something like the following:
function asyncFunction(session) {
(new Request({
url: form.php,
content: "sess=" + session,
onComplete: function (response) {
var response = response.json;
if (response.id) {
doStaff(response.msg);
} else {
// Break loop
}
}
})).get();
}
Note: Although I've found the problem implementing an add-on for Firefox, I think that this is a general javascript question.
Implementing the loop recursively
I've tried implementing the loop by recursivity but it didn't work and I'm not sure that this is the right way.
...
if (response.id) {
doStaff(response.msg);
asyncFunction(session);
} else {
// Break loop
}
...
Using jsdeferred
I also have tried with the jsdeferred library:
Deferred.define(this);
//Instantiate a new deferred object
var deferred = new Deferred();
// Main loop: stops when we receive the exception
Deferred.loop(1000, function() {
asyncFunction(session, deferred);
return deferred;
}).
error(function() {
console.log("Loop finished!");
});
And then calling:
...
if (response.id) {
doStaff(response.msg);
d.call();
} else {
d.fail();
}
...
And I achieve serialization but it started repeating previous calls for every iteration. For example, if it was the third time that it called the asyncFunction, it would call the same function with the corresponding parameters in the iterations 1 and 2.
Your question is not exactly clear, but the basic architecture must be that the completion event handlers for the asynchronous operation must decide whether to try again or to simply return. If the results of the operation warrant another attempt, then the handler should call the parent function. If not, then by simply exiting the cycle will come to an end.
You can't code something like this in JavaScript with anything that looks like a simple "loop" structure, for the very reason that the operations are asynchronous. The results of the operation don't happen in such a way as to allow the looping mechanism to perform a test on the results; the loop may run thousands of iterations before the result is available. To put it another way, you don't "wait" for an asynchronous operation with code. You wait by doing nothing, and allowing the registered event handler to take over when the results are ready.
Thank you guys for your help. This is what I ended doing:
var sess = ...;
Deferred.define(this);
function asyncFunction (session) {
Deferred.next(function() {
var d = new Deferred();
(new Request({
url: form.php,
content: "sess=" + session,
onComplete: function (response) {
d.call(response.json);
}
})).get();
return d;
}).next(function(resp) {
if (resp.id) {
asyncFunction(session);
console.log(resp.msg);
}
});
}
asyncFunction(sess);
Why wouldn't you just use a setInterval loop? In the case of an SDK-based extension, this would look like:
https://builder.addons.mozilla.org/addon/1065247/latest/
The big benefit of promises-like patterns over using timers is that you can do things in parallel, and use much more complicated dependencies for various tasks. A simple loop like this is done just as easily / neatly using setInterval.
If I correctly understand what you want to do, Deferred is a good approach. Here's an example using jQuery which has Deferred functionality built in (jQuery.Deferred)
A timeout is used to simulate an http request. When each timeout is complete (or http request is complete) a random number is returned which is equivalent to the result of your http request.
Based on the result of the request you can decide if you need another http request or want to stop.
Try out the below snippet. Include the jQuery file and then the snippet. It keeps printing values in the console and stops after a zero is reached.
This could take while to understand but useful.
$(function() {
var MAXNUM = 9;
function newAsyncRequest() {
var def = $.Deferred(function(defObject) {
setTimeout(function() {
defObject.resolve(Math.floor(Math.random() * (MAXNUM+1)));
}, 1000);
});
def.done(function(val) {
if (val !== 0)
newAsyncRequest();
console.log(val);
});
};
newAsyncRequest();
});
Update after suggestion from #canuckistani
#canuckistani is correct in his answer. For this problem the solution is simpler. Without using Deferred the above code snippet becomes the following. Sorry I led you to a tougher solution.
$(function() {
var MAXNUM = 9;
function newAsyncRequest() {
setTimeout(function() {
var val = Math.floor(Math.random() * (MAXNUM+1));
if (val !== 0)
newAsyncRequest();
console.log(val);
}, 1000);
}
newAsyncRequest();
});