I have a function that is bound to mouse click events on a Google Map. Due to the nature of the function it can take a few moments for processing to complete (.1sec - 2sec depending on connection speeds). In itself this is not much of a problem, however if the user gets click happy, this can cause problems and later calls are a bit depended on the previous one.
What would be the best way to have the later calls wait for previous ones to complete? Or even the best way to handle failures of previous calls?
I have looked at doing the following:
Using a custom .addEventListener (Link)
Using a while loop that waits previous one has processed
Using a simple if statement that checks if previous one needs to be re-run
Using other forms of callbacks
Now for some sample code for context:
this.createPath = function(){
//if previous path segment has no length
if (pathSegment[this.index-1].getPath().length === 0){
//we need the previous path segment recreated using this same function
pathSegment[this.index-1].createPath();
//now we can retry this path segment again
this.createPath();
}
//all is well, create this path segment using Google Maps direction service
else {
child.createPathLine(pathSegment[this.index-1].getEndPoint(), this.clickCoords);
}
}
Naturally this code as it is would loop like crazy and create many requests.
This is a good use case for promises.
They work like this (example using jQuery promises, but there are other APIs for promises if you don't want to use jQuery):
function doCallToGoogle() {
var defer = $.Deferred();
callToGoogleServicesThatTakesLong({callback: function(data) {
defer.resolve(data);
}});
return defer.promise();
}
/* ... */
var responsePromise = doCallToGoogle();
/* later in your code, when you need to wait for the result */
responsePromise.done(function (data) {
/* do something with the response */
});
The good thing is that you can chain promises:
var newPathPromise = previousPathPromise.then(
function (previousPath) { /* build new path */ });
Take a look to:
http://documentup.com/kriskowal/q/
http://api.jquery.com/promise/
To summarize promises are an object abstraction over the use of callbacks, that are very useful for control flow (chaining, waiting for all the callbacks, avoid lots of callback nesting).
Related
I have some third party library whose events I'm listening. I get a chance to modify data which that library is going to append in the UI. It is all fine until that data modification is synchronous. As soon as I involve Ajax callbacks/promises, this fails to work. Let me put an example to show case the problem.
Below is how I'm listening to a event:-
d.on('gotResults', function (data) {
// If alter data directly it works fine.
data.title = 'newTitle';
// Above code alters the text correctly.
//I want some properties to be grabbed from elsewhere so I make an Ajax call.
$.ajax('http://someurl...', {data.id}, function (res) {
data.someProperty = res.thatProperty;
});
// Above code doesn't wait for ajax call to complete, it just go away and
renders page without data change.
// Yes I tried promises but doesn't help
return fetch('http://someurl...').then(function (data) {
data.someProperty = res.thatProperty;
return true;
});
// Above code also triggers the url and gets away. Doesn't wait for then to complete.
});
I cannot change/alter the third party library. All I have is to listen to event and alter that data.
Any better solutions. Nope. I can't use async/wait, generators, because I want to have it supported for ES5 browsers.
You cannot make a synchronous function wait for an asynchronous response, it's simply not possible by definition. Your options pretty much are:
BAD IDEA: Make a synchronous AJAX request. Again: BAD IDEA. Not only will this block the entire browser, it is also a deprecated practice and should not be used in new code, or indeed ever.
Fetch the asynchronous data first and store it locally, so it's available synchronously when needed. That obviously only works if you have an idea what data you'll be needing ahead of time.
Alter the 3rd party library to add support for asynchronous callbacks, or request that of the vendor.
Find some hackaround where you'll probably let the library work with incomplete data first and then update it when the asynchronous data is available. That obviously depends a lot on the specifics of that library and the task being done.
Does the gotResults callback function really need to return anything else than true? If not, then you could just write regular asynchronous code without this library knowing about it. Let me explain myself by rewriting your pseudocode:
d.on('gotResults', function (data) {
// If alter data directly it works fine.
data.title = 'newTitle';
// Above code alters the text correctly.
//I want some properties to be grabbed from elsewhere so I make an Ajax call.
$.ajax('http://someurl...', {data.id}, function (res) {
data.someProperty = res.thatProperty;
// Above code doesn't wait for ajax call to complete, it just go away and
// EDIT: now it should render properly
renders page without data change.
// Yes I tried promises but doesn't help
return fetch('http://someurl...');
// Above code also triggers the url and gets away. Doesn't wait for then to complete.
}).then(function (data) {
data.someProperty = res.thatProperty;
// maybe render again here?
}).catch(function(err) {
handleError(err); // handle errors so the don't disappear silently
});
return true; // this line runs before any of the above asynchronous code but do we care?
});
I process thousands of points asynchronously in ArcGIS JS API. In the main function, I call functions processing individual features, but I need to finalize the processing when all the features are processed. There should be an event for this, though I didn't find any and I'm afraid it even doesn't exist - it would be hard to state that the last item processed was the last of all. .ajaxStop() should do this, but I don't use jQuery, just Dojo. Closest what I found in Dojo was Fetch and its OnComplete, but as far as I know it's about fetching data from AJAX, not from other JS function.
The only workaround idea I have now is to measure how many features are to be processed and then fire when the output points array reaches desired length, but I need to count the desired number at first. But how to do it at loading? Tracking the data to the point where they are read from server would mean modifying functions I'm not supposed to even know, which is not possible.
EDIT - some of my code:
addData: function (data) {
dojo.addOnLoad(
this.allData = data,
this._myFunction()
);
},
Some comments:
data is an array of graphics
when I view data in debugger, its count is 2000, then 3000, then 4000...
without dojo.addOnLoad, the count started near zero, now it's around 2000, but still a fraction of the real number
_myFunction() processes all the 2000...3000...4000... graphics in this._allData, and returns wrong results because it needs them all to work correctly
I need to delay execution of _myFunction() until all data load, perhaps by some other event instead of dojo.addOnLoad.
Workarounds I already though of:
a) setTimeout()
This is clearly a wrong option - any magic number of miliseconds to wait for would fail to save me if the data contains too much items, and it would delay even cases of a single point in the array.
b) length-based delay
I could replace the event with something like this:
if(data.length == allDataCount) {
this._myFunction();
}
setTimeout(this._thisFunction, someDelay);
or some other implementation of the same, through a loop or a counter incremented in asynchronously called functions. Problem is how to make sure the allDataCount variable is definitive and not just the number of features leaded until now.
EDIT2: pointing to deferreds and promises by #tik27 definitely helped me, but the best I found on converting synchronous code to a deferred was this simple example. I probably misunderstood something, because it doesn't work any better than the original, synchronous code, the this.allData still can't be guaranteed to hold all the data. The loading function now looks like this:
addData: function (data) {
var deferred = new Deferred();
this._addDataSync(data, function (error, result) {
if (error) {
deferred.reject(error);
}
else {
deferred.resolve(result);
}
});
deferred.promise.then(this._myFunction());
},
_addDataSync: function (data, callback) {
callback(this.allData = data);
},
I know most use cases of deferred suppose requesting data from some server. But this is the first time where I can work with data without breaking functions I shouldn't change, so tracking the data back to the request is not an option.
addonload is to wait for the dom.
If you are waiting for a function to complete to run another function deferred/promises are what is used.
Would need more info on your program to give you more specific answers..
I sort of solved my problem, delaying the call of my layer's constructor until the map loads completely and the "onUpdateEnd" event triggers. This is probably the way how it should be properly done, so I post this as an answer and not as an edit of my question. On the other hand, I have no control over other calls of my class and I would prefer to have another line of defense against incomplete inputs, or at least a way to tell whether I should complain about incomplete data or not, so I keep the answer unaccepted and the question open for more answers.
This didn't work when I reloaded the page, but then I figured out how to properly chain event listeners together, so I now can combine "onUpdateEnd" with extent change or any other event. That's perfectly enough for my needs.
I have the following setup, and I'm curious if this is the correct way to do it. It works correctly, but I'm just making sure that I'm doing it right, or if there is a better way to accomplish the same task.
//custom ajax wrapper
var pageLoadPromise = ajax({
url: //call to webmethod
});
//this is the way I have been doing it
pageLoadPromise.done(cB1)
.done(cB2)
.done(cB3)
.done(cB4)
.done(function(){cB5(args);});
//this function requires that cB1 has been completed
//I tried this and it worked as well
pageLoadPromise.done(cB1,cB2,cB3,cB4)
.done(function(){cB5(agrs)});
Doing it both ways works, but like I said, am I wondering if it is this the correct way to accomplish this?
UPDATE:
I have made a small adjustment to my code, specifically for cB1 and the callback to cB5
pageloadPromise.done(
function(data){
cB1(data).done(function(){
cB5(args);
});
},cB2,cB3,cB4
);
function cB1(data){
var cB1Promise = $.Deferred();
...
cB1Promise.resolve();
return cB1Promise;
}
As pointed out by #Bergi, regardless of how you add the callbacks, they are all run in the order they are attached using done. So, promise.done(cb1, cb2, cb3, cb4).done(cb5) is the same as promise.done(cb1).done(cb2).done(cb3).done(cb4).done(cb5).
To make sure cb5 runs after cb1 use:
promise.done( function(data) {cb1(data).done(cb5);}, cb2, cb3, cb4);
Remove data if you don't need it.
I played around with the scenarios in http://jsbin.com/moqiko/4/edit?js,console,output
Doing it both ways works
Yes, they are pretty much equivalent (except for the .done(function(){cB5}); which doesn't work).
I am wondering if it is this the correct way to accomplish this?
Use the one you like better. This is more a design question than one of "correctness". However, both ways look quite odd in my eyes, and I've seen lots of promise code. I would recommend two different structures, depending on how your app is structured:
You use the pageLoadPromise as a global cache for your initial data. It is then consumed in very different places, possibly at different times, for multiple different things (or maybe even repeatedly for the same thing). Then use pageLoadPromise repeatedly in each module:
var pageLoadPromise = ajax({url: …}); // initialisation
pageLoadPromise.done(cB1); // somewhere
…
pageLoadPromise.done(cB2); // somewhere else
…
pageLoadPromise.done(cB3); // other place or time
…
You use the pageLoadPromise in one place only, and want to basically do one thing when it's loaded, except that it is structured in multiple subtasks; and each needs only a part of, not the whole structure. Then use a single callback only:
ajax({url: …}).then(function(data) {
cb1(data.d.cb1data);
cb2(data.d.cb2data);
cb3(data.d.cb3data);
cb4(data.d.cb4data);
cb5(data.d.cb5data, some_additional_data);
});
I have made a small adjustment to my code, specifically for cB1 and the callback to cB5
You should not make cb1 return a promise when it doesn't do anything asynchronous. Don't modify it. If you want to express explicitly that cb5 needs to be executed with the result of cb1, then you should use .then for chaining:
var pageLoadPromise = ajax({url: …}); // initialisation
var cB1promise = pageLoadPromise.then(cB1);
cB1promise.done(cb5); // does get called with the return value of cB1
or
ajax({url: …}).then(function(data) {
var res1 = cb1(data.d.cb1data);
…
cb5(data.d.cb5data, some_additional_data, res1);
});
Update. Thanks to #Bergi who pointed out that jQuery's done() returns in fact the same promise. I've updated the answer based on that.
If cB2,cB3,cB4 are not interconnected and all of them process the same data from the ajax call, then you can add them to the same promise (pageloadPromise).
With the above assumption in mind, your second version of code can be simplified without involving a new promise to be created in cB1(), and without having to go one extra indentation level:
pageloadPromise.then(cB1).done(cB5);
pageloadPromise.done(cB2, cB3, cB4);
function cB1(data){
// ...
//data2 would be the argument value passed when resolving
// your original cB1Promise
return data2;
}
What happens here is that the .then() call creates a new promise that gets resolved with whatever data cB1 returns, allowing cB5 to receive that data without creating an extra callback and without involving another promise (as we already have one in hand).
However if cB1 needs another ajax then your original implementation of cB1 would be more appropriate (the callback scheduling remains the same though).
And one final note, I didn't noticed any failure handlers, in case the ajax call fails.
I'm using deferred as I need to execute several processes asynchronously.
To be clearer, here is the signification of my treatments :
Treatment1 : call of an ajax service providing user rights
Treatment2 : call of an ajax service providing links and labels.
I need to call these 2 services at the same time and then get the unified response of both services in order to display links depending on rights (my real problem is with a 3rd ajax service but let's talk about with only 2 to simplify).
First, I declare the deferred as global var :
var treatment1 = $.Deferred();
var treatment2 = $.Deferred();
Then, when I need to do the job, I call the resolve method with needed data for using it in the global unique treatment:
when my 1st ajax service responds : treatment1.resolve(responseData1)
when my 2nd ajax service responds : treatment2.resolve(responseData2)
When the treatment1 & 2 are finished, the done event is fired :
$.when(treatment1, treatment2).done(function(responseData1,responseData2) {
DoGlobalTreatmentWithAllResponseData(responseData1,responseData2)
}
My problem is that deferred works only once.
As my website is realized in ajax mainly, I need to fire the event multiple times.
The user can click a button to search for users. Then a list of users is displayed and the ajax services are all called asynchronously. This operation can be repeated infinitely.
I just need a way to reuse the principle of deferred but multiple times. I know that this problem has already been discussed and everyone says deferred can't work this way.
But, is it really not possible to reset the deferred state or reset the promises (even by implementing a custom solution, using AOP or something else)?
If it's impossible, what solution could I use? I don't want to fire treatments one after another but I really want to do a global treatment after all the treatments are finished (that is to say, after the last treatment in activity is finished) and I want to use the responseData of each services.
Here is my sample code that I would like to customize : http://jsfiddle.net/PLce6/14/
I hope to be clear as English is not my native language.
Thank you in advance for your help.
Deferreds can be resolved/rejected only once... However, I think the issue is how you're structuring your code...
As long as you're initializing your deferred each time, there isn't any problem in doing this...
I think the issue is this:
First, i declare the deferred as global var:
var treatment1 =$.Deferred();
var treatment2 = $.Deferred();
Instead, can you try doing this in a function that's invoked in the button click
The user can clic a button to search for users
so have a function like so:
function onClick() {
var treatment1 =$.ajax({url: '/call1'});
var treatment2 = $.ajax({url: '/call2'});
$.when(treatment1, treatment2).done(function(obj1, obj2) {
// do whatever else you need
});
}
Now from the rest of your post, looks like you're trying to reuse the deferreds - but in that case, your original solution should not have a problem with keeping deffereds as global since your done will be called with whatever data they were resolved with.
Can you post some more of your code to help explain what you're trying to do.
Updated from my own comment below for elaboration
based on op's fiddle, he wants to be able to trigger dependent action multiple times. Solution is to have the dependent action create new deferreds and hook up a $.when to itself. See updated fiddle at http://jsfiddle.net/PLce6/15/
// global
var d1 = $.Deferred();
var d2 = $.Deferred();
var d3 = $.Deferred();
// here's the reset
function resetDeferreds() {
d1 = $.Deferred();
d2 = $.Deferred();
d3 = $.Deferred();
$.when(d1, d2, d3).done(
function (responseData1, responseData2, responseData3) {
DoGlobalTreatmentWithAllResponseData(responseData1, responseData2, responseData3);
resetDeferreds();
});
// the onclick handlers
function do3() {
d3.resolve('do3 ');
return d3;
}
// the top level $.when
$.when(d1, d2, d3).done(function (responseData1, responseData2, responseData3) {
DoGlobalTreatmentWithAllResponseData(responseData1, responseData2, responseData3);
resetDeferreds();
});
Perhaps you code is not well designed?
I do not see how that would be an issue. The asynchronous process should be responsible for creating a new Deferred object everytime.
function doSomething() {
var d = $.Deferred();
setTimeout(function () {
d.resolve();
}, 1000);
return d;
}
function doSomethingElse() {
var d = $.Deferred();
setTimeout(function () {
d.resolve();
}, 1000);
return d;
}
Then you can always do the following:
$.when(doSomething(), doSomethingElse()).done(function () {
console.log('done');
});
There's always a solution:
If you absolutely need to be able to call resolve multiple times on the same Deferred, then you should wrap the Deferred into another object, let's say DeferredWrapper, which would expose the same API as a Deferred but would delegate all method calls to the it's encapsulated Deferred.
In addition of delegating the function calls, the DeferredWrapper would have to keep track of all listening operations (e.g. done, always, fail...) that were made on the object. The DeferredWrapper could store all actions as [functionName, arguments] tuples in an internal this._actions property.
Finally, you would need to provide a special implementation for state changing operations (e.g. reject, resolve, resolveWith...etc) that would look like:
Let d be the internal Deferred referenced by this._deferred.
Let fn be the function name of the function being called.
If d.state() is not pending:
3.1 Do d = this._deferred = [[native jQuery Deferred]]
3.2 Apply all actions on d.
Return the result of d[fn].apply(d, arguments)
Note: You would also need to implement a custom promise implementation and make sure it behaves correctly. You can probably use a similar approach like the one described.
I'm going to suggest a small change. One element you weren't clear on is whether or not the treatment1 and treatment2 results are different each time. If they are then do what #raghu and #juan-garcia
function onClick() {
var treatment1 =$.ajax({url: '/call1'});
var treatment2 = $.ajax({url: '/call2'});
$.when(treatment1, treatment2).done(function(obj1, obj2) {
// do whatever else you need
});
}
If they don't change then do this :
var treatment1 =$.ajax({url: '/call1'});
var treatment2 = $.ajax({url: '/call2'});
function onClick() {
$.when(treatment1, treatment2).done(function(obj1, obj2) {
// do whatever else you need
});
}
Or some variation of that. Because once they are complete, your callback function will always execute right away. It's still asynchronous, but it doesn't need to wait since everything is ready to go. This serves both use cases. This is a very common pattern for data that may take a few seconds to load before it's functionally useful when drawing a new component in the page. It's a lazy-load mechanism that's very useful. Once it's in though everything looks as if it's responding instantaneously.
I reworked the javascript in your example on JSFiddle to show just the basics I think you needed to see. That is here. Given your example, I think the mistake is in believing that resolve must be called multiple times to trigger a behavior. Invoking the done behavior cues a one time behavior and each invocation of done loads a new behavior into the queue. Resolve is called one time. $.when().done() you call as many times as you have behaviors dependent on the specific when() condition.
I need to perform several functions in my JavaScript/jQuery, but I want to avoid blocking the UI.
AJAX is not a viable solution, because of the nature of the application, those functions will easily reach the thousands. Doing this asynchroniously will kill the browser.
So, I need some way of chaining the functions the browser needs to process, and only send the next function after the first has finished.
The algorithm is something like this
For steps from 2 to 15
HTTP:GET amount of items for current step (ranges somewhere from a couple of hundred to multiple thousands)
For every item, HTTP:GET the results
As you see, I have two GET-request-"chains" I somehow need to manage... Especially the innermost loop crashes the browser near to instantly, if it's done asynchroniously - but I'd still like the user to be able to operate the page, so a pure (blocking) synchronous way will not work.
You can easily do this asynchronously without firing all requests at once. All you need to do is manage a queue. The following is pseudo-code for clarity. It's easily translatable to real AJAX requests:
// Basic structure of the request queue. It's a list of objects
// that defines ajax requests:
var request_queue = [{
url : "some/path",
callback : function_to_process_the_data
}];
// This function implements the main loop.
// It looks recursive but is not because each function
// call happens in an event handler:
function process_request_queue () {
// If we have anything in the queue, do an ajax call.
// Otherwise do nothing and let the loop end.
if (request_queue.length) {
// Get one request from the queue. We can either
// shift or pop depending on weather you prefer
// depth first or breadth first processing:
var req = request_queue.pop();
ajax(req.url,function(result){
req.callback(result);
// At the end of the ajax request process
// the queue again:
process_request_queue();
}
}
}
// Now get the ball rolling:
process_request_queue();
So basically we turn the ajax call itself into a pseudo loop. It's basically the classic continuation passing style of programming done recursively.
In your case, an example of a request would be:
request_queue.push({
url : "path/to/OUTER/request",
callback : function (result) {
// You mentioned that the result of the OUTER request
// should trigger another round of INNER requests.
// To do this simply add the INNER requests to the queue:
request_queue.push({
url : result.inner_url,
callback : function_to_handle_inner_request
});
}
});
This is quite flexible because you not only have the option of processing requests either breadth first or depth first (shift vs pop). But you can also use splice to add stuff to the middle of the queue or use unshift vs push to put requests at the head of the queue for high priority requests.
You can also increase the number of simultaneous requests by popping more than one request per loop. Just be sure to only call process_request_queue only once per loop to avoid exponential growth of simultaneous requests:
// Handling two simultaneous request channels:
function process_request_queue () {
if (request_queue.length) {
var req = request_queue.pop();
ajax(req.url,function(result){
req.callback(result);
// Best to loop on the first request.
// The second request below may never fire
// if the queue runs out of requests.
process_request_queue();
}
}
if (request_queue.length) {
var req = request_queue.pop();
ajax(req.url,function(result){
req.callback(result);
// DON'T CALL process_request_queue here
// We only want to call it once per "loop"
// otherwise the "loop" would grow into a "tree"
}
}
}
You could make that ASYNC and use a small library I wrote some time ago that will let you queue function calls.