Deferring Execution Till Needed - javascript

I know that you use jQuery deferreds in the following scenario:
I have a asynchronous request. I'd like to execute one or more sections of code whenever this request comes back.
Here's my issue. I don't want the async request to actually be sent out until it hits the first .when statement.
For instance:
I have an init function that loads up say 50 different deferred objects for various data requests. I obviously don't want all of these to fire at once, just when needed. The data will be loaded (the deferred object resolved) on future .when statements.
Sort of complicated but thanks for reading this! :D
Here's what we have
init() = function {
var data1 = someDeferredRequest();
//blah, lots of these
}
.doSomethingElse() = function {
//I only want the call to data1 to have been started the first instance I try to go get it, like this instance below. I do NOT want it fired in the init
$.when(data1).then();
//other stuff, still want data1 to be completed by here
$.when(data1).then();
}

Have a method on the objects named fire() or run(), whatever suits you that grabs/pulls does whatever you want with the data.
The object will then sit idle until you call object.run().

Related

Hold on callback function execution until promise is completed

I have some third party library whose events I'm listening. I get a chance to modify data which that library is going to append in the UI. It is all fine until that data modification is synchronous. As soon as I involve Ajax callbacks/promises, this fails to work. Let me put an example to show case the problem.
Below is how I'm listening to a event:-
d.on('gotResults', function (data) {
// If alter data directly it works fine.
data.title = 'newTitle';
// Above code alters the text correctly.
//I want some properties to be grabbed from elsewhere so I make an Ajax call.
$.ajax('http://someurl...', {data.id}, function (res) {
data.someProperty = res.thatProperty;
});
// Above code doesn't wait for ajax call to complete, it just go away and
renders page without data change.
// Yes I tried promises but doesn't help
return fetch('http://someurl...').then(function (data) {
data.someProperty = res.thatProperty;
return true;
});
// Above code also triggers the url and gets away. Doesn't wait for then to complete.
});
I cannot change/alter the third party library. All I have is to listen to event and alter that data.
Any better solutions. Nope. I can't use async/wait, generators, because I want to have it supported for ES5 browsers.
You cannot make a synchronous function wait for an asynchronous response, it's simply not possible by definition. Your options pretty much are:
BAD IDEA: Make a synchronous AJAX request. Again: BAD IDEA. Not only will this block the entire browser, it is also a deprecated practice and should not be used in new code, or indeed ever.
Fetch the asynchronous data first and store it locally, so it's available synchronously when needed. That obviously only works if you have an idea what data you'll be needing ahead of time.
Alter the 3rd party library to add support for asynchronous callbacks, or request that of the vendor.
Find some hackaround where you'll probably let the library work with incomplete data first and then update it when the asynchronous data is available. That obviously depends a lot on the specifics of that library and the task being done.
Does the gotResults callback function really need to return anything else than true? If not, then you could just write regular asynchronous code without this library knowing about it. Let me explain myself by rewriting your pseudocode:
d.on('gotResults', function (data) {
// If alter data directly it works fine.
data.title = 'newTitle';
// Above code alters the text correctly.
//I want some properties to be grabbed from elsewhere so I make an Ajax call.
$.ajax('http://someurl...', {data.id}, function (res) {
data.someProperty = res.thatProperty;
// Above code doesn't wait for ajax call to complete, it just go away and
// EDIT: now it should render properly
renders page without data change.
// Yes I tried promises but doesn't help
return fetch('http://someurl...');
// Above code also triggers the url and gets away. Doesn't wait for then to complete.
}).then(function (data) {
data.someProperty = res.thatProperty;
// maybe render again here?
}).catch(function(err) {
handleError(err); // handle errors so the don't disappear silently
});
return true; // this line runs before any of the above asynchronous code but do we care?
});

How to run a function when all the data loaded?

I process thousands of points asynchronously in ArcGIS JS API. In the main function, I call functions processing individual features, but I need to finalize the processing when all the features are processed. There should be an event for this, though I didn't find any and I'm afraid it even doesn't exist - it would be hard to state that the last item processed was the last of all. .ajaxStop() should do this, but I don't use jQuery, just Dojo. Closest what I found in Dojo was Fetch and its OnComplete, but as far as I know it's about fetching data from AJAX, not from other JS function.
The only workaround idea I have now is to measure how many features are to be processed and then fire when the output points array reaches desired length, but I need to count the desired number at first. But how to do it at loading? Tracking the data to the point where they are read from server would mean modifying functions I'm not supposed to even know, which is not possible.
EDIT - some of my code:
addData: function (data) {
dojo.addOnLoad(
this.allData = data,
this._myFunction()
);
},
Some comments:
data is an array of graphics
when I view data in debugger, its count is 2000, then 3000, then 4000...
without dojo.addOnLoad, the count started near zero, now it's around 2000, but still a fraction of the real number
_myFunction() processes all the 2000...3000...4000... graphics in this._allData, and returns wrong results because it needs them all to work correctly
I need to delay execution of _myFunction() until all data load, perhaps by some other event instead of dojo.addOnLoad.
Workarounds I already though of:
a) setTimeout()
This is clearly a wrong option - any magic number of miliseconds to wait for would fail to save me if the data contains too much items, and it would delay even cases of a single point in the array.
b) length-based delay
I could replace the event with something like this:
if(data.length == allDataCount) {
this._myFunction();
}
setTimeout(this._thisFunction, someDelay);
or some other implementation of the same, through a loop or a counter incremented in asynchronously called functions. Problem is how to make sure the allDataCount variable is definitive and not just the number of features leaded until now.
EDIT2: pointing to deferreds and promises by #tik27 definitely helped me, but the best I found on converting synchronous code to a deferred was this simple example. I probably misunderstood something, because it doesn't work any better than the original, synchronous code, the this.allData still can't be guaranteed to hold all the data. The loading function now looks like this:
addData: function (data) {
var deferred = new Deferred();
this._addDataSync(data, function (error, result) {
if (error) {
deferred.reject(error);
}
else {
deferred.resolve(result);
}
});
deferred.promise.then(this._myFunction());
},
_addDataSync: function (data, callback) {
callback(this.allData = data);
},
I know most use cases of deferred suppose requesting data from some server. But this is the first time where I can work with data without breaking functions I shouldn't change, so tracking the data back to the request is not an option.
addonload is to wait for the dom.
If you are waiting for a function to complete to run another function deferred/promises are what is used.
Would need more info on your program to give you more specific answers..
I sort of solved my problem, delaying the call of my layer's constructor until the map loads completely and the "onUpdateEnd" event triggers. This is probably the way how it should be properly done, so I post this as an answer and not as an edit of my question. On the other hand, I have no control over other calls of my class and I would prefer to have another line of defense against incomplete inputs, or at least a way to tell whether I should complain about incomplete data or not, so I keep the answer unaccepted and the question open for more answers.
This didn't work when I reloaded the page, but then I figured out how to properly chain event listeners together, so I now can combine "onUpdateEnd" with extent change or any other event. That's perfectly enough for my needs.

Chaining multiple callbacks to a single jquery promise

I have the following setup, and I'm curious if this is the correct way to do it. It works correctly, but I'm just making sure that I'm doing it right, or if there is a better way to accomplish the same task.
//custom ajax wrapper
var pageLoadPromise = ajax({
url: //call to webmethod
});
//this is the way I have been doing it
pageLoadPromise.done(cB1)
.done(cB2)
.done(cB3)
.done(cB4)
.done(function(){cB5(args);});
//this function requires that cB1 has been completed
//I tried this and it worked as well
pageLoadPromise.done(cB1,cB2,cB3,cB4)
.done(function(){cB5(agrs)});
Doing it both ways works, but like I said, am I wondering if it is this the correct way to accomplish this?
UPDATE:
I have made a small adjustment to my code, specifically for cB1 and the callback to cB5
pageloadPromise.done(
function(data){
cB1(data).done(function(){
cB5(args);
});
},cB2,cB3,cB4
);
function cB1(data){
var cB1Promise = $.Deferred();
...
cB1Promise.resolve();
return cB1Promise;
}
As pointed out by #Bergi, regardless of how you add the callbacks, they are all run in the order they are attached using done. So, promise.done(cb1, cb2, cb3, cb4).done(cb5) is the same as promise.done(cb1).done(cb2).done(cb3).done(cb4).done(cb5).
To make sure cb5 runs after cb1 use:
promise.done( function(data) {cb1(data).done(cb5);}, cb2, cb3, cb4);
Remove data if you don't need it.
I played around with the scenarios in http://jsbin.com/moqiko/4/edit?js,console,output
Doing it both ways works
Yes, they are pretty much equivalent (except for the .done(function(){cB5}); which doesn't work).
I am wondering if it is this the correct way to accomplish this?
Use the one you like better. This is more a design question than one of "correctness". However, both ways look quite odd in my eyes, and I've seen lots of promise code. I would recommend two different structures, depending on how your app is structured:
You use the pageLoadPromise as a global cache for your initial data. It is then consumed in very different places, possibly at different times, for multiple different things (or maybe even repeatedly for the same thing). Then use pageLoadPromise repeatedly in each module:
var pageLoadPromise = ajax({url: …}); // initialisation
pageLoadPromise.done(cB1); // somewhere
…
pageLoadPromise.done(cB2); // somewhere else
…
pageLoadPromise.done(cB3); // other place or time
…
You use the pageLoadPromise in one place only, and want to basically do one thing when it's loaded, except that it is structured in multiple subtasks; and each needs only a part of, not the whole structure. Then use a single callback only:
ajax({url: …}).then(function(data) {
cb1(data.d.cb1data);
cb2(data.d.cb2data);
cb3(data.d.cb3data);
cb4(data.d.cb4data);
cb5(data.d.cb5data, some_additional_data);
});
I have made a small adjustment to my code, specifically for cB1 and the callback to cB5
You should not make cb1 return a promise when it doesn't do anything asynchronous. Don't modify it. If you want to express explicitly that cb5 needs to be executed with the result of cb1, then you should use .then for chaining:
var pageLoadPromise = ajax({url: …}); // initialisation
var cB1promise = pageLoadPromise.then(cB1);
cB1promise.done(cb5); // does get called with the return value of cB1
or
ajax({url: …}).then(function(data) {
var res1 = cb1(data.d.cb1data);
…
cb5(data.d.cb5data, some_additional_data, res1);
});
Update. Thanks to #Bergi who pointed out that jQuery's done() returns in fact the same promise. I've updated the answer based on that.
If cB2,cB3,cB4 are not interconnected and all of them process the same data from the ajax call, then you can add them to the same promise (pageloadPromise).
With the above assumption in mind, your second version of code can be simplified without involving a new promise to be created in cB1(), and without having to go one extra indentation level:
pageloadPromise.then(cB1).done(cB5);
pageloadPromise.done(cB2, cB3, cB4);
function cB1(data){
// ...
//data2 would be the argument value passed when resolving
// your original cB1Promise
return data2;
}
What happens here is that the .then() call creates a new promise that gets resolved with whatever data cB1 returns, allowing cB5 to receive that data without creating an extra callback and without involving another promise (as we already have one in hand).
However if cB1 needs another ajax then your original implementation of cB1 would be more appropriate (the callback scheduling remains the same though).
And one final note, I didn't noticed any failure handlers, in case the ajax call fails.

Reuse Deferred more than once

I'm using deferred as I need to execute several processes asynchronously.
To be clearer, here is the signification of my treatments :
Treatment1 : call of an ajax service providing user rights
Treatment2 : call of an ajax service providing links and labels.
I need to call these 2 services at the same time and then get the unified response of both services in order to display links depending on rights (my real problem is with a 3rd ajax service but let's talk about with only 2 to simplify).
First, I declare the deferred as global var :
var treatment1 = $.Deferred();
var treatment2 = $.Deferred();
Then, when I need to do the job, I call the resolve method with needed data for using it in the global unique treatment:
when my 1st ajax service responds : treatment1.resolve(responseData1)
when my 2nd ajax service responds : treatment2.resolve(responseData2)
When the treatment1 & 2 are finished, the done event is fired :
$.when(treatment1, treatment2).done(function(responseData1,responseData2) {
DoGlobalTreatmentWithAllResponseData(responseData1,responseData2)
}
My problem is that deferred works only once.
As my website is realized in ajax mainly, I need to fire the event multiple times.
The user can click a button to search for users. Then a list of users is displayed and the ajax services are all called asynchronously. This operation can be repeated infinitely.
I just need a way to reuse the principle of deferred but multiple times. I know that this problem has already been discussed and everyone says deferred can't work this way.
But, is it really not possible to reset the deferred state or reset the promises (even by implementing a custom solution, using AOP or something else)?
If it's impossible, what solution could I use? I don't want to fire treatments one after another but I really want to do a global treatment after all the treatments are finished (that is to say, after the last treatment in activity is finished) and I want to use the responseData of each services.
Here is my sample code that I would like to customize : http://jsfiddle.net/PLce6/14/
I hope to be clear as English is not my native language.
Thank you in advance for your help.
Deferreds can be resolved/rejected only once... However, I think the issue is how you're structuring your code...
As long as you're initializing your deferred each time, there isn't any problem in doing this...
I think the issue is this:
First, i declare the deferred as global var:
var treatment1 =$.Deferred();
var treatment2 = $.Deferred();
Instead, can you try doing this in a function that's invoked in the button click
The user can clic a button to search for users
so have a function like so:
function onClick() {
var treatment1 =$.ajax({url: '/call1'});
var treatment2 = $.ajax({url: '/call2'});
$.when(treatment1, treatment2).done(function(obj1, obj2) {
// do whatever else you need
});
}
Now from the rest of your post, looks like you're trying to reuse the deferreds - but in that case, your original solution should not have a problem with keeping deffereds as global since your done will be called with whatever data they were resolved with.
Can you post some more of your code to help explain what you're trying to do.
Updated from my own comment below for elaboration
based on op's fiddle, he wants to be able to trigger dependent action multiple times. Solution is to have the dependent action create new deferreds and hook up a $.when to itself. See updated fiddle at http://jsfiddle.net/PLce6/15/
// global
var d1 = $.Deferred();
var d2 = $.Deferred();
var d3 = $.Deferred();
// here's the reset
function resetDeferreds() {
d1 = $.Deferred();
d2 = $.Deferred();
d3 = $.Deferred();
$.when(d1, d2, d3).done(
function (responseData1, responseData2, responseData3) {
DoGlobalTreatmentWithAllResponseData(responseData1, responseData2, responseData3);
resetDeferreds();
});
// the onclick handlers
function do3() {
d3.resolve('do3 ');
return d3;
}
// the top level $.when
$.when(d1, d2, d3).done(function (responseData1, responseData2, responseData3) {
DoGlobalTreatmentWithAllResponseData(responseData1, responseData2, responseData3);
resetDeferreds();
});
Perhaps you code is not well designed?
I do not see how that would be an issue. The asynchronous process should be responsible for creating a new Deferred object everytime.
function doSomething() {
var d = $.Deferred();
setTimeout(function () {
d.resolve();
}, 1000);
return d;
}
function doSomethingElse() {
var d = $.Deferred();
setTimeout(function () {
d.resolve();
}, 1000);
return d;
}
Then you can always do the following:
$.when(doSomething(), doSomethingElse()).done(function () {
console.log('done');
});
There's always a solution:
If you absolutely need to be able to call resolve multiple times on the same Deferred, then you should wrap the Deferred into another object, let's say DeferredWrapper, which would expose the same API as a Deferred but would delegate all method calls to the it's encapsulated Deferred.
In addition of delegating the function calls, the DeferredWrapper would have to keep track of all listening operations (e.g. done, always, fail...) that were made on the object. The DeferredWrapper could store all actions as [functionName, arguments] tuples in an internal this._actions property.
Finally, you would need to provide a special implementation for state changing operations (e.g. reject, resolve, resolveWith...etc) that would look like:
Let d be the internal Deferred referenced by this._deferred.
Let fn be the function name of the function being called.
If d.state() is not pending:
3.1 Do d = this._deferred = [[native jQuery Deferred]]
3.2 Apply all actions on d.
Return the result of d[fn].apply(d, arguments)
Note: You would also need to implement a custom promise implementation and make sure it behaves correctly. You can probably use a similar approach like the one described.
I'm going to suggest a small change. One element you weren't clear on is whether or not the treatment1 and treatment2 results are different each time. If they are then do what #raghu and #juan-garcia
function onClick() {
var treatment1 =$.ajax({url: '/call1'});
var treatment2 = $.ajax({url: '/call2'});
$.when(treatment1, treatment2).done(function(obj1, obj2) {
// do whatever else you need
});
}
If they don't change then do this :
var treatment1 =$.ajax({url: '/call1'});
var treatment2 = $.ajax({url: '/call2'});
function onClick() {
$.when(treatment1, treatment2).done(function(obj1, obj2) {
// do whatever else you need
});
}
Or some variation of that. Because once they are complete, your callback function will always execute right away. It's still asynchronous, but it doesn't need to wait since everything is ready to go. This serves both use cases. This is a very common pattern for data that may take a few seconds to load before it's functionally useful when drawing a new component in the page. It's a lazy-load mechanism that's very useful. Once it's in though everything looks as if it's responding instantaneously.
I reworked the javascript in your example on JSFiddle to show just the basics I think you needed to see. That is here. Given your example, I think the mistake is in believing that resolve must be called multiple times to trigger a behavior. Invoking the done behavior cues a one time behavior and each invocation of done loads a new behavior into the queue. Resolve is called one time. $.when().done() you call as many times as you have behaviors dependent on the specific when() condition.

javascript outside of scope

Based on chrome developer tools a breakpoints I think I'm dealing with a scope issue I can figure out. Is it the way I define the function? The script below is an include js file and the array ' timeStamp I want available for use in other functions without having to call my loadData function everytime.
The timeStamp array goes undefined once it leaves the for loop before it even leaves the function.
var timeStamp = []; // Want this array to be global
function loadData (url){
$.getJSON(url, function(json) {
for (var i=0;i<json.length;i++){
timeStamp.push(json[i].TimeStamp);
}
console.log(inputBITS); //returns the value
});
console.log(inputBITS); //undefined
}
Thank you for anyhelp
It looks like the issue is that getJSON is asynchronous. When it executes and finishes and your code continues on, it indicates only the START of the networking operation to retrieve the data. The actual networking operation does not complete until some time later.
When it does complete, the success handler is called (as specified as the second argument to your getJSON() call) and you populate the timeStamp array. ONLY after that success handler has been called is the timeStamp array valid.
As such, you cannot use the timeStamp array in code that immediately follows the getJSON() call (it hasn't been filled in yet). If other code needs the timeStamp array, you should call that code from the success handler or use some other timing mechanism to make sure that the code that uses the timeStamp array doesn't try to use it until AFTER the success handler has been called and the timeStamp array has been populated.
It is possible to make some Ajax calls be synchronous instead of asynchronous, but that is generally a very bad idea because it locks up the browser during the entire networking operation which is very unfriendly to the viewer. It is much better to fix the coding logic to work with asynchronous networking.
A typical design pattern for an ajax call like this is as follows:
function loadData (url){
$.getJSON(url, function(json) {
// this will execute AFTER the ajax networking finishes
var timeStamp = [];
for (var i=0;i<json.length;i++) {
timeStamp.push(json[i].TimeStamp);
}
console.log(timeStamp);
// now call other functions that need timeStamp data
myOtherFunc(timeStamp);
});
// this will execute when the ajax networking has just been started
//
// timeStamp data is NOT valid here because
// the ajax call has not yet completed
// You can only use the ajax data inside the success handler function
// or in any functions that you call from there
}
And here's another person who doesn't understand basic AJAX...
getJSON is asynchronous. Meaning, code keeps running after the function call and before the successful return of the JSON request.
You can "fix" this by forcing the request to be synchronous with an appropriate flag, but that's a really bad idea for many reasons (the least of which is that you're violating the basic idea of AJAX). The best way is to remember how AJAX works and instead put all your code that should be executed when the AJAX returns, in the right place.

Categories