A JS control calls a data service and continues rendering itself without waiting for the result. Sometimes the service returns after the the controls is being fully rendered, sometimes - before. How do you implement WaitForAll in JS? I'm using jQuery.
Here's what I've done myself: (Utils.WaitForAll simply counts the number of hits once it's matched with the count it calls handle)
// before we started
var waiter = Utils.WaitFor({handle: function(e){ alert("got called"; }, count: 2});
the way it gets triggered:
// place one
waiter.Notify({one: {...}});
and then
// place two (can occur before one though)
waiter.Notify({two: {...}});
which triggers handle, handle has values tagged as one & two in its e. Waiter is an extra 'global' var, travelling down the stack, which i didn't quite like and it's a another new object after all... Any obvious problems with my approach?
You should take a look a promise interface of CommonJS (implemented by jQuery.Deferred) it provides progress callback which can be used in this case.
sample code:
var waiter = $.Deferred();
var len = 2;
waiter.done(function() {
alert("Hooray!!!");
});
waiter.progress(function() {
if(--len === 0) {
waiter.resolve();
}
});
// somewhere
$.ajax({
...
data: somedata,
success: function() {
waiter.notify();
}
});
// somewhere else
$.ajax({
...
data: someotherdata,
success: function() {
waiter.notify();
}
});
More about deferred:
jQuery Deferred API
Learn how to use Deferred here
How to use deferred objects in jQuery (from OP's answer to the same question)
I've found exactly wheat I need being jQuery Deferred, see the article:
http://richardneililagan.com/2011/05/using-deferred-objects-in-jquery-1-5/
Related
I have the following code:
$("#submit_financials").live('click', function(event){
event.preventDefault();
// using serialize here to pass the POST variables to the django view function
var serialized_data = $("#financials_filter_form").serialize()
$.post("/ajax/custom_filter/", serialized_data, function(response){
// create a graph
});
$.post("/ajax/force_download/", serialized_data, function(response){
alert('hello');
});
});
However, when I do this code, I get the response 'hello' before the graph. Why is this happening? And how would I change this such that I get the graph first?
Async, you can never know which function runs\ finish first...
Think on async operations like telling a group of people to run 1 mile, do you know who will finish first? (Yes, Jon skeet, then Chuck Norris...)
You can use the a callack to run the second ajax:
$.post("/ajax/custom_filter/", serialized_data, function(response) {
// create a graph
...
...
$.post("/ajax/force_download/", serialized_data, function(response) {
alert('hello');
});
});
You can try using deferred objects If you want to generate graph before alert but want both calls to be completed.
$.when (
$.post("/ajax/custom_filter/", serialized_data),
$.post("/ajax/force_download/", serialized_data)
).done(function(a1, a2){
/* a1 and a2 are arguments resolved for the
custom_filter and force_download post requests */
var customFilterResponse = a1[2];
/* arguments are [ "success", statusText, jqXHR ] */
//generate graph.
alert('hello');
});
option 1 is to nest the post requests (gdoron's response). If this is not possible/practical, you can use a mutually scoped variable as a flag (change value in the response callback functions and then use setTimeout and recursion (or setInterval) to keep looking for a change in your "flag" variable and when you see it change, then pop the next $.post request
---EDITED---due to my ignorance, this is actually the same as alllll the other AJAX-type questions out there...need to get into the right mindset. Leaving it here for posterity's sake and maybe help others take a second look at callbacks before posting.
So I would like to say up front that I think this is not the standard "how do I return a value from an ajax call" issue where people aren't waiting for the async call to finish. I think this is a variable scope misunderstanding with Javascript module patterns, so any guidance would be appreciated.
I am following this SO post on constructing my ajax call, so I am using deferred objects to crunch my data after the call finishes. And also several tutorials on the Javascript module pattern, like this and this. It seems fairly straightforward to return values from within a private module inside my outer module--however, I always get myObj.roots() as undefined. Even though it is defined as an array of X values when I check with breakpoints. What simple thing am I missing--any hints? Thanks! Sorry for a simple question, I'm entirely new to JS module patterns and trying to build my own library...
My JS code:
var myObj = (function(window,document,$,undefined){
var _baseUri = 'http://example.com/',
_serviceUri = 'services/',
_bankId = '1234',
_myUri = _baseUri + _serviceUri + 'objectivebanks/' + _bankId,
_roots = [];
function myAjax(requestURL) {
return $.ajax({
type: 'GET',
url: requestURL,
dataType: 'json',
async: true
});
}
var getRoots = function() {
var _url = _myUri + '/objectives/roots';
_roots = [];
myAjax(_url).done(function(data) {
$.each(data, function(index, nugget) {
_roots.push(nugget);
});
return _roots;
}).fail(function(xhr, textStatus, thrownError) {
console.log(thrownError.message);
});
}
return {
roots: getRoots
};
})(this,document,jQuery);
My error (from Chrome's developer tools' console):
myObj.roots()
undefined
Your "getRoots" function does not return anything. Using the $.ajax(successCallback) or $.ajax.done() patter is the same thing. You are not deferring anything.
There is no way you can do this without callbacks, events or promises.
Callbacks and events are basically the same, only the latter allow better architectural decoupling (highly debatable fact).
Promises mean that you can write var x = getRoots() and x will be undefined until the browser gets back a response from the server. Your application has to account for this. So either you start coding with the async pattern in mind (callbacks, events) or design applications that handle null/undefined values gracefully.
Using callbacks:
function getStuff(callback) {
$.ajax(...).done(function(data) {
// maybe process data?
callback(data);
});
}
getStuff(function(data) {
// this is where I can use data
});
This way you can write your getStuff methods in a separate module, say "DataService" so MVC logic is not polluted.
Just by seeing what I've wrote now, I can see that one is much smaller, so in terms of code golf Option 2 is the better bet, but as far as which is cleaner, I prefer Option 1. I would really love the community's input on this.
Option 1
something_async({
success: function(data) {
console.log(data);
},
error: function(error) {
console.log(error);
}
});
Option 2
something_async(function(error,data){
if(error){
console.log(error);
}else{
console.log(data);
}
});
They are not exactly the same. Option 2 will still log the (data), whereas Option 1 will only log data on success. (Edit: At least it was that way before you changed the code)
That said, Option 1 is more readable. Programming is not / should not be a competition to see who can write the fewest lines that do the most things. The goal should always be to create maintainable, extendable (if necessary) code --- in my humble opinion.
Many people will find option#1 easier to read and to maintain - two different callback functions for two different purposes. It is commonly used by all Promise Libraries, where two arguments will be passed. Of course, the question Multiple arguments vs. options object is independent from that (while the object is useful in jQuery.ajax, it doesn't make sense for promise.then).
However, option#2 is Node.js convention (see also NodeGuide) and used in many libraries that are influenced by it, for example famous async.js. However, this convention is discussable, top google results I found are WekeRoad: NodeJS Callback Conventions and Stackoverflow: What is the suggested callback style for Node.js libraries?.
The reason for the single callback function with an error argument is that it always reminds the developer to handle errors, which is especially important in serverside applications. Many beginners at clientside ajax functions don't care forget about error handling for example, asking themselves why the success callback doesn't get invoked. On the other hand, promises with then-chaining are based on the optionality of error callbacks, propagating them to the next level - of course it still needs to be catched there.
In all honesty, I prefer to take them one step further, into Promises/Futures/Deferreds/etc...
Or (/and) go into a "custom event" queue, using a Moderator (or an observer/sub-pub, if there is good reason for one particular object to be the source for data).
This isn't a 100% percent of the time thing. Sometimes, you just need a single callback. However, if you have multiple views which need to react to a change (in model data, or to visualize user-interaction), then a single callback with a bunch of hard-coded results isn't appropriate.
moderator.listen("my-model:timeline_update", myView.update);
moderator.listen("ui:data_request", myModel.request);
button.onclick = function () { moderator.notify("ui:data_request", button.value); }
Things are now much less dependent upon one big callback and you can mix and match and reuse code.
If you want to hide the moderator, you can make it a part of your objects:
var A = function () {
var sys = null,
notify = function (msg, data) {
if (sys && sys.notify) { sys.notify(msg, data); }
},
listen = function (msg, callback) {
if (sys && sys.listen) { sys.listen(msg, callback); }
},
attach = function (messenger) { sys = messenger; };
return {
attach : attach
/* ... */
};
},
B = function () { /* ... */ },
shell = Moderator(),
a = A(),
b = B();
a.attach(shell);
b.attach(shell);
a.listen("do something", a.method.bind(a));
b.notify("do something", b.property);
If this looks a little familiar, it's similar behaviour to, say Backbone.js (except that they extend() the behaviour onto objects, and others will bind, where my example has simplified wrappers to show what's going on).
Promises would be the other big-win for usability, maintainable and easy to read code (as long as people know what a "promise" is -- basically it passes around an object which has the callback subscriptions).
// using jQuery's "Deferred"
var ImageLoader = function () {
var cache = {},
public_function = function (url) {
if (cache[url]) { return cache[url].promise(); }
var img = new Image(),
loading = $.Deferred(),
promise = loading.promise();
img.onload = function () { loading.resolve(img); };
img.onerror = function () { loading.reject("error"); };
img.src = url;
cache[url] = loading;
return promise;
};
return public_function;
};
// returns promises
var loadImage = ImageLoader(),
myImg = loadImage("//site.com/img.jpg");
myImg.done( lightbox.showImg );
myImg.done( function (img) { console.log(img.width); } );
Or
var blog_comments = [ /* ... */ ],
comments = BlogComments();
blog_comments.forEach(function (comment) {
var el = makeComment(comment.author, comment.text),
img = loadImage(comment.img);
img.done(el.showAvatar);
comments.add(el);
});
All of the cruft there is to show how powerful promises can be.
Look at the .forEach call there.
I'm using Image loading instead of AJAX, because it might seem a little more obvious in this case:
I can load hundreds of blog comments, if the same user makes multiple posts, the image is cached, and if not, I don't have to wait for images to load, or write nested callbacks. Images load in any order, but still appear in the right spots.
This is 100% applicable to AJAX calls, as well.
Promises have proven to be the way to go as far as async and libraries like bluebird embrace node-style callbacks (using the (err, value) signature). So it seems beneficial to utilize node-style callbacks.
But the examples in the question can be easily be converted into either format with the functions below. (untested)
function mapToNodeStyleCallback(callback) {
return {
success: function(data) {
return callback(null, data)
},
error: function(error) {
return callback(error)
}
}
}
function alterNodeStyleCallback(propertyFuncs) {
return function () {
var args = Array.prototype.slice.call(arguments)
var err = args.shift()
if (err) return propertyFuncs.err.apply(null, [err])
return propertyFuncs.success.apply(null, args)
}
}
Apart from making synchronous AJAX calls if you can and think it is appropriate, what is the best way to handle something like this?
var A = getDataFromServerWithAJAXCall(whatever);
var B = getDataFromServerWithAJAXCallThatDependsOnPreviousData(A);
var C = getMoreDataFromServerWithAJAXCall(whatever2);
processAllDataAndShowResult(A,B,C);
Provided that I can pass callbacks to those functions, I know I can use closures and lambdas to get the job done like this:
var A,B,C;
getDataFromServerWithAJAXCall(whatever, function(AJAXResult) {
A= AJAXResult;
getDataFromServerWithAJAXCallThatDependsOnPreviousData(A, function(AJAXResult2) {
B= AJAXResult2;
processAllDataAndShowResult(A,B,C);
});
});
getMoreDataFromServerWithAJAXCall(whatever2, function(AJAXResult) {
C= AJAXResult;
processAllDataAndShowResult(A,B,C);
});
function processAllDataAndShowResult(A,B,C) {
if(A && B && C) {
//Do stuff
}
}
But it doesn't feel right or clean enough to me. So is there a better way or at least a cleaner way to do the same thing or is it just that I'm not used to javascript functional programming?
By the way, I'm using jQuery (1.4.2) if that helps.
Thank you.
Yes, jQuery's Deferred Object is super handy.
Here's the example from the $.when() function documentation, illustrating a solution to your problem:
$.when($.ajax("/page1.php"), $.ajax("/page2.php")).done(function(a1, a2){
/* a1 and a2 are arguments resolved for the
page1 and page2 ajax requests, respectively */
var jqXHR = a1[2]; /* arguments are [ "success", statusText, jqXHR ] */
if ( /Whip It/.test(jqXHR.responseText) ) {
alert("First page has 'Whip It' somewhere.");
}
});
Cheers!
Make the callback function of each AJAX call to check/store results in a common local storage. And have another processing function that reads from this container, maybe at regular intervals or activated by each callback. This way you keep you functions clean and the focus on the Ajax call. This also keeps the accumulation scalable to n Ajax calls easy, and you dont have to modify existing code when adding a new call.
If you can use jQuery 1.5 you should be able to accomplish your needs via using the deferred object and $.when()
$.when(getDataFromServerWithAJAXCall("Call 1"), getMoreDataFromServerWithAJAXCall("Call 2")).done(function(a1, a2) {
var jqXHR = a1[2];
jqXHR.responseText;
getDataFromServerWithAJAXCallThatDependsOnPreviousData(jqXHR.responseText);
});
Simply put when the first two functions complete then it will execute the third function.
Example on jsfiddle
Use a so-called 'countdown latch'
Each of the functions have their own callback.
Have a variable called countdownlatch be upped each time a function is called and
count-down when each of the callbacks is reached (be sure to
countdown on async error as well.
Each of the callbacks separately checks to see if countdownlatch==0 if so call function
processAllDataAndShowResult
The beauty of javascript with these kind of async synchronizations is that implementing a countdownlatch is super-easy, because javascript is single-threaded, i.e: there's no way countdownlatch could get funky numbers because of racing conditions since these are non-existent (in this situation).
EDIT
Didn't see B depended on A, but the same principle applies:
var A,B,C;
var cdlatch = 2;
getDataFromServerWithAJAXCall(whatever, function(AJAXResult) {
A= AJAXResult;
getDataFromServerWithAJAXCallThatDependsOnPreviousData(A, function(AJAXResult2) {
B= AJAXResult2;
if(--cdlatch === 0){
processAllDataAndShowResult(A,B,C);
}
});
});
getMoreDataFromServerWithAJAXCall(whatever2, function(AJAXResult) {
C= AJAXResult;
if(--cdlatch === 0){
processAllDataAndShowResult(A,B,C);
}
});
function processAllDataAndShowResult(A,B,C) {
//Do stuff
}
I must admit it's not that clear as the general case I described earlier, oh well.
I have an application that uses Ajax.Request and its onSuccess event handler in lots of places.
I need to call a function (that will check the response) before all these onSuccess events fire. I tried using Ajax.Responders.register with onComplete event but it fires after Ajax.Request's onSuccess event. Any suggestions?
similar to Aleksander Krzywinski's answer, but I believe this would prevent you from having to sprinkle the use of "wrap" everywhere, by consolidating it to the onCreate Responder.
Ajax.Responders.register({
onCreate: function(request) {
request.options['onSuccess'] = request.options['onSuccess'].wrap(validateResponse);
}
});
There are several events to chose from. Here is the event chain for Ajax.Request:
onCreate
onUninitialized
onLoading
onLoaded
onInteractive
onXYZ, onSuccess or onFailure
onComplete
onLoading, onLoaded, onInteractive sound interesting, but according to the spec they are not guaranteed to happen. That leaves you with the possibility to hook on to onCreate, which is called just after the request object is build, but before the request is actually made.
This might be a little late, but for the benefit of anyone else wondering about the same problem I will propose this solution:
You can use Prototypes own implementation of aspect-oriented programming to do this. Granted you will have to modify all your onSuccess-parameters, but it can be done with a simple search-and-replace, instead of updating all your callback functions. Here is an example Ajax.Request creation:
new Ajax.Request('example.html', {
parameters: {action: 'update'},
onSuccess: this.updateSuccessful
});
Say you have similar code snippets spread all over your code, and you want to preceed them all with a certain function that validates the response before the actual function is run(or even prevented from running at all). By using Funtion.wrap supplied in Prototype we can do this by extending the code above:
new Ajax.Request('example.html', {
parameters: {action: 'update'},
onSuccess: this.updateSuccessful.wrap(validateResponse)
});
Where 'validateResponse' is a function similar to this:
// If you use the X-JSON-header of the response for JSON, add the third param
function validateResponse(originalFn, transport /*, json */) {
// Validate the transport
if (someConditionMet) {
originalFn(transport /*, json */);
}
}
Thus you have extended your onSuccess-functions in one central place with just a quick search for onSuccess and pasting in 'wrap(validateResponse)'. This also gives you the option of having several wrapper-functions depending on the needs of the particular Ajax-request.
You can run your method before the other code in onSuccess and return false if something is wrong.
Don't know if this is the cleanest solution, but for me it did the trick.
var tmp = Ajax.Request;
Ajax.Request = function(url, args) {
// stuff to do before the request is sent
var c = Object.clone(args);
c.onSuccess = function(transport){
// stuff to do when request is successful, before the callback
args.onSuccess(transport);
}
var a = new tmp(url, c);
return a;
}
Ajax.Request.protoype = new tmp();
Ajax.Request.Events = tmp.Events;
delete tmp;
"General solution" - independent upon JS framework (kind of)
var oldFunc = Ajax.Request.onSuccess;
Ajax.Request.onSuccess = function foo() {
alert('t');
oldFunc.apply(this, arguments);
}
This will "extend" your JS function making it do exactly what it used to do except show an alert box every time before it executes...