I need to visit each node in a tree, do some asynchronous work, and then find out when all of the asynchronous work has completed. Here are the steps.
Visit a node and modify its children asynchronously.
When async modifications to children are done, visit all children (which might require async work).
When all asynchronous work for all descendants is done, do something else.
Update:
I ended up using a pattern that looks like a monitor/lock (but isn't) for each node to know when to begin step 2. I used events and attributes to keep track of all descendants of a node to know when to begin step 3.
It works, but man is this difficult to read! Is there a cleaner pattern?
function step1(el) { // recursive
var allDone = false;
var monitor = new Monitor();
var lock = monitor.lock(); // obtain a lock
$(el).attr("step1", ""); // step1 in progress for this node
// fires each time a descendant node finishes step 1
$(el).on("step1done", function (event) {
if (allDone) return;
var step1Descendants = $(el).find("[step1]");
if (step1Descendants.length === 0) {
// step 1 done for all descendants (so step 2 is complete)
step3(el); // not async
allDone = true;
}
});
// fires first time all locks are unlocked
monitor.addEventListener("done", function () {
$(el).removeAttr("step1"); // done with step 1
step2(el); // might have async work
$(el).trigger("step1done");
});
doAsyncWork(el, monitor); // pass monitor to lock/unlock
lock.unlock(); // immediately checks if no other locks outstanding
};
function step2(el) { // visit children
$(el).children().each(function (i, child) {
step1(child);
});
};
Here's an updated version that walks the node-tree, processing each child in the initial root node, and then descends recursively into each child's tree and processes its child nodes and so on.
Here's a jsfiddle demo
// Pass the root node, and the callback to invoke
// when the entire tree has been processed
function processTree(rootNode, callback) {
var i, l, pending;
// If there are no child nodes, just invoke the callback
// and return immediately
if( (pending = rootNode.childNodes.length) === 0 ) {
callback();
return;
}
// Create a function to call, when something completes
function done() {
--pending || callback();
}
// For each child node
for( i = 0, l = rootNode.childNodes.length ; i < l ; i++ ) {
// Wrap the following to avoid the good ol'
// index-closure-loop issue. Pass the function
// a child node
(function (node) {
// Process the child node asynchronously.
// I'm assuming the function takes a callback argument
// it'll invoke when it's done.
processChildNodeAsync(node, function () {
// When the processing is done, descend into
// the child's tree (recurse)
processTree(node, done);
});
}(rootNode.childNodes[i]));
}
}
Original Answer
Here's a basic example you might be able to use... though without the specifics of your problem, it's half psuedo-code
function doAsyncTreeStuff(rootNode, callback) {
var pending = 0;
// Callback to handle completed DOM node processes
// When pending is zero, the callback will be invoked
function done() {
--pending || callback();
}
// Recurse down through the tree, processing each node
function doAsyncThingsToNode(node) {
pending++;
// I'm assuming the async function takes some sort of
// callback it'll invoke when it's finished.
// Here, we pass it the `done` function
asyncFunction(node, done);
// Recursively process child nodes
for( var i = 0 ; i < node.children.length ; i++ ) {
doAsyncThingsToNode(node.children[i]);
}
}
// Start the process
doAsyncThingsToNode(rootNode);
}
It seems the right pattern for this problem and for async work in general is Promises. The idea is that any function that will do asynchronous work should return a promise object, to which the caller can attach functions that should be called when the asynchronous work is completed.
jQuery has a great API for implementing this pattern. It's called a jQuery.Deferred object. Here's a simple example:
function asyncWork() {
var deferred = $.Deferred();
setTimeout(function () {
// pass arguments via the resolve method
deferred.resolve("Done.");
}, 1000);
return deferred.promise();
}
asyncWork().then(function (result) {
console.log(result);
});
Very tidy. What's the difference between a Deferred object and its promise object? Good question.
Here's how you might apply this pattern to solve this problem.
function step1(el) { // recursive
var deferred = $.Deferred();
// doAsyncWork needs to return a promise
doAsyncWork(el).then(function () {
step2(el).then(function () {
step3(el); // not async
deferred.resolve();
});
});
return deferred.promise();
};
function step2(el) { // visit children
var deferred = $.Deferred();
var childPromises = [];
$(el).children().each(function (i, child) {
childPromises.push(step1(child));
});
// When all child promises are resolved…
$.when.apply(this, childPromises).then(function () {
deferred.resolve();
});
return deferred.promise();
};
So much cleaner. So much easier to read.
This is something you would probably prefer to do with threads to continue other work, but since you are using JavaScript you need to work around this with some sort of blocking. One way is make an initially empty list of finished tasks, make the asynchronous calls, and have each call register itself on the list when it is finished. While you are waiting for the calls, enter a loop with a timer, and at each iteration check if the finished tasks list is complete; if so, continue with other tasks. You may want to give up if your loop runs too long.
Related
I'm trying to work through this js/async scenario and i'm trying to know how the rest of the js world handles this.
function doStuff(callback) {
cursor.each(function(err, blahblah) {
...doing stuff here takes some time
});
... Execute this code ONLY after the `cursor.each` loop is finished
callback();
EDIT
Here's a more concrete example updated using most of the suggestions below which still doesn't work.
function doStuff(callback) {
MongoClient.connect(constants.mongoUrl, function(err, db) {
var collection = db.collection('cases2');
var cursor = collection.find();
var promises = []; // array for storing promises
cursor.each(function(err, item) {
console.log('inside each'); // NEVER GETS LOGGED UNLESS I COMMENT OUT THIS LINE: return Q.all(promises).then(callback(null, items));
var def = Q.defer(); // Create deferred object and store
promises.push(def.promise); // Its promise in the array
if(item == null) {
return def.resolve();
}
def.resolve(); // resolve the promise
});
console.log('items'); // ALWAYS GETS CALLED
console.log(items);
// IF I COMMENT THIS LINE OUT COMPLETELY,
// THE LOG STATEMENT INSIDE CURSOR.EACH ACTUALLY GETS LOGGED
return Q.all(promises).then(callback(null, items));
});
}
without using promises or any other dependencies/libraries you can simply
function doStuff(callback) {
add a counter
var cursor = new Array(); // init with some array data
var cursorTasks = cursor.length;
function cursorTaskComplete()
{
cursorTasks--;
if ( cursorTasks <= 0 ) {
// this gets get called after each task reported to be complete
callback();
}
}
for ( var i = 0; i < cursor.length; i++ ) {
...doing stuff here takes some time and does some async stuff
check after each async request
...when async operation is complete call
cursorTaskComplete()
}
}
Without knowing the details of the async calls you're making within the cursor.each loop, I shall assume that you have the ability to invoke a callback each time the functions invoked therein have completed their async task:
function doStuff() {
var promises = []; // array for storing promises
cursor.each(function(err, blahblah) {
var def = Q.defer(); // create deferred object and store
promises.push(def.promise); // its promise in the array
call_async_function(..., def.resolve); // resolve the promise in the async function's callback
});
// pass the array to Q.all, only when all are resolved will "callback" be called
return Q.all(promises);
}
and the usage then becomes:
doStuff().then(callback)
Note how the invocation of the callback now never touches the doStuff function - that function now also returns a promise. You can now register multiple callbacks, failure callbacks, etc, all without modifying doStuff. This is called "separation of concerns".
[NB: all the above based on the Q promises library - https://github.com/kriskowal/q]
EDIT further discussion and experimentation has determined that the .each call is itself async, and gives no indication to the outside when the last row has been seen. I've created a Gist that demonstrates a resolution to this problem.
if you want to do it with the async module, you can make use of the async forEachSeries function
Code snippet:
function doStuff(callback) {
async.forEachSeries(cursor, function(cursorSingleObj,callbackFromForEach){
//...do stuff which takes time
//this callback is to tell when everything gets over execute the next function
callbackFromForEach();
},function(){
//over here the execution of forEach gets over and then the main callback is called
callback();
});
}
In my mind an elegant/ideal solution would be to have something like
cursor.each(........).then( function() { ....your stuff});
But without that you can do this....UPDATED
http://plnkr.co/edit/27l7t5VLszBIW9eFW4Ip?p=preview
The gist of this is as shown below...notice....when
var doStuff = function(callback) {
cursor.forEach(function(cursorStep) {
var deferred = $q.defer();
var promise = deferred.promise;
allMyAsyncPromises.push(promise);
cursorStep.execFn(cursorStep.stepMeta);
promise.resolve;
});
$q.when(allMyAsyncPromises).then(callback);
}
After hitting the start button wait for few seconds...the async tasks have been simulated to finish in 5 seconds so the status will update accordingly.
Not having access to a real cursor object..I had to resort of fake cursor like and array.
I've been developing in JavaScript for quite some time but net yet a cowboy developer, as one of the many things that always haunts me is synching JavaScript's callbacks.
I will describe a generic scenario when this concern will be raised: I have a bunch of operations to perform multiple times by a for loop, and each of the operations has a callback. After the for loop, I need to perform another operation but this operation can only execute successfully if all the callbacks from the for loop are done.
Code Example:
for ... in ... {
myFunc1(callback); // callbacks are executed asynchly
}
myFunc2(); // can only execute properly if all the myFunc1 callbacks are done
Suggested Solution:
Initiate a counter at the beginning of the loop holding the length of the loop, and each callback decrements that counter. When the counter hits 0, execute myFunc2. This is essentially to let the callbacks know if it's the last callback in sequence and if it is, call myFunc2 when it's done.
Problems:
A counter is needed for every such sequence in your code, and having meaningless counters everywhere is not a good practice.
If you recall how thread conflicts in classical synchronization problem, when multiple threads are all calling var-- on the same var, undesirable outcomes would occur. Does the same happen in JavaScript?
Ultimate Question:
Is there a better solution?
The good news is that JavaScript is single threaded; this means that solutions will generally work well with "shared" variables, i.e. no mutex locks are required.
If you want to serialize asynch tasks, followed by a completion callback you could use this helper function:
function serializeTasks(arr, fn, done)
{
var current = 0;
fn(function iterate() {
if (++current < arr.length) {
fn(iterate, arr[current]);
} else {
done();
}
}, arr[current]);
}
The first argument is the array of values that needs to be passed in each pass, the second argument is a loop callback (explained below) and the last argument is the completion callback function.
This is the loop callback function:
function loopFn(nextTask, value) {
myFunc1(value, nextTask);
}
The first argument that's passed is a function that will execute the next task, it's meant to be passed to your asynch function. The second argument is the current entry of your array of values.
Let's assume the asynch task looks like this:
function myFunc1(value, callback)
{
console.log(value);
callback();
}
It prints the value and afterwards it invokes the callback; simple.
Then, to set the whole thing in motion:
serializeTasks([1,2, 3], loopFn, function() {
console.log('done');
});
Demo
To parallelize them, you need a different function:
function parallelizeTasks(arr, fn, done)
{
var total = arr.length,
doneTask = function() {
if (--total === 0) {
done();
}
};
arr.forEach(function(value) {
fn(doneTask, value);
});
}
And your loop function will be this (only parameter name changes):
function loopFn(doneTask, value) {
myFunc1(value, doneTask);
}
Demo
The second problem is not really a problem as long as every one of those is in a separate function and the variable is declared correctly (with var); local variables in functions do not interfere with each other.
The first problem is a bit more of a problem. Other people have gotten annoyed, too, and ended up making libraries to wrap that sort of pattern for you. I like async. With it, your code might look like this:
async.each(someArray, myFunc1, myFunc2);
It offers a lot of other asynchronous building blocks, too. I'd recommend taking a look at it if you're doing lots of asynchronous stuff.
You can achieve this by using a jQuery deferred object.
var deferred = $.Deferred();
var success = function () {
// resolve the deferred with your object as the data
deferred.resolve({
result:...;
});
};
With this helper function:
function afterAll(callback,what) {
what.counter = (what.counter || 0) + 1;
return function() {
callback();
if(--what.counter == 0)
what();
};
}
your loop will look like this:
function whenAllDone() { ... }
for (... in ...) {
myFunc1(afterAll(callback,whenAllDone));
}
here afterAll creates proxy function for the callback, it also decrements the counter. And calls whenAllDone function when all callbacks are complete.
single thread is not always guaranteed. do not take it wrong.
Case 1:
For example, if we have 2 functions as follows.
var count=0;
function1(){
alert("this thread will be suspended, count:"+count);
}
function2(){
//anything
count++;
dump(count+"\n");
}
then before function1 returns, function2 will also be called, if 1 thread is guaranteed, then function2 will not be called before function1 returns. You can try this. and you will find out count is going up while you are being alerted.
Case 2: with Firefox, chrome code, before 1 function returns (no alert inside), another function can also be called.
So a mutex lock is indeed needed.
There are many, many ways to achieve this, I hope these suggestions help!
First, I would transform the callback into a promise! Here is one way to do that:
function aPromise(arg) {
return new Promise((resolve, reject) => {
aCallback(arg, (err, result) => {
if(err) reject(err);
else resolve(result);
});
})
}
Next, use reduce to process the elements of an array one by one!
const arrayOfArg = ["one", "two", "three"];
const promise = arrayOfArg.reduce(
(promise, arg) => promise.then(() => aPromise(arg)), // after the previous promise, return the result of the aPromise function as the next promise
Promise.resolve(null) // initial resolved promise
);
promise.then(() => {
// carry on
});
If you want to process all elements of an array at the same time, use map an Promise.all!
const arrayOfArg = ["one", "two", "three"];
const promise = Promise.all(arrayOfArg.map(
arg => aPromise(arg)
));
promise.then(() => {
// carry on
});
If you are able to use async / await then you could just simply do this:
const arrayOfArg = ["one", "two", "three"];
for(let arg of arrayOfArg) {
await aPromise(arg); // wow
}
// carry on
You might even use my very cool synchronize-async library like this:
const arrayOfArg = ["one", "two", "three"];
const context = {}; // can be any kind of object, this is the threadish context
for(let arg of arrayOfArg) {
synchronizeCall(aPromise, arg); // synchronize the calls in the given context
}
join(context).then(() => { // join will resolve when all calls in the context are finshed
// carry on
});
And last but not least, use the fine async library if you really don't want to use promises.
const arrayOfArg = ["one", "two", "three"];
async.each(arrayOfArg, aCallback, err => {
if(err) throw err; // handle the error!
// carry on
});
I have a handler (callback), an object to handle and four functions, which collect the data to object. In my case I wish to asynchronously call four data retrievers and when execution of all four is complete, handle the resulting object (something similar to the following):
var data = {};
function handle (jsObj) {}
// data retrieving
function getColorData () {}
function getSizeData () {}
function getWeightData () {}
function getExtraData () {}
data.color = getColorData();
data.size = getSizeData();
data.weight = getWeightData();
data.extra = getExtraData();
handle( data );
Of course, this code will not work properly. And if I chain data retrieving functions, they will be called one after another, right?
All four functions should be called asynchronously, cause they are being executed for too long to call them one by one.
Updated:
Thanks to everybody for your suggestions! I prefered $.Deferred(), but I found it slightly difficult to make it work the way I need. What I need is to asynchronously make a view, which requires four kinds of data (extraData, colorData, sizeData & weightData) and I have three objects: App, Utils & Tools.
Just a small description: view is created by calling App.getStuff passed App.handleStuff as a callback. Callback in the body of App.getStuff is called only $.when(App.getExtraData(), App.getColorData(), App.getSizeData(), App.getWeightData()). Before that Utils.asyncRequest passed Tools.parseResponse as a callback is called.
So, now the question is should I create four deferred objects inside each App.get*Data() and also return deferred.promise() from each of them?
And should I deferred.resolve() in the last function in my order (Tools.parseResponse for App.getExtraData in my example)?
var view,
App,
Utils = {},
Tools = {};
// Utils
Utils.asyncRequest = function (path, callback) {
var data,
parseResponse = callback;
// do something with 'data'
parseResponse( data );
};
// Tools
Tools.parseResponse = function (data) {
var output = {};
// do something to make 'output' from 'data'
/* So, should the deferred.resolve() be done here? */
deferred.resolve(output);
/// OR deferred.resolve();
/// OR return output;
};
// App
App = {
// Only one method really works in my example
getExtraData : function () {
var deferred = new jQuery.Deferred();
Utils.asyncRequest("/dir/data.txt", Tools.parseResponse);
return deferred.promise();
},
// Others do nothing
getColorData : function () { /* ... */ },
getSizeData : function () { /* ... */ },
getWeightData : function () { /* ... */ }
};
App.getStuff = function (callback) {
$.when(
App.getExtraData(),
App.getColorData(),
App.getSizeData(),
App.getWeightData()
)
.then(function (extraData, colorData, sizeData, weightData) {
var context,
handleStuff = callback;
// do something to make all kinds of data become a single object
handleStuff( context );
});
};
App.handleStuff = function (stuff) { /* ... */ };
/// RUN
view = App.getStuff( App.handleStuff );
I did not expect the code in my example above to work, it is for illustrative purposes.
I've been trying to solve this for quiet a long time and it still gives no result. The documentation for jQuery.Deferred() and discussions around this, unfortunately, did not help me. So, I would be very glad and greatful for any help or advise.
Conceptually, you would use a counter that gets incremented as each asynchronous call completes. The main caller should proceed after the counter has been incremented by all the asynchronous calls.
I think what you're looking for are Promises / Deferreds.
With promises you can write something like:
when(getColorData(), getSizeData(), getWeightData(), getExtraData()).then(
function (colorData, sizeData, weightData, extraData) {
handle(/*..*/);
}
)
The get*Data() functions will return a promise that they fulfill when their assynchronous call is complete.
Ex:
function getData() {
var promise = new Promise();
doAjax("getData", { "foo": "bar" }, function (result) {
promise.resolve(result);
});
return promise;
}
The when simply counts the number arguments, if all it's promises are resolved, it will call then with the results from the promises.
jQuery has an OK implementation: http://api.jquery.com/jQuery.when/
What I could suggest for this scenario would be something like that.
write a function like this
var completed = 0;
checkHandler = function() {
if(completed == 4) {
handle(data);
}
}
where completed is the number of positive callbacks you must receive.
As soon as every function receives a callback you can increment the "completed" counter and invoke the checkHandler function. and you're done!
in example
function getColorData() {
$.get('ajax/test.html', function(data) {
completed++;
checkHandler();
});
}
If I have a function that's passed this function:
function(work) {
work(10);
work(20);
work(30);
}
(There can be any number of work calls with any number in them.)
work performance some asynchronous activity—say, for this example, it just is a timeout. I have full control over what work does on the completion of this operation (and, in fact, its definition in general).
What's the best way of determining when all the calls to work are done?
My current method increments a counter when work is called and decrements it when it completes, and fires the all work done event when the counter is 0 (this is checked after every decrement). However, I worry that this could be a race condition of some sort. If that is not the case, do show my why and that would be a great answer.
There are a ton of ways you can write this program, but your simple technique of using a counter will work just fine.
The important thing to remember, the reason this will work, is because Javascript executes in a single thread. This is true of all browsers and node.js AFAIK.
Based on the thoughtful comments below, the solution works because the JS event loop will execute the functions in an order like:
function(work)
work(10)
counter++
Start async function
work(20)
counter++
Start async function
work(30)
counter++
Start async function
-- back out to event loop --
Async function completes
counter--
-- back out to event loop --
Async function completes
counter--
-- back out to event loop --
Async function completes
counter--
Counter is 0, so you fire your work done message
-- back out to event loop --
There's no race condition. There is the added requirement for every request made to perform a decrement when it's finished (always! including on http failure, which is easy to forget). But that can be handled in a more encapsulated way by wrapping you calls.
Untested, but this is the gist (I've implemented an object instead of a counter, so theoretically you can extend this to have more granular queries about specific requests):
var ajaxWrapper = (function() {
var id = 0, calls = {};
return {
makeRequest: function() {
$.post.apply($, arguments); // for example
calls[id] = true;
return id++;
},
finishRequest: function(id) {
delete calls[id];
},
isAllDone: function(){
var prop;
for(prop in calls) {
if(calls.hasOwnProperty(prop)) {return false;}
}
return true;
}
};
})();
Usage:
Instead of $.post("url", ... function(){ /*success*/ } ... ); We'll do
var requestId;
requestId = ajaxWrapper.makeRequest("url", ...
function(){ /*success*/ ajaxWrapper.finishRequest(requestId); } ... );
If you wanted to be even more sophisticated you could add the calls to finishRequest yourself inside the wrapper, so usage would be almost entirely transparent:
ajaxWrapper.makeRequest("url", ... function(){ /*success*/ } ... );
I have an after utility function.
var after = function _after(count, f) {
var c = 0, results = [];
return function _callback() {
switch (arguments.length) {
case 0: results.push(null); break;
case 1: results.push(arguments[0]); break;
default: results.push(Array.prototype.slice.call(arguments)); break;
}
if (++c === count) {
f.apply(this, results);
}
};
};
The following code below would just work. Because javascript is single threaded.
function doWork(work) {
work(10);
work(20);
work(30);
}
WorkHandler(doWork);
function WorkHandler(cb) {
var counter = 0,
finish;
cb(function _work(item) {
counter++;
// somethingAsync calls `finish` when it's finished
somethingAsync(item, function _cb() {
finish()
});
});
finish = after(counter, function() {
console.log('work finished');
});
};
I guess I should explain.
We pass the function that does work to the workhandler.
The work handler calls it and passes in work.
The function that does work calls work multiple times incrementing the counter
Since the function that does work is not asynchronous (very important) we can define the finish function after it has finished.
The asynchronouswork that is being done cannot finish (and call the undefined finish function) before the current synchronous block of work (the execution of the entire workhandler) has finished.
This means that after the entire workhandler has finished (and the variable finish is set) the asynchronous work jobs will start to end and call finish. Only once all of them have called finish will the callback send to after fire.
I have a Javascript object that requires 2 calls out to an external server to build its contents and do anything meaningful. The object is built such that instantiating an instance of it will automatically make these 2 calls. The 2 calls share a common callback function that operates on the returned data and then calls another method. The problem is that the next method should not be called until both methods return. Here is the code as I have implemented it currently:
foo.bar.Object = function() {
this.currentCallbacks = 0;
this.expectedCallbacks = 2;
this.function1 = function() {
// do stuff
var me = this;
foo.bar.sendRequest(new RequestObject, function(resp) {
me.commonCallback(resp);
});
};
this.function2 = function() {
// do stuff
var me = this;
foo.bar.sendRequest(new RequestObject, function(resp) {
me.commonCallback(resp);
});
};
this.commonCallback = function(resp) {
this.currentCallbacks++;
// do stuff
if (this.currentCallbacks == this.expectedCallbacks) {
// call new method
}
};
this.function1();
this.function2();
}
As you can see, I am forcing the object to continue after both calls have returned using a simple counter to validate they have both returned. This works but seems like a really poor implementation. I have only worked with Javascript for a few weeks now and am wondering if there is a better method for doing the same thing that I have yet to stumble upon.
Thanks for any and all help.
Unless you're willing to serialize the AJAX there is no other way that I can think of to do what you're proposing. That being said, I think what you have is fairly good, but you might want to clean up the structure a bit to not litter the object you're creating with initialization data.
Here is a function that might help you:
function gate(fn, number_of_calls_before_opening) {
return function() {
arguments.callee._call_count = (arguments.callee._call_count || 0) + 1;
if (arguments.callee._call_count >= number_of_calls_before_opening)
fn.apply(null, arguments);
};
}
This function is what's known as a higher-order function - a function that takes functions as arguments. This particular function returns a function that calls the passed function when it has been called number_of_calls_before_opening times. For example:
var f = gate(function(arg) { alert(arg); }, 2);
f('hello');
f('world'); // An alert will popup for this call.
You could make use of this as your callback method:
foo.bar = function() {
var callback = gate(this.method, 2);
sendAjax(new Request(), callback);
sendAjax(new Request(), callback);
}
The second callback, whichever it is will ensure that method is called. But this leads to another problem: the gate function calls the passed function without any context, meaning this will refer to the global object, not the object that you are constructing. There are several ways to get around this: You can either close-over this by aliasing it to me or self. Or you can create another higher order function that does just that.
Here's what the first case would look like:
foo.bar = function() {
var me = this;
var callback = gate(function(a,b,c) { me.method(a,b,c); }, 2);
sendAjax(new Request(), callback);
sendAjax(new Request(), callback);
}
In the latter case, the other higher order function would be something like the following:
function bind_context(context, fn) {
return function() {
return fn.apply(context, arguments);
};
}
This function returns a function that calls the passed function in the passed context. An example of it would be as follows:
var obj = {};
var func = function(name) { this.name = name; };
var method = bind_context(obj, func);
method('Your Name!');
alert(obj.name); // Your Name!
To put it in perspective, your code would look as follows:
foo.bar = function() {
var callback = gate(bind_context(this, this.method), 2);
sendAjax(new Request(), callback);
sendAjax(new Request(), callback);
}
In any case, once you've made these refactorings you will have cleared up the object being constructed of all its members that are only needed for initialization.
I can add that Underscore.js has a nice little helper for this:
Creates a version of the function that will only be run after first
being called count times. Useful for grouping asynchronous responses,
where you want to be sure that all the async calls have finished,
before proceeding.
_.after(count, function)
The code for _after (as-of version 1.5.0):
_.after = function(times, func) {
return function() {
if (--times < 1) {
return func.apply(this, arguments);
}
};
};
The license info (as-of version 1.5.0)
There is barely another way than to have this counter. Another option would be to use an object {} and add a key for every request and remove it if finished. This way you would know immediately which has returned. But the solution stays the same.
You can change the code a little bit. If it is like in your example that you only need to call another function inside of commonCallback (I called it otherFunction) than you don't need the commonCallback. In order to save the context you did use closures already. Instead of
foo.bar.sendRequest(new RequestObject, function(resp) {
me.commonCallback(resp);
});
you could do it this way
foo.bar.sendRequest(new RequestObject, function(resp) {
--me.expectedCallbacks || me.otherFunction(resp);
});
That's some good stuff Mr. Kyle.
To put it a bit simpler, I usually use a Start and a Done function.
-The Start function takes a list of functions that will be executed.
-The Done function gets called by the callbacks of your functions that you passed to the start method.
-Additionally, you can pass a function, or list of functions to the done method that will be executed when the last callback completes.
The declarations look like this.
var PendingRequests = 0;
function Start(Requests) {
PendingRequests = Requests.length;
for (var i = 0; i < Requests.length; i++)
Requests[i]();
};
//Called when async responses complete.
function Done(CompletedEvents) {
PendingRequests--;
if (PendingRequests == 0) {
for (var i = 0; i < CompletedEvents.length; i++)
CompletedEvents[i]();
}
}
Here's a simple example using the google maps api.
//Variables
var originAddress = "*Some address/zip code here*"; //Location A
var formattedAddress; //Formatted address of Location B
var distance; //Distance between A and B
var location; //Location B
//This is the start function above. Passing an array of two functions defined below.
Start(new Array(GetPlaceDetails, GetDistances));
//This function makes a request to get detailed information on a place.
//Then callsback with the **GetPlaceDetailsComplete** function
function GetPlaceDetails() {
var request = {
reference: location.reference //Google maps reference id
};
var PlacesService = new google.maps.places.PlacesService(Map);
PlacesService.getDetails(request, GetPlaceDetailsComplete);
}
function GetPlaceDetailsComplete(place, status) {
if (status == google.maps.places.PlacesServiceStatus.OK) {
formattedAddress = place.formatted_address;
Done(new Array(PrintDetails));
}
}
function GetDistances() {
distService = new google.maps.DistanceMatrixService();
distService.getDistanceMatrix(
{
origins: originAddress,
destinations: [location.geometry.location], //Location contains lat and lng
travelMode: google.maps.TravelMode.DRIVING,
unitSystem: google.maps.UnitSystem.IMPERIAL,
avoidHighways: false,
avoidTolls: false
}, GetDistancesComplete);
}
function GetDistancesComplete(results, status) {
if (status == google.maps.DistanceMatrixStatus.OK) {
distance = results[0].distance.text;
Done(new Array(PrintDetails));
}
}
function PrintDetails() {
alert(*Whatever you feel like printing.*);
}
So in a nutshell, what we're doing here is
-Passing an array of functions to the Start function
-The Start function calls the functions in the array and sets the number of PendingRequests
-In the callbacks for our pending requests, we call the Done function
-The Done function takes an array of functions
-The Done function decrements the PendingRequests counter
-If their are no more pending requests, we call the functions passed to the Done function
That's a simple, but practicle example of sychronizing web calls. I tried to use an example of something that's widely used, so I went with the Google maps api. I hope someone finds this useful.
Another way would be to have a sync point thanks to a timer. It is not beautiful, but it has the advantage of not having to add the call to the next function inside the callback.
Here the function execute_jobs is the entry point. it take a list of data to execute simultaneously. It first sets the number of jobs to wait to the size of the list. Then it set a timer to test for the end condition (the number falling down to 0). And finally it sends a job for each data. Each job decrease the number of awaited jobs by one.
It would look like something like that:
var g_numJobs = 0;
function async_task(data) {
//
// ... execute the task on the data ...
//
// Decrease the number of jobs left to execute.
--g_numJobs;
}
function execute_jobs(list) {
// Set the number of jobs we want to wait for.
g_numJobs = list.length;
// Set the timer (test every 50ms).
var timer = setInterval(function() {
if(g_numJobs == 0) {
clearInterval(timer);
do_next_action();
}
}, 50);
// Send the jobs.
for(var i = 0; i < list.length; ++i) {
async_task(list[i]));
}
}
To improve this code you can do a Job and JobList classes. The Job would execute a callback and decrease the number of pending jobs, while the JobList would aggregate the timer and call the callback to the next action once the jobs are finished.
I shared the same frustration. As I chained more asynchronous calls, it became a callback hell. So, I came up with my own solution. I'm sure there are similar solutions out there, but I wanted to create something very simple and easy to use. Asynq is a script that I wrote to chain asynchronous tasks. So to run f2 after f1, you can do:
asynq.run(f1, f2)
You can chain as many functions as you want. You can also specify parameters or run a series of tasks on elements in an array too. I hope this library can solve your issues or similar issues others are having.