Multiple Sequential Async JavaScript Functions - javascript

Let's say I have a function that looks like this:
var foo = function(callback) {
var final = {};
asyncFuncOne(function(x) {
final.x = x;
});
asyncFuncTwo(function(y) {
final.y = y;
});
callback(final);
});
Obviously, this doesn't do what I want it to do (call callback on final when it has both x and y). I have several questions:
Is there a way to do what I want it to do without nesting everything?
Does the current form introduce a race condition? Are both async functions accessing the same final?

Approach #0. Painful life without promises. Yet life
Actually, your code like cries to be rewritten in promises. Trust me, this refactoring is something you 100% need. But ok, let's try to solve this particular problem without invoking promises at all - just as an exercise. Actually before the promise era the pattern was to introduce a special function that checks whether we can consider that we are done or not.
In your particular case such function is:
function weAreDone() {
return final.hasOwnPropery('x') && final.hasOwnProperty('y')
}
Then we can introduce asyncFuncDecorator:
function asyncFuncDecorator = function(asyncFunc, asyncFuncHandler) {
return function(doneFunc, doneHandler) {
asyncFunc(asyncFuncHandler);
if (doneFunc()) {
doneHandler();
}
}
}
With this two functions introduced you can write something like:
var foo = function(callback) {
var final = {};
//here goes abovementioned declarations
...
asyncFuncDecorator(asyncFuncOne, function(x) {
final.x = x;
})(weAreDone, callback);
asyncFuncDecorator(asyncFuncTwo, function(y) {
final.y = y;
})(weAreDone, callback);
});
You can keep working on making this approach more flexible and universal but, once again, trust me,
you'll end up with something very similar to promises, so better promises ;)
Approach #1. Promisifying existing functions
If, for some reason, you are not ready to rewrite all you functions from callback style to promises,
you can promisify existing functions by using, once again, a decorator. Here's how it can be done for native Promises, which are present in all modern browsers already (for alternatives, check this question):
function promisify(asyncCall){
return new Promise(function(resolve,reject){
asyncCall(resolve,reject);
});
}
In that case you can rewrite you code in this fashion:
var foo = function(callback) {
//here goes abovementioned declarations
...
Promise.all([promisify(asyncFuncOne), promisify(asyncFuncTwo)]).then(function(data) {
// by the way, I'd rather not to call any variable "final" ))
final.x = data[0];
final.y = data[1];
}).then(callback);
});
Not to say that actually foo it's better to be promisified itself ;)
Approach #2. Promises everywhere. From the very beginning
It worth to reiterate this thought - as soon as you need to trigger some function after N other async functions should be completed - promises in 99% cases are unbeatable. It almost always worth trying to rewrite existing code to in promise-based style. Here's how can such code look like
Promise.all([asyncFuncOne(), asyncFuncTwo()]).then(function(data) {
return Promise.resolve({
x: data[0],
y: data[1]
})
}).then(callback);
See how much better it become. Also, a common mistake of using promises - is to have a sequential waterfall of thens - retrieving first chunk of data, only after that - the second one, after that - the third one. You actually never should do this unless you are transforming data received in Nth request depending on what you've got in one of your previous requests - instead just use all method.
This is very crucial to understand. This is one of main reasons why promises quite often are misunderstood as something excessively complicated.
Sidenote: as of December'14, native Promises are natively supported by all major modern browsers except IE, and in Node.js has native promise support is a thing since version 0.11.13, so in real-life you still most probably will need to use promise library. There's a lot of Promise spec implementations, you can check this page for the list of standalone promise libraries, it's quite big, the most popular solutiona are, I guess, Q and bluebird.
Approach #3. Generators. Our bright future. Well, may be
This is something worth to mention, generators are de-facto supported in Firefox, Chromium-based browsers and node.js (called with --harmony_generators option). So, de-facto, there are cases when generators can be used, and actually are already used, in production code. It's just that if you are writing a general-purpose web app, you should be aware of this approach but you'll probably won't use it for a while. So, you can use the fact that generators in js allow you to invoke two-way communication through yield/iterator.next(). In that case.
function async(gen) {
var it = gen();
var state = it.next();
var next = function() {
if (state.done) {
return state.value;
};
state.value(function(res) {
state = it.next(res);
next();
});
}
next();
}
async(function* () {
var res = {
x: yield asyncFuncOne,
y: yield asyncFuncTwo
}
callback(res);
});
Actually, there are already dozens of libraries which do this generator wrapping job for you.
You can read more about this approach and related libraries here.

Another solution is to create a setter:
var foo = function (callback) {
var final = {
setter: function(attr,value){
this[attr] = value;
if (this.hasOwnProperty("x") && this.hasOwnProperty("y"))
callback(this);
}
};
asyncFuncOne(function(x) {
final.setter("x", x);
});
asyncFuncTwo(function(y) {
final.setter("y", y);
});
};

final.x and final.y are set on final, but after it's sent to callback so, unless the callback is waiting, x and y are undefined when callback receives them.
You could check to see if one has come back in the response of the others and call out to the callback:
var foo = function(callback) {
var final = {};
asyncFuncOne(function(x) {
final.x = x;
if (typeof final.y !== 'undefined') {
callback(final);
}
});
asyncFuncTwo(function(y) {
final.y = y;
if (typeof final.x !== 'undefined') {
callback(final);
}
});
});
You could nest your callbacks, though this will cause asyncfuncTwo to not be called until asyncfuncOne has finished):
var foo = function(callback) {
var final = {};
asyncFuncOne(function(x) {
final.x = x;
asyncFuncTwo(function(y) {
final.y = y;
callback(final);
});
});
});
Then there are Promises. These are the future of async however they are not fully supported across all browsers (namely, all of IE [11 and below at the this time]). In fact, 40% of all browser users are not using a browser that natively supports Promises. This means you will have to use a polyfill library to give you support adding substantial filesize to your page. For this simple problem and at this given time I wouldn't recommend using Promises for this simple issue. However, you should definitely read up on how they are used.
If you want to see what that could look like, it'd be this:
var asyncFuncOne = function() {
return new Promise(function(resolve, reject) {
// A 500 seconds async op and resolve x as 5
setTimeout(function() { resolve(5); }, 500);
});
};
var asyncFuncTwo = function() {
return new Promise(function(resolve, reject) {
// A 750ms async op and resolve y as 10
setTimeout(function() { resolve(10); }, 750);
});
};
var foo = function() {
var final = {};
return new Promise(function(resolve, reject) {
Promise.all([
asyncFuncOne(),
asyncFuncTwo()
]).then(function(values) {
final.x = values[0];
final.y = values[1];
resolve(final);
});
});
};
foo().then(function(final) {
// After foo()'s Promise has resolved (750ms)
console.log(final.x + ', ' + final.y);
});
Note no callbacks, just use of then. In a real scenario you would also use catch and reject. Read more about Promises here https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise but, again, I personally don't see a strong need to use them for this single, specific issue (but, to each their own).

One pretty bad idea, but I've had to use it before, because I wasn't about to import a 50k promise library for a single function, would be to set a looping Timeout that checks to see if all the required variables are set, and then calls the callback.

Related

Using a Promise to create "atomic" blocks of code in Javascript

Coming from a Java background I am now trying to wrap my mind around the asynchronous nature of Javascript. I use promises in my code to do this and until now everything works like a charm, but now I am having a conceptual question and didn't find a definitive answer even after reading the Promise/A+ spec multiple times.
My requirements are this: I have a method that modifies a shared object, stores the update in a PouchDB and reads it back afterwards in order to get an updated revision id field from the db (optimistic locking). Storing and updating the data in Pouch is asynchronous (I am omitting storing "this" to call the methods from within the promises for brevity):
var _doc = ...;
var _pouch = new PouchDB(...);
function setValue(key, value) {
_doc[key] = value;
_pouch.put(_doc)
.then(function() {
return _pouch.get(_doc._id);
})
.then(function(updatedDoc) {
_doc = updatedDoc;
});
}
Now, I want to make sure that no other key is set on _doc while it is being written to the db before it has been read again. Is it (a) even possible that another setValue() call is executing a put() (with an outdated revision id) while the get() call from Pouch has not been executed (given the message-queue-approach that JS is using) and (b) if it is possible, is the following solution fail-safe (it is working in my tests, but since I don't know if my tests are considering all possibilities...; storing "this" is again omitted):
var _doc = ...;
var _pouch = new PouchDB(...);
var _updatePromise;
function setValue(key, value) {
if (_updatePromise == null) {
setValueInternal(key, value);
}
else {
// make sure the previous setValue() call is executed completely before
// starting another one...
_updatePromise.then(function() {
setValueInternal(key, value);
});
}
}
function setValueInternal(key, value) {
_doc[key] = value;
_updatePromise = new Promise(function(done, reject) {
_pouch.put(_doc)
.then(function() {
return _pouch.get(_doc._id);
})
.then(function(updatedDoc) {
_doc = updatedDoc;
_updatePromise = null;
done();
})
catch(function(error) {
_updatePromise = null;
reject(error);
});
});
}
I think it should work correctly if fulfilling a promise (calling done()) will synchronously call the next then() function, but I am unable to find a definitive answer whether this is the case.
Any clarification is greatly appreciated and thanks for your help.
Chaining promises as you're attempting to do here does indeed work as expected, but I do not believe there is any guarantee that done is called synchronously. I think your code would work, but you have some anti-patterns in it. I would recommend simplifying to avoid explicit creation of the promises.
Also think about: If you call setValue 4 times in a row, how many round-trips to the server should that make? Doing it this way is going to make it take 4. Did you want to batch them into 1 or 2?
One Round Trip Per setValue:
var _doc = ...;
var _pouch = new PouchDB(...);
var _updatePromise = Promise.resolve();
function setValue(key, value) {
// make sure the previous setValue() call is executed completely before
// starting another one...
_updatePromise = _updatePromise.then(function() {
_doc[key] = value;
return _pouch.put(_doc)
.then(function() {
return _pouch.get(_doc._id);
})
.then(function(updatedDoc) {
_doc = updatedDoc;
});
});
}

Running code sequentially in Node.js

I have a function that fetches data from a database:
recentItems = function () {
Items.find({item_published: true}).exec(function(err,item){
if(!err)
return item
});
};
And I want to use it like this:
var x = recentItems();
But this fails with undefined value due to Async behavior of recentItems. I know that I can change my function to use a callback like this:
recentItems = function (callback) {
Items.find({item_published: true}).exec(function(err,item){
if(!err)
callback(item)
});
};
And:
recentItems(function(result){
var x = result;
});
But i dont want to use this method because i have a situation like this. i have a function that should do two operations and pus result to an array and after them, fire a callback and return value:
var calc = function(callback){
var arr = [];
var b = getValues();
arr.push(b);
recentItems(function(result){
var x = result;
arr.push(x);
});
callback(arr);
};
In this situation, the value of b pushed to arr and the main callback called and after that value of x fetched from recentItems duo to Async behavior of recentItems. But I need this two operation runs sequentially and one after one. After calculating all of them, then last line runs and the callback fired.
How can I resolve this? I read about the Promises and Async libraries, but I don't know which of them is my answer. Can I overcome this with raw Node.js? If so, I would prefer that.
There are some ways of doing what you want, but none of them are ~perfect~ yet.
There is an ES7 proposal of native async/await that will be the callback heaven, but atm, you can do:
Nested callbacks (native, but very ugly and unmaintainable code)
Promises (good, but still too verbose)
Async/Await library (It's an amazing library, but very far from native, and performance isn't cool)
ES7 transpiler - you can write the ES7 code today, and it will transpile for you to ES5 (e.g Babel)
But, if you're already using the newest version of NodeJS (4.0.0 as the time of writing) - and if you're not, you really should - the best way of achieving what you want is to use generators.
Combined with a small library named co, it will help you to achieve almost what the ES7 async/await proposes, and it will mostly use native code, so both readability and performance are really good:
var co = require('co');
var calc = co(function *calc() {
var arr = [];
var b = getValues();
arr.push(b);
var items = yield recentItems();
arr.push(items);
return arr;
});
function recentItems() {
return new Promise(function(resolve) {
Items.find({item_published: true}).exec(function(err, item) {
if(!err)
resolve(item);
});
}
You can read more about this subject in this awesome Thomas Hunter's blog post.
You've almost got it. There is no method to work-around callbacks. However, you can certainly use callbacks to do what you want. Simply nest them:
var calc = function(callback){
var arr = [];
getValues(function(b){
arr.push(b);
recentItems(function(result){
var x = result;
arr.push(x);
callback(arr);
});
});
};
You can try something like this. It still nests the callbacks, but the code is a little cleaner.
var callA = function(callback) {
//Run the first call
prompt(callback(data));
}
var callB = function(callback) {
//Some other call
prompt(callback(data));
}
callA(function(dataA) {
callB(function(dataB) {
//Place a handler function here
console.log(dataA + " " + dataB)
})
});

How to implement dependency between asynchronous functions in JavaScript?

As a simplified case, I have two async functions, foo and bar. bar needs the result of foo, i.e. bar depends on foo. I have no idea about which function will be called first.
If bar is invoked first, bar will call foo and start itself right after foo is done.
If foo is invoked first and done, bar can use the result of foo.
If foo is invoked first and bar is invoked before foo is done, bar needs to wait for foo's result. (Don't invoke a new call to foo, just wait for the already-fired call to foo)
How can I achieve this?
Is it possible to register an async function dependency chain (something like the dependency in require.js define['foo'], function() { bar(); })?
Can I use $.deferred() to achieve it?
How?
In circumstances like this, the standard approach is to cache the lower level promise.
Typically you will establish, in some suitable outer scope, a js plain object as a promise cache, and always look there first before calling your async process.
var promiseCache = {};
function foo() {
if(!promiseCache.foo) {
promiseCache.foo = doSomethingAsync();
}
return promiseCache.foo;
}
function bar() {
return foo().then(doSomethingElseAsync);
}
Of course, there's nothing to prevent you also caching the higher level promise, if appropriate.
function bar() {
if(!promiseCache.bar) {
promiseCache.bar = foo().then(doSomethingElseAsync);
}
return promiseCache.bar;
}
EDIT: forceRefresh feature
You can force a function to refresh its cached promise by passing an (extra) parameter.
function foo(any, number, of, other, arguments, forceRefresh) {
if(forceRefresh || !promiseCache.foo) {
promiseCache.foo = doSomethingAsync();
}
return promiseCache.foo;
}
By making forceRefresh the last argument, leaving it out is the same as passing false and foo will use the cached promise if available. Alternatively, pass true to guarantee that doSomethingAsync() be called and the cached value be refreshed.
EDIT 2: setName()/getName()
With the forceRefresh mechanism in place in getName() :
setName(newName).then(getName.bind(null, true)); //set new name then read it back using forceRefresh.
Alternatively, omit the forceRefresh mechanism and, assuming the cache property to be promiseCache.name :
setName(newName).then(function() {
promiseCache.name = $.when(newName);//update the cache with a simulated `getName()` promise.
});
The first method is more elegant, the second more efficient.
You can simply think of both functions as independent. That way, you don't go daisy-chaining dependencies that operate asynchronously. You can then have one other module that uses them.
Since they do async stuff, consider using promises. You can use jQuery's deferreds for compatibility. Think of deferreds as read/write while promises are read-only.
// foo.js
define(function(){
return function(){
return new Promise(function(resolve, reject){
// Do async stuff. Call resolve/reject accordingly
});
};
});
// bar.js
define(function(){
return function(){
return new Promise(function(resolve, reject){
// Do async stuff. Call resolve/reject accordingly
});
};
});
// Your code (Excuse the CommonJS format. Personal preference)
define(function(require){
// Require both functions
var foo = require('foo');
var bar = require('bar');
// Use them
foo(...).then(function(response){
return bar();
}).then(function(){
// all done
});;
});
Try creating an object property with possible values undefined , "pending" , true ; call deferred.resolve() when obj.active is true , deferred.reject() when obj.active is "pending"
var res = {
active: void 0
};
var foo = function foo(state) {
var t;
var deferred = function(type) {
return $.Deferred(function(dfd) {
if (res.active === "pending" || state && state === "pending") {
res.active = "pending";
dfd.rejectWith(res, [res.active])
} else {
res.active = state || "pending";
t = setInterval(function() {
console.log(res.active)
}, 100);
setTimeout(function() {
clearInterval(t)
res.active = true;
dfd.resolveWith(res, [res.active])
}, 3000);
}
return dfd.promise()
})
.then(function(state) {
console.log("foo value", state);
return state
}, function(err) {
console.log("foo status", err)
return err
})
}
return deferred()
}
var bar = function bar(result) {
var deferred = function(type) {
return $.Deferred(function(dfd) {
if (result && result === true) {
setTimeout(function() {
dfd.resolveWith(result, [true])
}, 1500)
} else {
dfd.rejectWith(res, [res.active || "pending"])
};
return dfd.promise()
})
}
return deferred().then(function(data) {
console.log("bar value", data);
}, function(err) {
console.log("bar status", err);
})
}
$("button").click(function() {
$(this).is(":first")
? foo().then(bar, bar)
: bar(res.active === true ? res.active : "pending")
.then(foo, foo).then(bar, bar)
})
<script src="https://ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js">
</script>
<button>foo</button>
<button>bar</button>
Not sure I understood correctly the question. But here is my take at it:
Put you function foo into a variable
var foo_fn = function foo(foo_args){// Your async code goes here}
foo is async and returns something at some point. In your definition of foo, I recommend that you use promises, the concept is designed to manage composition of asynchronous functions in a clean and scalable way. jQuery implementation of the concept is convenient in a lot of simple use cases but suffers from some drawbacks which make it interesting for you at some point to use one of the many promises library which follow the Promises/A specification. For more information, you can refer to :
Cf. https://thewayofcode.wordpress.com/2013/01/22/javascript-promises-and-why-jquery-implementation-is-broken/ and https://blog.domenic.me/youre-missing-the-point-of-promises
so, say foo takes args, and returns a promise which later resolves into some value.
var foo_fn = function foo(foo_args) {
return foo_fn.promise = new RSVP.Promise (resolve, reject) {
// Your async code goes here
}
}
Here I use the RSVP promise library but any promise library following the Promises/A specification could do the job.
When bar is called, you can just do:
function bar (bar_args) {
var foo_promise = foo_fn.promise;
// if foo was called, whether the computation is in progress or finished,
// the foo_fn.promise field will be non-empty, as foo returns immediately
// with a promise anytime it is called
``
if (!foo.promise) {
// foo has not yet been called so call it
foo_promise = foo(foo_args);
}
foo_promise.then (function (foo_result) {/*some async code here*/})
}
NOTE : That solution is quite similar to the one proposed by Roamer-1888. One difference is that in Roamer proposal, the foo function will always return the same value after performing once its asyncronous computation. Don't know if this is the intended behaviour. In my implementation, foo executes the async. computation every time it is called. bar will use the latest computed value that is stored in the field foo_fn.promise. Older computations are lost, possible computation in progress is not taken into account.
If you are going to have this pattern often used in your code, you can also create a function working on the model of the define
function in require.js.
You will need :
a registry to hold the dependencies functions (foo in your example)
the dependant function (bar in your example) will need to accept the dependencies functions computed value as part of their signature. For example, a hash of the dependencies could be passed as first parameter, so bar signature could be: {foo: foo_result}, other_bar_args...
the dependencies function must follow the model of my previous answer, i.e. register their promise value as a property on themselves when they execute.
Reminder : you need to name those dependencies functions to reference them inside their body, and then add that object to the registry.
In the define function body, you wrap the dependent function into another one which :
Get all dependencies from the registry
Get all dependencies values, executing the dependencies when necessary (similarly to my previous answer). This means you end up having a list of promises, whose results you then congregate together (RSVP.hash for example with RSVP promise library). I believe jQuery has a similar function with jQuery.when
you call the dependent function (bar) with this hash of results as a first argument, other arguments being the same as the wrapped function
that wrapped function is the new bar, so when bar is called, it will be the wrapped function which will be called.
A bit lengthy but it should work. If you want to see some code, let me know if this is what you were looking for. In any case, if you are going to have complex async. in your code, it could be interesting for you to use a compliant promise library. $.deferred is also to be used only when you have nothing better at sight as it makes it harder for you to track the behaviour of your functions : you need to keep track of all places where this deferred appears to be able to reason about your program.

How to handle these async functions (design-pattern?)

I got an application which needs to execute some booting/startup like:
ajax
dynamic requirejs
routing, set up something else
before able to run.
I now got difficulties organizing these tasks in a solid way together.
Especially the async behavior is giving me headache.
Currently I am using events to share fetched results and watch the state of the application. Unfortunately this resulted in enological, inconvenient crap.
Then I tried using some promises libraries like q, jquery.defered but they don't really match my problem.
This is a simplified version of the code:
// this could be an ajax call fetching some user data
var fetchUser = function() {
var user = {};
// lets emulate the ajax call with setTimeout
setTimeout(function() {
// set some values
user.username = 'bodo';
user.password = 'helloKitty';
// return the user object we "fetched"
return user;
}, 300);
};
// this could fetch some config or some requirejs modules
var fetchConfig = function() {
var config = {};
// we emulate this too...
setTimeout(function() {
return config;
}, 200);
};
// this could be anything else like setting up some objects
var justSetUpSomething = function() {
var someObj = {};
someObj.router = 'this could be a router object for example';
someObj.logger = 'or a logger';
return someObj;
};
// in the final step everything should be merged together
// and be passed as event argument
var finalStep = function(user, config, someObj) {
var mainObj = {};
mainObj.user = user;
mainObj.config = config;
mainObj.someObj = someObj;
// trigger some event system
trigger('everything:ready', mainObj);
};​
Also viewable at:
http://jsfiddle.net/uzJrs/3/
I hope this describes my problem:
There are three totally different, async tasks. When they all a ready their results have to get merged and somehow passed to another object.
Events make this workable but far a way from understandable and promises also don't really make me happy. Isn't there another helpful design pattern?
At first, as you have requirejs already there, you can use it to load your three modules. Requirejs resolves the dependencies asynchronous and in parallel, and calls the final factory function when they're loaded.
But, also every Deferred library offers functions to merge promises: The merged one resolves when all single ones are resolved, or becomes rejected if one of them is rejected. The respective functions are Q.all and jQuery.when:
function fetchX() {
var d = new $.Deferred();
asynchronousThing(function callback(x) {
d.resolve(x);
});
return d.promise();
}
$.when(fetchX(), fetchY(), …).then(function finalStep(x, y, …) {
// merge everything together
return new Event();
}).done(function readyCallback(e) {
// this is effectively your 'everything:ready' "event"
});
It sounds like the majority of your functionality is actually synchronous as long as the dependencies are resolved (i.e. config and someObj don't do anything asynchronous they simply might be loaded asynchronously. If you are using require.js have you tried simply using its functionality to make the require calls synchronous from your code's perspective by using define? Then you only have to worry about the call which has to be asynchronous (fetchUser).
// In app/main.js perhaps?
define(["data/user",
"app/config",
"app/someObject",
"app/events"],
function(user, config, someObj, eventBus) {
// By the sound of it config and someObj are ready once loaded
var mainObj = {"config": config, "someObj": someObj};
eventBus.register_once("data:user:loaded", function(user) {
mainObj.user = user;
eventBus.trigger("everything:ready", mainObj);
});
user.fetchUser()
});
// in data/user.js
define(["app/config", "app/events", "vendor/jquery"],
function(config, eventBus, $) {
function user_loaded(user) {
eventBus.trigger("data:user:loaded", user);
}
function fetchUser() {
$.get(config.dataendpoint + "/user/1", user_loaded);
}
return {"fetchUser": fetchUser};
});

How to execute a Javascript function only after multiple other functions have completed?

My specific problem is that I need to execute a (potentially) large number of Javascript functions to prepare something like a batch file (each function call adds some information to the same batch file) and then, after all those calls are completed, execute a final function to send the batch file (say, send it as an HTML response). I'm looking for a general Javascript programming pattern for this.
Generalize problem:
Given the Javascript functions funcA(), funcB(), and funcC(), I would to figure out the best way to order execution so that funcC is only executed after after funcA and funcB have executed. I know that I could use nested callback functions like this:
funcA = function() {
//Does funcA stuff
funcB();
}
funcB = function() {
//Does funcB stuff
funcC();
}
funcA();
I could even make this pattern a little more general by passing in callback parameters, however, this solution becomes quite verbose.
I am also familiar with Javascript function chaining where a solution might look like:
myObj = {}
myObj.answer = ""
myObj.funcA = function() {
//Do some work on this.answer
return this;
}
myObj.funcB = function() {
//Do some more work on this.answer
return this;
}
myObj.funcC = function() {
//Use the value of this.answer now that funcA and funcB have made their modifications
return this;
}
myObj.funcA().funcB().funcC();
While this solution seems a little cleaner to me, as you add more steps to the computation, the chain of function executions grows longer and longer.
For my specific problem, the order in which funcA, funcB, etc. are executed DOES NOT matter. So in my solutions above, I am technically doing more work than is required because I am placing all the functions in a serial ordering. All that matters to me is that funcC (some function for sending the result or firing off a request) is only called after funcA and funcB have ALL completed execution. Ideally, funcC could somehow listen for all the intermediate function calls to complete and THEN would execute? I hoping to learn a general Javascript pattern to solve such a problem.
Thanks for your help.
Another Idea:
Maybe pass a shared object to funcA and funcB and when they complete execution mark the shared object like sharedThing.funcA = "complete" or sharedThing.funcB = "complete" and then somehow? have funcC execute when the shared object reaches a state where all fields are marked complete. I'm not sure how exactly you could make funcC wait for this.
Edit:
I should note that I'm using server-side Javascript (Node.js) and I would like to learn a pattern to solve it just using plain old Javascript (without the use of jQuery or other libraries). Surely this problem is general enough that there is a clean pure-Javascript solution?
If you want to keep it simple, you can use a counter-based callbacks system. Here's a draft of a system that allows when(A, B).then(C) syntax. (when/then is actually just sugar, but then again the whole system arguably is.)
var when = function() {
var args = arguments; // the functions to execute first
return {
then: function(done) {
var counter = 0;
for(var i = 0; i < args.length; i++) {
// call each function with a function to call on done
args[i](function() {
counter++;
if(counter === args.length) { // all functions have notified they're done
done();
}
});
}
}
};
};
Usage:
when(
function(done) {
// do things
done();
},
function(done) {
// do things
setTimeout(done, 1000);
},
...
).then(function() {
// all are done
});
If you don't use any asynchronous functions and your script doesn't break the order of execution, then the most simple solution is, as stated by Pointy and others:
funcA();
funcB();
funcC();
However, since you're using node.js, I believe you're going to use asynchronous functions and want to execute funcC after a async IO request has finished, so you have to use some kind of counting mechanisms, for example:
var call_after_completion = function(callback){
this._callback = callback;
this._args = [].slice.call(arguments,1);
this._queue = {};
this._count = 0;
this._run = false;
}
call_after_completion.prototype.add_condition = function(str){
if(this._queue[str] !== undefined)
throw new TypeError("Identifier '"+str+"' used twice");
else if(typeof str !== "String" && str.toString === undefined)
throw new TypeError("Identifier has to be a string or needs a toString method");
this._queue[str] = 1;
this._count++;
return str;
}
call_after_completion.prototype.remove_condition = function(str){
if(this._queue[str] === undefined){
console.log("Removal of condition '"+str+"' has no effect");
return;
}
else if(typeof str !== "String" && str.toString === undefined)
throw new TypeError("Identifier has to be a string or needs a toString method");
delete this._queue[str];
if(--this._count === 0 && this._run === false){
this._run = true;
this._callback.apply(null,this._args);
}
}
You can simplify this object by ignoring the identifier str and just increasing/decreasing this._count, however this system could be useful for debugging.
In order to use call_after_completion you simply create a new call_after_completion with your desired function func as argument and add_conditions. func will only be called if all conditions have been removed.
Example:
var foo = function(){console.log("foo");}
var bar = new call_after_completion(foo);
var i;
bar.add_condition("foo:3-Second-Timer");
bar.add_condition("foo:additional function");
bar.add_condition("foo:for-loop-finished");
function additional_stuff(cond){
console.log("additional things");
cond.remove_condition("foo:additional function");
}
for(i = 0; i < 1000; ++i){
}
console.log("for loop finished");
bar.remove_condition("foo:for-loop-finished");
additional_stuff(bar);
setTimeout(function(){
console.log("3 second timeout");
bar.remove_condition("foo:3-Second-Timer");
},3000);
JSFiddle Demo
If you don't want to use any helper libraries, than you need to write some helper yourself, there's no simple one line solution for this.
If you'd like to end with something that looks as readable as it would in synchronous case, try some deferred/promise concept implementation (it's still plain JavaScript), e.g. using deferred package you may end up with something as simple as:
// Invoke one after another:
funcA()(funcB)(funcC);
// Invoke funcA and funcB simultaneously and afterwards funcC:
funcA()(funcB())(funcC);
// If want result of both funcA and funcB to be passed to funcC:
deferred(funcA(), funcB())(funcC);
Have a look into jQuery's deferred objects. This provides a sophisticated means of controlling what happens when in an asynchronous environment.
The obvious use-case for this is AJAX, but it is not restricted to this.
Resources:
jQuery docs: deferred object
good introduction to deferred object patterns
Non-AJAX use for jQuery's deferred objects
I was looking for the same kind of pattern. I am using APIs that interrogate multiple remote data sources. The APIs each require that I pass a callback function to them. This means that I cannot just fire off a set of my own functions and wait for them to return. Instead I need a solution that works with a set of callbacks that might be called in any order depending on how responsive the different data sources are.
I came up with the following solution. JS is way down the list of languages that I am most familiar with, so this may not be a very JS idiom.
function getCallbackCreator( number_of_data_callbacks, final_callback ) {
var all_data = {}
return function ( data_key ) {
return function( data_value ) {
all_data[data_key] = data_value;
if ( Object.keys(all_data).length == number_of_data_callbacks ) {
final_callback( all_data );
}
}
}
}
var getCallback = getCallbackCreator( 2, inflatePage );
myGoogleDataFetcher( getCallback( 'google' ) );
myCartoDataFetcher( getCallback( 'cartodb' ) );
Edit: The question was tagged with node.js but the OP said, "I'm looking for a general Javascript programming pattern for this," so I have posted this even though I am not using node.
Nowadays, one can do something like this:
Let's say we have both funcA, funcB and funcC:
If one's want funcA and funcB results to be passed to funcC:
var promiseA = new Promise((resolve, reject) => {
resolve(await funcA());
});
var promiseB = new Promise((resolve, reject) => {
resolve(await funcB());
});
var promise = Promise.all([ promiseA, promiseB ]).then(results => {
// results = [result from funcA, result from funcB]
return funcC(results);
});
If one's want funcA, then funcB and then funcC:
var promise = (
new Promise(async resolve => resolve( await funcA() ))
).then(result_a => funcB(result_a)).then(result_b => funcC(result_b));
And finally:
promise.then(result_c => console.log('done.'));
how about:
funcC(funcB(funcA)));
I think the questions is because some of functions run longer and there might be a situation when we run funcC when funcA or funcB did not fininsh executing.

Categories