How to sync JavaScript callbacks? - javascript

I've been developing in JavaScript for quite some time but net yet a cowboy developer, as one of the many things that always haunts me is synching JavaScript's callbacks.
I will describe a generic scenario when this concern will be raised: I have a bunch of operations to perform multiple times by a for loop, and each of the operations has a callback. After the for loop, I need to perform another operation but this operation can only execute successfully if all the callbacks from the for loop are done.
Code Example:
for ... in ... {
myFunc1(callback); // callbacks are executed asynchly
}
myFunc2(); // can only execute properly if all the myFunc1 callbacks are done
Suggested Solution:
Initiate a counter at the beginning of the loop holding the length of the loop, and each callback decrements that counter. When the counter hits 0, execute myFunc2. This is essentially to let the callbacks know if it's the last callback in sequence and if it is, call myFunc2 when it's done.
Problems:
A counter is needed for every such sequence in your code, and having meaningless counters everywhere is not a good practice.
If you recall how thread conflicts in classical synchronization problem, when multiple threads are all calling var-- on the same var, undesirable outcomes would occur. Does the same happen in JavaScript?
Ultimate Question:
Is there a better solution?

The good news is that JavaScript is single threaded; this means that solutions will generally work well with "shared" variables, i.e. no mutex locks are required.
If you want to serialize asynch tasks, followed by a completion callback you could use this helper function:
function serializeTasks(arr, fn, done)
{
var current = 0;
fn(function iterate() {
if (++current < arr.length) {
fn(iterate, arr[current]);
} else {
done();
}
}, arr[current]);
}
The first argument is the array of values that needs to be passed in each pass, the second argument is a loop callback (explained below) and the last argument is the completion callback function.
This is the loop callback function:
function loopFn(nextTask, value) {
myFunc1(value, nextTask);
}
The first argument that's passed is a function that will execute the next task, it's meant to be passed to your asynch function. The second argument is the current entry of your array of values.
Let's assume the asynch task looks like this:
function myFunc1(value, callback)
{
console.log(value);
callback();
}
It prints the value and afterwards it invokes the callback; simple.
Then, to set the whole thing in motion:
serializeTasks([1,2, 3], loopFn, function() {
console.log('done');
});
Demo
To parallelize them, you need a different function:
function parallelizeTasks(arr, fn, done)
{
var total = arr.length,
doneTask = function() {
if (--total === 0) {
done();
}
};
arr.forEach(function(value) {
fn(doneTask, value);
});
}
And your loop function will be this (only parameter name changes):
function loopFn(doneTask, value) {
myFunc1(value, doneTask);
}
Demo

The second problem is not really a problem as long as every one of those is in a separate function and the variable is declared correctly (with var); local variables in functions do not interfere with each other.
The first problem is a bit more of a problem. Other people have gotten annoyed, too, and ended up making libraries to wrap that sort of pattern for you. I like async. With it, your code might look like this:
async.each(someArray, myFunc1, myFunc2);
It offers a lot of other asynchronous building blocks, too. I'd recommend taking a look at it if you're doing lots of asynchronous stuff.

You can achieve this by using a jQuery deferred object.
var deferred = $.Deferred();
var success = function () {
// resolve the deferred with your object as the data
deferred.resolve({
result:...;
});
};

With this helper function:
function afterAll(callback,what) {
what.counter = (what.counter || 0) + 1;
return function() {
callback();
if(--what.counter == 0)
what();
};
}
your loop will look like this:
function whenAllDone() { ... }
for (... in ...) {
myFunc1(afterAll(callback,whenAllDone));
}
here afterAll creates proxy function for the callback, it also decrements the counter. And calls whenAllDone function when all callbacks are complete.

single thread is not always guaranteed. do not take it wrong.
Case 1:
For example, if we have 2 functions as follows.
var count=0;
function1(){
alert("this thread will be suspended, count:"+count);
}
function2(){
//anything
count++;
dump(count+"\n");
}
then before function1 returns, function2 will also be called, if 1 thread is guaranteed, then function2 will not be called before function1 returns. You can try this. and you will find out count is going up while you are being alerted.
Case 2: with Firefox, chrome code, before 1 function returns (no alert inside), another function can also be called.
So a mutex lock is indeed needed.

There are many, many ways to achieve this, I hope these suggestions help!
First, I would transform the callback into a promise! Here is one way to do that:
function aPromise(arg) {
return new Promise((resolve, reject) => {
aCallback(arg, (err, result) => {
if(err) reject(err);
else resolve(result);
});
})
}
Next, use reduce to process the elements of an array one by one!
const arrayOfArg = ["one", "two", "three"];
const promise = arrayOfArg.reduce(
(promise, arg) => promise.then(() => aPromise(arg)), // after the previous promise, return the result of the aPromise function as the next promise
Promise.resolve(null) // initial resolved promise
);
promise.then(() => {
// carry on
});
If you want to process all elements of an array at the same time, use map an Promise.all!
const arrayOfArg = ["one", "two", "three"];
const promise = Promise.all(arrayOfArg.map(
arg => aPromise(arg)
));
promise.then(() => {
// carry on
});
If you are able to use async / await then you could just simply do this:
const arrayOfArg = ["one", "two", "three"];
for(let arg of arrayOfArg) {
await aPromise(arg); // wow
}
// carry on
You might even use my very cool synchronize-async library like this:
const arrayOfArg = ["one", "two", "three"];
const context = {}; // can be any kind of object, this is the threadish context
for(let arg of arrayOfArg) {
synchronizeCall(aPromise, arg); // synchronize the calls in the given context
}
join(context).then(() => { // join will resolve when all calls in the context are finshed
// carry on
});
And last but not least, use the fine async library if you really don't want to use promises.
const arrayOfArg = ["one", "two", "three"];
async.each(arrayOfArg, aCallback, err => {
if(err) throw err; // handle the error!
// carry on
});

Related

How to execute promises in series?

var promiseReturningFuncs = [];
for(var i = 0; i < 5; i++){
promiseReturningFuncs.push(askQuestion);
}
var programmers = [];
Promise.reduce(promiseReturningFuncs, function(resp, x) {
console.log(typeof resp);
if(typeof resp != "function") {
programmers.push(resp);
}
return x();
})
.then(function(resp) {
programmers.push(resp);
console.log(programmers);
});
My goal: execute the askQuestion function in series and resolve an array of objects created by that function. (this function must execute in series so that it can respond to user input)
So imagine that the askQuestion function returns a promise that resolves a object I want to add to an array.
This is my messy way of doing it.
I am looking to find a cleaner way of doing it, ideally, i wouldn't even need to push to an array, I would just have a final .then, where the response is an array.
Since you appear to be using the Bluebird promise library, you have a number of built-in options for sequencing your promise returning functions. You can use Promise.reduce(), Promise.map() with a concurrency value of 1, Promise.mapSeries or Promise.each(). If the iterator function returns a promise, all of these will wait for the next iteration until that promise resolves. Which to use depends more upon the mechanics of how your data is structured and what result you want (neither of which you actually show or describe).
Let's suppose you have an array of promise returning functions and you want to call them one at a time, waiting for the one to resolve before calling the next one. If you want all the results, then I'd suggest Promise.mapSeries():
let arrayOfPromiseReturningFunctions = [...];
// call all the promise returning functions in the array, one at a time
// wait for one to resolve before calling the next
Promise.mapSeries(arrayOfPromiseReturningFunctions, function(fn) {
return fn();
}).then(function(results) {
// results is an array of resolved results from all the promises
}).catch(function(err) {
// process error here
});
Promise.reduce() could also be used, but it would accumulate a single result, passing it from one to the next and end with one final result (like Array.prototype.reduce() does).
Promise.map() is a more general version of Promise.mapSeries() that lets you control the concurrency number (the number of async operations in flight at the same time).
Promise.each() will also sequence your functions, but does not accumulate a result. It assumes you either don't have a result or you are accumulating the result out-of-band or via side effects. I tend to not like to use Promise.each() because I don't like side effect programming.
You could solve this in pure JS using ES6 (ES2015) features:
function processArray(arr, fn) {
return arr.reduce(
(p, v) => p.then((a) => fn(v).then(r => a.concat([r]))),
Promise.resolve([])
);
}
It applies the function given to the array in series and resolves to an array of the results.
Usage:
const numbers = [0, 4, 20, 100];
const multiplyBy3 = (x) => new Promise(res => res(x * 3));
// Prints [ 0, 12, 60, 300 ]
processArray(numbers, multiplyBy3).then(console.log);
You'll want to double check browser compatibility but this works on reasonably current Chrome (v59), NodeJS (v8.1.2) and probably most others.
You can use recursion so that you can move to the next iteration in a then block.
function promiseToExecuteAllInOrder(promiseReturningFunctions /* array of functions */) {
var resolvedValues = [];
return new Promise(function(resolve, reject) {
function executeNextFunction() {
var nextFunction = promiseReturningFunctions.pop();
if(nextFunction) {
nextFunction().then(function(result) {
resolvedValues.push(result);
executeNextFunction();
});
} else {
resolve(resolvedValues);
}
}
executeNextFunction();
}
}
Executing one after another using a recursive function( in a non promise way):
(function iterate(i,result,callback){
if( i>5 ) callback(result);askQuestion().then(res=>iterate(i+1,result.concat([res]),callback);
})(0,[],console.log);
For shure this can be wrapped in a promise:
function askFive(){
return new Promise(function(callback){
(function iterate(i,result){
if( i>5 ) callback(result);askQuestion().then(res=>iterate(i+1,result.concat([res]),callback);
})(0,[],console.log);
});
}
askFive().then(console.log);
Or:
function afteranother(i,promise){
return new Promise(function(resolve){
if(!i) return resolve([]);
afteranother(i-1,promise).then(val=>promise().then(val2=>resolve(val.concat([val2])));
});
}
afteranother(5,askQuestion).then(console.log);

Nodejs callback next loop when function is finished

Supposing we have a script that will execute a certain task for each row in an array.
function execute(err, array){
loop(array, function(err,object){
console.log(object)
//do a certain task when it's finished get into the next object successively.
});
}
function loop(array,callback){
array.forEach(function(object){
callback(null, object);
});
}
function array(callback){
callback(null, [1, 2, 3, 4, 5]);
}
setTimeout(function(){
array(execute);
}, 6000);
Questions:
How to get into the next loop only after finishing the task?
Is my function considered asynchronous ?
You can do something like this to iterate over your array :
var myarray = [1,2,3,4,5];
next(myarray, 0);
function next(array, idx) {
if (idx !== array.length) {
// do something
console.log(array[idx]);
// run other functions with callback
another_fct(array[idx], function() {
next(array, idx + 1); // then the next loop
});
// or run directly the next loop
next(array, idx + 1);
} else {
// the entire array has been proceed
console.log('an array of '+array.length+' number of elements have been proceed');
}
}
function another_fct(array_element, callback) {
// do something with array_element
console.log('value : '+array_element);
callback(); // run the next loop after processing
}
This method will perform your array elements synchronously.
Try this:
function execute(err, array) {
loop(array, function(err, object, next) {
console.log(object);
next(); // this will call recur inside loop function
}, function() {
console.log('All done');
});
}
function loop(array, callback, finish) {
var copy = array.slice();
(function recur() {
var item = copy.shift();
if (item) {
callback(null, item, recur);
} else {
if (typeof finish == 'function') {
finish();
}
}
})();
}
No, you function is not asynchronous but you call it asynchronously using setTimeout.
async provides async.series and other helpers for this purpose.
You can wrap your head around it better if you refactor the code and remove any redundant anonymous functions. The case is, that passing an anonymous function as an argument to another function not yet make the function asynchronous. You read more a more in depth explanation in "Does taking a callback make a function asynchronous?".
From the refactored version,
setTimeout(function() {
[1, 2, 3, 4, 5].forEach(function(object) {
console.log(object)
//do a certain task when it's...
});
}, 6000);
it can be seen that forEach is called for the array. The forEach method executes provided function once per array element and does it synchronously.
So, to answer your question:
yes, it calls the next item only after finishing previous (but if doing anything asynchronous, see below)
this is not considered asynchronous since it doesn't perform any asynchronous operations (not considering the outer setTimeout)
However, if you choose to start an asynchronous operation in the forEach function, then things change considerably. The result is that all operations are in-progress at the same time. This is potentially a hazardous operation resource-wise. There are libraries such as async for handling this use case gracefully.
(Btw, good use of Node.js errbacks, where first function argument reserved for passing potential error.)

Javascript how to execute code after for loop completes

I'm trying to work through this js/async scenario and i'm trying to know how the rest of the js world handles this.
function doStuff(callback) {
cursor.each(function(err, blahblah) {
...doing stuff here takes some time
});
... Execute this code ONLY after the `cursor.each` loop is finished
callback();
EDIT
Here's a more concrete example updated using most of the suggestions below which still doesn't work.
function doStuff(callback) {
MongoClient.connect(constants.mongoUrl, function(err, db) {
var collection = db.collection('cases2');
var cursor = collection.find();
var promises = []; // array for storing promises
cursor.each(function(err, item) {
console.log('inside each'); // NEVER GETS LOGGED UNLESS I COMMENT OUT THIS LINE: return Q.all(promises).then(callback(null, items));
var def = Q.defer(); // Create deferred object and store
promises.push(def.promise); // Its promise in the array
if(item == null) {
return def.resolve();
}
def.resolve(); // resolve the promise
});
console.log('items'); // ALWAYS GETS CALLED
console.log(items);
// IF I COMMENT THIS LINE OUT COMPLETELY,
// THE LOG STATEMENT INSIDE CURSOR.EACH ACTUALLY GETS LOGGED
return Q.all(promises).then(callback(null, items));
});
}
without using promises or any other dependencies/libraries you can simply
function doStuff(callback) {
add a counter
var cursor = new Array(); // init with some array data
var cursorTasks = cursor.length;
function cursorTaskComplete()
{
cursorTasks--;
if ( cursorTasks <= 0 ) {
// this gets get called after each task reported to be complete
callback();
}
}
for ( var i = 0; i < cursor.length; i++ ) {
...doing stuff here takes some time and does some async stuff
check after each async request
...when async operation is complete call
cursorTaskComplete()
}
}
Without knowing the details of the async calls you're making within the cursor.each loop, I shall assume that you have the ability to invoke a callback each time the functions invoked therein have completed their async task:
function doStuff() {
var promises = []; // array for storing promises
cursor.each(function(err, blahblah) {
var def = Q.defer(); // create deferred object and store
promises.push(def.promise); // its promise in the array
call_async_function(..., def.resolve); // resolve the promise in the async function's callback
});
// pass the array to Q.all, only when all are resolved will "callback" be called
return Q.all(promises);
}
and the usage then becomes:
doStuff().then(callback)
Note how the invocation of the callback now never touches the doStuff function - that function now also returns a promise. You can now register multiple callbacks, failure callbacks, etc, all without modifying doStuff. This is called "separation of concerns".
[NB: all the above based on the Q promises library - https://github.com/kriskowal/q]
EDIT further discussion and experimentation has determined that the .each call is itself async, and gives no indication to the outside when the last row has been seen. I've created a Gist that demonstrates a resolution to this problem.
if you want to do it with the async module, you can make use of the async forEachSeries function
Code snippet:
function doStuff(callback) {
async.forEachSeries(cursor, function(cursorSingleObj,callbackFromForEach){
//...do stuff which takes time
//this callback is to tell when everything gets over execute the next function
callbackFromForEach();
},function(){
//over here the execution of forEach gets over and then the main callback is called
callback();
});
}
In my mind an elegant/ideal solution would be to have something like
cursor.each(........).then( function() { ....your stuff});
But without that you can do this....UPDATED
http://plnkr.co/edit/27l7t5VLszBIW9eFW4Ip?p=preview
The gist of this is as shown below...notice....when
var doStuff = function(callback) {
cursor.forEach(function(cursorStep) {
var deferred = $q.defer();
var promise = deferred.promise;
allMyAsyncPromises.push(promise);
cursorStep.execFn(cursorStep.stepMeta);
promise.resolve;
});
$q.when(allMyAsyncPromises).then(callback);
}
After hitting the start button wait for few seconds...the async tasks have been simulated to finish in 5 seconds so the status will update accordingly.
Not having access to a real cursor object..I had to resort of fake cursor like and array.

Iterate over array running async task on each element

I have a list of items, which I want to run an async task on each.
I want to synchronize so each element will be processed once the preceding element has completed. What I've tried so far is:
function processItems(items) {
var i = 0;
while(i < items.length) {
asyncFunc(items[i], function() { i++ }); // asyncFunc takes a callback parameter
}
}
However this loops forever (I believe i is out of scope in the callback function).
Is there a better way to achieve this?
I think the following accomplishes what you're looking for:
function processItems(items) {
var i = 0,
length = items.length,
fn = function() {
if(i < length) {
asyncFunc(items[i], fn);
i++;
}
};
fn();
}
fn is a function that sets the callback equal to itself as long as the index is less than the length. Then I call fn once to start it off. Here's a fiddle:
http://jsfiddle.net/8A8CG/
Alternatively, if asyncFunc returns a promise, you can use Array#reduce to process the items in a series:
function processItems(items) {
return items.reduce((promise, item) => {
return promise.then(() => asyncFunc(item));
}, Promise.resolve());
}
If you need to execute a set of async function in series, I highly recommend the async module for node.
It offers a series method to execute async tasks in series. It also offers a waterfall method that will pass the results of the previous task to the next task.
Oops. You are looking for eachSeries. Here's how it might work for you:
var async = require('async');
async.eachSeries(items, asyncFunc, function(err){ if(err) return "OH NO"; return "Yay!"})
Make sure asyncFunc invokes a callback with either an error or null.

How to execute a Javascript function only after multiple other functions have completed?

My specific problem is that I need to execute a (potentially) large number of Javascript functions to prepare something like a batch file (each function call adds some information to the same batch file) and then, after all those calls are completed, execute a final function to send the batch file (say, send it as an HTML response). I'm looking for a general Javascript programming pattern for this.
Generalize problem:
Given the Javascript functions funcA(), funcB(), and funcC(), I would to figure out the best way to order execution so that funcC is only executed after after funcA and funcB have executed. I know that I could use nested callback functions like this:
funcA = function() {
//Does funcA stuff
funcB();
}
funcB = function() {
//Does funcB stuff
funcC();
}
funcA();
I could even make this pattern a little more general by passing in callback parameters, however, this solution becomes quite verbose.
I am also familiar with Javascript function chaining where a solution might look like:
myObj = {}
myObj.answer = ""
myObj.funcA = function() {
//Do some work on this.answer
return this;
}
myObj.funcB = function() {
//Do some more work on this.answer
return this;
}
myObj.funcC = function() {
//Use the value of this.answer now that funcA and funcB have made their modifications
return this;
}
myObj.funcA().funcB().funcC();
While this solution seems a little cleaner to me, as you add more steps to the computation, the chain of function executions grows longer and longer.
For my specific problem, the order in which funcA, funcB, etc. are executed DOES NOT matter. So in my solutions above, I am technically doing more work than is required because I am placing all the functions in a serial ordering. All that matters to me is that funcC (some function for sending the result or firing off a request) is only called after funcA and funcB have ALL completed execution. Ideally, funcC could somehow listen for all the intermediate function calls to complete and THEN would execute? I hoping to learn a general Javascript pattern to solve such a problem.
Thanks for your help.
Another Idea:
Maybe pass a shared object to funcA and funcB and when they complete execution mark the shared object like sharedThing.funcA = "complete" or sharedThing.funcB = "complete" and then somehow? have funcC execute when the shared object reaches a state where all fields are marked complete. I'm not sure how exactly you could make funcC wait for this.
Edit:
I should note that I'm using server-side Javascript (Node.js) and I would like to learn a pattern to solve it just using plain old Javascript (without the use of jQuery or other libraries). Surely this problem is general enough that there is a clean pure-Javascript solution?
If you want to keep it simple, you can use a counter-based callbacks system. Here's a draft of a system that allows when(A, B).then(C) syntax. (when/then is actually just sugar, but then again the whole system arguably is.)
var when = function() {
var args = arguments; // the functions to execute first
return {
then: function(done) {
var counter = 0;
for(var i = 0; i < args.length; i++) {
// call each function with a function to call on done
args[i](function() {
counter++;
if(counter === args.length) { // all functions have notified they're done
done();
}
});
}
}
};
};
Usage:
when(
function(done) {
// do things
done();
},
function(done) {
// do things
setTimeout(done, 1000);
},
...
).then(function() {
// all are done
});
If you don't use any asynchronous functions and your script doesn't break the order of execution, then the most simple solution is, as stated by Pointy and others:
funcA();
funcB();
funcC();
However, since you're using node.js, I believe you're going to use asynchronous functions and want to execute funcC after a async IO request has finished, so you have to use some kind of counting mechanisms, for example:
var call_after_completion = function(callback){
this._callback = callback;
this._args = [].slice.call(arguments,1);
this._queue = {};
this._count = 0;
this._run = false;
}
call_after_completion.prototype.add_condition = function(str){
if(this._queue[str] !== undefined)
throw new TypeError("Identifier '"+str+"' used twice");
else if(typeof str !== "String" && str.toString === undefined)
throw new TypeError("Identifier has to be a string or needs a toString method");
this._queue[str] = 1;
this._count++;
return str;
}
call_after_completion.prototype.remove_condition = function(str){
if(this._queue[str] === undefined){
console.log("Removal of condition '"+str+"' has no effect");
return;
}
else if(typeof str !== "String" && str.toString === undefined)
throw new TypeError("Identifier has to be a string or needs a toString method");
delete this._queue[str];
if(--this._count === 0 && this._run === false){
this._run = true;
this._callback.apply(null,this._args);
}
}
You can simplify this object by ignoring the identifier str and just increasing/decreasing this._count, however this system could be useful for debugging.
In order to use call_after_completion you simply create a new call_after_completion with your desired function func as argument and add_conditions. func will only be called if all conditions have been removed.
Example:
var foo = function(){console.log("foo");}
var bar = new call_after_completion(foo);
var i;
bar.add_condition("foo:3-Second-Timer");
bar.add_condition("foo:additional function");
bar.add_condition("foo:for-loop-finished");
function additional_stuff(cond){
console.log("additional things");
cond.remove_condition("foo:additional function");
}
for(i = 0; i < 1000; ++i){
}
console.log("for loop finished");
bar.remove_condition("foo:for-loop-finished");
additional_stuff(bar);
setTimeout(function(){
console.log("3 second timeout");
bar.remove_condition("foo:3-Second-Timer");
},3000);
JSFiddle Demo
If you don't want to use any helper libraries, than you need to write some helper yourself, there's no simple one line solution for this.
If you'd like to end with something that looks as readable as it would in synchronous case, try some deferred/promise concept implementation (it's still plain JavaScript), e.g. using deferred package you may end up with something as simple as:
// Invoke one after another:
funcA()(funcB)(funcC);
// Invoke funcA and funcB simultaneously and afterwards funcC:
funcA()(funcB())(funcC);
// If want result of both funcA and funcB to be passed to funcC:
deferred(funcA(), funcB())(funcC);
Have a look into jQuery's deferred objects. This provides a sophisticated means of controlling what happens when in an asynchronous environment.
The obvious use-case for this is AJAX, but it is not restricted to this.
Resources:
jQuery docs: deferred object
good introduction to deferred object patterns
Non-AJAX use for jQuery's deferred objects
I was looking for the same kind of pattern. I am using APIs that interrogate multiple remote data sources. The APIs each require that I pass a callback function to them. This means that I cannot just fire off a set of my own functions and wait for them to return. Instead I need a solution that works with a set of callbacks that might be called in any order depending on how responsive the different data sources are.
I came up with the following solution. JS is way down the list of languages that I am most familiar with, so this may not be a very JS idiom.
function getCallbackCreator( number_of_data_callbacks, final_callback ) {
var all_data = {}
return function ( data_key ) {
return function( data_value ) {
all_data[data_key] = data_value;
if ( Object.keys(all_data).length == number_of_data_callbacks ) {
final_callback( all_data );
}
}
}
}
var getCallback = getCallbackCreator( 2, inflatePage );
myGoogleDataFetcher( getCallback( 'google' ) );
myCartoDataFetcher( getCallback( 'cartodb' ) );
Edit: The question was tagged with node.js but the OP said, "I'm looking for a general Javascript programming pattern for this," so I have posted this even though I am not using node.
Nowadays, one can do something like this:
Let's say we have both funcA, funcB and funcC:
If one's want funcA and funcB results to be passed to funcC:
var promiseA = new Promise((resolve, reject) => {
resolve(await funcA());
});
var promiseB = new Promise((resolve, reject) => {
resolve(await funcB());
});
var promise = Promise.all([ promiseA, promiseB ]).then(results => {
// results = [result from funcA, result from funcB]
return funcC(results);
});
If one's want funcA, then funcB and then funcC:
var promise = (
new Promise(async resolve => resolve( await funcA() ))
).then(result_a => funcB(result_a)).then(result_b => funcC(result_b));
And finally:
promise.then(result_c => console.log('done.'));
how about:
funcC(funcB(funcA)));
I think the questions is because some of functions run longer and there might be a situation when we run funcC when funcA or funcB did not fininsh executing.

Categories