How to execute promises in series? - javascript

var promiseReturningFuncs = [];
for(var i = 0; i < 5; i++){
promiseReturningFuncs.push(askQuestion);
}
var programmers = [];
Promise.reduce(promiseReturningFuncs, function(resp, x) {
console.log(typeof resp);
if(typeof resp != "function") {
programmers.push(resp);
}
return x();
})
.then(function(resp) {
programmers.push(resp);
console.log(programmers);
});
My goal: execute the askQuestion function in series and resolve an array of objects created by that function. (this function must execute in series so that it can respond to user input)
So imagine that the askQuestion function returns a promise that resolves a object I want to add to an array.
This is my messy way of doing it.
I am looking to find a cleaner way of doing it, ideally, i wouldn't even need to push to an array, I would just have a final .then, where the response is an array.

Since you appear to be using the Bluebird promise library, you have a number of built-in options for sequencing your promise returning functions. You can use Promise.reduce(), Promise.map() with a concurrency value of 1, Promise.mapSeries or Promise.each(). If the iterator function returns a promise, all of these will wait for the next iteration until that promise resolves. Which to use depends more upon the mechanics of how your data is structured and what result you want (neither of which you actually show or describe).
Let's suppose you have an array of promise returning functions and you want to call them one at a time, waiting for the one to resolve before calling the next one. If you want all the results, then I'd suggest Promise.mapSeries():
let arrayOfPromiseReturningFunctions = [...];
// call all the promise returning functions in the array, one at a time
// wait for one to resolve before calling the next
Promise.mapSeries(arrayOfPromiseReturningFunctions, function(fn) {
return fn();
}).then(function(results) {
// results is an array of resolved results from all the promises
}).catch(function(err) {
// process error here
});
Promise.reduce() could also be used, but it would accumulate a single result, passing it from one to the next and end with one final result (like Array.prototype.reduce() does).
Promise.map() is a more general version of Promise.mapSeries() that lets you control the concurrency number (the number of async operations in flight at the same time).
Promise.each() will also sequence your functions, but does not accumulate a result. It assumes you either don't have a result or you are accumulating the result out-of-band or via side effects. I tend to not like to use Promise.each() because I don't like side effect programming.

You could solve this in pure JS using ES6 (ES2015) features:
function processArray(arr, fn) {
return arr.reduce(
(p, v) => p.then((a) => fn(v).then(r => a.concat([r]))),
Promise.resolve([])
);
}
It applies the function given to the array in series and resolves to an array of the results.
Usage:
const numbers = [0, 4, 20, 100];
const multiplyBy3 = (x) => new Promise(res => res(x * 3));
// Prints [ 0, 12, 60, 300 ]
processArray(numbers, multiplyBy3).then(console.log);
You'll want to double check browser compatibility but this works on reasonably current Chrome (v59), NodeJS (v8.1.2) and probably most others.

You can use recursion so that you can move to the next iteration in a then block.
function promiseToExecuteAllInOrder(promiseReturningFunctions /* array of functions */) {
var resolvedValues = [];
return new Promise(function(resolve, reject) {
function executeNextFunction() {
var nextFunction = promiseReturningFunctions.pop();
if(nextFunction) {
nextFunction().then(function(result) {
resolvedValues.push(result);
executeNextFunction();
});
} else {
resolve(resolvedValues);
}
}
executeNextFunction();
}
}

Executing one after another using a recursive function( in a non promise way):
(function iterate(i,result,callback){
if( i>5 ) callback(result);askQuestion().then(res=>iterate(i+1,result.concat([res]),callback);
})(0,[],console.log);
For shure this can be wrapped in a promise:
function askFive(){
return new Promise(function(callback){
(function iterate(i,result){
if( i>5 ) callback(result);askQuestion().then(res=>iterate(i+1,result.concat([res]),callback);
})(0,[],console.log);
});
}
askFive().then(console.log);
Or:
function afteranother(i,promise){
return new Promise(function(resolve){
if(!i) return resolve([]);
afteranother(i-1,promise).then(val=>promise().then(val2=>resolve(val.concat([val2])));
});
}
afteranother(5,askQuestion).then(console.log);

Related

why the variable is undefined

I could access the variable 'savedCards ' from the first promise, it has some value. Inside the second promise it's undefined but the variable 'eCard' has value. Please explain me why ?
saveCard(eCard: IEcards) {
var savedCards: IEcards[] = [];
this.storage.get("key").then((value) => {
if (value.saves == undefined) {
var saves = savedCards;
value.saves = saves;
}
savedCards = value.saveCard; // have value and can be accessed
console.log(savedCards);
}).then((data) => {
console.log(savedCards); // savedCards is undefined but eCard.id has value
this.globalProvider.IsCardExist(eCard.id, savedCards).then((data) => {
if (!data.response) {
this.globalProvider.AddEcardToStorage("saves", eCard);
}
});
});
}
When you need to access the intermediate values in your chain, you should split your chain apart in those single pieces that you need. Instead of attaching one callback and somehow trying to use its parameter multiple times, attach multiple callbacks to the same promise - wherever you need the result value.
function getExample() {
var a = promiseA(…);
var b = a.then(function(resultA) {
// some processing
return promiseB(…);
});
return Promise.all([a, b]).then(function([resultA, resultB]) {
// more processing
return // something using both resultA and resultB
});
}
[Edit]
You want to know why, and here is the answer: ES6 came with generator functions, which allow to break the execution apart in pieces at arbitrarily placed yield keywords. Those slices can be run after each other, independently, even asynchronously - and that's just what we do when we want to wait for a promise resolution before running the next step.
Your code is resolving the second promise before the first one. You can not assure that your code will work as you want by using "then()". If you want a synchronous resolution, you should go for another way.
[Edit 2]
Try to use await and see if you are able to solve your problem. More info here: http://2ality.com/2017/08/promise-callback-data-flow.html

Flattening promise chain with readable function name

I saw promise implementation in Handling multiple catches in promise chain which produce a very readable chain
return validateInput
.then(checkLoginPermission)
.then(checkDisableUser)
.then(changePassword);
However, in order to do this each function needs to return a value instead of a Promise? Since Promise can resolves to either value or a Promise so this is not a problem. My goal is to turn every function to have readable clear logic as such.
The problem occurs when trying to unwind the nested promise function
return validateInput
.then(function(resultA) {
return checkLoginPermission
.then (function(resultB) {
// Do something with resultA
})
});
Imagine the original implementation involves accessing the value from previous promise. With nested promise, it is easily achievable. But with flatten chain, I would need to break up each function like this
function validateInput = function (resultA ) {
return Promise.resolve({resultA : resultA, resultB :
}
function checkLoginPermission = function (mix ) {
let resultA = mix.resultA;
let resultB = mix.resultB
//Do something with resultA
...
}
This is worse when the last function in the chain rely on something from the very beginning. That means the value have to be passed down from the beginning of the chain even if it was not used.
So am I accidentally stepping on some kind of anti-pattern that might affect performance? How else can I achieve good readability without all these hassles?
This is actually where async and await come in. It's good when you need results across multiple asynchronous calls/promises to be in scope. If you can use that, I'd say try it.
async function foo () {
const input = await validateInput()
const hasPermission = await checkLoginPermission(input)
const result = await checkDisableUser(hasPermission)
return await changePassword(result)
}
Just pass the variables into what function as they need to be. Just showing an example there. I was also a bit unsure of how you're setting validateInput, i think you need to put await infront of the function call itself.
If you cannot use async/await, I usually go with your 2nd code snippet, or define the higher scope variables ontop:
let resultA
return validateInput
.then(function(result) {
resultA = result
return checkLoginPermission
.then (function(resultB) {
// Do something with resultA
})
});
Promises are pattern, related to functional programming, there direct passing data from one function to other is a basic (it's called compose, here examples: http://scott.sauyet.com/Javascript/Talk/Compose/2013-05-22/). So it's not anti-pattern by no means.
I can't see any problem in such pattern. You can pass any data you want to next Promises and in nested Promises grab what they need. It's preety transparent and clear:
function validateInput() {
return Promise.resolve({resultA: 1});
}
function checkLoginPermission(result) {
return new Promise(function(resolve, reject) {
// ...
// code
// ...
result.resultB = 2;
return resolve(result);
});
}
function checkDisableUser(result) {
return new Promise(function(resolve, reject) {
// grab some data from previous function
let resultB = result.resultB;
// ...
// code
// ...
result.resultC = 3;
return resolve(result);
});
}
function changePassword(result) {
return new Promise(function(resolve, reject) {
// grab some data from previous functions
let resultB = result.resultB;
let resultC = result.resultC;
// ...
// code
// ...
result.resultD = resultB * resultC;
return resolve(result);
});
}
validateInput()
.then(checkLoginPermission)
.then(checkDisableUser)
.then(changePassword);
Also you can collect data in some variable, declared before Promises and so you will not have to pass result. But it will destroy functional nature of Promises.
The inner .then(/* ... */) callbacks can return either a primitive value or a Promise that resolves to some value. If it is another promise then the next .then won't start until the inner promise is resolved. Essentially, Promises always resolve to a non-promise type. If you resolve or return another Promise, it will be automatically unwrapped.
I would like to propose a solution using ramda.js#pipeP().
The good thing about this function is that it resolves promises sequentially.
We can rewrite your example using pipeP():
import pipeP from 'ramda/src/pipeP'
pipeP([
checkLoginPermission,
checkDisableUser,
changePassword
])(initialValue)
.then(responseChangePassword => { ... })
The results of a previous promise are passed to the following one.

Is this a good way to generate a chain of promises for an array?

I have an array of items that I want to insert into an SQL server. I am using promises for this and in order to execute each insert sequentially I wrote the following method:
var ForeachPromise = function (array, func) {
var promise = func(array[0]);
for (var i=1; i < array.length; i++) {
promise = promise.then(function() { return func(array[i]) });
}
return promise;
}
The idea is that when func is call it will return a promise, which will then be chained to the previous promise.
...
return ForeachPromise(type.subprocessen, function(subproces) {
return newSubproces(subproces, typeId, dienstId, createData, s + 1);
});
I haven't actually tested it yet, but I assume that something like this will work. My question however is am I using promises correctly? Promises are great but easily misunderstood and I just want to be sure that I'm not making any fundamental mistakes.
Yes, that approach is fine, and works well with promises. Two minor quibbles:
you should take care for the case of an empty array. Start your chain with Promise.resolve() (a promise fulfilled with undefined), and begin your loop at index 0.
As the then callback is asynchronous, your i variable has the wrong value - the classical closure in a loop fallacy.
Using the .reduce method does help with both problems:
function foreachPromise(array, func) {
return array.reduce(function(promise, elem, i) {
return promise.then(function() { return func(elem) });
}, Promise.resolve());
}

Generator returns undefined for a value while waiting on promise to resolve

I have a function connectImpl referenced in multiple places. I am trying to invoke this promise and return its value out to the calling function synchronously through intermediating via a generator. If I call .next() on the generator it is returned in a pending state
{ value: { state: 'pending' }, done: false }
I would like to wait on the value of this generator until it is no longer pending. I have tried multiple versions of waitOn to accomplish this, but I cannot seem to make it work properly.
I am open to implementation suggestions. This is driving me a bit batty. Surprisingly the final created log of the promise fires later in the execution chain - after the generator is done. I am obviously missing something:
let models = null ;
let connectImpl = function() {
console.log('connectImpl')
let orm = setupImpl()
let config = getConfigImpl()
let qInitialize = q.nbind(orm.initialize, orm)
if(models) {
console.log('connectImpl:cached')
return q(models)
} else {
console.log('connectImpl:create')
return qInitialize(config).then(function(m){
console.log('connectImpl:created')
models = m
return models
})
}
}
let waitOn = function(generator){
console.log('waitOn')
let done = false ;
let generatorValue = null
while(!done){
var generatorResult = generator.next()
console.log(generatorResult)
done = generatorResult.done
generatorValue = generatorResult.value
}
return generatorValue
}
let domainImpl = function() {
console.log('domainImpl')
let getConnection = function *() {
console.log('domainImpl:getConnection')
yield connectImpl()
}
var generator = getConnection()
return waitOn(generator)
}
console.log('START')
console.log(domainImpl())
console.log('END')
I am able to invoke and get the
START
domainImpl
waitOn
domainImpl:getConnection
connectImpl
connectImpl:create
{ value: { state: 'pending' }, done: false }
{ value: undefined, done: true }
undefined
END
connectImpl:created
I am able to add execute the connectImpl promise to work with the middleware via this function - but I can't seem to adapt this to my above use case:
let domainMiddlewareImpl = function () {
return function *(next) {
let models = yield connectImpl()
this.request.models = models.collections;
this.request.connections = models.connections;
yield next
};
};
This looks fun. Let's see how we can yield promises. Our end goal is to write something like:
waitOn(function*(){
console.log("hello");
yield Q.delay(2000); // a placeholder, your calls in your example
console.log("World"); // this should run two seconds late.
});
Your issue here is that you're yielding them without waiting for them in advance. First of all, you can skip to the end for a 'ready' solution (don't!) and here is a fiddle of what we're making. Let's go through implementing waitOn with generators:
Let's start:
function waitOn(gen){
}
So, our function takes a generator, the first thing we'll have to do is invoke it since we need to execute the generator to get its results:
function waitOn(gen){
let sequence = gen(); // call the generator
}
Next, we'll want to wrap everything in a Promise since our waitOn will yield promises and return a promise for being done itself:
function waitOn(gen){
let sequence = gen(); // call the generator
return Promise.resolve(); // this is Q.resolve with Q
}
Now, what cases do we have:
The generator is done and returned a value - that is a return
The generator yielded a regular value and we do not have to wait for it
The generator yielded a promise and we have to wait for it. We also have to deal with exceptions (what if we yield a promise that rejects?)
So our basic structure is something like:
function waitOn(gen){
let sequence = gen(); // call the generator
return Promise.resolve().then(function cont(value){
let {value, done} = en.next(value); // get the next item
// depending on the case do what's appropriate
});
}
Note the destructuring assignment - I assume that's ok since your code has ES6 statements in it too. Note since this is the first call, value is undefined but generally we'll want to pass the value from our last call on. Now to handle the cases:
function waitOn(gen){
let sequence = gen(); // call the generator
return Promise.resolve().then(function cont(value){
let {done, value} = en.next(value); // get the next item
if(done) return value; // return case
if(!value || !value.then) return cont(value); // value case, recurse
return value.catch(e => gen.throw(e)).then(cont); // promise case
});
}
Note the .catch clause - we're throwing our code from the promise back to the generator for it to handle so we can try/catch the promises.
That's it! In 9 lines of JavaScript we've implemented generators for promises. Now to your code, you can yield any promise:
let conn = q.nBind(orm.initialize, orm);
waitOn(function*(){
console.log("Starting")
let handle = yield conn(config);
console.log("Handle created!", handle); // connected here
});
Happy coding and enjoy the power of coroutines. After we've had our fun - it's worth mentioning that Q already ships with Q.async and other newer promise libraries like Bluebird ship with their own (Bluebird has Promise.coroutine). If you're using a promise library - you can utilise those. This implementation works with native promises too.

How to sync JavaScript callbacks?

I've been developing in JavaScript for quite some time but net yet a cowboy developer, as one of the many things that always haunts me is synching JavaScript's callbacks.
I will describe a generic scenario when this concern will be raised: I have a bunch of operations to perform multiple times by a for loop, and each of the operations has a callback. After the for loop, I need to perform another operation but this operation can only execute successfully if all the callbacks from the for loop are done.
Code Example:
for ... in ... {
myFunc1(callback); // callbacks are executed asynchly
}
myFunc2(); // can only execute properly if all the myFunc1 callbacks are done
Suggested Solution:
Initiate a counter at the beginning of the loop holding the length of the loop, and each callback decrements that counter. When the counter hits 0, execute myFunc2. This is essentially to let the callbacks know if it's the last callback in sequence and if it is, call myFunc2 when it's done.
Problems:
A counter is needed for every such sequence in your code, and having meaningless counters everywhere is not a good practice.
If you recall how thread conflicts in classical synchronization problem, when multiple threads are all calling var-- on the same var, undesirable outcomes would occur. Does the same happen in JavaScript?
Ultimate Question:
Is there a better solution?
The good news is that JavaScript is single threaded; this means that solutions will generally work well with "shared" variables, i.e. no mutex locks are required.
If you want to serialize asynch tasks, followed by a completion callback you could use this helper function:
function serializeTasks(arr, fn, done)
{
var current = 0;
fn(function iterate() {
if (++current < arr.length) {
fn(iterate, arr[current]);
} else {
done();
}
}, arr[current]);
}
The first argument is the array of values that needs to be passed in each pass, the second argument is a loop callback (explained below) and the last argument is the completion callback function.
This is the loop callback function:
function loopFn(nextTask, value) {
myFunc1(value, nextTask);
}
The first argument that's passed is a function that will execute the next task, it's meant to be passed to your asynch function. The second argument is the current entry of your array of values.
Let's assume the asynch task looks like this:
function myFunc1(value, callback)
{
console.log(value);
callback();
}
It prints the value and afterwards it invokes the callback; simple.
Then, to set the whole thing in motion:
serializeTasks([1,2, 3], loopFn, function() {
console.log('done');
});
Demo
To parallelize them, you need a different function:
function parallelizeTasks(arr, fn, done)
{
var total = arr.length,
doneTask = function() {
if (--total === 0) {
done();
}
};
arr.forEach(function(value) {
fn(doneTask, value);
});
}
And your loop function will be this (only parameter name changes):
function loopFn(doneTask, value) {
myFunc1(value, doneTask);
}
Demo
The second problem is not really a problem as long as every one of those is in a separate function and the variable is declared correctly (with var); local variables in functions do not interfere with each other.
The first problem is a bit more of a problem. Other people have gotten annoyed, too, and ended up making libraries to wrap that sort of pattern for you. I like async. With it, your code might look like this:
async.each(someArray, myFunc1, myFunc2);
It offers a lot of other asynchronous building blocks, too. I'd recommend taking a look at it if you're doing lots of asynchronous stuff.
You can achieve this by using a jQuery deferred object.
var deferred = $.Deferred();
var success = function () {
// resolve the deferred with your object as the data
deferred.resolve({
result:...;
});
};
With this helper function:
function afterAll(callback,what) {
what.counter = (what.counter || 0) + 1;
return function() {
callback();
if(--what.counter == 0)
what();
};
}
your loop will look like this:
function whenAllDone() { ... }
for (... in ...) {
myFunc1(afterAll(callback,whenAllDone));
}
here afterAll creates proxy function for the callback, it also decrements the counter. And calls whenAllDone function when all callbacks are complete.
single thread is not always guaranteed. do not take it wrong.
Case 1:
For example, if we have 2 functions as follows.
var count=0;
function1(){
alert("this thread will be suspended, count:"+count);
}
function2(){
//anything
count++;
dump(count+"\n");
}
then before function1 returns, function2 will also be called, if 1 thread is guaranteed, then function2 will not be called before function1 returns. You can try this. and you will find out count is going up while you are being alerted.
Case 2: with Firefox, chrome code, before 1 function returns (no alert inside), another function can also be called.
So a mutex lock is indeed needed.
There are many, many ways to achieve this, I hope these suggestions help!
First, I would transform the callback into a promise! Here is one way to do that:
function aPromise(arg) {
return new Promise((resolve, reject) => {
aCallback(arg, (err, result) => {
if(err) reject(err);
else resolve(result);
});
})
}
Next, use reduce to process the elements of an array one by one!
const arrayOfArg = ["one", "two", "three"];
const promise = arrayOfArg.reduce(
(promise, arg) => promise.then(() => aPromise(arg)), // after the previous promise, return the result of the aPromise function as the next promise
Promise.resolve(null) // initial resolved promise
);
promise.then(() => {
// carry on
});
If you want to process all elements of an array at the same time, use map an Promise.all!
const arrayOfArg = ["one", "two", "three"];
const promise = Promise.all(arrayOfArg.map(
arg => aPromise(arg)
));
promise.then(() => {
// carry on
});
If you are able to use async / await then you could just simply do this:
const arrayOfArg = ["one", "two", "three"];
for(let arg of arrayOfArg) {
await aPromise(arg); // wow
}
// carry on
You might even use my very cool synchronize-async library like this:
const arrayOfArg = ["one", "two", "three"];
const context = {}; // can be any kind of object, this is the threadish context
for(let arg of arrayOfArg) {
synchronizeCall(aPromise, arg); // synchronize the calls in the given context
}
join(context).then(() => { // join will resolve when all calls in the context are finshed
// carry on
});
And last but not least, use the fine async library if you really don't want to use promises.
const arrayOfArg = ["one", "two", "three"];
async.each(arrayOfArg, aCallback, err => {
if(err) throw err; // handle the error!
// carry on
});

Categories