I'm a working my way with promises and I'm stuck with my use case.
I have an array of transformer functions (each function is a promise and modifies some JSON structure).
Let me show some code.
Lets say this is my JSON structure (array)
var data = [{a: 1, b:2}, {a:3, b:4}];
transformFunction is definition of transform functions modifying the data a certain way. The two functions adds c and d property to the above JSON structure:
var transformFunctions = { //
transform1: function (data) { // This function adds `c` property to each object from `a`
return new Promise(function (resolve) {
for (var i = 0; i < data.length; i++) {
data[i].c = data[i].a;
}
return resolve(data);
})
},
transform2: function (data) { // This function adds `d` property to each object from `c`
return new Promise(function (resolve) {
for (var i = 0; i < data.length; i++) {
data[i].d = data[i].c;
}
return resolve(data);
})
},
...
}
The from a UI user specifies which transformer functions he should use and in what order. Lets say he picked the normal order like this:
var userTransformList = ['transform1', 'transform2'];
The transform1 method should modify the data and the result should be passed to transform2 method.
I was looking at: Promise.all but it seems that it does not care for the order of the promises, and most important it needs to pass the previous result to the next promise.
Note: As adeneo pointed out in the comments, use promises only if you are dealing with asynchronous code.
Create an array of functions which are to be executed. And make sure that they all return a Promise.
Then, you can use Promise.reduce to reduce the initial value to the transformed final value by returning the result of executing current promise returning function, on every iteration.
Finally you can attach a then handler to get the actual value and a catch handler, just in case if the promises are rejected.
Lets say we have two transform functions like this.
Note: I am telling again. You should never use Promises with functions like these. You should use promises only when the functions you are dealing with are really asynchronous.
// Just add a property called `c` to all the objects and return a Promise object
function transform1(data) {
return Promise.resolve(data.map(function(currentObject) {
currentObject.c = currentObject.a + currentObject.b;
return currentObject;
}));
}
// Just add a property called `d` to all the objects and return a Promise object
function transform2(data) {
return Promise.resolve(data.map(function(currentObject) {
currentObject.d = currentObject.a + currentObject.b + currentObject.c;
return currentObject;
}));
}
Then you can transform the original value, like this
Promise.reduce([transform1, transform2], function (result, currentFunction) {
return currentFunction(result);
}, [{a: 1, b: 2}, {a: 3, b: 4}]) // Initial value
.then(function (transformedData) {
console.log(transformedData);
})
.catch(function (err) {
console.error(err);
});
Output
[ { a: 1, b: 2, c: 3, d: 6 }, { a: 3, b: 4, c: 7, d: 14 } ]
You can chain Promises the way you always do: using .then().
Let's say you have these two transformations:
function increment(x) {
return Promise.resolve(x + 1);
}
function double(x) {
return Promise.resolve(2 * x);
}
In a real scenario, these would be doing asynchronous work. You could just:
increment(1).then(double)
But, you don't know the order or number of transformations. Let's put them into an array, and then() them one by one:
var transformations = [increment, double]
var promise = Promise.resolve(1);
for (var i = 0; i < transformations.length; i++)
promise = promise.then(transformations[i]);
You can attach a catch() handler before you start, after you finish or even per-transformation.
This isn't efficient if you're going to apply hundreds of transformations. In that case, you should use reduce() as thefourtheye suggests in his answer.
Related
I am trying to map object where correct ID will contain the array that belongs to it.
Here is the scenario:
let obj = {}
const arr = []
for (let i = 0; i < someIds.length; i++) {
const res = await someApiCall() // returns array
obj = {
id: someIds[i],
res
}
arr.push(obj)
return arr
}
The thing here is that arr will only output last set of object because of iterating over the values.
For example I get
[{id: 122, res: "whatever122"}]
where I want to get
[{id: 122, res: "whatever122"}, {id: 111, res: "whatever111"}, {id: 133, res: "whatever133}]
How do I concatenate all?
You return arr to fast. It should be returned outside the for scope:
let obj = {}
const arr = []
for (let i = 0; i < someIds.length; i++) {
const res = await someApiCall() // returns array
obj = {
id: someIds[i],
res
}
arr.push(obj)
}
return arr
Move the return statement outside of the for loop, which would wait for all of the objects to get added to the array.
let obj = {}
const arr = []
for (let i = 0; i < someArr.length; i++) {
const res = await someApiCall() // returns array
obj = {
id: res[i],
res
}
arr.push(obj)
}
return arr
The problem is produced by the return statement that should probably stay outside the for loop. It completes the execution of the current function and returns (an optional value) to the caller.
A simpler way to write what you need is to use Promise.all() to run all the calls to someApiCall() in parallel (if they are independent). This way the code run much faster:
async function doSomething() {
return await Promise.all(
someIds.map(
async (id) => {
return {
id: id,
res: await someApiCall(id),
};
}
)
);
}
How it works
someIds.map() calls the callback function that it receives as argument for each item of someIds and returns a new array that contains the values returned by the callback function.
Because the callback function is asynchronous and uses await, it returns a Promise as soon as the await-ed call starts, without waiting it to complete.
The array of promises returned by someIds.map() is passed as argument to Promise.all() that waits for all of them and returns an array containing the values returned by the promises (which are the objects returned by the callback were they be synchronous).
Remark
The await in front of Promise.all() is not really needed. Without it, the function doSomething() returns the promise returned by Promise.all(). With it, the function returns an array of objects. Either way, the call of doSomething() still needs to be await-ed and the value produced by it after await is an array of objects.
var promiseReturningFuncs = [];
for(var i = 0; i < 5; i++){
promiseReturningFuncs.push(askQuestion);
}
var programmers = [];
Promise.reduce(promiseReturningFuncs, function(resp, x) {
console.log(typeof resp);
if(typeof resp != "function") {
programmers.push(resp);
}
return x();
})
.then(function(resp) {
programmers.push(resp);
console.log(programmers);
});
My goal: execute the askQuestion function in series and resolve an array of objects created by that function. (this function must execute in series so that it can respond to user input)
So imagine that the askQuestion function returns a promise that resolves a object I want to add to an array.
This is my messy way of doing it.
I am looking to find a cleaner way of doing it, ideally, i wouldn't even need to push to an array, I would just have a final .then, where the response is an array.
Since you appear to be using the Bluebird promise library, you have a number of built-in options for sequencing your promise returning functions. You can use Promise.reduce(), Promise.map() with a concurrency value of 1, Promise.mapSeries or Promise.each(). If the iterator function returns a promise, all of these will wait for the next iteration until that promise resolves. Which to use depends more upon the mechanics of how your data is structured and what result you want (neither of which you actually show or describe).
Let's suppose you have an array of promise returning functions and you want to call them one at a time, waiting for the one to resolve before calling the next one. If you want all the results, then I'd suggest Promise.mapSeries():
let arrayOfPromiseReturningFunctions = [...];
// call all the promise returning functions in the array, one at a time
// wait for one to resolve before calling the next
Promise.mapSeries(arrayOfPromiseReturningFunctions, function(fn) {
return fn();
}).then(function(results) {
// results is an array of resolved results from all the promises
}).catch(function(err) {
// process error here
});
Promise.reduce() could also be used, but it would accumulate a single result, passing it from one to the next and end with one final result (like Array.prototype.reduce() does).
Promise.map() is a more general version of Promise.mapSeries() that lets you control the concurrency number (the number of async operations in flight at the same time).
Promise.each() will also sequence your functions, but does not accumulate a result. It assumes you either don't have a result or you are accumulating the result out-of-band or via side effects. I tend to not like to use Promise.each() because I don't like side effect programming.
You could solve this in pure JS using ES6 (ES2015) features:
function processArray(arr, fn) {
return arr.reduce(
(p, v) => p.then((a) => fn(v).then(r => a.concat([r]))),
Promise.resolve([])
);
}
It applies the function given to the array in series and resolves to an array of the results.
Usage:
const numbers = [0, 4, 20, 100];
const multiplyBy3 = (x) => new Promise(res => res(x * 3));
// Prints [ 0, 12, 60, 300 ]
processArray(numbers, multiplyBy3).then(console.log);
You'll want to double check browser compatibility but this works on reasonably current Chrome (v59), NodeJS (v8.1.2) and probably most others.
You can use recursion so that you can move to the next iteration in a then block.
function promiseToExecuteAllInOrder(promiseReturningFunctions /* array of functions */) {
var resolvedValues = [];
return new Promise(function(resolve, reject) {
function executeNextFunction() {
var nextFunction = promiseReturningFunctions.pop();
if(nextFunction) {
nextFunction().then(function(result) {
resolvedValues.push(result);
executeNextFunction();
});
} else {
resolve(resolvedValues);
}
}
executeNextFunction();
}
}
Executing one after another using a recursive function( in a non promise way):
(function iterate(i,result,callback){
if( i>5 ) callback(result);askQuestion().then(res=>iterate(i+1,result.concat([res]),callback);
})(0,[],console.log);
For shure this can be wrapped in a promise:
function askFive(){
return new Promise(function(callback){
(function iterate(i,result){
if( i>5 ) callback(result);askQuestion().then(res=>iterate(i+1,result.concat([res]),callback);
})(0,[],console.log);
});
}
askFive().then(console.log);
Or:
function afteranother(i,promise){
return new Promise(function(resolve){
if(!i) return resolve([]);
afteranother(i-1,promise).then(val=>promise().then(val2=>resolve(val.concat([val2])));
});
}
afteranother(5,askQuestion).then(console.log);
I'm attempting to run a series of functions based upon the Q API using their first strategy for sequences. This suggests the pattern:
var funcs = [foo, bar, baz, qux];
var result = Q(initialVal);
funcs.forEach(function (f) {
result = result.then(f);
});
return result;
What structure are each of the functions within the array meant to take? I am quite confused about when to use the return def.promise;. Is that simply always the last line? Will it frequently or always immediately follow def.resolve(someVar). Is something like this then ended structure?
function foo(f){
var def = Q.defer();
f++;
def.resolve(f);
return def.promise;
}
So that each subsequent function within the array will receive the newly calculated value of f: in this case, if var initialVal = 1; and four functions each incrementing f++, the returned result will be 4? How do I access that returned value? console.log(result) prints { state: 'pending' } .
What structure are each of the functions within the array meant to take?
Q.js allows promises to be created in several ways. For example :
function foo(value) {
var def = Q.defer();
def.resolve(value + 1);
return def.promise;
}
function foo(value) {
return Q(value + 1);
}
function foo(value) {
return Q.Promise(function(resolve, reject) {
resolve(value + 1);
});
}
Other Promise libs are similar, but not necessarily so flexible. Native js Promises must be constructed with the third of these approaches.
However, in the real world you will only rarely need to create your own Promise. You will typically be dealing with promise-returning lib methods that someone else has written. For example :
function foo(value) {
return lib.doSomethingAsync(value, and, other, params);
}
How do I access that returned value?
The code is easier to understand if member name "result" is replaced with "promise", and result.then(f) is rewritten with an anonymous function that calls f().
function performAsyncSequence() {
var promise = Q(initialVal);
funcs.forEach(function (f) {
promise = promise.then(function(previousResult) {
return f(previousResult);
});
});
return promise;
}
This is 100% equivalent to the code in the question, but now it should be clearer how the previous result is passed down the promise chain.
Accessing all previous promise results in the sequence is more complicated. The answers here discuss the subject comprehensively.
I am kind of new to promises and are stuck on the following exercise.
I have an array of values and I want to execute an async call on each one.
In the callback, I want to execute another call on the outcome of the first call.
Basically, my frustration is in the following:
The order of execution should be '1x2x3x' but the order is '123xxx'
In other words, the loop is already going to the next iteration when the sub/nested promise of the first promise is not fullfilled yet..
var values = ["1", "2", "3"];
function do(val) {
var deferred = Q.defer();
asyncCall(val)
.then( function( response ) {
console.log(val);
asyncCall(response)
.then( function ( response ) {
console.log('x');
deferred.resolve(true)
});
});
return deferred.promise;
}
var result = do(values[0]);
values.forEach( function(f) {
result = result.then(do(f));
}
There is probably an easy solution but I'm stuck on it.
You don't need the deferred, that's the deferred anti pattern you have there since promises chain.
Also, you have to return a promise from a .then handler if you want it to wait for it to resolve.
You can simply use a for loop:
function do(val) {
var q = Q();
for(var i = 0; i < val; i++){
q = q.then(asyncCall.bind(null,i))
.then(console.log.bind(console))
.then(console.log.bind(console,"x"));
}
return q; // in case you want to chain
}
fiddle.
Note: Bind just fixates the value for the function call. In this case since the first param (the this value) is null it acts like function(fn,arg){ return function(arg){ return fn(arg); }} that is, it translates a function call to a "partial application" - for more info see the MDN docs.
I am using the Q javascript promises library and am running in a browser, and I want to figure out how to chain together groups of promises so that each group gets executed sequentially. For example, if I have items A, B, C, and D, I want to group A and B together and then C and D together, so that both A and B must fulfill before C and D get executed. I created this simple jsfiddle to show my current attempt.
var work_items = [ 'A','B','C','D','E','F','G','H','I' ];
var n = 2; // group size
var wait = 1000;
var getWorkPromiseFn = function (item) {
log("Getting promise function for " + item);
return function () {
log("Starting " + item);
var deferred = Q.defer();
setTimeout(function () {
var status = "Finished " + item;
log(status);
deferred.resolve(status);
}, wait);
return deferred.promise;
};
};
var queue = Q();
//log('Getting sequentially'); // One-by-one in sequence works fine
//work_items.forEach(function (item) {
// queue = queue.then(getWorkPromiseFn(item));
//});
log('Getting ' + n + ' at a time'); // This section does not
while (work_items.length > 0) {
var unit = [];
for (var i=0; i<n; i++) {
var item = work_items.shift();
if (item) {
unit.push(getWorkPromiseFn(item));
}
}
queue.then(Q.all(unit));
}
var inspect = queue.inspect(); // already fulfilled, though no work is done
It looks like I am probably passing the wrong array to Q.all here, since I'm passing in an array of functions which return promises rather than an array of the promises themselves. When I tried to use promises directly there (with unit.push(Q().then(getWorkPromiseFn(item)); for example), the work for each was begun immediately and there was no sequential processing. I guess I'm basically unclear on a good way to represent the group in a way that appropriately defers execution of the group.
So, how can I defer execution of a group of promises like this?
This can be done by first pre-processing the array of items into groups, then applying the two patterns (not the anti-patterns) provided here under the heading "The Collection Kerfuffle".
The main routine can be coded as a single chain of array methods.
var work_items = [ 'A','B','C','D','E','F','G','H','I' ];
var wait = 3000;
//Async worker function
function getWorkPromise(item) {
console.log("Starting " + item);
var deferred = Q.defer();
setTimeout(function () {
var status = "Finished " + item;
console.log(status);
deferred.resolve(status);
}, wait);
return deferred.promise;
};
function doAsyncStuffInGroups(arr, n) {
/*
* Process original array into groups, then
* process the groups in series,
* progressing to the next group
* only after performing something asynchronous
* on all group members in parallel.
*/
return arr.map(function(currentValue, i) {
return (i % n === 0) ? arr.slice(i, i+n) : null;
}).filter(function(item) {
return item;
}).reduce(function(promise, group) {
return promise.then(function() {
return Q.all(group.map(function(item) {
return getWorkPromise(item);
}));
});
}, Q());
}
doAsyncStuffInGroups(work_items, 2).then(function() {
console.log("All done");
});
See fiddle. Delay of 3s gives you time to appreciate what's going on. I found 1s too quick.
Solutions like this are elegant and concise but pretty well unreadable. In production code I would provide more comments to help whoever came after me.
For the record:
The opening arr.map(...).filter(...) processes arr (non destructively) into an array of arrays, each inner array representing a group of length n (plus terminal remainders).
The chained .reduce(...) is an async "serializer" pattern.
The nested Q.all(group.map(...)) is an async "parallelizer" pattern.
The .then function of a promise does not mutate the promise, so when you do:
p.then(function(){
// stuff
});
You do not change the promise p at all, instead, you need to assign it to something:
p = p.then(....)
This is why your queue promise was always resolved, it never changed beyond Q().
In your case, something like changing:
queue.then(Q.all(unit));
Into:
queue = queue.then(function(){ return Q.all(unit); });
Or in ES6 promises and libraries that use their syntax like Bluebird the other answer mentioned:
queue = queue.then(function(){ return Promise.all(unit); });
The thing that confused me most is that the async function being chained needs to return a function that returns a promise. Here's an example:
function setTimeoutPromise(ms) {
return new Promise(function (resolve) {
setTimeout(resolve, ms);
});
}
function foo(item, ms) {
return function() {
return setTimeoutPromise(ms).then(function () {
console.log(item);
});
};
}
var items = ['one', 'two', 'three'];
function bar() {
var chain = Promise.resolve();
for (var i in items) {
chain = chain.then(foo(items[i], (items.length - i)*1000));
}
return chain.then();
}
bar().then(function () {
console.log('done');
});
Notice that foo returns a function that returns a promise. foo() does not return a promise directly.
See this Live Demo
i would suggest you use bluebird, its the best performance promise out there, https://github.com/petkaantonov/bluebird
the example to chain should also be here https://github.com/petkaantonov/bluebird#how-do-long-stack-traces-differ-from-eg-q