javascript loop promises iterating with array in arguments - javascript

I am trying to loop through an array of AsynWork to be done. And cant flood the system with async work done all at the time. so I am trying to do one by one with promises. My problem is that I need to go though an array of values so each async works on each value of the array. I managed to do it with this code, but it works for my specific case. Can't make it general. What would be the approach to make it reusable for other type of arrays? I have seen some solutions with array.reduce then promises but cant figure it out. Also have seen examples with Q but not using, if it can be done with simple javascript would be better.
My Code:
function doSomething(ObjIn1, ObjIn2) {
return new Promise(function(resolve, reject) {
console.log("doSomething: ObjIn1: " + ObjIn1 + " ObjIn2: " + ObjIn2);
setTimeout(function() {
console.log("doSomething Done: ObjIn1: " + ObjIn1 + " ObjIn2: " + ObjIn2);
resolve(ObjIn1, ObjIn2);
}, 500);
})
}
function LoopPromises(Function2Loop, functionOptions, Counter, Max) {
console.log("Counter: " + Counter);
if (Counter < Max) {
Function2Loop.apply(this, [functionOptions[0][Counter], functionOptions[1]]).then(function() {
Counter++;
LoopPromises(Function2Loop, functionOptions, Counter, Max);
});
}
}
LoopPromises(doSomething, [
["A1", "A2", "A3"], "ARG2TESTE"
], 0, 3)

You're overthinking this :) A function with arguments is the same as a function without arguments closing over a function with arguments so:
a(1,2,3,4);
Is the same as
(() => a(1,2,3,4))();
Except perhaps negligibly slower. I'm assuming you need to queue the work for an arbitrary amount of promises. If you need to do it for a fixed number - you can just then between them. Let's see how we can do this:
// runs fn on the array elements in sequence, but
function sequence(fns) { // fns - functions returning promises
return fns.reduce((prev, nextFn) => { // 'fold' the array
return prev.then(nextFn); // after the previous is done, execute the next
}, Promise.resolve()); // start with an empty promise
}
Make sure you understand reduce first. For convenience - let's see an example without it:
function sequence(fns) { // fns - functions returning promises
var queue = Promise.resolve();
fns.forEach(fn => queue = queue.then(fn));
return queue;
}
We're iterating through our array of work (functions) and executing them one after the other where we execute the next after the promise the previous returned resolved.
Where the values wait for each other based on the promise resolving (via then). This would let you do:
sequence([
() => new Promise(r => setTimeout(r, 500));
() => console.log("I only run after the previous work completed");
]);

Related

Promise flattening fails - why?

const simplePromise = i => {
return new Promise(function(resolve, reject) {
console.log(i);
setTimeout(function(){
resolve();
}, 2000);
});
}
var anchor = simplePromise(0);
for (var i=1; i<4; i++) {
anchor = anchor.then(_ => simplePromise(i));
}
prints:
0
4
4
4
4
instead of:
0
1
2
3
4
1. Can someone explain why? and 2. tell me how to achieve this?
I can see that the first promise is executed (i=0), then the loop runs and then the value of i(=4) gets passed to the next promise. Shouldn't this be solved by having a function inside then (_ => simplePromise(i)) ?
It's happened due you use var. Try to change var to let and that fix your problem.
UPDATE
That problem more clearly explained at this article and this (scroll to section Difference Details -> Closure in Loop) and great explanation of let key word
EXPLANATION
Let take that piece of code:
for (var i = 0; i < 5; ++i) {
setTimeout(function () {
console.log(i); // output '5' 5 times
}, 100);
}
In that example each iteration create function with closure on variable i, which will be executed in the future. Problem is var declare variable which
...is scoped to the nearest function block and let is scoped to the nearest enclosing block, which can be smaller than a function block.
i.e. all of the created functions will create closure to the same variable. And when the execution time comes i === 5. And All of the function will print the same value.
How let solve that problem...
let in the loop can re-binds it to each iteration of the loop, making sure to re-assign it the value from the end of the previous loop iteration, so it can be used to avoid issue with closures.
Your mistake is one of most common mistakes in JS - if not the most common - using a for loop to manipulate a state variable in an asynchronous situation. Your promises and your loop do not run in sync. The loop finishes much faster than any of your promises do.
Yet you use i in your promise callback, which only ever runs after the loop is done. Don't do that. There are ways to prevent it, but this has been discussed so often that I will only suggest to research and read a few of the existing answers.
I strongly suspect that you do not even want to loop over a fixed range of numbers. You actually have an array of items.
Simply drop the for loop and use the array iteration tools to dodge the scoping problem. Array#reduce is the perfect candidate here.
const simplePromise = val => {
return new Promise(function(resolve, reject) {
setTimeout(function(){
console.log(val);
resolve(val);
}, 200);
});
}
var items = [0,1,2,3,4];
console.log("array iteration starts");
items
.reduce((acc, i) => acc.then(_ => simplePromise(i)), Promise.resolve())
.then(val => console.log("chain execution end w/ " + val));
console.log("array iteration done");
/*
acc = Promise.resolve()
then simplePromise(0)
then simplePromise(1)
then simplePromise(2)
then simplePromise(3)
then simplePromise(4)
then console.log("...")
*/

How to execute promises in series?

var promiseReturningFuncs = [];
for(var i = 0; i < 5; i++){
promiseReturningFuncs.push(askQuestion);
}
var programmers = [];
Promise.reduce(promiseReturningFuncs, function(resp, x) {
console.log(typeof resp);
if(typeof resp != "function") {
programmers.push(resp);
}
return x();
})
.then(function(resp) {
programmers.push(resp);
console.log(programmers);
});
My goal: execute the askQuestion function in series and resolve an array of objects created by that function. (this function must execute in series so that it can respond to user input)
So imagine that the askQuestion function returns a promise that resolves a object I want to add to an array.
This is my messy way of doing it.
I am looking to find a cleaner way of doing it, ideally, i wouldn't even need to push to an array, I would just have a final .then, where the response is an array.
Since you appear to be using the Bluebird promise library, you have a number of built-in options for sequencing your promise returning functions. You can use Promise.reduce(), Promise.map() with a concurrency value of 1, Promise.mapSeries or Promise.each(). If the iterator function returns a promise, all of these will wait for the next iteration until that promise resolves. Which to use depends more upon the mechanics of how your data is structured and what result you want (neither of which you actually show or describe).
Let's suppose you have an array of promise returning functions and you want to call them one at a time, waiting for the one to resolve before calling the next one. If you want all the results, then I'd suggest Promise.mapSeries():
let arrayOfPromiseReturningFunctions = [...];
// call all the promise returning functions in the array, one at a time
// wait for one to resolve before calling the next
Promise.mapSeries(arrayOfPromiseReturningFunctions, function(fn) {
return fn();
}).then(function(results) {
// results is an array of resolved results from all the promises
}).catch(function(err) {
// process error here
});
Promise.reduce() could also be used, but it would accumulate a single result, passing it from one to the next and end with one final result (like Array.prototype.reduce() does).
Promise.map() is a more general version of Promise.mapSeries() that lets you control the concurrency number (the number of async operations in flight at the same time).
Promise.each() will also sequence your functions, but does not accumulate a result. It assumes you either don't have a result or you are accumulating the result out-of-band or via side effects. I tend to not like to use Promise.each() because I don't like side effect programming.
You could solve this in pure JS using ES6 (ES2015) features:
function processArray(arr, fn) {
return arr.reduce(
(p, v) => p.then((a) => fn(v).then(r => a.concat([r]))),
Promise.resolve([])
);
}
It applies the function given to the array in series and resolves to an array of the results.
Usage:
const numbers = [0, 4, 20, 100];
const multiplyBy3 = (x) => new Promise(res => res(x * 3));
// Prints [ 0, 12, 60, 300 ]
processArray(numbers, multiplyBy3).then(console.log);
You'll want to double check browser compatibility but this works on reasonably current Chrome (v59), NodeJS (v8.1.2) and probably most others.
You can use recursion so that you can move to the next iteration in a then block.
function promiseToExecuteAllInOrder(promiseReturningFunctions /* array of functions */) {
var resolvedValues = [];
return new Promise(function(resolve, reject) {
function executeNextFunction() {
var nextFunction = promiseReturningFunctions.pop();
if(nextFunction) {
nextFunction().then(function(result) {
resolvedValues.push(result);
executeNextFunction();
});
} else {
resolve(resolvedValues);
}
}
executeNextFunction();
}
}
Executing one after another using a recursive function( in a non promise way):
(function iterate(i,result,callback){
if( i>5 ) callback(result);askQuestion().then(res=>iterate(i+1,result.concat([res]),callback);
})(0,[],console.log);
For shure this can be wrapped in a promise:
function askFive(){
return new Promise(function(callback){
(function iterate(i,result){
if( i>5 ) callback(result);askQuestion().then(res=>iterate(i+1,result.concat([res]),callback);
})(0,[],console.log);
});
}
askFive().then(console.log);
Or:
function afteranother(i,promise){
return new Promise(function(resolve){
if(!i) return resolve([]);
afteranother(i-1,promise).then(val=>promise().then(val2=>resolve(val.concat([val2])));
});
}
afteranother(5,askQuestion).then(console.log);

Is this a good way to generate a chain of promises for an array?

I have an array of items that I want to insert into an SQL server. I am using promises for this and in order to execute each insert sequentially I wrote the following method:
var ForeachPromise = function (array, func) {
var promise = func(array[0]);
for (var i=1; i < array.length; i++) {
promise = promise.then(function() { return func(array[i]) });
}
return promise;
}
The idea is that when func is call it will return a promise, which will then be chained to the previous promise.
...
return ForeachPromise(type.subprocessen, function(subproces) {
return newSubproces(subproces, typeId, dienstId, createData, s + 1);
});
I haven't actually tested it yet, but I assume that something like this will work. My question however is am I using promises correctly? Promises are great but easily misunderstood and I just want to be sure that I'm not making any fundamental mistakes.
Yes, that approach is fine, and works well with promises. Two minor quibbles:
you should take care for the case of an empty array. Start your chain with Promise.resolve() (a promise fulfilled with undefined), and begin your loop at index 0.
As the then callback is asynchronous, your i variable has the wrong value - the classical closure in a loop fallacy.
Using the .reduce method does help with both problems:
function foreachPromise(array, func) {
return array.reduce(function(promise, elem, i) {
return promise.then(function() { return func(elem) });
}, Promise.resolve());
}

How can I chain together groups of promises?

I am using the Q javascript promises library and am running in a browser, and I want to figure out how to chain together groups of promises so that each group gets executed sequentially. For example, if I have items A, B, C, and D, I want to group A and B together and then C and D together, so that both A and B must fulfill before C and D get executed. I created this simple jsfiddle to show my current attempt.
var work_items = [ 'A','B','C','D','E','F','G','H','I' ];
var n = 2; // group size
var wait = 1000;
var getWorkPromiseFn = function (item) {
log("Getting promise function for " + item);
return function () {
log("Starting " + item);
var deferred = Q.defer();
setTimeout(function () {
var status = "Finished " + item;
log(status);
deferred.resolve(status);
}, wait);
return deferred.promise;
};
};
var queue = Q();
//log('Getting sequentially'); // One-by-one in sequence works fine
//work_items.forEach(function (item) {
// queue = queue.then(getWorkPromiseFn(item));
//});
log('Getting ' + n + ' at a time'); // This section does not
while (work_items.length > 0) {
var unit = [];
for (var i=0; i<n; i++) {
var item = work_items.shift();
if (item) {
unit.push(getWorkPromiseFn(item));
}
}
queue.then(Q.all(unit));
}
var inspect = queue.inspect(); // already fulfilled, though no work is done
It looks like I am probably passing the wrong array to Q.all here, since I'm passing in an array of functions which return promises rather than an array of the promises themselves. When I tried to use promises directly there (with unit.push(Q().then(getWorkPromiseFn(item)); for example), the work for each was begun immediately and there was no sequential processing. I guess I'm basically unclear on a good way to represent the group in a way that appropriately defers execution of the group.
So, how can I defer execution of a group of promises like this?
This can be done by first pre-processing the array of items into groups, then applying the two patterns (not the anti-patterns) provided here under the heading "The Collection Kerfuffle".
The main routine can be coded as a single chain of array methods.
var work_items = [ 'A','B','C','D','E','F','G','H','I' ];
var wait = 3000;
//Async worker function
function getWorkPromise(item) {
console.log("Starting " + item);
var deferred = Q.defer();
setTimeout(function () {
var status = "Finished " + item;
console.log(status);
deferred.resolve(status);
}, wait);
return deferred.promise;
};
function doAsyncStuffInGroups(arr, n) {
/*
* Process original array into groups, then
* process the groups in series,
* progressing to the next group
* only after performing something asynchronous
* on all group members in parallel.
*/
return arr.map(function(currentValue, i) {
return (i % n === 0) ? arr.slice(i, i+n) : null;
}).filter(function(item) {
return item;
}).reduce(function(promise, group) {
return promise.then(function() {
return Q.all(group.map(function(item) {
return getWorkPromise(item);
}));
});
}, Q());
}
doAsyncStuffInGroups(work_items, 2).then(function() {
console.log("All done");
});
See fiddle. Delay of 3s gives you time to appreciate what's going on. I found 1s too quick.
Solutions like this are elegant and concise but pretty well unreadable. In production code I would provide more comments to help whoever came after me.
For the record:
The opening arr.map(...).filter(...) processes arr (non destructively) into an array of arrays, each inner array representing a group of length n (plus terminal remainders).
The chained .reduce(...) is an async "serializer" pattern.
The nested Q.all(group.map(...)) is an async "parallelizer" pattern.
The .then function of a promise does not mutate the promise, so when you do:
p.then(function(){
// stuff
});
You do not change the promise p at all, instead, you need to assign it to something:
p = p.then(....)
This is why your queue promise was always resolved, it never changed beyond Q().
In your case, something like changing:
queue.then(Q.all(unit));
Into:
queue = queue.then(function(){ return Q.all(unit); });
Or in ES6 promises and libraries that use their syntax like Bluebird the other answer mentioned:
queue = queue.then(function(){ return Promise.all(unit); });
The thing that confused me most is that the async function being chained needs to return a function that returns a promise. Here's an example:
function setTimeoutPromise(ms) {
return new Promise(function (resolve) {
setTimeout(resolve, ms);
});
}
function foo(item, ms) {
return function() {
return setTimeoutPromise(ms).then(function () {
console.log(item);
});
};
}
var items = ['one', 'two', 'three'];
function bar() {
var chain = Promise.resolve();
for (var i in items) {
chain = chain.then(foo(items[i], (items.length - i)*1000));
}
return chain.then();
}
bar().then(function () {
console.log('done');
});
Notice that foo returns a function that returns a promise. foo() does not return a promise directly.
See this Live Demo
i would suggest you use bluebird, its the best performance promise out there, https://github.com/petkaantonov/bluebird
the example to chain should also be here https://github.com/petkaantonov/bluebird#how-do-long-stack-traces-differ-from-eg-q

How to sync JavaScript callbacks?

I've been developing in JavaScript for quite some time but net yet a cowboy developer, as one of the many things that always haunts me is synching JavaScript's callbacks.
I will describe a generic scenario when this concern will be raised: I have a bunch of operations to perform multiple times by a for loop, and each of the operations has a callback. After the for loop, I need to perform another operation but this operation can only execute successfully if all the callbacks from the for loop are done.
Code Example:
for ... in ... {
myFunc1(callback); // callbacks are executed asynchly
}
myFunc2(); // can only execute properly if all the myFunc1 callbacks are done
Suggested Solution:
Initiate a counter at the beginning of the loop holding the length of the loop, and each callback decrements that counter. When the counter hits 0, execute myFunc2. This is essentially to let the callbacks know if it's the last callback in sequence and if it is, call myFunc2 when it's done.
Problems:
A counter is needed for every such sequence in your code, and having meaningless counters everywhere is not a good practice.
If you recall how thread conflicts in classical synchronization problem, when multiple threads are all calling var-- on the same var, undesirable outcomes would occur. Does the same happen in JavaScript?
Ultimate Question:
Is there a better solution?
The good news is that JavaScript is single threaded; this means that solutions will generally work well with "shared" variables, i.e. no mutex locks are required.
If you want to serialize asynch tasks, followed by a completion callback you could use this helper function:
function serializeTasks(arr, fn, done)
{
var current = 0;
fn(function iterate() {
if (++current < arr.length) {
fn(iterate, arr[current]);
} else {
done();
}
}, arr[current]);
}
The first argument is the array of values that needs to be passed in each pass, the second argument is a loop callback (explained below) and the last argument is the completion callback function.
This is the loop callback function:
function loopFn(nextTask, value) {
myFunc1(value, nextTask);
}
The first argument that's passed is a function that will execute the next task, it's meant to be passed to your asynch function. The second argument is the current entry of your array of values.
Let's assume the asynch task looks like this:
function myFunc1(value, callback)
{
console.log(value);
callback();
}
It prints the value and afterwards it invokes the callback; simple.
Then, to set the whole thing in motion:
serializeTasks([1,2, 3], loopFn, function() {
console.log('done');
});
Demo
To parallelize them, you need a different function:
function parallelizeTasks(arr, fn, done)
{
var total = arr.length,
doneTask = function() {
if (--total === 0) {
done();
}
};
arr.forEach(function(value) {
fn(doneTask, value);
});
}
And your loop function will be this (only parameter name changes):
function loopFn(doneTask, value) {
myFunc1(value, doneTask);
}
Demo
The second problem is not really a problem as long as every one of those is in a separate function and the variable is declared correctly (with var); local variables in functions do not interfere with each other.
The first problem is a bit more of a problem. Other people have gotten annoyed, too, and ended up making libraries to wrap that sort of pattern for you. I like async. With it, your code might look like this:
async.each(someArray, myFunc1, myFunc2);
It offers a lot of other asynchronous building blocks, too. I'd recommend taking a look at it if you're doing lots of asynchronous stuff.
You can achieve this by using a jQuery deferred object.
var deferred = $.Deferred();
var success = function () {
// resolve the deferred with your object as the data
deferred.resolve({
result:...;
});
};
With this helper function:
function afterAll(callback,what) {
what.counter = (what.counter || 0) + 1;
return function() {
callback();
if(--what.counter == 0)
what();
};
}
your loop will look like this:
function whenAllDone() { ... }
for (... in ...) {
myFunc1(afterAll(callback,whenAllDone));
}
here afterAll creates proxy function for the callback, it also decrements the counter. And calls whenAllDone function when all callbacks are complete.
single thread is not always guaranteed. do not take it wrong.
Case 1:
For example, if we have 2 functions as follows.
var count=0;
function1(){
alert("this thread will be suspended, count:"+count);
}
function2(){
//anything
count++;
dump(count+"\n");
}
then before function1 returns, function2 will also be called, if 1 thread is guaranteed, then function2 will not be called before function1 returns. You can try this. and you will find out count is going up while you are being alerted.
Case 2: with Firefox, chrome code, before 1 function returns (no alert inside), another function can also be called.
So a mutex lock is indeed needed.
There are many, many ways to achieve this, I hope these suggestions help!
First, I would transform the callback into a promise! Here is one way to do that:
function aPromise(arg) {
return new Promise((resolve, reject) => {
aCallback(arg, (err, result) => {
if(err) reject(err);
else resolve(result);
});
})
}
Next, use reduce to process the elements of an array one by one!
const arrayOfArg = ["one", "two", "three"];
const promise = arrayOfArg.reduce(
(promise, arg) => promise.then(() => aPromise(arg)), // after the previous promise, return the result of the aPromise function as the next promise
Promise.resolve(null) // initial resolved promise
);
promise.then(() => {
// carry on
});
If you want to process all elements of an array at the same time, use map an Promise.all!
const arrayOfArg = ["one", "two", "three"];
const promise = Promise.all(arrayOfArg.map(
arg => aPromise(arg)
));
promise.then(() => {
// carry on
});
If you are able to use async / await then you could just simply do this:
const arrayOfArg = ["one", "two", "three"];
for(let arg of arrayOfArg) {
await aPromise(arg); // wow
}
// carry on
You might even use my very cool synchronize-async library like this:
const arrayOfArg = ["one", "two", "three"];
const context = {}; // can be any kind of object, this is the threadish context
for(let arg of arrayOfArg) {
synchronizeCall(aPromise, arg); // synchronize the calls in the given context
}
join(context).then(() => { // join will resolve when all calls in the context are finshed
// carry on
});
And last but not least, use the fine async library if you really don't want to use promises.
const arrayOfArg = ["one", "two", "three"];
async.each(arrayOfArg, aCallback, err => {
if(err) throw err; // handle the error!
// carry on
});

Categories