Looking for batching approach for JS Async generator - javascript

I am trying to add batching capability to an async js generator. The idea is to have a function that would wrap around a non-batched generator. This function would call the generator's next method several times to launch several async operations concurrently, then it would return the first value taking care to refill the batch object as it returns its items to the client. The following examples demonstrates the working case that doesn't use the wrapper as well as the wrapper case that produces correct results, but doesn't result in desired behavior of concurrent execution of the batched promises.
function sleep(ms) {
return new Promise(resolve => setTimeout(resolve, ms));
}
async function foo(v) {
await sleep(1000);
return v;
}
const createAsyncGenerator = async function*(){
for (let i = 0; i < 5000; i++) {
yield foo(i);
}
}
const createBatchedAsyncGenerator = async function*(batch_size){
const batch = [];
for (let i = 0; i < batch_size; i++) {
batch.push(foo(i));
}
for (let i = batch_size; i < 500; i++) {
batch.push(foo(i));
yield batch.shift();
}
}
function batchAsyncGenerator(generator) {
return {
batch: [],
[Symbol.asyncIterator]() {
while (this.batch.length < 5) {
this.batch.push(generator.next());
}
return {
batch: this.batch,
async next() {
this.batch.push(generator.next());
const result = this.batch.shift();
return result;
}
}
}
}
}
const batching_works = async () => {
const asyncGenerator = createBatchedAsyncGenerator(5);
for await (const item of asyncGenerator) {
console.log(item)
}
}
const batching_doesnt_work = async () => {
const asyncGenerator = batchAsyncGenerator(createAsyncGenerator());
for await (const item of asyncGenerator) {
console.log(item)
}
}
batching_works()
//batching_doesnt_work()

This function would call the generator's next method several times to launch several async operations concurrently
That is unfortunately (?) not how async generators work. If you yield x in an async function*, what is actually happening is
await yield await x
(see steps 5 and 8.b of the AsyncGeneratorYield abstract operation). This means that all actions inside the generator body will be strictly sequential and may not overlap just because the async generator is iterated "faster". Every .next() call you make on an async generator is in fact queued to make awaits inside the generator body possible.
You can achieve the desired behaviour using a synchronous iterator that yields promises, though.
Finally, note a huge drawback of your implementations (both createBatchedAsyncGenerato and batchAsyncGenerator): they don't properly handle errors, and will potentially cause unhandled promise rejections to crash your application. See Waiting for more than one concurrent await operation and Any difference between await Promise.all() and multiple await? for details.

The batching function could be a generator function itself:
function sleep(ms) {
return new Promise(resolve => setTimeout(() => resolve(), ms));
}
const foo = async(v) => {
await sleep(150)
return v
}
const createAsyncGenerator = async function*() {
for (let i = 0; i < 50; i++) {
const r = await foo(i)
yield r
}
}
// creating the batches & handling the results
const asyncBatchGenerator = async function*({
batch,
fn
}) {
const a = [...Array(batch)].fill('').map((_) => fn.next())
const res = await Promise.all(a)
yield res
}
// wrapper function that handles the different states
// (e.g. asyncBatchGenerator is re-initialized, while fn
// is kept until "closed"
const batchingWrapper = async(fn) => {
const genFn = fn()
const bf = () => asyncBatchGenerator({
batch: 8,
fn: genFn
})
let res = []
let genClosed = false
while (!genClosed) {
for await (let value of bf()) {
// creating the return array: only items that
// hold a real value are added to the result
res = [...res, ...value.filter(({
done
}) => !done)]
// logging the growing return array:
console.log(res)
if (value.some(({
done
}) => done)) {
genClosed = true
}
}
}
console.log('generator closed')
}
// calling the wrapper with the argument:
batchingWrapper(createAsyncGenerator)
The batching generator is re-created on every pass - until the wrapper generator function closes.

Related

Can I prevent an `AsyncGenerator` from yielding after its `return()` method has been invoked?

AsyncGenerator.prototype.return() - JavaScript | MDN states:
The return() method of an async generator acts as if a return statement is inserted in the generator's body at the current suspended position, which finishes the generator and allows the generator to perform any cleanup tasks when combined with a try...finally block.
Why then does the following code print 0–3 rather than only 0–2?
const delay = (ms) => new Promise((resolve) => setTimeout(resolve, ms));
const values = (async function* delayedIntegers() {
let n = 0;
while (true) {
yield n++;
await delay(100);
}
})();
await Promise.all([
(async () => {
for await (const value of values) console.log(value);
})(),
(async () => {
await delay(250);
values.return();
})(),
]);
I tried adding log statements to better understand where the "current suspended position" is and from what I can tell when I call the return() method the AsyncGenerator instance isn't suspended (the body execution isn't at a yield statement) and instead of returning once reaching the yield statement the next value is yielded and then suspended at which point the "return" finally happens.
Is there any way to detect that the return() method has been invoked and not yield afterwards?
I can implement the AsyncIterator interface myself but then I lose the yield syntax supported by async generators:
const delay = (ms) => new Promise((resolve) => setTimeout(resolve, ms));
const values = (() => {
let n = 0;
let done = false;
return {
[Symbol.asyncIterator]() {
return this;
},
async next() {
if (done) return { done, value: undefined };
if (n !== 0) {
await delay(100);
if (done) return { done, value: undefined };
}
return { done, value: n++ };
},
async return() {
done = true;
return { done, value: undefined };
},
};
})();
await Promise.all([
(async () => {
for await (const value of values) console.log(value);
})(),
(async () => {
await delay(250);
values.return();
})(),
]);
Why does the code print 0–3 rather than only 0–2? From what I can tell, when I call the return() method, the AsyncGenerator instance isn't suspended (the body execution isn't at a yield statement) and instead of returning once reaching the yield statement the next value is yielded and then suspended at which point the "return" finally happens.
Yes, precisely this is what happens. The generator is already running because the for await … of loop did call its .next() method, and so the generator will complete that before considering the .return() call.
All the methods that you invoke on an async generator are queued. (In a sync generator, you'd get a "TypeError: Generator is already running" instead). One can demonstrate this by immediately calling next multiple times:
const values = (async function*() {
let i=0; while (true) {
await new Promise(r => { setTimeout(r, 1000); });
yield i++;
}
})();
values.next().then(console.log, console.error);
values.next().then(console.log, console.error);
values.next().then(console.log, console.error);
values.return('done').then(console.log, console.error);
values.next().then(console.log, console.error);
Is there any way to detect that the return() method has been invoked and not yield afterwards?
No, not from within the generator. And really you probably still should yield the value if you already expended the effort to produce it.
It sounds like what you want to do is to ignore the produced value when you want the generator to stop. You should do that in your for await … of loop - and you can also use it to stop the generator by using a break statement:
const delay = (ms) => new Promise((resolve) => {
setTimeout(resolve, ms);
});
async function* delayedIntegers() {
let n = 0;
while (true) {
yield n++;
await delay(1000);
}
}
(async function main() {
const start = Date.now();
const values = delayedIntegers();
for await (const value of values) {
if (Date.now() - start > 2500) {
console.log('done:', value);
break;
}
console.log(value);
}
})();
But if you really want to abort the generator from the outside, you need an out-of-band channel to signal the cancellation. You can use an AbortSignal for this:
const delay = (ms, signal) => new Promise((resolve, reject) => {
function done() {
resolve();
signal?.removeEventListener("abort", stop);
}
function stop() {
reject(this.reason);
clearTimeout(handle);
}
signal?.throwIfAborted();
const handle = setTimeout(done, ms);
signal?.addEventListener("abort", stop, {once: true});
});
async function* delayedIntegers(signal) {
let n = 0;
while (true) {
yield n++;
await delay(1000, signal);
}
}
(async function main() {
try {
const values = delayedIntegers(AbortSignal.timeout(2500));
for await (const value of values) {
console.log(value);
}
} catch(e) {
if (e.name != "TimeoutError") throw e;
console.log("done");
}
})();
This will actually permit to stop the generator during the timeout, not after the full second has elapsed.
Is there a way to prevent this "extra yield" after invoking the return method? If not, are there libraries, patterns, etc. our there that avoid this while still implementing these AsyncIterator interface optional properties?
As #Bergi clearly explained, the extra yield cannot be avoided with the AsyncGenerator.return() method. This is a really interesting case, but I don't think you will find libraries that fix it. #Bergi proposed a clever solution using the AbortSignal, I have tried a different approach with only Promises:
(async function test() {
const delay = (ms) => new Promise((resolve) => setTimeout(resolve, ms));
const wrapIntoStoppable = function (generator) {
const newGenerator = {
isGeneratorStopped: false,
resolveStopPromise: null,
async *[Symbol.asyncIterator]() {
let stoppedSymbol = Symbol('stoppedPromise')
let stoppingPromise
while (true) {
if (this.isGeneratorStopped)
return
stoppingPromise = new Promise((resolve, _) => this.resolveStopPromise = resolve)
.then(_ => stoppedSymbol)
nextValuePromise = generator.next()
const result = await Promise.race([nextValuePromise, stoppingPromise])
this.resolveStopPromise() // resolve the promise in case it is still pending
if (result === stoppedSymbol)
return
else
yield result.value
}
},
stop: function() {
this.resolveStopPromise()
this.isGeneratorStopped = true
}
}
const handler = {
get: function(target, prop, receiver) {
if (['next', 'return', 'throw'].includes(prop))
return generator[prop].bind(generator)
else
return newGenerator[prop].bind(newGenerator)
}
}
return new Proxy(newGenerator, handler)
}
const values = wrapIntoStoppable((async function* delayedIntegers() {
let n = 0;
while (true) {
yield n++;
await delay(100);
}
})());
await Promise.all([
(async () => {
for await (const value of values) {
console.log(Date.now())
console.log(value);
}
console.log(Date.now())
// console.log(await values.next())
// console.log(await values.return())
// console.log(await values.throw())
})(),
(async () => {
await delay(250);
values.stop()
})(),
]);
})();
The idea is that I wrap an async generator with an object that has an async iterator. All the elements yielded by the wrapping generator are yielded by the original generator, but now 2 promises are started:
nextValuePromise that will return the value to yield
stoppingPromise that will end the iteration if resolved before the previous one
In this way, if the stop() method (which resolves stoppingPromise) is called before the first promise is resolved, then Promise.race() will immediately return a dummy Symbol. When the result of the race is this symbol, the iterator returns. The stop() function also sets the isGeneratorStopped flag that makes sure the iterator will eventually return if the stop() method is called after the stoppingPromise() is manually resolved.
I have also used a Proxy to make sure that the wrapping object behaves as a true AsyncGenerator. Calls to next(), return() or throw() are simply forwarded to the wrapped generator.
Let's see the pros:
wrapIntoStoppable can become a util method that just wraps any async generator. This is certainly convenient because you don't have to use signals every time there is a pending Promise 1
Once the stop() method is called on the async generator, the for await...of loop immediately returns. Note: this doesn't mean that pending Promises are aborted
And now the cons:
Maybe too much code to maintain? Now the generator has a proxy that wraps another wrapper... I would like to simplify the design at least
After the generator is stopped, the nextValuePromise() could be resolved in the meantime, causing some potential side effects. This is the main reason why it is a pretty dangerous library function.
Actually, I think you could even merge #Bergi's and my solution and manage to abort a Promise when the stop() method is called. However, in this case, all the promises need to handle the abort signals.
await would suspend the async part of the function, but not the generator part, thus AsyncGenerator.return() can not act on the await suspension, but only yield suspension. And I think that's why AsyncGenerator.return() returns a promise, but Generator.return() does not.
Yes. Bergi is right. The for await loop invokes .next() right after the consumption and puts yield in charge before return. So what happens is;
# 0ms 0 gets yielded and .next() puts yield in charge to yield 1 once resolved.
#100ms 1 gets yielded and .next() puts yield in charge to yield 2 once resolved.
#200ms 2 gets yielded and .next() puts yield in charge to yield 3 once resolved.
#250ms a values.return() is enqueued but yield has already been queued to yield 3.
#300ms 3 gets yielded and generator finalizes along with the iterable values.
Now the thing is, if we find a way to resolve or reject the promise waiting for 3 prematurely #250ms then we are fine. Yet without using the abort abstraction you can still do this with naked promises and even without using an async generator. You just need to lift the resolve and reject functions out of the generator functions scope and invoke from there. I think it's best to reject prematurely and catch the rejection at the outer scope (silent or not).
Here is a way to accomplish this;
const delay = (ms) => new Promise((resolve) => setTimeout(resolve, ms)),
prior = {};
const values = (function* delayedIntegers() {
let n = 0,
p = new Promise((v,x) => Object.assign(prior,{v,x}));
prior.v(n++); // resolve p with 0 and increment n
while (true) {
yield p
p = new Promise((v,x) => Object.assign(prior,{v,x}));
delay(100).then(_ => prior.v(n++));
}
})();
(async () => {
try {
for await (const value of values) console.log(value);
}
catch(e){
console.log(e);
values.return();
}
})();
delay(250).then(_ => prior.x("Finalized..!"));
This is almost like your code but there is this prior object which holds the resolve and reject functions of a promise callback as v and x respectively.
Since prior object is accessible from within the outer context we can invoke it's x method (rejection) before the while loop in the generator resolves 3 and catch the rejection with the employed catch(e).

How to run setTimeout synchronously in a loop [duplicate]

for (let i = 0; i < 10; i++) {
const promise = new Promise((resolve, reject) => {
const timeout = Math.random() * 1000;
setTimeout(() => {
console.log(i);
}, timeout);
});
// TODO: Chain this promise to the previous one (maybe without having it running?)
}
The above will give the following random output:
6
9
4
8
5
1
7
2
3
0
The task is simple: Make sure each promise runs only after the other one (.then()).
For some reason, I couldn't find a way to do it.
I tried generator functions (yield), tried simple functions that return a promise, but at the end of the day it always comes down to the same problem: The loop is synchronous.
With async I'd simply use async.series().
How do you solve it?
As you already hinted in your question, your code creates all promises synchronously. Instead they should only be created at the time the preceding one resolves.
Secondly, each promise that is created with new Promise needs to be resolved with a call to resolve (or reject). This should be done when the timer expires. That will trigger any then callback you would have on that promise. And such a then callback (or await) is a necessity in order to implement the chain.
With those ingredients, there are several ways to perform this asynchronous chaining:
With a for loop that starts with an immediately resolving promise
With Array#reduce that starts with an immediately resolving promise
With a function that passes itself as resolution callback
With ECMAScript2017's async / await syntax
With ECMAScript2020's for await...of syntax
But let me first introduce a very useful, generic function.
Promisfying setTimeout
Using setTimeout is fine, but we actually need a promise that resolves when the timer expires. So let's create such a function: this is called promisifying a function, in this case we will promisify setTimeout. It will improve the readability of the code, and can be used for all of the above options:
const delay = ms => new Promise(resolve => setTimeout(resolve, ms));
See a snippet and comments for each of the options below.
1. With for
You can use a for loop, but you must make sure it doesn't create all promises synchronously. Instead you create an initial immediately resolving promise, and then chain new promises as the previous ones resolve:
const delay = ms => new Promise(resolve => setTimeout(resolve, ms));
for (let i = 0, p = Promise.resolve(); i < 10; i++) {
p = p.then(() => delay(Math.random() * 1000))
.then(() => console.log(i));
}
So this code creates one long chain of then calls. The variable p only serves to not lose track of that chain, and allow a next iteration of the loop to continue on the same chain. The callbacks will start executing after the synchronous loop has completed.
It is important that the then-callback returns the promise that delay() creates: this will ensure the asynchronous chaining.
2. With reduce
This is just a more functional approach to the previous strategy. You create an array with the same length as the chain you want to execute, and start out with an immediately resolving promise:
const delay = ms => new Promise(resolve => setTimeout(resolve, ms));
[...Array(10)].reduce( (p, _, i) =>
p.then(() => delay(Math.random() * 1000))
.then(() => console.log(i))
, Promise.resolve() );
This is probably more useful when you actually have an array with data to be used in the promises.
3. With a function passing itself as resolution-callback
Here we create a function and call it immediately. It creates the first promise synchronously. When it resolves, the function is called again:
const delay = ms => new Promise(resolve => setTimeout(resolve, ms));
(function loop(i) {
if (i >= 10) return; // all done
delay(Math.random() * 1000).then(() => {
console.log(i);
loop(i+1);
});
})(0);
This creates a function named loop, and at the very end of the code you can see it gets called immediately with argument 0. This is the counter, and the i argument. The function will create a new promise if that counter is still below 10, otherwise the chaining stops.
When delay() resolves, it will trigger the then callback which will call the function again.
4. With async/await
Modern JS engines support this syntax:
const delay = ms => new Promise(resolve => setTimeout(resolve, ms));
(async function loop() {
for (let i = 0; i < 10; i++) {
await delay(Math.random() * 1000);
console.log(i);
}
})();
It may look strange, as it seems like the promises are created synchronously, but in reality the async function returns when it executes the first await. Every time an awaited promise resolves, the function's running context is restored, and proceeds after the await, until it encounters the next one, and so it continues until the loop finishes.
5. With for await...of
With EcmaScript 2020, the for await...of found its way to modern JavaScript engines. Although it does not really reduce code in this case, it allows to isolate the definition of the random interval chain from the actual iteration of it:
const delay = ms => new Promise(resolve => setTimeout(resolve, ms));
async function * randomDelays(count, max) {
for (let i = 0; i < count; i++) yield delay(Math.random() * max).then(() => i);
}
(async function loop() {
for await (let i of randomDelays(10, 1000)) console.log(i);
})();
You can use async/await for this. I would explain more, but there's nothing really to it. It's just a regular for loop but I added the await keyword before the construction of your Promise
What I like about this is your Promise can resolve a normal value instead of having a side effect like your code (or other answers here) include. This gives you powers like in The Legend of Zelda: A Link to the Past where you can affect things in both the Light World and the Dark World – ie, you can easily work with data before/after the Promised data is available without having to resort to deeply nested functions, other unwieldy control structures, or stupid IIFEs.
// where DarkWorld is in the scary, unknown future
// where LightWorld is the world we saved from Ganondorf
LightWorld ... await DarkWorld
So here's what that will look like ...
async function someProcedure (n) {
for (let i = 0; i < n; i++) {
const t = Math.random() * 1000
const x = await new Promise(r => setTimeout(r, t, i))
console.log (i, x)
}
return 'done'
}
someProcedure(10)
.then(console.log)
.catch(console.error)
0 0
1 1
2 2
3 3
4 4
5 5
6 6
7 7
8 8
9 9
done
See how we don't have to deal with that bothersome .then call within our procedure? And async keyword will automatically ensure that a Promise is returned, so we can chain a .then call on the returned value. This sets us up for great success: run the sequence of n Promises, then do something important – like display a success/error message.
Based on the excellent answer by trincot, I wrote a reusable function that accepts a handler to run over each item in an array. The function itself returns a promise that allows you to wait until the loop has finished and the handler function that you pass may also return a promise.
loop(items, handler) : Promise
It took me some time to get it right, but I believe the following code will be usable in a lot of promise-looping situations.
Copy-paste ready code:
// SEE https://stackoverflow.com/a/46295049/286685
const loop = (arr, fn, busy, err, i=0) => {
const body = (ok,er) => {
try {const r = fn(arr[i], i, arr); r && r.then ? r.then(ok).catch(er) : ok(r)}
catch(e) {er(e)}
}
const next = (ok,er) => () => loop(arr, fn, ok, er, ++i)
const run = (ok,er) => i < arr.length ? new Promise(body).then(next(ok,er)).catch(er) : ok()
return busy ? run(busy,err) : new Promise(run)
}
Usage
To use it, call it with the array to loop over as the first argument and the handler function as the second. Do not pass parameters for the third, fourth and fifth arguments, they are used internally.
const loop = (arr, fn, busy, err, i=0) => {
const body = (ok,er) => {
try {const r = fn(arr[i], i, arr); r && r.then ? r.then(ok).catch(er) : ok(r)}
catch(e) {er(e)}
}
const next = (ok,er) => () => loop(arr, fn, ok, er, ++i)
const run = (ok,er) => i < arr.length ? new Promise(body).then(next(ok,er)).catch(er) : ok()
return busy ? run(busy,err) : new Promise(run)
}
const items = ['one', 'two', 'three']
loop(items, item => {
console.info(item)
})
.then(() => console.info('Done!'))
Advanced use cases
Let's look at the handler function, nested loops and error handling.
handler(current, index, all)
The handler gets passed 3 arguments. The current item, the index of the current item and the complete array being looped over. If the handler function needs to do async work, it can return a promise and the loop function will wait for the promise to resolve before starting the next iteration. You can nest loop invocations and all works as expected.
const loop = (arr, fn, busy, err, i=0) => {
const body = (ok,er) => {
try {const r = fn(arr[i], i, arr); r && r.then ? r.then(ok).catch(er) : ok(r)}
catch(e) {er(e)}
}
const next = (ok,er) => () => loop(arr, fn, ok, er, ++i)
const run = (ok,er) => i < arr.length ? new Promise(body).then(next(ok,er)).catch(er) : ok()
return busy ? run(busy,err) : new Promise(run)
}
const tests = [
[],
['one', 'two'],
['A', 'B', 'C']
]
loop(tests, (test, idx, all) => new Promise((testNext, testFailed) => {
console.info('Performing test ' + idx)
return loop(test, (testCase) => {
console.info(testCase)
})
.then(testNext)
.catch(testFailed)
}))
.then(() => console.info('All tests done'))
Error handling
Many promise-looping examples I looked at break down when an exception occurs. Getting this function to do the right thing was pretty tricky, but as far as I can tell it is working now. Make sure to add a catch handler to any inner loops and invoke the rejection function when it happens. E.g.:
const loop = (arr, fn, busy, err, i=0) => {
const body = (ok,er) => {
try {const r = fn(arr[i], i, arr); r && r.then ? r.then(ok).catch(er) : ok(r)}
catch(e) {er(e)}
}
const next = (ok,er) => () => loop(arr, fn, ok, er, ++i)
const run = (ok,er) => i < arr.length ? new Promise(body).then(next(ok,er)).catch(er) : ok()
return busy ? run(busy,err) : new Promise(run)
}
const tests = [
[],
['one', 'two'],
['A', 'B', 'C']
]
loop(tests, (test, idx, all) => new Promise((testNext, testFailed) => {
console.info('Performing test ' + idx)
loop(test, (testCase) => {
if (idx == 2) throw new Error()
console.info(testCase)
})
.then(testNext)
.catch(testFailed) // <--- DON'T FORGET!!
}))
.then(() => console.error('Oops, test should have failed'))
.catch(e => console.info('Succesfully caught error: ', e))
.then(() => console.info('All tests done'))
UPDATE: NPM package
Since writing this answer, I turned the above code in an NPM package.
for-async
Install
npm install --save for-async
Import
var forAsync = require('for-async'); // Common JS, or
import forAsync from 'for-async';
Usage (async)
var arr = ['some', 'cool', 'array'];
forAsync(arr, function(item, idx){
return new Promise(function(resolve){
setTimeout(function(){
console.info(item, idx);
// Logs 3 lines: `some 0`, `cool 1`, `array 2`
resolve(); // <-- signals that this iteration is complete
}, 25); // delay 25 ms to make async
})
})
See the package readme for more details.
If you are limited to ES6, the best option is Promise all. Promise.all(array) also returns an array of promises after successfully executing all the promises in array argument.
Suppose, if you want to update many student records in the database, the following code demonstrates the concept of Promise.all in such case-
let promises = students.map((student, index) => {
//where students is a db object
student.rollNo = index + 1;
student.city = 'City Name';
//Update whatever information on student you want
return student.save();
});
Promise.all(promises).then(() => {
//All the save queries will be executed when .then is executed
//You can do further operations here after as all update operations are completed now
});
Map is just an example method for loop. You can also use for or forin or forEach loop. So the concept is pretty simple, start the loop in which you want to do bulk async operations. Push every such async operation statement in an array declared outside the scope of that loop. After the loop completes, execute the Promise all statement with the prepared array of such queries/promises as argument.
The basic concept is that the javascript loop is synchronous whereas database call is async and we use push method in loop that is also sync. So, the problem of asynchronous behavior doesn't occur inside the loop.
here's my 2 cents worth:
resuable function forpromise()
emulates a classic for loop
allows for early exit based on internal logic, returning a value
can collect an array of results passed into resolve/next/collect
defaults to start=0,increment=1
exceptions thrown inside loop are caught and passed to .catch()
function forpromise(lo, hi, st, res, fn) {
if (typeof res === 'function') {
fn = res;
res = undefined;
}
if (typeof hi === 'function') {
fn = hi;
hi = lo;
lo = 0;
st = 1;
}
if (typeof st === 'function') {
fn = st;
st = 1;
}
return new Promise(function(resolve, reject) {
(function loop(i) {
if (i >= hi) return resolve(res);
const promise = new Promise(function(nxt, brk) {
try {
fn(i, nxt, brk);
} catch (ouch) {
return reject(ouch);
}
});
promise.
catch (function(brkres) {
hi = lo - st;
resolve(brkres)
}).then(function(el) {
if (res) res.push(el);
loop(i + st)
});
})(lo);
});
}
//no result returned, just loop from 0 thru 9
forpromise(0, 10, function(i, next) {
console.log("iterating:", i);
next();
}).then(function() {
console.log("test result 1", arguments);
//shortform:no result returned, just loop from 0 thru 4
forpromise(5, function(i, next) {
console.log("counting:", i);
next();
}).then(function() {
console.log("test result 2", arguments);
//collect result array, even numbers only
forpromise(0, 10, 2, [], function(i, collect) {
console.log("adding item:", i);
collect("result-" + i);
}).then(function() {
console.log("test result 3", arguments);
//collect results, even numbers, break loop early with different result
forpromise(0, 10, 2, [], function(i, collect, break_) {
console.log("adding item:", i);
if (i === 8) return break_("ending early");
collect("result-" + i);
}).then(function() {
console.log("test result 4", arguments);
// collect results, but break loop on exception thrown, which we catch
forpromise(0, 10, 2, [], function(i, collect, break_) {
console.log("adding item:", i);
if (i === 4) throw new Error("failure inside loop");
collect("result-" + i);
}).then(function() {
console.log("test result 5", arguments);
}).
catch (function(err) {
console.log("caught in test 5:[Error ", err.message, "]");
});
});
});
});
});
In ES6, you should use 'for await':
(async () => {
for await (const num of asyncIterable) {
console.log(num);
}
// My action here
})();
For more information, see this for await...of.
I see the previous answers and feel confused. And I coded the following by the answers' inspiration. I think its logic is more obvious, I call the function to replace original for loop:
async function pointToCountry(world, data) { // Data is for loop array
if (data.length > 0) { // For condition
const da = data.shift(); // Get current data and modified data one row code
// Some business logic
msg = da.info
pointofView(world, da);
// Await the current task
await new Promise(r => setTimeout(_ => {
r() // Resolve and finish the current task
}, 5000))
// Call itself and enter the next loop
pointToCountry(world, data)
} else { // Business logic after all tasks
pointofView(world, { longitude: 0, latitude: 0 });
world.controls().autoRotate = true;
}
}
// This is my main function - calculate all project by city
const projectCity = async (req, res, next) => {
try {
let record = [];
let cityList = await Cityodel.find({active:true});
for (let j = 0; j < cityList.length; j++) {
let arr = [];
let projectList = await getProduct(cityList[j]._id)
arr.push({
_id:cityList[j]._id,
name:cityList[j].name,
projectList:projectList
})
record.push(arr);
}
return res.status(200).send({
status: CONSTANT.REQUESTED_CODES.SUCCESS,
result: record });
} catch (error) {
return res.status(400).json(UTILS.errorHandler(error));
}
};
async function getProduct(city){
let projectList = await ProjectModel.find({city:city});
return projectList;
}
I've created a snippet in Angular that loops a promise function indefinitely. You can start it, stop it, or restart it.
You basically need to recursively call the same method and await it's current process like so:
async autoloop(): Promise<void> {
if(this.running){
await this.runMe();
await this.autoloop();
}
return Promise.resolve();
}
JavaScript:
import { Component } from '#angular/core';
#Component({
selector: 'my-app',
templateUrl: './app.component.html',
styleUrls: ['./app.component.css'],
})
export class AppComponent {
messages: string[] = [];
counter = 1;
running = false;
constructor() {
this.start();
}
onClick(): void {
this.running = !this.running;
if(this.running){
this.start();
}
else{
this.stop();
}
}
async onRestartClick(): Promise<void>{
await this.stop();
this.messages = [];
this.counter = 1;
this.start();
}
start(): void{
this.running = true;
this.autoloop();
}
async stop(): Promise<void>{
this.running = false;
await this.delay(1000);
}
async autoloop(): Promise<void> {
if(this.running){
await this.runMe();
await this.autoloop();
}
return Promise.resolve();
}
async runMe(): Promise<void> {
await this.delay(1000);
if(this.running){
this.messages.push(`Message ${this.counter++}`);
}
return Promise.resolve();
}
async delay(ms: number) {
await new Promise<void>((resolve) => setTimeout(() => resolve(), ms));
}
}
Html:
<h1>Endless looping a promise every 1 second</h1>
<button (click)="onClick()">Start / stop</button>
<button (click)="onRestartClick()">Restart</button>
<p *ngFor="let message of messages">
{{message}}
</p>

What are the practical differences between an AsyncIterable and an Observable?

I've been hung up about this topic lately. It seems AsyncIterables and Observables both have stream-like qualities, though they are consumed a bit differently.
You could consume an async iterable like this
const myAsyncIterable = async function*() { yield 1; yield 2; yield 3; }
const main = async () => {
for await (const number of myAsyncIterable()) {
console.log(number)
}
}
main()
You can consume an observable like this
const Observable = rxjs
const { map } = rxjs.operators
Observable.of(1, 2, 3).subscribe(x => console.log(x))
<script src="https://unpkg.com/rxjs/bundles/rxjs.umd.min.js"></script>
My overarching question is based off of this RxJS pr
If the observable emits at a pace faster than the loop completes, there will be a memory build up as the buffer gets more full. We could provide other methods that use different strategies (e.g. just the most recent value, etc), but leave this as the default. Note that the loop itself may have several awaits in it, that exacerbate the problem.
It seems to me that async iterators inherently do not have a backpressure problem, so is it right to implement Symbol.asyncIterator (##asyncIterator) on an Observable and default to a backpressure strategy? Is there even a need for Observables in light of AsyncIterables?
Ideally, you could show me practical differences between AsyncIterables and Observables with code examples.
The main difference is which side decides when to iterate.
In the case of Async Iterators the client decides by calling await iterator.next(). The source decides when to resolve the promise, but the client has to ask for the next value first. Thus, the consumer "pulls" the data in from the source.
Observables register a callback function which is called by the observable immediately when a new value comes in. Thus, the source "pushes" to the consumer.
An Observable could easily be used to consume an Async Iterator by using a Subject and mapping it to the next value of the async iterator. You would then call next on the Subject whenever you're ready to consume the next item. Here is a code sample
const pull = new Subject();
const output = pull.pipe(
concatMap(() => from(iter.next())),
map(val => {
if(val.done) pull.complete();
return val.value;
})
);
//wherever you need this
output.pipe(
).subscribe(() => {
//we're ready for the next item
if(!pull.closed) pull.next();
});
This is the current implementation Observable[Symbol.asyncIterator].
Here's a basic example of Symbol.asyncIterator implemented on an array:
const dummyPromise = (val, time) => new Promise(res => setTimeout(res, time * 1000, val));
const items = [1, 2, 3];
items[Symbol.asyncIterator] = async function * () {
yield * await this.map(v => dummyPromise(v, v));
}
!(async () => {
for await (const value of items) {
console.log(value);
}
})();
/*
1 - after 1s
2 - after 2s
3 - after 3s
*/
The way I understand generators(sync generators) is that they are pausable functions, meaning that you can request a value right now and another value 10 seconds later. The async generators follow the same approach, except that the value they produce is asynchronous, which means that you'll have to await for it.
For instance:
const dummyPromise = (val, time) => new Promise(res => setTimeout(res, time * 1000, val));
const items = [1, 2, 3];
items[Symbol.asyncIterator] = async function * () {
yield * await this.map(v => dummyPromise(v, v));
}
const it = items[Symbol.asyncIterator]();
(async () => {
// console.log(await it.next())
await it.next();
setTimeout(async () => {
console.log(await it.next());
}, 2000); // It will take 4s in total
})();
Going back to the Observable's implementation:
async function* coroutine<T>(source: Observable<T>) {
const deferreds: Deferred<IteratorResult<T>>[] = [];
const values: T[] = [];
let hasError = false;
let error: any = null;
let completed = false;
const subs = source.subscribe({
next: value => {
if (deferreds.length > 0) {
deferreds.shift()!.resolve({ value, done: false });
} else {
values.push(value);
}
},
error: err => { /* ... */ },
complete: () => { /* ... */ },
});
try {
while (true) {
if (values.length > 0) {
yield values.shift();
} else if (completed) {
return;
} else if (hasError) {
throw error;
} else {
const d = new Deferred<IteratorResult<T>>();
deferreds.push(d);
const result = await d.promise;
if (result.done) {
return;
} else {
yield result.value;
}
}
}
} catch (err) {
throw err;
} finally {
subs.unsubscribe();
}
}
From my understanding:
values is used to keep track of synchronous values
If you have of(1, 2, 3), the values array will contain [1, 2, 3] before it even reached while(true) { }. And because you're using a for await (const v of ...),
you'd be requesting values as if you were doing it.next(); it.next(); it.next() ....
Put differently, as soon as you can consume one value from your iterator, you're immediately requesting for the next one, until the data producer has nothing to offer.
deferreds is used for asynchronous values
so at your first it.next() , the values array is empty(meaning that the observable did not emit synchronously), so it will fall back to the last else, which simply creates a promise that is added to deferreds, after which that promise is awaited until it either resolves or rejects.
When the observable finally emits, deferreds won't be empty, so the awaited promise will resolve with the newly arrived value.
const src$ = merge(
timer(1000).pipe(mapTo(1)),
timer(2000).pipe(mapTo(2)),
timer(3000).pipe(mapTo(3)),
);
!(async () => {
for await (const value of src$) {
console.log(value);
}
})();
StackBlitz
The observable stuff is mind-bending, and my understanding could be flawed. But an async iterator is just an iterator that returns promises, which can resolve to future events in a "live" stream of events (a hot observable). It could be implemented using a queue as follows.
function* iterateClickEvents(target) {
const queue = []
target.addEventListener('click', e => queue.shift()?.fulfill(e))
while (true)
yield new Promise(fulfill => queue.push({fulfill}))
}
//use it
for await (const e of iterateClickEvents(myButton))
handleEvent(e)
Then you can implement fluent operators like:
class FluentIterable {
constructor(iterable) {
this.iterable = iterable
}
filter(predicate) {
return new FluentIterable(this.$filter(predicate))
}
async* $filter(predicate) {
for await (const value of this.iterable)
if (predicate(value))
yield value
}
async each(fn) {
for await (const value of this.iterable)
fn(value)
}
}
//use it
new FluentIterable(iterateClickEvents(document.body))
.filter(e => e.target == myButton)
.each(handleEvent)
.catch(console.error)
https://codepen.io/ken107/pen/PojZjgB
You could implement a map operator that returns the results of inner iterators. Things get complicated from there.

Is it possible to take advantage of Mongo threads from a NodeJS app without forking the node process? [duplicate]

I have some code that is iterating over a list that was queried out of a database and making an HTTP request for each element in that list. That list can sometimes be a reasonably large number (in the thousands), and I would like to make sure I am not hitting a web server with thousands of concurrent HTTP requests.
An abbreviated version of this code currently looks something like this...
function getCounts() {
return users.map(user => {
return new Promise(resolve => {
remoteServer.getCount(user) // makes an HTTP request
.then(() => {
/* snip */
resolve();
});
});
});
}
Promise.all(getCounts()).then(() => { /* snip */});
This code is running on Node 4.3.2. To reiterate, can Promise.all be managed so that only a certain number of Promises are in progress at any given time?
P-Limit
I have compared promise concurrency limitation with a custom script, bluebird, es6-promise-pool, and p-limit. I believe that p-limit has the most simple, stripped down implementation for this need. See their documentation.
Requirements
To be compatible with async in example
ECMAScript 2017 (version 8)
Node version > 8.2.1
My Example
In this example, we need to run a function for every URL in the array (like, maybe an API request). Here this is called fetchData(). If we had an array of thousands of items to process, concurrency would definitely be useful to save on CPU and memory resources.
const pLimit = require('p-limit');
// Example Concurrency of 3 promise at once
const limit = pLimit(3);
let urls = [
"http://www.exampleone.com/",
"http://www.exampletwo.com/",
"http://www.examplethree.com/",
"http://www.examplefour.com/",
]
// Create an array of our promises using map (fetchData() returns a promise)
let promises = urls.map(url => {
// wrap the function we are calling in the limit function we defined above
return limit(() => fetchData(url));
});
(async () => {
// Only three promises are run at once (as defined above)
const result = await Promise.all(promises);
console.log(result);
})();
The console log result is an array of your resolved promises response data.
Using Array.prototype.splice
while (funcs.length) {
// 100 at a time
await Promise.all( funcs.splice(0, 100).map(f => f()) )
}
Note that Promise.all() doesn't trigger the promises to start their work, creating the promise itself does.
With that in mind, one solution would be to check whenever a promise is resolved whether a new promise should be started or whether you're already at the limit.
However, there is really no need to reinvent the wheel here. One library that you could use for this purpose is es6-promise-pool. From their examples:
var PromisePool = require('es6-promise-pool')
var promiseProducer = function () {
// Your code goes here.
// If there is work left to be done, return the next work item as a promise.
// Otherwise, return null to indicate that all promises have been created.
// Scroll down for an example.
}
// The number of promises to process simultaneously.
var concurrency = 3
// Create a pool.
var pool = new PromisePool(promiseProducer, concurrency)
// Start the pool.
var poolPromise = pool.start()
// Wait for the pool to settle.
poolPromise.then(function () {
console.log('All promises fulfilled')
}, function (error) {
console.log('Some promise rejected: ' + error.message)
})
If you know how iterators work and how they are consumed you would't need any extra library, since it can become very easy to build your own concurrency yourself. Let me demonstrate:
/* [Symbol.iterator]() is equivalent to .values()
const iterator = [1,2,3][Symbol.iterator]() */
const iterator = [1,2,3].values()
// loop over all items with for..of
for (const x of iterator) {
console.log('x:', x)
// notices how this loop continues the same iterator
// and consumes the rest of the iterator, making the
// outer loop not logging any more x's
for (const y of iterator) {
console.log('y:', y)
}
}
We can use the same iterator and share it across workers.
If you had used .entries() instead of .values() you would have gotten a iterator that yields [index, value] which i will demonstrate below with a concurrency of 2
const sleep = t => new Promise(rs => setTimeout(rs, t))
const iterator = Array.from('abcdefghij').entries()
// const results = [] || Array(someLength)
async function doWork (iterator, i) {
for (let [index, item] of iterator) {
await sleep(1000)
console.log(`Worker#${i}: ${index},${item}`)
// in case you need to store the results in order
// results[index] = item + item
// or if the order dose not mather
// results.push(item + item)
}
}
const workers = Array(2).fill(iterator).map(doWork)
// ^--- starts two workers sharing the same iterator
Promise.allSettled(workers).then(console.log.bind(null, 'done'))
The benefit of this is that you can have a generator function instead of having everything ready at once.
What's even more awesome is that you can do stream.Readable.from(iterator) in node (and eventually in whatwg streams as well). and with transferable ReadbleStream, this makes this potential very useful in the feature if you are working with web workers also for performances
Note: the different from this compared to example async-pool is that it spawns two workers, so if one worker throws an error for some reason at say index 5 it won't stop the other worker from doing the rest. So you go from doing 2 concurrency down to 1. (so it won't stop there) So my advise is that you catch all errors inside the doWork function
Instead of using promises for limiting http requests, use node's built-in http.Agent.maxSockets. This removes the requirement of using a library or writing your own pooling code, and has the added advantage more control over what you're limiting.
agent.maxSockets
By default set to Infinity. Determines how many concurrent sockets the agent can have open per origin. Origin is either a 'host:port' or 'host:port:localAddress' combination.
For example:
var http = require('http');
var agent = new http.Agent({maxSockets: 5}); // 5 concurrent connections per origin
var request = http.request({..., agent: agent}, ...);
If making multiple requests to the same origin, it might also benefit you to set keepAlive to true (see docs above for more info).
bluebird's Promise.map can take a concurrency option to control how many promises should be running in parallel. Sometimes it is easier than .all because you don't need to create the promise array.
const Promise = require('bluebird')
function getCounts() {
return Promise.map(users, user => {
return new Promise(resolve => {
remoteServer.getCount(user) // makes an HTTP request
.then(() => {
/* snip */
resolve();
});
});
}, {concurrency: 10}); // <---- at most 10 http requests at a time
}
As all others in this answer thread have pointed out, Promise.all() won't do the right thing if you need to limit concurrency. But ideally you shouldn't even want to wait until all of the Promises are done before processing them.
Instead, you want to process each result ASAP as soon as it becomes available, so you don't have to wait for the very last promise to finish before you start iterating over them.
So, here's a code sample that does just that, based partly on the answer by Endless and also on this answer by T.J. Crowder.
// example tasks that sleep and return a number
// in real life, you'd probably fetch URLs or something
const tasks = [];
for (let i = 0; i < 20; i++) {
tasks.push(async () => {
console.log(`start ${i}`);
await sleep(Math.random() * 1000);
console.log(`end ${i}`);
return i;
});
}
function sleep(ms) { return new Promise(r => setTimeout(r, ms)); }
(async () => {
for await (let value of runTasks(3, tasks.values())) {
console.log(`output ${value}`);
}
})();
async function* runTasks(maxConcurrency, taskIterator) {
// Each async iterator is a worker, polling for tasks from the shared taskIterator
// Sharing the iterator ensures that each worker gets unique tasks.
const asyncIterators = new Array(maxConcurrency);
for (let i = 0; i < maxConcurrency; i++) {
asyncIterators[i] = (async function* () {
for (const task of taskIterator) yield await task();
})();
}
yield* raceAsyncIterators(asyncIterators);
}
async function* raceAsyncIterators(asyncIterators) {
async function nextResultWithItsIterator(iterator) {
return { result: await iterator.next(), iterator: iterator };
}
/** #type Map<AsyncIterator<T>,
Promise<{result: IteratorResult<T>, iterator: AsyncIterator<T>}>> */
const promises = new Map(asyncIterators.map(iterator =>
[iterator, nextResultWithItsIterator(iterator)]));
while (promises.size) {
const { result, iterator } = await Promise.race(promises.values());
if (result.done) {
promises.delete(iterator);
} else {
promises.set(iterator, nextResultWithItsIterator(iterator));
yield result.value;
}
}
}
There's a lot of magic in here; let me explain.
This solution is built around async generator functions, which many JS developers may not be familiar with.
A generator function (aka function* function) returns a "generator," an iterator of results. Generator functions are allowed to use the yield keyword where you might have normally used a return keyword. The first time a caller calls next() on the generator (or uses a for...of loop), the function* function runs until it yields a value; that becomes the next() value of the iterator. But the subsequent time next() is called, the generator function resumes from the yield statement, right where it left off, even if it's in the middle of a loop. (You can also yield*, to yield all of the results of another generator function.)
An "async generator function" (async function*) is a generator function that returns an "async iterator," which is an iterator of promises. You can call for await...of on an async iterator. Async generator functions can use the await keyword, as you might do in any async function.
In the example, we call runTasks() with an array of task functions. runTasks() is an async generator function, so we can call it with a for await...of loop. Each time the loop runs, we'll process the result of the latest completed task.
runTasks() creates N async iterators, the workers. (Note that the workers are initially defined as async generator functions, but we immediately invoke each function, and store each resulting async iterator in the asyncIterators array.) The example calls runTasks with 3 concurrent workers, so no more than 3 tasks are launched at the same time. When any task completes, we immediately queue up the next task. (This is superior to "batching", where you do 3 tasks at once, await all three of them, and don't start the next batch of three until the entire previous batch has finished.)
runTasks() concludes by "racing" its async iterators with yield* raceAsyncIterators(). raceAsyncIterators() is like Promise.race() but it races N iterators of Promises instead of just N Promises; it returns an async iterator that yields the results of resolved Promises.
raceAsyncIterators() starts by defining a promises Map from each of the iterators to promises. Each promise is a promise for an iteration result along with the iterator that generated it.
With the promises map, we can Promise.race() the values of the map, giving us the winning iteration result and its iterator. If the iterator is completely done, we remove it from the map; otherwise we replace its Promise in the promises map with the iterator's next() Promise and yield result.value.
In conclusion, runTasks() is an async generator function that yields the results of racing N concurrent async iterators of tasks, so the end user can just for await (let value of runTasks(3, tasks.values())) to process each result as soon as it becomes available.
I suggest the library async-pool: https://github.com/rxaviers/async-pool
npm install tiny-async-pool
Description:
Run multiple promise-returning & async functions with limited concurrency using native ES6/ES7
asyncPool runs multiple promise-returning & async functions in a limited concurrency pool. It rejects immediately as soon as one of the promises rejects. It resolves when all the promises completes. It calls the iterator function as soon as possible (under concurrency limit).
Usage:
const timeout = i => new Promise(resolve => setTimeout(() => resolve(i), i));
await asyncPool(2, [1000, 5000, 3000, 2000], timeout);
// Call iterator (i = 1000)
// Call iterator (i = 5000)
// Pool limit of 2 reached, wait for the quicker one to complete...
// 1000 finishes
// Call iterator (i = 3000)
// Pool limit of 2 reached, wait for the quicker one to complete...
// 3000 finishes
// Call iterator (i = 2000)
// Itaration is complete, wait until running ones complete...
// 5000 finishes
// 2000 finishes
// Resolves, results are passed in given array order `[1000, 5000, 3000, 2000]`.
Here is my ES7 solution to a copy-paste friendly and feature complete Promise.all()/map() alternative, with a concurrency limit.
Similar to Promise.all() it maintains return order as well as a fallback for non promise return values.
I also included a comparison of the different implementation as it illustrates some aspects a few of the other solutions have missed.
Usage
const asyncFn = delay => new Promise(resolve => setTimeout(() => resolve(), delay));
const args = [30, 20, 15, 10];
await asyncPool(args, arg => asyncFn(arg), 4); // concurrency limit of 4
Implementation
async function asyncBatch(args, fn, limit = 8) {
// Copy arguments to avoid side effects
args = [...args];
const outs = [];
while (args.length) {
const batch = args.splice(0, limit);
const out = await Promise.all(batch.map(fn));
outs.push(...out);
}
return outs;
}
async function asyncPool(args, fn, limit = 8) {
return new Promise((resolve) => {
// Copy arguments to avoid side effect, reverse queue as
// pop is faster than shift
const argQueue = [...args].reverse();
let count = 0;
const outs = [];
const pollNext = () => {
if (argQueue.length === 0 && count === 0) {
resolve(outs);
} else {
while (count < limit && argQueue.length) {
const index = args.length - argQueue.length;
const arg = argQueue.pop();
count += 1;
const out = fn(arg);
const processOut = (out, index) => {
outs[index] = out;
count -= 1;
pollNext();
};
if (typeof out === 'object' && out.then) {
out.then(out => processOut(out, index));
} else {
processOut(out, index);
}
}
}
};
pollNext();
});
}
Comparison
// A simple async function that returns after the given delay
// and prints its value to allow us to determine the response order
const asyncFn = delay => new Promise(resolve => setTimeout(() => {
console.log(delay);
resolve(delay);
}, delay));
// List of arguments to the asyncFn function
const args = [30, 20, 15, 10];
// As a comparison of the different implementations, a low concurrency
// limit of 2 is used in order to highlight the performance differences.
// If a limit greater than or equal to args.length is used the results
// would be identical.
// Vanilla Promise.all/map combo
const out1 = await Promise.all(args.map(arg => asyncFn(arg)));
// prints: 10, 15, 20, 30
// total time: 30ms
// Pooled implementation
const out2 = await asyncPool(args, arg => asyncFn(arg), 2);
// prints: 20, 30, 15, 10
// total time: 40ms
// Batched implementation
const out3 = await asyncBatch(args, arg => asyncFn(arg), 2);
// prints: 20, 30, 20, 30
// total time: 45ms
console.log(out1, out2, out3); // prints: [30, 20, 15, 10] x 3
// Conclusion: Execution order and performance is different,
// but return order is still identical
Conclusion
asyncPool() should be the best solution as it allows new requests to start as soon as any previous one finishes.
asyncBatch() is included as a comparison as its implementation is simpler to understand, but it should be slower in performance as all requests in the same batch is required to finish in order to start the next batch.
In this contrived example, the non-limited vanilla Promise.all() is of course the fastest, while the others could perform more desirable in a real world congestion scenario.
Update
The async-pool library that others have already suggested is probably a better alternative to my implementation as it works almost identically and has a more concise implementation with a clever usage of Promise.race(): https://github.com/rxaviers/async-pool/blob/master/lib/es7.js
Hopefully my answer can still serve an educational value.
Semaphore is well known concurrency primitive that was designed to solve similar problems. It's very universal construct, implementations of Semaphore exist in many languages. This is how one would use Semaphore to solve this issue:
async function main() {
const s = new Semaphore(100);
const res = await Promise.all(
entities.map((users) =>
s.runExclusive(() => remoteServer.getCount(user))
)
);
return res;
}
I'm using implementation of Semaphore from async-mutex, it has decent documentation and TypeScript support.
If you want to dig deep into topics like this you can take a look at the book "The Little Book of Semaphores" which is freely available as PDF here
Unfortunately there is no way to do it with native Promise.all, so you have to be creative.
This is the quickest most concise way I could find without using any outside libraries.
It makes use of a newer javascript feature called an iterator. The iterator basically keeps track of what items have been processed and what haven't.
In order to use it in code, you create an array of async functions. Each async function asks the same iterator for the next item that needs to be processed. Each function processes its own item asynchronously, and when done asks the iterator for a new one. Once the iterator runs out of items, all the functions complete.
Thanks to #Endless for inspiration.
const items = [
'https://httpbin.org/bytes/2',
'https://httpbin.org/bytes/2',
'https://httpbin.org/bytes/2',
'https://httpbin.org/bytes/2',
'https://httpbin.org/bytes/2',
'https://httpbin.org/bytes/2',
'https://httpbin.org/bytes/2',
'https://httpbin.org/bytes/2'
]
// get a cursor that keeps track of what items have already been processed.
let cursor = items.entries();
// create 5 for loops that each run off the same cursor which keeps track of location
Array(5).fill().forEach(async () => {
for (let [index, url] of cursor){
console.log('getting url is ', index, url)
// run your async task instead of this next line
var text = await fetch(url).then(res => res.text())
console.log('text is', text.slice(0, 20))
}
})
Here goes basic example for streaming and 'p-limit'. It streams http read stream to mongo db.
const stream = require('stream');
const util = require('util');
const pLimit = require('p-limit');
const es = require('event-stream');
const streamToMongoDB = require('stream-to-mongo-db').streamToMongoDB;
const pipeline = util.promisify(stream.pipeline)
const outputDBConfig = {
dbURL: 'yr-db-url',
collection: 'some-collection'
};
const limit = pLimit(3);
async yrAsyncStreamingFunction(readStream) => {
const mongoWriteStream = streamToMongoDB(outputDBConfig);
const mapperStream = es.map((data, done) => {
let someDataPromise = limit(() => yr_async_call_to_somewhere())
someDataPromise.then(
function handleResolve(someData) {
data.someData = someData;
done(null, data);
},
function handleError(error) {
done(error)
}
);
})
await pipeline(
readStream,
JSONStream.parse('*'),
mapperStream,
mongoWriteStream
);
}
So many good solutions. I started out with the elegant solution posted by #Endless and ended up with this little extension method that does not use any external libraries nor does it run in batches (although assumes you have features like async, etc):
Promise.allWithLimit = async (taskList, limit = 5) => {
const iterator = taskList.entries();
let results = new Array(taskList.length);
let workerThreads = new Array(limit).fill(0).map(() =>
new Promise(async (resolve, reject) => {
try {
let entry = iterator.next();
while (!entry.done) {
let [index, promise] = entry.value;
try {
results[index] = await promise;
entry = iterator.next();
}
catch (err) {
results[index] = err;
}
}
// No more work to do
resolve(true);
}
catch (err) {
// This worker is dead
reject(err);
}
}));
await Promise.all(workerThreads);
return results;
};
Promise.allWithLimit = async (taskList, limit = 5) => {
const iterator = taskList.entries();
let results = new Array(taskList.length);
let workerThreads = new Array(limit).fill(0).map(() =>
new Promise(async (resolve, reject) => {
try {
let entry = iterator.next();
while (!entry.done) {
let [index, promise] = entry.value;
try {
results[index] = await promise;
entry = iterator.next();
}
catch (err) {
results[index] = err;
}
}
// No more work to do
resolve(true);
}
catch (err) {
// This worker is dead
reject(err);
}
}));
await Promise.all(workerThreads);
return results;
};
const demoTasks = new Array(10).fill(0).map((v,i) => new Promise(resolve => {
let n = (i + 1) * 5;
setTimeout(() => {
console.log(`Did nothing for ${n} seconds`);
resolve(n);
}, n * 1000);
}));
var results = Promise.allWithLimit(demoTasks);
#tcooc's answer was quite cool. Didn't know about it and will leverage it in the future.
I also enjoyed #MatthewRideout's answer, but it uses an external library!!
Whenever possible, I give a shot at developing this kind of things on my own, rather than going for a library. You end up learning a lot of concepts which seemed daunting before.
class Pool{
constructor(maxAsync) {
this.maxAsync = maxAsync;
this.asyncOperationsQueue = [];
this.currentAsyncOperations = 0
}
runAnother() {
if (this.asyncOperationsQueue.length > 0 && this.currentAsyncOperations < this.maxAsync) {
this.currentAsyncOperations += 1;
this.asyncOperationsQueue.pop()()
.then(() => { this.currentAsyncOperations -= 1; this.runAnother() }, () => { this.currentAsyncOperations -= 1; this.runAnother() })
}
}
add(f){ // the argument f is a function of signature () => Promise
this.runAnother();
return new Promise((resolve, reject) => {
this.asyncOperationsQueue.push(
() => f().then(resolve).catch(reject)
)
})
}
}
//#######################################################
// TESTS
//#######################################################
function dbCall(id, timeout, fail) {
return new Promise((resolve, reject) => {
setTimeout(() => {
if (fail) {
reject(`Error for id ${id}`);
} else {
resolve(id);
}
}, timeout)
}
)
}
const dbQuery1 = () => dbCall(1, 5000, false);
const dbQuery2 = () => dbCall(2, 5000, false);
const dbQuery3 = () => dbCall(3, 5000, false);
const dbQuery4 = () => dbCall(4, 5000, true);
const dbQuery5 = () => dbCall(5, 5000, false);
const cappedPool = new Pool(2);
const dbQuery1Res = cappedPool.add(dbQuery1).catch(i => i).then(i => console.log(`Resolved: ${i}`))
const dbQuery2Res = cappedPool.add(dbQuery2).catch(i => i).then(i => console.log(`Resolved: ${i}`))
const dbQuery3Res = cappedPool.add(dbQuery3).catch(i => i).then(i => console.log(`Resolved: ${i}`))
const dbQuery4Res = cappedPool.add(dbQuery4).catch(i => i).then(i => console.log(`Resolved: ${i}`))
const dbQuery5Res = cappedPool.add(dbQuery5).catch(i => i).then(i => console.log(`Resolved: ${i}`))
This approach provides a nice API, similar to thread pools in scala/java.
After creating one instance of the pool with const cappedPool = new Pool(2), you provide promises to it with simply cappedPool.add(() => myPromise).
Obliviously we must ensure that the promise does not start immediately and that is why we must "provide it lazily" with the help of the function.
Most importantly, notice that the result of the method add is a Promise which will be completed/resolved with the value of your original promise! This makes for a very intuitive use.
const resultPromise = cappedPool.add( () => dbCall(...))
resultPromise
.then( actualResult => {
// Do something with the result form the DB
}
)
This solution uses an async generator to manage concurrent promises with vanilla javascript. The throttle generator takes 3 arguments:
An array of values to be be supplied as arguments to a promise genrating function. (e.g. An array of URLs.)
A function that return a promise. (e.g. Returns a promise for an HTTP request.)
An integer that represents the maximum concurrent promises allowed.
Promises are only instantiated as required in order to reduce memory consumption. Results can be iterated over using a for await...of statement.
The example below provides a function to check promise state, the throttle async generator, and a simple function that return a promise based on setTimeout. The async IIFE at the end defines the reservoir of timeout values, sets the async iterable returned by throttle, then iterates over the results as they resolve.
If you would like a more complete example for HTTP requests, let me know in the comments.
Please note that Node.js 16+ is required in order async generators.
const promiseState = function( promise ) {
const control = Symbol();
return Promise
.race([ promise, control ])
.then( value => ( value === control ) ? 'pending' : 'fulfilled' )
.catch( () => 'rejected' );
}
const throttle = async function* ( reservoir, promiseClass, highWaterMark ) {
let iterable = reservoir.splice( 0, highWaterMark ).map( item => promiseClass( item ) );
while ( iterable.length > 0 ) {
await Promise.any( iterable );
const pending = [];
const resolved = [];
for ( const currentValue of iterable ) {
if ( await promiseState( currentValue ) === 'pending' ) {
pending.push( currentValue );
} else {
resolved.push( currentValue );
}
}
console.log({ pending, resolved, reservoir });
iterable = [
...pending,
...reservoir.splice( 0, highWaterMark - pending.length ).map( value => promiseClass( value ) )
];
yield Promise.allSettled( resolved );
}
}
const getTimeout = delay => new Promise( ( resolve, reject ) => {
setTimeout(resolve, delay, delay);
} );
( async () => {
const test = [ 1100, 1200, 1300, 10000, 11000, 9000, 5000, 6000, 3000, 4000, 1000, 2000, 3500 ];
const throttledRequests = throttle( test, getTimeout, 4 );
for await ( const timeout of throttledRequests ) {
console.log( timeout );
}
} )();
The concurrent function below will return a Promise which resolves to an array of resolved promise values, while implementing a concurrency limit. No 3rd party library.
// waits 50 ms then resolves to the passed-in arg
const sleepAndResolve = s => new Promise(rs => setTimeout(()=>rs(s), 50))
// queue 100 promises
const funcs = []
for(let i=0; i<100; i++) funcs.push(()=>sleepAndResolve(i))
//run the promises with a max concurrency of 10
concurrent(10,funcs)
.then(console.log) // prints [0,1,2...,99]
.catch(()=>console.log("there was an error"))
/**
* Run concurrent promises with a maximum concurrency level
* #param concurrency The number of concurrently running promises
* #param funcs An array of functions that return promises
* #returns a promise that resolves to an array of the resolved values from the promises returned by funcs
*/
function concurrent(concurrency, funcs) {
return new Promise((resolve, reject) => {
let index = -1;
const p = [];
for (let i = 0; i < Math.max(1, Math.min(concurrency, funcs.length)); i++)
runPromise();
function runPromise() {
if (++index < funcs.length)
(p[p.length] = funcs[index]()).then(runPromise).catch(reject);
else if (index === funcs.length)
Promise.all(p).then(resolve).catch(reject);
}
});
}
Here's the Typescript version if you are interested
/**
* Run concurrent promises with a maximum concurrency level
* #param concurrency The number of concurrently running promises
* #param funcs An array of functions that return promises
* #returns a promise that resolves to an array of the resolved values from the promises returned by funcs
*/
function concurrent<V>(concurrency:number, funcs:(()=>Promise<V>)[]):Promise<V[]> {
return new Promise((resolve,reject)=>{
let index = -1;
const p:Promise<V>[] = []
for(let i=0; i<Math.max(1,Math.min(concurrency, funcs.length)); i++) runPromise()
function runPromise() {
if (++index < funcs.length) (p[p.length] = funcs[index]()).then(runPromise).catch(reject)
else if (index === funcs.length) Promise.all(p).then(resolve).catch(reject)
}
})
}
No external libraries. Just plain JS.
It can be resolved using recursion.
The idea is that initially we immediately execute the maximum allowed number of queries and each of these queries should recursively initiate a new query on its completion.
In this example I populate successful responses together with errors and I execute all queries but it's possible to slightly modify algorithm if you want to terminate batch execution on the first failure.
async function batchQuery(queries, limit) {
limit = Math.min(queries.length, limit);
return new Promise((resolve, reject) => {
const responsesOrErrors = new Array(queries.length);
let startedCount = 0;
let finishedCount = 0;
let hasErrors = false;
function recursiveQuery() {
let index = startedCount++;
doQuery(queries[index])
.then(res => {
responsesOrErrors[index] = res;
})
.catch(error => {
responsesOrErrors[index] = error;
hasErrors = true;
})
.finally(() => {
finishedCount++;
if (finishedCount === queries.length) {
hasErrors ? reject(responsesOrErrors) : resolve(responsesOrErrors);
} else if (startedCount < queries.length) {
recursiveQuery();
}
});
}
for (let i = 0; i < limit; i++) {
recursiveQuery();
}
});
}
async function doQuery(query) {
console.log(`${query} started`);
const delay = Math.floor(Math.random() * 1500);
return new Promise((resolve, reject) => {
setTimeout(() => {
if (delay <= 1000) {
console.log(`${query} finished successfully`);
resolve(`${query} success`);
} else {
console.log(`${query} finished with error`);
reject(`${query} error`);
}
}, delay);
});
}
const queries = new Array(10).fill('query').map((query, index) => `${query}_${index + 1}`);
batchQuery(queries, 3)
.then(responses => console.log('All successfull', responses))
.catch(responsesWithErrors => console.log('All with several failed', responsesWithErrors));
So I tried to make some examples shown work for my code, but since this was only for an import script and not production code, using the npm package batch-promises was surely the easiest path for me
NOTE: Requires runtime to support Promise or to be polyfilled.
Api
batchPromises(int: batchSize, array: Collection, i => Promise: Iteratee)
The Promise: Iteratee will be called after each batch.
Use:
batch-promises
Easily batch promises
NOTE: Requires runtime to support Promise or to be polyfilled.
Api
batchPromises(int: batchSize, array: Collection, i => Promise: Iteratee)
The Promise: Iteratee will be called after each batch.
Use:
import batchPromises from 'batch-promises';
batchPromises(2, [1,2,3,4,5], i => new Promise((resolve, reject) => {
// The iteratee will fire after each batch resulting in the following behaviour:
// # 100ms resolve items 1 and 2 (first batch of 2)
// # 200ms resolve items 3 and 4 (second batch of 2)
// # 300ms resolve remaining item 5 (last remaining batch)
setTimeout(() => {
resolve(i);
}, 100);
}))
.then(results => {
console.log(results); // [1,2,3,4,5]
});
Recursion is the answer if you don't want to use external libraries
downloadAll(someArrayWithData){
var self = this;
var tracker = function(next){
return self.someExpensiveRequest(someArrayWithData[next])
.then(function(){
next++;//This updates the next in the tracker function parameter
if(next < someArrayWithData.length){//Did I finish processing all my data?
return tracker(next);//Go to the next promise
}
});
}
return tracker(0);
}
expanding on the answer posted by #deceleratedcaviar, I created a 'batch' utility function that takes as argument: array of values, concurrency limit and processing function. Yes I realize that using Promise.all this way is more akin to batch processing vs true concurrency, but if the goal is to limit excessive number of HTTP calls at one time I go with this approach due to its simplicity and no need for external library.
async function batch(o) {
let arr = o.arr
let resp = []
while (arr.length) {
let subset = arr.splice(0, o.limit)
let results = await Promise.all(subset.map(o.process))
resp.push(results)
}
return [].concat.apply([], resp)
}
let arr = []
for (let i = 0; i < 250; i++) { arr.push(i) }
async function calc(val) { return val * 100 }
(async () => {
let resp = await batch({
arr: arr,
limit: 100,
process: calc
})
console.log(resp)
})();
One more solution with a custom promise library (CPromise):
using generators Live codesandbox demo
import { CPromise } from "c-promise2";
import cpFetch from "cp-fetch";
const promise = CPromise.all(
function* () {
const urls = [
"https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=1",
"https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=2",
"https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=3",
"https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=4",
"https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=5",
"https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=6",
"https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=7"
];
for (const url of urls) {
yield cpFetch(url); // add a promise to the pool
console.log(`Request [${url}] completed`);
}
},
{ concurrency: 2 }
).then(
(v) => console.log(`Done: `, v),
(e) => console.warn(`Failed: ${e}`)
);
// yeah, we able to cancel the task and abort pending network requests
// setTimeout(() => promise.cancel(), 4500);
using mapper Live codesandbox demo
import { CPromise } from "c-promise2";
import cpFetch from "cp-fetch";
const promise = CPromise.all(
[
"https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=1",
"https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=2",
"https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=3",
"https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=4",
"https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=5",
"https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=6",
"https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=7"
],
{
mapper: (url) => {
console.log(`Request [${url}]`);
return cpFetch(url);
},
concurrency: 2
}
).then(
(v) => console.log(`Done: `, v),
(e) => console.warn(`Failed: ${e}`)
);
// yeah, we able to cancel the task and abort pending network requests
//setTimeout(() => promise.cancel(), 4500);
Warning this has not been benchmarked for efficiency and does a lot of array copying/creation
If you want a more functional approach you could do something like:
import chunk from 'lodash.chunk';
const maxConcurrency = (max) => (dataArr, promiseFn) =>
chunk(dataArr, max).reduce(
async (agg, batch) => [
...(await agg),
...(await Promise.all(batch.map(promiseFn)))
],
[]
);
and then to you could use it like:
const randomFn = (data) =>
new Promise((res) => setTimeout(
() => res(data + 1),
Math.random() * 1000
));
const result = await maxConcurrency(5)(
[1, 2, 3, 4, 5, 6, 7, 8, 9, 10],
randomFn
);
console.log('result+++', result);
I had been using the bottleneck library, which I actually really liked, but in my case wasn't releasing memory and kept tanking long running jobs... Which isn't great for running the massive jobs that you likely want a throttling/concurrency library for in the first place.
I needed a simple, low-overhead, easy to maintain solution. I also wanted something that kept the pool topped up, rather than simply batching predefined chunks... In the case of a downloader, this will stop that nGB file from holding up your queue for minutes/hours at a time, even though the rest of the batch finished ages ago.
This is the Node.js v16+, no-dependency, async generator solution I've been using instead:
const promiseState = function( promise ) {
// A promise could never resolve to a unique symbol unless it was in this scope
const control = Symbol();
// This helps us determine the state of the promise... A little heavy, but it beats a third-party promise library. The control is the second element passed to Promise.race() since it will only resolve first if the promise being tested is pending.
return Promise
.race([ promise, control ])
.then( value => ( value === control ) ? 'pending' : 'fulfilled' )
.catch( () => 'rejected' );
}
const throttle = async function* ( reservoir, promiseFunction, highWaterMark ) {
let iterable = reservoir.splice( 0, highWaterMark ).map( item => promiseFunction( item ) );
while ( iterable.length > 0 ) {
// When a promise has resolved we have space to top it up to the high water mark...
await Promise.any( iterable );
const pending = [];
const resolved = [];
// This identifies the promise(s) that have resolved so that we can yield them
for ( const currentValue of iterable ) {
if ( await promiseState( currentValue ) === 'pending' ) {
pending.push( currentValue );
} else {
resolved.push( currentValue );
}
}
// Put the remaining promises back into iterable, and top it to the high water mark
iterable = [
...pending,
...reservoir.splice( 0, highWaterMark - pending.length ).map( value => promiseFunction( value ) )
];
yield Promise.allSettled( resolved );
}
}
// This is just an example of what would get passed as "promiseFunction"... This can be the function that returns your HTTP request promises
const getTimeout = delay => new Promise( (resolve, reject) => setTimeout(resolve, delay, delay) );
// This is just the async IIFE that bootstraps this example
( async () => {
const test = [ 1000, 2000, 3000, 4000, 5000, 6000, 1500, 2500, 3500, 4500, 5500, 6500 ];
for await ( const timeout of throttle( test, getTimeout, 4 ) ) {
console.log( timeout );
}
} )();
I have solution with creating chunks and using .reduce function to wait each chunks promise.alls to be finished. And also I add some delay if the promises have some call limits.
export function delay(ms: number) {
return new Promise<void>((resolve) => setTimeout(resolve, ms));
}
export const chunk = <T>(arr: T[], size: number): T[][] => [
...Array(Math.ceil(arr.length / size)),
].map((_, i) => arr.slice(size * i, size + size * i));
const myIdlist = []; // all items
const groupedIdList = chunk(myIdList, 20); // grouped by 20 items
await groupedIdList.reduce(async (prev, subIdList) => {
await prev;
// Make sure we wait for 500 ms after processing every page to prevent overloading the calls.
const data = await Promise.all(subIdList.map(myPromise));
await delay(500);
}, Promise.resolve());
Using tiny-async-pool ES9 for await...of API, you can do the following:
const asyncPool = require("tiny-async-pool");
const getCount = async (user) => ([user, remoteServer.getCount(user)]);
const concurrency = 2;
for await (const [user, count] of asyncPool(concurrency, users, getCount)) {
console.log(user, count);
}
The above asyncPool function returns an async iterator that yields as soon as a promise completes (under concurrency limit) and it rejects immediately as soon as one of the promises rejects.
It is possible to limit requests to server by using https://www.npmjs.com/package/job-pipe
Basically you create a pipe and tell it how many concurrent requests you want:
const pipe = createPipe({ throughput: 6, maxQueueSize: Infinity })
Then you take your function which performs call and force it through the pipe to create a limited amount of calls at the same time:
const makeCall = async () => {...}
const limitedMakeCall = pipe(makeCall)
Finally, you call this method as many times as you need as if it was unchanged and it will limit itself on how many parallel executions it can handle:
await limitedMakeCall()
await limitedMakeCall()
await limitedMakeCall()
await limitedMakeCall()
await limitedMakeCall()
....
await limitedMakeCall()
Profit.
I suggest not downloading packages and not writing hundreds of lines of code:
async function async_arr<T1, T2>(
arr: T1[],
func: (x: T1) => Promise<T2> | T2, //can be sync or async
limit = 5
) {
let results: T2[] = [];
let workers = [];
let current = Math.min(arr.length, limit);
async function process(i) {
if (i < arr.length) {
results[i] = await Promise.resolve(func(arr[i]));
await process(current++);
}
}
for (let i = 0; i < current; i++) {
workers.push(process(i));
}
await Promise.all(workers);
return results;
}
Here's my recipe, based on killdash9's answer.
It allows to choose the behaviour on exceptions (Promise.all vs Promise.allSettled).
// Given an array of async functions, runs them in parallel,
// with at most maxConcurrency simultaneous executions
// Except for that, behaves the same as Promise.all,
// unless allSettled is true, where it behaves as Promise.allSettled
function concurrentRun(maxConcurrency = 10, funcs = [], allSettled = false) {
if (funcs.length <= maxConcurrency) {
const ps = funcs.map(f => f());
return allSettled ? Promise.allSettled(ps) : Promise.all(ps);
}
return new Promise((resolve, reject) => {
let idx = -1;
const ps = new Array(funcs.length);
function nextPromise() {
idx += 1;
if (idx < funcs.length) {
(ps[idx] = funcs[idx]()).then(nextPromise).catch(allSettled ? nextPromise : reject);
} else if (idx === funcs.length) {
(allSettled ? Promise.allSettled(ps) : Promise.all(ps)).then(resolve).catch(reject);
}
}
for (let i = 0; i < maxConcurrency; i += 1) nextPromise();
});
}
I know there are a lot of answers already, but I ended up using a very simple, no library or sleep required, solution that uses only a few commands. Promise.all() simply lets you know when all the promises passed to it are finalized. So, you can check on the queue intermittently to see if it is ready for more work, if so, add more processes.
For example:
// init vars
const batchSize = 5
const calls = []
// loop through data and run processes
for (let [index, data] of [1,2,3].entries()) {
// pile on async processes
calls.push(doSomethingAsyncWithData(data))
// every 5th concurrent call, wait for them to finish before adding more
if (index % batchSize === 0) await Promise.all(calls)
}
// clean up for any data to process left over if smaller than batch size
const allFinishedProcs = await Promise.all(calls)
A good solution for controlling the maximum number of promises/requests is to split your list of requests into pages, and produce only requests for one page at a time.
The example below makes use of iter-ops library:
import {pipeAsync, map, page} from 'iter-ops';
const i = pipeAsync(
users, // make it asynchronous
page(10), // split into pages of 10 items in each
map(p => Promise.all(p.map(u => u.remoteServer.getCount(u)))), // map into requests
wait() // resolve each page in the pipeline
);
// below triggers processing page-by-page:
for await(const p of i) {
//=> p = resolved page of data
}
This way it won't try to create more requests/promises than the size of one page.

Javascript: Run async task in series(or sequence) without libraries

I want to run some asynchronous task in a loop, but it should execute in sequence order(one after another). It should be vanilla JS, not with any libraries.
var doSome = function(i) {
return Promise.resolve(setTimeout(() => {
console.log('done... ' + i)
}, 1000 * (i%3)));
}
var looper = function() {
var p = Promise.resolve();
[1,2,3].forEach((n) => {
p = p.then(() => doSome(n))
})
return p;
}
looper();
Current output:
calling for ...1
calling for ...2
calling for ...3
Promise {<resolved>: 8260}
done... 3
done... 1
done... 2
Expected output:
calling for ...1
calling for ...2
calling for ...3
Promise {<resolved>: 8260}
done... 1
done... 2
done... 3
Note: Kindly answer, if you tried and it's working as expected
So, from your comment below, I think your own example code isn't quite matching your description. I think what you want for your example is something closer to the below snippet:
var doSome = function(i) {
return new Promise((resolve, reject) => {
setTimeout(() => resolve(`Completing ${i}`), 1000*(i%3))
});
}
var looper = function() {
[1,2,3].forEach((n) => {
doSome(n).then(console.log);
});
}
looper();
Here, the array [1, 2, 3] is iterated over, and an asynchronous process is generated for each one. As each of those async processes complete, we .then on them and console log their resolved result.
So, now the question comes how to best queue the results? Below, I stored them into an array, then leveraged async/await in order to pause execution on the results until they complete in order.
// This is probably what you want
var doSome = function(i) {
return new Promise((resolve, reject) => {
setTimeout(() => resolve(`Completing ${i}`), 1000*(i%3))
});
}
var looper = async function() {
const nums = [1,2,3];
const promises = []
nums.forEach((n) => {
console.log(`Queueing ${n}`);
promises.push(doSome(n));
});
for (let promise of promises) {
const result = await promise;
console.log(result);
}
}
looper();
Now, we could have eliminated a loop and only executed one after the last completed:
// Don't use this-- it is less efficient
var doSome = function(i) {
return new Promise((resolve, reject) => {
setTimeout(() => resolve(`Completing ${i}`), 1000*(i%3))
});
}
var looper = async function() {
const nums = [1,2,3];
const promises = [];
for (let n of nums) {
console.log(`Queueing ${n}`);
const result = await doSome(n);
console.log(result);
};
}
looper();
But, as you can see in the log, this approach won't queue up the next async process until the previous one has completed. This is undesirable and doesn't match your use case. What we get from the two-looped approach preceding this one is that all async processes are immediately executed, but then we order/queue the results so they respect our predefined order, not the order in which they resolve.
UPDATE
Regarding Promise.all, async/await and the intended behavior of the queueing:
So, if you want to avoid using async/await, I think you could write some sort of utility:
var doSome = function(i) {
return new Promise((resolve, reject) => {
setTimeout(() => resolve(`Completing ${i}`), 1000*(i%3))
});
}
function handlePromiseQueue(queue) {
let promise = queue.shift();
promise.then((data) => {
console.log(data)
if (queue.length > 0) {
handlePromiseQueue(queue);
}
})
}
var looper = function() {
const nums = [1,2,3];
const promises = []
nums.forEach((n) => {
console.log(`Queueing ${n}`);
promises.push(doSome(n));
});
handlePromiseQueue(promises);
}
looper();
HOWEVER, let me be clear-- if user Bergi's assertion is correct, and it is not important that each async promise be executed upon as soon as it resolves, only that none of them be acted upon until they all have come back, then this can 100% be simplified with Promise.all:
// This is probably what you want
var doSome = function(i) {
return new Promise((resolve, reject) => {
setTimeout(() => resolve(`Completing ${i}`), 1000*(i%3))
});
}
function handlePromiseQueue(queue) {
let promise = queue.shift();
promise.then((data) => {
console.log(data)
if (queue.length > 0) {
handlePromiseQueue(queue);
}
})
}
var looper = function() {
const nums = [1,2,3];
const promises = []
nums.forEach((n) => {
console.log(`Queueing ${n}`);
promises.push(doSome(n));
});
Promise.all(promises).then(() => handlePromiseQueue(promises));
}
looper();
Finally, as Bergi also pointed out, I am playing fast and loose here by not setting up any catch on these various promises-- I omitted them for brevity in examples, but for your purposes you will want to include proper handling for errors or bad requests.

Categories