I want to await a function to complete then run another function - javascript

This is the code I am trying with:
const arr = ['a' , 'b' ,'c', 'd']
const func = async () => {
let i = 0
let interval = setInterval(() => {
let x = arr[i++ % arr.length]
console.log(x)
if (i === 4 ) {
clearInterval(interval)
}
}, 2000)
}
const another_func = () => {
console.log('logic')
}
const main = async () => {
await func()
another_func()
}
main()
Output :-
logic
a
b
c
d
When I run this problem "logic" gets printed before all the elements of array.
Why should I do print all the elements of array first and only then run the other function and print the logic?

For that, you need to use Promise. Here is my solution:
const arr = ['a', 'b', 'c', 'd']
const func = () => new Promise((resolve, reject) => {
let i = 0
let interval = setInterval(() => {
let com = arr[i++ % arr.length]
console.log(com)
if (i === 4) {
clearInterval(interval);
resolve('success');
}
}, 2000)
})
const another_func = () => {
console.log('logic')
}
const main = async () => {
await func()
another_func()
}
main()

Your first async function doesn't use await, which is already a sign of a problem. setInterval schedules the execution of the callback, but the setInterval call itself immediately returns, so your async function returns, and the implicit promise it returns is resolved. So main is awaiting a promise that is immediately resolved.
You can typically get a correct implementation by promisifying setTimeout, i.e. you define a helper function that returns a promise which will resolve after a given delay. With that in place you can create a for loop with await in the first async function:
// Helper function that promisifies setTimeout
const delay = ms => new Promise(resolve => setTimeout(resolve, ms));
const arr = ['a' , 'b' ,'c', 'd'];
const func = async () => {
for (let i = 0; i < 4; i++) {
await delay(1000);
let x = arr[i % arr.length];
console.log(x);
}
}
const another_func = () => {
console.log('logic');
}
const main = async () => {
await func();
another_func();
}
main();

Related

Add delay between each item in an array when looping over array

I have an async function that gets called that loops over an array an calls a function for each item.
In this example, the function is hitting an API endpoint and I need to wait for one item to finish before moving onto the next.
However, what currently happens is that each function gets called at roughly the same time, which is causing issues in the api response. So i need to wait 1 second between each request.
This is what I currently have
const delayedLoop = async () => {
const delay = (ms) => new Promise((resolve) => setTimeout(resolve, ms));
const myAsyncFunc = async (i) => {
console.log(`item ${i}`);
await delay(0);
return true;
};
const arr = ['one', 'two', 'three'];
const promises = arr.map(
(_, i) =>
new Promise((resolve) =>
setTimeout(async () => {
await myAsyncFunc(i);
resolve(true);
}, 1000),
),
);
return Promise.all(promises);
}
const myFunc = async () => {
console.log('START');
await delayedLoop();
console.log('FINISH');
}
myFunc();
What happens is;
LogsSTART
waits 1 second
Logs all item ${i} together (without delay in between)
Immediately logs FINISH
What I want to happen is
LogsSTART
waits 1 second
Logs item 1
waits 1 second
Logs item 2
waits 1 second
Logs item 3
Immediately logs FINISH
See JSFiddle to see it in action
You can do, it like this, using a simple for-loop:
const delayedLoop = async () => {
const delay = (ms) => new Promise((resolve) => setTimeout(resolve, ms));
const myAsyncFunc = async (i) => {
console.log(`item ${i}`);
return true;
};
const arr = ['one', 'two', 'three'];
for(let i=0; i < arr.length; i++) {
await myAsyncFunc(i);
await delay(1000);
}
}
const myFunc = async () => {
console.log('START');
await delayedLoop();
console.log('FINISH');
}
myFunc();

How to run multiple async functions as fast as possible (JS)?

If you were given an array of async functions and the task is to create a class that takes this array and runs the functions as fast as possible with a constraint that only 15 functions can run at the same time, what would be a way to do that?
If there wasn't a constraint for 15 functions, I believe Promise.all would be the way to go.
Using just async/await and waiting for one function to resolve to add the next one is very slow as we must have to wait for 1 function to resolve until we can add another one and we can thus have a bottleneck function.
Adding 15 functions to array and running them with Promise.all and after that resolves, adding another 15 or the rest of them, is again, not very efficient as what we want to do is to call another function as soon as one of the functions resolves.
Any ideas?
Let's create a stack that has an async popAsync method:
const createAsyncStack = () => {
const stack = [];
const waitingConsumers = [];
const push = (v) => {
if (waitingConsumers.length > 0) {
const resolver = waitingConsumers.shift();
resolver && resolver(v);
}
else {
stack.push(v);
}
};
const popAsync = () => {
if (stack.length > 0) {
const queueItem = stack.pop();
return typeof queueItem !== 'undefined'
? Promise.resolve(queueItem)
: Promise.reject(Error('unexpected'));
}
else {
return new Promise((resolve) => waitingConsumers.push(resolve));
}
};
return [push, popAsync];
};
This means that any consumer calling popAsync will be returned a Promise that only completes if / when an item is available in the stack.
We can now use this stack as a "gatekeeper" for a simple higher-order function (i.e. a function that returns a function).
Say we only want to allow maxDOP (maximum degrees-of-parallelism) concurrent invocations of an async function, we push maxDOP tokens into the stack (here, I've used empty objects as the tokens), then require that in order to proceed, it is necessary to acquire a token from this stack. When our function call is finished, we return our token to the stack (using push), where that token can then be consumed by any waiting consumers.
const withMaxDOP = (f, maxDop) => {
const [push, popAsync] = createAsyncStack();
for (let x = 0; x < maxDop; ++x) {
push({});
}
return async (...args) => {
const token = await popAsync();
try {
return await f(...args);
}
finally {
push(token);
}
};
};
The function returns a new function that can be called in exactly the same way as the function that is supplied to it (i.e. is has the same signature).
Now, let's create a function that simply calls a supplied function with the supplied arguments:
const runAsync = (asyncFn, ...args) => asyncFn(...args);
and wrap it using the higher-order withMaxDOP function, which will return a new function with an identical signature to the wrapped function:
const limitedRunAsync = withMaxDOP(runAsync, 15);
Now we can use this function to call the functions in our array:
Promise.all(asyncFns.map(f => limitedRunAsync(f)))
.then((returnValues) => console.log("all finished", returnValues));
which will ensure that there are only ever 15 "in-flight" invocations ever permitted at one time.
See this runnable snippet for a full example:
const createAsyncStack = () => {
const stack = [];
const waitingConsumers = [];
const push = (v) => {
if (waitingConsumers.length > 0) {
const resolver = waitingConsumers.shift();
resolver && resolver(v);
} else {
stack.push(v);
}
};
const popAsync = () => {
if (stack.length > 0) {
const queueItem = stack.pop();
return typeof queueItem !== 'undefined' ? Promise.resolve(queueItem) : Promise.reject(Error('unexpected'));
} else {
return new Promise((resolve) => waitingConsumers.push(resolve));
}
};
return [push, popAsync];
};
const withMaxDOP = (f, maxDop) => {
const [push, popAsync] = createAsyncStack();
for (let x = 0; x < maxDop; ++x) {
push({});
}
return async(...args) => {
const token = await popAsync();
try {
return await f(...args);
} finally {
push(token);
}
};
};
const runAsync = (asyncFn, ...args) => asyncFn(...args);
const limitedRunAsync = withMaxDOP(runAsync, 15);
// set up an array of async functions
const delay = (durationMS) => new Promise((resolve) => setTimeout(() => resolve(), durationMS));
const asyncFns = [...Array(50)].map((_, i) => () => {
console.log("starting " + i);
return delay(Math.random() * 5000).then(v => {
console.log("finished " + i);
return i;
});
});
// ...then wrap and call them all at once
Promise.all(asyncFns.map(f => limitedRunAsync(f))).then((returnValues) => console.log("all finished", returnValues));
...and see this TypeScript Playground Link for a fully type-annotated version of the same code.
Here's something I whipped up in the last 20 minutes that should do the job
I'm sure if I thought about it I could probably do it without the Promise constructor, but ... 20 minutes is 20 minutes :p
Please, if someone can rewrite this without the Promise constructor, I'd love to see it - because in the back of my mind, I'm sure there is a way
Note, this will run regardless of rejections
Results will be either
result: actualResult
or
error: rejectionReason
So you can process results/rejections
function runPromises(arrayOfFunctions, maxLength) {
return new Promise(resolve => {
const queue = arrayOfFunctions.map((fn, index) => ({fn, index}));
const results = new Array(arrayOfFunctions.length);
let finished = 0;
const doQ = () => {
++finished;
if (queue.length) {
const {fn, index} = queue.shift();
fn()
.then(result => results[index] = {result})
.catch(error => results[index] = {error})
.finally(doQ);
} else {
if (finished === arrayOfFunctions.length) {
resolve(results);
}
}
};
queue.splice(0, maxLength).forEach(({fn, index}) => fn()
.then(result => results[index] = {result})
.catch(error => results[index] = {error})
.finally(doQ)
);
});
}
//
// demo and show that maximum 15 inflight requests
//
let inFlight = 0;
let maxInFlight = 0;
const fns = Array.from({length:50}, (_, i) => {
return () => new Promise(resolve => {
++inFlight;
maxInFlight = Math.max(inFlight, maxInFlight);
setTimeout(() => {
--inFlight;
resolve(i);
}, Math.random() * 200 + 100,)
});
});
runPromises(fns, 15).then(results => console.log(maxInFlight, JSON.stringify(results)));

Executing list of async functions

I have an exercise to make a function executeFunctions which takes as arguments a list of async functions and an argument, e.g. number.
The functions have to happen one after another, so if fun1 ends, fun2 needs to start with the value which was returned from fun1.
The problem is that I can't use async and await. I wanted to do it using reduce, but I guess that it wants to execute const res1 and go further before it returns a value (because of setTimeout).
Is there any way to do it without async and await?
const fun1 = function(value) {
return setTimeout(() => value*2, 3000)
}
const fun2 = function(value) {
return setTimeout(() => value*4, 3000)
}
const cb2 = (value) => {
return value*10
}
const executeFunctions = (funTab, cb) => (n) => {
const res1= funTab[0](n)
console.log(res1)
const resReduce = funTab.reduce((prev,curr) => {
const res2 = curr(prev)
return prev+res2
}, res1)
return cb(resReduce)
};
executeFunctions([fun1,fun2], cb2)(2)
We can use Promise-chaining:
const fun1 = function(value) {
return Promise.resolve(value * 2);
}
const fun2 = function(value) {
return Promise.resolve(value * 2);
}
const fun3 = function(value) {
return Promise.resolve(value * 2);
}
const executeFunctions = (funcList) => (n) => {
let chain = Promise.resolve(n); // initial value
for (let i = 0; i < funcList.length; i++) {
chain = chain.then(funcList[i]); // keep chaining
}
return chain; // last promise
};
const main = () => {
// we need to wait for the last promise in order to print the result
executeFunctions([fun1, fun2, fun3])(2).then(x => console.log('solution is:', x));
}
main() // prints: "solution is: 16"
or, we can also use a modified version of the suggested reduce solution, by changing the implementation of executeFunctions as follows (the rest of the code should remain as in the previous snippet):
const executeFunctions = (funcList) => (n) => {
const init = Promise.resolve(n);
const res = funcList.reduce((p, c) => {
return p.then(c)
}, init);
return res;
};

My Asynchronous Loop returns an array of unknown Numbers

I am trying to scrape the list of results from Google Maps.
Example: Visit https://www.google.com/maps/search/gym+in+nyc
Get all results in an array, loop Starts and click element#1, extract data, back to results page and continue loop.
const finalData = async function () {
const arr = [];
const resultList = [...[...document.querySelectorAll("[aria-
label^='Results for']")][0].children].filter((even, i) => !(i % 2));
for (const eachElement of resultList) {
let response = await scrapePage(eachElement);
arr.push(response);
}
return arr;
};
async function scrapePage(elem) {
// Clicks each element
let click = await elem.click();
// Grabs Just the Title
const titleText = await setTimeout(function () {
let title = document.querySelector(".section-hero-header-title
span").innerText;
return title;
}, 3000);
// setTimeout to cause delay before click the back button
setTimeout(function () {
document.querySelector(".section-back-to-list-button").click();
}, 5000);
return titleText;
}
const final = finalData().then((value) => {
return value;
});
I have no idea why when I try the above code in devtools, only the last result is clicked and why my const variable "final" is filled with array of random numbers.
The problem is that you had assumed that the await operator worked on setTimeout; setTimout, setInterval and related functions do not use promises at all.
When working with time orientated code, generally I would set up a helper function that uses a promise:
const delay = seconds => new Promise(
resolve => {
setTimeout(resolve, seconds);
}
);
Updating your code to wait for the timeouts from here is simple:
const finalData = async () => {
const arr = [];
const resultList = [...document.querySelector("[aria-label^='Results for']").children].filter((even, i) => !(i % 2));
for (const eachElement of resultList) {
const response = await scrapePage(eachElement);
arr.push(response);
}
return arr;
};
async function scrapePage(elem) {
// Clicks each element
const click = await elem.click(); // this is not asynchronous! click also returns undefined
// Grabs Just the Title
await delay(3);
const titleText = document.querySelector(".section-hero-header-title span").innerText;
// delay before click[ing] the back button
await delay(5);
document.querySelector(".section-back-to-list-button").click();
return titleText;
}
const final = finalData();
And, on a related note, this piece of code waits for every single result in sequence, this takes at least 8 seconds per iteration:
const finalData = async () => {
const arr = [];
const resultList = [...document.querySelector("[aria-label^='Results for']").children].filter((even, i) => !(i % 2));
for (const eachElement of resultList) {
let response = await scrapePage(eachElement);
arr.push(response);
}
return arr;
};
If you wanted to concurrently execute every iteration, you may want to consider mapping the function across the array and using Promise.all, like so:
const finalData = async () => {
const resultList = [...document.querySelector("[aria-label^='Results for']").children].filter((_, i) => !(i % 2));
return Promise.all(resultList.map(scrapePage));
};
You can try this:
async function scrapePage(elem) {
// Clicks each element
let click = await elem.click();
// Grabs Just the Title
const titleText = await new Promise((resolve) => {
setTimeout(function () {
let title = document.querySelector(".section-hero-header-title span").innerText;
resolve(title);
}, 3000);
})
// setTimeout to cause delay before click the back button
setTimeout(function () {
document.querySelector(".section-back-to-list-button").click();
}, 5000);
return titleText;
}

ES6 Promise replacement of async.eachLimit / async.mapLimit

In async, if I need to apply a asynchronousfunction to 1000 items, I can do that with:
async.mapLimit(items, 10, (item, callback) => {
foo(item, callback);
});
so that only 10 item are processed at the same time, limiting overhead and allowing control.
With ES6 promise, while I can easily do:
Promise.all(items.map((item) => {
return bar(item);
}));
that would process all 1000 items at the same time which may cause a lot of problems.
I know Bluebird have ways to handle that, but I am searching a ES6 solution.
If you don't care about the results, then it's quick to whip one up:
Promise.eachLimit = async (funcs, limit) => {
let rest = funcs.slice(limit);
await Promise.all(funcs.slice(0, limit).map(async func => {
await func();
while (rest.length) {
await rest.shift()();
}
}));
};
// Demo:
var wait = ms => new Promise(resolve => setTimeout(resolve, ms));
async function foo(s) {
await wait(Math.random() * 2000);
console.log(s);
}
(async () => {
let funcs = "ABCDEFGHIJKLMNOPQRSTUVWXYZ".split("").map(s => () => foo(s));
await Promise.eachLimit(funcs, 5);
})();
A key performance property is running the next available function as soon as any function finishes.
Preserving results
Preserving the results in order makes it a little less elegant perhaps, but not too bad:
Promise.mapLimit = async (funcs, limit) => {
let results = [];
await Promise.all(funcs.slice(0, limit).map(async (func, i) => {
results[i] = await func();
while ((i = limit++) < funcs.length) {
results[i] = await funcs[i]();
}
}));
return results;
};
// Demo:
var wait = ms => new Promise(resolve => setTimeout(resolve, ms));
async function foo(s) {
await wait(Math.random() * 2000);
console.log(s);
return s.toLowerCase();
}
(async () => {
let funcs = "ABCDEFGHIJKLMNOPQRSTUVWXYZ".split("").map(s => () => foo(s));
console.log((await Promise.mapLimit(funcs, 5)).join(""));
})();
There's nothing built in, but you can of course group them yourself into promise chains, and use a Promise.all on the resulting array of chains:
const items = /* ...1000 items... */;
const concurrencyLimit = 10;
const promise = Promise.all(items.reduce((promises, item, index) => {
// What chain do we add it to?
const chainNum = index % concurrencyLimit;
let chain = promises[chainNum];
if (!chain) {
// New chain
chain = promises[chainNum] = Promise.resolve();
}
// Add it
promises[chainNum] = chain.then(_ => foo(item));
return promises;
}, []));
Here's an example, showing how many concurrent promises there are any given time (and also showing when each "chain" is complete, and only doing 200 instead of 1,000):
const items = buildItems();
const concurrencyLimit = 10;
const promise = Promise.all(items.reduce((promises, item, index) => {
const chainNum = index % concurrencyLimit;
let chain = promises[chainNum];
if (!chain) {
chain = promises[chainNum] = Promise.resolve();
}
promises[chainNum] = chain.then(_ => foo(item));
return promises;
}, []).map(chain => chain.then(_ => console.log("Chain done"))));
promise.then(_ => console.log("All done"));
function buildItems() {
const items = [];
for (let n = 0; n < 200; ++n) {
items[n] = n;
}
return items;
}
var outstanding = 0;
function foo(item) {
++outstanding;
console.log("Starting " + item + " (" + outstanding + ")");
return new Promise(resolve => {
setTimeout(_ => {
--outstanding;
console.log("Resolving " + item + " (" + outstanding + ")");
resolve(item);
}, Math.random() * 500);
});
}
.as-console-wrapper {
max-height: 100% !important;
}
I should note that if you want to track the result of each of those, you'd have to modify the above; it doesn't try to track the results (!). :-)
Using Array.prototype.splice
while (funcs.length) {
await Promise.all( funcs.splice(0, 100).map(f => f()) )
}
This is the closest one to async.eachLimit
Promise.eachLimit = async (coll, limit, asyncFunc) => {
let ret = [];
const splitArr = coll.reduce((acc,item,i)=> (i%limit) ? acc :[...acc,coll.slice(i,i+limit)],[])
for(let i =0; i< splitArr.length;i++){
ret[i]=await Promise.all(splitArr[i].map(ele=>asyncFunc(ele)));
}
return ret;
}
const wait = ms => new Promise(resolve => setTimeout(resolve, ms));
async function foo(s) {
await wait(Math.random() * 2000);
console.log(s);
return s.toLowerCase();
}
(async () => {
let arr = "ABCDEFGHIJKLMNOPQRSTUVWXYZ".split("");
console.log((await Promise.eachLimit(arr, 5, foo)));
})();

Categories