Go to the next forEach iteration after promise resolves - javascript

I'm going throgh the array of some data and on every iteration I deal with promise.
What I want is to go to the next forEach iteration only when the promise on the current iteration is resolved.
I searched through out different solutions and figured out that usage of for(... of ...) instead of forEach could make the trick. But still can't figure it out.
const data = ['apple', 'orange', 'banana'];
data.forEach((row, rowIndex) => {
let params = {
fruitIndex: rowIndex,
};
axios.post('/fruits', { ...params })
.then(response => {
// go to the next iteration of forEach only after this promise resolves
})
.catch(error => console.log(error))
});

Recursion helps:
const data = ['apple', 'orange', 'banana'];
function request_fruit(n) {
let params = {
fruitIndex: n,
};
axios.post('/fruits', { ...params })
.then(response => {
// Work with recived...
// ....
// Request next
if(n+1<fruits.length) request_fruit(n+1)
})
.catch(error => console.log(error))
};
request_fruit(0)

If you can, i'd recommed an easy to read and understand for of loop with await/async. Notice the top level await is very recent addition, so you should wrap that in some async function (which you'd likely do anyway).
If you cannot use async/await you can use reduce with an initial resolved promise to chain subsequent calls.
Notice that I used a function sleep that resolves a promise after some time. The call to sleep should be interchangeable with axios calls.
const sleep = (ms) => new Promise(resolve => setTimeout(resolve, ms));
const data = [1000, 2000, 3000];
function withReduce() {
console.log('Sequential Promises with Reduce:');
return data.reduce((previousPromise, ms) => {
return previousPromise
.then(() => {
console.log(`sleeping for ${ms}ms`);
return sleep(ms);
});
}, Promise.resolve());
}
async function withAsync() {
console.log('Sequential Promises with for of and await:');
for (let ms of data) {
console.log(`sleeping for ${ms}ms`);
await sleep(ms);
}
}
withReduce().then(withAsync);

It looks like nobody posted the specific async/await yet, so this is how you use it with for…of:
const data = ['apple', 'orange', 'banana'];
for (const [rowIndex, row] of data.entries()) {
let params = {
fruitIndex: rowIndex,
};
let response;
try {
response = await axios.post('/fruits', { ...params });
} catch (error) {
console.log(error);
continue;
}
// …
}
This all being placed inside an async function. (Also, { ...params } is a bit weird. What’s wrong with params directly?)

Just use await on the api call.
function postFruitData(){
const data = ['apple', 'orange', 'banana'];
data.forEach(async (row, rowIndex) => {
let params = {
fruitIndex: rowIndex,
};
const response = await axios.post('/fruits', { ...params })
});
}

Related

How to push failed promises to separate array?

I am looping through ids that sends it to an async function and I want to return both success and failed data, right now I am only returning success data
const successContractSignature: LpContractSla[] = [];
for (const id of lpContractSlaIds) {
const data = await createLpSignature(context, id);
if (data) {
successContractSignature.push(data);
}
}
return successContractSignature;
If the createLpSignature throws error, do i need a try catch here? but wouldnt that exit the loop?
How can I push failed data to separate array without breaking the loop?
Unless there's a specific reason to avoid the call in parallel, it's always a good practice to avoid to start async calls in a for loop with await, since you are going to wait for each promise to resolve (or reject ) before executing the next one.
This is a better pattern which lets you also get all the results of the promises, either they resolved or rejected:
const successContractSignature: LpContractSla[] = await Promise.allSettled(lpContractSlaIds.map((id: string) => createLpSignature(context,id)))
return successContractSignature;
But if for some particular reason you need to make these calls in a sequence and not in parallel, you can wrap the single call in a try catch block ,that won't exit the loop:
for (const id of lpContractSlaIds) {
let data;
try {
data = await createLpSignature(context, id);
} catch(e) {
data = e
}
if (data) {
successContractSignature.push(data);
}
}
You can test it in this example:
const service = (id) =>
new Promise((res, rej) =>
setTimeout(
() => (id %2 === 0 ? res("ID: "+id) : rej('Error ID : '+id)),
1000
)
);
const ids = [1,2,3,4,5]
const testParallelService = async () => {
try {
const data = await Promise.allSettled(ids.map(id => service(id)))
return data.map(o => `${o.status}: ${o.reason ?? o.value}`)
} catch(e) {
console.log(e)
}
}
testParallelService().then(data => console.log("Parallel data: ", data))
const testSequentialService = async () => {
const res = [];
for (const id of ids) {
let data;
try {
data = await service(id);
} catch (e) {
data = e;
}
if (data) {
res.push(data);
}
}
return res;
};
testSequentialService().then((data) => console.log('Sequential Data: ', data));

Sequential Promise All call with a variable param

I have a function
this.config.apiFunc = (pageNo) => this.somePaginatedCall({
page: {
number: pNo,
size: 10
}
})
Now, I want to fetch the data in a batch of 5 pages (by maintaining the sequence). I added a delay of 2000ms for the sake of testing. I created
config = {
apiFunc: any,
data: []
}
async getData(){
const pageGroupList = [
[1,2,3,4],
[5,6,7,8]
];
const groupedPromise = [];
groupedPromise.push(this.pageGroupList.map(pageNo => this.config.apiFunc(pageNo))); //<-- This is making network request
// because I am trigerring the function call with ()
await this.asyncForEach(groupedPromise,this.fetchInBatch.bind(this));
}
private asyncForEach(promiseList, func): Promise<any> {
return promiseList.reduce((p,apiList) => {
return p.then(this.sleep(2000)).then(() => func(apiList));
}, Promise.resolve());
}
private fetchInBatch(apiList) {
return Promise.all(apiList).then((res: any) => {
// this gets called after every 2 secs but I do not see any call in Network tab
this.config.data = [...this.config.data , ...[].concat(...res.map(r => r.data))];
})
}
sleep(ms) {
return (x) => new Promise(resolve => setTimeout(() => resolve(x), ms))
}
The problem is that I am making API request at groupedPromise.push(this.pageGroupList.map(pageNo => this.config.apiFunc(pageNo))) which I should not.
The data although loads as expected (after 2000 ms delay) but the network calls are already made.
I want to load the data after the 1st batch of pages is loaded (1,2,3,4) . In this example, after 2 secs.
Problem is that I want to pass pageNo to each API call before I invoke the function. I am slightly confused.
Try to do it like below. I have moved the map function inside the Promise.all
async getData(){
const pageGroupList = [
[1,2,3,4],
[5,6,7,8]
];
this.asyncForEach(pageGroupList,this.fetchInBatch.bind(this));
}
private asyncForEach(pageGroupList, execFunc) {
return pageGroupList.reduce((p,pageGroup) => {
return p.then(() => execFunc(pageGroup));
}, Promise.resolve());
}
private fetchInBatch(pageGroupList) {
return Promise.all(pageGroupList.map(pageNo => this.config.apiFunc(pageNo))).then((res: any) => {
this.config.data = [...this.config.data, ...[].concat(...res.map(r => r.data))];
})
}
I think your problem is that you're mapping the results of calling, but you should mapping functions, try this instead:
//Returns [fn1, fn2, ...];
groupedPromise.push(...this.pageGroupList.map(pageNo => () => this.config.apiFunc(pageNo)));
Or better:
async getData(){
const pageGroupList = [
[1,2,3,4],
[5,6,7,8]
];
const groupedPromise = this.pageGroupList.map(pageNo => () =>
this.config.apiFunc(pageNo)));
await this.asyncForEach(groupedPromise,this.fetchInBatch.bind(this));
}

Using Async Funtions in a For loop

I have an array where I need to call an API endpoint for each index. Once that is resolved I need to append it in that element. I want to return the updated array once this gets completed for each index of the array.
I tried using async-await in this way
// Let input be [{'x': 1, 'y': 2}, {'x': 11, 'y': 22}, ...]
async function hello(input) {
await input.forEach(element => {
fetch(url, options)
.then((res) => {
element['z'] = res
})
})
return input
}
I need to use this function to update my state
hello(data)
.then((res: any) => {
this.setState((prevState) => ({
...prevState,
inputData: res,
}))
})
The issue is that I need one more forced render for key 'z' to show.
How to resolve this?
I don't have much experience using async await so I am not sure if I am using it correctly.
The correct way is to use Promise.all and return the promise to be used by the caller function since you want the entire updated input value to be set in state.
In your case forEach doesn't return a promise so await on it is useless.
Also if you use await within the forEach function, you need to be able provide away to let the hello function's .then method call when all promises have resolved. Promise.all does that for you
function hello(input) {
const promises = [];
input.forEach(element => {
promises.push(
fetch(url, options)
.then(res => res.json()
.then((result) => {
// return the updated object
return {...element, z: result};
})
)
});
return Promise.all(promises);
}
...
hello(data)
.then((res: any) => {
this.setState((prevState) => ({
...prevState,
inputData: res,
}))
})
P.S. Note that the response from fetch also will need to be called with res.json()
async/await won't work in loops which uses callback(forEach, map, etc...)
You can achieve your result using for..of loop.
Try this and let me know if it works.
function getResult() {
return new Promise((resolve) => {
fetch(url, options)
.then((res) => {
return resolve(res);
})
})
}
async function hello(input) {
for (let element of input) {
let res = await getResult(element);
element['z'] = res;
}
}

How do I make a long list of http calls in serial?

I'm trying to only make one http call at time but when I log the response from getUrl they are piling up and I start to get 409s (Too many requests)
function getUrl(url, i, cb) {
const fetchUrl = `https://api.scraperapi.com?api_key=xxx&url=${url.url}`;
fetch(fetchUrl).then(async res => {
console.log(fetchUrl, 'fetched!');
if (!res.ok) {
const err = await res.text();
throw err.message || res.statusText;
}
url.data = await res.text();
cb(url);
});
}
let requests = urls.map((url, i) => {
return new Promise(resolve => {
getUrl(url, i, resolve);
});
});
const all = await requests.reduce((promiseChain, currentTask) => {
return promiseChain.then(chainResults =>
currentTask.then(currentResult => [...chainResults, currentResult]),
);
}, Promise.resolve([]));
Basically I don't want the next http to start until the previous one has finished. Otherwise I hammer their server.
BONUS POINTS: Make this work with 5 at a time in parallel.
Since you're using await, it would be a lot easier to use that everywhere instead of using confusing .thens with reduce. It'd also be good to avoid the explicit Promise construction antipattern. This should do what you want:
const results = [];
for (const url of urls) {
const response = await fetch(url);
if (!response.ok) {
throw new Error(response); // or whatever logic you need with errors
}
results.push(await response.text());
}
Then your results variable will contain an array of response texts (or an error will have been thrown, and the code won't reach the bottom).
The syntax for an async function is an async keyword before the argument list, just like you're doing in your original code:
const fn = async () => {
const results = [];
for (const url of urls) {
const response = await fetch(url);
if (!response.ok) {
throw new Error(response); // or whatever logic you need with errors
}
results.push(await response.text());
}
// do something with results
};
To have a limited number of requests at a time, make a queue system - when a request completes, recursively call a function that makes another request, something like:
const results = [];
const queueNext = async () => {
if (!urls.length) return;
const url = urls.shift();
const response = await fetch(url);
if (!response.ok) {
throw new Error(response); // or whatever logic you need with errors
}
results.push(await response.text());
await queueNext();
}
await Promise.all(Array.from({ length: 5 }, queueNext));
// do something with results
You cannot use Array methods to sequentually run async operations because array methods are all synchronous.
The easiest way to achieve sequential async tasks is through a loop. Otherwise, you will need to write a custom function to imitate a loop and run .then after a async task ends, which is quite troublesome and unnecessary.
Also, fetch is already returning a Promise, so you don't have to create a Promise yourself to contain that promise returned by fetch.
The code below is a working example, with small changes to your original code (see comments).
// Fake urls for example purpose
const urls = [{ url: 'abc' }, { url: 'def', }, { url: 'ghi' }];
// To imitate actual fetching
const fetch = (url) => new Promise(resolve => {
setTimeout(() => {
resolve({
ok: true,
text: () => new Promise(res => setTimeout(() => res(url), 500))
});
}, 1000);
});
function getUrl(url, i, cb) {
const fetchUrl = `https://api.scraperapi.com?api_key=xxx&url=${url.url}`;
return fetch(fetchUrl).then(async res => { // <-- changes here
console.log(fetchUrl, 'fetched!');
if (!res.ok) {
const err = await res.text();
throw err.message || res.statusText;
}
url.data = await res.text();
return url; // <--- changes here
});
}
async function getAllUrls(urls){
const result = [];
for (const url of urls){
const response = await getUrl(url);
result.push(response);
}
return result;
}
getAllUrls(urls)
.then(console.log);
async/await is perfect for this.
Assuming you have an array of URLs as strings:
let urls = ["https://example.org/", "https://google.com/", "https://stackoverflow.com/"];
You simply need to do:
for (let u of urls) {
await fetch(u).then(res => {
// Handle response
}).catch(e => {
// Handle error
});
}
The loop will not iterate until the current fetch() has resolved, which will serialise things.
The reason array.map doesn't work is as follows:
async function doFetch(url) {
return await fetch(url).then(res => {
// Handle response
}).catch(e => {
// Handle error
});
}
let mapped = urls.map(doFetch);
is equivalent to:
let mapped;
for (u of urls) {
mapped.push(doFetch(u));
}
This will populate mapped with a bunch of Promises immediately, which is not what you want. The following is what you want:
let mapped;
for (u of urls) {
mapped.push(await doFetch(u));
}
But this is not what array.map() does. Therefore using an explicit for loop is necessary.
Many people provided answers using for loop. But in some situation await in for loop is not welcome, for example, if you are using Airbnb style guide.
Here is a solution using recursion.
// Fake urls for example purpose
const urls = [{ url: 'abc' }, { url: 'def', }, { url: 'ghi' }];
async function serialFetch(urls) {
return await doSerialRecursion(
async (url) => {
return result = await fetch(url)
.then((response) => {
// handle response
})
.catch((err) => {
// handle error
});
},
urls,
0
);
}
async function doSerialRecursion(fn, array, startIndex) {
if (!array[startIndex]) return [];
const currResult = await fn(array[startIndex]);
return [currResult, ...(await doSerialRecursion(array, fn, startIndex + 1))];
}
const yourResult = await serialFetch(urls);
The doSerialRecursion function will serially execute the function you passed in, which is fetch(url) in this example.

Javascript implement async to get result of long-running process into array

I have an array of files that I am adding data to which conceptually works like this:
let filearray = ['file1.txt', 'file2.txt', 'file3.txt'];
newarray = [];
for (let f of filearray) {
newstuff = 'newstuff';
newarray.push([f, newstuff]);
}
console.log(newarray)
// Returns expected array of arrays
However, what I need to do is make newstuff = slow_promise_function(f); that involves lots of processing. How do I get the value from that promise function into the array?
Ideally I'd like to use the new async feature in ES2017.
Update:
These answers are helping me understand the problems (and solutions):
https://stackoverflow.com/a/37576787/1061836
https://stackoverflow.com/a/43422983/1061836
You could use Promise.all which returns a single Promise that resolves when all of the promises have resolved:
let loadData = async () => {
let filearray = ['file1.txt', 'file2.txt', 'file3.txt'];
try {
let ops = filearray.map(f => slow_promise_function(f));
let newarray = await Promise.all(ops);
// TODO: use newarray
} catch (err) {
console.log(err);
}
}
loadData();
async/await is a nice way to accomplish this, as you suspected:
console.log('Starting...');
let files = ['file1.txt', 'file2.txt', 'file3.txt'];
Promise.all(files.map(async f => [f, await slow_promise_function(f)]))
.then(files => console.log('got results: ', files));
function slow_promise_function(file) {
return new Promise(res => setTimeout(_ => res(`processed ${file}`), 500));
}
Well that can simply be done with Promises more info on this link Promises.
const newStuffFunc = async () => {
try {
let newStuff = await slow_promise_function(f);
// new stuff is a promise. so you either do it with await again or with then/catch
let data = await newStuff;
} catch (e){
}
}
const slow_promise_function = (url) => {
return new Promise((resolve, reject) => {
// do something asynchronous which eventually calls either:
//
// resolve(someValue); // fulfilled
// or
// reject("failure reason"); // rejected
});
};
This link can show you more usability of async/promises into javascript.

Categories