I've found lots of solutions for this, typically something like
const serial = funcs =>
funcs.reduce((promise, func) =>
promise.then(result =>
func().then(Array.prototype.concat.bind(result))),
Promise.resolve([])
)
I'm trying to map an array of promises and run them one after another,
serial(Object.keys(tables).map(key =>
websocketExecute(store,dropTableSQL(tables[key]),null)))
.then(data => {console.log(data);success(data)})
They all run however I get an error TypeError: func is not a function
And the final then isn't resolved..
Any idea how I run a final .then() on a list of promises?
Your function serial expects its argument to be an array of functions that return Promises
however,
Object.keys(tables).map(key => websocketExecute(store,dropTableSQL(tables[key]),null))
returns an array of the results of calling
websocketExecute(store,dropTableSQL(tables[key]),null)
Which is not likely to be a function returning a promise, more like some result
What you'll want to do is:
serial(Object.keys(tables).map(key => () => websocketExecute(store,dropTableSQL(tables[key]),null)))
.then(data => {console.log(data);success(data)})
Assuming websocketExecute returns a Promise
So now, the array returned by .map is an array of
() => websocketExecute(store,dropTableSQL(tables[key]),null)
Which will get called in turn in .reduce
Check out Promise.all() as well.
If I'm not mistaken, you should be able to do something like the following:
const promises: Promise<any>[] = Object.keys(tables).map(key => (
websocketExecute(store, dropTableSQL(tables[key]), null)
)
Promise.all(promises).then((results: any[]) => { ...do stuff })
Typescript annotations are for readability.
Related
How best we can flatten the below calls. New to RxJS, trying to understand how it should be simplified. Read flatMap, forkJoin, switchMap and mergeMap, not getting a right path to integrate it below and not sure which is best in the below scenario.
const useful = [];
a.get('abc').
subscribe((abcdatas) => {
abcdatas.forEach(abcdata => {
if(abcdata.exist) {
b.get('def').
subscribe((defdatas) => {
useful.push(defdatas.someval);
});
}
});
})
if(useful.length) {
c.get('ghi').
subscribe((ghidata) => {
completed...
});
}
Update
Updating my question here and thanks for all the responses. The useful is an global array of result that should be populated from nested call in my case. And it should be passed to the last call finally.
Steps which I am trying:
a.get() => returns adata
b.get(adataset) => should perform a request for every adataset if adataset has exist attribute and also populate useful array which will be used later
c.get(useful) => should trigger and exit.
Use a mapping function like switchMap or mergeMap to map the result from one request to the next request. Use forkJoin to execute multiple requests simultaneously.
So for a one to many scenario the general idea is:
firstRequest().pipe(
switchMap(results => forkJoin(results.map(r => nextRequest(r))))
)
For your case that would be something like:
useful = [];
a.get('abc').pipe(
switchMap(abcdatas => forkJoin(getUseFulRequests(abcdatas))),
tap(useful => useful.forEach(u => this.useful.push(u))),
switchMap(useful => useful.length ? c.get('ghi') : EMPTY)
).subscribe((ghidata) => {
completed...
});
function getUseFulRequests(abcdatas: AbcData[]): Observable<SomeVal>[] {
return abcdatas.reduce((acc, abcdata) => {
if (abcdata.exist) {
const request = b.get('def').pipe(
map(defdatas => defdatas.someval)
)
acc.push(request);
}
return acc;
}, []);
}
This won't emit anything if getUseFulRequests(abcdatas) returns an empty array or useful.length == 0.
I believe the best way to handle this will be to use higher order observables
Consider below code
useful$ = a.get('abc').pipe(
mergeMap(abcdatas =>
abcdata.exist ? forkJoin(abcdatas.map(abcdata => b.get('def'))) : of(undefined)
),
map(defdatas => defdatas.flat()),
mergeMap(({ length }) => length ? c.get('ghi') : of(undefined))
);
useful$.subscribe({
next: () => {
// Completed...
}
})
We first pipe the result of a.get('abc') and use mergeMap to test if abcdata.exist. If it does exits we return forkJoin(abcdatas.map(abcdata => b.get('def'))) simply this will combine an array of observables generated from map function on abcdatas
map(defdatas => defdatas.flat()), will transform the array to a single array
NOTE: flat() was introduces in ES2019
Next we destructure the length property and if it exists we return our final observable
I think that what you are trying to do is this:
a.get("abc").pipe(
mergeMap((abcdatas) => abcdatas.filter((abcdata) => abcdata.exist)), // let's create a stream with all those useful abcdata
mergeMap(abcdata => b.get('def')), // and for each one of those we perform a b.get request
toArray(), // once all the b.get requests have completed, emit a one value stream with an Array of those values values
concatMap(useful => useful.length ? c.get('ghi') : EMPTY) // let's concat that result with the final request
)
I'm writing a then() statement for extracting the json data from an array of responses from fetch(). In the code below queries is an array of promises returned by a series of calls to fetch(). I'm using async/await for the response because otherwise the promises would be returned without resolving (I found a solution in this question).
My first attempt worked properly, when I push into jsonified I obtain an array with the promises as elements:
return Promise.all(queries)
.then(async(responses)=> {
let jsonified = [];
for (let res of responses){
jsonified.push(await(res.json()));
}
return jsonified;
}.then(data=> ...
But when I went for refactoring and tried to use Array.reduce() I realised that when I push into the accumulator instead of obtaining an array with a promise as element, acc is assigned to be a promise instead.
.then(responses=> {
return responses.reduce(async(acc, next) => {
acc.push(await(next.json()));
return acc;
}, [])
})
I can use the first version without any issue and the program works properly, but whats happening inside Array.reduce()? Why pushing a promise into the accumulator returns a promise intead of an array? How could I refactor the code with Array.reduce()?
Although it's not what you've asked, you could avoid the pain of having to use reduce, and just utilise the Promise.all() that you are already using:
return Promise.all(queries.map(q => q.then(res => res.json()))
.then(data => {...})
It's a much shorter way and less of a headache to read when you come back to it.
Have the accumulator's initial value be a Promise that resolves to an empty array, then await the accumulator on each iteration (so that all prior iterations resolve before the current iteration runs)
.then(responses=> {
return responses.reduce(async (accPromiseFromLastIter, next) => {
const arr = await accPromiseFromLastIter;
arr.push(await next.json());
return arr;
}, Promise.resolve([]))
})
(That said, your original code is a lot clearer, I'd prefer it over the .reduce version)
Live demo:
const makeProm = num => Promise.resolve(num * 2);
const result = [1, 2, 3].reduce(async(accPromiseFromLastIter, next) => {
const arr = await accPromiseFromLastIter;
arr.push(await makeProm(next));
return arr;
}, Promise.resolve([]));
result.then(console.log);
Unless you have to retrieve all data in serial, consider using Promise.all to call the .json() of each Promise in parallel instead, so that the result is produced more quickly:
return Promise.all(queries)
.then(responses => Promise.all(responses.map(response => response.json())));
If the queries are an array of Responses that were just generated from fetch, it would be even better to chain the .json() call onto the original fetch call instead, eg:
const urls = [ ... ];
const results = await Promise.all(
urls.map(url => fetch(url).then(res => res.json()))
);
This way, you can consume the responses immediately when they come back, rather than having to wait for all responses to come back before starting to process the first one.
I need to recursively go through JSON and in some cases call remote API. I need to return the whole JSON modified at the end but I cannot figure out how to wait until all promises are fulfilled
const getObjectsOfRelated = (xmlAsJson, token) => {
if (testIfIwantCallApi()) {
const jsonToReturn = JSON.parse(JSON.stringify(xmlAsJson))
jsonToReturn.elements = callApi(xmlAsJson.text).then(result => {
return result.data
})
return jsonToReturn
}
if (xmlAsJson.elements) {
const jsonToReturn = JSON.parse(JSON.stringify(xmlAsJson))
jsonToReturn.elements = xmlAsJson.elements.map(res => getObjectsOfRelated(res, token))
return jsonToReturn
}
return xmlAsJson
}
Even if I try to hack it using setTimeout the result does not include parts that were created using external API.
This way the code returns correct structure with promises instead of values I want it either return completed promises or be able to wait until the promises are fulfilled.
Wrap plain return values in Promise's using Promise.resolve:
const getObjectsOfRelated = (xmlAsJson, token) => {
if (testIfIwantCallApi()) {
const jsonToReturn = JSON.parse(JSON.stringify(xmlAsJson))
return callApi(xmlAsJson.text).then(result => {
jsonToReturn.elements = result.data;
return jsonToReturn;
})
}
if (xmlAsJson.elements) {
const jsonToReturn = JSON.parse(JSON.stringify(xmlAsJson))
Promise.all(xmlAsJson.elements.map(res => getObjectsOfRelated(res,
token)).then((results) => {
jsonToReturn.elements = results.map(result => result.data);
return jsonToReturn;
});
}
return Promise.resolve(xmlAsJson);
}
This way you will consistently return promises and you can use your function like this:
getObjectsOfRelated(xmlAsJson, token).then(result => console.log(result))
You can use "Promise.all"...
For a simple array, you map a function over the array:
The function returns a promise for the "new value" of each element.
If you use Bluebird promises, you can even return a mixture of Promises and plain values.
( without having to wrap plain values in "Promise.resolve" )
Then you pass the array of promises to "Promise.all()", which waits for all of them to complete.
To transform a tree-shaped data structure (like JSON), you do the same sort of thing, but recursively. Each node in the tree would use "Promise.all" to wait for all its child-nodes=, and the root node would only "resolve" when every node in the tree has resolved.
Note that "Promise.all" is going to run all of your ASYNC functions at the same time. If you don't want that, you can instead use "Promise.mapSeries", which does the same thing, but it waits for each async function before starting the next. This can be better if you have large data and don't want to start too many simultaneous async functions at the same time.
I am trying to run async processes in order using chained promises.
I started with this:
var dataStore = {};
DBCalls.GetAllProjects()
.then((data) => ProcessData.StoreProjects(data,dataStore))
.then(ProcessData.DoStuff(dataStore))
The above finished the DoStuff function before StoreProjects function. (Running in wrong order)
var dataStore = {};
DBCalls.GetAllProjects()
.then((data) => ProcessData.StoreProjects(data,dataStore))
.then(() => {ProcessData.DoStuff(dataStore)})
This ran the function in correct order.
Can anyone explain what the differences in the syntax are?
Is it because the StoreProjects Resolve is returning nothing and that the callback signatures are different?
Extra Info:
All functions used return promises.
If your functions return promises, you should be using the return statement in the promise handlers if you want to wait for the promise to resolve before moving on to the next handler in the chain.
var dataStore = {};
return DBCalls.GetAllProjects()
.then((data) => { return ProcessData.StoreProjects(data, dataStore); })
.then(() => { return ProcessData.DoStuff(dataStore); })
I know this is a hot topic on stackoverflow, but I'm running into an issue while filling an external object with a promise function.
So basically what I want to do:
Through a promise get an array of objects I want to iterate over
Iterate over this array with a map function. Call a promise with each iteration
After this second promise resolves I want to push an Id and the result of the promise to an array
Naturally I cannot use a global object, because the promise will not be in the right scope. I have also experimented with Bluebird Promise.map function, but this way I am not able to push to the object more than only the second promise results.
Here is my code so far. PromiseTwo should populate an object, which I want to show in the res.json function (this is an Express/NodeJS app)
let promise = actie.groupActies()
let akties = {}
promise.then(function(aktieMaanden) {
let values = aktieMaanden.forEach((aktie) => {
let aktieId = aktie['_id']
let artikelen = aktie['artikelen']
let promiseTwo = order.getActieArtikelenOmzet(artikelen)
promiseTwo.then(function(orders) {
akties[aktieId] = orders
return akties
})
})
return akties
}).then((akties) => {
res.json({ message: "Omzet voor aktie", akties: akties })
})
Through a promise get an array of objects I want to iterate over
actie.groupActies()
Iterate over this array with a map function. Call a promise with each iteration
.then( acties => Promise.all(
acties.map(actie =>
order.getActieArtikelenOmzet(actie.artikelen)
.then(orders => [actie._id,orders])
)
))
After this second promise resolves I want to push an Id and the result of the promise to an array
.then(results=> res.json({results}))
The main idea here is to use Promise.all so that it only continues if all orders have finished.
Elaborating on the answer of Jonas w, using Promise.map the following works as well:
actie.groupActies()
.then(acties =>
Promise.map(acties, actie =>
order.getActieArtikelenOmzet(actie.artikelen)
.then(orders => [actie._id,orders])
)
)
.then(results => {
res.json({ message: "Omzet voor aktie", akties: results})
})