In Node.js, I have a Promise.all(array) resolution with a resulting value that I need to combine with the results of another asynchronous function call. I am having problems getting the results of this second function, since it resolves later than the promise.all. I could add it to the Promise.all, but it would ruin my algorithm. Is there a way to get these values outside of their resolutions so I can modify them statically? Can I create a container that waits for their results?
To be more specific, I am reading from a Firebase realtime database that has been polling API data. I need to run an algorithm on this data and store it in a MongoDB archive. But the archive opens aynchronously and I can't get it to open before my results resolve (which need to be written).
Example:
module.exports = {
news: async function getNews() {
try {
const response = await axios.get('https://cryptopanic.com/api/posts/?auth_token=518dacbc2f54788fcbd9e182521851725a09b4fa&public=true');
//console.log(response.data.results);
var news = [];
response.data.results.forEach((results) => {
news.push(results.title);
news.push(results.published_at);
news.push(results.url);
});
console.log(news);
return news;
} catch (error) {
console.error(error);
}
},
coins: async function resolution() {
await Promise.all(firebasePromise).then((values) => {
//code
return value
}
}
I have tried the first solution, and it works for the first entry, but I may be writing my async function wrong on my second export, because it returns undefined.
You can return a Promise from getNews
module.exports = {
news: function getNews() {
return axios.get('https://cryptopanic.com/api/posts/?auth_token=518dacbc2f54788fcbd9e182521851725a09b4fa&public=true')
.then(res => {
// Do your stuff
return res;
})
}
}
and then
let promiseSlowest = news();
let promiseOther1 = willResolveSoon1();
let promiseOther2 = willResolveSoon2();
let promiseOther3 = willResolveSoon3();
Promise.all([ promiseOther1, promiseOther2, promiseOther3 ]).then([data1,
data2, data3] => {
promiseSlowest.then(lastData => {
// data1, data2, data3, lastData all will be defined here
})
})
The benefit here is all promises will start and run concurrently so your total waiting time will be equal to the time taken by the promiseSlowest.
Check the link for more:
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/async_function#Examples
Related
I am using the code given below to run multiple https GET request for Wikipedia API.
app.get("/data_results", (req, res) => {
const articlesData = names.map(nameObj => {
let name = nameObj.name;
let articleExtract = "";
let contentURL =
`https://en.wikipedia.org/w/api.php?
action=query&titles=${name}&prop=extracts&format=json&exintro=1&explaintext=false&origin=*`;
// Getting article content
https.get(contentURL, response => {
response.on("data", async data => {
const wikiArticle = JSON.parse(data);
// Mapping the keys of the JSON data of query to its values.
articleExtract = await Object.keys(wikiArticle.query.pages).map(key => wikiArticle.query.pages[key])[0].extract;
nameObj.article = articleExtract.substring(0,350);
})
});
return nameObj;
});
res.send(articlesData);});
This is my names array
[
{ name: 'Indus%20Valley%20Civilisation' },
{ name: 'Ramayana' },
{ name: 'Mahabharata' },
{ name: 'Gautama%20Buddha' }
]
My aim :-
Run the HTTPS GET request for every value in the names array sequentially.
Add the article extract with its name and save in it in an objects array.
You could also suggest me better ways of doing this.
Try this solution. it creates an object like
{
Mahabharata: response
}
The code
const getValues = async () => {
const name_array = [
{name: 'Indus%20Valley%20Civilisation'},
{name: 'Ramayana'},
{name: 'Mahabharata'},
{name: 'Gautama%20Buddha'},
];
const namePromises = name_array.map(value =>
fetch('baseurl' + value.name).then(res => res.json()),
);
const response = await Promise.all(namePromises);
return response.reduce((obj, responseResults, index) => {
obj[name_array[index].name] = responseResults;
}, {});
};
I would suggest you use fetch and promise. all. Map over your array of names and create promises as input. and then return the array of resolved promises.
something like this
const namePromises = [
{ name: 'Indus%20Valley%20Civilisation' },
{ name: 'Ramayana' },
{ name: 'Mahabharata' },
{ name: 'Gautama%20Buddha' }
].map((value) => fetch(baseurl + value.name).then(res=>res.json()));
Promise.all(namePromises).then((response) => {
console.log(response);
});
Problems:
So, you have several things going on here.
http.get() is non-blocking and asynchronous. That means that the rest of your code continues to run while your multiple http.get() operations are running. That means you end up calling res.send(articlesData); before any of the http requests are complete.
.map() is not promise-aware or asynchronous-aware so it will not wait its loop for any asynchronous operation. As such, you're actually running ALL your http requests in parallel (they are all in-flight at the same time). The .map() loop starts them all and then they all finish some time later.
The data event for http.get() is not guaranteed to contain all your data or even a whole piece of data. It may just be a partial chunk of data. So, while your code may seem to get a complete response now, different network or server conditions could change all that and reading the result of your http requests may stop functioning correctly.
You don't show where names comes from in your request handler. If it's a statically declared array of objects declared in a higher-scoped variable, then this code has a problem that you're trying to modify the objects in that array in a request handler and multiple requests to this request handler can be conflicting with one another.
You don't show any error handling for errors in your http requests.
You're using await in a place it doesn't belong. await only does something useful when you are awaiting a promise.
Solutions:
Managing asynchronous operations in Javascript, particularly when you have more than one to coordinate) is a whole lot easier when using promises instead of the plain callback that http.get() uses. And, it's a whole lot easier to use promises when you use an http request interface that already supports promises. My goto library for built-in promise support for http requests in nodejs is got(). There are many good choices shown here.
You can use got() and promises to control the asynchronous flow and error handling like this:
const got = require('got');
app.get("/data_results", (req, res) => {
Promise.all(names.map(nameObj => {
let name = nameObj.name;
// create separate result object
let result = { name };
let contentURL =
`https://en.wikipedia.org/w/api.php?
action=query&titles=${name}&prop=extracts&format=json&exintro=1&explaintext=false&origin=*`;
return got(contentURL).json().then(wikiArticle => {
// Mapping the keys of the JSON data of query to its values.
let articleExtract = Object.keys(wikiArticle.query.pages).map(key => wikiArticle.query.pages[key])[0].extract;
result.article = articleExtract.substring(0, 350);
return result;
});
})).then(articlesData => {
res.send(articlesData);
}).catch(err => {
console.log(err);
res.sendStatus(500);
});
});
This code attempts to fix all six of the above mentioned problems while still running all your http requests in parallel.
If your names array was large, running all these requests in parallel may consume too many resources, either locally or on the target server. Or, it may run into rate limiting on the target server. If that was an issue, you can run sequence the http requests to run them one at a time like this:
const got = require('got');
app.get("/data_results", async (req, res) => {
try {
let results = [];
for (let nameObj of names) {
let name = nameObj.name;
let result = { name };
let contentURL =
`https://en.wikipedia.org/w/api.php?
action=query&titles=${name}&prop=extracts&format=json&exintro=1&explaintext=false&origin=*`;
const wikiArticle = await got(contentURL).json();
let articleExtract = Object.keys(wikiArticle.query.pages).map(key => wikiArticle.query.pages[key])[0].extract;
result.article = articleExtract.substring(0, 350);
results.push(result);
}
res.send(results);
} catch (e) {
console.log(e);
res.sendStatus(500);
}
});
If you really want to stick with https.get(), then you can "promisify" it and use it in place of got():
function myGet(url, options = {}) {
return new Promise((resolve, reject) => {
https.get(url, options, (res) => {
res.setEncoding('utf8');
let rawData = '';
res.on('data', (chunk) => { rawData += chunk; });
res.on('end', () => {
try {
const parsedData = JSON.parse(rawData);
resolve(parsedData);
} catch (e) {
reject(e);
}
});
}).on('error', reject);
});
}
app.get("/data_results", async (req, res) => {
try {
let results = [];
for (let nameObj of names) {
let name = nameObj.name;
let result = { name };
let contentURL =
`https://en.wikipedia.org/w/api.php?
action=query&titles=${name}&prop=extracts&format=json&exintro=1&explaintext=false&origin=*`;
const wikiArticle = await myGet(contentURL);
let articleExtract = Object.keys(wikiArticle.query.pages).map(key => wikiArticle.query.pages[key])[0].extract;
result.article = articleExtract.substring(0, 350);
results.push(result);
}
res.send(results);
} catch (e) {
console.log(e);
res.sendStatus(500);
}
});
I'm trying to update some rows at once.
I'm waiting for all queries to be written before executing them all in one query. However, I need some values to be returned by the query after the update. I'm using the returning() method for that.
I can't manage to get all the returned values at once as an array after the query has been executed. If I use a then() directly in the transaction function it returns the returned values one by one and my promise won't work. It also considerably increases the execution time.
How do I get all the returning values from my update request at once in a list.
return await knex.transaction(trx => {
const queries = [];
data.forEach(pair => {
const query = knex('pair')
.where('symbol', pair.s)
.update({
last_price: pair.c,
})
.returning(['id'])
.transacting(trx)
//.then(function (updated) {
// ... doing it one by one here
//})
queries.push(query);
});
return Promise.all(queries) // Once all queries are written
.then(() => {
trx.commit // We try to execute them all
.catch((e) => {
trx.rollback // And rollback in case any of them goes wrong
console.error(e)
});
})
I found a solution without using trx.commit():
async function updateDB(data) {
return await knex.transaction(async (trx) => {
const queries = [];
data.forEach(item => {
const query = knex('core_exchangepair')
.where('symbol_slug', item.s)
.transacting(trx)
.update({
last_price: item.c,
})
.returning(['id'])
queries.push(query);
});
try {
return await Promise.all(queries);
} catch (e) {
console.error(e);
return e;
}
})
}
The await after the return is unnecessary. You need to await your Promise.all before returning it, so your trx function should be async.
I'm developing the front-end for my spring boot application. I set up an initial call wrapped in a useEffect() React.js function:
useEffect(() => {
const getData = async () => {
try {
const { data } = await fetchContext.authAxios.get(
'/myapi/' + auth.authState.id
);
setData(data);
} catch (err) {
console.log(err);
}
};
getData();
}, [fetchContext]);
The data returned isn't comprehensive, and needs further call to retrieve other piece of information, for example this initial call return an employee id, but if I want to retrieve his name and display it I need a sub-sequential call, and here I'm experiencing tons of issues:
const getEmployeeName = async id => {
try {
const name = await fetchContext.authAxios.get(
'/employeeName/' + id
);
console.log((name["data"])); // <= Correctly display the name
return name["data"]; // return an [Object promise],
} catch (err) {
console.log(err);
}
};
I tried to wrap the return call inside a Promise.resolve() function, but didn't solve the problem. Upon reading to similar questions here on stackoverflow, most of the answers suggested to create a callback function or use the await keyword (as I've done), but unfortunately didn't solve the issue. I admit that this may not be the most elegant way to do it, as I'm still learning JS/React I'm open to suggestions on how to improve the api calls.
var output = Object.values(data).map((index) =>
<Appointment
key={index["storeID"].toString()}
// other irrelevant props
employee={name}
approved={index["approved"]}
/>);
return output;
Async functions always return promises. Any code that needs to interact with the value needs to either call .then on the promise, or be in an async function and await the promise.
In your case, you should just need to move your code into the existing useEffect, and setState when you're done. I'm assuming that the employeeID is part of the data returned by the first fetch:
const [name, setName] = useState('');
useEffect(() => {
const getData = async () => {
try {
const { data } = await fetchContext.authAxios.get(
"/myapi/" + auth.authState.id
);
setData(data);
const name = await fetchContext.authAxios.get(
'/employeeName/' + data.employeeID
);
setName(name.data);
} catch (err) {
console.log(err);
}
};
getData();
}, [fetchContext]);
// ...
var output = Object.values(appointmentsData).map((index) =>
<Appointment
key={index["storeID"].toString()}
// other irrelevant props
employee={name}
approved={index["approved"]}
/>);
return output;
Note that the above code will do a rerender once it has the data (but no name), and another later when you have the name. If you want to wait until both fetches are complete, simply move the setData(data) down next to the setName
Is there any way I can make the below code run synchronously in a way where I can get all of the productLine ids and then loop through and delete all of them, then once all of this is complete, get all of the productIds and then loop through and delete all of them?
I really want to be able to delete each set of items in batch, but the next section can't run until the first section is complete or there will be referential integrity issues.
// Delete Product Lines
axios.get('https://myapi.com/ProductLine?select=id')
.then(function (response) {
const ids = response.data.value
ids.forEach(id => {
axios.delete('https://myapi.com/ProductLine/' + id)
})
})
.catch(function (error) {
})
// Delete Products (I want to ensure this runs after the above code)
axios.get('https://myapi.com/Product?select=id')
.then(function (response) {
const ids = response.data.value
ids.forEach(id => {
axios.delete('https://myapi.com/Product/' + id)
})
})
.catch(function (error) {
})
There's a lot of duplication in your code. To reduce code duplication, you can create a helper function that can be called with appropriate arguments and this helper function will contain code to delete product lines and products.
async function deleteHelper(getURL, deleteURL) {
const response = await axios.get(getURL);
const ids = response.data.value;
return Promise.all(ids.map(id => (
axios.delete(deleteURL + id)
)));
}
With this helper function, now your code will be simplified and will be without code duplication.
Now to achieve the desired result, you could use one of the following ways:
Instead of two separate promise chains, use only one promise chain that deletes product lines and then deletes products.
const prodLineGetURL = 'https://myapi.com/ProductLine?select=id';
const prodLineDeleteURL = 'https://myapi.com/ProductLine/';
deleteHelper(prodLineGetURL, prodLineDeleteURL)
.then(function() {
const prodGetURL = 'https://myapi.com/Product?select=id';
const prodDeleteURL = 'https://myapi.com/Product/';
deleteHelper(prodGetURL, prodDeleteURL);
})
.catch(function (error) {
// handle error
});
Use async-await syntax.
async function delete() {
try {
const urls = [
[ prodLineGetURL, prodLineDeleteURL ],
[ prodGetURL, prodDeleteURL ]
];
for (const [getURL, deleteURL] of urls) {
await deleteHelper(getURL, deleteURL);
}
} catch (error) {
// handle error
}
}
One other thing that you could improve in your code is to use Promise.all instead of forEach() method to make delete requests, above code uses Promise.all inside deleteHelper function.
Your code (and all other answers) are executing delete requests sequentially, which is huge waste of time. You should use Promise.all() and execute in parallel...
// Delete Product Lines
axios.get('https://myapi.com/ProductLine?select=id')
.then(function (response) {
const ids = response.data.value
// execute all delete requests in parallel
Promise.all(
ids.map(id => axios.delete('https://myapi.com/ProductLine/' + id))
).then(
// all delete request are finished
);
})
.catch(function (error) {
})
All HTTP request are asynchronous but you can make it sync-like. How? Using async-await
Suppose you have a function called retrieveProducts, you need to make that function async and then await for the response to keep processing.
Leaving it to:
const retrieveProducts = async () => {
// Delete Product Lines
const response = await axios.get('https://myapi.com/ProductLine?select=id')
const ids = response.data.value
ids.forEach(id => {
axios.delete('https://myapi.com/ProductLine/' + id)
})
// Delete Products (I want to ensure this runs after the above code)
const otherResponse = await axios.get('https://myapi.com/Product?select=id') // use proper var name
const otherIds = response.data.value //same here
otherIds.forEach(id => {
axios.delete('https://myapi.com/Product/' + id)
})
}
But just keep in mind that it's not synchronous, it keeps being async
I'm quite a newbie in JavaScript and in Promises.
I'm trying to build an array of objects that I get from an API.
To do so, I've build two functions in a file MyFile.js.
The first one returns a promise when an axios promise is resolved. It's
function get_items (url) {
return new Promise((resolve, reject) => {
let options = {
baseURL: url,
method: 'get'
}
axios(options)
.then(response => {
resolve(response.data)
})
.catch(error => {
reject(error.stack)
})
})
}
The second one looks like this:
let output = []
let next_url = 'https://some_url.com/api/data'
async function get_data () {
try {
let promise = new Promise((resolve, reject) => {
if (next_url) {
get_items(next_url)
.then(response => {
output.push(...response.results)
if (response.next) {
next_url = response.next
console.log('NEXT_URL HERE', next_url)
get_data()
} else {
console.log('else')
next_url = false
get_data()
}
})
.catch(error => {
reject(error.stack)
})
} else {
console.log('before resolve')
resolve(output)
}
})
return await promise
} catch(e) {
console.log(e)
}
}
It's where I'm grinding my teeth.
What I think I understand of this function, is that:
it's returning the value of a promise (that's what I understand return await promise is doing)
it's a recursive function. So, if there is a next_url, the function continues on. But if there is not, it gets called one last time to go into the else part where it resolves the array output which contains the results (values not state) of all the promises. At least, when I execute it, and check for my sanity checks with the console.log I wrote, it works.
So, output is filled with data and that's great.
But, when I call this function from another file MyOtherFile.js, like this:
final_output = []
MyFile.get_data()
.then(result => {
console.log('getting data')
final_output.push(...result)
})
it never gets into the then part. And when I console.log MyFile.get_data(), it's a pending promise.
So, what I would like to do, is be able to make get_data() wait for all the promises result (without using Promise.all(), to have calls in serie, not in parallel, that would be great for performances, I guess?) and then be able to retrieve that response in the then part when calling this function from anywhere else.
Keep in mind that I'm really a newbie in promises and JavaScript in general (I'm more of a Python guy).
Let me know if my question isn't clear enough.
I've been scratching my head for two days now and it feels like I'm running in circle.
Thanks for being an awesome community!
This is a bit untested
const api_url = 'https://some_url.com/api/data';
get_data(api_url).then((results) => {
console.log(results);
}).catch((error) => {
// console.error(error);
});
function get_items (url) {
const options = {
baseURL: url,
method: 'get'
};
return axios(options).then((response) => response.data);
}
async function get_data(next_url) {
const output = [];
while (next_url) {
const { results, next } = await get_items(next_url);
output.push(...results);
next_url = next;
}
return output;
}
Basically it makes things a bit neater. I suggest to look at more examples with Promises and the advantage and when to ease await/async. One thing to keep in mind, if you return a Promise, it will follow the entire then chain, and it will always return a Promise with a value of the last then.. if that makes sense :)
There are a few problems. One is that you never resolve the initial Promise unless the else block is entered. Another is that you should return the recursive get_data call every time, so that it can be properly chained with the initial Promise. You may also consider avoiding the explicit promise construction antipattern - get_items already returns a Promise, so there's no need to construct another one (same for the inside of get_items, axios calls return Promises too).
You might consider a plain while loop, reassigning the next_url string until it's falsey:
function get_items (baseURL) {
const options = {
baseURL: url,
method: 'get'
}
// return the axios call, handle errors in the consumer instead:
return axios(options)
.then(res => res.data)
}
async function get_data() {
const output = []
let next_url = 'https://some_url.com/api/data'
try {
while (next_url) {
const response = await get_items(next_url);
output.push(...response.results)
next_url = response.next;
}
} catch (e) {
// handle errors *here*, perhaps
console.log(e)
}
return output;
}
Note that .catch will result in a Promise being converted from a rejected Promise to a resolved one - you don't want to .catch everywhere, because that will make it difficult for the caller to detect errors.
Another way of doing it is to not use async at all and just recursively return a promise:
const getItems = (url) =>
axios({
baseURL: url,
method: 'get',
}).then((response) => response.data);
const getData = (initialUrl) => {
const recur = (result, nextUrl) =>
!nextUrl
? Promise.resolve(result)
: getItems(nextUrl).then((data) =>
recur(result.concat([data.results]), data.next),
);
return recur([],initialUrl)
.catch(e=>Promise.reject(e.stack));//reject with error stack
};
As CertainPerformance noted; you don't need to catch at every level, if you want getData to reject with error.stack you only need to catch it once.
However; if you had 100 next urls and 99 of them were fine but only the last one failed would you like to reject in a way that keeps the results so far so you can try again?
If you do then the code could look something like this:
const getData = (initialUrl) => {
const recur = (result, nextUrl) =>
!nextUrl
? Promise.resolve(result)
: getItems(nextUrl)
.catch(e=>Promise.reject([e,result]))//reject with error and result so far
.then((data) =>
recur(result.concat([data.results]), data.next),
);
return recur([],initialUrl);//do not catch here, just let it reject with error and result
};