Async/Promise issues when adding a Gatsby node field - javascript

I'm having some issues adding some fields on to a Gatsby node. The real issue comes down to the fact that I just can't seem to wrap my head around the asynchronous situation, since I'm creating these fields from API call results. I'm still trying to learn about promises/async/etc.
I make one API call to an API to get location information and add it as a field (locationRequest, which is working just fine), and then run another call to get the orthodontists that work at that location.
When getOrthos runs, and it gets up to the console.log that should be spitting out an array of orthodontist entities, I'm getting this instead:
Created Ortho Node... [ Promise { <pending> }, Promise { <pending> } ]
What am I doing wrong? I've tried to go through some Promise tutorials, but I can't figure out the best way to do this where it returns the actual data rather than the promise.
Thank you for any guidance you can provide, and please excuse my ignorance.
const yextOrthos = node.acf.location_orthodontists;
const locationRequest = async () => {
const data = await fetch("https://FAKEURL.COM")
.then(response => response.json());
if( data && data.response && data.response.count === 1 ){
createNodeField({
node,
name: `yextLocation`,
value: data.response.entities[0]
});
} else {
console.log("NO LOCATIONS FOUND");
}
};
const getOrthos = async () => {
let orthodontists = await yextOrthos.map( async (ortho, i) => {
let orthoID = ortho.acf.yext_entity_ortho_id;
return await orthoRequest(orthoID);
});
if( orthodontists.length ){
createNodeField({
node,
name: `yextOrthos`,
value: orthodontists
});
console.log("Created Ortho Node...", orthodontists);
} else {
console.log("NO DOCTORS FOUND");
}
};
const orthoRequest = async (orthoID) => {
const dataPros = await fetch("https://FAKEURL.COM").then(response => response.json());
if( dataPros && dataPros.response && dataPros.response.count === 1 ){
return dataPros.response.entities[0];
} else {
return;
}
}
locationRequest();
getOrthos();

What you need to remember is that await should only stand before promise or something that returns promise. Array.prototype.map() returns array so you can't use await with it directly. Promise.all() on the other hand accepts an array and returns a promise. The example Jose Vasquez gave seems sufficient.
Good luck

You should use Promise.all() for arrays, on this line:
let orthodontists = await Promise.all(yextOrthos.map( async (ortho, i) => {...});
I hope it helps!
Edit:
A Promise which will be resolved with the value returned by the async
function, or rejected with an uncaught exception thrown from within
the async function.
If you wish to fully perform two or more jobs in parallel, you must
use await Promise.all([job1(), job2()]) as shown in the parallel
example.
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/async_function

Related

Array of filtered axios results from paginated API is empty

In my code below I get an empty array on my console.log(response) but the console.log(filterdIds) inside the getIds function is showing my desired data. I think my resolve is not right.
Note that I run do..while once for testing. The API is paged. If the records are from yesterday it will keep going, if not then the do..while is stopped.
Can somebody point me to the right direction?
const axios = require("axios");
function getToken() {
// Get the token
}
function getIds(jwt) {
return new Promise((resolve) => {
let pageNumber = 1;
const filterdIds = [];
const config = {
//Config stuff
};
do {
axios(config)
.then((response) => {
response.forEach(element => {
//Some logic, if true then:
filterdIds.push(element.id);
console.log(filterdIds);
});
})
.catch(error => {
console.log(error);
});
} while (pageNumber != 1)
resolve(filterdIds);
});
}
getToken()
.then(token => {
return token;
})
.then(jwt => {
return getIds(jwt);
})
.then(response => {
console.log(response);
})
.catch(error => {
console.log(error);
});
I'm also not sure where to put the reject inside the getIds function because of the do..while.
The fundamental problem is that resolve(filterdIds); runs synchronously before the requests fire, so it's guaranteed to be empty.
Promise.all or Promise.allSettled can help if you know how many pages you want up front (or if you're using a chunk size to make multiple requests--more on that later). These methods run in parallel. Here's a runnable proof-of-concept example:
const pages = 10; // some page value you're using to run your loop
axios
.get("https://httpbin.org") // some initial request like getToken
.then(response => // response has the token, ignored for simplicity
Promise.all(
Array(pages).fill().map((_, i) => // make an array of request promisess
axios.get(`https://jsonplaceholder.typicode.com/comments?postId=${i + 1}`)
)
)
)
.then(responses => {
// perform your filter/reduce on the response data
const results = responses.flatMap(response =>
response.data
.filter(e => e.id % 2 === 0) // some silly filter
.map(({id, name}) => ({id, name}))
);
// use the results
console.log(results);
})
.catch(err => console.error(err))
;
<script src="https://unpkg.com/axios/dist/axios.min.js"></script>
The network tab shows the requests happening in parallel:
If the number of pages is unknown and you intend to fire requests one at a time until your API informs you of the end of the pages, a sequential loop is slow but can be used. Async/await is cleaner for this strategy:
(async () => {
// like getToken; should handle err
const tokenStub = await axios.get("https://httpbin.org");
const results = [];
// page += 10 to make the snippet run faster; you'd probably use page++
for (let page = 1;; page += 10) {
try {
const url = `https://jsonplaceholder.typicode.com/comments?postId=${page}`;
const response = await axios.get(url);
// check whatever condition your API sends to tell you no more pages
if (response.data.length === 0) {
break;
}
for (const comment of response.data) {
if (comment.id % 2 === 0) { // some silly filter
const {name, id} = comment;
results.push({name, id});
}
}
}
catch (err) { // hit the end of the pages or some other error
break;
}
}
// use the results
console.log(results);
})();
<script src="https://unpkg.com/axios/dist/axios.min.js"></script>
Here's the sequential request waterfall:
A task queue or chunked loop can be used if you want to increase parallelization. A chunked loop would combine the two techniques to request n records at a time and check each result in the chunk for the termination condition. Here's a simple example that strips out the filtering operation, which is sort of incidental to the asynchronous request issue and can be done synchronously after the responses arrive:
(async () => {
const results = [];
const chunk = 5;
for (let page = 1;; page += chunk) {
try {
const responses = await Promise.all(
Array(chunk).fill().map((_, i) =>
axios.get(`https://jsonplaceholder.typicode.com/comments?postId=${page + i}`)
)
);
for (const response of responses) {
for (const comment of response.data) {
const {name, id} = comment;
results.push({name, id});
}
}
// check end condition
if (responses.some(e => e.data.length === 0)) {
break;
}
}
catch (err) {
break;
}
}
// use the results
console.log(results);
})();
<script src="https://unpkg.com/axios/dist/axios.min.js"></script>
(above image is an except of the 100 requests, but the chunk size of 5 at once is visible)
Note that these snippets are proofs-of-concept and could stand to be less indiscriminate with catching errors, ensure all throws are caught, etc. When breaking it into sub-functions, make sure to .then and await all promises in the caller--don't try to turn it into synchronous code.
See also
How do I return the response from an asynchronous call? and Why is my variable unaltered after I modify it inside of a function? - Asynchronous code reference which explain why the array is empty.
What is the explicit promise construction antipattern and how do I avoid it?, which warns against adding a new Promise to help resolve code that already returns promises.
To take a step back and think about why you ran into this issue, we have to think about how synchronous and asynchronous javascript code works together. Your
synchronous getIds function is going to run to completion, stepping through each line until it gets to the end.
The axios function invocation is returning a Promise, which is an object that represents some future fulfillment or rejection value. That Promise isn't going to resolve until the next cycle of the event loop (at the earliest), and your code is telling it to do some stuff when that pending value is returned (which is the callback in the .then() method).
But your main getIds function isn't going to wait around... it invokes the axios function, gives the Promise that is returned something to do in the future, and keeps going, moving past the do/while loop and onto the resolve method which returns a value from the Promise you created at the beginning of the function... but the axios Promise hasn't resolved by that point and therefore filterIds hasn't been populated.
When you moved the resolve method for the promise you're creating into the callback that the axios resolved Promise will invoke, it started working because now your Promise waits for axios to resolve before resolving itself.
Hopefully that sheds some light on what you can do to get your multi-page goal to work.
I couldn't help thinking there was a cleaner way to allow you to fetch multiple pages at once, and then recursively keep fetching if the last page indicated there were additional pages to fetch. You may still need to add some additional logic to filter out any pages that you batch fetch that don't meet whatever criteria you're looking for, but this should get you most of the way:
async function getIds(startingPage, pages) {
const pagePromises = Array(pages).fill(null).map((_, index) => {
const page = startingPage + index;
// set the page however you do it with axios query params
config.page = page;
return axios(config);
});
// get the last page you attempted, and if it doesn't meet whatever
// criteria you have to finish the query, submit another batch query
const lastPage = await pagePromises[pagePromises.length - 1];
// the result from getIds is an array of ids, so we recursively get the rest of the pages here
// and have a single level array of ids (or an empty array if there were no more pages to fetch)
const additionalIds = !lastPage.done ? [] : await getIds(startingPage + pages, pages);
// now we wait for all page queries to resolve and extract the ids
const resolvedPages = await Promise.all(pagePromises);
const resolvedIds = [].concat(...resolvedPages).map(elem => elem.id);
// and finally merge the ids fetched in this methods invocation, with any fetched recursively
return [...resolvedIds, ...additionalIds];
}

new promise inside a async function in order to grab the value outside of the async function

I am returning a new Promis inside a async function that is doing a request to a api, I can console log the request from the api just fine inside the function. But when I try and resolve the same line I consoled document.sold it does not work. I expected the checkPriceId variable to return a promise which I could then catch with .then, but that does not seem to work. I have also tried using promise.all around the documents.forEach loop to no avail.
Any help would be greatly appreciated.
Here's the code
const checkPriceId = async test => {
return new Promise((resolve, reject) => {
const query = `*[_type == "products" && price_id == "${body.price_id}"]`
client.fetch(query, {}).then(documents => {
documents.forEach(document => {
//console.log(document.sold)
resolve(document.sold)
})
})
})
}
checkPriceId.then(test => {
console.log(test) // does nothing
})
console.log(checkPriceId) // just returns a async function
checkPriceId()
Why use the Promise constructor at all? client.fetch already returns a promise, and you're also inside an async function which also returns a promise. Assuming all documents have a .sold that you're trying to get back in an array:
const checkPriceId = async () => {
const query = `*[_type == "products" && price_id == "${body.price_id}"]`
const documents = await client.fetch(query, {})
return documents.map(({ sold }) => sold)
}
checkPriceId.then((soldList) => {
console.log(soldList)
})
This also removes the test argument to checkPriceId since it wasn't being used.
As #blex mentinoned, you are not calling checkPriceId on line 13.
However, as also mentioned by #grodzi, you cannot resolve your promise multiple times. Once the resolve fn is called once (on line 7), subsequent calls will be ignored.
Since mixing Promises and async the way you are can be verbose and opaqueI'd suggest you just use async/awiat here. This will greatly help your code readability.
Here's a fixed example:
const checkPriceId = async (test) => {
const query = `*[_type == "products" && price_id == "${body.price_id}"]`;
const documents = await client.fetch(query, {}); // this could throw
return documents.map(document => document.sold);
}
checkPriceId().then(test => {
console.log(test) // this will log the return value from line 7
})

How to get neo4j query result returned from .then with node js

I would need to get the data returned by using the Neo4j driver for Node JS. My problem is, that i can print the value of 'online' out on console inside the .then call but i can't seem to get access to it outside of that part - I have tried returning record.get('onl'), assign it to a pre-defined variable outside the function but nothing works - all i get as result if i try to, for example, print the value of online out at the last line in this snippet, is Promise { <pending> }. I suppose I don't do the promise handling right, and I looked up lots of tutorials and examples, but I can't work it out. So: how could i assign the returned data (record.get('onl')) to var online and get actual result instead of the promise?
Thanks in advance :)
var online = session.run(cyp1, param).then(results => {
return results.records.map(record =>{
console.log(record.get('onl'))
return record.get('onl')
})
}).then(()=>{
session.close()
});
console.log(online)
Currently, you are assigning var online as a 'Promise Chain' & not a 'Resolved Promise'. You could use Async/Await this will allow you to write async code in a synchronous manner.
async function getRecords(){
const records = await session.run(cyp1, param);
return records.map(record => record.get('onl'))
}
const online = await getRecords();
Use try/catch/finally
try {
const online = await getRecords();
} catch (error) {
// do something
} finally {
await session.close()
}
If you wanted to continue using .then()
You need to use 'Promise Chaining' and pass the value 'Down the chain', this results in complex 'Callback'/'Promise Chain' hell.
session.run(cyp1, param).then(results => {
return results.records.map(record => record.get('onl'))
}).then((online)=> {
console.log(online)
}).catch(() => {
// do something
}).finally(() => {
session.close()
});

Recursion with an API, using Vanilla JS

I'm playing with the Rick and Morty API and I want to get all of the universe's characters
into an array so I don't have to make more API calls to work the rest of my code.
The endpoint https://rickandmortyapi.com/api/character/ returns the results in pages, so
I have to use recursion to get all the data in one API call.
I can get it to spit out results into HTML but I can't seem to get a complete array of JSON objects.
I'm using some ideas from
Axios recursion for paginating an api with a cursor
I translated the concept for my problem, and I have it posted on my Codepen
This is the code:
async function populatePeople(info, universePeople){ // Retrieve the data from the API
let allPeople = []
let check = ''
try {
return await axios.get(info)
.then((res)=>{
// here the current page results is in res.data.results
for (let i=0; i < res.data.results.length; i++){
item.textContent = JSON.stringify(res.data.results[i])
allPeople.push(res.data.results[i])
}
if (res.data.info.next){
check = res.data.info.next
return allPeople.push(populatePeople(res.data.info.next, allPeople))
}
})
} catch (error) {
console.log(`Error: ${error}`)
} finally {
return allPeople
}
}
populatePeople(allCharacters)
.then(data => console.log(`Final data length: ${data.length}`))
Some sharp eyes and brains would be helpful.
It's probably something really simple and I'm just missing it.
The following line has problems:
return allPeople.push(populatePeople(res.data.info.next, allPeople))
Here you push a promise object into allPeople, and as .push() returns a number, you are returning a number, not allPeople.
Using a for loop to push individual items from one array to another is really a verbose way of copying an array. The loop is only needed for the HTML part.
Also, you are mixing .then() with await, which is making things complex. Just use await only. When using await, there is no need for recursion any more. Just replace the if with a loop:
while (info) {
....
info = res.data.info.next;
}
You never assign anything to universePeople. You can drop this parameter.
Instead of the plain for loop, you can use the for...of syntax.
As from res you only use the data property, use a variable for that property only.
So taking all that together, you get this:
async function populatePeople(info) {
let allPeople = [];
try {
while (info) {
let {data} = await axios.get(info);
for (let content of data.results) {
const item = document.createElement('li');
item.textContent = JSON.stringify(content);
denizens.append(item);
}
allPeople.push(...data.results);
info = data.info.next;
}
} catch (error) {
console.log(`Error: ${error}`)
} finally {
section.append(denizens);
return allPeople;
}
}
Here is working example for recursive function
async function getAllCharectersRecursively(URL,results){
try{
const {data} = await axios.get(URL);
// concat current page results
results =results.concat(data.results)
if(data.info.next){
// if there is next page call recursively
return await getAllCharectersRecursively(data.info.next,results)
}
else{
// at last page there is no next page so return collected results
return results
}
}
catch(e){
console.log(e)
}
}
async function main(){
let results = await getAllCharectersRecursively("https://rickandmortyapi.com/api/character/",[])
console.log(results.length)
}
main()
I hesitate to offer another answer because Trincot's analysis and answer is spot-on.
But I think a recursive answer here can be quite elegant. And as the question was tagged with "recursion", it seems worth presenting.
const populatePeople = async (url) => {
const {info: {next}, results} = await axios .get (url)
return [...results, ...(next ? await populatePeople (next) : [])]
}
populatePeople ('https://rickandmortyapi.com/api/character/')
// or wrap in an `async` main, or wait for global async...
.then (people => console .log (people .map (p => p .name)))
.catch (console .warn)
.as-console-wrapper {max-height: 100% !important; top: 0}
<script>/* dummy */ const axios = {get: (url) => fetch (url) .then (r => r .json ())} </script>
This is only concerned with fetching the data. Adding it to your DOM should be a separate step, and it shouldn't be difficult.
Update: Explanation
A comment indicated that this is hard to parse. There are two things that I imagine might be tricky here:
First is the object destructuring in {info: {next}, results} = <...>. This is just a nice way to avoid using intermediate variables to calculate the ones we actually want to use.
The second is the spread syntax in return [...results, ...<more>]. This is a simpler way to build an array than using .concat or .push. (There's a similar feature for objects.)
Here's another version doing the same thing, but with some intermediate variables and an array concatenation instead. It does the same thing:
const populatePeople = async (url) => {
const response = await axios .get (url)
const next = response .info && response .info .next
const results = response .results || []
const subsequents = next ? await populatePeople (next) : []
return results .concat (subsequents)
}
I prefer the original version. But perhaps you would find this one more clear.

Promise.all results are as expected, but individual items showing undefined

First of all, there are some issues with console.log in Google Chrome not functioning as expected. This is not the case as I am working in VSCode.
We begin with two async calls to the server.
promise_a = fetch(url)
promise_b = fetch(url)
Since fetch results are also promises, .json() will needed to be called on each item. The helper function process will be used, as suggested by a Stackoverflow user -- sorry lost the link.
let promiseResults = []
let process = prom => {
prom.then(data => {
promiseResults.push(data);
});
};
Promise.all is called. The resulting array is passed to .then where forEach calls process on item.json() each iteration and fulfilled promises are pushed to promiseResults.
Promise.all([promise_a, promise_b])
.then(responseArr => {
responseArr.forEach(item => {
process(item.json());
});
})
No argument is given to the final .then block because promiseResults are in the outer scope. console.log show confusing results.
.then(() => {
console.log(promiseResults); // correct results
console.log(promiseResults[0]); // undefined ?!?
})
Any help will be greatly appreciated.
If you are familiar with async/await syntax, I would suggest you not to use an external variable promiseResults, but return the results on the fly with this function:
async function getJsonResults(promisesArr) {
// Get fetch promises response
const results = await Promise.all(promisesArr);
// Get JSON from each response promise
const jsonResults = await Promise.all(results.map(r => r.json()));
return jsonResults
}
This is usage example:
promise_a = fetch(url1)
promise_b = fetch(url2)
getJsonResults([promise_a, promise_b])
.then(theResults => console.log('All results:', theResults))
Use theResults variable to extract necessary results.
You can try this, it looks the array loop is not going properly in the promise env.
Specifically: the promiseResults is filled after you are logging.
var resultAll = Promise.all([promise_a, promise_b])
.then(responseArr => {
return Promise.all(responseArr.map(item => return item.json()));
});
resultAll.then(promiseResults => {
console.log(promiseResults);
});

Categories