I have some api endpoint.
one returns all server details (https://dnscheck.io/api/serverDetails/)
others are server specific endpoint. (https://dnscheck.io/api/query/?id=2&type=A&hostname=test.com) for each server_Id(which I got from serverDetails endpoint), I have to call each api endpoint.
what I have done is.
I loop over the results array (which I got from serverDetails endpoint)
and for each iteration of loop, I call each endpoint for getting the ip.
loop:
for (const [index, item] of data.entries()) {
const res = await fetch(
`https://dnscheck.io/api/query/?id=${item.id}&type=${query.type}&hostname=${query.host}`
);
const result = await res.json();
renderResult(result, item, index);
}
render-function:
const renderResult = (result, data, index) => {
const ip = document.querySelector(`.ip-address${index + 1}`);
ip.innerHTML = result.answers[0].address;
};
In this way, results are displayed in the DOM in a sync way. (one after another)
But, what I want is, update the dom with the result, as soon as the result is ready.
what can I do?
Don't use await, as that blocks the for loop and orders the results. Use .then() instead.
for (const [index, item] of data.entries()) {
fetch(
`https://dnscheck.io/api/query/?id=${item.id}&type=${query.type}&hostname=${query.host}`
).then(res => res.json())
.then(result => renderResult(result, item, index));
}
You can do them in parallel by using map on the array and using fetch within. You can know when they've all finished by using Promise.all to observe the overall result:
await Promise.all(
data.entries().map(async (index, item) => {
const res = await fetch(
`https://dnscheck.io/api/query/?id=${item.id}&type=${query.type}&hostname=${query.host}`
);
// You need to check `res.ok` here
const result = await res.json();
renderResult(result, item, index);
)
);
Note that Promise.all will reject its promise immediately if any of the input promises rejects. If you want to know what succeeded and what failed, use allSettled instead:
const results = await Promise.allSettled(
data.entries().map(async (index, item) => {
const res = await fetch(
`https://dnscheck.io/api/query/?id=${item.id}&type=${query.type}&hostname=${query.host}`
);
// You need to check `res.ok` here
const result = await res.json();
renderResult(result, item, index);
)
);
// Use `results` here, it's an array of objects, each of which is either:
// {status: "fulfilled", value: <the fulfillment value>}
// or
// {status: "rejected", reason: <the rejection reason>}
About my "You need to check res.ok here" note: this is unfortunately a footgun in the fetch API. It only rejects its promise on network failure, not HTTP errors. So a 404 results in a fulfilled promise. I write about it here. Typically the best thing is to have wrapper functions you call, for instance:
function fetchJSON(...args) {
return fetch(...args)
.then(response => {
if (!response.ok) {
throw new Error(`HTTP error ${response.status}`); // Or an error subclass
}
return response.json();
});
}
Related
I am struggling to extract data from an API call. I need it to return an array of keys but I keep getting [object Promise] when I try to render.
const apiKey = '*****';
const container = document.getElementById('root');
const Yelp = {
async search(term, location, searchBy) {
let response = await fetch(`https://cors-anywhere.herokuapp.com/https://api.yelp.com/v3/businesses/search?term=${term}&location=${location}&sort_by=${searchBy}`, {
headers: {
Authorization: `Bearer ${apiKey}`
}
})
if (response.ok) {
let jsonResponse = await response.json()
return jsonResponse.businesses.map(business => {
return business
})
}
}
}
console.log(Yelp.search('pasta', 'london', 'best_match'));
Since the Yelp.search function is async, it will always return a promise. That's the reason you observe a promise when you log the immediate result after calling it. Instead, you should await the promise like:
Yelp.search('pasta', 'london', 'best_match').then(results => {
console.log(results)
})
So to answer your question, you would call the then() method of the Promise to wait for its results to resolve. The then method takes two arguments. The first is a function that you provide to handle the results of the promise. The second is a function that you provide to handle any errors returned by the promise. For example:
Yelp.search('pasta', 'london', 'best_match').then(results => {
// handle the results
console.log(results)
}, err => {
// handle the error
console.error(err)
})
You may also catch exceptions thrown by the promise by invoking catch(), like so:
Yelp.search('pasta', 'london', 'best_match')
.then(results => console.log(results), err => console.error(err))
.catch(ex => console.error(ex))
I have this async function to get three separate requests from the swapi API to retrieve data. However, I'm only getting back the first page of data as it's paginated. I know I have to create a loop for data.next to make new requests but I'm unsure the best way to run it through my function.
(async function getData() {
//Utility Functions for fetch
const urls = ["https://swapi.co/api/planets/", "https://swapi.co/api/films/", "https://swapi.co/api/people/"];
const checkStatus = res => res.ok ? Promise.resolve(res) : Promise.reject(new Error(res.statusText));
const parseJSON = response => response.json();
//Get Data
await Promise.all(urls.map(url => fetch(url)
.then(checkStatus)
.then(parseJSON)
.catch(error => console.log("There was a problem!", error))))
.then(data => {
let planets = data[0].results,
films = data[1].results,
people = data[2].results;
buildData(films, planets, people);
});
})();
You are trying to access all the data.results keys in the loop, which misses the point of using Promise.all. Promise.all collects all the results from promises and stores it in a single array when all the promises are resolved.
So wait for the promises to resolve and use the array returned from Promise.all to build your data.
To get all the pages you need to have a recursive function. Which means that this function will keep calling itself until a condition is met. Sort of like a loop but with callbacks.
Every time you fetch a page check if the there is a next page by checking the next property in the response object. If there is call the getAllPages again until there are no more pages left. At the same time all the results are concatenated in a single array. That array is passed on to the next call which concatenates it again with the result. And at the end the collection variable, which contains all the concatenated arrays, is returned.
Let me know if you have any questions regarding the code.
(async function getData() {
//Utility Functions for fetch
const urls = ["https://swapi.co/api/planets/", "https://swapi.co/api/films/", "https://swapi.co/api/people/"];
const checkStatus = res => res.ok ? Promise.resolve(res) : Promise.reject(new Error(res.statusText));
const parseJSON = response => response.json();
// Get a single endpoint.
const getPage = url => fetch(url)
.then(checkStatus)
.then(parseJSON)
.catch(error => console.log("There was a problem!", error));
// Keep getting the pages until the next key is null.
const getAllPages = async (url, collection = []) => {
const { results, next } = await getPage(url);
collection = [...collection, ...results];
if (next !== null) {
return getAllPages(next, collection);
}
return collection;
}
// Select data out of all the pages gotten.
const [ planets, films, people ] = await Promise.all(urls.map(url => getAllPages(url)));
buildData(films, planets, people);
})();
Edit2: Solution at the bottom
I am using the chrome-console and I am trying to output fetched data, and I only get the desired output by writing "await" at exactly the right place, even though another solution can do it earlier and I don't know why/how it works.
solution() is the "official" solution from a web-course I am doing. Both functions return the same, currently. In myFunction I tried writing "await" in front of every used function and make every function "async", but I still can't replace the "await" inside log, even though the other solution can.
const urls = ['https://jsonplaceholder.typicode.com/users']
const myFunction = async function() {
// tried await before urls/fetch (+ make it async)
const arrfetched = urls.map( url => fetch(url) );
const [ users ] = arrfetched.map( async fetched => { //tried await in front of arrfetched
return (await fetched).json(); //tried await right after return
});
console.log('users', await users); // but can't get rid of this await
}
const solution = async function() {
const [ users ] = await Promise.all(urls.map(async function(url) {
const response = await fetch(url);
return response.json();
}));
console.log('users', users); // none here, so it can be done
}
solution();
myFunction();
I would think "await" works in a way that makes:
const a = await b;
console.log(a); // this doesn't work
the same as
const a = b;
console.log(await a); // this works
but it doesn't, and I don't understand why not. I feel like Promise.all does something unexpected, as simply writing "await" in the declaration can't do the same, only after the declaration.
Edit1: this does not work
const myFunction = async function() {
const arrfetched = await urls.map( async url => await fetch(url) );
const [ users ] = await arrfetched.map( async fetched => {
return await (await fetched).json();
});
console.log('users', users);
}
Edit2: Thanks for the help everyone, I tried putting ".toString()" on a lot of variables and switching where I put "await" in the code and where not.
As far as I understand it, if I don't use Promise.all then I need to await every time I want to use (as in the actualy data, not just use) a function or variable that has promises. It is insufficient to only have await where the data is being procensed and not further up.
In the Edit1 above users runs bevore any other await is complete, therefore no matter how many awaits i write in, none are being executed. Copying this code in the (in my case chrome-)console demostrates it nicely:
const urls = [
'https://jsonplaceholder.typicode.com/users',
]
const myFunction = async function() {
const arrfetched = urls.map( async url => fetch(url) );
const [ users ] = arrfetched.map( async fetched => {
console.log('fetched', fetched);
console.log('fetched wait', await fetched);
return (await fetched).json();
});
console.log('users', users);
console.log('users wait', await users);
}
myFunction();
// Output in the order below:
// fetched()
// users()
// fetched wait()
// users wait()
TL; DR: Promise.all is important there, but it's nothing magical. It just converts an array of Promises into a Promise that resolves with an array.
Let's break down myFunction:
const arrfetched = urls.map( url => fetch(url) );
This returns an array of Promises, all good so far.
const [ users] = arrfetched.map( async fetched => {
return (await fetched).json();
});
You're destructuring an array to get the first member, so it's the same as this:
const arr = arrfetched.map( async fetched => {
return (await fetched).json();
});
const users = arr[0];
Here we are transforming an array of promises into another array of promises. Notice that calling map with an async function will always result in an array of Promises.
You then move the first member of that array into users, so users now actually contains a single Promise. You then await it before printing it:
console.log('users', await users);
In contrast, the other snippet does something slightly different here:
const [ users ] = await Promise.all(urls.map(async function(url) {
const response = await fetch(url);
return response.json();
}));
Once again, let's separate the destructuring:
const arr = await Promise.all(urls.map(async function(url) {
const response = await fetch(url);
return response.json();
}));
const users = arr[0];
Promise.all transforms the array of Promises into a single Promise that results in an array. This means that, after await Promise.all, everything in arr has been awaited (you can sort of imagine await Promise.all like a loop that awaits everything in the array). This means that arr is just a normal array (not an array of Promises) and thus users is already awaited, or rather, it was never a Promise in the first place, and thus you don't need to await it.
Maybe the easiest way to explain this is to break down what each step achieves:
const urls = ['https://jsonplaceholder.typicode.com/users']
async function myFunction() {
// You can definitely use `map` to `fetch` the urls
// but remember that `fetch` is a method that returns a promise
// so you'll just be left with an array filled with promises that
// are waiting to be resolved.
const arrfetched = urls.map(url => fetch(url));
// `Promise.all` is the most convenient way to wait til everything's resolved
// and it _also_ returns a promise. We can use `await` to wait for that
// to complete.
const responses = await Promise.all(arrfetched);
// We now have an array of resolved promises, and we can, again, use `map`
// to iterate over them to return JSON. `json()` _also_ returns a promise
// so again you'll be left with an array of unresolved promises...
const userData = responses.map(fetched => fetched.json());
//...so we wait for those too, and destructure out the first array element
const [users] = await Promise.all(userData);
//... et voila!
console.log(users);
}
myFunction();
Await can only be used in an async function. Await is a reserved key. You can't wait for something if it isn't async. That's why it works in a console.log but not in the global scope.
I'm quite a newbie in JavaScript and in Promises.
I'm trying to build an array of objects that I get from an API.
To do so, I've build two functions in a file MyFile.js.
The first one returns a promise when an axios promise is resolved. It's
function get_items (url) {
return new Promise((resolve, reject) => {
let options = {
baseURL: url,
method: 'get'
}
axios(options)
.then(response => {
resolve(response.data)
})
.catch(error => {
reject(error.stack)
})
})
}
The second one looks like this:
let output = []
let next_url = 'https://some_url.com/api/data'
async function get_data () {
try {
let promise = new Promise((resolve, reject) => {
if (next_url) {
get_items(next_url)
.then(response => {
output.push(...response.results)
if (response.next) {
next_url = response.next
console.log('NEXT_URL HERE', next_url)
get_data()
} else {
console.log('else')
next_url = false
get_data()
}
})
.catch(error => {
reject(error.stack)
})
} else {
console.log('before resolve')
resolve(output)
}
})
return await promise
} catch(e) {
console.log(e)
}
}
It's where I'm grinding my teeth.
What I think I understand of this function, is that:
it's returning the value of a promise (that's what I understand return await promise is doing)
it's a recursive function. So, if there is a next_url, the function continues on. But if there is not, it gets called one last time to go into the else part where it resolves the array output which contains the results (values not state) of all the promises. At least, when I execute it, and check for my sanity checks with the console.log I wrote, it works.
So, output is filled with data and that's great.
But, when I call this function from another file MyOtherFile.js, like this:
final_output = []
MyFile.get_data()
.then(result => {
console.log('getting data')
final_output.push(...result)
})
it never gets into the then part. And when I console.log MyFile.get_data(), it's a pending promise.
So, what I would like to do, is be able to make get_data() wait for all the promises result (without using Promise.all(), to have calls in serie, not in parallel, that would be great for performances, I guess?) and then be able to retrieve that response in the then part when calling this function from anywhere else.
Keep in mind that I'm really a newbie in promises and JavaScript in general (I'm more of a Python guy).
Let me know if my question isn't clear enough.
I've been scratching my head for two days now and it feels like I'm running in circle.
Thanks for being an awesome community!
This is a bit untested
const api_url = 'https://some_url.com/api/data';
get_data(api_url).then((results) => {
console.log(results);
}).catch((error) => {
// console.error(error);
});
function get_items (url) {
const options = {
baseURL: url,
method: 'get'
};
return axios(options).then((response) => response.data);
}
async function get_data(next_url) {
const output = [];
while (next_url) {
const { results, next } = await get_items(next_url);
output.push(...results);
next_url = next;
}
return output;
}
Basically it makes things a bit neater. I suggest to look at more examples with Promises and the advantage and when to ease await/async. One thing to keep in mind, if you return a Promise, it will follow the entire then chain, and it will always return a Promise with a value of the last then.. if that makes sense :)
There are a few problems. One is that you never resolve the initial Promise unless the else block is entered. Another is that you should return the recursive get_data call every time, so that it can be properly chained with the initial Promise. You may also consider avoiding the explicit promise construction antipattern - get_items already returns a Promise, so there's no need to construct another one (same for the inside of get_items, axios calls return Promises too).
You might consider a plain while loop, reassigning the next_url string until it's falsey:
function get_items (baseURL) {
const options = {
baseURL: url,
method: 'get'
}
// return the axios call, handle errors in the consumer instead:
return axios(options)
.then(res => res.data)
}
async function get_data() {
const output = []
let next_url = 'https://some_url.com/api/data'
try {
while (next_url) {
const response = await get_items(next_url);
output.push(...response.results)
next_url = response.next;
}
} catch (e) {
// handle errors *here*, perhaps
console.log(e)
}
return output;
}
Note that .catch will result in a Promise being converted from a rejected Promise to a resolved one - you don't want to .catch everywhere, because that will make it difficult for the caller to detect errors.
Another way of doing it is to not use async at all and just recursively return a promise:
const getItems = (url) =>
axios({
baseURL: url,
method: 'get',
}).then((response) => response.data);
const getData = (initialUrl) => {
const recur = (result, nextUrl) =>
!nextUrl
? Promise.resolve(result)
: getItems(nextUrl).then((data) =>
recur(result.concat([data.results]), data.next),
);
return recur([],initialUrl)
.catch(e=>Promise.reject(e.stack));//reject with error stack
};
As CertainPerformance noted; you don't need to catch at every level, if you want getData to reject with error.stack you only need to catch it once.
However; if you had 100 next urls and 99 of them were fine but only the last one failed would you like to reject in a way that keeps the results so far so you can try again?
If you do then the code could look something like this:
const getData = (initialUrl) => {
const recur = (result, nextUrl) =>
!nextUrl
? Promise.resolve(result)
: getItems(nextUrl)
.catch(e=>Promise.reject([e,result]))//reject with error and result so far
.then((data) =>
recur(result.concat([data.results]), data.next),
);
return recur([],initialUrl);//do not catch here, just let it reject with error and result
};
tl;dr - if you have to filter the promises (say for errored ones) don't use async functions
I'm trying to fetch a list of urls with async and parse them, the problem is that if there's an error with one of the urls when I'm fetching - let's say for some reason the api endpoint doesn't exists - the program crushes on the parsing with the obvious error:
UnhandledPromiseRejectionWarning: Unhandled promise rejection (rejection id: 1): TypeError: ext is not iterable
I've tried checking if the res.json() is undefined, but obviously that's not it as it complains about the entire 'ext' array of promises not being iterable.
async function fetchAll() {
let data
let ext
try {
data = await Promise.all(urls.map(url=>fetch(url)))
} catch (err) {
console.log(err)
}
try {
ext = await Promise.all(data.map(res => {
if (res.json()==! 'undefined') { return res.json()}
}))
} catch (err) {
console.log(err)
}
for (let item of ext) {
console.log(ext)
}
}
Question 1:
How do I fix the above so it won't crash on an invalid address?
Question 2:
My next step is to write the extracted data to the database.
Assuming the data size of 2-5mgb of content, is my approach of using Promise.all() memory efficient? Or will it be more memory efficient and otherwise to write a for loop which handles each fetch then on the same iteration writes to the database and only then handles the next fetch?
You have several problems with your code on a fundamental basis. We should address those in order and the first is that you're not passing in any URLS!
async function fetchAll(urls) {
let data
let ext
try {
data = await Promise.all(urls.map(url=>fetch(url)))
} catch (err) {
console.log(err)
}
try {
ext = await Promise.all(data.map(res => {
if (res.json()==! 'undefined') { return res.json()}
}))
} catch (err) {
console.log(err)
}
for (let item of ext) {
console.log(ext)
}
}
First you have several try catch blocks on DEPENDANT DATA. They should all be in a single try catch block:
async function fetchAll(urls) {
try {
let data = await Promise.all(urls.map(url=>fetch(url)))
let ext = await Promise.all(data.map(res => {
// also fixed the ==! 'undefined'
if (res.json() !== undefined) { return res.json()}
}))
for (let item of ext) {
console.log(ext)
}
} catch (err) {
console.log(err)
}
}
Next is the problem that res.json() returns a promise wrapped around an object if it exists
if (res.json() !== undefined) { return res.json()}
This is not how you should be using the .json() method. It will fail if there is no parsable json. You should be putting a .catch on it
async function fetchAll(urls) {
try {
let data = await Promise.all(urls.map(url => fetch(url).catch(err => err)))
let ext = await Promise.all(data.map(res => res.json ? res.json().catch(err => err) : res))
for (let item of ext) {
console.log(ext)
}
} catch (err) {
console.log(err)
}
}
Now when it cannot fetch a URL, or parse a JSON you'll get the error and it will cascade down without throwing. Now your try catch block will ONLY throw if there is a different error that happens.
Of course this means we're putting an error handler on each promise and cascading the error, but that's not exactly a bad thing as it allows ALL of the fetches to happen and for you to distinguish which fetches failed. Which is a lot better than just having a generic handler for all fetches and not knowing which one failed.
But now we have it in a form where we can see that there is some better optimizations that can be performed to the code
async function fetchAll(urls) {
try {
let ext = await Promise.all(
urls.map(url => fetch(url)
.then(r => r.json())
.catch(error => ({ error, url }))
)
)
for (let item of ext) {
console.log(ext)
}
} catch (err) {
console.log(err)
}
}
Now with a much smaller footprint, better error handling, and readable, maintainable code, we can decide what we eventually want to return. Now the function can live wherever, be reused, and all it takes is a single array of simple GET URLs.
Next step is to do something with them so we probably want to return the array, which will be wrapped in a promise, and realistically we want the error to bubble since we've handled each fetch error, so we should also remove the try catch. At that point making it async no longer helps, and actively harms. Eventually we get a small function that groups all URL resolutions, or errors with their respective URL that we can easily filter over, map over, and chain!
function fetchAll(urls) {
return Promise.all(
urls.map(url => fetch(url)
.then(r => r.json())
.then(data => ({ data, url }))
.catch(error => ({ error, url }))
)
)
}
Now we get back an array of similar objects, each with the url it fetched, and either data or an error field! This makes chaining and inspecting SUPER easy.
You are getting a TypeError: ext is not iterable - because ext is still undefined when you caught an error and did not assign an array to it. Trying to loop over it will then throw an exception that you do not catch.
I guess you're looking for
async function fetchAll() {
try {
const data = await Promise.all(urls.map(url => fetch(url)));
const ext = await Promise.all(data.map(res => res.json()));
for (let item of ext) {
console.log(item);
}
} catch (err) {
console.log(err);
}
}
Instead of fetch(url) on line 5, make your own function, customFetch, which calls fetch but maybe returns null, or an error object, instead of throwing.
something like
async customFetch(url) {
try {
let result = await fetch(url);
if (result.json) return await result.json();
}
catch(e) {return e}
}
if (res.json()==! 'undefined')
Makes no sense whatsoever and is an asynchronous function. Remove that condition and just return res.json():
try {
ext = await Promise.all(data.map(res => res.json()))
} catch (err) {
console.log(err)
}
Whether or not your approach is "best" or "memory efficient" is up for debate. Ask another question for that.
You can have fetch and json not fail by catching the error and return a special Fail object that you will filter out later:
function Fail(reason){this.reason=reason;};
const isFail = o => (o&&o.constructor)===Fail;
const isNotFail = o => !isFail(o);
const fetchAll = () =>
Promise.all(
urls.map(
url=>
fetch(url)
.then(response=>response.json())
.catch(error=>new Fail([url,error]))
)
);
//how to use:
fetchAll()
.then(
results=>{
const successes = results.filter(isNotFail);
const fails = results.filter(isFail);
fails.forEach(
e=>console.log(`failed url:${e.reason[0]}, error:`,e.reason[1])
)
}
)
As for question 2:
Depending on how many urls you got you may want to throttle your requests and if the urls come from a large file (gigabytes) you can use stream combined with the throttle.
async function fetchAll(url) {
return Promise.all(
url.map(
async (n) => fetch(n).then(r => r.json())
)
);
}
fetchAll([...])
.then(d => console.log(d))
.catch(e => console.error(e));
Will this work for you?
If you don't depend on every resource being a success I would have gone back to basics skipping async/await
I would process each fetch individual so I could catch the error for just the one that fails
function fetchAll() {
const result = []
const que = urls.map(url =>
fetch(url)
.then(res => res.json())
.then(item => {
result.push(item)
})
.catch(err => {
// could't fetch resource or the
// response was not a json response
})
)
return Promise.all(que).then(() => result)
}
Something good #TKoL said:
Promise.all errors whenever one of the internal promises errors, so whatever advice anyone gives you here, it will boil down to -- Make sure that you wrap the promises in an error handler before passing them to Promise.all
Regarding question 1, please refer to this:
Handling errors in Promise.all
Promise.all is all or nothing. It resolves once all promises in the array resolve, or reject as soon as one of them rejects. In other words, it either resolves with an array of all resolved values, or rejects with a single error.