Catching individual errors when using Fetch API - javascript

I'm trying to make multiple fetch calls at once, so far I've used this page to reach a point where I can make multiple calls correctly. However the problem now is if one of those calls returns an error. What I'd like, is that if one URL is good and one is bad, the good URL still returns the JSON object, but the bad URL returns the error. But the catch statement notices one error and stops both calls.
My current code:
let resList = await Promise.all([
fetch(goodURL),
fetch(badURL)
]).then(responses => {
return Promise.all(responses.map(response => {
return response.json();
}))
}).catch(error => {
console.log(error);
});
console.log(resList);
Currently, logging resList returns undefined when I want it to return the JSON object from the good URL

As #jonrsharpe mentioned, .allSettled() is probably your best bet.
You can also basically make your own version with something like this:
const results = await Promise.all(
[a, b, c].map(p =>
p.catch(err => {
/* do something with it if you want */
}
)
);
Basically, instead of having one catch on Promise.all(), you attach a catch to each Promise BEFORE giving it to Promise.all(). If one of them throws an error, it'll trigger its individual catch, which you can do whatever with. As long as you don't return something else, you'll end up with it returning undefined, which you can filter out.
For example, if a and c worked and b errored, your results would look like this:
['a', undefined, 'b']
And it'd be in the same order as you fed the promises in, so you could tell which was the failure.
const a = new Promise((resolve) => setTimeout(() => resolve('a'), 100));
const b = new Promise((_, reject) => setTimeout(() => reject('b'), 200));
const c = new Promise((resolve) => setTimeout(() => resolve('c'), 50));
(async () => {
const results = await Promise.all(
[a, b, c].map(p =>
p.catch(err => console.error('err', err))
)
);
console.log(results);
})();

Related

Learning Promises, Async/Await to control execution order

I have been studying promises, await and async functions. While I was just in the stage of learning promises, I realized that the following: When I would send out two requests, there was no guarantee that they would come in the order that they are written in the code. Of course, with routing and packets of a network. When I ran the code below, the requests would resolve in no specific order.
const getCountry = async country => {
await fetch(`https://restcountries.com/v2/name/${country}`)
.then(res => res.json())
.then(data => {
console.log(data[0]);
})
.catch(err => err.message);
};
getCountry('portugal');
getCountry('ecuador');
At this point, I hadn't learned about async and await. So, the following code works the exact way I want it. Each request, waits until the other one is done.
Is this the most simple way to do it? Are there any redundancies that I could remove? I don't need a ton of alternate examples; unless I am doing something wrong.
await fetch(`https://restcountries.com/v2/name/${country}`)
.then(res => res.json())
.then(data => {
console.log(data[0]);
})
.catch(err => err.message);
};
const getCountryData = async function () {
await getCountry('portugal');
await getCountry('ecuador');
};
getCountryData();
Thanks in advance,
Yes, that's the correct way to do so. Do realize though that you're blocking each request so they run one at a time, causing inefficiency. As I mentioned, the beauty of JavaScript is its asynchronism, so take advantage of it. You can run all the requests almost concurrently, causing your requests to speed up drastically. Take this example:
// get results...
const getCountry = async country => {
const res = await fetch(`https://restcountries.com/v2/name/${country}`);
const json = res.json();
return json;
};
const getCountryData = async countries => {
const proms = countries.map(getCountry); // create an array of promises
const res = await Promise.all(proms); // wait for all promises to complete
// get the first value from the returned array
return res.map(r => r[0]);
};
// demo:
getCountryData(['portugal', 'ecuador']).then(console.log);
// it orders by the countries you ordered
getCountryData(['ecuador', 'portugal']).then(console.log);
// get lots of countries with speed
getCountryData(['mexico', 'china', 'france', 'germany', 'ecaudor']).then(console.log);
Edit: I just realized that Promise.all auto-orders the promises for you, so no need to add an extra sort function. Here's the sort fn anyways for reference if you take a different appoach:
myArr.sort((a, b) =>
(countries.indexOf(a.name.toLowerCase()) > countries.indexOf(b.name.toLowerCase())) ? 1 :
(countries.indexOf(a.name.toLowerCase()) < countries.indexOf(b.name.toLowerCase()))) ? -1 :
0
);
I tried it the way #deceze recommended and it works fine: I removed all of the .then and replaced them with await. A lot cleaner this way. Now I can use normal try and catch blocks.
// GET COUNTRIES IN ORDER
const getCountry = async country => {
try {
const status = await fetch(`https://restcountries.com/v2/name/${country}`);
const data = await status.json();
renderCountry(data[0]); // Data is here. Now Render HTML
} catch (err) {
console.log(err.name, err.message);
}
};
const getCountryData = async function () {
await getCountry('portugal');
await getCountry('Ecuador');
};
btn.addEventListener('click', function () {
getCountryData();
});
Thank you all.

JavaScript Promise inside async/await function resolve final array of responses

I'm quite a newbie in JavaScript and in Promises.
I'm trying to build an array of objects that I get from an API.
To do so, I've build two functions in a file MyFile.js.
The first one returns a promise when an axios promise is resolved. It's
function get_items (url) {
return new Promise((resolve, reject) => {
let options = {
baseURL: url,
method: 'get'
}
axios(options)
.then(response => {
resolve(response.data)
})
.catch(error => {
reject(error.stack)
})
})
}
The second one looks like this:
let output = []
let next_url = 'https://some_url.com/api/data'
async function get_data () {
try {
let promise = new Promise((resolve, reject) => {
if (next_url) {
get_items(next_url)
.then(response => {
output.push(...response.results)
if (response.next) {
next_url = response.next
console.log('NEXT_URL HERE', next_url)
get_data()
} else {
console.log('else')
next_url = false
get_data()
}
})
.catch(error => {
reject(error.stack)
})
} else {
console.log('before resolve')
resolve(output)
}
})
return await promise
} catch(e) {
console.log(e)
}
}
It's where I'm grinding my teeth.
What I think I understand of this function, is that:
it's returning the value of a promise (that's what I understand return await promise is doing)
it's a recursive function. So, if there is a next_url, the function continues on. But if there is not, it gets called one last time to go into the else part where it resolves the array output which contains the results (values not state) of all the promises. At least, when I execute it, and check for my sanity checks with the console.log I wrote, it works.
So, output is filled with data and that's great.
But, when I call this function from another file MyOtherFile.js, like this:
final_output = []
MyFile.get_data()
.then(result => {
console.log('getting data')
final_output.push(...result)
})
it never gets into the then part. And when I console.log MyFile.get_data(), it's a pending promise.
So, what I would like to do, is be able to make get_data() wait for all the promises result (without using Promise.all(), to have calls in serie, not in parallel, that would be great for performances, I guess?) and then be able to retrieve that response in the then part when calling this function from anywhere else.
Keep in mind that I'm really a newbie in promises and JavaScript in general (I'm more of a Python guy).
Let me know if my question isn't clear enough.
I've been scratching my head for two days now and it feels like I'm running in circle.
Thanks for being an awesome community!
This is a bit untested
const api_url = 'https://some_url.com/api/data';
get_data(api_url).then((results) => {
console.log(results);
}).catch((error) => {
// console.error(error);
});
function get_items (url) {
const options = {
baseURL: url,
method: 'get'
};
return axios(options).then((response) => response.data);
}
async function get_data(next_url) {
const output = [];
while (next_url) {
const { results, next } = await get_items(next_url);
output.push(...results);
next_url = next;
}
return output;
}
Basically it makes things a bit neater. I suggest to look at more examples with Promises and the advantage and when to ease await/async. One thing to keep in mind, if you return a Promise, it will follow the entire then chain, and it will always return a Promise with a value of the last then.. if that makes sense :)
There are a few problems. One is that you never resolve the initial Promise unless the else block is entered. Another is that you should return the recursive get_data call every time, so that it can be properly chained with the initial Promise. You may also consider avoiding the explicit promise construction antipattern - get_items already returns a Promise, so there's no need to construct another one (same for the inside of get_items, axios calls return Promises too).
You might consider a plain while loop, reassigning the next_url string until it's falsey:
function get_items (baseURL) {
const options = {
baseURL: url,
method: 'get'
}
// return the axios call, handle errors in the consumer instead:
return axios(options)
.then(res => res.data)
}
async function get_data() {
const output = []
let next_url = 'https://some_url.com/api/data'
try {
while (next_url) {
const response = await get_items(next_url);
output.push(...response.results)
next_url = response.next;
}
} catch (e) {
// handle errors *here*, perhaps
console.log(e)
}
return output;
}
Note that .catch will result in a Promise being converted from a rejected Promise to a resolved one - you don't want to .catch everywhere, because that will make it difficult for the caller to detect errors.
Another way of doing it is to not use async at all and just recursively return a promise:
const getItems = (url) =>
axios({
baseURL: url,
method: 'get',
}).then((response) => response.data);
const getData = (initialUrl) => {
const recur = (result, nextUrl) =>
!nextUrl
? Promise.resolve(result)
: getItems(nextUrl).then((data) =>
recur(result.concat([data.results]), data.next),
);
return recur([],initialUrl)
.catch(e=>Promise.reject(e.stack));//reject with error stack
};
As CertainPerformance noted; you don't need to catch at every level, if you want getData to reject with error.stack you only need to catch it once.
However; if you had 100 next urls and 99 of them were fine but only the last one failed would you like to reject in a way that keeps the results so far so you can try again?
If you do then the code could look something like this:
const getData = (initialUrl) => {
const recur = (result, nextUrl) =>
!nextUrl
? Promise.resolve(result)
: getItems(nextUrl)
.catch(e=>Promise.reject([e,result]))//reject with error and result so far
.then((data) =>
recur(result.concat([data.results]), data.next),
);
return recur([],initialUrl);//do not catch here, just let it reject with error and result
};

Extracting array elements from JSON (javascript)

I'm trying to manipulate JSON data received from an API url (this is my first time handling this type of work)
The following function returns a promise of a 20 element array:
const articles = () => {
return fetch(url)
.then(res => res.json())
.then(post => post.articles);
};
Console view:
Now, I'd like to extract the elements from the array - I tried something like:
articles()[0].name
but this doesn't work and I'm not sure of an alternative way to go about this? Appreciate your help. Thanks
Your articles fucntion returns a promise. You have to consume the promise (more on MDN):
articles().then(articleArray => {
console.log(articleArray);
});
or within an async function:
const articleArray = await articles();
console.log(articleArray);
Side note: Your fetch code is missing a check for HTTP success (HTTP failure isn't a rejection). You're by far not the only person who misses out this check, so much so that I've written a post on my anemic blog about it. With the check:
const articles = () => {
return fetch(url)
.then(res => {
if (!res.ok) {
throw new Error("HTTP error " + res.status);
}
return res.json();
})
.then(post => post.articles);
};

Bluebird Promise Order issue

I am watching videos to learn MongoDB Express.js VueJS Node.js (MEVN) stack.
And I want to create a seed directory and also use promise functions
// const delay = require('delay')
const Promise = require('bluebird')
const songs = require('./songs.json')
const users = require('./users.json')
const bookmarks = require('./bookmarks.json')
const historys = require('./history.json')
sequelize.sync({ force: true })
.then( async function () {
await Promise.all(
users.map( user => {
User.create(user)
})
)
await Promise.all(
songs.map( song => {
Song.create(song)
})
)
//I have to add this line
// ---> await delay(1000)
await Promise.all(
bookmarks.map( bookmark => {
Bookmark.create(bookmark)
})
)
await Promise.all(
historys.map( history => {
History.create(history)
})
)
})
I have four tables with seeds to create, and the last two tables data must be created after the former two tables data. (They are foreign keys)
But every time I run this file, the last two tables data will be created first
The only way I can prevent this is to add delay(1000) between them.
I am wondering if there exists any efficient way to solve this issue~
Thank you.
Race conditions like this one is always caused by that promises weren't properly chained.
A promise should be returned from map callback:
await Promise.all(
users.map( user => User.create(user))
);
etc.
Not returning a value from map is virtually always a mistake. It can be prevented by using array-callback-return ESLint rule.
If User.create(user), etc. were Bluebird promises with default configuration, not chaining them would also result in this warning.
My assumption why your code might fail:
You're not returning the Promises that I guess /(User|Song|Bookmark|History).create/g return to the Promise.all() function, since your map callback is not returning anything.
If you're using Arrow functions with brackets, then you need to explicitly specify the return value (using the familar return keyword).
Otherwise you can just omit the curly brackets.
My suggestion is, that you refactor you're code by utilizing Promise .then()-Chaining.
For you're example, I would suggest something like this:
const Promise = require('bluebird')
const songs = require('./songs.json')
const users = require('./users.json')
const bookmarks = require('./bookmarks.json')
const histories = require('./history.json')
sequelize.sync({
force: true
}).then(() =>
Promise.all(
users.map(user =>
User.create(user)
)
).then(() =>
Promise.all(
songs.map(song =>
Song.create(song)
)
)
).then(() =>
Promise.all(
bookmarks.map(bookmark =>
Bookmark.create(bookmark)
)
)
).then(() =>
Promise.all(
histories.map(history =>
History.create(history)
)
)
)
);

Using Promise.all() to fetch a list of urls with await statements

tl;dr - if you have to filter the promises (say for errored ones) don't use async functions
I'm trying to fetch a list of urls with async and parse them, the problem is that if there's an error with one of the urls when I'm fetching - let's say for some reason the api endpoint doesn't exists - the program crushes on the parsing with the obvious error:
UnhandledPromiseRejectionWarning: Unhandled promise rejection (rejection id: 1): TypeError: ext is not iterable
I've tried checking if the res.json() is undefined, but obviously that's not it as it complains about the entire 'ext' array of promises not being iterable.
async function fetchAll() {
let data
let ext
try {
data = await Promise.all(urls.map(url=>fetch(url)))
} catch (err) {
console.log(err)
}
try {
ext = await Promise.all(data.map(res => {
if (res.json()==! 'undefined') { return res.json()}
}))
} catch (err) {
console.log(err)
}
for (let item of ext) {
console.log(ext)
}
}
Question 1:
How do I fix the above so it won't crash on an invalid address?
Question 2:
My next step is to write the extracted data to the database.
Assuming the data size of 2-5mgb of content, is my approach of using Promise.all() memory efficient? Or will it be more memory efficient and otherwise to write a for loop which handles each fetch then on the same iteration writes to the database and only then handles the next fetch?
You have several problems with your code on a fundamental basis. We should address those in order and the first is that you're not passing in any URLS!
async function fetchAll(urls) {
let data
let ext
try {
data = await Promise.all(urls.map(url=>fetch(url)))
} catch (err) {
console.log(err)
}
try {
ext = await Promise.all(data.map(res => {
if (res.json()==! 'undefined') { return res.json()}
}))
} catch (err) {
console.log(err)
}
for (let item of ext) {
console.log(ext)
}
}
First you have several try catch blocks on DEPENDANT DATA. They should all be in a single try catch block:
async function fetchAll(urls) {
try {
let data = await Promise.all(urls.map(url=>fetch(url)))
let ext = await Promise.all(data.map(res => {
// also fixed the ==! 'undefined'
if (res.json() !== undefined) { return res.json()}
}))
for (let item of ext) {
console.log(ext)
}
} catch (err) {
console.log(err)
}
}
Next is the problem that res.json() returns a promise wrapped around an object if it exists
if (res.json() !== undefined) { return res.json()}
This is not how you should be using the .json() method. It will fail if there is no parsable json. You should be putting a .catch on it
async function fetchAll(urls) {
try {
let data = await Promise.all(urls.map(url => fetch(url).catch(err => err)))
let ext = await Promise.all(data.map(res => res.json ? res.json().catch(err => err) : res))
for (let item of ext) {
console.log(ext)
}
} catch (err) {
console.log(err)
}
}
Now when it cannot fetch a URL, or parse a JSON you'll get the error and it will cascade down without throwing. Now your try catch block will ONLY throw if there is a different error that happens.
Of course this means we're putting an error handler on each promise and cascading the error, but that's not exactly a bad thing as it allows ALL of the fetches to happen and for you to distinguish which fetches failed. Which is a lot better than just having a generic handler for all fetches and not knowing which one failed.
But now we have it in a form where we can see that there is some better optimizations that can be performed to the code
async function fetchAll(urls) {
try {
let ext = await Promise.all(
urls.map(url => fetch(url)
.then(r => r.json())
.catch(error => ({ error, url }))
)
)
for (let item of ext) {
console.log(ext)
}
} catch (err) {
console.log(err)
}
}
Now with a much smaller footprint, better error handling, and readable, maintainable code, we can decide what we eventually want to return. Now the function can live wherever, be reused, and all it takes is a single array of simple GET URLs.
Next step is to do something with them so we probably want to return the array, which will be wrapped in a promise, and realistically we want the error to bubble since we've handled each fetch error, so we should also remove the try catch. At that point making it async no longer helps, and actively harms. Eventually we get a small function that groups all URL resolutions, or errors with their respective URL that we can easily filter over, map over, and chain!
function fetchAll(urls) {
return Promise.all(
urls.map(url => fetch(url)
.then(r => r.json())
.then(data => ({ data, url }))
.catch(error => ({ error, url }))
)
)
}
Now we get back an array of similar objects, each with the url it fetched, and either data or an error field! This makes chaining and inspecting SUPER easy.
You are getting a TypeError: ext is not iterable - because ext is still undefined when you caught an error and did not assign an array to it. Trying to loop over it will then throw an exception that you do not catch.
I guess you're looking for
async function fetchAll() {
try {
const data = await Promise.all(urls.map(url => fetch(url)));
const ext = await Promise.all(data.map(res => res.json()));
for (let item of ext) {
console.log(item);
}
} catch (err) {
console.log(err);
}
}
Instead of fetch(url) on line 5, make your own function, customFetch, which calls fetch but maybe returns null, or an error object, instead of throwing.
something like
async customFetch(url) {
try {
let result = await fetch(url);
if (result.json) return await result.json();
}
catch(e) {return e}
}
if (res.json()==! 'undefined')
Makes no sense whatsoever and is an asynchronous function. Remove that condition and just return res.json():
try {
ext = await Promise.all(data.map(res => res.json()))
} catch (err) {
console.log(err)
}
Whether or not your approach is "best" or "memory efficient" is up for debate. Ask another question for that.
You can have fetch and json not fail by catching the error and return a special Fail object that you will filter out later:
function Fail(reason){this.reason=reason;};
const isFail = o => (o&&o.constructor)===Fail;
const isNotFail = o => !isFail(o);
const fetchAll = () =>
Promise.all(
urls.map(
url=>
fetch(url)
.then(response=>response.json())
.catch(error=>new Fail([url,error]))
)
);
//how to use:
fetchAll()
.then(
results=>{
const successes = results.filter(isNotFail);
const fails = results.filter(isFail);
fails.forEach(
e=>console.log(`failed url:${e.reason[0]}, error:`,e.reason[1])
)
}
)
As for question 2:
Depending on how many urls you got you may want to throttle your requests and if the urls come from a large file (gigabytes) you can use stream combined with the throttle.
async function fetchAll(url) {
return Promise.all(
url.map(
async (n) => fetch(n).then(r => r.json())
)
);
}
fetchAll([...])
.then(d => console.log(d))
.catch(e => console.error(e));
Will this work for you?
If you don't depend on every resource being a success I would have gone back to basics skipping async/await
I would process each fetch individual so I could catch the error for just the one that fails
function fetchAll() {
const result = []
const que = urls.map(url =>
fetch(url)
.then(res => res.json())
.then(item => {
result.push(item)
})
.catch(err => {
// could't fetch resource or the
// response was not a json response
})
)
return Promise.all(que).then(() => result)
}
Something good #TKoL said:
Promise.all errors whenever one of the internal promises errors, so whatever advice anyone gives you here, it will boil down to -- Make sure that you wrap the promises in an error handler before passing them to Promise.all
Regarding question 1, please refer to this:
Handling errors in Promise.all
Promise.all is all or nothing. It resolves once all promises in the array resolve, or reject as soon as one of them rejects. In other words, it either resolves with an array of all resolved values, or rejects with a single error.

Categories