Variable is not storing Promise result in Node [duplicate] - javascript

This question already has answers here:
Why is my variable unaltered after I modify it inside of a function? - Asynchronous code reference
(7 answers)
How do I access previous promise results in a .then() chain?
(17 answers)
Closed 2 years ago.
I'm developing a script that connects with an API, then with the JSON reponse do some operations and then reformat the JSON to send it to another API.
But now I'm stuck in the first step as I can't deal with the first part as my Promises is not working as expected. How can I store the API's response into a variable? For development puropose I stored one API response into a JSON file. This is my code:
declare var require: any;
let url = './data/big buy/big-bui-product-list.json';
const fs = require('fs');
let counter = 0;
const getProductList = () => {
return new Promise((resolve, reject) => {
fs.readFile(url, 'utf8', (err, data) => {
if(err){
return reject (err);
}
else {
return resolve(JSON.parse(data));
}
})
})
}
const getProductStock = () => {
return new Promise((resolve, reject) => {
fs.readFile('./data/big buy/big-bui-product-stock.json', 'utf8', (err, data) => {
if(err) {
return reject(err);
}
else {
return resolve(JSON.parse(data));
}
})
})
}
try {
let products;
console.log('Products:');
Promise.all([getProductList()])
.then(function(result) {
products = result[0];
});
console.log('Stocks:');
const productStock = Promise.all([getProductStock()]);
console.log(products);
}
catch(e) {
console.log((`Ha ocurrido un error: ${e.message}`));
}
finally {
}
In this code, what I do is getting a list of products and then get the stocks of all the products, later I will add a new function that will filter by stock and get me just a list of products where stock is bigger than X units. Now when I launch it from the terminal I dont' get the response stored into products variable but if I add .then((data) => console.log(data)) into the Promise I see on screen the JSON but as I dont' have it stored it in any variable I don't see how can I work with the objects I'm retrieving.

Promises are asynchronous. They are quite literally promises that a value will be yielded in the future. When you do getProductList().then(x => products = x), you're saying that Javascript should fetch the product list in the background, and once it's finished, it should set the products variable to x. The key words there are "in the background", since the code afterwards keeps running while the product list is being fetched. The products variable is only guaranteed to be set after the .then portion is run. So, you can move things into the .then:
try {
let products;
getProductList().then(list => {
products = list;
return getProductStock(); // leverage promise chaining
}).then(stocks => {
console.log("Products:", products);
console.log("Stocks:", stocks);
});
}
catch(e) {
console.log((`Ha ocurrido un error: ${e.message}`));
}
finally {
}

You seem to be missing some fundamental knowledge about promises. I suggest reading through the MDN Using Promises guide to familiarize yourself a bit with them.
A structure like below never works.
let variable;
promise.then(data => variable = data);
console.log(variable);
This is because it is executed in the following manner.
Create the variable variable.
Add a promise handler to the promise.
Log variable.
Promise gets resolved.
Execute the promise handler.
Set variable variable to data.
If you are using Node.js 10 or higher you can use the promise API of file system to simplify your code quite a bit.
const fs = require('fs/promises');
const readJSON = (path) => fs.readFile(path, "utf8").then((json) => JSON.parse(json));
const getProductList = () => readJSON("./data/big buy/big-bui-product-list.json");
const getProductStock = () => readJSON("./data/big buy/big-bui-product-stock.json");
Promise.all([
getProductList(),
getProductStock(),
]).then(([products, stocks]) => {
console.log({ products, stocks });
}).catch((error) => {
console.log("Ha ocurrido un error:", error.message);
}).finally(() => {
// ...
});

Related

Iterating the creation of objects with asynchonous javascript map method data [duplicate]

This question already has answers here:
Using async/await with a forEach loop
(33 answers)
Use async await with Array.map
(9 answers)
Closed 27 days ago.
In an async IIFE at the bottom of this javascript, you'll see that I'm trying to: 1) read a JSON file, 2) get multiple RSS feed URLs from that data, 3) pull and parse the data from those feeds, and create an object with that data, so I can 4) write that pulled RSS data object to a JSON file. Everything for #1 and #2 is fine. I'm able to pull data from multiple RSS feeds in #3 (and log it to console), and I'm comfortable handling #4 when I get to that point later.
My problem is that, at the end of step #3, within the const parseFeed function, I am trying to create and push an object for that iteration of rssJSONValsArr.map() in the IIFE and it's not working. The rssFeedDataArr result is empty. Even though I am able to console.log those values, I can't create and push the new object I need in order to reach step #4. My creating of a similar object in #2 works fine, so I think it's the map I have to use within const parseFeed to pull the RSS data (using the rss-parser npm package) which is making object creation not work in step #3. How do I get rssFeedOject to work with the map data?
import fs from 'fs';
import path from 'path';
import { fileURLToPath } from 'url';
import Parser from 'rss-parser';
const parser = new Parser();
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
const feedsJSON = path.join(__dirname, 'rss-feeds-test.json');
const rssJSONValsArr = [];
const rssFeedDataArr = [];
const pullValues = (feedObject, i) => {
const url = feedObject.feed.url;
const jsonValsObject = {
url: url,
};
rssJSONValsArr.push(jsonValsObject);
};
const parseFeed = async (url) => {
try {
const feed = await parser.parseURL(url);
feed.items.forEach((item) => {
console.log(`title: ${item.title}`); // correct
});
const rssFeedOject = {
title: item.title,
};
rssFeedDataArr.push(rssFeedOject);
} catch (err) {
console.log(`parseFeed() ERROR 💥: ${err}`);
}
};
(async () => {
try {
console.log('1: read feeds JSON file');
const feedsFileArr = await fs.promises.readFile(feedsJSON, {
encoding: 'utf-8',
});
const jsonObj = JSON.parse(feedsFileArr);
console.log('2: get feed URLs');
jsonObj.slice(0, 30).map(async (feedObject, i) => {
await pullValues(feedObject, i);
});
console.log('rssJSONValsArr: ', rssJSONValsArr); // correct
console.log('3: pull data from rss feeds');
rssJSONValsArr.map(async (feedItem, i) => {
await parseFeed(feedItem.url, i);
});
console.log('rssFeedDataArr: ', rssFeedDataArr); // empty !!!
// console.log('4: write rss data to JSON file');
// await fs.promises.writeFile(
// `${__dirname}/rss-bulk.json`,
// JSON.stringify(rssFeedDataArr)
// );
console.log('5: Done!');
} catch (err) {
console.log(`IIFE CATCH ERROR 💥: ${err}`);
}
})();
Example JSON file with two RSS feed URLs:
[
{
"feed": {
"details": {
"name": "nodejs"
},
"url": "https://news.google.com/rss/search?q=nodejs"
}
},
{
"feed": {
"details": {
"name": "rss-parser"
},
"url": "https://news.google.com/rss/search?q=rss-parser"
}
}
]
Any and all help appreciated. Thanks
The problem is you are printing rssFeedDataArr right after the .map call, which, like stated on the comments, is being incorrectly used, since you are not using the returned value, forEach would be the way to go here. For every value in rssJSONValsArr you are calling an anonymous and async function which in turn awaits for parseFeed, so you are basically creating a Promise in each iteration, but obviously those promises are resolved after your print statement is executed. You need to wait for all of those promises to be resolved before printing rssFeedDataArr. One way to do that, since you are creating a bunch of promises which can be run in parallel is to use Promise.all, like this:
await Promise.all(
rssJSONValsArr.map(async (feedItem, i) => {
await parseFeed(feedItem.url, i);
});
)
and you we can simplify it even more and return the promise created by parseFeed directly:
await Promise.all(
rssJSONValsArr.map((feedItem, i) => parseFeed(feedItem.url, i))
)
And in this case the right method is map and not forEach
In the case of rssJSONValsArr it works because the call to pullValues is being resolved instantly, it doesnt run asynchronously, even when its declared as async, there is not await inside the function definition.

Making multiple web api calls synchronously without nesting in Node.js with Axios

Is there any way I can make the below code run synchronously in a way where I can get all of the productLine ids and then loop through and delete all of them, then once all of this is complete, get all of the productIds and then loop through and delete all of them?
I really want to be able to delete each set of items in batch, but the next section can't run until the first section is complete or there will be referential integrity issues.
// Delete Product Lines
axios.get('https://myapi.com/ProductLine?select=id')
.then(function (response) {
const ids = response.data.value
ids.forEach(id => {
axios.delete('https://myapi.com/ProductLine/' + id)
})
})
.catch(function (error) {
})
// Delete Products (I want to ensure this runs after the above code)
axios.get('https://myapi.com/Product?select=id')
.then(function (response) {
const ids = response.data.value
ids.forEach(id => {
axios.delete('https://myapi.com/Product/' + id)
})
})
.catch(function (error) {
})
There's a lot of duplication in your code. To reduce code duplication, you can create a helper function that can be called with appropriate arguments and this helper function will contain code to delete product lines and products.
async function deleteHelper(getURL, deleteURL) {
const response = await axios.get(getURL);
const ids = response.data.value;
return Promise.all(ids.map(id => (
axios.delete(deleteURL + id)
)));
}
With this helper function, now your code will be simplified and will be without code duplication.
Now to achieve the desired result, you could use one of the following ways:
Instead of two separate promise chains, use only one promise chain that deletes product lines and then deletes products.
const prodLineGetURL = 'https://myapi.com/ProductLine?select=id';
const prodLineDeleteURL = 'https://myapi.com/ProductLine/';
deleteHelper(prodLineGetURL, prodLineDeleteURL)
.then(function() {
const prodGetURL = 'https://myapi.com/Product?select=id';
const prodDeleteURL = 'https://myapi.com/Product/';
deleteHelper(prodGetURL, prodDeleteURL);
})
.catch(function (error) {
// handle error
});
Use async-await syntax.
async function delete() {
try {
const urls = [
[ prodLineGetURL, prodLineDeleteURL ],
[ prodGetURL, prodDeleteURL ]
];
for (const [getURL, deleteURL] of urls) {
await deleteHelper(getURL, deleteURL);
}
} catch (error) {
// handle error
}
}
One other thing that you could improve in your code is to use Promise.all instead of forEach() method to make delete requests, above code uses Promise.all inside deleteHelper function.
Your code (and all other answers) are executing delete requests sequentially, which is huge waste of time. You should use Promise.all() and execute in parallel...
// Delete Product Lines
axios.get('https://myapi.com/ProductLine?select=id')
.then(function (response) {
const ids = response.data.value
// execute all delete requests in parallel
Promise.all(
ids.map(id => axios.delete('https://myapi.com/ProductLine/' + id))
).then(
// all delete request are finished
);
})
.catch(function (error) {
})
All HTTP request are asynchronous but you can make it sync-like. How? Using async-await
Suppose you have a function called retrieveProducts, you need to make that function async and then await for the response to keep processing.
Leaving it to:
const retrieveProducts = async () => {
// Delete Product Lines
const response = await axios.get('https://myapi.com/ProductLine?select=id')
const ids = response.data.value
ids.forEach(id => {
axios.delete('https://myapi.com/ProductLine/' + id)
})
// Delete Products (I want to ensure this runs after the above code)
const otherResponse = await axios.get('https://myapi.com/Product?select=id') // use proper var name
const otherIds = response.data.value //same here
otherIds.forEach(id => {
axios.delete('https://myapi.com/Product/' + id)
})
}
But just keep in mind that it's not synchronous, it keeps being async

Returning data from a Fetch api to build a UI [duplicate]

This question already has answers here:
How do I return the response from an asynchronous call?
(41 answers)
Closed 2 years ago.
First time using Stack Overflow, sorry if I do something wrong.
I'm building a simple app that allows you to enter a characters name from Rick and Morty, and then build a small card that displays that characters name, a pic of the char, their status, and their current location.
I currently can get the input from the input field and send that data to the API no problem. my issue is getting the data back from the API and setting it to a variable so I can then extract each individual piece of info from the API. i.e. character.name , character.location, etc...
Here's my JavaScript so far:
// UI variables
const input = document.getElementById("searchChar");
// Event Listeners
input.addEventListener("keyup", (e) => {
const userText = e.target.value;
if (userText !== "") {
let response = fetch(
`https://rickandmortyapi.com/api/character/?name=${userText}`
)
.then((res) => res.json())
.then((data) => console.log(data));
}
});
I can't return the 'data' or set it to a variable from the .then(data), so how do I get it to use it elsewhere in my code?
Any help is appreciated. I'm new to the Fetch API. Thank you so much!
Updated the answer based on below suggestions in comments.
Many people use this: which is an anti-pattern, What is the explicit promise construction antipattern and how do I avoid it?
Not to Use:
You can wrap your fetch call in a promise and return as a promise.
as an example
function testfunction(arg1, arg2) {
return new Promise((resolve, reject) => {
fetch(apiPath)
.then(resp => {
if (resp.ok) return resp.json()
})
.then(resp => resolve(resp)).catch(err => {
reject(err)
})
})
}
Calling the same from anywhere from your application
testfunction(arg1, arg2)
.then(result=> {
// here you can use the API result
})
.catch(error => {
throw error;
});
This is how you can call your fetch API method and use it at all the places:
function testfunction(arg1, arg2) {
const request = baseApi.InitialiseGetRequest();
return fetch(apiPath).then(resp => {
if (resp.ok) return resp.json()
})
.then(resp => resolve(resp)).catch(err => {
reject(err)
})
}
This is how you can use it accross the application:
testfunction(arg1, arg2)
.then(result=> {
// here you can use the API result
})
.catch(error => {
throw error;
});

Defining asynchrosnous function and using it in a forEach loop

I am trying to compile tex files into PFD using data from a firestore database. After completion the script doens't terminate. I found one can use process.exit() to make it quit. However, it terminates the child processes still cimpling the tex files. So, I need an asynchronous function.
The guides I found on how to make them did not particularly help me. I am still very new to javascript and any bloat is still highly confusion to me.
The guides I found on how to make them did not particularly help me. I am still very new to javascript and any bloat is still highly confusion to me.
prepending below mentioned makePDF function with async and the call of the function with await does not work and is, to my understanding, outdated.
I tried implementing a promise, but don't understand how their syntax works. Does simply appending .then() to the function call in the for loop suffice to make the loop wait for the functions completion?
How do I make this specific asynchronous?
Does it matter that it already uses asynchronous functions in its body?
I understand that a promise is used to return what ever the producer has produced to a consumer. Now, my producer doesn't produce anything to be returned. Does this matter at all?
My function called from the for loop:
function makePDF(object){
let input = fs.readFileSync('main.tex', 'utf8');
const outNameTex = object.ID + '.tex';
const outNamePDF = object.ID + '.pdf';
makeTEX(object, input, outNameTex);
const infile = fs.createReadStream(outNameTex);
const outfile = fs.createWriteStream(outNamePDF);
const pdf = latex(infile);
pdf.pipe(outfile);
pdf.on('error', err => console.error(err));
pdf.on('finish', () => {console.log('PDF generated!')});
}
And my function with the loop:
firebase.auth().onAuthStateChanged((user) => {
if (user) {
console.log('user');
db.collection('objects').where('printed', '==', false).get().then((snapshot) => {
snapshot.forEach((doc) => {
console.table(doc.data());
makePDF(doc.data());
})
process.exit();
})
.catch((err) => {
console.log('Error getting documents', err);
});
} else {
console.log('no user');
}
});
It outputs a table for each document, but no PDF generated.
async/await can be tricky to use with for loops, that is because async functions return a promise... if you convert the async/await syntax to native promise syntax you might figure out what the issue is.
What you want to do is use Array.map to map/convert each doc to a promise that resolves once the makePDF is done, then use Promise.all to wait for all the promises to resolve..
The solution should look something like this
function makePDF(object){
return new Promise((resolve, reject) => {
let input = fs.readFileSync('main.tex', 'utf8');
const outNameTex = object.ID + '.tex';
const outNamePDF = object.ID + '.pdf';
makeTEX(object, input, outNameTex);
const infile = fs.createReadStream(outNameTex);
const outfile = fs.createWriteStream(outNamePDF);
const pdf = latex(infile);
pdf.pipe(outfile);
pdf.on('error', reject);
pdf.on('finish', () => {console.log('PDF generated!'); resolve();});
}
firebase.auth().onAuthStateChanged((user) => {
if (user) {
console.log('user');
db.collection('objects').where('printed', '==', false).get().then((snapshot) => {
const promiseArr = snapshot.docs.map((doc) => {
console.table(doc.data());
return makePDF(doc.data());
})
Promise.all(promiseArr)
.then(() => {
process.exit();
})
})
.catch((err) => {
console.log('Error getting documents', err);
});
} else {
console.log('no user');
}
});

Using Promise.all() to fetch a list of urls with await statements

tl;dr - if you have to filter the promises (say for errored ones) don't use async functions
I'm trying to fetch a list of urls with async and parse them, the problem is that if there's an error with one of the urls when I'm fetching - let's say for some reason the api endpoint doesn't exists - the program crushes on the parsing with the obvious error:
UnhandledPromiseRejectionWarning: Unhandled promise rejection (rejection id: 1): TypeError: ext is not iterable
I've tried checking if the res.json() is undefined, but obviously that's not it as it complains about the entire 'ext' array of promises not being iterable.
async function fetchAll() {
let data
let ext
try {
data = await Promise.all(urls.map(url=>fetch(url)))
} catch (err) {
console.log(err)
}
try {
ext = await Promise.all(data.map(res => {
if (res.json()==! 'undefined') { return res.json()}
}))
} catch (err) {
console.log(err)
}
for (let item of ext) {
console.log(ext)
}
}
Question 1:
How do I fix the above so it won't crash on an invalid address?
Question 2:
My next step is to write the extracted data to the database.
Assuming the data size of 2-5mgb of content, is my approach of using Promise.all() memory efficient? Or will it be more memory efficient and otherwise to write a for loop which handles each fetch then on the same iteration writes to the database and only then handles the next fetch?
You have several problems with your code on a fundamental basis. We should address those in order and the first is that you're not passing in any URLS!
async function fetchAll(urls) {
let data
let ext
try {
data = await Promise.all(urls.map(url=>fetch(url)))
} catch (err) {
console.log(err)
}
try {
ext = await Promise.all(data.map(res => {
if (res.json()==! 'undefined') { return res.json()}
}))
} catch (err) {
console.log(err)
}
for (let item of ext) {
console.log(ext)
}
}
First you have several try catch blocks on DEPENDANT DATA. They should all be in a single try catch block:
async function fetchAll(urls) {
try {
let data = await Promise.all(urls.map(url=>fetch(url)))
let ext = await Promise.all(data.map(res => {
// also fixed the ==! 'undefined'
if (res.json() !== undefined) { return res.json()}
}))
for (let item of ext) {
console.log(ext)
}
} catch (err) {
console.log(err)
}
}
Next is the problem that res.json() returns a promise wrapped around an object if it exists
if (res.json() !== undefined) { return res.json()}
This is not how you should be using the .json() method. It will fail if there is no parsable json. You should be putting a .catch on it
async function fetchAll(urls) {
try {
let data = await Promise.all(urls.map(url => fetch(url).catch(err => err)))
let ext = await Promise.all(data.map(res => res.json ? res.json().catch(err => err) : res))
for (let item of ext) {
console.log(ext)
}
} catch (err) {
console.log(err)
}
}
Now when it cannot fetch a URL, or parse a JSON you'll get the error and it will cascade down without throwing. Now your try catch block will ONLY throw if there is a different error that happens.
Of course this means we're putting an error handler on each promise and cascading the error, but that's not exactly a bad thing as it allows ALL of the fetches to happen and for you to distinguish which fetches failed. Which is a lot better than just having a generic handler for all fetches and not knowing which one failed.
But now we have it in a form where we can see that there is some better optimizations that can be performed to the code
async function fetchAll(urls) {
try {
let ext = await Promise.all(
urls.map(url => fetch(url)
.then(r => r.json())
.catch(error => ({ error, url }))
)
)
for (let item of ext) {
console.log(ext)
}
} catch (err) {
console.log(err)
}
}
Now with a much smaller footprint, better error handling, and readable, maintainable code, we can decide what we eventually want to return. Now the function can live wherever, be reused, and all it takes is a single array of simple GET URLs.
Next step is to do something with them so we probably want to return the array, which will be wrapped in a promise, and realistically we want the error to bubble since we've handled each fetch error, so we should also remove the try catch. At that point making it async no longer helps, and actively harms. Eventually we get a small function that groups all URL resolutions, or errors with their respective URL that we can easily filter over, map over, and chain!
function fetchAll(urls) {
return Promise.all(
urls.map(url => fetch(url)
.then(r => r.json())
.then(data => ({ data, url }))
.catch(error => ({ error, url }))
)
)
}
Now we get back an array of similar objects, each with the url it fetched, and either data or an error field! This makes chaining and inspecting SUPER easy.
You are getting a TypeError: ext is not iterable - because ext is still undefined when you caught an error and did not assign an array to it. Trying to loop over it will then throw an exception that you do not catch.
I guess you're looking for
async function fetchAll() {
try {
const data = await Promise.all(urls.map(url => fetch(url)));
const ext = await Promise.all(data.map(res => res.json()));
for (let item of ext) {
console.log(item);
}
} catch (err) {
console.log(err);
}
}
Instead of fetch(url) on line 5, make your own function, customFetch, which calls fetch but maybe returns null, or an error object, instead of throwing.
something like
async customFetch(url) {
try {
let result = await fetch(url);
if (result.json) return await result.json();
}
catch(e) {return e}
}
if (res.json()==! 'undefined')
Makes no sense whatsoever and is an asynchronous function. Remove that condition and just return res.json():
try {
ext = await Promise.all(data.map(res => res.json()))
} catch (err) {
console.log(err)
}
Whether or not your approach is "best" or "memory efficient" is up for debate. Ask another question for that.
You can have fetch and json not fail by catching the error and return a special Fail object that you will filter out later:
function Fail(reason){this.reason=reason;};
const isFail = o => (o&&o.constructor)===Fail;
const isNotFail = o => !isFail(o);
const fetchAll = () =>
Promise.all(
urls.map(
url=>
fetch(url)
.then(response=>response.json())
.catch(error=>new Fail([url,error]))
)
);
//how to use:
fetchAll()
.then(
results=>{
const successes = results.filter(isNotFail);
const fails = results.filter(isFail);
fails.forEach(
e=>console.log(`failed url:${e.reason[0]}, error:`,e.reason[1])
)
}
)
As for question 2:
Depending on how many urls you got you may want to throttle your requests and if the urls come from a large file (gigabytes) you can use stream combined with the throttle.
async function fetchAll(url) {
return Promise.all(
url.map(
async (n) => fetch(n).then(r => r.json())
)
);
}
fetchAll([...])
.then(d => console.log(d))
.catch(e => console.error(e));
Will this work for you?
If you don't depend on every resource being a success I would have gone back to basics skipping async/await
I would process each fetch individual so I could catch the error for just the one that fails
function fetchAll() {
const result = []
const que = urls.map(url =>
fetch(url)
.then(res => res.json())
.then(item => {
result.push(item)
})
.catch(err => {
// could't fetch resource or the
// response was not a json response
})
)
return Promise.all(que).then(() => result)
}
Something good #TKoL said:
Promise.all errors whenever one of the internal promises errors, so whatever advice anyone gives you here, it will boil down to -- Make sure that you wrap the promises in an error handler before passing them to Promise.all
Regarding question 1, please refer to this:
Handling errors in Promise.all
Promise.all is all or nothing. It resolves once all promises in the array resolve, or reject as soon as one of them rejects. In other words, it either resolves with an array of all resolved values, or rejects with a single error.

Categories