Defining asynchrosnous function and using it in a forEach loop - javascript

I am trying to compile tex files into PFD using data from a firestore database. After completion the script doens't terminate. I found one can use process.exit() to make it quit. However, it terminates the child processes still cimpling the tex files. So, I need an asynchronous function.
The guides I found on how to make them did not particularly help me. I am still very new to javascript and any bloat is still highly confusion to me.
The guides I found on how to make them did not particularly help me. I am still very new to javascript and any bloat is still highly confusion to me.
prepending below mentioned makePDF function with async and the call of the function with await does not work and is, to my understanding, outdated.
I tried implementing a promise, but don't understand how their syntax works. Does simply appending .then() to the function call in the for loop suffice to make the loop wait for the functions completion?
How do I make this specific asynchronous?
Does it matter that it already uses asynchronous functions in its body?
I understand that a promise is used to return what ever the producer has produced to a consumer. Now, my producer doesn't produce anything to be returned. Does this matter at all?
My function called from the for loop:
function makePDF(object){
let input = fs.readFileSync('main.tex', 'utf8');
const outNameTex = object.ID + '.tex';
const outNamePDF = object.ID + '.pdf';
makeTEX(object, input, outNameTex);
const infile = fs.createReadStream(outNameTex);
const outfile = fs.createWriteStream(outNamePDF);
const pdf = latex(infile);
pdf.pipe(outfile);
pdf.on('error', err => console.error(err));
pdf.on('finish', () => {console.log('PDF generated!')});
}
And my function with the loop:
firebase.auth().onAuthStateChanged((user) => {
if (user) {
console.log('user');
db.collection('objects').where('printed', '==', false).get().then((snapshot) => {
snapshot.forEach((doc) => {
console.table(doc.data());
makePDF(doc.data());
})
process.exit();
})
.catch((err) => {
console.log('Error getting documents', err);
});
} else {
console.log('no user');
}
});
It outputs a table for each document, but no PDF generated.

async/await can be tricky to use with for loops, that is because async functions return a promise... if you convert the async/await syntax to native promise syntax you might figure out what the issue is.
What you want to do is use Array.map to map/convert each doc to a promise that resolves once the makePDF is done, then use Promise.all to wait for all the promises to resolve..
The solution should look something like this
function makePDF(object){
return new Promise((resolve, reject) => {
let input = fs.readFileSync('main.tex', 'utf8');
const outNameTex = object.ID + '.tex';
const outNamePDF = object.ID + '.pdf';
makeTEX(object, input, outNameTex);
const infile = fs.createReadStream(outNameTex);
const outfile = fs.createWriteStream(outNamePDF);
const pdf = latex(infile);
pdf.pipe(outfile);
pdf.on('error', reject);
pdf.on('finish', () => {console.log('PDF generated!'); resolve();});
}
firebase.auth().onAuthStateChanged((user) => {
if (user) {
console.log('user');
db.collection('objects').where('printed', '==', false).get().then((snapshot) => {
const promiseArr = snapshot.docs.map((doc) => {
console.table(doc.data());
return makePDF(doc.data());
})
Promise.all(promiseArr)
.then(() => {
process.exit();
})
})
.catch((err) => {
console.log('Error getting documents', err);
});
} else {
console.log('no user');
}
});

Related

Learning Promises, Async/Await to control execution order

I have been studying promises, await and async functions. While I was just in the stage of learning promises, I realized that the following: When I would send out two requests, there was no guarantee that they would come in the order that they are written in the code. Of course, with routing and packets of a network. When I ran the code below, the requests would resolve in no specific order.
const getCountry = async country => {
await fetch(`https://restcountries.com/v2/name/${country}`)
.then(res => res.json())
.then(data => {
console.log(data[0]);
})
.catch(err => err.message);
};
getCountry('portugal');
getCountry('ecuador');
At this point, I hadn't learned about async and await. So, the following code works the exact way I want it. Each request, waits until the other one is done.
Is this the most simple way to do it? Are there any redundancies that I could remove? I don't need a ton of alternate examples; unless I am doing something wrong.
await fetch(`https://restcountries.com/v2/name/${country}`)
.then(res => res.json())
.then(data => {
console.log(data[0]);
})
.catch(err => err.message);
};
const getCountryData = async function () {
await getCountry('portugal');
await getCountry('ecuador');
};
getCountryData();
Thanks in advance,
Yes, that's the correct way to do so. Do realize though that you're blocking each request so they run one at a time, causing inefficiency. As I mentioned, the beauty of JavaScript is its asynchronism, so take advantage of it. You can run all the requests almost concurrently, causing your requests to speed up drastically. Take this example:
// get results...
const getCountry = async country => {
const res = await fetch(`https://restcountries.com/v2/name/${country}`);
const json = res.json();
return json;
};
const getCountryData = async countries => {
const proms = countries.map(getCountry); // create an array of promises
const res = await Promise.all(proms); // wait for all promises to complete
// get the first value from the returned array
return res.map(r => r[0]);
};
// demo:
getCountryData(['portugal', 'ecuador']).then(console.log);
// it orders by the countries you ordered
getCountryData(['ecuador', 'portugal']).then(console.log);
// get lots of countries with speed
getCountryData(['mexico', 'china', 'france', 'germany', 'ecaudor']).then(console.log);
Edit: I just realized that Promise.all auto-orders the promises for you, so no need to add an extra sort function. Here's the sort fn anyways for reference if you take a different appoach:
myArr.sort((a, b) =>
(countries.indexOf(a.name.toLowerCase()) > countries.indexOf(b.name.toLowerCase())) ? 1 :
(countries.indexOf(a.name.toLowerCase()) < countries.indexOf(b.name.toLowerCase()))) ? -1 :
0
);
I tried it the way #deceze recommended and it works fine: I removed all of the .then and replaced them with await. A lot cleaner this way. Now I can use normal try and catch blocks.
// GET COUNTRIES IN ORDER
const getCountry = async country => {
try {
const status = await fetch(`https://restcountries.com/v2/name/${country}`);
const data = await status.json();
renderCountry(data[0]); // Data is here. Now Render HTML
} catch (err) {
console.log(err.name, err.message);
}
};
const getCountryData = async function () {
await getCountry('portugal');
await getCountry('Ecuador');
};
btn.addEventListener('click', function () {
getCountryData();
});
Thank you all.

Variable is not storing Promise result in Node [duplicate]

This question already has answers here:
Why is my variable unaltered after I modify it inside of a function? - Asynchronous code reference
(7 answers)
How do I access previous promise results in a .then() chain?
(17 answers)
Closed 2 years ago.
I'm developing a script that connects with an API, then with the JSON reponse do some operations and then reformat the JSON to send it to another API.
But now I'm stuck in the first step as I can't deal with the first part as my Promises is not working as expected. How can I store the API's response into a variable? For development puropose I stored one API response into a JSON file. This is my code:
declare var require: any;
let url = './data/big buy/big-bui-product-list.json';
const fs = require('fs');
let counter = 0;
const getProductList = () => {
return new Promise((resolve, reject) => {
fs.readFile(url, 'utf8', (err, data) => {
if(err){
return reject (err);
}
else {
return resolve(JSON.parse(data));
}
})
})
}
const getProductStock = () => {
return new Promise((resolve, reject) => {
fs.readFile('./data/big buy/big-bui-product-stock.json', 'utf8', (err, data) => {
if(err) {
return reject(err);
}
else {
return resolve(JSON.parse(data));
}
})
})
}
try {
let products;
console.log('Products:');
Promise.all([getProductList()])
.then(function(result) {
products = result[0];
});
console.log('Stocks:');
const productStock = Promise.all([getProductStock()]);
console.log(products);
}
catch(e) {
console.log((`Ha ocurrido un error: ${e.message}`));
}
finally {
}
In this code, what I do is getting a list of products and then get the stocks of all the products, later I will add a new function that will filter by stock and get me just a list of products where stock is bigger than X units. Now when I launch it from the terminal I dont' get the response stored into products variable but if I add .then((data) => console.log(data)) into the Promise I see on screen the JSON but as I dont' have it stored it in any variable I don't see how can I work with the objects I'm retrieving.
Promises are asynchronous. They are quite literally promises that a value will be yielded in the future. When you do getProductList().then(x => products = x), you're saying that Javascript should fetch the product list in the background, and once it's finished, it should set the products variable to x. The key words there are "in the background", since the code afterwards keeps running while the product list is being fetched. The products variable is only guaranteed to be set after the .then portion is run. So, you can move things into the .then:
try {
let products;
getProductList().then(list => {
products = list;
return getProductStock(); // leverage promise chaining
}).then(stocks => {
console.log("Products:", products);
console.log("Stocks:", stocks);
});
}
catch(e) {
console.log((`Ha ocurrido un error: ${e.message}`));
}
finally {
}
You seem to be missing some fundamental knowledge about promises. I suggest reading through the MDN Using Promises guide to familiarize yourself a bit with them.
A structure like below never works.
let variable;
promise.then(data => variable = data);
console.log(variable);
This is because it is executed in the following manner.
Create the variable variable.
Add a promise handler to the promise.
Log variable.
Promise gets resolved.
Execute the promise handler.
Set variable variable to data.
If you are using Node.js 10 or higher you can use the promise API of file system to simplify your code quite a bit.
const fs = require('fs/promises');
const readJSON = (path) => fs.readFile(path, "utf8").then((json) => JSON.parse(json));
const getProductList = () => readJSON("./data/big buy/big-bui-product-list.json");
const getProductStock = () => readJSON("./data/big buy/big-bui-product-stock.json");
Promise.all([
getProductList(),
getProductStock(),
]).then(([products, stocks]) => {
console.log({ products, stocks });
}).catch((error) => {
console.log("Ha ocurrido un error:", error.message);
}).finally(() => {
// ...
});

Returning PromiseValue when creating an object

Ok so i've searched around and found nothing related to this problem.
My problem is something like this ->
I create an object (push into array) with some info taken from an api. After getting the info from the api i need to call yet another API to get further information on users. Since there are multiple keys for users i'd like to be able to set them inline with a simple function.
I'm doing something like this ->
_item.push({
Author: setPeople(item.Author.Title),
Title: item.Title,
....
Requester: setPeople(item.Requester.Title
})
At the moment i am getting the promise set(entirely) and not the PromiseValue. I know you usually do something like setPeople(name).then(() => {}) however that is not working in my object (sets the key too fast).
Any tip on how i should approach this?
Updating with more code.
export const retrieveIrfItems = async (spId) => {
let spQuery = "SITE" + SpQueryExtend1 + spQueryExpand;
return new Promise((resolve, reject) => {
let _items = [];
axiosApi.get(SiteUrl + spQuery).then((response) => {
//console.log(response.data.d);
return response.data.d;
}).then(async (item) => {
//let requesterSP = setPeople()
const createSPUser = async (user) => {
let spUser;
console.log("User prop is");
console.log(user);
setPeople(user).then((item) => {
spUser = item;
});
return spUser;
}
_item.push({
Author: setPeople(item.Author.Title),
Title: item.Title,
....
Requester: setPeople(item.Requester.Title
})
Ignore the unused function, i'm still doing tests to find a way for this problem.
Found the fix thanks to comments.
Using async/await wouldnt help me since i'd still get promise pending or undefined. What i had to use is ->
Requester: await setPeople(item.Requester.Title).then((user) => { return user }),
Using that in my object seems to work, but my question is...how good is this approach? If there are lots of fields with this behaviour (currently 5), wouldnt that slow down the page by...a lot ?
Then you should try something like that :
export const retrieveIrfItems = async (spId) => {
return new Promise((resolve, reject) => {
let spQuery = "SITE" + SpQueryExtend1 + spQueryExpand;
let _items = [];
try{
const axiosApiItem = await axiosApi.get(SiteUrl + spQuery);
const item = axiosApiItem.data.d;
_item.push({
Author: await setPeople(item.Author.Title),
Title: item.Title,
...
Requester: await setPeople(item.Requester.Title)
});
return resolve(_items);
}catch(e){
return reject(e);
}
});
}
Async / await is a way to consume async Promise functions. The async keyword tells the javascript compiler that this function will consume one or more asynchronous functions. The await keyword tells the compiler that the next function call is asynchronous and returns a promise.
In your code, your first then() function should return a promise which is not the case, that's why the second then() can't be reached.
Also, in your code, your new Promise doesn't return anything. When you create a new Promise, you have to end it by calling resolve() or reject.
Hope it helps...

Refactoring Complicated Nested Node.js Function

I have the following snippet of code below. It currently works, but I'm hoping to optimize/refactor it a bit.
Basically, it fetches JSON data, extracts the urls for a number of PDFs from the response, and then downloads those PDFs into a folder.
I'm hoping to refactor this code in order to process the PDFs once they are all downloaded. Currently, I'm not sure how to do that. There are a lot of nested asynchronous functions going on.
How might I refactor this to allow me to tack on another .then call before my error handler, so that I can then process the PDFs that are downloaded?
const axios = require("axios");
const moment = require("moment");
const fs = require("fs");
const download = require("download");
const mkdirp = require("mkdirp"); // Makes nested files...
const getDirName = require("path").dirname; // Current directory name...
const today = moment().format("YYYY-MM-DD");
function writeFile(path, contents, cb){
mkdirp(getDirName(path), function(err){
if (err) return cb(err)
fs.writeFile(path, contents, cb)
})
};
axios.get(`http://federalregister.gov/api/v1/public-inspection-documents.json?conditions%5Bavailable_on%5D=${today}`)
.then((res) => {
res.data.results.forEach((item) => {
download(item.pdf_url).then((data) => {
writeFile(`${__dirname}/${today}/${item.pdf_file_name}`, data, (err) => {
if(err){
console.log(err);
} else {
console.log("FILE WRITTEN: ", item.pdf_file_name);
}
})
})
})
})
.catch((err) => {
console.log("COULD NOT DOWNLOAD FILES: \n", err);
})
Thanks for any help you all can provide.
P.S. –– When I simply tack on the .then call right now, it fires immediately. This means that my forEach loop is non-blocking? I thought that forEach loops were blocking.
The current forEach will run synchronously, and will not wait for the asynchronous operations to complete. You should use .map instead of forEach so you can map each item to its Promise from download. Then, you can use Promise.all on the resulting array, which will resolve once all downloads are complete:
axios.get(`http://federalregister.gov/api/v1/public-inspection-documents.json?conditions%5Bavailable_on%5D=${today}`)
.then(processResults)
.catch((err) => {
console.log("COULD NOT DOWNLOAD FILES: \n", err);
});
function processResults(res) {
const downloadPromises = res.data.results.map((item) => (
download(item.pdf_url).then(data => new Promise((resolve, reject) => {
writeFile(`${__dirname}/${today}/${item.pdf_file_name}`, data, (err) => {
if(err) reject(err);
else resolve(console.log("FILE WRITTEN: ", item.pdf_file_name));
});
}))
));
return Promise.all(downloadPromises)
.then(() => {
console.log('all done');
});
}
If you wanted to essentially block the function on each iteration, you would want to use an async function in combination with await instead.

Using Promise.all() to fetch a list of urls with await statements

tl;dr - if you have to filter the promises (say for errored ones) don't use async functions
I'm trying to fetch a list of urls with async and parse them, the problem is that if there's an error with one of the urls when I'm fetching - let's say for some reason the api endpoint doesn't exists - the program crushes on the parsing with the obvious error:
UnhandledPromiseRejectionWarning: Unhandled promise rejection (rejection id: 1): TypeError: ext is not iterable
I've tried checking if the res.json() is undefined, but obviously that's not it as it complains about the entire 'ext' array of promises not being iterable.
async function fetchAll() {
let data
let ext
try {
data = await Promise.all(urls.map(url=>fetch(url)))
} catch (err) {
console.log(err)
}
try {
ext = await Promise.all(data.map(res => {
if (res.json()==! 'undefined') { return res.json()}
}))
} catch (err) {
console.log(err)
}
for (let item of ext) {
console.log(ext)
}
}
Question 1:
How do I fix the above so it won't crash on an invalid address?
Question 2:
My next step is to write the extracted data to the database.
Assuming the data size of 2-5mgb of content, is my approach of using Promise.all() memory efficient? Or will it be more memory efficient and otherwise to write a for loop which handles each fetch then on the same iteration writes to the database and only then handles the next fetch?
You have several problems with your code on a fundamental basis. We should address those in order and the first is that you're not passing in any URLS!
async function fetchAll(urls) {
let data
let ext
try {
data = await Promise.all(urls.map(url=>fetch(url)))
} catch (err) {
console.log(err)
}
try {
ext = await Promise.all(data.map(res => {
if (res.json()==! 'undefined') { return res.json()}
}))
} catch (err) {
console.log(err)
}
for (let item of ext) {
console.log(ext)
}
}
First you have several try catch blocks on DEPENDANT DATA. They should all be in a single try catch block:
async function fetchAll(urls) {
try {
let data = await Promise.all(urls.map(url=>fetch(url)))
let ext = await Promise.all(data.map(res => {
// also fixed the ==! 'undefined'
if (res.json() !== undefined) { return res.json()}
}))
for (let item of ext) {
console.log(ext)
}
} catch (err) {
console.log(err)
}
}
Next is the problem that res.json() returns a promise wrapped around an object if it exists
if (res.json() !== undefined) { return res.json()}
This is not how you should be using the .json() method. It will fail if there is no parsable json. You should be putting a .catch on it
async function fetchAll(urls) {
try {
let data = await Promise.all(urls.map(url => fetch(url).catch(err => err)))
let ext = await Promise.all(data.map(res => res.json ? res.json().catch(err => err) : res))
for (let item of ext) {
console.log(ext)
}
} catch (err) {
console.log(err)
}
}
Now when it cannot fetch a URL, or parse a JSON you'll get the error and it will cascade down without throwing. Now your try catch block will ONLY throw if there is a different error that happens.
Of course this means we're putting an error handler on each promise and cascading the error, but that's not exactly a bad thing as it allows ALL of the fetches to happen and for you to distinguish which fetches failed. Which is a lot better than just having a generic handler for all fetches and not knowing which one failed.
But now we have it in a form where we can see that there is some better optimizations that can be performed to the code
async function fetchAll(urls) {
try {
let ext = await Promise.all(
urls.map(url => fetch(url)
.then(r => r.json())
.catch(error => ({ error, url }))
)
)
for (let item of ext) {
console.log(ext)
}
} catch (err) {
console.log(err)
}
}
Now with a much smaller footprint, better error handling, and readable, maintainable code, we can decide what we eventually want to return. Now the function can live wherever, be reused, and all it takes is a single array of simple GET URLs.
Next step is to do something with them so we probably want to return the array, which will be wrapped in a promise, and realistically we want the error to bubble since we've handled each fetch error, so we should also remove the try catch. At that point making it async no longer helps, and actively harms. Eventually we get a small function that groups all URL resolutions, or errors with their respective URL that we can easily filter over, map over, and chain!
function fetchAll(urls) {
return Promise.all(
urls.map(url => fetch(url)
.then(r => r.json())
.then(data => ({ data, url }))
.catch(error => ({ error, url }))
)
)
}
Now we get back an array of similar objects, each with the url it fetched, and either data or an error field! This makes chaining and inspecting SUPER easy.
You are getting a TypeError: ext is not iterable - because ext is still undefined when you caught an error and did not assign an array to it. Trying to loop over it will then throw an exception that you do not catch.
I guess you're looking for
async function fetchAll() {
try {
const data = await Promise.all(urls.map(url => fetch(url)));
const ext = await Promise.all(data.map(res => res.json()));
for (let item of ext) {
console.log(item);
}
} catch (err) {
console.log(err);
}
}
Instead of fetch(url) on line 5, make your own function, customFetch, which calls fetch but maybe returns null, or an error object, instead of throwing.
something like
async customFetch(url) {
try {
let result = await fetch(url);
if (result.json) return await result.json();
}
catch(e) {return e}
}
if (res.json()==! 'undefined')
Makes no sense whatsoever and is an asynchronous function. Remove that condition and just return res.json():
try {
ext = await Promise.all(data.map(res => res.json()))
} catch (err) {
console.log(err)
}
Whether or not your approach is "best" or "memory efficient" is up for debate. Ask another question for that.
You can have fetch and json not fail by catching the error and return a special Fail object that you will filter out later:
function Fail(reason){this.reason=reason;};
const isFail = o => (o&&o.constructor)===Fail;
const isNotFail = o => !isFail(o);
const fetchAll = () =>
Promise.all(
urls.map(
url=>
fetch(url)
.then(response=>response.json())
.catch(error=>new Fail([url,error]))
)
);
//how to use:
fetchAll()
.then(
results=>{
const successes = results.filter(isNotFail);
const fails = results.filter(isFail);
fails.forEach(
e=>console.log(`failed url:${e.reason[0]}, error:`,e.reason[1])
)
}
)
As for question 2:
Depending on how many urls you got you may want to throttle your requests and if the urls come from a large file (gigabytes) you can use stream combined with the throttle.
async function fetchAll(url) {
return Promise.all(
url.map(
async (n) => fetch(n).then(r => r.json())
)
);
}
fetchAll([...])
.then(d => console.log(d))
.catch(e => console.error(e));
Will this work for you?
If you don't depend on every resource being a success I would have gone back to basics skipping async/await
I would process each fetch individual so I could catch the error for just the one that fails
function fetchAll() {
const result = []
const que = urls.map(url =>
fetch(url)
.then(res => res.json())
.then(item => {
result.push(item)
})
.catch(err => {
// could't fetch resource or the
// response was not a json response
})
)
return Promise.all(que).then(() => result)
}
Something good #TKoL said:
Promise.all errors whenever one of the internal promises errors, so whatever advice anyone gives you here, it will boil down to -- Make sure that you wrap the promises in an error handler before passing them to Promise.all
Regarding question 1, please refer to this:
Handling errors in Promise.all
Promise.all is all or nothing. It resolves once all promises in the array resolve, or reject as soon as one of them rejects. In other words, it either resolves with an array of all resolved values, or rejects with a single error.

Categories