How to require directories using async / await - javascript

I am trying to load data which is in javascript files into an object.
All the files are like this:
module.exports = {
test: ‘qwerty’
}
I’m using require, but I have to load several directories so need to do the loads one at a time.
I tried wrapping it in a promise:
function load(fileOrDirPath) {
return new Promise(function(resolve, reject) {
let data;
debug(`Requiring: [${fileOrDirPath}]`);
try {
data = require(fileOrDirPath);
}
catch(e) {
return reject(e);
}
debug(`Loaded data: [${JSON.stringify(data, 0, 2)}]`);
return resolve(data);
});
}
I use the function:
const dirPath = ‘redacted’
const data = await load(dirPath);
debug(`Loaded: [${JSON.stringify(data, 0, 2)}]`);
And the logs show that the data is loaded inside the function. However outside the data is always null.
Why doesn’t the function await?
I tried looking on npm for a module but couldn’t find any.
How can I load a directory of javascript files recursively into an object?

Related

Iterating the creation of objects with asynchonous javascript map method data [duplicate]

This question already has answers here:
Using async/await with a forEach loop
(33 answers)
Use async await with Array.map
(9 answers)
Closed 27 days ago.
In an async IIFE at the bottom of this javascript, you'll see that I'm trying to: 1) read a JSON file, 2) get multiple RSS feed URLs from that data, 3) pull and parse the data from those feeds, and create an object with that data, so I can 4) write that pulled RSS data object to a JSON file. Everything for #1 and #2 is fine. I'm able to pull data from multiple RSS feeds in #3 (and log it to console), and I'm comfortable handling #4 when I get to that point later.
My problem is that, at the end of step #3, within the const parseFeed function, I am trying to create and push an object for that iteration of rssJSONValsArr.map() in the IIFE and it's not working. The rssFeedDataArr result is empty. Even though I am able to console.log those values, I can't create and push the new object I need in order to reach step #4. My creating of a similar object in #2 works fine, so I think it's the map I have to use within const parseFeed to pull the RSS data (using the rss-parser npm package) which is making object creation not work in step #3. How do I get rssFeedOject to work with the map data?
import fs from 'fs';
import path from 'path';
import { fileURLToPath } from 'url';
import Parser from 'rss-parser';
const parser = new Parser();
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
const feedsJSON = path.join(__dirname, 'rss-feeds-test.json');
const rssJSONValsArr = [];
const rssFeedDataArr = [];
const pullValues = (feedObject, i) => {
const url = feedObject.feed.url;
const jsonValsObject = {
url: url,
};
rssJSONValsArr.push(jsonValsObject);
};
const parseFeed = async (url) => {
try {
const feed = await parser.parseURL(url);
feed.items.forEach((item) => {
console.log(`title: ${item.title}`); // correct
});
const rssFeedOject = {
title: item.title,
};
rssFeedDataArr.push(rssFeedOject);
} catch (err) {
console.log(`parseFeed() ERROR 💥: ${err}`);
}
};
(async () => {
try {
console.log('1: read feeds JSON file');
const feedsFileArr = await fs.promises.readFile(feedsJSON, {
encoding: 'utf-8',
});
const jsonObj = JSON.parse(feedsFileArr);
console.log('2: get feed URLs');
jsonObj.slice(0, 30).map(async (feedObject, i) => {
await pullValues(feedObject, i);
});
console.log('rssJSONValsArr: ', rssJSONValsArr); // correct
console.log('3: pull data from rss feeds');
rssJSONValsArr.map(async (feedItem, i) => {
await parseFeed(feedItem.url, i);
});
console.log('rssFeedDataArr: ', rssFeedDataArr); // empty !!!
// console.log('4: write rss data to JSON file');
// await fs.promises.writeFile(
// `${__dirname}/rss-bulk.json`,
// JSON.stringify(rssFeedDataArr)
// );
console.log('5: Done!');
} catch (err) {
console.log(`IIFE CATCH ERROR 💥: ${err}`);
}
})();
Example JSON file with two RSS feed URLs:
[
{
"feed": {
"details": {
"name": "nodejs"
},
"url": "https://news.google.com/rss/search?q=nodejs"
}
},
{
"feed": {
"details": {
"name": "rss-parser"
},
"url": "https://news.google.com/rss/search?q=rss-parser"
}
}
]
Any and all help appreciated. Thanks
The problem is you are printing rssFeedDataArr right after the .map call, which, like stated on the comments, is being incorrectly used, since you are not using the returned value, forEach would be the way to go here. For every value in rssJSONValsArr you are calling an anonymous and async function which in turn awaits for parseFeed, so you are basically creating a Promise in each iteration, but obviously those promises are resolved after your print statement is executed. You need to wait for all of those promises to be resolved before printing rssFeedDataArr. One way to do that, since you are creating a bunch of promises which can be run in parallel is to use Promise.all, like this:
await Promise.all(
rssJSONValsArr.map(async (feedItem, i) => {
await parseFeed(feedItem.url, i);
});
)
and you we can simplify it even more and return the promise created by parseFeed directly:
await Promise.all(
rssJSONValsArr.map((feedItem, i) => parseFeed(feedItem.url, i))
)
And in this case the right method is map and not forEach
In the case of rssJSONValsArr it works because the call to pullValues is being resolved instantly, it doesnt run asynchronously, even when its declared as async, there is not await inside the function definition.

how to fetch json from API using vanilla JS through Protractor

I am trying to download distance between 2 locations from tomtom api.
Protractor will not let me use
*fetch - fetch is not defined - please use import
*import - Cannot use import statement outside of module
*when I add
{
type: module
}
to package.json - protractor stops working, as no entire code is a module of ES
*browser.get - opens http with json data, but I cannot extract it.
Is there any other way? I tried to import json to a different file and export response.data, but the module error stops me from doing that also.
Protractor is for testing angular webpages, but you can have the browser execute arbitrary javascript, but to use fetch, you need to use window
function getTomTomData() {
//replace this with your tomtom api call, and transform the response
return window.fetch(TOM_TOM_URL);
}
browser.executeScript(getTomTomData).then(res=> {
//do something with the response
});
I did not manage to run node-fetch on my script as Protractor kept rejecting the import. I managed to to sort it out with require 'https'
const https = require('https');
let measureDistance = async function(pickup, dropoff) {
let url = `https://api.tomtom.com/routing/1/calculateRoute/${pickup[0]}%2C${pickup[1]}%3A${dropoff[0]}%2C${dropoff[1]}/json?routeType=shortest&avoid=unpavedRoads&key=uwbU08nKLNQTyNrOrrQs5SsRXtdm4CXM`;
await https.get(url, res => {
let body = '';
res.on('data', chunk => {
body += chunk;
});
res.on("end", () => {
try {
let json = JSON.parse(body);
howFar = json.routes[0].summary.lengthInMeters;
} catch (error) {
console.error(error.message);
}
}).on("error", (error) => {
console.error(error.message);
});
});
};
Also I used to put require on top of the file like in Ruby, which seemed to be another issue.

Get all Amazon S3 files inside a bucket within Promise

I'm trying to grab thousands of files from Amazon S3 within a Promise but I can't seem to figure out how to include the ContinuationToken within if the list is truncated and gather it all together within the promise. I'm a novice with JS and could use some help. Here's what I have, so far:
getFiles()
.then(filterFiles)
.then(mapUrls)
;
function getFiles(token) {
var params = {
Bucket: bucket,
MaxKeys: 5000,
ContinuationToken: token
};
var allKeys = [];
var p = new Promise(function(resolve, reject){
s3.listObjectsV2(params, function(err, data) {
if (err) {
return reject(err);
}
allKeys.push(data.Contents)
if (data.IsTruncated) {
s3.listObjectsV2({Bucket: bucket, MaxKeys: 5000, ContinuationToken: data.NextContinuationToken})
console.log('Getting more images...');
allKeys.push(data.Contents)
}
resolve(data.Contents);
});
});
return p;
}
I need the function to continue to run until I've created a list of all objects in the bucket to return.
You need ContinuationToken the second time only.
var params = {
Bucket: bucket,
MaxKeys: 5000,
};
if (data.IsTruncated) {
s3.listObjectsV2({...params, ContinuationToken: data.NextContinuationToken})
IMO, this is just a s3 function called twice, more like a nested
call. Recursion is when a function keeps calling itself
until a specified condition is met.
Read more about recursion: https://medium.com/#vickdayaram/recursion-caad288bf621
I was able to list all objects in the bucket using async/await and the code below to populate an array.
async function getFiles(objects = []) {
const response = await s3.listObjectsV2(params).promise();
response.Contents.forEach(obj => filelist.push(obj.Key));
if (response.NextContinuationToken) {
params.ContinuationToken = response.NextContinuationToken;
await getFiles(params, objects);
}
console.log(filelist.length)
return filelist;
}
Thanks to all who helped!

How do I make a function wait for the completion of another function before running

The problem I'm facing is that I want to create a temporary work folder for a certain function to hold its assets and work on them.
So:
await function someFunc() {
createFolder()
...
makeSomeFiles()
doOtherStuff()
...
deleteFolder()
}
But the functions that I am using, in node.js, are all async. Creating a folder is fs.mkdir() and deleting a folder is fs.rmdir() and downloading images and saving them is also an async procedure of some kind.
The problem is such: the folder gets created, and deleted, before any of the code in the middle executes. So I get errors from the middle section code that the folder doesn't exist, because it gets deleted prematurely. How do i make fs.rmdir(), at the end, wait for all the middle code to run first, before deleting the folder.
The specific code is this:
async function run() {
//GENERATE SESSION ID AND FOLDER
const sessionID = str.random(50);
fs.mkdir('images/'+sessionID, (err) => {
if (err) return err;
});
//DOWNLOAD IMAGE
https.get('https://en.wikipedia.org/wiki/Main_Page#/media/File:RE_Kaja_Kallas.jpg', (file) => {
file.pipe(fs.createWriteStream('images/'+sessionID+'/image.jpeg'));
});
//CLEANUP
fs.rmdir('images/'+sessionID, { recursive: true }, (err) => {
if (err) return err;
});
}
I would use promise-based versions of functions that do these operations and then use async/await with those promises:
const stream = require('stream');
const {promisify} = require('util');
const fs = require('fs');
const fsp = fs.promises;
const got = require('got');
const pipeline = promisify(stream.pipeline);
async function run() {
const sessionID = str.random(50);
const dir = 'images/'+sessionID;
await fsp.mkdir(dir);
await pipeline(
got.stream('https://en.wikipedia.org/wiki/Main_Page#/media/File:RE_Kaja_Kallas.jpg'),
fs.createWriteStream(dir + '/image.jpeg')
);
// not sure why you're trying to remove a directory that you just
// put a file in so it's not empty
await fsp.rmdir(dir, { recursive: true })
}
run().then(() => {
console.log("all done");
}).catch(err => {
console.log(err);
});
But, this function isn't making a lot of sense to me because you're creating a directory, downloading a file to it and then trying to remove a non-empty directory.
This uses the library got() for downloading the file because it's my goto library for http requests since it has both stream and promise interfaces.

Return deferred function result to next Gulp pipe with gulp-tap

I'm creating a simple, static site where, in development, I'm using Handlebars.js and making some API calls to fill in the Handlebars templates. But for production, I want to precompile all the templates into static HTML.
I'm trying to automate that process with Gulp, so I've got a task that looks like this:
gulp.task('compileHtml', () => {
return gulp
.src('index.html')
.pipe(someThing())
.pipe(anotherThing())
.pipe(compileTemplates())
.pipe(someOtherThing())
.pipe(gulp.dest('build'));
});
In my compileTemplates function, I'm using gulp-tap and jsdom to basically run the file with relevant scripts to make the API calls and fill in the Handlebars templates, then remove those scripts and send back the compiled HTML to the next pipe. But I'm having trouble deferring sending back the new DOM until jsdom has had ample time to run all the scripts.
Here's what I have so far:
const compileTemplates = file => {
return tap(file => {
const dom = new JSDOM(file.contents,
{
runScripts: 'dangerously',
resources: 'usable',
beforeParse(window) {
window.fetch = require('node-fetch');
},
},
);
const document = dom.window.document;
const script = document.querySelector('script[src$="handlebars.min.js"]');
// dom.serialize() still contains my uncompiled templates at this point
setTimeout(() => {
script.remove();
file.contents = Buffer.from(dom.serialize()); // this is what I want to return from this function
}, 2500);
});
};
I know I probably need to use a promise to send back file.contents when it's ready, but I'm not great with promises or with Gulp.
I've tried returning a promise that resolves inside the timeout, but I end up with TypeError: dest.on is not a function because the next pipe is ultimately expecting file and not a promise.
How could I refactor to either defer sending back my manipulated file to the next pipe or to send back a promise from this function and then resolve that promise to the new file in my compileHtml task?
I'm using Gulp 4.
After consulting Using setTimeout on Promise Chain, I figured out how to resolve the promise after a timeout.
const compileTemplates = file => {
return tap(file => {
const dom = new JSDOM(file.contents,
{
runScripts: 'dangerously',
resources: 'usable',
beforeParse(window) {
window.fetch = require('node-fetch');
},
},
);
const document = dom.window.document;
const script = document.querySelector('script[src$="handlebars.min.js"]');
new Promise(resolve => {
setTimeout(resolve, 2500);
}).then(() => {
script.remove();
file.contents = Buffer.from(dom.serialize());
});
});
};
However, that didn't totally solve my problem because I needed to wait for that promise to resolve before sending the new file to the next pipe in my gulp task, but I couldn't find any good documentation on callbacks in gulp-tap.
So I ended up using through2, instead.
const compileHtml = file => {
return through2.obj(function(file, encoding, callback) {
const dom = new JSDOM(file.contents,
{
runScripts: 'dangerously',
resources: 'usable',
beforeParse(window) {
window.fetch = require('node-fetch');
},
},
);
const document = dom.window.document;
const script = document.querySelector('script[src$="handlebars.min.js"]');
new Promise(resolve => {
setTimeout(resolve, 2500);
}).then(() => {
script.remove();
file.contents = Buffer.from(dom.serialize());
this.push(file);
callback();
});
});
}
If anyone knows what this function would look like using gulp-tap, feel free to post an answer!

Categories