Files in subdirectory aren't found in Node.js - javascript

I need somehow to loop over subdirectories but it returns me an error: ENOENT: no such file or directory, stat 'text3.txt'
Here are the files I use:
main.js
files
|_file1.txt
|_file2.txt
dir
|_text3.txt
Here is my main.js:
fs = require('fs'), aes = require('aes256'),
key = 'abc';
enc = file => {
return aes.encrypt(key,file)
}
decr = encr => {
return aes.decrypt(key,encr)
}
clf = dir => {
files = fs.readdirSync(dir);
// Filter files
for(let i of files){
stat = fs.statSync(i)
if(stat.isFile()){
newFiles.push(i)
}
else dirs.push(i)
}
// Encrypt folders
for(let file of newFiles){
fl = fs.readFileSync(file).toString();
fs.writeFileSync(file,enc(fl));
}
}
clf('./')
for(let c of dirs) clf(c);
Decrypt and ecnrypt func-s use aes256 encryption and return strings. Then clf function checks if files are not folders and pushes folders to an array. Then we encrypt files in main directory, but nothing happens in sub directories, it returns an error instead:
ENOENT: no such file or directory, stat 'text3.txt'
But text3.txt IS in dir directory!! Then why I have an error?

First off, declare every single variable you use. This is a recipe for disaster using undeclared variables. I would not even attempt to work on code like this without first declaring every single variable to the proper scope using let or const.
Second, when you do fs.statSync(i), the i here is just a plain filename with no path. If you do console.log(i), you will see it is only a filename. So, to reference the right file, you have to add the path back onto it from the your readdirSync(dir) and then pass that full path to fs.statSync().
You will find path.join() as a convenient way to combine a path with a filename.

Related

How do I access a JSON file relative to my calling module in Node?

I'm defining a package, PackageA, that has a function (parseJson) that takes in a file path to a json file to parse. In another package, PackageB, I want to be able to call PackageA using a file I specify with a local path from PackageB. For example, if file.json is in the same directory as packageB, I'd like to be able to call PackageA.parseJson('./file.json'), without any extra code in PackageB. How would I do this? It seems that require requires a path from PackageA to the file, which is not what I want.
Edit: Currently, parseJson looks something like this:
public parseJson(filepath) {
let j = require(filepath);
console.log(j);
}
and PackageB is calling it like this:
let a = new PackageA();
a.parseJson("./file.json");
file.json is in the same directory as PackageB.
CommonJS modules have __dirname variable in their scope, containing a path to directory they reside in.
To get absolute path to RELATIVE_PATH use join(__dirname, RELATIVE_PATH) (join from path module).
example:
// PackageB .js file
const Path = require('path')
const PackageA = require(/* PackageA name or path */)
const PackageB_jsonPathRelative = /* relative path to json file */
// __dirname is directory that contains PackageB .js file
const PackageB_jsonPathAbsolute = Path.join(__dirname, PackageB_jsonPathRelative)
PackageA.parseJson(PackageB_jsonPathAbsolute)
UPDATED
If you can't change PackageB, but you know exactly how PackageA.parseJson is called by PackageB (e.g. directly, or through wrappers, but with known depth), then you can get path to PackageB from stack-trace.
example:
// PackageA .js file
// `npm install stack-trace#0.0.10` if you have `ERR_REQUIRE_ESM` error
const StackTrace = require('stack-trace')
const Path = require('path')
const callerFilename = (skip=0) => StackTrace.get(callerFilename)[skip + 1].getFileName()
module.exports.parseJson = (caller_jsonPathRelative) => {
// we want direct caller of `parseJson` so `skip=0`
// adjust `skip` parameter if caller chain changes
const callerDir = Path.dirname(callerFilename())
// absolute path to json file, from relative to caller file
const jsonPath = Path.join(callerDir, caller_jsonPathRelative)
console.log(jsonPath)
console.log(JSON.parse(require('fs').readFileSync(jsonPath)))
}

No such file or directory when exporting function from another file

src/test.js
module.exports.test = function() {
const { readFileSync } = require('fs');
console.log(readFileSync('test.txt', 'utf8').toString())
}
index.js
const { test } = require('./src/test.js');
test();
Which results in No such file or directory. Does module.exports or exports not work when requiring files in another directory?
When you do something like this:
readFileSync('test.txt', 'utf8')
that attempts to read test.txt from the current working directory. That current working directory is determined by how the main program got started and what the current working directory was when the program was launched. It will have nothing at all to do with the directory your src/test.js module is in.
So, if test.txt is inside the same directory as your src/test.js and you want to read it from there, then you need to manually build a path that references your module's directory. To do that, you can use __dirname which is a special variable set for each module that points to the directory the module is in.
In this case, you can do this:
const path = require('path');
module.exports.test = function() {
const { readFileSync } = require('fs');
console.log(readFileSync(path.join(__dirname, 'test.txt'), 'utf8').toString())
}
And, that will reliably read test.txt from your module's directory.

Copy file to several dynamic paths using gulp

I have the following gulp task. The goal is to copy each html file to more than one directory, based on a json array (directoryData).
gulp.task("reorg", () => {
return gulp.src('./dist/**/*.html')
.pipe(rename(function (path) {
let fileName = path.dirname.split('/')[0];
let directoryName = directoryData[fileName][0];
path.dirname = `${directoryName}/${path.dirname}`;
}))
.pipe(gulp.dest('./dist/'));
});
Currently this task will only copy each file to their first directory in the json array. I'd like to iterate over directoryData[fileName] and copy the file to every directory listed.
I would create a variable that contains the destination folder, and then I would insert the variable in the .pipe(dest([var1,var2])).
example:
var1 = './dist/';
var2 = './assets/';

Dynamic command handler with shared and seperate commands

I am setting up a command handler for multiple channels (Twitch). At the moment I have all the commands divided in folders user specific and the generic ones. Accessing them by using a map(). I would like each user/channel to have access to only their folder and the generic one. The map key is name in a .js file.
So what would be the best way to do it? Ive tried mapping over the generic folder and the folder that matches the user name on login, but I am not aware of a way to change the "command" in client.commands.set(key, value) so it would be client.(nameChannel).set(key, value). Then I would be able to probably assign the default and user specific folder to the map.
Also fs.dirReadSync lists all of the .js files in a folder and sub folder. How do I access all of them at once in require? Wildcards don't seem to work so do I need to list them like shown below?
I want to be able to add more later and not hardcode them one-by-one if possible.
//hardcode example.
var moduleA = require( "./module-a.js" );
var moduleB = require( "../../module-b.js" );
var moduleC = require( "/my-library/module-c.js" );
The piece of code below is still a work in progress. What I like to achieve:
exclude channel specific commands from being called from other channels.
know if/what the standard or recommended approach is.
how to require() all .js from the readDir sync in one require.
client.commands = new Map();
//add commands property/method to client instance. A map to iterate over.
const commandFiles = fs.readdirSync("./commands").filter(file => file.endsWith(".js"));
//reads file system and filter .js files only. returns an array of those items.
for (const file of commandFiles) {
const command = require(`./commands/${file}`); //this only grabs the results in commands itself can't use wildcards..
//sets a new item in the collection.
//Key of map is command name.
client.commands.set(command.name, command);
}
//input validation etc here
//check command
try {
client.commands.get(commandFromMessage).execute(channel, commandFromMessage, argument);
} catch (error) {
console.error(error);
}
pastebin of the folder tree: https://pastebin.com/XNJt98Ha
You can set string names as your keys in regular objects.
const channelSpecificCommands = new Map();
const channelFolder= fs.readdirSync(`./commands/${channelSpecificDir}`);
for(const file in channelFolder) {
const commandFromFile = require(`./commands/{${channelSpecificDir}/${file}`)
channelSpecificCommands.set(commandFromFile.name, commandFromFile);
}
client[channelSpecificDir] = channelSpecificCommands;
For your second question - you should be using .json files instead of .js for this. You can load json files with JSON.parse(fs.readFileSync('myjsonfile.json', 'utf8')) to get the data without any issues that come with dynamic module resolution.

How to delete all json files within directory NodeJs

I need to delete only the json files within a directory (multiple levels). I'd hazard a guess that it's possible with fs-unlinkSync(path)
But I can't find a solution without specifying the individual file name.
I was hoping to solve it with the following...
fs.unlinkSync('./desktop/directory/*.json')
but unfortunately the asterisk wouldn't select all. Any suggestion please?
You can list files using fs.readdirSync, then call fs.unlinkSync to delete. This can be called recursively to traverse an entire tree.
const fs = require("fs");
const path = require("path");
function deleteRecursively(dir, pattern) {
let files = fs.readdirSync(dir).map(file => path.join(dir, file));
for(let file of files) {
const stat = fs.statSync(file);
if (stat.isDirectory()) {
deleteRecursively(file, pattern);
} else {
if (pattern.test(file)) {
console.log(`Deleting file: ${file}...`);
// Uncomment the next line once you're happy with the files being logged!
try {
//fs.unlinkSync(file);
} catch (err) {
console.error(`An error occurred deleting file ${file}: ${err.message}`);
}
}
}
}
}
deleteRecursively('./some_dir', /\.json$/);
I've actually left the line that deletes the file commented out.. I'd suggest you run the script and be happy the files that are logged are the right ones. Then just uncomment the fs.unlinkSync line to delete the files.

Categories