Copy file to several dynamic paths using gulp - javascript

I have the following gulp task. The goal is to copy each html file to more than one directory, based on a json array (directoryData).
gulp.task("reorg", () => {
return gulp.src('./dist/**/*.html')
.pipe(rename(function (path) {
let fileName = path.dirname.split('/')[0];
let directoryName = directoryData[fileName][0];
path.dirname = `${directoryName}/${path.dirname}`;
}))
.pipe(gulp.dest('./dist/'));
});
Currently this task will only copy each file to their first directory in the json array. I'd like to iterate over directoryData[fileName] and copy the file to every directory listed.

I would create a variable that contains the destination folder, and then I would insert the variable in the .pipe(dest([var1,var2])).
example:
var1 = './dist/';
var2 = './assets/';

Related

How do I access a JSON file relative to my calling module in Node?

I'm defining a package, PackageA, that has a function (parseJson) that takes in a file path to a json file to parse. In another package, PackageB, I want to be able to call PackageA using a file I specify with a local path from PackageB. For example, if file.json is in the same directory as packageB, I'd like to be able to call PackageA.parseJson('./file.json'), without any extra code in PackageB. How would I do this? It seems that require requires a path from PackageA to the file, which is not what I want.
Edit: Currently, parseJson looks something like this:
public parseJson(filepath) {
let j = require(filepath);
console.log(j);
}
and PackageB is calling it like this:
let a = new PackageA();
a.parseJson("./file.json");
file.json is in the same directory as PackageB.
CommonJS modules have __dirname variable in their scope, containing a path to directory they reside in.
To get absolute path to RELATIVE_PATH use join(__dirname, RELATIVE_PATH) (join from path module).
example:
// PackageB .js file
const Path = require('path')
const PackageA = require(/* PackageA name or path */)
const PackageB_jsonPathRelative = /* relative path to json file */
// __dirname is directory that contains PackageB .js file
const PackageB_jsonPathAbsolute = Path.join(__dirname, PackageB_jsonPathRelative)
PackageA.parseJson(PackageB_jsonPathAbsolute)
UPDATED
If you can't change PackageB, but you know exactly how PackageA.parseJson is called by PackageB (e.g. directly, or through wrappers, but with known depth), then you can get path to PackageB from stack-trace.
example:
// PackageA .js file
// `npm install stack-trace#0.0.10` if you have `ERR_REQUIRE_ESM` error
const StackTrace = require('stack-trace')
const Path = require('path')
const callerFilename = (skip=0) => StackTrace.get(callerFilename)[skip + 1].getFileName()
module.exports.parseJson = (caller_jsonPathRelative) => {
// we want direct caller of `parseJson` so `skip=0`
// adjust `skip` parameter if caller chain changes
const callerDir = Path.dirname(callerFilename())
// absolute path to json file, from relative to caller file
const jsonPath = Path.join(callerDir, caller_jsonPathRelative)
console.log(jsonPath)
console.log(JSON.parse(require('fs').readFileSync(jsonPath)))
}

Dynamic command handler with shared and seperate commands

I am setting up a command handler for multiple channels (Twitch). At the moment I have all the commands divided in folders user specific and the generic ones. Accessing them by using a map(). I would like each user/channel to have access to only their folder and the generic one. The map key is name in a .js file.
So what would be the best way to do it? Ive tried mapping over the generic folder and the folder that matches the user name on login, but I am not aware of a way to change the "command" in client.commands.set(key, value) so it would be client.(nameChannel).set(key, value). Then I would be able to probably assign the default and user specific folder to the map.
Also fs.dirReadSync lists all of the .js files in a folder and sub folder. How do I access all of them at once in require? Wildcards don't seem to work so do I need to list them like shown below?
I want to be able to add more later and not hardcode them one-by-one if possible.
//hardcode example.
var moduleA = require( "./module-a.js" );
var moduleB = require( "../../module-b.js" );
var moduleC = require( "/my-library/module-c.js" );
The piece of code below is still a work in progress. What I like to achieve:
exclude channel specific commands from being called from other channels.
know if/what the standard or recommended approach is.
how to require() all .js from the readDir sync in one require.
client.commands = new Map();
//add commands property/method to client instance. A map to iterate over.
const commandFiles = fs.readdirSync("./commands").filter(file => file.endsWith(".js"));
//reads file system and filter .js files only. returns an array of those items.
for (const file of commandFiles) {
const command = require(`./commands/${file}`); //this only grabs the results in commands itself can't use wildcards..
//sets a new item in the collection.
//Key of map is command name.
client.commands.set(command.name, command);
}
//input validation etc here
//check command
try {
client.commands.get(commandFromMessage).execute(channel, commandFromMessage, argument);
} catch (error) {
console.error(error);
}
pastebin of the folder tree: https://pastebin.com/XNJt98Ha
You can set string names as your keys in regular objects.
const channelSpecificCommands = new Map();
const channelFolder= fs.readdirSync(`./commands/${channelSpecificDir}`);
for(const file in channelFolder) {
const commandFromFile = require(`./commands/{${channelSpecificDir}/${file}`)
channelSpecificCommands.set(commandFromFile.name, commandFromFile);
}
client[channelSpecificDir] = channelSpecificCommands;
For your second question - you should be using .json files instead of .js for this. You can load json files with JSON.parse(fs.readFileSync('myjsonfile.json', 'utf8')) to get the data without any issues that come with dynamic module resolution.

Files in subdirectory aren't found in Node.js

I need somehow to loop over subdirectories but it returns me an error: ENOENT: no such file or directory, stat 'text3.txt'
Here are the files I use:
main.js
files
|_file1.txt
|_file2.txt
dir
|_text3.txt
Here is my main.js:
fs = require('fs'), aes = require('aes256'),
key = 'abc';
enc = file => {
return aes.encrypt(key,file)
}
decr = encr => {
return aes.decrypt(key,encr)
}
clf = dir => {
files = fs.readdirSync(dir);
// Filter files
for(let i of files){
stat = fs.statSync(i)
if(stat.isFile()){
newFiles.push(i)
}
else dirs.push(i)
}
// Encrypt folders
for(let file of newFiles){
fl = fs.readFileSync(file).toString();
fs.writeFileSync(file,enc(fl));
}
}
clf('./')
for(let c of dirs) clf(c);
Decrypt and ecnrypt func-s use aes256 encryption and return strings. Then clf function checks if files are not folders and pushes folders to an array. Then we encrypt files in main directory, but nothing happens in sub directories, it returns an error instead:
ENOENT: no such file or directory, stat 'text3.txt'
But text3.txt IS in dir directory!! Then why I have an error?
First off, declare every single variable you use. This is a recipe for disaster using undeclared variables. I would not even attempt to work on code like this without first declaring every single variable to the proper scope using let or const.
Second, when you do fs.statSync(i), the i here is just a plain filename with no path. If you do console.log(i), you will see it is only a filename. So, to reference the right file, you have to add the path back onto it from the your readdirSync(dir) and then pass that full path to fs.statSync().
You will find path.join() as a convenient way to combine a path with a filename.

How to output files to the parent of their source directory using Webpack in Gulp?

So far I have this code, which I got from here:
var gulp = require('gulp');
var webpack = require('webpack-stream');
var named = require('vinyl-named');
gulp.task('default', function() {
return gulp.src('*/lib/app.js', { base: '.' })
.pipe(named())
.pipe(webpack())
.pipe(gulp.dest('.'));
});
My folder structure is like:
site1/lib/app.js
site2/lib/app.js
I want to create the output files like the following, with each file containing only their respective lib/app.js file's code (and any require()s made in them):
site1/app.js
site2/app.js
However, the code I have now just outputs to the project's root directory. I've tried several combinations, such as removing the { base: '.' }, but nothing works. If I remove the named() and webpack() pipes, though, then the current code actually outputs to the correct directory. So, in the process, it seems like perhaps Webpack loses the originating directory information?
Also, it possible to get a solution that also works with Webpack's "watch: true" option, so that compiling modified files is quick, rather than using Gulp to always iterate through every single file on every file change?
I assume you want to create a app.js for each site that packs only the code for that site (and not the others).
In that case you can use gulp-foreach to effectively iterate over all your app.js files and send each one down its own stream. Then you can use the node.js built-in path module to figure out where the parent directory for each app.js file is and write it there with gulp.dest().
var gulp = require('gulp');
var webpack = require('webpack-stream');
var named = require('vinyl-named');
var foreach = require('gulp-foreach');
var path = require('path');
gulp.task('default', function() {
return gulp.src('*/lib/app.js')
.pipe(foreach(function(stream, file) {
var parentDir = path.dirname(path.dirname(file.path));
return stream
.pipe(named())
.pipe(webpack())
.pipe(gulp.dest(parentDir));
}));
});
If you want to use webpack({watch:true}) you'll have to use a different approach. The following uses glob to iterate over all the app.js files. Each app.js file is again send down its own stream, however this time all the streams are merged before being returned.
var gulp = require('gulp');
var webpack = require('webpack-stream');
var named = require('vinyl-named');
var path = require('path');
var merge = require('merge-stream');
var glob = require('glob');
gulp.task('default', function() {
return merge.apply(null, glob.sync('*/lib/app.js').map(function(file) {
var parentDir = path.dirname(path.dirname(file));
return gulp.src(file)
.pipe(named())
.pipe(webpack({watch:true}))
.pipe(gulp.dest(parentDir));
}));
});

Gulp plugin "gulp-filter" not restoring files in the stream as intended

I have a gulp task that does a pretty simple task, it searches for all files in a folder, filter html files, validate them and then restore file stream and push every file type in the destination folder. This gulpfile:
// define gulp
var gulp = require('gulp');
// define plug-ins
var filter = require('gulp-filter');
var w3cjs = require('gulp-w3cjs');
var newer = require('gulp-newer');
// define paths
var src_path = 'src';
var dest_path = 'public';
// Copy all files from /src, validate html files, and and push everything inside /public
gulp.task('files', function() {
return gulp.src(src_path + '/*') //search for all files
.pipe(newer(dest_path)) // if new go on, if old skip
.pipe(filter('*.html')) // filter html files
.pipe(w3cjs()) // validate filtered files
.pipe(filter('*.html').restore()) // restore files in pre-filter state
.pipe(gulp.dest(dest_path)) // push in destination folder
});
It seems that the "restore" is not restoring files, infact only html files are being pushed in production (/public) folder, what could be wrong? Thanks for any help.
maybe assign a variable to var filter = filter('*.html') and use it to restore .pipe(filter.restore()).

Categories