I have a bunch of text files (I'm calling them 'index files' here) in a directory each containing a list of files separated by newlines.
In my NodeJS script I then want to iterate over these index files and make a call to tar using the index file as an input via the -T argument. For this I'm using spawnSync
What should happen is that tar then archives all of the files listed in the index file.
Instead what is happening is I get a completely empty archive, and no output.
Here is the relevant part of my script:
console.log("Processing index files");
process.chdir(sourcePath);
for(const key in indexFiles) {
let index = indexFiles[key];
console.log("Processing: "+index);
let commandLine = "tar acf "+outputPath+'/'+key+".tar.bz2 -T "+index;
console.log(process.cwd());
console.log(commandLine);
let tar = spawnSync("tar", ["acf", outputPath+"/"+key+".tar.bz2", "-T", index], {cwd:sourcePath, stdio:"inherit"});
}
In this script sourcePath is the location of the files listed within the index file. I'm setting that as the CWD since that is the way tar would work when I call it from the command line.
What's odd is that, as you can see, I am logging out both the sourcePath and the equivalent command line for my spawnSync call. So the output looks like this.
Processing index files
/media/chmo/NewLinux/scan/2022-03-22/temp
Processing: /media/chmo/NewLinux/wrangled/d01d36c8-698a-4791-9075-73fa4c0af881_0.txt
/media/chmo/NewLinux/scan/2022-03-22/temp
tar acf /media/chmo/NewLinux/wrangled/0.tar.bz2 -T /media/chmo/NewLinux/wrangled/d01d36c8-698a-4791-9075-73fa4c0af881_0.txt
I can literally take the second from last line, which should be the CWD for my call to spawnSync, cd into that directory, and then run the command as it appears in the last line, and that works perfectly. Yet when I do what should be the exact same operation but called from within NodeJS it simply creates an empty tar file.
What's up with that? Not sure how else to explain it. It seems like something very basic that just isn't working, so I'm hoping that I've just done something wrong and someone can point out what that is.
you dont need to changecwd like that, just pass it to 3rd arg of spawnSync.
spawnSync('tar', [args], { cwd: sourcePath })
I recommend using execSync over spawnSync in your situation, since you don't need any info other than output of tar command (which is empty on success).
I made little script that is close to your requirement, hope this helps you.
const fs = require('fs/promises')
const { execSync } = require('child_process')
const path = require('path')
const [, , indexfilePath] = process.argv
async function main() {
const indexFilesDir = path.resolve(indexfilePath)
const files = await fs.readdir(indexFilesDir, { withFileTypes: true })
for (const f of files) {
if (!f.isFile()) continue
console.log('processing index file', f.name)
const output = execSync(`tar -acf /tmp/node-${f.name}.tar.bz2 -T ${f.name}`, {
cwd: indexFilesDir,
// encoding: 'utf8',
})
console.log(`------> ${f.name} <------`)
process.stdout.write(output)
console.log(`------> end <------`)
}
}
main()
.then(() => {
console.log('completed main')
})
.catch(console.error)
above script tags index files dir and saves tar file in /tmp dir.
$ tree ./
.
├── cli.js
└── test
├── contents
│ ├── en.txt
│ └── fr.txt
└── index-1.txt
file content of index-1.txt index file.
$ cat test/index-1.txt
./contents/en.txt
./contents/fr.txt
usage
$ node cli.js ./test/
processing index file index-1.txt
------> index-1.txt <------
------> end <------
completed main
$ ls /tmp/
node-index-1.txt.tar.bz2
$ tar --list -f /tmp/node-index-1.txt.tar.bz2
./contents/en.txt
./contents/fr.txt
note: if you are using this inside of a webserver or other, use either exec or spawn with Promise wrap. Because sync operations blocks event loop of nodejs.
Related
I'm creating a guide which will be a source of truth. I've got all my modules in a folder and all my page templates in another. I'm creating a page which will display the modules and templates. Whenever a new module or template is created it needs to be added straight to this page. I need the gulp task to automatically add the module to a json file which I then pull from.
1) Create gulp task that pulls files from src and populates json file ("modules-guide.json") the following details as an array:
- Name (taken from the file name, removing the dashes and replacing it with spaces)
- File name (same as file name minus the extension)
- Id (same as file name minus extension)
2) Pull information from modules-guide.json to populate html file.
I've tried creating a gulp task which pulls the modules and outputs it into the modules-guide.json
The file structure:
//templates
+-- index.ejs
+-- aboutUs.ejs
+-- modules
| +-- header.ejs
+-- components
| +-- heading.ejs
| +-- buttons.ejs
import path from 'path'
import gulp from 'gulp'
const jsonGuideData = './src/content/modules-guide.json';
gulp.task('module-guide', function(){
gulp.src(path.join(src, 'templates/modules/**'))
.pipe(gulp.dest(path.join(src, 'content/modules-guide.json')));
});
I expect the output to be a page with modules that are automatically created when we create a new file. We don't want to manually add the files to the guide.
Create gulp task that pulls files from src and populates JSON file
The solution you proposed just copies the files from the source folder to destination one.
As to me, your task should consist of three stages:
reading files from a directory
parsing their names into a correct format
saving to JSON and storing it on disk
It can be done as follows:
// Importing requied libraries
const gulp = require('gulp');
const glob = require('glob');
const fs = require('fs');
const path = require('path');
// Setuping constants
const ORIGINAL_JSON = './modules-guide.json';
const DIR_TO_READ = './**/*';
// Task
gulp.task('module-guide', function (cb) {
// Getting the list of all the files
const fileArray = glob.sync(DIR_TO_READ);
// Mapping the files into your structure
const fileStruct = fileArray.map(file => {
// Skipping directories if any
if (fs.lstatSync(file).isDirectory()) {
return null;
}
const fileName = path.basename(file, path.extname(file));
const name = fileName.replace('-', ' ');
const id = fileName;
return {
name, fileName, id
}
});
// Removing `nulls` (previously they were directories)
const jsonGuideData = {files: fileStruct.filter(file => !!file)};
// Storing results to JSON
fs.writeFile(ORIGINAL_JSON, JSON.stringify(jsonGuideData), cb);
});
I'd like to run eslint on modified files only. I've created a new run target in package.json to run eslint from command line (git diff --name-only --relative | grep -E '.*\\.(vue|js)$' | xargs eslint --ext .js,.vue). In theory, this should work fine, but there's a little transformation step happening in my project (a string replacement) when bundling the files with webpack that will throw off eslint (some non-standard markup will be expanded to JS).
What are my options and how would I go about implementing them? For instance, could I execute a particular webpack rule/loader and pipe the result to eslint? Another option I see is to include eslint into the webpack rule/loader process (instead of executing it from the command line), but how would I then filter on files that are currently modified (could this be handled by a temporary file that contains the git diff... result?)
I've got a somewhat working approach. I chose to modify webpack.base.conf.js instead of going for the command line solution to make use of the already existing string replacement loader.
The files are collected in the WebpackBeforeBuildPlugin callback function and instead of a regex based test variable, a function is used which checks against the previously collected files.
const exec = require('child_process').exec;
const WebpackBeforeBuildPlugin = require('before-build-webpack');
var modFilesList = new Set([]);
const srcPath = resolve('.');
...
rules: [{
test: function(filename) {
let relFilename = path.relative(srcPath, filename);
let lint = modFilesList.has(relFilename);
return lint
},
loader: 'eslint-loader',
include: resolve('src'),
exclude: /node_modules/,
options: {
formatter: require('eslint-friendly-formatter'),
cache: false
}
}, {
... other string replacement loader ...
}
plugins: [
...
new WebpackBeforeBuildPlugin(function(stats, callback) {
// Collect changed files before building.
let gitCmd = 'git diff --name-only --relative | grep -E ".*\\.(vue|js)$"';
const proc = exec(gitCmd, (error, stdout, stderr) => {
if (stdout) {
let files = stdout.split('\n');
modFilesList = new Set(files);
}
if (error !== null) {
console.log(`exec error: ${error}`);
}
});
callback();
})
]
The only problem at the moment is that when git file changes occur, they don't trigger a re-linting based on these file changes (i.e. new file is changed, or previously (before starting webpack-dev-server) changed file changes are discarded). I checked everything I could. The change is registered and stored in modFilesList, the test function is executed and returns true (for a new change in a previously unchanged file) or false in case the change was discarded. I also played with the cache option to no avail. It seems that at initial load, eslint-loader caches the files it will lint in future (don't know if that's a result of using a test function instead of a regex or also the case with the regex). Is anyone having an idea or has seen this before (eslint-loader not updating the file list)?
Update
This seems to be a problem with webpack (or one of the other loaders) as the eslint-loader isn't even executed when the file changed. The test function however is executed which is a bit weird. I don't fully understand how loaders work or how they play together, so there might be some other loader that is causing this...
I have written a file that needs to execute before the index.js, since it uses commander to require the user to pass information to the index file. I have it placed in a bin directory, but I'm not sure how to make it run. I can cd into the directory and run node <file_name> and pass it the values needed, and it runs fine (As I export the index and import it into the file and call it at the end) but is there not a way to add it into the package.json to run it with an easier command?
Executable:
#!/usr/bin/env node
const program = require('commander');
const index = require('../src/index.js')
program
.version('0.0.1')
.option('-k, --key <key>')
.option('-s, --secret <secret>')
.option('-i, --id <id>')
.parse(process.argv);
let key = program.key;
let secret = program.secret;
let publicId = program.id;
index(key, secret, publicId);
When Node.js script is supposed to run as executable, it's specified as package.json bin option:
To use this, supply a bin field in your package.json which is a map of command name to local file name. On install, npm will symlink that file into prefix/bin for global installs, or ./node_modules/.bin/ for local installs.
It can be located in src or elsewhere:
{
...
"bin" : { "foo" : "src/bin.js" },
...
}
I have written one function in the file and wants to call it from another file. Both files are in the same folder.
.
├── member.js
├── universal.js
member.js
function getRandomNo(min,max){
return Math.random() * (max-min) + min;
}
module.exports.getRandomNo=getRandomNo;
Accessing it in universal.js as
const modelUniversal = require('./universal.js');
//somewhere inside a function
modelUniversal.getRandomNo(a,b);
but I am not able to call the getRandomNo function. I am running the script using nodemon, pm2.
But when I try to run using node universal.js, it is working. Why?
How to get rid of this problem?
node -v
v8.9.1
npm -v
5.5.1
I would like to be able to include the file with a given order while compiling my coffeescript files into js with coffeebar.
I would like to have the files settings.coffee, constants.coffee included first
--
|-- settings.coffee
|-- constants.coffee
|-- page1.coffee
|-- page2.coffee
Code Snippet
fs = require 'fs'
{exec, spawn} = require 'child_process'
util = require 'util'
task 'watch', 'Coffee bar Combine and build', ->
coffee = spawn 'coffeebar', ['-w','-o','./../js/main/kc.js', './']
coffee.stdout.on 'data', (data) ->
console.log data.toString().trim()
invoke 'minify'
task 'minify', ' Minify JS File', ->
file = "./../js/main/kc"
util.log "Minifiying #{file}.js"
exec "uglifyjs #{file}.js > #{file}.min.js", (err,stdout,stderr) ->
if err
util.log "Error minifiying file"
util.log err
else
util.log "Minified to #{file}.min.js"
util.log '----------------------------'
For now the script is only compiling the whole thing together according to its own logic.
I would appreciate any help on this.
It seems like you have 3 potential solutions, but all of them not so elegant:
I'm not sure, but try to set inputPaths argument of coffeebar(inputPaths, [options]) as explicit array of paths with file names, where you can set order of array elements as you need
try to rename files with num prefixes like 01_settiings.coffee and so on, in order what you need, so coffeebar will process it in this order
you can use extra plugin, like rigger to include all files you need in desired sequence in one root file, and process this file with coffeebar