Running an executable before index.js - javascript

I have written a file that needs to execute before the index.js, since it uses commander to require the user to pass information to the index file. I have it placed in a bin directory, but I'm not sure how to make it run. I can cd into the directory and run node <file_name> and pass it the values needed, and it runs fine (As I export the index and import it into the file and call it at the end) but is there not a way to add it into the package.json to run it with an easier command?
Executable:
#!/usr/bin/env node
const program = require('commander');
const index = require('../src/index.js')
program
.version('0.0.1')
.option('-k, --key <key>')
.option('-s, --secret <secret>')
.option('-i, --id <id>')
.parse(process.argv);
let key = program.key;
let secret = program.secret;
let publicId = program.id;
index(key, secret, publicId);

When Node.js script is supposed to run as executable, it's specified as package.json bin option:
To use this, supply a bin field in your package.json which is a map of command name to local file name. On install, npm will symlink that file into prefix/bin for global installs, or ./node_modules/.bin/ for local installs.
It can be located in src or elsewhere:
{
...
"bin" : { "foo" : "src/bin.js" },
...
}

Related

Spawning tar from nodejs with input file

I have a bunch of text files (I'm calling them 'index files' here) in a directory each containing a list of files separated by newlines.
In my NodeJS script I then want to iterate over these index files and make a call to tar using the index file as an input via the -T argument. For this I'm using spawnSync
What should happen is that tar then archives all of the files listed in the index file.
Instead what is happening is I get a completely empty archive, and no output.
Here is the relevant part of my script:
console.log("Processing index files");
process.chdir(sourcePath);
for(const key in indexFiles) {
let index = indexFiles[key];
console.log("Processing: "+index);
let commandLine = "tar acf "+outputPath+'/'+key+".tar.bz2 -T "+index;
console.log(process.cwd());
console.log(commandLine);
let tar = spawnSync("tar", ["acf", outputPath+"/"+key+".tar.bz2", "-T", index], {cwd:sourcePath, stdio:"inherit"});
}
In this script sourcePath is the location of the files listed within the index file. I'm setting that as the CWD since that is the way tar would work when I call it from the command line.
What's odd is that, as you can see, I am logging out both the sourcePath and the equivalent command line for my spawnSync call. So the output looks like this.
Processing index files
/media/chmo/NewLinux/scan/2022-03-22/temp
Processing: /media/chmo/NewLinux/wrangled/d01d36c8-698a-4791-9075-73fa4c0af881_0.txt
/media/chmo/NewLinux/scan/2022-03-22/temp
tar acf /media/chmo/NewLinux/wrangled/0.tar.bz2 -T /media/chmo/NewLinux/wrangled/d01d36c8-698a-4791-9075-73fa4c0af881_0.txt
I can literally take the second from last line, which should be the CWD for my call to spawnSync, cd into that directory, and then run the command as it appears in the last line, and that works perfectly. Yet when I do what should be the exact same operation but called from within NodeJS it simply creates an empty tar file.
What's up with that? Not sure how else to explain it. It seems like something very basic that just isn't working, so I'm hoping that I've just done something wrong and someone can point out what that is.
you dont need to changecwd like that, just pass it to 3rd arg of spawnSync.
spawnSync('tar', [args], { cwd: sourcePath })
I recommend using execSync over spawnSync in your situation, since you don't need any info other than output of tar command (which is empty on success).
I made little script that is close to your requirement, hope this helps you.
const fs = require('fs/promises')
const { execSync } = require('child_process')
const path = require('path')
const [, , indexfilePath] = process.argv
async function main() {
const indexFilesDir = path.resolve(indexfilePath)
const files = await fs.readdir(indexFilesDir, { withFileTypes: true })
for (const f of files) {
if (!f.isFile()) continue
console.log('processing index file', f.name)
const output = execSync(`tar -acf /tmp/node-${f.name}.tar.bz2 -T ${f.name}`, {
cwd: indexFilesDir,
// encoding: 'utf8',
})
console.log(`------> ${f.name} <------`)
process.stdout.write(output)
console.log(`------> end <------`)
}
}
main()
.then(() => {
console.log('completed main')
})
.catch(console.error)
above script tags index files dir and saves tar file in /tmp dir.
$ tree ./
.
├── cli.js
└── test
├── contents
│   ├── en.txt
│   └── fr.txt
└── index-1.txt
file content of index-1.txt index file.
$ cat test/index-1.txt
./contents/en.txt
./contents/fr.txt
usage
$ node cli.js ./test/
processing index file index-1.txt
------> index-1.txt <------
------> end <------
completed main
$ ls /tmp/
node-index-1.txt.tar.bz2
$ tar --list -f /tmp/node-index-1.txt.tar.bz2
./contents/en.txt
./contents/fr.txt
note: if you are using this inside of a webserver or other, use either exec or spawn with Promise wrap. Because sync operations blocks event loop of nodejs.

Load a variable from dotenv file when starting PM2

I am starting instances of my app as a package.json script with PM2 this way:
"start:pm2": "pm2 start -i max node myapp.js"
I found out that not all members in the team always want to use max as a value for instances number while developing, but prefer to use some lower value.
To not change package.json I would better let them change the value inside .env file because we already use it so that the value from it would be used as the parameter to pm2.
I know I can create a wrapper js or bash script to load the variable from .env file and pass it to pm2 but it would be better to have a solution without it.
How can I achieve this?
You can create an ecosystem.config.js file and declare your environment variables under the “env:” attribute, in your case the NODE_APP_INSTANCE can be used to set the number of instances:
module.exports = {
apps : [{
name: "MyApp",
script: "./myapp.js",
env: {
NODE_ENV: "development",
NODE_APP_INSTANCE: "max"
},
env_production: {
NODE_ENV: "production",
}
}]
}
Then call pm2 start or pm2 start /path/to/ecosystem.config.js to load an ecosystem from an other folder.
A better pattern here is to remove dotenv from your code and "require" it on the command line. This makes your code nicely transportable between any environment (including cloud-based) - which is one of the main features of environment variables.
a) code up your .env file alongside your script (e.g. app.js)
b) to run your script without pm2:
node -r dotenv/config app.js
c) in pm2.config.js:
module.exports = {
apps : [{
name : 'My Application',
script : 'app.js',
node_args : '-r dotenv/config',
...
}],
}
and then
pm2 start pm2.config.js
Note: the use of dotenv/config on the command line is one of the best practices recommended by dotenv themselves

npm global packages: Reference content files from package

I'm in the process of building an npm package which will be installed globally. Is it possible to have non-code files installed alongside code files that can be referenced from code files?
For example, if my package includes someTextFile.txt and a module.js file (and my package.json includes "bin": {"someCommand":"./module.js"}) can I read the contents of someTextFile.txt into memory in module.js? How would I do that?
The following is an example of a module that loads the contents of a file (string) into the global scope.
core.js : the main module file (entry point of package.json)
//:Understanding: module.exports
module.exports = {
reload:(cb)=>{ console.log("[>] Magick reloading to memory"); ReadSpellBook(cb)}
}
//:Understanding: global object
//the following function is only accesible by the magick module
const ReadSpellBook=(cb)=>{
require('fs').readFile(__dirname+"/spellBook.txt","utf8",(e,theSpells)=>{
if(e){ console.log("[!] The Spell Book is MISSING!\n"); cb(e)}
else{
console.log("[*] Reading Spell Book")
//since we want to make the contents of .txt accesible :
global.SpellBook = theSpells // global.SpellBook is now shared accross all the code (global scope)
cb()//callBack
}
})
}
//·: Initialize :.
console.log("[+] Time for some Magick!")
ReadSpellBook((e)=>e?console.log(e):console.log(SpellBook))
spellBook.txt
ᚠ ᚡ ᚢ ᚣ ᚤ ᚥ ᚦ ᚧ ᚨ ᚩ ᚪ ᚫ ᚬ ᚭ ᚮ ᚯ
ᚰ ᚱ ᚲ ᚳ ᚴ ᚵ ᚶ ᚷ ᚸ ᚹ ᚺ ᚻ ᚼ ᚽ ᚾ ᚿ
ᛀ ᛁ ᛂ ᛃ ᛄ ᛅ ᛆ ᛇ ᛈ ᛉ ᛊ ᛋ ᛌ ᛍ ᛎ ᛏ
ᛐ ᛑ ᛒ ᛓ ᛔ ᛕ ᛖ ᛗ ᛘ ᛙ ᛚ ᛛ ᛜ ᛝ ᛞ ᛟ
ᛠ ᛡ ᛢ ᛣ ᛤ ᛥ ᛦ ᛧ ᛨ ᛩ ᛪ ᛫ ᛬ ᛭ ᛮ ᛯ
If you require it from another piece of code, you will see how it prints to the console and initializes by itself.
If you want to achieve a manual initalization, simply remove the 3 last lines (·: Initialize :.) and use reload() :
const magick = require("core.js")
magick.reload((error)=>{ if(error){throw error}else{
//now you know the SpellBook is loaded
console.log(SpellBook.length)
})
I have built some CLIs which were distributed privately, so I believe I can illuminate a bit here.
Let's say your global modules are installed at a directory called $PATH. When your package will be installed on any machine, it will essentially be extracted at that directory.
When you'll fire up someCommand from any terminal, the module.js will be invoked which was kept at $PATH. If you initially kept the template file in the same directory as your package, then it will be present at that location which is local to module.js.
Assuming you edit the template as a string and then want to write it locally to where the user wished / pwd, you just have to use process.cwd() to get the path to that directory. This totally depends on how you code it out.
In case you want to explicitly include the files only in the npm package, then use files attribute of package.json.
As to particularly answer "how can my code file in the npm package locate the path to the globally installed npm folder in which it is located in a way that is guaranteed to work across OSes and is future proof?", that is very very different from the template thingy you were trying to achieve. Anyway, what you're simply asking here is the global path of npm modules. As a fail safe option, use the path returned by require.main.filename within your code to keep that as a reference.
When you npm publish, it packages everything in the folder, excluding things noted in .npmignore. (If you don't have an .npmignore file, it'll dig into .gitignore. See https://docs.npmjs.com/misc/developers#keeping-files-out-of-your-package) So in short, yes, you can package the text file into your module. Installing the module (locally or globally) will get the text file into place in a way you expect.
How do you find the text file once it's installed? __dirname gives you the path of the current file ... if you ask early enough. See https://nodejs.org/docs/latest/api/globals.html#globals_dirname (If you use __dirname inside a closure, it may be the path of the enclosing function.) For the near-term of "future", this doesn't look like it'll change, and will work as expected in all conditions -- whether the module is installed locally or globally, and whether others depend on the module or it's a direct install.
So let's assume the text file is in the same directory as the currently running script:
var fs = require('fs');
var path = require('path');
var dir = __dirname;
function runIt(cb) {
var fullPath = path.combine(__dirname, 'myfile.txt');
fs.readFile(fullPath, 'utf8' , function (e,content) {
if (e) {
return cb(e);
}
// content now has the contents of the file
cb(content);
}
}
module.exports = runIt;
Sweet!

how do I make webpack not convert process.env variables to their values during build?

I have the following in one of my project files:
const baas = process.env.DBID;
console.log('baas', baas);
If I run:
cross-env PORT=4000 NODE_ENV=production WEBPACK_CONFIG=browser_prod,server_prod webpack --colors
My server.js file looks like:
const baas = undefined;
console.log('baas', baas);
As expected. However, I want to be able to set the ID when I run the built app not when I build the app, ie:
DBID=someotherid node dist/server.js
So I need webpack to not convert const baas = process.env.DBID to it's value at build time, but rather leave it as is, so the server.js uses it's value at runtime.
How do I do this?
Note: if I manually edit the built server.js and change undefined to process.env.DBID then the run script works and the app uses the env var from run time, but I don't want to edit files after building.
You are using the wrong target.
By default, webpack builds the application to be run in the browser. This means it will mock native node functions like path fs and process
Your target is node, so there is no need to mock these.
Add this to your webpack.config.js
module.exports = {
target: 'node'
};
https://webpack.js.org/concepts/targets/#usage
What you need is process.argv not process.env:
// server.js
const baas = process.argv[0];
console.log('baas', baas);
Then:
node dist/server.js baas_value
For convenience, you can use this module https://www.npmjs.com/package/yargs
I was able to prevent Webpack from converting process.env by accessing it indirectly like this:
const processText = "process";
const _process = global[processText];
app.listen(_process.env.PORT || 2000);
You need to get process indirectly instead of env because the process variable is defined by webpack to be something like /* provided dependency */ var process = __webpack_require__(/*! process/browser */ "process/browser");

How do I get Bower to install a file to a specified path and name?

I have the following bower.json:
{
"name": "myname",
"dependencies": {
"stripe": "https://js.stripe.com/v2/"
}
}
This grabs the javascript at the associated url and creates the following file:
/bower_components/stripe/index
Note that the file is not index.js, but simply index. This is problematic, as my Brocfile refuses to use the index file, insisting that it has to be index.js. If I manually change the name to index.js, then the application works fine. Obviously, this isn't a satisfactory solution.
So is there a way to get bower to install the file as index.js rather than index?
If you need to set a different folder for bower you can create a .bowerrc file with the following:
{
"directory": "public/bower"
}
I'm not exactly sure of your environment, but for example if you have node.js you can create a gulp.js setup which would do the rename before whatever other processes you need to run.
quasi example gulpfile.js
var gulp = require('gulp');
var rename = require('gulp-rename');
gulp.task('prep', function () {
gulp.src('public/bower/stripe/index', {
base: 'public/bower/stripe'
})
.pipe(rename('index.js'));
.pipe(gulp.dest('./'));
});

Categories