Npm CLI global variable - javascript

I am developing a CLI and it is all based on a uid that i have to store somehow. What is the most viable solution. I have tried with using fs but the file created was placed in the path in which the command is ran.
#!/usr/bin/env node
const program = require("commander");
const { saveUid } = require("./commands");
program
.command('setuid <uid>')
.alias('b')
.description('Set the uid of the album.')
.action(uid => {
saveUid(uid);
})
program.parse(process.argv);
So, any idea to for the saveUid function?
const saveUid = (uid) => {
}
module.exports = {
saveUid
}

You can use the mkdirp module on NPM to make a folder in any directory. Once you open the directory you can use fs to make a file in it.

Related

How should I load package.json for a CLI app installed globally?

I created a cli application which reads its version number from package.json with this bit of code
const packageJson = JSON.parse(fs.readFileSync(path.resolve('./package.json'), 'utf8'))
This works fine if I run the app with yarn start or a similar command while development
But after the package is installed with npm install --global app-name the user should use the declare executable from any path on the system. So if I want to run it say in /Users/myUser/Desktop I get an error like this
Error: ENOENT: no such file or directory, open '/Users/myUser/Desktop/package.json'
So what's a good protocol of loading this package.json within my CLI or is there a better way for approaching this?
Later edit:
For clarity, my package json contains this
{
...
"bin": {
"clip": "./bin/clip.js"
},
...
}
so what I mean by my problem, is when I am running the executable "clip" from a different path, after I used npm publish
After some research I tried this code (use the path.dirname function):
const __filename = fileURLToPath(import.meta.url)
const __dirname = path.dirname(__filename)
export const packageJsonLocation = path.join(__dirname, './../package.json')
const packageJson = JSON.parse(fs.readFileSync(packageJsonLocation, 'utf8'))
and this (just importing the file as json using node's standard import keyword)
import * as packageJson from './../package.json' assert { type: 'json' }
in both cases I get the same result, the executable generated and it tries to read package.json from the current directory. Specifically if I try to console.log() the path I get my current path where I am executing the global executable (clip in my case)
Use __dirname because it always refers to the path of the file that contains this variable, whereas ./ gives you the working directory, such as process.cwd().
const packageJson = JSON.parse(fs.readFileSync(
path.join(__dirname, 'package.json'), 'utf8')
)
If you're using ES Modules, do also to get __dirname
import { dirname } from 'path';
import { fileURLToPath } from 'url';
const __dirname = dirname(fileURLToPath(import.meta.url));
const packageJson = JSON.parse(fs.readFileSync(
path.join(__dirname, 'package.json'), 'utf8')
)
Edit:
You installed the package globally with a bin, but the bin you're calling with a CLI is a symlink which is inside the path <npm_glob_path>/node_modules/bin not <npm_glob_path>/node_modules/app-name/bin. The package.json of your app is inside <npm_glob_path>/node_modules/app-name. And don't use ./, always use path calls
Hence try this instead (replace app-name by your app's name):
import { dirname } from 'path';
import { fileURLToPath } from 'url';
const __dirname = dirname(fileURLToPath(import.meta.url))
console.log('__dirname:' + __dirname) // TELL ME WHAT YOU SEE HERE WHEN YOU RUN THE CLI CMD
const packageJsonLocation = path.join(__dirname, '..', 'app-name' 'package.json')
const packageJson = JSON.parse(fs.readFileSync(
path.join(__dirname, 'package.json'), 'utf8')
)
And please, add console.log('__dirname:' + __dirname) after defining __dirname. Which path do you see when you run the CLI app?
is there a better way for approaching this?
Yes - you should store the version number in the actual package itself somewhere. This way it will always be available/accessible and there's no risk of the package.json version and the installed version becoming out of sync. For example, if someone adds your package to a project and then runs yarn install, but later uses git pull to get an up-to-date version of their local files which happens to include a version bump for your package, there is a window where the package.json has a different version number to the installed version.

How to assign variables in .env file?

Q1.
I have a .env file in my ReactJS app like this:
API_1_ROOT='http://my-api-1.com'
API_2_ROOT='http://my-api-2.com'
BASE_API=API_1_ROOT // This doesn't work as expected
I want to assign one of these api roots to my base api root; I tried doing this in my .env file but it doesn't work as expected.
How can I do this in my .env file?
Q2.
Also, I am not able to destructure multiple items from process.env like this:
const { API_1_ROOT, API_2_ROOT } = process.env;
When I'm doing this, I'm getting the following error:
Uncaught ReferenceError: process is not defined
I have to do this to get both variables:
const { API_1_ROOT } = process.env;
const { API_2_ROOT } = process.env;
npm install dotenv --save
Next add the following line to your app.
require('dotenv').config()
Then create a .env file at the root directory of your application and add the variables to it.
.env
// contents of .env
REACT_APP_API_1_ROOT = 'my-secret-api-key'
REACT_APP_API_2_ROOT = 'my-secret-api-key'
config.js
require('dotenv').config()
const config = {
api1: process.env.REACT_APP_API_1_ROOT,
api2: process.env.REACT_APP_API_2_ROOT,
}
export default config

No such file or directory when exporting function from another file

src/test.js
module.exports.test = function() {
const { readFileSync } = require('fs');
console.log(readFileSync('test.txt', 'utf8').toString())
}
index.js
const { test } = require('./src/test.js');
test();
Which results in No such file or directory. Does module.exports or exports not work when requiring files in another directory?
When you do something like this:
readFileSync('test.txt', 'utf8')
that attempts to read test.txt from the current working directory. That current working directory is determined by how the main program got started and what the current working directory was when the program was launched. It will have nothing at all to do with the directory your src/test.js module is in.
So, if test.txt is inside the same directory as your src/test.js and you want to read it from there, then you need to manually build a path that references your module's directory. To do that, you can use __dirname which is a special variable set for each module that points to the directory the module is in.
In this case, you can do this:
const path = require('path');
module.exports.test = function() {
const { readFileSync } = require('fs');
console.log(readFileSync(path.join(__dirname, 'test.txt'), 'utf8').toString())
}
And, that will reliably read test.txt from your module's directory.

Simple and elegant way to override local file If exists?

Looking for elegant and simple solution to have "local configuration override" files.
The idea is to be able to have local configuration that will not ask to be added to git repository every time.
For that I need to include local.config.js if it exists.
I have global app configuration in config.js with configuration like
export const config = {
API_URL="https://some.host",
}
and config.local.js
export const config = {
API_URL="https://other.address",
}
there's .gitignore:
config.local.js
Difficulty:
I do not want to add a node module to project just for this one thing. I believe there should be an elegant way to do this in one or few lines, but have not found any so far.
Things that I tried:
1.
try {
const {
apiUrl: API_URL,
} = require('./config.local.js');
config. API_URL =apiUrl;
} catch (e) {
}
require does not work inside try{} block.
2.
const requireCustomFile = require.context('./', false, /config.local.js$/);
requireCustomFile.keys().forEach(fileName => {
requireCustomFile(fileName);
});
does not work.
3.
export const config = require('./config.local.js') || {default:'config = {...}'}
does not work.
4.
Using .env and settings environment variable: I need to override whole array of configuration values. Not one by one.
This solution uses process.argv. It is native to node as documented here and does not use .env
It inspects the command values used to start the app. Since these should be different between your local and production environments, it's an easy way to switch with no additional modules required.
command prompt to start your node app:
(this might also be in package.json and incurred via npm start if you're using that approach.)
$ node index.js local
index.js of your node app:
var express = require('express');
var config = require('./config');
if (process.argv[2] === 'local') {
// the 3rd argument provided at startup (2nd index) was 'local', so here we are!
config = require('./config_local');
}
var app = express();
// rest of owl…

How to run Docker and node.js with remote configurations

I'd like to provide an easy and simple Docker container for an open source application that takes an URL of a configuration file as an argument and uses this file.
The Dockerfile is pretty straight forward:
FROM phusion/baseimage
# Use baseimage-docker's init system.
CMD ["/sbin/my_init"]
RUN curl -sL https://deb.nodesource.com/setup_4.x | sudo -E bash -
RUN apt-get update
RUN apt-get install -y nodejs git
ADD . /src
RUN cd /src; npm install; npm update
ENV NODE_ENV production
CMD ["/usr/bin/node", "/src/gitevents.js"]
I found no way of adding the file when the container runs (with ADD or ENTRYPOINT), so I'm trying to work it out in node.js:
docker run -e "CONFIG_URL=https://gist.githubusercontent.com/PatrickHeneise/c97ba221495df0cd9a3b/raw/fda1b8cd53874735349c6310a6643e6fc589a404/gitevents_config.js" gitevents
this sets CONFIG_URL as a environment variable that I can use in node. However, I need to download a file then, which is async, which kind of doesn't work in the current setup.
if (process.env.NODE_ENV === 'production') {
var exists = fs.accessSync(path.join(__dirname, 'common', 'production.js'), fs.R_OK);
if (exists) {
config = require('./production');
} else {
// https download, but then `config` is undefined when running the app the first time.
}
}
There's no synchronous download in node.js, any recommendations how I could solve this?
I'd love to have Docker do the job with ADD or CMD doing a curl download, but I'm not sure how that works?
Another thing would be to consider that your "config file" is not a file but just text and pass the content to the container at runtime.
CONFIG="$(curl -sL https://gist.githubusercontent.com/PatrickHeneise/c97ba221495df0cd9a3b/raw/fda1b8cd53874735349c6310a6643e6fc589a404/gitevents_config.js)"
docker run -e "CONFIG_URL=${CONFIG}" gitevents
How about a combination of ENTRYPOINT and environment variable? You'd have ENTRYPOINT in the Dockerfile set to a shell script that would download the configuration file specified in the environment variable and then start the application.
Since the entry point script would receive whatever is in CMD as it's arguments, the application start step could be accomplished by something like
# Execute CMD.
eval "$#"
I managed to re-write my config script to work asynchronous, still not the best solution in my eyes.
var config = {};
var https = require('https');
var fs = require('fs');
var path = require('path');
config.load = function(fn) {
if (process.env.NODE_ENV === 'production') {
fs.access(path.join(__dirname, 'production.js'), fs.R_OK, function(error, exists) {
if (exists) {
config = require('./production');
} else {
var file = fs.createWriteStream(path.join(__dirname, 'production.js'));
var url = process.env.CONFIG_URL;
if (!url) {
process.exit(-1);
} else {
https.get(url, function(response) {
response.pipe(file);
file.on('finish', function() {
file.close(function() {
return fn(require('./production'));
});
});
});
}
}
});
} else if (process.env.NODE_ENV === 'test') {
return fn(require('./test'));
} else {
return fn(require('./development'));
}
};
module.exports = exports = config;

Categories