Pass Node.js environment variable with Windows PowerShell [duplicate] - javascript

This question already has answers here:
PowerShell: Setting an environment variable for a single command only
(10 answers)
Closed 4 years ago.
I'm trying pass an environment variable to Node.js with PowerShell like this way:
C:\Users\everton\my-project> $env:MY_VAR = 8000 node index.js
But I get an error in PowerShell:
Token 'node' unexpected expression or statement

Set environmental variable MY_VAR first and run your app like this:
C:\Users\everton\my-project> $env:MY_VAR="8000" ; node index.js
You can access environmental variable MY_VAR inside index.js by
process.env.MY_VAR
Note: PowerShell doesn't directly support command-scoped environment variables. The above command sets the environment variable for that PowerShell session.

My answer require the use of Node.js and npm libraries.
...or you just take out the pain of writing obscure-WTF-language-scripting, and use one of command-scoped (plus cross-platform) Node.js scripts:
cross-env (for inline)
cross-env MYVAR=MYVALUE node index.js
env-cmd (from .env file)
env-cmd .env node index.js
with
#.env file
MYVAR=MYVALUE

Note: If you can assume that Node.js is already installed - as is by definition the case when you're invoking node - consider use of npm helper packages, as shown in Cyril CHAPON 's helpful answer.
This answer focuses on generic solutions from within PowerShell.
tl;dr
# Set env. variable temporarily, invoke the external utility,
# then remove / restore old value.
$oldVal, $env:MYVAR = $env:MYVAR, 8000; node index.js; $env:MYVAR = $oldVal
# Scoped alternative that uses a *transient* helper variable.
& { $oldVal, $env:MY_VAR = $env:MY_VAR, 8000; node index.js; $env:MY_VAR = $oldVal }
More simply, if there's no preexisting MY_VAR value that must be restored.
$env:MYVAR=8000; node index.js; $env:MYVAR=$null
See below for an explanation and an alternative based on a helper function.
To complement Harikrishnan's effective answer:
PowerShell has no equivalent to the command-scoped method of passing environment variables that POSIX-like shells offer (as of PowerShell v7 - however, introducing it to PowerShell - not necessarily with the same syntax - is being discussed in GitHub issue #3316); e.g.:
# E.g., in *Bash*:
# Define environment variable MY_VAR for the child process being invoked (`node`)
# ONLY; in other words: MY_VAR is scoped to the command being invoked
# (subsequent commands do not see it).
MY_VAR=8000 node index.js
In PowerShell, as Harikrishnan's answer demonstrates, you have to define the environment variable first, and then, in a separate statement, call the external program, so $env:MY_VAR="8000"; node index.js is the right PowerShell solution, but it is worth nothing that $env:MY_VAR stays in scope for the remainder of the session (it is set at the process level).
Note that even using a script block invoked with & to create a child scope doesn't help here, because such child scopes only apply to PowerShell variables, not environment variables.
You can, of course, remove the environment variable manually after the node call:
Remove-Item env:MY_VAR or even just $env:MY_VAR = $null, which is what the 1st command at the top shows.
A more structured alternative - perhaps better in the case of setting multiple environment variables and/or invoking multiple commands - is to use a script block invoked with &:
& { $oldVal, $env:MY_VAR = $env:MY_VAR, 8000; node index.js; $env:MY_VAR = $oldVal }
This takes advantage of:
{ ... } is a script block that provides a clearly visible grouping for the commands in it; invoked with &, it creates a local scope, so that helper variable $oldVal automatically goes out of scope on exiting the block.
$oldVal, $env:MY_VAR = $env:MY_VAR, 8000 saves the old value (if any) of $env:MY_VAR in $oldVal while changing the value to 8000; this technique of assigning to multiple variables at once (known as destructuring assignment in some languages) is explained in Get-Help about_Assignment_Operators, section "ASSIGNING MULTIPLE VARIALBES".
Alternatively, use the helper function below, or use a try { ... } finally { ... } approach, as demonstrated in this related answer.
Helper function for command-scoped environment modifications.
If you define the helper function below (remember that function definitions must be placed before they're invoked), you can achieve command-scoped modification of your environment as follows:
# Invoke `node index.js` with a *temporarily* set MY_VAR environment variable.
Invoke-WithEnvironment #{ MY_VAR = 8000 } { node index.js }
Invoke-WithEnvironment() source code:
function Invoke-WithEnvironment {
<#
.SYNOPSIS
Invokes commands with a temporarily modified environment.
.DESCRIPTION
Modifies environment variables temporarily based on a hashtable of values,
invokes the specified script block, then restores the previous environment.
.PARAMETER Environment
A hashtable that defines the temporary environment-variable values.
Assign $null to (temporarily) remove an environment variable that is
currently set.
.PARAMETER ScriptBlock
The command(s) to execute with the temporarily modified environment.
.EXAMPLE
> Invoke-WithEnvironment #{ PORT=8080 } { node index.js }
Runs node with environment variable PORT temporarily set to 8080, with its
previous value, if any
#>
param(
[Parameter(Mandatory)] [System.Collections.IDictionary] $Environment,
[Parameter(Mandatory)] [scriptblock] $ScriptBlock
)
# Modify the environment based on the hashtable and save the original
# one for later restoration.
$htOrgEnv = #{}
foreach ($kv in $Environment.GetEnumerator()) {
$htOrgEnv[$kv.Key] = (Get-Item -EA SilentlyContinue "env:$($kv.Key)").Value
Set-Item "env:$($kv.Key)" $kv.Value
}
# Invoke the script block
try {
& $ScriptBlock
} finally {
# Restore the original environment.
foreach ($kv in $Environment.GetEnumerator()) {
# Note: setting an environment var. to $null or '' *removes* it.
Set-Item "env:$($kv.Key)" $htOrgEnv[$kv.Key]
}
}
}

Related

npm-config Environment Variables

I saw that best way to store some secret strings is using config package and environment variables. This is how I set it up.
Created a config folder with 2 files (default.json, custom-environment-variables.json)
In default.json I created this:
{
"passPrivateKey": ""
}
In custom-environment-variables.json I created this:
{
"passPrivateKey": "nodeProject_passPrivateKey"
}
After I set the variable in terminal with this command:
npm config set nodeProject_passPrivateKey=randomKey
When I am reading the variable from terminal with command below it works fine and shows the correct value
npm config get nodeProject_passPrivateKey
However in code I have these lines:
if (!config.get("nodeProject_passPrivateKey")) {
console.error("nodeProject_passPrivateKey has not been set");
}
So yeah the problem is this method config.get() is not reading the value and I am getting the error not set from above. I tried doing everything in vs code as admin, and using config.get on "nodeProject_passPrivateKey" and "passPrivateKey" but the method is still not reading any value.
Why not use dotenv?
You create an .env file where you store all your secrets and you access them through
process.env.myvar

detect whether ES Module is run from command line in Node

When using CommonJS modules in Node, you can detect whether a script is being run from the command line using require.main === module.
What is an equivalent way to detect whether a script is being run from the command line when using ES Modules in Node (with the --experimental-modules flag)?
Use
if (import.meta.url === `file://${process.argv[1]}`) {
// module was not imported but called directly
}
See the MDN docs on import.meta for details.
Update Sep 27, 2021
Perhaps more robust, but involving an extra import (via Rich Harris)
import {pathToFileURL} from 'url'
if (import.meta.url === pathToFileURL(process.argv[1]).href) {
// module was not imported but called directly
}
There is none - yet (it's still experimental!). Although the prevailing opinion is that such a check is a bad practice anyway and you should just provide separate scripts for the library and the executable, there is an idea to provide a boolean import.meta.main property for this purpose.
The other answers get close, but will miss the mark for a pretty typical usecase - cli scripts exposed by the bin property in package.json files.
These scripts will be symlinked in the node_modules/.bin folder. These can be invoked through npx or as scripts defined in the scripts-object in package.json. process.argv[1] will in that case be the symlink and not the actual file referenced by import.meta.url
Furthermore, we need to convert the file path to an actual file://-url otherwise it will not work correctly on different platforms.
import { realpathSync } from "fs";
import { pathToFileURL } from "url";
function wasCalledAsScript() {
// We use realpathSync to resolve symlinks, as cli scripts will often
// be executed from symlinks in the `node_modules/.bin`-folder
const realPath = realpathSync(process.argv[1]);
// Convert the file-path to a file-url before comparing it
const realPathAsUrl = pathToFileURL(realPath).href;
return import.meta.url === realPathAsUrl;
}
if (wasCalledAsScript()) {
// module was executed and not imported by another file.
}
I would have posted this as a comment on the accepted answer, but apparently I'm not allowed to comment with a fresh account.
The module global variable will be defined in CommonJS, but won’t exist at
all in an ES module. Yes, there is an inconsistency there, that ES modules are
the things that don’t have module variables.
You can check for an undefined variable by seeing if typeof v is the string
(not value!) 'undefined'.
That turns into:
const inCommonJs = typeof module !== 'undefined';
console.log(`inCommonJs = ${inCommonJs}`);
If we put that exact code into both .cjs and .mjs files, we get the correct answers:
$ node foo.mjs
inCommonJs = false
$ cp foo.mjs foo.cjs
$ node foo.cjs
inCommonJs = true
I like import.meta.url === `file://${process.argv[1]}` , but it does not work in Windows inside bash shell. This is the alternative that is only checking the basename:
const runningAsScript = import.meta.url.endsWith(path.basename(process.argv[1]));
It looks like there is a documented way to do this now:
if (require.main === module) {
console.log('executed directly');
. . .
}

npm global packages: Reference content files from package

I'm in the process of building an npm package which will be installed globally. Is it possible to have non-code files installed alongside code files that can be referenced from code files?
For example, if my package includes someTextFile.txt and a module.js file (and my package.json includes "bin": {"someCommand":"./module.js"}) can I read the contents of someTextFile.txt into memory in module.js? How would I do that?
The following is an example of a module that loads the contents of a file (string) into the global scope.
core.js : the main module file (entry point of package.json)
//:Understanding: module.exports
module.exports = {
reload:(cb)=>{ console.log("[>] Magick reloading to memory"); ReadSpellBook(cb)}
}
//:Understanding: global object
//the following function is only accesible by the magick module
const ReadSpellBook=(cb)=>{
require('fs').readFile(__dirname+"/spellBook.txt","utf8",(e,theSpells)=>{
if(e){ console.log("[!] The Spell Book is MISSING!\n"); cb(e)}
else{
console.log("[*] Reading Spell Book")
//since we want to make the contents of .txt accesible :
global.SpellBook = theSpells // global.SpellBook is now shared accross all the code (global scope)
cb()//callBack
}
})
}
//·: Initialize :.
console.log("[+] Time for some Magick!")
ReadSpellBook((e)=>e?console.log(e):console.log(SpellBook))
spellBook.txt
ᚠ ᚡ ᚢ ᚣ ᚤ ᚥ ᚦ ᚧ ᚨ ᚩ ᚪ ᚫ ᚬ ᚭ ᚮ ᚯ
ᚰ ᚱ ᚲ ᚳ ᚴ ᚵ ᚶ ᚷ ᚸ ᚹ ᚺ ᚻ ᚼ ᚽ ᚾ ᚿ
ᛀ ᛁ ᛂ ᛃ ᛄ ᛅ ᛆ ᛇ ᛈ ᛉ ᛊ ᛋ ᛌ ᛍ ᛎ ᛏ
ᛐ ᛑ ᛒ ᛓ ᛔ ᛕ ᛖ ᛗ ᛘ ᛙ ᛚ ᛛ ᛜ ᛝ ᛞ ᛟ
ᛠ ᛡ ᛢ ᛣ ᛤ ᛥ ᛦ ᛧ ᛨ ᛩ ᛪ ᛫ ᛬ ᛭ ᛮ ᛯ
If you require it from another piece of code, you will see how it prints to the console and initializes by itself.
If you want to achieve a manual initalization, simply remove the 3 last lines (·: Initialize :.) and use reload() :
const magick = require("core.js")
magick.reload((error)=>{ if(error){throw error}else{
//now you know the SpellBook is loaded
console.log(SpellBook.length)
})
I have built some CLIs which were distributed privately, so I believe I can illuminate a bit here.
Let's say your global modules are installed at a directory called $PATH. When your package will be installed on any machine, it will essentially be extracted at that directory.
When you'll fire up someCommand from any terminal, the module.js will be invoked which was kept at $PATH. If you initially kept the template file in the same directory as your package, then it will be present at that location which is local to module.js.
Assuming you edit the template as a string and then want to write it locally to where the user wished / pwd, you just have to use process.cwd() to get the path to that directory. This totally depends on how you code it out.
In case you want to explicitly include the files only in the npm package, then use files attribute of package.json.
As to particularly answer "how can my code file in the npm package locate the path to the globally installed npm folder in which it is located in a way that is guaranteed to work across OSes and is future proof?", that is very very different from the template thingy you were trying to achieve. Anyway, what you're simply asking here is the global path of npm modules. As a fail safe option, use the path returned by require.main.filename within your code to keep that as a reference.
When you npm publish, it packages everything in the folder, excluding things noted in .npmignore. (If you don't have an .npmignore file, it'll dig into .gitignore. See https://docs.npmjs.com/misc/developers#keeping-files-out-of-your-package) So in short, yes, you can package the text file into your module. Installing the module (locally or globally) will get the text file into place in a way you expect.
How do you find the text file once it's installed? __dirname gives you the path of the current file ... if you ask early enough. See https://nodejs.org/docs/latest/api/globals.html#globals_dirname (If you use __dirname inside a closure, it may be the path of the enclosing function.) For the near-term of "future", this doesn't look like it'll change, and will work as expected in all conditions -- whether the module is installed locally or globally, and whether others depend on the module or it's a direct install.
So let's assume the text file is in the same directory as the currently running script:
var fs = require('fs');
var path = require('path');
var dir = __dirname;
function runIt(cb) {
var fullPath = path.combine(__dirname, 'myfile.txt');
fs.readFile(fullPath, 'utf8' , function (e,content) {
if (e) {
return cb(e);
}
// content now has the contents of the file
cb(content);
}
}
module.exports = runIt;
Sweet!

node.js set process.env variable in test

when I run a test in node.js with mocha, how I can set temporal environment variables?
in a module, I have a variable depending of a environment variable
var myVariable = proccess.env.ENV_VAR;
now I use the rewire module,
var rewire = require('rewire');
var myModule = rewire('../myModule');
myModule.__set__('myVariable', 'someValue');
exist a more simple way? without the rewire module?
In your myModule.js file, export a function that takes the variable as an argument eg:
module.exports = function (var) {
// return what you were exporting before
};
Then when you require it, require it like so:
var myModule = require('../myModule')(process.env.ENV_VAR);
My first instinct was to simply set the env var at the top of the test.js before any require statements. However, this may not work for you if you have a module that depends on a env var, and it is required multiple times in the same test run. say you have an env dependent module called mode.js:
module.exports = {
MODE : process.env.ENV_VAR
};
If you add a single test file called bTest.js with
process.env.ENV_VAR= "UNIT_TEST_MODE"
const mode = require('./mode.js')
// describe some tests scenarios that use mode.MODE
...
you will be OK. but if you add a second test file
const mode = require('./mode.js')
// describe some more tests scenarios that use mode.MODE
...
and name it aTest.js, the new file will run first in your suite and mode.MODE will be undefined for all subsequent test js files. The require command won't actually reload the same module multiple times.
Let's assume you aren't able to use the dotenv package in your tests. If so, you can set values on the process.env programmatically in the mocha config file. By default, this is found in .mocharc.json or .mocha.yml, but this can easily be translated to .mocharc.js . Referring to the sample js file here: https://github.com/mochajs/mocha/blob/master/example/config/.mocharc.js
So your .mocharc.js could be
"use strict";
process.env.ENV_VAR = "UNIT_TEST_MODE";
// end of .mocharc.js
and ENV_VAR will be set before mocha requires or runs any of your modules.
Even if you are using dotenv , you can choose to flip set other dotenv option from inside your mochajs config that you might not want to set on your local dev server's .env file. That way, your .env.mocha vars will be available to individual modules that don't require dotenv.
"use strict";
require('dotenv').config({ debug: process.env.DEBUG, { path: '/full/custom/path/to/.env.mocha' } })`.
// end of .mocharc.js
Although in the second case, you may be better off just setting the dotenv env path as part of the test command in your package.json:
node -r dotenv/config /node_modules/mocha/bin/_mocha dotenv_config_path=/full/custom/path/to/.env.mocha

Compile LiveScript that imports other js files

I'm probably misunderstanding how LiveScript works, but how should I import another js file in an .ls file and make it compile? For instance I'd like to access the DOM document like:
el = document.getElementById 'app'
and load mithril.js (which is in the same local dir):
require! 'mithril.js'
But when compiling like:
lsc -c file.ls
this currently tells me that it can't find 'document' or any other mithril specific variables (such as 'm').
If you are compiling your files to Javascript and running them in a browser, then you don't need to require Mithril.
Just make sure it's added before your script.
For example:
# file.ls
element = document.get-element-by-id 'example'
m.module element app
Then run lsc -c file.ls
// file.js
(function() {
var element = document.getElementById('example');
m.module(element, app);
}).call(this);
This is now just a regular JS file with some references to Mithril variables. We need to remember that when we link them to our HTML.
<script src='https://cdnjs.cloudflare.com/ajax/libs/mithril/0.1.34/mithril.js'></script>
<script src='file.js'></script>
It's important that Mithril comes first.
If you want to require Mithril, in order to use it outside of the browser, then you'll have to slightly readjust your require statement.
If you look line 34 of the Mithril source, you'll see that m is just a local function. Then on line 1066 it tries to create a global variable m, if window exists. It won't do in Node/IO.js so instead it attaches the value to module.exports.
This means you'll have to use the value returned by require:
m = require 'mithril.js'
m.module! # works!
Can't find document? Are you not running your code in the browser?
require! 'mithril.js' compiles to var mithril = require('mithril.js');
You want to call mithril m, not mithril, so do this
require! mithril: m or this m = require 'mithril'

Categories