Here is my problem, I want to create a CLI that automatically runs a test. Without the CLI, I'm able to run everything perfectly with the node command:
node test.js
Basically, I want to do the exact same thing as the command before, so I googled for a technique that does this. I found this:
#!/usr/bin/env node
'use strict';
const options = process.argv;
const { execFile } = require('child_process');
const child = execFile('node', ['../dist/test.js'], (error, stdout, stderr) => {
if (error) {
throw error;
}
console.log(stdout);
});
This method doesn't work for me because, in the test.js file, I'm using the ora package. And because this package is making real-time animations, it doesn't come in stdout.
Is there any way of executing in real time (without subprocess) my test.js using Node? I'm open to other methods, but I want to publish the CLI on NPM, so keep in mind that it has to be in JavaScript 😊.
You can find every file that I've talked here on GitHub. Normally, you wouldn't need this link, but I'm giving it to you if you need to have a closer look.
You should simply call your test() function from your CLI code, after requiring the module that defines it. Have a look at mocha and jasmine: you will see that while both tools provide a CLI, they also provide instructions for invoking the test frameworks from arbitrary JS code.
I can't think of a way without a sub-process. but this may help.
The child process exec will not work with the continuous output commands as it buffers the output the process will halt when that buffer is full.
The suitable solution is spwan :
var spwan = require('child_process').spwan
var child = spwan('node', ['../dist/test.js'])
child.stdout.on('data', function(data) {
console.log(data)
})
child.stderr.on('data', function(data) {
console.log(data)
})
Here is my solution, you can use the fs library to get the code of the file, and then, you simply use eval to execute in the same process.
const fs = require("fs");
function run(file) {
fs.readFile(file, (err, data) => {
eval(data.toString('utf8'))
})
}
Related
I'm writing a desktop web app that uses node.js to access the local file system. I can currently use node.js to open and copy files to different places on the hard drive. What I would also like to do is allow the user to open a specific file using the application that is associated with the file type. In other words, if the user selects "myfile.doc" in a Windows environment, it will launch MSWord with that file.
I must be a victim of terminology, because I haven't been able to find anything but the spawning of child processes that communicate with node.js. I just want to launch a file for the user to view and then let them decided what to do with it.
Thanks
you can do this
var cp = require("child_process");
cp.exec("document.docx"); // notice this without a callback..
process.exit(0); // exit this nodejs process
it not safe thought, to ensure that the command show no errors or any undesired output
you should add the callback parameter
child_process.exec(cmd,function(error,stdout,stderr){})
and next you can work with events so you won't block execution of script or even make use of a external node.js script that launches and handles outputs from processes which you spawn from a "master" script.
In below example I have used textmate "mate" command to edit file hello.js, you can run any command with child_process.exec but the application you want to open file in should provide you with command line options.
var exec = require('child_process').exec;
exec('mate hello.js');
var childProcess = require('child_process');
childProcess.exec('start Example.xlsx', function (err, stdout, stderr) {
if (err) {
console.error(err);
return;
}
console.log(stdout);
process.exit(0);// exit process once it is opened
})
Emphasis on where 'exit' is called. This executes properly in windows.
Simply call your file (any file with extension, including .exe) from the command promp, or programmatically:
var exec = require('child_process').exec;
exec('C:\\Users\\Path\\to\\File.js', function (err, stdout, stderr) {
if (err) {
throw err;
}
})
If you want to run a file without extension, you can do almost the same, as follow:
var exec = require('child_process').exec;
exec('start C:\\Users\\Path\\to\\File', function (err, stdout, stderr) {
if (err) {
throw err;
}
})
As you can see, we use start to open the File, letting windows (or windows letting us) choose an application.
If you prefer opening a file with async/await pattern,
const util = require('util');
const exec = util.promisify(require('child_process').exec);
async function openFile(path) {
const { stdout, stderr, err } = await exec(path);
if (err) {
console.log(err);
return;
}
process.exit(0);
}
openFile('D:\\Practice\\test.txt'); // enter your file location here
I'd like to publish to NPM within my CI/build system, so I found libnpmpublish which seems to be the correct tool, but explicitly states that it doesn't pack your code into a tarball, although the publish API requires that you pass it a tarball (as compared to, say, a folder or a path).
Their suggested solution is
Since libnpmpublish does not generate tarballs itself, one way to build your own tarball for publishing is to do npm pack in the directory you wish to pack. You can then fs.createReadStream('my-proj-1.0.0.tgz') and pass that to libnpmpublish, along with require('./package.json').
Is there a programmatic (in Node) way to script this process? I looked around the NPM repositories and couldn't find a package that is dedicated to packaging, though I can find this code which seems to implement packing, but is in an archived repository- namely, it's not in libnpm.
The closest I can find is npm-packlist which, when given a folder, creates a list of files which can be forwarded to the NPM tar package, as demonstrated in the README for npm-packlist.
This is a bit of a hack using the command line, but it works for a similar concept where I'm creating and uploading a package via a REST command. I wrap in a promise that returns a stream so that I can easily add that to the formPost data:
const exec = require('child_process').exec;
return new Promise((resolve, reject) => {
exec(`npm pack ${t}`, { cwd: d }, (error, stdout, stderr) => {
if (error) {
console.error(error);
reject(error);
}
var f = d + p.sep + stdout.trim();
if (debug) console.log(`zip file "${f}" created.`);
resolve(fs.createReadStream(f))
});
});
Enjoy!
there is now libnpmpack, which will generate a tarball buffer in exactly the format that libnpmpublish expects, so this can be done with:
const pack = require('libnpmpack')
const { publish } = require('libnpmpublish')
async function packAndPublish(packagePath) {
// readPackageJson omitted for brevity
const manifest = readPackageJson(packagePath)
const tarball = await pack(packagePath)
return publish(manifest, tarball)
}
So lets say I have some code in js
const myApiKey = 'id_0001'
But instead of harcoding it I want to put it in some bash script with other env vars and read from it and then replace it in the JS
So lets say for prod I would read from prod-env.sh or for dev I would read them from dev-env.sh and then gulp or some other tool does the magic and replaces MY_API_KEY based on whatever is established inside of prod-env.sh or dev-env.sh.
const myApiKey = MY_API_KEY
Update: I want to add I only care about unix OS, not concerned about windows. In golang there is way to read for example envVars.get('MY_API_KEY'), I'm looking for something similar but for JS in the client side.
If you're using gulp, it sounds like you could use any gulp string replacer, like gulp-replace.
As for writing the gulp task(s). If you are willing to import the environment into your shell first, before running node, you can access the environment via process.env
gulp.task('build', function(){
gulp.src(['example.js'])
.pipe(replace('MY_API_KEY', process.env.MY_API_KEY))
.pipe(gulp.dest('build/'));
});
If you don't want to import the environment files before running node, you can use a library like env2 to read shell environment files.
Another option would be to use js/json to define those environment files, and load them with require.
prod-env.js
{
"MY_API_KEY": "api_key"
}
gulpfile.js
const myEnv = require('./prod-env')
gulp.task('build', function(){
gulp.src(['example.js'])
.pipe(replace('MY_API_KEY', myEnv.MY_API_KEY))
.pipe(gulp.dest('build/'));
});
Also, for a more generic, loopy version of the replace you can do:
gulp.task('build', function () {
stream = gulp.src(['example.js']);
for (const key in process.env) {
stream.pipe('${' + key + '}', process.env[key]);
}
stream.pipe(gulp.dest('build/'));
});
In that last example I added ${} around the environment variable name to make it less prone to accidents. So the source file becomes:
const myApiKey = ${MY_API_KEY}
This answer is an easy way to do this for someone who doesn't want to touch the code they are managing. For example you are on the ops team but not the dev team and need to do what you are describing.
The environment variable NODE_OPTIONS can control many things about the node.js runtime - see https://nodejs.org/api/cli.html#cli_node_options_options
One such option we can set is --require which allows us to run code before anything else is even loaded.
So using this you can create a overwrite.js file to perform this replacement on any non-node_modules script files:
const fs = require('fs');
const original = fs.readFileSync;
// set some custom env variables
// API_KEY_ENV_VAR - the value to set
// API_KEY_TEMPLATE_TOKEN - the token to replace with the value
if (!process.env.API_KEY_TEMPLATE_TOKEN) {
console.error('Please set API_KEY_TEMPLATE_TOKEN');
process.exit(1);
}
if (!process.env.API_KEY_ENV_VAR) {
console.error('Please set API_KEY_ENV_VAR');
process.exit(1);
}
fs.readFileSync = (file, ...args) => {
if (file.includes('node_modules')) {
return original(file, ...args);
}
const fileContents = original(file, ...args).toString(
/* set encoding here, or let it default to utf-8 */
);
return fileContents
.split(process.env.API_KEY_TEMPLATE_TOKEN)
.join(process.env.API_KEY_ENV_VAR);
};
Then use it with a command like this:
export API_KEY_ENV_VAR=123;
export API_KEY_TEMPLATE_TOKEN=TOKEN;
NODE_OPTIONS="--require ./overwrite.js" node target.js
Supposing you had a script target.js
console.log('TOKEN');
It would log 123. You can use this pretty much universally with node, so it should work fine with gulp, grunt, or any others.
I am aware this isn't the first post about fs.unlink not working, but I'm very new to both Visual Studio and Node Js.
I want to delete a file in the working folder, I got an error and the file is not deleted.
Here is what I tried:
var fs = require('fs');
fs.unlink('test1.txt');
PS: I installed the necessary Node Js components in VS.
As far as the code goes, you're not invoking fs.unlink properly. For starters, it's asynchronous. You will need to provide it a callback. See example here:
https://nodejs.org/api/fs.html#fs_fs_unlink_path_callback
Secondly, you need to provide it the full file path, not just the name of the file... ie:
var fs = require('fs');
fs.unlink('C:\path\to\my\file\test1.txt', (err) => {});
You can also supply it with the variable __dirname to utilize your current working directory from wherever you invoke node against the script. Thus, that would look something like:
let fs = require('fs');
let path = require('path');
fs.unlink(path.join(__dirname, 'test1.txt', (err) => {
if (err) throw err;
console.log('test1.txt was deleted');
});
Currently, you can also invoke it synchronously using it's single parameter signature... thus you'd provide only the dir path:
fs.unlinkSync('C:\\path\\to\\my\\file\\test1.txt');
But, this is ill-advised as it will be blocking. I'd only use the "sync" variant during some application bootstrapping process, where it'd be invoked only one time or so, at startup. Try to fight the urge of it being "easier" to use and understand, and instead get yourself to understand asynchronous logic.
I have a little Grunt task that shells out via node and runs "composer install".
var done = this.async();
var exec = require('child_process').exec;
var composer = exec(
'php bin/composer.phar install',
function(error, stdout, stderr) {
done(error===null);
}
);
composer.stdout.on(
'data',
grunt.log.write
);
As you can see, I'm outputting the stdout of this child process to grunt.log. All output is showing up nice and well as expected, except that the output is all in my default console color. If I run "composer install" directly I get highlighting that improves readability.
Since I'm new to node, Grunt and shelling out in general, I'm unsure about in which part of the system the coloring gets lost, or even how to debug this efficiently.
Using spawn with the option stdio='inherit' worked to include output color.
From the documentation:
options (Object)
cwd String Current working directory of the child process
stdio (Array|String) Child's stdio configuration. (See below)
...
As a shorthand, the stdio argument may also be one of the following
strings, rather than an array:
ignore - ['ignore', 'ignore', 'ignore']
pipe - ['pipe', 'pipe', 'pipe']
inherit - [process.stdin, process.stdout, process.stderr] or [0,1,2]
Here is an example of the working code:
require('child_process')
.spawn('npm', ['install'], {stdio:'inherit'})
.on('exit', function (error) {
if(!error){
console.log('Success!');
}
}
});
I wanted to make exec work but I did not find a way to access the same option.
The --colors flag worked for me. Node version 6.8.0...
--colors, -c force enabling of colors [boolean]
The following generic example would print the colors should any be returned...
var exec = require('child_process').exec;
exec('node someCommand --colors', function (error, stdout, stderr) {
console.log(stdout || stderr); // yay colors!
});
In some cases command line programs will prevent a colorized output when not run through a terminal, and thus you need to instruct the program to output the ANSI escape sequences.
In this case, it's as simple as adding an '--ansi' flag, for example:
var done = this.async();
var exec = require('child_process').exec;
var composer = exec(
'php bin/composer.phar install --ansi',
function(error, stdout, stderr) {
done(error===null);
}
);
composer.stdout.on(
'data',
grunt.log.write
);
If like myself, you are spawning a child node process as opposed to a non-node script, you may find that the --ansi and --color options will give you little success for retaining the colored output of child node processes.
Instead, you should inherit the instances of stdio of the current process.
My particular use-case involved forking a node server as a background task in order to execute an end-to-end test suite against an active HTTP interface. Here was my final solution:
var child = spawn('node', ['webserver/server.js'], {
args: ['--debug'],
env: _.extend(process.env, {
MOCK_API: mockApi
}),
// use process.stdout to retain ansi color codes
stdio: [process.stdin, process.stdout, 'pipe']
});
// use custom error buffer in order to throw using grunt.fail()
var errorBuffer = '';
child.stderr.on('data', function(data) {
errorBuffer += data;
});
child.on('close', function(code) {
if (code) {
grunt.fail.fatal(errorBuffer, code);
} else {
done();
}
});
I was there too. If you:
Don't want to inherit the child stdout
Don't know which command is going to be executed (then you don't know which of e.g --ansi or --colors can work)
Then you should spawn a PTY from node. I made this node package for this exact reason.