I'm working on a nodejs application and I need to pipe a multi-line string into a shell command. I'm not a pro at shell scripting but if I run this command in my terminal it works just fine:
$((cat $filePath) | dayone new)
Here's what I've got for the nodejs side. The dayone command does work but there is nothing piped into it.
const cp = require('child_process');
const terminal = cp.spawn('bash');
var multiLineVariable = 'Multi\nline\nstring';
terminal.stdin.write('mul');
cp.exec('dayone new', (error, stdout, stderr) => {
console.log(error, stdout, stderr);
});
terminal.stdin.end();
Thanks for any help!
Here, you're starting up bash using spawn, but then you're using exec to start your dayone program. They are separate child processes and aren't connected in any way.
'cp' is just a reference to the child_process module, and spawn and exec are just two different ways of starting child processes.
You could use bash and write your dayone command to stdin in order to invoke dayone (as your snippet seems to be trying to do), or you could just invoke dayone directly with exec (bear in mind exec still runs the command in a shell):
var multiLineVariable = 'Multi\nline\nstring';
// get the child_process module
const cp = require('child_process');
// open a child process
var process = cp.exec('dayone new', (error, stdout, stderr) => {
console.log(error, stdout, stderr);
});
// write your multiline variable to the child process
process.stdin.write(multiLineVariable);
process.stdin.end();
With Readable Streams it's really easy to listen to the input
const chunks = [];
process.stdin.on('readable', () => {
const chunk = process.stdin.read()
chunks.push(chunk);
if (chunk !== null) {
const result = Buffer.concat(chunks);
console.log(result.toString());
}
});
With Writable Streams you can write to the stdout
process.stdout.write('Multi\nline\nstring');
Hopefully I could help you
Related
So I have python script called main.py
#!/usr/bin/env python3
print("hello")
exit (12)
I am trying to capture the exit code via nodejs
const cp = require('child_process');
cp.exec('python main.py', (er, stdout, stderr) => {
console.log(stdout)
}).on('close', (code) => {
console.log(code)
});
This works and outputs "hello" and "12". But I need to execute python script via another cmd unit. I have tried the code below
const cp = require('child_process');
cp.exec('start cmd /k python ./main.py', (er, stdout, stderr) => {
console.log(stdout)
}).on('close', (code) => {
console.log(code)
});
But this outputs "hello" and "0" instead of "12". I have been trying to figure this out for past couple hours but not getting anywhere. What am I missing here?
Calling a another child process is not solution for me, since I am also trying to run the python script on a development unit instead on cmd
I'm writing a desktop web app that uses node.js to access the local file system. I can currently use node.js to open and copy files to different places on the hard drive. What I would also like to do is allow the user to open a specific file using the application that is associated with the file type. In other words, if the user selects "myfile.doc" in a Windows environment, it will launch MSWord with that file.
I must be a victim of terminology, because I haven't been able to find anything but the spawning of child processes that communicate with node.js. I just want to launch a file for the user to view and then let them decided what to do with it.
Thanks
you can do this
var cp = require("child_process");
cp.exec("document.docx"); // notice this without a callback..
process.exit(0); // exit this nodejs process
it not safe thought, to ensure that the command show no errors or any undesired output
you should add the callback parameter
child_process.exec(cmd,function(error,stdout,stderr){})
and next you can work with events so you won't block execution of script or even make use of a external node.js script that launches and handles outputs from processes which you spawn from a "master" script.
In below example I have used textmate "mate" command to edit file hello.js, you can run any command with child_process.exec but the application you want to open file in should provide you with command line options.
var exec = require('child_process').exec;
exec('mate hello.js');
var childProcess = require('child_process');
childProcess.exec('start Example.xlsx', function (err, stdout, stderr) {
if (err) {
console.error(err);
return;
}
console.log(stdout);
process.exit(0);// exit process once it is opened
})
Emphasis on where 'exit' is called. This executes properly in windows.
Simply call your file (any file with extension, including .exe) from the command promp, or programmatically:
var exec = require('child_process').exec;
exec('C:\\Users\\Path\\to\\File.js', function (err, stdout, stderr) {
if (err) {
throw err;
}
})
If you want to run a file without extension, you can do almost the same, as follow:
var exec = require('child_process').exec;
exec('start C:\\Users\\Path\\to\\File', function (err, stdout, stderr) {
if (err) {
throw err;
}
})
As you can see, we use start to open the File, letting windows (or windows letting us) choose an application.
If you prefer opening a file with async/await pattern,
const util = require('util');
const exec = util.promisify(require('child_process').exec);
async function openFile(path) {
const { stdout, stderr, err } = await exec(path);
if (err) {
console.log(err);
return;
}
process.exit(0);
}
openFile('D:\\Practice\\test.txt'); // enter your file location here
Is there a way where I can invoke a windows batch file from inside the javascript code? Or any other healthy way to do the below through any node package?
scripts.bat
ECHO "JAVASCRIPT is AWESOME"
PAUSE
scripts.js
// Code to read and run the batch file //
On the command prompt:
C:/> node scripts.js
One way to do this is with child_process. You just have to pass the file you want to execute.
const execFile = require('child_process').execFile;
const child = execFile('scripts.bat', [], (error, stdout, stderr) => {
if (error) {
throw error;
}
console.log(stdout);
});
I have a little Grunt task that shells out via node and runs "composer install".
var done = this.async();
var exec = require('child_process').exec;
var composer = exec(
'php bin/composer.phar install',
function(error, stdout, stderr) {
done(error===null);
}
);
composer.stdout.on(
'data',
grunt.log.write
);
As you can see, I'm outputting the stdout of this child process to grunt.log. All output is showing up nice and well as expected, except that the output is all in my default console color. If I run "composer install" directly I get highlighting that improves readability.
Since I'm new to node, Grunt and shelling out in general, I'm unsure about in which part of the system the coloring gets lost, or even how to debug this efficiently.
Using spawn with the option stdio='inherit' worked to include output color.
From the documentation:
options (Object)
cwd String Current working directory of the child process
stdio (Array|String) Child's stdio configuration. (See below)
...
As a shorthand, the stdio argument may also be one of the following
strings, rather than an array:
ignore - ['ignore', 'ignore', 'ignore']
pipe - ['pipe', 'pipe', 'pipe']
inherit - [process.stdin, process.stdout, process.stderr] or [0,1,2]
Here is an example of the working code:
require('child_process')
.spawn('npm', ['install'], {stdio:'inherit'})
.on('exit', function (error) {
if(!error){
console.log('Success!');
}
}
});
I wanted to make exec work but I did not find a way to access the same option.
The --colors flag worked for me. Node version 6.8.0...
--colors, -c force enabling of colors [boolean]
The following generic example would print the colors should any be returned...
var exec = require('child_process').exec;
exec('node someCommand --colors', function (error, stdout, stderr) {
console.log(stdout || stderr); // yay colors!
});
In some cases command line programs will prevent a colorized output when not run through a terminal, and thus you need to instruct the program to output the ANSI escape sequences.
In this case, it's as simple as adding an '--ansi' flag, for example:
var done = this.async();
var exec = require('child_process').exec;
var composer = exec(
'php bin/composer.phar install --ansi',
function(error, stdout, stderr) {
done(error===null);
}
);
composer.stdout.on(
'data',
grunt.log.write
);
If like myself, you are spawning a child node process as opposed to a non-node script, you may find that the --ansi and --color options will give you little success for retaining the colored output of child node processes.
Instead, you should inherit the instances of stdio of the current process.
My particular use-case involved forking a node server as a background task in order to execute an end-to-end test suite against an active HTTP interface. Here was my final solution:
var child = spawn('node', ['webserver/server.js'], {
args: ['--debug'],
env: _.extend(process.env, {
MOCK_API: mockApi
}),
// use process.stdout to retain ansi color codes
stdio: [process.stdin, process.stdout, 'pipe']
});
// use custom error buffer in order to throw using grunt.fail()
var errorBuffer = '';
child.stderr.on('data', function(data) {
errorBuffer += data;
});
child.on('close', function(code) {
if (code) {
grunt.fail.fatal(errorBuffer, code);
} else {
done();
}
});
I was there too. If you:
Don't want to inherit the child stdout
Don't know which command is going to be executed (then you don't know which of e.g --ansi or --colors can work)
Then you should spawn a PTY from node. I made this node package for this exact reason.
I want to write a JavaScript function which will execute the system shell commands (ls for example) and return the value.
How do I achieve this?
I'll answer assuming that when the asker said "Shell Script" he meant a Node.js backend JavaScript. Possibly using commander.js to use frame your code :)
You could use the child_process module from node's API. I pasted the example code below.
var exec = require('child_process').exec;
exec('cat *.js bad_file | wc -l',
function (error, stdout, stderr) {
console.log('stdout: ' + stdout);
console.log('stderr: ' + stderr);
if (error !== null) {
console.log('exec error: ' + error);
}
});
I don't know why the previous answers gave all sorts of complicated solutions. If you just want to execute a quick command like ls, you don't need async/await or callbacks or anything. Here's all you need - execSync:
const execSync = require('child_process').execSync;
// import { execSync } from 'child_process'; // replace ^ if using ES modules
const output = execSync('ls', { encoding: 'utf-8' }); // the default is 'buffer'
console.log('Output was:\n', output);
For error handling, add a try/catch block around the statement.
If you're running a command that takes a long time to complete, then yes, look at the asynchronous exec function.
...few year later...
ES6 has been accepted as a standard and ES7 is around the corner so it deserves updated answer. We'll use ES6+async/await with nodejs+babel as an example, prerequisites are:
nodejs with npm
babel
Your example foo.js file may look like:
import { exec } from 'child_process';
/**
* Execute simple shell command (async wrapper).
* #param {String} cmd
* #return {Object} { stdout: String, stderr: String }
*/
async function sh(cmd) {
return new Promise(function (resolve, reject) {
exec(cmd, (err, stdout, stderr) => {
if (err) {
reject(err);
} else {
resolve({ stdout, stderr });
}
});
});
}
async function main() {
let { stdout } = await sh('ls');
for (let line of stdout.split('\n')) {
console.log(`ls: ${line}`);
}
}
main();
Make sure you have babel:
npm i babel-cli -g
Install latest preset:
npm i babel-preset-latest
Run it via:
babel-node --presets latest foo.js
This depends entirely on the JavaScript environment. Please elaborate.
For example, in Windows Scripting, you do things like:
var shell = WScript.CreateObject("WScript.Shell");
shell.Run("command here");
In a nutshell:
// Instantiate the Shell object and invoke its execute method.
var oShell = new ActiveXObject("Shell.Application");
var commandtoRun = "C:\\Winnt\\Notepad.exe";
if (inputparms != "") {
var commandParms = document.Form1.filename.value;
}
// Invoke the execute method.
oShell.ShellExecute(commandtoRun, commandParms, "", "open", "1");
Note: These answers are from a browser based client to a Unix based web server.
Run command on client
You essentially can't. Security says only run within a browser and its access to commands and filesystem is limited.
Run ls on server
You can use an AJAX call to retrieve a dynamic page passing in your parameters via a GET.
Be aware that this also opens up a security risk as you would have to do something to ensure that mrs rouge hacker does not get your application to say run: /dev/null && rm -rf / ......
So in a nutshel, running from JS is just a bad, bad idea.... YMMV
With NodeJS is simple like that!
And if you want to run this script at each boot of your server, you can have a look on the forever-service application!
var exec = require('child_process').exec;
exec('php main.php', function (error, stdOut, stdErr) {
// do what you want!
});
function exec(cmd, handler = function(error, stdout, stderr){console.log(stdout);if(error !== null){console.log(stderr)}})
{
const childfork = require('child_process');
return childfork.exec(cmd, handler);
}
This function can be easily used like:
exec('echo test');
//output:
//test
exec('echo test', function(err, stdout){console.log(stdout+stdout+stdout)});
//output:
//testtesttest
Here is simple command that executes ifconfig shell command of Linux
var process = require('child_process');
process.exec('ifconfig',function (err,stdout,stderr) {
if (err) {
console.log("\n"+stderr);
} else {
console.log(stdout);
}
});
If you are using npm you can use the shelljs package
To install: npm install [-g] shelljs
var shell = require('shelljs');
shell.ls('*.js').forEach(function (file) {
// do something
});
See more: https://www.npmjs.com/package/shelljs
Another post on this topic with a nice jQuery/Ajax/PHP solution:
shell scripting and jQuery
In IE, you can do this :
var shell = new ActiveXObject("WScript.Shell");
shell.run("cmd /c dir & pause");
With nashorn you can write a script like this:
$EXEC('find -type f');
var files = $OUT.split('\n');
files.forEach(...
...
and run it:
jjs -scripting each_file.js
As far as I can tell, there is no built-in function, method or otherwise, in the official ECMAScript specification to run an external process. That said, extensions are allowed, see this note from the spec, for example:
NOTE Examples of built-in functions include parseInt and Math.exp. A
host or implementation may provide additional built-in functions that
are not described in this specification.
One such "host" is Node.js which has the child_process module. Let's try this code to execute the Linux shell command ps -aux, saved in runps.js, based on the child_process documentation:
const { spawn } = require('child_process');
const ps = spawn('ps', ['-aux']);
ps.stdout.on('data', (data) => {
console.log(`stdout: ${data}`);
});
ps.stderr.on('data', (data) => {
console.error(`stderr: ${data}`);
});
ps.on('close', (code) => {
console.log(`child process exited with code ${code}`);
});
Which produces the following example output, running it in docker:
$ docker run --rm -v "$PWD":/usr/src/app -w /usr/src/app node:17-bullseye node ./runps.js
stdout: USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND
root 1 0.0 0.8 319312 33888 ? Ssl 11:08 0:00 node ./runps.js
root 13 0.0 0.0 6700 2844 ? R 11:08 0:00 ps -aux
child process exited with code 0
The thing I like about this module, is that it's included with the Node.js distribution, no npm install ... needed.
If you search the Node.js code in github for spawn you will find references to the implementation in C or C++ in the engine. Modern browsers like Firefox and Chrome would be reluctant to extend JavaScript with such features, for obvious security reasons, even if the underlying engine such as V8 supports it.
On that note, it's better not to run our container as root, let's try the above example again, adding a random user this time.
$ docker run --rm -u 7000 -v "$PWD":/usr/src/app -w /usr/src/app node:17-bullseye node ./runps.js
stdout: USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND
7000 1 5.0 0.8 319312 33812 ? Ssl 11:19 0:00 node ./runps.js
7000 13 0.0 0.0 6700 2832 ? R 11:19 0:00 ps -aux
child process exited with code 0
Of course that's better but not enough. If this approach is used at all, more precautions must be taken, such as ensuring that no arbitrary user commands can be executed.
Windows 10
My version of Windows 10 still has Windows Script Host which can run JScript on the console with the wscript.exe or cscript.exe programs, i.e. no browser needed. To try it out you can open a PowerShell Windows Terminal. Save the following code into a file which you can call shell.js:
WScript.StdOut.WriteLine("Hallo, ECMAScript on Windows!");
WScript.CreateObject("WScript.Shell").run("C://Windows//system32//mspaint.exe");
And on the command line, run:
cscript .\shell.js
Which shows the following and opens Paint:
Microsoft (R) Windows Script Host Version 5.812
Copyright (C) Microsoft Corporation. All rights reserved.
Hallo, ECMAScript on Windows!
Other variations exist. Find the documentation applicable to your preferred JavaScript runtime environment.
const fs = require('fs');
function ls(startPath) {
fs.readdir(startPath, (err, entries) => {
console.log(entries);
})
}
ls('/home/<profile_name>/<folder_name>')
The startPath used here is in reference with debian distro
Js file
var oShell = new ActiveXObject("Shell.Application");
oShell.ShellExecute("E:/F/Name.bat","","","Open","");
Bat file
powershell -Command "& {ls | Out-File -FilePath `E:F/Name.txt}"`
Js file run with node namefile.js
const fs = require('fs')
fs.readFile('E:F/Name.txt', (err, data) => {
if (err) throw err;
console.log(data.toString());
})
You can also do everything in one solution with an asynchronous function.
Directly there could be security problems.