How to execute a shell script with node? [duplicate] - javascript

I am in the process of porting a CLI library from Ruby over to Node.js. In my code I execute several third party binaries when necessary. I am not sure how best to accomplish this in Node.
Here's an example in Ruby where I call PrinceXML to convert a file to a PDF:
cmd = system("prince -v builds/pdf/book.html -o builds/pdf/book.pdf")
What is the equivalent code in Node?

For even newer version of Node.js (v8.1.4), the events and calls are similar or identical to older versions, but it's encouraged to use the standard newer language features. Examples:
For buffered, non-stream formatted output (you get it all at once), use child_process.exec:
const { exec } = require('child_process');
exec('cat *.js bad_file | wc -l', (err, stdout, stderr) => {
if (err) {
// node couldn't execute the command
return;
}
// the *entire* stdout and stderr (buffered)
console.log(`stdout: ${stdout}`);
console.log(`stderr: ${stderr}`);
});
You can also use it with Promises:
const util = require('util');
const exec = util.promisify(require('child_process').exec);
async function ls() {
const { stdout, stderr } = await exec('ls');
console.log('stdout:', stdout);
console.log('stderr:', stderr);
}
ls();
If you wish to receive the data gradually in chunks (output as a stream), use child_process.spawn:
const { spawn } = require('child_process');
const child = spawn('ls', ['-lh', '/usr']);
// use child.stdout.setEncoding('utf8'); if you want text chunks
child.stdout.on('data', (chunk) => {
// data from standard output is here as buffers
});
// since these are streams, you can pipe them elsewhere
child.stderr.pipe(dest);
child.on('close', (code) => {
console.log(`child process exited with code ${code}`);
});
Both of these functions have a synchronous counterpart. An example for child_process.execSync:
const { execSync } = require('child_process');
// stderr is sent to stderr of parent process
// you can set options.stdio if you want it to go elsewhere
let stdout = execSync('ls');
As well as child_process.spawnSync:
const { spawnSync} = require('child_process');
const child = spawnSync('ls', ['-lh', '/usr']);
console.log('error', child.error);
console.log('stdout ', child.stdout);
console.log('stderr ', child.stderr);
Note: The following code is still functional, but is primarily targeted at users of ES5 and before.
The module for spawning child processes with Node.js is well documented in the documentation (v5.0.0). To execute a command and fetch its complete output as a buffer, use child_process.exec:
var exec = require('child_process').exec;
var cmd = 'prince -v builds/pdf/book.html -o builds/pdf/book.pdf';
exec(cmd, function(error, stdout, stderr) {
// command output is in stdout
});
If you need to use handle process I/O with streams, such as when you are expecting large amounts of output, use child_process.spawn:
var spawn = require('child_process').spawn;
var child = spawn('prince', [
'-v', 'builds/pdf/book.html',
'-o', 'builds/pdf/book.pdf'
]);
child.stdout.on('data', function(chunk) {
// output will be here in chunks
});
// or if you want to send output elsewhere
child.stdout.pipe(dest);
If you are executing a file rather than a command, you might want to use child_process.execFile, which parameters which are almost identical to spawn, but has a fourth callback parameter like exec for retrieving output buffers. That might look a bit like this:
var execFile = require('child_process').execFile;
execFile(file, args, options, function(error, stdout, stderr) {
// command output is in stdout
});
As of v0.11.12, Node now supports synchronous spawn and exec. All of the methods described above are asynchronous, and have a synchronous counterpart. Documentation for them can be found here. While they are useful for scripting, do note that unlike the methods used to spawn child processes asynchronously, the synchronous methods do not return an instance of ChildProcess.

Node JS v15.8.0, LTS v14.15.4, and v12.20.1 --- Feb 2021
Async method (Unix):
'use strict';
const { spawn } = require( 'child_process' );
const ls = spawn( 'ls', [ '-lh', '/usr' ] );
ls.stdout.on( 'data', ( data ) => {
console.log( `stdout: ${ data }` );
} );
ls.stderr.on( 'data', ( data ) => {
console.log( `stderr: ${ data }` );
} );
ls.on( 'close', ( code ) => {
console.log( `child process exited with code ${ code }` );
} );
Async method (Windows):
'use strict';
const { spawn } = require( 'child_process' );
// NOTE: Windows Users, this command appears to be differ for a few users.
// You can think of this as using Node to execute things in your Command Prompt.
// If `cmd` works there, it should work here.
// If you have an issue, try `dir`:
// const dir = spawn( 'dir', [ '.' ] );
const dir = spawn( 'cmd', [ '/c', 'dir' ] );
dir.stdout.on( 'data', ( data ) => console.log( `stdout: ${ data }` ) );
dir.stderr.on( 'data', ( data ) => console.log( `stderr: ${ data }` ) );
dir.on( 'close', ( code ) => console.log( `child process exited with code ${code}` ) );
Sync:
'use strict';
const { spawnSync } = require( 'child_process' );
const ls = spawnSync( 'ls', [ '-lh', '/usr' ] );
console.log( `stderr: ${ ls.stderr.toString() }` );
console.log( `stdout: ${ ls.stdout.toString() }` );
From Node.js v15.8.0 Documentation
The same goes for Node.js v14.15.4 Documentation and Node.js v12.20.1 Documentation

You are looking for child_process.exec
Here is the example:
const exec = require('child_process').exec;
const child = exec('cat *.js bad_file | wc -l',
(error, stdout, stderr) => {
console.log(`stdout: ${stdout}`);
console.log(`stderr: ${stderr}`);
if (error !== null) {
console.log(`exec error: ${error}`);
}
});

Since version 4 the closest alternative is child_process.execSync method:
const {execSync} = require('child_process');
let output = execSync('prince -v builds/pdf/book.html -o builds/pdf/book.pdf');
⚠️ Note that execSync call blocks event loop.

Now you can use shelljs (from node v4) as follows:
var shell = require('shelljs');
shell.echo('hello world');
shell.exec('node --version');
Install with
npm install shelljs
See https://github.com/shelljs/shelljs

const exec = require("child_process").exec
exec("ls", (error, stdout, stderr) => {
//do whatever here
})

If you want something that closely resembles the top answer but is also synchronous then this will work.
var execSync = require('child_process').execSync;
var cmd = "echo 'hello world'";
var options = {
encoding: 'utf8'
};
console.log(execSync(cmd, options));

I just wrote a Cli helper to deal with Unix/windows easily.
Javascript:
define(["require", "exports"], function (require, exports) {
/**
* Helper to use the Command Line Interface (CLI) easily with both Windows and Unix environments.
* Requires underscore or lodash as global through "_".
*/
var Cli = (function () {
function Cli() {}
/**
* Execute a CLI command.
* Manage Windows and Unix environment and try to execute the command on both env if fails.
* Order: Windows -> Unix.
*
* #param command Command to execute. ('grunt')
* #param args Args of the command. ('watch')
* #param callback Success.
* #param callbackErrorWindows Failure on Windows env.
* #param callbackErrorUnix Failure on Unix env.
*/
Cli.execute = function (command, args, callback, callbackErrorWindows, callbackErrorUnix) {
if (typeof args === "undefined") {
args = [];
}
Cli.windows(command, args, callback, function () {
callbackErrorWindows();
try {
Cli.unix(command, args, callback, callbackErrorUnix);
} catch (e) {
console.log('------------- Failed to perform the command: "' + command + '" on all environments. -------------');
}
});
};
/**
* Execute a command on Windows environment.
*
* #param command Command to execute. ('grunt')
* #param args Args of the command. ('watch')
* #param callback Success callback.
* #param callbackError Failure callback.
*/
Cli.windows = function (command, args, callback, callbackError) {
if (typeof args === "undefined") {
args = [];
}
try {
Cli._execute(process.env.comspec, _.union(['/c', command], args));
callback(command, args, 'Windows');
} catch (e) {
callbackError(command, args, 'Windows');
}
};
/**
* Execute a command on Unix environment.
*
* #param command Command to execute. ('grunt')
* #param args Args of the command. ('watch')
* #param callback Success callback.
* #param callbackError Failure callback.
*/
Cli.unix = function (command, args, callback, callbackError) {
if (typeof args === "undefined") {
args = [];
}
try {
Cli._execute(command, args);
callback(command, args, 'Unix');
} catch (e) {
callbackError(command, args, 'Unix');
}
};
/**
* Execute a command no matters what's the environment.
*
* #param command Command to execute. ('grunt')
* #param args Args of the command. ('watch')
* #private
*/
Cli._execute = function (command, args) {
var spawn = require('child_process').spawn;
var childProcess = spawn(command, args);
childProcess.stdout.on("data", function (data) {
console.log(data.toString());
});
childProcess.stderr.on("data", function (data) {
console.error(data.toString());
});
};
return Cli;
})();
exports.Cli = Cli;
});
Typescript original source file:
/**
* Helper to use the Command Line Interface (CLI) easily with both Windows and Unix environments.
* Requires underscore or lodash as global through "_".
*/
export class Cli {
/**
* Execute a CLI command.
* Manage Windows and Unix environment and try to execute the command on both env if fails.
* Order: Windows -> Unix.
*
* #param command Command to execute. ('grunt')
* #param args Args of the command. ('watch')
* #param callback Success.
* #param callbackErrorWindows Failure on Windows env.
* #param callbackErrorUnix Failure on Unix env.
*/
public static execute(command: string, args: string[] = [], callback ? : any, callbackErrorWindows ? : any, callbackErrorUnix ? : any) {
Cli.windows(command, args, callback, function () {
callbackErrorWindows();
try {
Cli.unix(command, args, callback, callbackErrorUnix);
} catch (e) {
console.log('------------- Failed to perform the command: "' + command + '" on all environments. -------------');
}
});
}
/**
* Execute a command on Windows environment.
*
* #param command Command to execute. ('grunt')
* #param args Args of the command. ('watch')
* #param callback Success callback.
* #param callbackError Failure callback.
*/
public static windows(command: string, args: string[] = [], callback ? : any, callbackError ? : any) {
try {
Cli._execute(process.env.comspec, _.union(['/c', command], args));
callback(command, args, 'Windows');
} catch (e) {
callbackError(command, args, 'Windows');
}
}
/**
* Execute a command on Unix environment.
*
* #param command Command to execute. ('grunt')
* #param args Args of the command. ('watch')
* #param callback Success callback.
* #param callbackError Failure callback.
*/
public static unix(command: string, args: string[] = [], callback ? : any, callbackError ? : any) {
try {
Cli._execute(command, args);
callback(command, args, 'Unix');
} catch (e) {
callbackError(command, args, 'Unix');
}
}
/**
* Execute a command no matters what's the environment.
*
* #param command Command to execute. ('grunt')
* #param args Args of the command. ('watch')
* #private
*/
private static _execute(command, args) {
var spawn = require('child_process').spawn;
var childProcess = spawn(command, args);
childProcess.stdout.on("data", function (data) {
console.log(data.toString());
});
childProcess.stderr.on("data", function (data) {
console.error(data.toString());
});
}
}
Example of use:
Cli.execute(Grunt._command, args, function (command, args, env) {
console.log('Grunt has been automatically executed. (' + env + ')');
}, function (command, args, env) {
console.error('------------- Windows "' + command + '" command failed, trying Unix... ---------------');
}, function (command, args, env) {
console.error('------------- Unix "' + command + '" command failed too. ---------------');
});

Use this lightweight npm package: system-commands
Look at it here.
Import it like this:
const system = require('system-commands')
Run commands like this:
system('ls').then(output => {
console.log(output)
}).catch(error => {
console.error(error)
})

If you don't mind a dependency and want to use promises, child-process-promise works:
installation
npm install child-process-promise --save
exec Usage
var exec = require('child-process-promise').exec;
exec('echo hello')
.then(function (result) {
var stdout = result.stdout;
var stderr = result.stderr;
console.log('stdout: ', stdout);
console.log('stderr: ', stderr);
})
.catch(function (err) {
console.error('ERROR: ', err);
});
spawn usage
var spawn = require('child-process-promise').spawn;
var promise = spawn('echo', ['hello']);
var childProcess = promise.childProcess;
console.log('[spawn] childProcess.pid: ', childProcess.pid);
childProcess.stdout.on('data', function (data) {
console.log('[spawn] stdout: ', data.toString());
});
childProcess.stderr.on('data', function (data) {
console.log('[spawn] stderr: ', data.toString());
});
promise.then(function () {
console.log('[spawn] done!');
})
.catch(function (err) {
console.error('[spawn] ERROR: ', err);
});
ECMAScript Modules import...from syntax
import {exec} from 'child-process-promise';
let result = await exec('echo hi');
console.log(result.stdout);

#hexacyanide's answer is almost a complete one.
On Windows command prince could be prince.exe, prince.cmd, prince.bat or just prince (I'm no aware of how gems are bundled, but npm bins come with a sh script and a batch script - npm and npm.cmd).
If you want to write a portable script that would run on Unix and Windows, you have to spawn the right executable.
Here is a simple yet portable spawn function:
function spawn(cmd, args, opt) {
var isWindows = /win/.test(process.platform);
if ( isWindows ) {
if ( !args ) args = [];
args.unshift(cmd);
args.unshift('/c');
cmd = process.env.comspec;
}
return child_process.spawn(cmd, args, opt);
}
var cmd = spawn("prince", ["-v", "builds/pdf/book.html", "-o", "builds/pdf/book.pdf"])
// Use these props to get execution results:
// cmd.stdin;
// cmd.stdout;
// cmd.stderr;

Related

Is there a way to get 'live' output lines from a python script spawned by child_process.execFile without flushing stdout every time?

I am trying to get the lines a ('never ending') python script puts into stdout. But currently my code would only log something to the console when the python process exits. Is there a way I can get the 'live' output of the python script line by line?
spawn_child.js:
let execFile = require("child_process").execFile;
var child = execFile("python3", ["PATH_TO_FILE"]);
child.stdout.on("data", data=>{
console.log(data.toString());
});
child.stderr.on("data", data=>{
console.log(data.toString());
});
child.on("exit", code=>{
console.log("Child exited with code "+code);
});
The python file:
from time import sleep
while True:
sleep(3)
print("test")
Edit: It works when using a nodejs script instead of a python script
change python script to
import time
import sys
while True:
time.sleep(1)
print("test")
sys.stdout.flush()
and increase the buffer size of the child process
const child = execFile("python", ["./runner.py"], {
detached: true,
maxBuffer: 10 * 1024 * 1024 * 1024
});
or you can do it without the flushing to stdout with python-shell
const { PythonShell } = require('python-shell');
let pyshell = new PythonShell('runner.py');
pyshell.on('message', function (message) {
console.log(message);
});
pyshell.end(function (err, code, signal) {
if (err) throw err;
console.log('The exit code was: ' + code);
console.log('The exit signal was: ' + signal);
console.log('finished');
});
Use spawn instead of execFile, dont forget options shell and stdio.
const spawn = require("child_process").spawn;
const child = spawn("python3", ["file.py"], {shell: true, stdio: 'inherit'});
child.on('data', function(data) {
console.log(data);
});
child.on('close', function(code) {
console.log('Child process exited with exit code '+code);
});
You can also add cwd option.
Was trying to implement something similar inside a NextJS application and wanted live output from my python script and using python-shell had the same issue that it was only giving me output when the process existed and I ended up using node-pty instead which worked as expected:
import { spawn } from "node-pty"
const pyProcess = spawn("python", ["path/to/python/script"], {
name: 'xterm-color',
cols: 80,
rows: 30,
cwd: process.cwd(),
});
pyProcess.on('data', function (data: { toString: () => any; }) {
console.log(data.toString());
});
pyProcess.on('exit', (code: any) => {
console.log(`child process exited with code ${code}`);
});

Run cmd.exe and make some command with Electron.js

Is it possible to run cmd.exe and execute some command with Electron.js?
If yes then how can I do this?
In your main.js file, you can put the following code:
//Uses node.js process manager
const electron = require('electron');
const child_process = require('child_process');
const dialog = electron.dialog;
// This function will output the lines from the script
// and will return the full combined output
// as well as exit code when it's done (using the callback).
function run_script(command, args, callback) {
var child = child_process.spawn(command, args, {
encoding: 'utf8',
shell: true
});
// You can also use a variable to save the output for when the script closes later
child.on('error', (error) => {
dialog.showMessageBox({
title: 'Title',
type: 'warning',
message: 'Error occured.\r\n' + error
});
});
child.stdout.setEncoding('utf8');
child.stdout.on('data', (data) => {
//Here is the output
data=data.toString();
console.log(data);
});
child.stderr.setEncoding('utf8');
child.stderr.on('data', (data) => {
// Return some data to the renderer process with the mainprocess-response ID
mainWindow.webContents.send('mainprocess-response', data);
//Here is the output from the command
console.log(data);
});
child.on('close', (code) => {
//Here you can get the exit code of the script
switch (code) {
case 0:
dialog.showMessageBox({
title: 'Title',
type: 'info',
message: 'End process.\r\n'
});
break;
}
});
if (typeof callback === 'function')
callback();
}
Now, you can execute arbitary command (the example is from windows command prompt, but the funtion is universal) by calling:
run_script("dir", ["/A /B /C"], null);
The parameters of your command are in fact an array ["/A /B /C"], and the last parameter is callback to be executed, you can provide null as parameter, if special callback function is not needed.
it is possible by using node child_process , You can use this function:
const exec = require('child_process').exec;
function execute(command, callback) {
exec(command, (error, stdout, stderr) => {
callback(stdout);
});
};
// call the function
execute('ping -c 4 0.0.0.0', (output) => {
console.log(output);
});
and there are many packages in npm for this topic to .

Cannot kill a linux process using node js process.kill(pid)

Im trying to kill background running process using nodejs process.kill(pid, 'SIGTERM'), but the process is not getting killed.
I executed node script mentioned below and later checked the process using ps -efww | grep 19783 | grep -v grep from the prompt to confirm it is still not killed.
I can confirm that the process it is trying to kill is started by the same user, so there is no permission issue.
Is there something I need to pass to get the process killed.
Node Version: 8.11.1
OS: Linux 3.10.0-327.10.1.e17.x86_64
Reference : Node process
Code :
'use strict';
const argv = require('yargs').argv;
const exec = require('child_process').exec;
function execute(command) {
console.log("Executing Command : ", command);
return new Promise((resolve, reject) => {
exec(command, {
maxBuffer: 1024 * 5000000
}, (error, stdout, stderr) => {
if (error) {
console.log(`ERROR: Something went wrong while executing ${command}: ${error}`);
reject(error);
} else {
resolve(stdout);
}
});
});
}
function kill(pid) {
try {
console.log(`Killing Process : ${pid}`);
process.kill(pid, 'SIGTERM');
let command = `ps -efww | grep ${pid} | grep -v grep | grep -v dzdo `;
let output = execute(command).then(res => {
console.log(`output: ${res}`);
}).catch(err => console.log(err));
} catch (e) {
console.log(`Invalid Process ID:${pid}, failed during kill, "ERROR: ${e}"`);
}
}
function main() {
// remove all spaces;
if (argv.kill) {
let allPIDs = argv.kill || undefined;
// console.log(`ALL PID's: ${allPIDs}`);
allPIDs = allPIDs.toString().replace(/\s/, '').split(',');
if (allPIDs.length > 0) {
allPIDs.forEach(pid => {
if (!isNaN(pid)) {
// console.log(`Valid PID: ${pid}`);
kill(pid);
} else {
console.log(`ERROR: Invalid Process ID : ${pid}, Skipped Kill `);
}
});
}
}
}
main();
Assuming this code is saved as killer.js
Usage: node killer.js --kill=19783
Try SIGKILL instead of SIGTERM
The doc says
'SIGKILL' cannot have a listener installed, it will unconditionally terminate Node.js on all platforms.
So I think worth trying.

How to ignore errors generated by child_process.exec?

I am executing shell script commands from my nodejs script. One of these commands is "npm install" followed by a command to run the index file of a nodejs file.
The npm install command is returning an error generated by node-gyp. In general, this error does not affect my service. However, child_process.exec is catching it and stopping the script. My questions is, how do I trigger the exec command and ignore the error returned?
Below is a fraction of the code snippet
const exec = require('child_process').exec;
exec("npm install", {
cwd: serviceDirectory + gitRepo
},
(error1, stdout, stderr) => {
if(error1){
//this error is for testing purposes
util.log(error1);
}
//run the service
exec("node index.js",{
cwd: serviceDirectory + gitRepo + "/"
}, cb);
});
}
You can use try-catch to handle the errors with catch, example:
const { promisify } = require('util');
const exec = promisify(require('child_process').exec);
export default async function () {
const dataFormat = {
stdout: '',
stderr: '',
};
let cpu = dataFormat;
let diskUsed = dataFormat;
try {
cpu = await exec('top -bn1 | grep "Cpu(s)" | sed "s/.*, *\\([0-9.]*\\)%* id.*/\\1/"');
} catch (error) {
cpu.stderr = error.stderr;
}
try {
diskUsed = await exec("df -h | awk 'NR==2{printf $3}'");
} catch (error) {
diskUsed.stderr = error.stderr;
}
const payload = {
cpu,
diskUsed,
};
return payload
}

How execute multiple commands on SSH2 using NodeJS

I'm trying to deploy from GitHub using I want to execute more than one command, in order of the array. The code I'm using now is included below.
async.series([
...
// Deploy from GitHub
function (callback) {
// Console shizzle:
console.log('');
console.log('Deploying...'.red.bold);
console.log();
console.log();
var deployFunctions = [
{
command: 'cd ' + envOptions.folder + ' && pwd',
log: false
},
{
command: 'pwd'
},
{
command: 'su ' + envOptions.user,
log: false
},
{
command: 'git pull'
},
{
command: 'chmod 0777 * -R',
log: false
}
];
async.eachSeries(deployFunctions, function (item, callback) {
deployment.ssh2.exec(item.command, function (err, stream) {
deployment.logExec(item);
stream.on('data', function (data, extended) {
console.log(data.toString().trim());
console.log();
});
function done() {
callback(err);
}
stream.on('exit', done);
stream.on('end', done);
});
}, function () {
callback();
});
},
...);
But, after I cd'ed to the right directory, it forgets where it was and starts all over again.
$ cd /some/folder && pwd
/some/folder
$ pwd
/root
#robertklep is correct about why your cd doesn't persist. Each command invokes a distinct shell instance which starts in its initial state. You could prefix each command with cd /home/jansenstok/domains/alcoholtesterwinkel.com/public_html/ && as a quick fix, but really you are setting yourself up for pain. What you want is a shell script with all the power of multiple lines as opposed to a list of individual disconnected commands.
Look at using ssh2's sftp function to transfer a complete shell script to the remote machine as step 1, execute it via exec (/bin/bash /tmp/your_deploy_script.sh) as step 2, and then delete the script as step 3.
I know this is a super old question, but I ran into this problem while trying to manage an ACE through my Node server. The answer didn't work for me, but several searches later led me to a wrapper that worked really well for me. Just wanted to share here because this was the top link in my Google search. It's called ssh2shell and can be found here: https://www.npmjs.com/package/ssh2shell
It's very simple to use, just pass an array of commands and they run one by one waiting for each command to complete before moving on to the next.
A practical example:
const client = new Client();
const cmds = [
'ls -lah \n',
'cd /mnt \n',
'pwd \n',
'ls -lah \n',
'exit \n',
];
client.on('ready', () => {
console.log('Client :: ready');
client.shell((err, stream) => {
stream.on('close', (code) => {
console.log('stream :: close\n', { code });
}).on('data', (myData) => {
console.log('stream :: data\n', myData.toString());
}).on('exit', (code) => {
console.log('stream :: exit\n', { code });
client.end();
}).on('error', (e) => {
console.log('stream :: error\n', { e });
rej(e);
});
for (let i = 0; i < cmds.length; i += 1) {
const cmd = cmds[i];
stream.write(`${cmd}`);
}
});
}).connect({
host: '127.0.0.1',
port: 22,
username: 'root',
password: 'root',
});
all the examples in the doc use stream.end() which caused the creation of a new session instead of using the current one.
You cooldn't use "shell" on your program because "Shell" command invokes a new terminal on the system and does your jop. You need to use "exec" command without not emitting "exit" . Default "exec" command emits "exit" command after the command which you gave has been executed.

Categories