Controlling a specific process id in Node? - javascript

I'm developing a node application that allows a client to control a program running on the server. The program must always be running on its own terminal window. Ideal scenario is outlined as follows:
client clicks a button -> command is run in terminal running program-> program does something
I'm not too experienced with node but I know that I can run command line scripts using the ChildProcess event emitter. The issue I'm having is how do I tell node to run a command on a particular process (i.e. the one running the program I'm trying to manipulate). Is there a way to execute commands on a specific process id? Is there a way to detect all current processes and their id's?
Any suggestions or direction would be greatly appreciated.

When you create a child process, you can assign it to a variable so that you can reference it later. In this instance, you may want to add it to an object or array so that you can reference a group of running processes.
You can refer to the documentation for spawn or exec for examples.
One way to send commands to a created child process is using signals, such as child.kill('SIGSOMETHING');
For example:
var spawn = require('child_process').spawn;
function spawnChild() {
var cmd = spawn('cmd', ['-p1', 'param']);
cmd.stdout.on('data', function (data) {
console.log('stdout: ' + data);
});
cmd.stderr.on('data', function (data) {
console.log('stderr: ' + data);
});
cmd.on('close', function (code) {
console.log('child process exited with code ' + code);
});
// Save a reference to this child
children.push(cmd);
}
// Spawn 5 children
for (var i = 0; i < 5; i++) {
spawnChild();
}
// Send a signal after 5 seconds
setTimeout(function(){
for (var i in children) {
var child = children[i];
console.log('Sending signal to child with PID: ' + child.pid);
child.kill('SIGSOMETHING');
}
}, 5000);

Related

Node.js start updater and close main program

I am checking if my application has an update by pinging a certain URL, pinging this URL returns whether I need an update or not.
Now, I have a powershell file which actually handles the update, so I'm trying to launch this powershell file from inside of my application.
I have this working, I can spawn my updater file and it will run through and everything is good. However, my application stays open the whole time, which means that once the updater is finished I will have 2 instances of it running.
The obvious solution to this in my mind is to close the application if an update is found (after spawning the updater).
Here is my code:
child = spawn("powershell.exe",['-ExecutionPolicy', 'ByPass', '-File', require("path").resolve(__dirname, '../../../../updater.ps1')]);
child.unref();
self.close();
However, when I try to make the application close, it seems like the updater is never launched. Or rather, I believe it is launched but gets closed when the main application gets closed.
I have the line child.unref() which I thought was supposed to make the spawned window not attached to the main application, but the updater won't stay open.
I have also tried adding {detached: true} as the 3rd parameter of my spawn() command, but it didn't make a difference in the way it was running.
How can I spawn the updater completely separate from my application?
To start the update separated from your application I think you should use a script instead of a inline parameter. This will ensure that OS creates a separated process from your node app. For example:
var fs = require('fs');
var spawn = require('child_process').spawn;
var out = fs.openSync('./out.log', 'a');
var err = fs.openSync('./out.log', 'a');
var child = spawn('./myscript.sh', [], {
detached: true,
stdio: [ 'ignore', out, err ]
});
child.unref();
setTimeout(function(){
process.exit();
}, 1000);
The myscript.sh looks like this:
sleep 5; ls >> out2.log
The code above will force node exit (after 2 seconds) but just before it started a bash script (which will wait 5 seconds to run ls command). Running this code results in 2 output files (out.log and out2.log). The first one (out.log) is the output of node app calling a child process while the second (out2.log) is the result of script redirected from separated script.
A better approach and more elegant is using on function. But this means that your main process will actually wait for child process to complete the execution. For example:
var fs = require('fs');
var spawn = require('child_process').spawn;
var out = fs.openSync('./out.log', 'a');
var err = fs.openSync('./out.log', 'a');
var child = spawn('ls', [], {
detached: true,
stdio: [ 'ignore', out, err ]
});
child.on('exit', (code) => {
console.log(`Child exited with code ${code}`);
});
child.unref();
In the second example, the ls result will be saved in out.log file. Since the main process will wait for child to complete.
So all depends on what you are willing to achieve. The first solution is not beautiful but will start something really apart from your node app.

Is there a way to synchronously execute multiple JavaScript files in node?

So in Node I can execute a JavaScript file using a command like:
$ node src/someFile.js
But is there a way to execute all of the JavaScript files in a given directory synchronously (one file executes, then after it has finished the next one executes, etc)? Basically, is there a single command that would have the effect of something like
$ node src/firstFile.js
$ node src/secondFile.js
$ node src/thirdFile.js
...
I've tried commands like
$ node src/*.js
but with no success.
If there exists no such command, what's the best way to go about doing something like this?
I am not sure if this is going to work for you because this is a feature of the shell not of the node runtime but..
for f in src/*.js; do node "$f"; done
Or in Powershell:
Get-ChildItem .\*.js | Foreach-Object {
node $_
}
You could use spawn to run a node process from node like
(function() {
var cp = require('child_process');
var childProcess = cp.spawn('node', [`src/firstFile.js`]);
At this point you have to add some listeners:
// now listens events
// Listen for an exit event:
child.on('exit', function(exitCode) {
console.log("Child exited with code: " + exitCode);
return resolve(exitCode);
});
// Listen for stdout data
child.stdout.on('data', function(data) {
console.log(data.toString());
});
// child error
child.stderr.on('data',
function(data) {
console.log('err data: ' + data);
// on error, kill this child
child.kill();
}
);
}).call(this);
Of course you need to serialize execution here, but it's easy since you have the child.on('exit') that tells you that the process ended, so you can start the next one.
Look to Controlling Multiple Processes in Node for my example working solution that run multi processes in node and wait execution to end / join.
Using a POSIX shell:
$ for js in src/*.js; do node "$js"; done
If the calling each one from the shell thing isn't a hard requirement, I would kick them all off with a single node process from the shell. This node script would:
traverse the directory of modules
require the first one, which executes it, and pass a callback which the module will call on completion
When the complete callback is called, execute the next script in your directory.

Node.js code ordering -- is it possible that data is emitted before the listener is in place

This code from Professional Node.js Building Javascript-based Scalable software:
var spawn = require('child_process').spawn;
// Spawn the child with a node process executing the plus_one app
var child = spawn('node', ['06_plus_one.js']);
// Call this function every 1 second (1000 milliseconds):
setInterval(function() {
// Create a random number smaller than 10.000
var number = Math.floor(Math.random() * 10000);
// Send that number to the child process:
child.stdin.write(number + "\n");
// Get the response from the child process and print it:
child.stdout.on('data', function(data) {
console.log('child replied to ' + number + ' with: ' + data);
});
}, 1000);
child.stderr.on('data', function(data) {
process.stdout.write(data);
});
The child process simply increment the number passed from the parent. Is it possible that child.stdin.write() goes to child process and before parent register its data listener that the child already emit the data event?
Also a second question. The code originally had incorrect child program file name and that throws an error. How do I catch error from spawn?
WARNING, memory leak detected! Don't attach a listener inside a loop (setInterval). Every second you're adding a listener. Put this code outside the setInterval callback:
child.stdout.on('data', function(data) {
console.log('child replied to ' + number + ' with: ' + data);
});
The child process simply increment the number passed from the parent. Is it possible that child.stdin.write() goes to child process and before parent register its data listener that the child already emit the data event?
No. Two reasons:
Read the previous comment. You MUST attach the listener outside the setInterval(), just like stderr.on("data").
The child will send the message in a later tick. In the current loop tick you write the message and in a future tick you get the response from the child. This is the definition of asynchronicity with 1 thread (in the javascript layer).
Also a second question. The code originally had incorrect child program file name and that throws an error. How do I catch error from spawn?
Have you tried try-catching the spawn() function?
The child process simply increment the number passed from the parent.
Is it possible that child.stdin.write() goes to child process and
before parent register its data listener that the child already emit
the data event?
since the stream.write operation is trully async, NO.

use grunt to restart a phantomjs process

I'm using grunt to have some tasks done every time I change my code ( jshint for example ) and I want to reload a phantomJs process every time I have changes.
The first way I found is to use grunt.util.spawn to run phantomJs the first time.
// http://gruntjs.com/api/grunt.util#grunt.util.spawn
var phantomJS_child = grunt.util.spawn({
cmd: './phantomjs-1.9.1-linux-x86_64/bin/phantomjs',
args: ['./phantomWorker.js']
},
function(){
console.log('phantomjs done!'); // we never get here...
});
And then, every time watch restarts, another task uses grunt.util.spawn to kill the phantomJs process, which is of course VERY ugly.
Is there any better way to do it?
The thing is that the phantomJs process is not teminating because I use it as a webserver to server a REST API with JSON.
Can I have a grunt callback or something whenever watch kicks in so I can close my previous phantomJs process before I re-run the task to create a new one?
I used grunt.event to make a handler, but I cannot see how to access the phantomjs process in order to kill it.
grunt.registerTask('onWatchEvent',function(){
// whenever watch starts, do this...
grunt.event.on('watch',function(event, file, task){
grunt.log.writeln('\n' + event + ' ' + file + ' | running-> ' + task);
});
});
This entirely untested code could be a solution for your problem.
Node's native child spawning function exec immediately returns a reference to the child process, which we can keep around to later kill it. To use it we can create a custom grunt task on the fly, like so:
// THIS DOESN'T WORK. phantomjs is undefined every time the watcher re-executes the task
var exec = require('child_process').exec,
phantomjs;
grunt.registerTask('spawn-phantomjs', function() {
// if there's already phantomjs instance tell it to quit
phantomjs && phantomjs.kill();
// (re-)start phantomjs
phantomjs = exec('./phantomjs-1.9.1-linux-x86_64/bin/phantomjs ./phantomWorker.js',
function (err, stdout, stderr) {
grunt.log.write(stdout);
grunt.log.error(stderr);
if (err !== null) {
grunt.log.error('exec error: ' + err);
}
});
// when grunt exits, make sure phantomjs quits too
process.on('exit', function() {
grunt.log.writeln('killing child...');
phantomjs.kill();
});
});

Execution of arbitrary JavaScript code (not command) in NodeJS child process

What is the easiest way to run some code (not command!) in a different process and communicate its result with the main process?
So I have a quite intensive task that needs to be split up into different processes. What's the easiest way to do something like this?
// in main process
var otherProcess = createAnotherProcess(function() {
console.log("this code is ran in another process");
return "some data";
});
otherProcess.on("done", function(data) {
console.log(data); // will output "some data"
});
Having a single source code file that is able to run code in multiple processes would be amazing! Is this even possible? I've tried reading a bit about "child_processes" in node but find it a little too convoluted.
Any help?
var spawn = require("child_process").spawn;
var stat = spawn("dstat", ["-r", "--noheaders", "--nocolor"]);
var output = function (output_data) {
//do something with output_data
};
stat.stdout.on("data", output);
http://nodejs.org/docs/v0.4.11/api/child_processes.html
To run some command in child process you can use child_process.spawn method.
If you want to run some heavy JS code you can split it execution to chunks to not block the IO using process.nextTick.
Use dimas-parallel (on npm):
var dimas = require('dimas-parallel');
dimas.execute(function() {
// do some stuff on another process
});
You can use node -e <code>
Example:
const { spawn } = require('child_process')
const child = spawn('node', ['-e', 'console.log("in other process", process.pid)'])
child.stdout.on('data', data => {
console.log(data.toString())
})

Categories