I'm getting a bit lost in child_process docs. What is the recommended way to run a server.js in a child_process ?
Should I run this below? Also, if I kill the main file, will it kill the child process too?
const { exec } = require('child_process')
exec('node server.js')
Backstory: I'm trying to run webpack, but spin up the proxy api server from the webpack JS file.
So after some finicking around here is what I got to run the webpack server and express server at the same time from the same file (NOTE: they do both get killed simultanously :) )
In webpackDevServer.js
child_process.exec('node servers/devServer.js ' + API_SERVER_PORT, (err, stdout, stderr) => {
if (err) {
throw new Error('Proxy server failed to run.', err);
}
})
console.info('> API SERVER: running on port', API_SERVER_PORT)
Related
I need to create a node.js app that connects to this ftp server and downloads files from this directory:
ftp://www.ngs.noaa.gov/cors/rinex/2021/143/nynb
I've tried following the ftp npm package docs but I feel like I am doing something horribly wrong:
import Client from "ftp";
/**
* https://github.com/mscdex/node-ftp
*/
const c = new Client();
c.on("ready", function () {
c.get(
"ftp://www.ngs.noaa.gov/cors/rinex/2021/143/nynb",
(error, stream) => {
if (error) throw error;
console.log(`stream`, stream);
stream.once("close", function () {
c.end();
});
}
);
});
// connect to localhost:21 as anonymous
c.connect();
When I run npm run dev with nodemon I get:
Error: connect ECONNREFUSED 127.0.0.1:21
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1146:16)
[nodemon] app crashed - waiting for file changes before starting...
Can someone please help? I'm completely stumped.
Is it possible if someone could show me a small example of how I can connect to this remote ftp server?
There are a few points :
You're connecting to the local ftp with c.connect();. You need to connect to www.ngs.noaa.gov to download files from there.
This path cors/rinex/2021/143/nynb is a directory on the remote host. c.get doesn't work, you need to list all files in the directory then download them 1 by 1.
The code below connect to the remote server and list all files in the directory
const Client = require('ftp');
const fs = require("fs");
const c = new Client();
c.on('ready', function () {
c.list( "/cors/rinex/2021/143/nynb", function (err, list) {
if (err) throw err;
console.dir(list);
});
});
c.connect({
host: "www.ngs.noaa.gov",
});
I have a React application that makes calls to a server.js file - these calls are requests to get data from a database with the use of queries (I'm using MSSQL).
Here is my server.js:
var express = require('express');
var app = express();
var sql = require("mssql");
var cors = require('cors');
app.use(cors())
var database = {
server: xxx,
authentication: {
type: 'default',
options: {
userName: ‘xxx’,
password: ‘xxx’,
},
},
options: {
database: ‘xxx’,
encrypt: false,
},
}
app.get(‘/gettingInfo’, function(req, res) {
sql.connect(database, function(err) {
if (err) {
console.log("ERROR HERE")
console.log(err);
}
var request = new sql.Request();
const finalRoomQuery = [query];
request.query(finalRoomQuery, function(err, recordset) {
if (err) {
console.log(err);
}
res.send(recordset);
});
});
});
var server = app.listen(5000, function () {
console.log('Server is running...');
});
Below is sample bit of code from one of my components that retrieves data from my server.js:
getData = () => {
if (this.mounted === true) {
fetch('http://localhost:5000/gettingInfo')
.then(results => results.json())
.then(results => this.setState({data: results}))
.catch(err => console.error(err));
}
}
When I run node server.js in my project directory, followed my npm start, my react application is able to retrieve data and render it appropriately.
I'm working on deploying my application. I ran npm run build to build the application, and I've deployed it to IIS. This is where I'm running into problems - if I don't have localhost:5000 running on my machine (i.e if node server.js is not entered), my application cannot query for the data needed, I get a (failed) net::ERR_CONNECTION_REFUSED. However, when I have localhost:5000 running, the application on IIS will run as expected.
My question is: how can I host this server.js file as well, and/or how can I configure this project so that I don't have to have localhost:5000 running for the application to work properly? I apologize if this is some straight forward fix or if I'm missing something very basic, this is my first experience with Web Dev.
In your server code, you are not sending response back in all cases.
app.get(‘/gettingInfo’, function(req, res) {
sql.connect(database, function(err) {
if (err) {
console.log("ERROR HERE")
console.log(err);
return res.send("Error Message");
}
var request = new sql.Request();
const finalRoomQuery = [query];
request.query(finalRoomQuery, function(err, recordset) {
if (err) {
console.log(err);
return res.send("Error Message");
}
return res.send(recordset);
});
});
});
You need to use process managers like PM2 and run your server in that console.
pm2
I'm writing a desktop web app that uses node.js to access the local file system. I can currently use node.js to open and copy files to different places on the hard drive. What I would also like to do is allow the user to open a specific file using the application that is associated with the file type. In other words, if the user selects "myfile.doc" in a Windows environment, it will launch MSWord with that file.
I must be a victim of terminology, because I haven't been able to find anything but the spawning of child processes that communicate with node.js. I just want to launch a file for the user to view and then let them decided what to do with it.
Thanks
you can do this
var cp = require("child_process");
cp.exec("document.docx"); // notice this without a callback..
process.exit(0); // exit this nodejs process
it not safe thought, to ensure that the command show no errors or any undesired output
you should add the callback parameter
child_process.exec(cmd,function(error,stdout,stderr){})
and next you can work with events so you won't block execution of script or even make use of a external node.js script that launches and handles outputs from processes which you spawn from a "master" script.
In below example I have used textmate "mate" command to edit file hello.js, you can run any command with child_process.exec but the application you want to open file in should provide you with command line options.
var exec = require('child_process').exec;
exec('mate hello.js');
var childProcess = require('child_process');
childProcess.exec('start Example.xlsx', function (err, stdout, stderr) {
if (err) {
console.error(err);
return;
}
console.log(stdout);
process.exit(0);// exit process once it is opened
})
Emphasis on where 'exit' is called. This executes properly in windows.
Simply call your file (any file with extension, including .exe) from the command promp, or programmatically:
var exec = require('child_process').exec;
exec('C:\\Users\\Path\\to\\File.js', function (err, stdout, stderr) {
if (err) {
throw err;
}
})
If you want to run a file without extension, you can do almost the same, as follow:
var exec = require('child_process').exec;
exec('start C:\\Users\\Path\\to\\File', function (err, stdout, stderr) {
if (err) {
throw err;
}
})
As you can see, we use start to open the File, letting windows (or windows letting us) choose an application.
If you prefer opening a file with async/await pattern,
const util = require('util');
const exec = util.promisify(require('child_process').exec);
async function openFile(path) {
const { stdout, stderr, err } = await exec(path);
if (err) {
console.log(err);
return;
}
process.exit(0);
}
openFile('D:\\Practice\\test.txt'); // enter your file location here
Is there a way where I can invoke a windows batch file from inside the javascript code? Or any other healthy way to do the below through any node package?
scripts.bat
ECHO "JAVASCRIPT is AWESOME"
PAUSE
scripts.js
// Code to read and run the batch file //
On the command prompt:
C:/> node scripts.js
One way to do this is with child_process. You just have to pass the file you want to execute.
const execFile = require('child_process').execFile;
const child = execFile('scripts.bat', [], (error, stdout, stderr) => {
if (error) {
throw error;
}
console.log(stdout);
});
A scenario of mine:
I'd like to convert jsx file (React) into normal js.
And I need browserify or a kind because one of modules needs require.
Watching files for every modifications and repeating tasks make me feel overkill. Those tasks are only needed to be executed on timing of browser reload (on a request).
I know Rails development env does this, but in this case a node.js app.
So I do like to use a http proxy in front of my app and let execute those tasks before my app responses to browser.
Any tools available already? or any advice for implementing such a proxy?
I don't mind if those tools are available already in any language, (node.js, python or ruby), but if non I'd like to implement one with node.js.
While app server is running on port 3000, I wrote a quick and dirty reverse proxy like this.
This proxy listens on port 31000 and just forward every requests, but if path is "/bundle.js" execute a command before forward the request.
var exec = require('child_process').exec;
var http = require('http');
var httpProxy = require('http-proxy')
var proxy = httpProxy.createProxyServer({target: "http://localhost:3000"})
var proxyServer = http.createServer(function(req, res) {
if (req.url == '/bundle.js') {
exec("jsx src/ build/ && browserify build/app.js -o bundle.js", function(err, stdout, stderr) {
if (err) {
console.error('error: ' + err);
}
proxy.web(req, res);
});
} else {
proxy.web(req, res);
}
});
proxyServer.on('upgrade', function (req, socket, head) {
proxy.ws(req, socket, head);
});
proxyServer.listen(3100);