NodeJS Tests - exposing app - javascript

I'm testing my NodeJS application with supertest. At the end of my app.js I expose my app, so I can use it within the test.js.
///////////////////
// https options
var options = {
key: fs.readFileSync("./private/keys/Server.key"),
cert: fs.readFileSync("./private/certs/Server.crt"),
ca: fs.readFileSync("./private/ca/CA.crt"),
requestCert: true,
rejectUnauthorized: false
};
///////////////////
// start https server
var server = https.createServer(options, app).listen(app.get("port"), function(){
console.log('Server listening on port ' + app.get('port'));
});
exports = module.exports = server;
When I require my server in my test.js the path of options will cause an error, since my test.js is in the directory ./test/test.js, whereas app.js is in ./app.js.
I'm struggeling to find a clean solution for this problem.

If you make your options into a module, you can either use it directly or stub it in your test using something like rewire or sandboxed-module
Alternatively you could modify your exports to accept the options as a parameter so that you require it with e.g. require('server')(config)

Related

How to call function from Nodejs running as windows service

I have created windows service from nodeJs application using node-windows package. Below is my code.
Main.js
var Service = require('node-windows').Service;
// Create a new service object
var svc = new Service({
name:'SNMPCollector',
description: 'SNMP collector',
script: './app.js',
nodeOptions: [
'--harmony',
'--max_old_space_size=4096'
]
//, workingDirectory: '...'
});
// Listen for the "install" event, which indicates the
// process is available as a service.
svc.on('install',function(){
svc.start();
});
svc.install();
/* svc.uninstall(); */
App.js
const { workerData, parentPort, isMainThread, Worker } = require('worker_threads')
var NodesList = ["xxxxxxx", "xxxxxxx"]
module.exports.run = function (Nodes) {
if (isMainThread) {
while (Nodes.length > 0) {
// my logic
})
}
}
}
Now when I run main.js, it creates a windows service and I can see the service running in services.msc
But, how can I call this run() method which is inside the running service, from any outside application? I couldn't find any solution for this, any help would be great.
You might consider simply importing your run function where you need it and run it there, then there is no need for a windows service or main.js - this assumes that "any outside application" is a Node application.
In your other application you you do the folowing:
const app = require('<path to App.js>');
app.run(someNodes)
For broader usage or if you do need to run it as a service, you could be starting an express (or another webserver) in your App.js with an endpoint that invokes your run function. Then from anywhere else you'll need to make an http call to that endpoint.
App.js
const express = require('express')
const bodyParser = require('body-parser')
const { workerData, parentPort, isMainThread, Worker } = require('worker_threads')
const app = express()
const port = 3000
var NodesList = ["xxxxxxx", "xxxxxxx"]
const run = function (Nodes) {
if (isMainThread) {
while (Nodes.length > 0) {
// my logic
})
}
}
}
app.use(bodyParser.json())
app.post('/', (req, res) => res.send(run(req.body)))
app.listen(port, () => console.log(`Example app listening at http://localhost:${port}`))
(Based off of example for express - https://expressjs.com/en/starter/hello-world.html)
You'll need to install both express and body-parser: $ npm install --save express body-parser from the directory of App.js.
From your other applications you will need to call the endpoint http://localhost:3000 with a POST request and the Nodes as a JSON array.
You can expose it on a port like the other answer mentions, though you'll want to make sure you don't expose it more broadly depending on the environment you're running in. There's a good answer here on ensuring the port is locked down.
As an alternative to exposing it on a port you can simply call the function by running the command in any other application:
node -e 'require("/somePathToYourJS/app").run()'
One concern is that app.js will now run at whatever permissions the calling application has. Although that can be resolved by running runas prior. More details here. But an example is:
runas /user:domainname\username "node -e 'require(^"/somePathToYourJS/app^").run()'"

How to use socket.io across diffrent routes in node.js

I have different routes in my node js application and i have to use socket.io in every route to make my node and react js application realtime. But, i have the below structure of my node js application.
router.js
const express = require('express');
const router = express.Router();
const worksheetController = require('../controllers/worksheet')
const attendenceController = require('../controllers/attendence')
router.route('/worksheets')
.get(
worksheetController.getWorksheet
)
.post(
worksheetController.validateWorksheet,
worksheetController.addWorksheet,
attendenceController.markAttendence
)
router.route('/attendances')
.get(
attendenceController.getAttendance
)
module.exports = router;
server.js
const express = require('express');
const router = require('./router');
const app = express();
app.use('/api', router);
app.listen('5000', () => {
console.log('Listening on port');
});
module.exports = app;
So, I want to know
1) Should i need to use http module to create a server, if i need to use socket.io.
2) How can i use socket.io for diffrent routes.
I found posts that match's to my question on stackoverflow, which is this, this and this. But i don't think, that works for me. So please help me.
You can use http module or other module in document of socket.io for to use socket.io
I don't sure your idea. But when you want implement socket.io. I think you should run another node app. (Meaning you have 2 nodejs app. 1 for node http normally and 1 for socket.io app). After you can use path option when init socket.io app https://socket.io/docs/server-api/#new-Server-httpServer-options. Because when you deploy to production. You should run your socket.io app with beside of proxy serve (ex: nginx). Socket.io basically support multi transport and protocol. So if use with http restful. How about config your connection mapping from nginx to socket.io app, how you setup error handler ?.
In your case:
+ Create new file socket.js:
// socket.js
var http = require('http')
var socket_io = require('socket.io')
function init_socket(app) {
const server = http.Server(app)
const io = socket_io(server, { path: 'your-path-want-for-socket-io' }) // default: /socket.io/
}
import {init_socket} from 'socket.js'
init_socket(app)

Make Node server.js file safer by removing security information

I have a public git[hub] project, and am now ready to switch it from development to production. We are in the research field, so we like to share our code too!
I have a server.js file that we start with node server.js like most tutorials.
In it, there is connection information for the SQL server, and the location of the HTTPS certificates. It looks something like this:
var https = require('https');
var express = require('express');
var ... = require('...');
var fs = require('fs');
var app = express();
var Sequelize = require('sequelize'),
// ('database', 'username', 'password');
sequelize = new Sequelize('db', 'uname', 'pwd', {
logging: function () {},
dialect: 'mysql',
…
});
…
var secureServer = https.createServer({
key: fs.readFileSync('./location/to/server.key'),
cert: fs.readFileSync('./location/to/server.crt'),
ca: fs.readFileSync('./location/to/ca.crt'),
requestCert: true,
rejectUnauthorized: false
}, app).listen('8443', function() {
var port = secureServer.address().port;
console.log('Secure Express server listening at localhost:%s', port);
});
In PHP you can have the connection information in another file, then import the files (and therefore variables) into scope to use. Is this possible for the SQL connection (db, uname, pwd) and the file locations of the certs (just to be safe) so that we can commit the server.js file to git and ignore/not follow the secret file?
You can do this in a lot of different ways. One would be to use environment variables like MYSQL_USER=foo MYSQL_PASSWD=bar node server.js and then use process.env.MYSQL_USER in the code.
You can also read from files as you have suggested. You can do require("config.json") and node will automatically parse and import the JSON as JavaScript constructs. You can then .gitignore config.json and perhaps provide an example.config.json.
If you want to support both of these at once there is at least one library that allows you to do this simply: nconf.
You can always just store the configuration information in a JSON file. Node natively supports JSON files. You can simply require it:
var conf = require('myconfig.json');
var key = fs.readFileSync(conf.ssl_keyfile);
There are also 3rd party libraries for managing JSON config files that add various features. I personally like config.json because it allows you to publish a sample config file with empty values then, without modifying the sample config file, you can override those values using a .local.json file. It makes it easier to deal with config files in repos and also makes it easier to publish changes to the config file.
Here is great writeup about how you should organise your deployments
Basically all application critical variables like db password, secret keys, etc., should be accessible via environment variables.
You could do something like this
// config.js
const _ = require('lodash');
const env = process.env.NODE_ENV || 'development';
const config = {
default: {
mysql: {
poolSize: 5,
},
},
development: {
mysql: {
url: 'mysql://localhost/database',
},
},
production: {
mysql: {
url: process.env.DB_URI,
},
},
};
module.exports = _.default(config.default, config[env]);
// app.js
const config = require('./config');
// ....
const sequelize = new Sequelize(config.mysql.url);
Code is not perfect, but should be enough to get the idea.

NodeJS: My node files have dependencies on variables in an other file

I am creating an app with nodejs. In the app, I have a app.js script that is the entrypoint that initializes both the app, as an expressjs app, and the http server that I use.
Just to clarify: modules here are not npm modules, they are my own files. I've written the app in modules. They are just seperate script files used by require()-ing them.
This app has several modules that a main module handler initializes. It reads the contents of a folder, which contains my own modules, and then by convention call the .initialize on each module after running a require() call on the filenames without the .js extension.
However, I have 1 module that needs the app variable to create an endpoint, and 1 module that needs the httpServer variable to create a web socket. Both of these are instansiated in app.js.
Seeing as I don't know what kind of modules will be in the folder, I don't really want to send app and httpServer to every module if they are just needed by 1 module each. Something like dependency injection would fit nice, but is that possible without to much overhead?
Right now I just temporarily added app and httpServer to the GLOBAL object.
What I usually do is have app.js export app so that modules elsewhere in my app can require it directly rather than having to deal with passing it around everywhere. I also slightly modify app.js so that it won't "listen" if it is required as a module that way later on if i decide to wrap it with another app, I can with minimal changes. This is not important to your question, I just find it give me more control when unit testing. All you really need from the code below is module.exports = app
'use strict';
var express = require('express'),
app = express(),
config = require('config'),
pkg = require('./package.json');
// trust reverse proxies
app.enable('trust proxy');
app.set('version', pkg.version);
module.exports = app; // <--- *** important ***
if (app.get('env') !== 'production') {
app.set('debug', true);
}
// calling app.boot bootstraps the app
app.boot = function (skipStart) { // skipStart var makes it easy to unit test without actually starting the server
// add middleware
require('./middleware/');
// setup models
app.set('models', require('./models'));
// setup routes
require('./routes/');
// wait for a dbconnection to start listening
app.on('dbopen', function () {
// setup hosting params
if (!skipStart) {
let server = app.listen(config.port, function () {
app.emit('started');
console.log('Web Server listening at: http://%s:%s', server.address().address, server.address().port);
// mail server interceptor for dev
if (app.get('env') !== 'production') {
// Config smtp server for dev
let SMTPServer = require('smtp-server').SMTPServer,
mailServer = new SMTPServer({
secure: false,
disabledCommands: ['STARTTLS'],
onData: function(stream, session, callback){
stream.pipe(process.stdout); // print message to console
stream.on('end', callback);
},
onAuth: function (auth, session, callback) {
callback(null, {user: 1, data: {}});
}
});
// Start smtp server
mailServer.listen(1025, '0.0.0.0');
} else {
// start agenda jobs only on production
require('./jobs.js');
console.log('Agenda Jobs Running.');
}
});
} else {
app.emit('booted');
}
});
};
// If this is the main module, run boot.
if (require.main === module) {
// move all of this to next tick so we can require app.js in other modules safely.
process.nextTick(app.boot);
}
Suppose you want to initialize 2 file from main app.js
app.js
var socketIni = require('./socketini.js');//You have to pass server
var xyz = require('./xyz.js')//you have to pass app
var app = express();
var server=http.createServer(app);
socketIni(server);
xyz(app);
socketini.js
module.exports = function(server){
//your socket initilize goes here
var io = require('socket.io').listen(server);
}
xyz.js
module.exports = function(app){
//you can access app here
}

How to start multiple node/socket servers on one machine

I am starting two different node servers, on different ports, but I still get the following error.
info - socket.io started
info - FlashPolicyFileServer received an error event:
listen EADDRINUSE
This is how I am starting the first server:
"use strict";
var
express = require('express'),
app = module.exports = express();
// set some config vars
var
server = require('http').createServer(app),
socket = require('./app/lib/socket');
// these settings are common to both environments
app.configure(function () {
// configuration left out
app.use(app.router);
});
// Load the routing
require('./app/routes')(app);
// run the server with socket.io
server.listen(3001);
socket.listen(server, session, app);
I am starting the second server the exact same way except the second last line is changed to :
server.listen(3002);
socket.io is started like this in another file
exports.listen = function (server, sessionStore, app) {
var io = require('socket.io').listen(server);
...
Not sure how to fix this error.
The flash policy port defaults to 10843, so both apps will try to run it off this port, which is the error you are getting. Either remove the transport, or set the port using
io.set('flash policy port', 3005)
Or you can just remove that transport altogether:
io.set('transports', [
'websocket',
'xhr-polling',
'htmlfile',
'jsonp-polling'
]);

Categories