Node.js Express app handle startup errors - javascript

I have app in Node.js and Express. I need to write tests for it. I have a problem with handling Express app errors. I found this How do I catch node.js/express server errors like EADDRINUSE?, but it doesn't work for me, I don't know why. I want to handle errors, which can occured while expressApp.listen() is executing (EADDRINUSE, EACCES etc.).
express = require('express')
listener = express()
#doesn't work for me
listener.on('uncaughtException', (err) ->
#do something
)
#doesn't work too
listener.on("error", (err) ->
#do something
)
#this works, but it caughts all errors in process, I want only in listener
process.on('uncaughtException', (err) ->
#do something
)
listener.listen(80) #for example 80 to get error
Any ideas?

This should do the trick:
listener.listen(80).on('error', function(err) { });
What listener.listen actually does is create a HTTP server and call listen on it:
app.listen = function(){
var server = http.createServer(this);
return server.listen.apply(server, arguments);
};

First off, expressJS does not throw the uncaughtException event, process does, so it's no surprise your code doesn't work.
So use: process.on('uncaughtException',handler) instead.
Next, expressJS already provides a standard means of error handling which is to use the middleware function it provides for this purpose, as in:
app.configure(function(){
app.use(express.errorHandler({ dumpExceptions: true, showStack: true }));
});
This function returns an error message to the client, with optional stacktrace, and is documented at connectJS errorHandler.
(Note that errorHandler is actually part of connectJS and is only exposed by expressJS.)
If the behavior the existing errorHandler provides is not sufficient for your needs, its source is located at connectJS's errorHandler middleware and can be easily modified to suit your needs.
Of course, rather than modifying this function directly, the "correct" way to do this is to create your own errorHandler, using the connectJS version as a starting point, as in:
var myErrorHandler = function(err, req, res, next){
...
// note, using the typical middleware pattern, we'd call next() here, but
// since this handler is a "provider", i.e. it terminates the request, we
// do not.
};
And install it into expressJS as:
app.configure(function(){
app.use(myErrorHandler);
});
See Just Connect it, Already for an explanation of connectJS's idea of filter and provider middleware and How To Write Middleware for Connect/Express for a well-written tutorial.
You might also find these useful:
How to handle code exceptions in node.js?
Recover from Uncaught Exception in Node.JS
Finally, an excellent source of information regarding testing expressJS can be found in its own tests.

Mention: Marius Tibeica answer is complete and great, also david_p comment is. As too is Rob Raisch answer (interesting to explore).
https://stackoverflow.com/a/27040451/7668448
https://stackoverflow.com/a/13326769/7668448
NOTICE
This first method is a bad one! I leave it as a reference! See the Update section! For good versions! And also for the explanation for why!
 Bad version
For those who will find this useful, here a function to implement busy port handling
(if the port is busy, it will try with the next port, until it find a no busy port)
app.portNumber = 4000;
function listen(port) {
app.portNumber = port;
app.listen(port, () => {
console.log("server is running on port :" + app.portNumber);
}).on('error', function (err) {
if(err.errno === 'EADDRINUSE') {
console.log(`----- Port ${port} is busy, trying with port ${port + 1} -----`);
listen(port + 1)
} else {
console.log(err);
}
});
}
listen(app.portNumber);
The function listen is recursively calling itself. In case of port busy error. Incrementing the port number each time.
update Completely re-done
 Callback full version
First of all this version is the one that follow the same signature as nodejs http.Server.listen() method!
function listen(server) {
const args = Array.from(arguments);
// __________________________________ overriding the callback method (closure to pass port)
const lastArgIndex = arguments.length - 1;
let port = args[1];
if (typeof args[lastArgIndex] === 'function') {
const callback = args[lastArgIndex];
args[lastArgIndex] = function () {
callback(port);
}
}
const serverInstance = server.listen.apply(server, args.slice(1))
.on('error', function (err) {
if(err.errno === 'EADDRINUSE') {
console.log(`----- Port ${port} is busy, trying with port ${port + 1} -----`);
port += 1;
serverInstance.listen.apply(serverInstance, [port].concat(args.slice(2, lastArgIndex)));
} else {
console.log(err);
}
});
return serverInstance;
}
Signature:
listen(serverOrExpressApp, [port[, host[, backlog]]][, callback])
just as per
https://nodejs.org/api/net.html#net_server_listen_port_host_backlog_callback
The callback signature is changed to
(port) => void
 usage:
const server = listen(app, 3000, (port) => {
console.log("server is running on port :" + port);
});
// _____________ another example port and host
const server = listen(app, 3000, 'localhost', (port) => {
console.log("server is running on port :" + port);
});
 Explanation
In contrary to the old example! This method doesn't call itself!
Key elements:
app.listen() first call will return a net.Server instance
After binding an event once, calling listen again into the same net.Server instance will attempt reconnecting!
The error event listener is always there!
each time an error happen we re-attempt again.
the port variable play on the closure to the callback! when the callback will be called the right value will be passed.
Importantly
serverInstance.listen.apply(serverInstance, [port].concat(args.slice(2, lastArgIndex)));
Why we are skipping the callback here!?
The callback once added! It's hold in the server instance internally on an array! If we add another! We will have multiple triggers! On the number of (attempts + 1). So we only include it in the first attempt!
That way we can have the server instance directly returned! And keep using it to attempt! And it's done cleanly!
Simple version port only
That's too can help to understand better at a glimpse
function listen(server, port, callback) {
const serverInstance = server.listen(port, () => { callback(port) })
.on('error', function (err) {
if(err.errno === 'EADDRINUSE') {
console.log(`----- Port ${port} is busy, trying with port ${port + 1} -----`);
port += 1;
serverInstance.listen(port);
} else {
console.log(err);
}
});
return serverInstance;
}
Here the parameter port variable play on the closure!
ES6 full version
function listen(server, ...args) {
// __________________________________ overriding the callback method (closure to pass port)
const lastArgIndex = args.length - 1;
let port = args[0];
if (typeof args[lastArgIndex] === 'function') {
const callback = args[lastArgIndex];
args[lastArgIndex] = function () {
callback(port);
}
}
const serverInstance = server.listen(server, ...args)
.on('error', function (err) {
if(err.errno === 'EADDRINUSE') {
console.log(`----- Port ${port} is busy, trying with port ${port + 1} -----`);
port += 1;
serverInstance.listen(...[port, ...args.slice(1, lastArgIndex)])
} else {
console.log(err);
}
});
return serverInstance;
}
Why the old version is bad
To say right it's not really! But with the first version! We call the function itself at every failure! And each time it create a new instance! The garbage collector will budge some muscles!
It doesn't matter because this function only execute once and at start!
The old version didn't return the server instance!
Extra (for #sakib11)
You can look at #sakib11 comment to see what problem he fall in! It can be thoughtful!
Also in the comment i mentioned promise version and closure getter pattern! I don't deem them interesting! The way above just respect the same signature as nodejs! And too callback just do fine! And we are getting our server reference write away! With a promise version! A promise get returned and at resolution we pass all the elements! serverInstance + port!
And if you wonder for the closure getter pattern! (It's bad here)
Within our method we create a ref that reference the server instance! If we couldn't return the server instance as we are doing (imaging it was impossible! So each time a new instance is created! The pattern consist of creating a closure (method at that scope) And return it!
so for usage
const getServer = listen(port, () => {
console.log('Server running at port ' + getServer().address().port);
const io = socketIo(getServer(), {});
});
But it's just overhead specially we need to wait for the server to be done!
Unless we set it in a way that it use a callback! or return a promise!
And it's just over complicating! And not good at all!
It's just because i mentioned it!
And the method above can be tweaked! To add number of attempts limit! And add some events or hooks! But well! Generally we only need a simple function that just attempt and make it! For me the above is more then sufficient!
Good links
https://nodejs.org/api/http.html#http_http_createserver_options_requestlistener
https://nodejs.org/api/http.html#http_class_http_server
https://expressjs.com/en/4x/api.html#app.listen
From the doc
The app.listen() method returns an http.Server object and (for HTTP) is a convenience method for the following:
app.listen = function () {
var server = http.createServer(this)
return server.listen.apply(server, arguments)
}

Related

Why am I getting "Cannot access 'server' before initialization" error in NodeJS?

I am getting the dreaded Cannot access 'server' before initialization error in code that is identical to code that's running in production.
The only things that have changed are my OS version (macOS 10.11->10.14) my NodeJS version (10->12) and my VSCode launch.json, but I cannot see anything in either that would cause an issue. My Node version went from 10 to 12, but in production it went from 8 to 15 without issue. I routinely keep launch.json pretty sparse, and the same error happens using node server in Terminal.
Here is the offending code. The issue occurs because I have shutdown() defined before server and it references server. It's written to add an event-handler and then cause the event. Yes, it could be refactored but it already works. It works, really. In 21 instances spread over 7 servers.
I have tried changing the declaraion/init of server from const to var but that does not fix it. As mentioned, this is code that's running in prod! What's wrong with my environment?
Maybe a better question is: why did this ever work?
'use strict'
const fs = require('fs');
const https = require('https');
const cyp = require('crypto').constants;
const stoppable = require('./common/stoppable.js');
const hu = require('./common/hostutil');
process.on('uncaughtException', err => {
wslog.error(`Uncaught Exception: ${err} ${err.stack}`);
shutdown();
});
process.on('unhandledRejection', (reason, p) => {
wslog.error(`Unhandled Promise Rejection: ${reason} - ${p}`);
});
// 'shutdown' is a known static string sent from node-windows wrapper.js if the service is stopped
process.on('message', m => {
if (m == 'shutdown') {
wslog.info(`${wsconfig.appName} has received shutdown message`);
shutdown();
}
});
process.on('SIGTERM', shutdown);
process.on('SIGINT', shutdown);
process.on('SIGHUP', shutdown);
function shutdown() {
httpStatus = 503; // Unavailable
wslog.info(`${wsconfig.appName} httpStatus now ${httpStatus} - stopping server...`);
// Error happens on this next line; It should not execute till after server is running already
server.on('close', function () {
wslog.info(`${wsconfig.appName} HTTP server has stopped, now exiting process.`);
process.exit(0)
});
server.stop();
}
// Init and start the web server/listener
var combiCertFile = fs.readFileSync(wsconfig.keyFile, 'utf8');
var certAuthorityFile = fs.readFileSync(wsconfig.caFile, 'utf8');
var serverOptions = {
key: combiCertFile,
cert: combiCertFile,
ca: certAuthorityFile,
passphrase: wsconfig.certPass,
secureOptions: cyp.SSL_OP_NO_TLSv1 | cyp.SSL_OP_NO_TLSv1_1
};
var server = https.createServer(serverOptions, global.app)
.listen(wsconfig.port, function () {
wslog.info(`listening on port ${wsconfig.port}.`);
});
server.on('clientError', (err, socket) => {
if (err.code === 'ECONNRESET' || !socket.writable) { return; }
// ECONNRESET was already logged in socket.on.error. Here, we log others.
wslog.warn(`Client error: ${err} ${err.stack}`);
socket.end('HTTP/1.1 400 Bad Request\r\n\r\n');
});
server.on('error', (err)=>{
if ( err.code === 'EADDRINUSE' ) {
wslog.error(`${err.code} FATAL - Another ${wsconfig.appName} or app is using my port! ${wsconfig.port}`);
} else {
wslog.error(`${err.code} FATAL - Server error: ${err.stack}`);
}
shutdown();
})
combiCertFile = null;
certAuthorityFile = null;
// Post-instantiation configuration required (may differ between apps: need an indirect way to plug in app-specific behavior)
stoppable(server, wsconfig.stopTimeout);
// Load all RESTful endpoints
const routes = require('./routes/');
This is a runtime error, which happens only in a very specific situation. But actually this exact error shouldn't happen with var server = ... but only with const server = ... or let server = .... With var server = ... the error message should say "Cannot read properties of undefined"
What happens
You have an error handler for uncaughtException which is calling shutdown() and in shutdown() you are referencing your server. But consider what happens if your code throws an exception before you initialized your server. For instance if your cert or key cannot be read from the disk, cert or key are invalid ... So nothing will be assigned to server, and an exception will be raised.
Then the handler for your uncaught exception will fire and call the shutdown() function, which then tries to access the server, which of course hasn't been initialized yet.
How to fix
Check what the unhandled exception is, that is thrown before your server is initialized and fix it. In your production environment, there is probably no exception, because the configuration and environment is properly set up. But there is at least one issue in your develepment environment, which causes an exception.
Difference between var and const
And the difference between var server = ... and const server = ... is quite a subtle one. For both, the declaration of the variable is hoisted up to the top of their respective scope. In your case it's always global, also for const. But variables declared as var are assigned a value of undefined whereas variables declared as let/const are not initialized at all.
You can easily reproduce this error if you uncomment either error1 or error2 in the following code. But error3 alone won't produce this ReferenceError because bar will already be initialized. You can also replace const bar = with var bar = and you will see, that you get a different error message.
process.on("uncaughtException", err => {
console.log("uncaught exception");
console.log(err);
foo();
});
function foo() {
console.log("foo");
console.log(bar.name);
}
function init() {
// throw new Error("error1");
return { name: "foobar"}
}
// throw new Error("error2");
const bar = init();
//throw new Error("error3");

How a function is getting called with correct request & response objects?

I have a piece of code:
var http = require('http');
function createApplication() {
let app = function(req,res,next) {
console.log("hello")
};
return app;
}
app = createApplication();
app.listen = function listen() {
var server = http.createServer(this);
return server.listen.apply(server, arguments);
};
app.listen(3000, () => console.log('Example app listening on port 3000!'))
Nothing fancy here. But when I run this code and go to localhost:3000, I can see hello is getting printed. I'm not sure how this function is getting called at all. Also, the function receives the req & res objects as well. Not sure whats happening here.
http.createServer() has a couple optional arguments. One being requestListener which is
https://nodejs.org/api/http.html#http_http_createserver_options_requestlistener
The requestListener is a function which is automatically added to the
'request' event.
Since you call your listen() like so app.listen(), this inside that function is going to be a reference to the function you made and returned in createApplication. So you are basically doing:
http.createServer(function(req,res,next) {
console.log("hello")
});
Hence your function is added as a callback for any request, and thus why any request you make will create a console log of hello.
If you want an equivalent more straight forward example
var http = require('http');
var server = http.createServer();
server.on('request',function(req,res,next) {
//callback anytime a request is made
console.log("hello")
});
server.listen(3000);

bull task in sailsjs not working?

Well I (naively) tried to get bull working in a sails application: ultimatelly I wish to have a queue to which I can add/remove/check tasks based on incoming routes.
Now as I understand sails to create a queueing system that works globaly I would have to add this setup in bootstrap.js.
/**
* Bootstrap
* (sails.config.bootstrap)
*
* An asynchronous bootstrap function that runs before your Sails app gets lifted.
* This gives you an opportunity to set up your data model, run jobs, or perform some special logic.
*
* For more information on bootstrapping your app, check out:
* https://sailsjs.com/config/bootstrap
*/
module.exports.bootstrap = function(done) {
// It's very important to trigger this callback method when you are finished
// with the bootstrap! (otherwise your server will never lift, since it's waiting on the bootstrap)
let Queue = require('bull');
let q = new Queue('test queue');
q.process(function(job, done){
console.log("starting job");
for(let i = 0; i<job.value; i+=1) {
console.log(i);
}
done();
});
q.add({'value':10});
global.DirectUpdateQueue = q;
return done();
};
Given above code, sails launches perfectly fine, and in the routes I can see the global.DirectUpdateQueue existing.
What does however not work is that the queued tasks are executed. - I do not see any log in the console ("starting job" is expected at least). Nor does the code break whe nI put a breakpoint in the processing function.
So what is going on here?
EDIT: can this be due to me not having set up a (local) redis server? - I don't find any information on this subject but I expected/hoped bull.js to actually handle this server internally and (even more importantly) not be limited to a specific (OS) environment.
So, first of all, you have to make sure you have Redis installed on your server.
When creating a queue, you can pass Redis config in my example below it's the default.
Then in bootsrap.js:
var Queue = require('bull');
var testQueue = new Queue('Website Queue', 'redis://127.0.0.1:6379');
testQueue.process(function(job, done){
console.log('job started');
setTimeout(function () {
console.log('10 seconds later');
console.log(job.data);
}, 10000)
done();
});
global.testQueue = testQueue;
then from action/controller you can do this:
testQueue.add({'value':10});
First you must connect to a Redis server
var testQueue = new Queue('test', {
redis: {
port: 6379,
host: '127.0.0.1',
password: 'secret'
}
});
According to the doc :
If the queue is empty the job will be executed directly, otherwise it will be placed in the queue and executed as soon as possible.
To access data in job, use job.data object :
testQueue.process((job) => {
console.log("job with data 'foo' :", job.data.foo);
// example with Promise
return asynchTreatment()
.then(() => { console.log('treatment ok'); })
.catch((err) => { console.log('treatment ko :', err); }
}).on('completed', (job, result) => {
// Job completed with output result!
console.log('result :', result);
});
testQueue.add({ foo : 'bar' });
EDIT 1 :
The doc says :
It creates a new Queue that is persisted in Redis. Everytime the same queue is instantiated it tries to process all the old jobs that may exist from a previous unfinished session.
So if the server restart, you don't lose your jobs.
Just use job.data.value in your for loop
for(let i = 0; i<job.data.value; i+=1) {
console.log(i);
}

Best approach for a modular node.js/socket.io/express application

I'm currectly creating an app using Node.JS that makes use of Express and Socket.io. As time progresses it's becoming increasingly difficult to deal with one file, I'm in the process of moving certain things out that I know how but was wondering on the best approach to do this.
I have a private area constructor similar to:
privateArea.js
function privateArea(props) {
this.id = props.id;
this.name = props.name;
this.users = [];
}
privateArea.prototype.addUser = function(socketId) {
this.users.push(socketId);
};
module.exports = privateArea;
I'd like to have this also have access to the socket.io variable that's been setup for use in a separate sockets.js file that can be included via the main app.js and a seperate file for express.js
So I'd like the structure as follows:
project
| app.js - joins it all together
| express.js - initialises and manages all express routing
| privateArea.js - constructor for private areas - must be able to reference socket.io
| sockets.js - initialises and manages all socket.io sockets and events
Any help/examples would be very appreciated.
Thanks
I use socket.io and express quite often in my projects, and I've developed a template which makes things easy. I like to have a fail-over in case the socket connections drops for some reason, or if a socket connection cannot be established. So I create http channels as well as socket channels. Here's a basic module template:
module.exports = function () {
var exported = {};
var someFunction = function (done) {
//.. code here..//
if (typeof done === "function") {
done(null, true);
}
};
// export the function
exported.someFunction = someFunction;
var apicalls = function (app) {
app.get("/module/someFunction", function (req, res) {
res.header("Content-Type", "application/json");
someFunction(function (err, response) {
if (err) return res.send(JSON.stringify(err));
res.send(JSON.stringify(response));
});
});
};
exported.apicalls = apicalls;
var socketcalls = function (io) {
io.on("connection", function (socket) {
socket.on('module-someFunction', function () {
someFunction(function (err, response) {
if (err) return socket.emit('module-someFunction', err);
socket.emit('module-someFunction', response);
});
});
});
};
exported.socketcalls = socketcalls;
return exported;
}
So to use this, I'd first need to include the module in my app.js file like this:
var mymod = require('./myModule.js');
And then I can enable access to this service from HTTP and over the websocket like this:
mymod.apicalls(app); // passing express to the module
mymod.socketcalls(io); // passing socket.io to the module
Finally, from the front-end, I can check to see if I have a socket connection, and if so, I use the socket to emit "module-someFunction". If I don't have a socket connection, the front-end will do an AJAX call instead to "/module/someFunction" which will hit the same function on the server side that it would've had I used the socket connection.
As an added bonus, if I need to utilize the function within the server, I could do that as well since the function is exported. That would look like this:
mymod.someFunction(function (err, response) {
// ... handle result here ... //
});

Node.js / express: respond immediately to client request and continue tasks in nextTick

I would like to separate server high consuming CPU task from user experience:
./main.js:
var express = require('express');
var Test = require('./resources/test');
var http = require('http');
var main = express();
main.set('port', process.env.PORT || 3000);
main.set('views', __dirname + '/views');
main.use(express.logger('dev'));
main.use(express.bodyParser());
main.use(main.router);
main.get('/resources/test/async', Test.testAsync);
main.configure('development', function() {
main.use(express.errorHandler());
});
http.createServer(main).listen(main.get('port'), function(){
console.log('Express server app listening on port ' + main.get('port'));
});
./resources/test.js:
function Test() {}
module.exports = Test;
Test.testAsync = function(req, res) {
res.send(200, "Hello world, this should be sent inmediately");
process.nextTick(function() {
console.log("Simulating large task");
for (var j = 0; j < 1000000000; j++) {
// Simulate large loop
}
console.log("phhhew!! Finished!");
});
};
When requesting "localhost:3000/resources/test/async" I would expect the browser rendering "Hello world, this should be sent inmediately" really fast and node.js to continue processing, and after a while in console appearing "finished" message.
Instead, browser keeps waiting until node.js finishes large task and then renders the content. I've tried with res.set({ 'Connection': 'close' }); and also res.end(); but nothing works as expected. I've also googled with no luck.
How should it be to send the response to client immediately and server continue with tasks?
EDIT
posted fork method in solution
Try waiting instead of hogging the CPU:
res.send("Hello world, this should be sent inmediately");
console.log("Response sent.");
setTimeout(function() {
console.log("After-response code running!");
}, 3000);
node.js is single-threaded. If you lock up the CPU with a busy loop, the whole thing grinds to a halt until that is done.
Thakns for Peter Lyons help, finally the main problem was firefox buffer: response was not so long as to flush it (so firefox kept waiting).
Anyway, for hight CPU performing tasks, node would keep hanged until finishing, so will not be attending new requests. If someone needs it, it can be achieved by forking (with child_process, see sample in http://nodejs.org/api/child_process.html)
Have to say that change of context by forking could take longer than splitting the task in different ticks.
./resources/test.js:
var child = require('child_process');
function Test() {}
module.exports = Test;
Test.testAsync = function(req, res) {
res.send(200, "Hello world, this should be sent inmediately");
var childTask = child.fork('child.js');
childTask.send({ hello: 'world' });
};
./resources/child.js:
process.on('message', function(m) {
console.log('CHILD got message:', m);
});
A good solution is to use child_process.fork(): it allows you to execute another JavaScript file of your app in a different Node instance, and thus in a different event loop. Of course, you can still communicate between the two processes by sending messages: so, from your UI process, you can send a message to the forked process to ask it to execute something.
For example, in ui.js:
var ChildProcess = require('child_process');
var heavyTaskWorker = ChildProcess.fork('./heavyTaskWorker.js');
...
var message = {
operation: "longOperation1",
parameters: {
param1: "value1",
...
}
};
heavyTaskWorker.send(message);
And in heavyTaskWorker.js:
process.on('message', function (message) {
switch (message.operation) {
case 'longOperation1':
longOperation1.apply(null, message.parameters);
break;
...
}
});
Tested here, and it works fine!
Hope that helps!

Categories