How to keep mysql connections alive in node.js? - javascript

I'm using mysql connection pools in Node JS. After some idle time, the connections expire and the next time I perform a query, a new connection needs to be created. This can cause a delay of several seconds. Unacceptable!
I would like to implement keepalive functionality to periodically poll the database and ensure the consistent health of connections to the backend. I am looking for input from others who have attempted the same, or feedback on my approach.
const mysql = require('mysql');
const pool = createConnectionPool();
setInterval(keepalive, 180000); // 30 mins
function keepalive() {
pool._freeConnections.forEach((connection) => pool.acquireConnection(connection,function () {
connection.query('SELECT 1 + 1 AS solution', function (err) {
if (err) {
console.log(err.code); // 'ER_BAD_DB_ERROR'
}
console.log('Keepalive RDS connection pool using connection id', connection.threadId);
})
}));
}
This keepalive has been somewhat successful:
- once a connection is opened, it stays open
- connections never time out
- if a connection is lost, it is recreated on the next interval
This keepalive is not ideal:
- the mysql connection pool is lazy, only creating and restoring connections as needed. with this keepalive, the pool is no longer lazy. once a connection is opened, the keepalive will keep it open. the pool no longer scales depending on traffic.
- i'm not confident that my method of iterating through the list of free connections and performing a query will be a wise approach. is it possible for the application to check out the same connection from the pool while the same connection is being used by keepalive?
Another possible approach is to ditch the keepalive functionality within the application, and rely on heartbeat traffic to keep a minimum of connections alive.
Has anybody attempted to implement keepalive functionality, use a package or tool that provides this feature, or ended up using a different approach?

did you try this one in case if using pool connection
const pool = mysql.createPool({...});
function keepAlive() {
pool.getConnection(function(err, connection){
if(err) { console.error('mysql keepAlive err', err); return; }
console.log('ping db')
connection.ping(); // this is what you want
connection.release();
});
}
setInterval(keepAlive, 60000); // ping to DB every minute

I don't use pool and this code works
function pingdb() {
var sql_keep = `SELECT 1 + 1 AS solution`;
con.query(sql_keep, function (err, result) {
if (err) throw err;
console.log("Ping DB");
});
}
setInterval(pingdb, 40000);

Related

Kill all sleeping mysql pool connections using node

I am using node for my project and using connection pooling, whenever I query show processlist I found more than 200 sleep connections even though i am releasing it after every query like this-
return new Promise((resolve, reject) => {
sql.getConnection(function (err, conn) {
if (err) {
conn.release();
reject(err)
}
conn.query('QUERY', function (err, rows) {
conn.release();
if (err) {
reject(err)
}
else {
resolve(rows[0])
}
})
})
})
Still i found 200 plus sleep connections. Is there any way to kill useless sleeps connection through node? Or is it fine having so many sleep connections?
Thanks in advance!
If you are using connection pooling, you will need to close all the connections in the pool when you're finished using them. Otherwise the connections will stay open until they are closed by the MySQL server.
pool.end(function (err) {
// all connections in the pool have ended
});
This GitHub issue explains a little more about the nuance between releasing a connection (conn.release()) and closing the underlying connection pool (pool.end()).

I am unable to save data to my mongodb Atlas database

Github repo. I am trying to use MongoDB Atlas database with my node JS Login & Signup app for storing data. The problem is that the data is not saving to the database or in other words the request isn't going through even if my app is connected to Atlas. Full code available on www.github.com/tahseen09/login
// Connection to mongodb atlas
const uri = "mongodb+srv://tahseen09:<PASSWORD>#cluster0-pirty.mongodb.net/userdb"
MongoClient.connect(uri, function(err, client) {
if(err) {
console.log('Error occurred while connecting to MongoDB Atlas...\n',err);
}
console.log('Connected to Atlas');
const collection = client.db("userdb").collection("credentials");
client.close();
});
//New User Registration
app.post('/register', function(req,res){
var cred= new credential();
cred.uname=req.body.uname;
const hash = bcrypt.hashSync(req.body.password, 10);
cred.password=hash;
collection.save(function(err,newuser){
if(err){
res.status(500).send("Username exists");
}
else{
res.status(200).send("New User Created");
}
})
})
The code that is important is attached as a snippet and the rest of the code is available on www.github.com/tahseen09/login
Note: I am running this app on localhost.
Let me describe your flow so you can understand wrong point there :)
Connect to MongoDB
Create reference to the collection
Close connection
When someone tries to access /register route, you already have closed connection by that time. Thus, any operation attempt to the database will end up with connection error.
From the documentation it's recommended calling MongoClient.connect once and reusing the database variable returned by the callback, i.e. do not close connection manually, driver will just create and use pool of connections, so don't worry about closing connection. Check out example code in the documentation.
Lets step through the code to see what happens:
MongoClient.connect(uri, function(err, client) {
A connection to mongodb is created, then somewhen the connection is established or it fails, then the callback gets called back. Now you create a local variable holding the database reference:
const collection = client.db("userdb").collection("credentials");
And then you close the connection:
client.close();
Then the callback ends:
});
which means that a variables inside (connection) can't be accessed anymore and get therefore recycled.
Now somewhen (that might even happen before the db connection gets established), someone requests the webpage and you try to do:
collection.save(/*...*/);
That won't work for various reasons:
1) The db might not even be opened
2) If it was opened already, it was also closed already.
3) Even if it is open at the moment, you still cannot access connection as it is not in scope.
Now to resolve that we have to:
1) only start the webserver when the db connection is establishee
2) don't close the connection
3) expose the connection so that it can be used elsewhere
For that it makes sense to create a function that establishes the connection and calls back with the db:
function withCredentials(callback) {
const uri = "mongodb+srv://tahseen09:<PASSWORD>#cluster0-pirty.mongodb.net/userdb"
MongoClient.connect(uri, function(err, client) {
if(err) {
console.log('Error occurred while connecting to MongoDB Atlas...\n',err);
} else {
console.log('Connected to Atlas');
const collection = client.db("userdb").collection("credentials");
callback(collection);
}
});
}
So now you can use that:
withCredentials(function(credentials) {
app.post('/register', function(req,res){
const cred = { };
cred.uname = req.body.uname;
cred.password = bcrypt.hashSync(req.body.password, 10);
credentials.insertOne(cred, function(err,newuser){
if(err){
res.status(500).send("Username exists");
} else {
res.status(200).send("New User Created");
}
})
});
});

Process-spawning issue with node js setInterval and nohup

I'm running a node.js script to broadcast data to all connected web-app users every 15 seconds. It's running with this command...
nohup node /pushNotifications.js &
And this is the code...
var https = require('https'), fs = require('fs'), app = require("express"), key = fs.readFileSync('apache.key', 'utf8'), cert = fs.readFileSync('apache.crt', 'utf8')
var server = https.createServer({key: key, cert: cert}, app);
server.listen(8080)
var io = require("socket.io").listen(server);
var mysql = require('mysql');
function handler (req, res) {
}
io.on('connection', function (socket) {
setInterval( function() {
var connection = mysql.createConnection({
host: 'db.server.com', user: 'dbuser', password: 'dbpass', database: 'dbname', port: 3306
});
connection.connect(function(err){
});
connection.query('SELECT someColumn FROM someTable ', function(err, rows, fields) {
if (!err) {
socket.emit('notifications', JSON.stringify(rows));
connection.end();
} else {
connection.end();
}
});
}, 15000); //15 seconds
});
It's always worked fine, but recently I've started getting errors in the web app saying "User already has more than 'max_user_connections' active connections", and upon investigation at the DB level using MySQL's "show processlist", I see rapidly spawning/dying connections - all I need to do is kill/restart the pushNotifications.js script and everything is back to normal.
What I'm hoping is that somebody sees something wrong with my code that may be failing to handle a scenario that could lead to processes repeatedly spawning at intervals more regular than every 15 seconds. Appreciate any thoughts at all because I'm out of ideas to diagnose this further.
You're creating a new database connection for each client connection and each interval, which is rather wasteful.
It's much better to create a decently sized connection pool once, and use connections from that:
let pool = mysql.createPool({
connectionLimit : 10, // this may require tweaking depending on # of clients
...
});
io.on('connection', function(socket) {
setInterval(function() {
pool.query('SELECT someColumn FROM someTable ', function(err, rows, fields) {
if (!err) {
socket.emit('notifications', JSON.stringify(rows));
connection.end();
} else {
connection.end();
}
});
}, 15000);
});
Some additional remarks:
once a client disconnects, its associated interval isn't cleared/stopped. At some point that will start causing problems, because its running queries on behalf of a client that isn't there anymore; you should probably use a listener on the disconnect event to call clearInterval to clean up resources when the server detects a client disconnected.
your example code doesn't show if the database query is specific for each client. If it's not, you should move the interval to outside the io.on() block entirely, and use Socket.IO broadcasting to send all connections clients the data (instead of running the exact same query for each client separately)

Is opening a new connection for each query with a MongoDB database good practise?

I'm creating a web server that stores a user's data in a MongoDB database. The code behind the web requests uses asynchronous functions to insert a document into the database, but because these functions are asynchronous it means that for every request a new connection is made with the server.
exports.create_user = function(username, password, callback) {
mongo.connect(url, function(err, db) {
db.collection('users').insertOne({username: username, password: password}, function(err, result) {
callback(result)
db.close()
})
})
}
I'm under the impression that doing it this way is not the best practise, but I can't think a way to do it using the module model that I'm using above. Any suggestions or advice would be appreciated.
I stumbled upon this on my own research whether to use a new connection for mongodb on each query is the best practice or to use connection pooling. Turns out, that mongodb suggests connection pooling for most use-cases.
Citing from the docs:
A Connection Pool is a cache of database connections maintained by the driver so that connections can be re-used when new connections to the database are required. To reduce the number of connection pools created by your application, we recommend calling MongoClient.connect once and reusing the database variable returned by the callback
I am usually using the following form to establish and reuse a connection while firing queries:
// db.js
import { MongoClient } from 'mongodb';
// this will hold our cached database connection, which will itself hold multiple connections in a pool to be used
let connection,
database;
export {
connect: (next) => {
// already established? => return connection
if (database) return next(undefined, database);
// establish connection
MongoClient.connect('http://localhost:27017/admin', (err, db) => {
if (err) return next(err);
// save connection
connection = db;
// connect to database
database = db.db('myDatabase');
// call callback
next(undefined, database);
});
},
disconnect: (next) => {
if (!connection) return next();
// close connection
connection.close();
next();
}
};
Firing queries:
import db from './db';
db.connect((err, db) => {
if (err) return next(err);
db.collection('myUsers').insertOne({name: 'test'}, (err) => {
if (err) throw err;
db.disconnect((err) => {
if (err) throw err;
console.log('Everything finished, database connection closed');
});
});
});
Note: It is possible to determine the maximum amount of pooled connections manually (afaik the default is 5?). Refer to the docs about how to set the amount of opened connections via the mongodb url.
By doing db.close() you can close the connection, If you don't close your connection, event loop will keep the connection open and your process will not exit. If you are building a web server where your process will not be terminated, it's not necessary for you to close the connection.
For a reference node-mongodb-native

Node JS and pg module 'How can I really close connection?'

I'm going crazy with node pg module, getting 'too many clients already' error.
My app.js file for example, manages some routes in which I query some data to postgres. app.js looks like bellow:
//First I create a client
var client = new pg.Client(connectionString);
// Then I use that client to every routes, for example:
ContPg.prototype.someController = function(req, res){
client.connect(function(error){
if(error) return console.error('error conectando', error);
// Need to close client if there's an error connecting??
client.query(someQuery, function(e,r){
client.end();
// Here sometimes I dont end client if i need to query more data
if(e) return console.error('error consultando', e);
// Do anything with result...
})
});
}
As I said I use that client for all routes in file pg.js, but in other files with other routes I do the same to connect to postgres (create client and use for all routes that manage that file)
Questions
Is something wrong with my code? I ended wrong client connection?
If there's nothing wrong, what could be causing 'too many clients already' error?
Thanks in advance!!
The recommended pattern is to use client pooling. From the node-postgres documentation:
Generally you will access the PostgreSQL server through a pool of
clients. A client takes a non-trivial amount of time to establish a
new connection. A client also consumes a non-trivial amount of
resources on the PostgreSQL server - not something you want to do on
every http request. Good news: node-postgres ships with built in
client pooling.
var pg = require('pg');
var conString = "postgres://username:password#localhost/database";
//this initializes a connection pool
//it will keep idle connections open for a (configurable) 30 seconds
//and set a limit of 20 (also configurable)
pg.connect(conString, function(err, client, done) {
if(err) {
return console.error('error fetching client from pool', err);
}
client.query('SELECT $1::int AS number', ['1'], function(err, result) {
//call `done()` to release the client back to the pool
done();
if(err) {
return console.error('error running query', err);
}
console.log(result.rows[0].number);
//output: 1
});
});
Don't forget to call done() or you'll be in trouble!

Categories