How to reuse mongo client using globals, best practices - javascript

So, I have read other topics here in StackOverflow that try to touch on this but don't have a clear solution to the problem.
First I created a config file for the mongo client which is exported from it.
const { MongoClient } = require('mongodb');
const authMechanism = 'SCRAM-SHA-1';
const user = encodeURIComponent(process.env.MONGODB_USER);
const password = encodeURIComponent(process.env.MONGODB_PASSWORD);
const uri = `mongodb://${user}:${password}#${process.env.MONGODB_URI}?authMechanism=${authMechanism}`;
const client = new MongoClient(uri, {
useUnifiedTopology: true,
useNewUrlParser: true,
loggerLevel: 'info',
});
module.exports = client;
From there I understand that you must have mongo client initialised once, and only once, before you listen to your application, hence I have created this index.js (entry point to the app) file that does that requiring the typical app.js where all the node config is.
const app = require('./app');
const db = require('../configs/db/db-config');
const port = process.env.PORT;
db.connect((err, client) => {
if (err) {
throw err;
}
const database = client.db('dbnamegoeshere');
app.listen(port, () => console.log(`Listening on port ${port}...`));
});
Now, in order for me to reuse that db anywhere I want to make queries or whatever, what is the best practice? How could I add it globally? would adding it globally affect the performance or be a bad practice?
I have seen other examples where people perform these two tasks using a class but yet again all in the same file, not with an export or a global.
One final question, where, and why should I close the db client connection.
Thank you.

I think adding the db connection object to the global object works perfectly. And if you are worried about performance, just reassign global vars to local ones.
//dbconnection.js
const debug = require('debug')('someapp:mongo');
const mongoClient = require('mongodb').MongoClient;
const mongoOptions = {};
const mongoUrl = process.env.MONGO_URL || "mongodb://localhost:27017/dbname";
function callback(err, r){
debug("callback: ", err, r);
};
module.exports = function () {
mongoClient.connect(mongoUrl, mongoOptions, (err, client) => {
if(err){
debug("MongoDB connection error: ", err);
throw err;
};
const db = global.db = client.db();
});
};
And just use it like so in your root application file, once added to the file the db connection should be globally available in other files in your project.
//server.js
require("./dbconnection")();
const userDB = global.db.collection("Users");
userDB.find({}).toArray((err, items)=>{});

Related

Mongo DB problem - connections accumulation

I have a problem with the approach I use to connect to Mondo DB.
I use the following method:
import { Db, MongoClient } from "mongodb";
let cachedConnection: { client: MongoClient; db: Db } | null = null;
export async function connectToDatabase(mongoUri?: string, database?: string) {
if (!mongoUri) {
throw new Error(
"Please define the MONGO_URI environment variable inside .env.local"
);
}
if (!database) {
throw new Error(
"Please define the DATABASE environment variable inside .env.local"
);
}
if (cachedConnection) return cachedConnection;
cachedConnection = await MongoClient.connect(mongoUri, {
useNewUrlParser: true,
useUnifiedTopology: true,
}).then((client) => ({
client,
db: client.db(database),
}));
return cachedConnection!;
}
Everytime I need to connect to MongoDB I do as follows:
const { db } = await connectToDatabase(config.URI, config.USERS_DATABASE);
const myUniversity = await db
.collection(config.MY_COLLECTION)
.findOne({})
Everything seems ok, so what is the problem?
The problem is that the connections to my DB don't close after I use them. In fact I thought that my server is stateless so after every time i use my DB, the connections end. But it is not true! They stay alive, and after few hours of using my app mongo atlas sends me an email saying that the limit is exceeded.
As you can see in this screenshot, this chart is ever growing. That means that connections stay on and they accumulate. How do you think I can solve this problem?
Keep in mind that it uses cachedConnection only if I use the same connection. If I call a different API from the first one it creates another connection and it doesn't enter in if (cachedConnection) block, but it goes forward till the end.
You can try this simple demo which will allow you to use the same connection throughout the application in different modules. There are three modules: the index.js is the starter program, the dbaccess.js is where you have code to create and maintain a connection which can be used again and again, and a apis.js module where you use the database connection to retrieve data.
index.js:
const express = require('express');
const mongo = require('./dbaccess');
const apis = require('./apis');
const app = express();
const init = async () => {
await mongo.connect();
app.listen(3000);
apis(app, mongo);
};
init();
dbaccess.js:
const { MongoClient } = require('mongodb');
class Mongo {
constructor() {
this.client = new MongoClient("mongodb://127.0.0.1:27017/", {
useNewUrlParser: true,
useUnifiedTopology: true
});
}
async connect() {
await this.client.connect();
console.log('Connected to MongoDB server.');
this.db = this.client.db('test');
console.log('Database:', this.db.databaseName);
}
}
module.exports = new Mongo();
apis.js:
module.exports = function(app, mongo) {
app.get('/', function(req, res) {
mongo.db.collection('users').find().limit(1).toArray(function(err, result) {
res.send('Doc: ' + JSON.stringify(result));
});
});
}
Change the appropriate values in the url, database name and collection name before trying.

TypeError: Cannot read property 'execute' of undefined . node.js how to export oracle db connection

Hi I am new to node and oracle.I have created a app and made a successfull connection to db.
I need to use connection object across the application how can i do that?
Below is my index.js file
const express = require("express");
const app = express();
const authRoute = require("./routes/auth");
app.use(express.json());
app.use("/api",authRoute) ;
app.listen(3000,function(){
console.log("Node Server : Running on port 3000...");
})
database connection file => connect.js
const oracledb = require('oracledb');
const dotenv = require('dotenv');
dotenv.config();
const connection = oracledb.getConnection(
{
user : process.env.USER,
password : process.env.PASS,
connectString : process.env.ConnectString
},
function(err, connection)
{
if (err) {
console.error(err.message);
return;
}
console.log('Connection was successful!');
connection.close(function(err){
if (err) {
console.error(err.message);
return;
}
});
});
module.exports = connection;
I want to use this db connection in my auth.js file
const router = require('express').Router();
const db = require('../database/connect');
router.post("/authenticate",function(req,res){
//console.log(req);
const user = req.body.username;
const username = {"name" : user};
const pass = req.body.key;
const password = {"pass" : pass};
//const result = db.execute('select * from usertable');// this doesn't work
//console.log(result.rows);
res.send('success');
});
module.exports = router;
when i run const result = db.execute('select * from usertable'); I get the error below.
TypeError: Cannot read property 'execute' of undefined
What am i doing wrong.Can anyone please help.Thanks in advance
I had faced this problem. You must install Oracle install client v 19 in your machine. You have to go to web install oracle instant client base on your machine.
(Update: there is a multi-part series with code showing what you want at https://github.com/oracle/oracle-db-examples/tree/main/javascript/rest-api)
Use a connection pool that is opened at app start. Then the pool cache can be used to get the pool (and then connections) in other modules.
For a web app like yours you definitely want to use a connection pool for performance.
There's a big section on connection pooling in the documentation. E.g see Connection Pool Cache which says:
When pools are created, they can be given a named alias. The alias can
later be used to retrieve the related pool object for use. This
facilitates sharing pools across modules and simplifies getting
connections.
The examples are worth reviewing.

Express.js: create additional mongodb DB in controller

I'm new to MongoDB.
When I create my node.js server I use only one DB connection (on start I connect to it).
But imagine: I have one database with some generic tables, and more databases - each for a custom client.
How can I create those DB at runtime?
start.js:
const mongoose = require("mongoose");
// import environmental variables from variables.env file
require("dotenv").config({ path: "variables.env" });
mongoose.connect(process.env.DATABASE);
mongoose.Promise = global.Promise;
mongoose.connection.on("error", err => {
console.error(`🚫 → ${err.message}`);
});
require("./models/MaintenanceType");
require("./models/Maintenance");
const app = require("./app");
app.set("port", process.env.PORT || 7777);
const server = app.listen(app.get("port"), () => {
console.log('started');
});
variables.env (example):
NODE_ENV=development
DATABASE=mongodb://db:qwe123#sometest.server.com:412345/webtest
PORT=1234
SECRET=webtest
KEY=webtestcom
and controller:
const mongoose = require("mongoose");
const Maintenance = mongoose.model("Maintenance");
exports.createMaintenance = async (req, res) => {
const maintenance = await new Maintenance(req.body).save();
// ALSO create a db and table if not exists for this client and use it somehow
res.json(maintenance);
};
is it possible to do?
You can create new connection
mongoose.connect('URI_FOR_ANOTHER_DATABASE')
But it's bad idea to create new connections, so the driver has a feature to use existing connections to query another database, for this purpose you can check useDb() method as shown here

Node.js Async/Await module export

I'm kinda new to module creation and was wondering about module.exports and waiting for async functions (like a mongo connect function for example) to complete and exporting the result. The variables get properly defined using async/await in the module, but when trying to log them by requiring the module, they show up as undefined. If someone could point me in the right direction, that'd be great. Here's the code I've got so far:
// module.js
const MongoClient = require('mongodb').MongoClient
const mongo_host = '127.0.0.1'
const mongo_db = 'test'
const mongo_port = '27017';
(async module => {
var client, db
var url = `mongodb://${mongo_host}:${mongo_port}/${mongo_db}`
try {
// Use connect method to connect to the Server
client = await MongoClient.connect(url, {
useNewUrlParser: true
})
db = client.db(mongo_db)
} catch (err) {
console.error(err)
} finally {
// Exporting mongo just to test things
console.log(client) // Just to test things I tried logging the client here and it works. It doesn't show 'undefined' like test.js does when trying to console.log it from there
module.exports = {
client,
db
}
}
})(module)
And here's the js that requires the module
// test.js
const {client} = require('./module')
console.log(client) // Logs 'undefined'
I'm fairly familiar with js and am still actively learning and looking into things like async/await and like features, but yeah... I can't really figure that one out
You have to export synchronously, so its impossible to export client and db directly. However you could export a Promise that resolves to client and db:
module.exports = (async function() {
const client = await MongoClient.connect(url, {
useNewUrlParser: true
});
const db = client.db(mongo_db);
return { client, db };
})();
So then you can import it as:
const {client, db} = await require("yourmodule");
(that has to be in an async function itself)
PS: console.error(err) is not a proper error handler, if you cant handle the error just crash
the solution provided above by #Jonas Wilms is working but requires to call requires in an async function each time we want to reuse the connection. an alternative way is to use a callback function to return the mongoDB client object.
mongo.js:
const MongoClient = require('mongodb').MongoClient;
const uri = "mongodb+srv://<user>:<pwd>#<host and port>?retryWrites=true";
const mongoClient = async function(cb) {
const client = await MongoClient.connect(uri, {
useNewUrlParser: true
});
cb(client);
};
module.exports = {mongoClient}
then we can use mongoClient method in a diffrent file(express route or any other js file).
app.js:
var client;
const mongo = require('path to mongo.js');
mongo.mongoClient((connection) => {
client = connection;
});
//declare express app and listen....
//simple post reuest to store a student..
app.post('/', async (req, res, next) => {
const newStudent = {
name: req.body.name,
description: req.body.description,
studentId: req.body.studetId,
image: req.body.image
};
try
{
await client.db('university').collection('students').insertOne({newStudent});
}
catch(err)
{
console.log(err);
return res.status(500).json({ error: err});
}
return res.status(201).json({ message: 'Student added'});
};

Mongodb + Node: When to close

I am working on a Koa + Mongodb backend. My question is: When should I close the db, or does Mongodb manage that because I am not closing any of them right now and it seems fine.
// app.js
const Koa = require('koa')
const database = require('./database')
const app = new Koa()
database
.connet()
.then(() => {app.listen(':8080')})
.catch((err) => {console.error(err)})
// ./database.js
const MongoClient = require('mongodb').MongoClient
const Model = require('./model')
class Database {
async connect() {
if (!db) {
db = await MongoClient.connect("localhost:27017")
this.item = new Model(db, 'item_collection')
}
}
}
module.exports = new Database()
// ./model.js
class Model {
constructor(db, collectionName) {
this.name = collectionName
this.database = database
}
async findAll() {
const result = await this.db.collection(this.name).find().toArray()
if (!result) {
throw new Error('error')
}
return result
}
}
module.exports = Model
I also ran a stress test using vegeta to make API request to the server at 100 request / second and the response time is good. So, am I worried about premature optimization here? If not, when should I close the db?
As Koa keeps running (and in your case listening on port 8080) you should not close the db connection.
If you are running scripts that are expected to end (tasks running on cron, etc) you should manually close the connection when you are finished with all of your db tasks.
You can take a look at this example for express.js (Koa's sister framework)

Categories