Multi-tenant with mongoose and Node.js. One or multiple connection? - javascript

We are building an Saas application where we need, for technical reasons (backup, security), adopt a multi-tenants architecture (aka one DB per customer). We are using Node.js (with Typescript) and MongoDB (with mongoose driver).
For a first test, we are doing as below. Please note that it is a simple draft easily reproductible.
We start a new connection in index.ts, the "entry-point":
try {
//Define Mongo connection options
const mongoOptions = {
useNewUrlParser: true,
useCreateIndex: true,
useUnifiedTopology: true,
useFindAndModify: false,
autoIndex: true,
poolSize: 10,
bufferMaxEntries: 0,
connectTimeoutMS: 10000,
socketTimeoutMS: 30000,
};
// Creating mongo connection
const connection = await mongoose.connect(
`mongodb://${process.env.MONGO_HOSTNAME}:${process.env.MONGO_PORT}`,
mongoOptions
);
// connectToMongoDB;
console.log('Successfuly connected to mongo database !');
} catch (err) {
console.log(err);
}
Then in a separate file, we have some logic for selecting the appropriate Database and return the Model :
import mongoose from 'mongoose';
export class ClientDbProvider {
static async getTenantDb(
tenantId: string,
modelName: string,
schema: mongoose.Schema
) {
const dbName = `spearateDB_${tenantId}`;
const db = mongoose.connection.useDb(dbName, { useCache: true });
db.model(modelName, schema);
return db;
}
static async getModelForDb(databaseName: string,
model: mongoose.Model<mongoose.Document>, schema: mongoose.Schema
) {
const connection = await ClientDbProvider.getTenantDb(
databaseName,
model.modelName,
schema
);
return connection.model(model.modelName);
}
}
Then, in any route, we include clientID for using the appropriate DB.
router.post('/api/data/:clientID', async (req: Request, res: Response) => {
const { name, value } = req.body;
const data = Data.build({
name: name,
value: value,
});
try {
const dataModel = await ClientDbProvider.getModelForDb(
req.params.clientID,
Data,
DataSchema
);
const doc = await dataModel.create(data);
} catch (err) {
console.log(err);
}
res.send({});
});
Basically we call ClientDbProvider.getModelForDb for getting a Model. The getModelForDb switches to a different database using the same connection pool and return a model.
Note on the app:
the database will be continuously fills by data as it will store telemetry data from several sensors (some can generate data every second, some every 10 minutes...)
The api will be mostly for reading data (send it as JSON).
Some queries could be long as it will depends on own many data the client ask (even if we will put some defaults limits).
We will never have a huge amount of customers (DB). We plan to have, for the next two years, 20 to 40 customers (we will upgrade to another architecture if needed such as Sharded cluster). In any cases, the number of customer will never be like 1 000 000 or so...
All DB are on the same server/mongo instance (for now)
Questions:
Does this 'draft' code could cause some trouble or performance issue ?
Would that make sense to create a connection per DB (customer) with mongoose.createConnection function and then cache the connection as describe here (in a global variable for example) ?
As all of our DB are on the same server, does increasing the pool of the connections isn't sufficient ?

Related

Mongoose not saving data according to a function (discord.js)

Today, I decided to make a Discord RPG bot which of course requires profile stats such as coins and the actual users. Therefore, I searched up a tutorial on how I can do this with MongoDB but I am running into one issue. When a guild member joins and the bot is running, the data does not save with no error at all and I am unsure of why this is happening. I have tried troubleshooting the connection status by adding a line console.log(mongoose.connection.readyState) after the bot attempts to connect to the database. This returns 1 which means the connection is fine. I cannot find any other reason why this is caused so I decided to ask a question after hours of thinking.
(index.js): Connecting to the database
const mongoose = require("mongoose");
mongoose.connect(process.env.MONGO_SERVER, {
useNewUrlParser: true,
useUnifiedTopology: true
}).then(() => [
console.log("Connected to MongoDB Database successfully!"),
console.log(mongoose.connection.readyState)
]).catch((err) => {
console.error(err);
});
(profileSchema.js): Creating a profile schema
const mongoose = require("mongoose");
const profileSchema = new mongoose.Schema({
id: { type: String, require: true, unique: true },
serverid: { type: String, require: true },
coins: { type: Number, default: 0 },
bank: { type: Number }
});
const model = mongoose.model("ProfileModels", profileSchema);
module.exports = model;
(guildMemberAdd.js): Creating and uploading the data into the database
const profileModel = require('../models/profileSchema');
module.exports = async(client, discord, member) => {
let profile = await profileModel.create({
id: member.id,
serverid: member.guild.id,
coins: 0,
bank: 0
})
profile.save();
}
The reason is to do with the way you connect to mongo
BY default mongo closes the connection after connecting to a database. To do this, when connecting to mongo pass in the keepAlive option, so it would look something like:
mongoose.connect(process.env.MONGO_SERVER, {
useNewUrlParser: true,
useUnifiedTopology: true,
keepAlive: true
})
This will then mean an active connection to your database will be kept open
You are exporting 'model' from profileSchema.js and requiring 'profileModel ' from guildMemberAdd.js?
so import model from profileSchema and not profileModel
Fix: Make sure the guildMemberAdd event is being called by adding console.log statements.
If not, check if guildMemberAdd's code is different to other event codes.

fastify and prisma with postgres session storage

I am building a nodejs api that uses fastify, Prisma and Postgres. I have the API working with fastify-cookies and fastify-session and i can get cookies just fine but i need to be able to store the session cookies in the database. I saw a tutorial on doing this but it was without prisma, so im lost on how to connect fastify-session to the Prisma database pool.
I user the prisma client to connect to the database to do my normal calls in my routes, const data = await prisma.model.create({});
server.js
const fastify = require('fastify')({ logger: true });
const PORT = process.env.PORT || 3000;
// Session state
fastify.register(require('./sessions'));
// Register all our routes here.
...
// Startup code for the fastify server.
const start = async () => {
try {
await fastify.listen(PORT, '0.0.0.0');
} catch (error) {
fastify.log.error(error);
process.exit(1);
}
};
// Start the fastify server.
start();
sessions.js
const cookie = require('fastify-cookie');
const session = require('fastify-session');
const fp = require('fastify-plugin');
/**
* #param {import('fastify').FastifyInstance} fastify
*/
const plugin = async (fastify) => {
// All plugin data here is global to fastify.
fastify.register(cookie);
fastify.register(session, {
secret: process.env.SESSION_SECRET,
store: new SessionStore({
tableName: 'UserSession',
pool: ???, <--------------------------------- how to connect?
}),
saveUninitialized: false,
cookie: {
httpOnly: true,
secure: false,
},
});
fastify.addHook('preHandler', (req, reply, next) => {
req.session.user = {};
next();
});
};
module.exports = fp(plugin);
If you want to use the Prisma connection pool you would have to create a session storage library similar to connect-pg-simple or modify the codebase to accept a Prisma connection. This is definitely a non-trivial implementation and I don't think it would make a lot of sense without exceptional circumstances.
I would suggest creating a new pg.Pool or pgPromise instance and connecting with that like it was shown in the tutorial video you linked to. There's no reason you can't have two separate connection pools open to the same database (One with Prisma and one with pg.Pool)

Mongo DB problem - connections accumulation

I have a problem with the approach I use to connect to Mondo DB.
I use the following method:
import { Db, MongoClient } from "mongodb";
let cachedConnection: { client: MongoClient; db: Db } | null = null;
export async function connectToDatabase(mongoUri?: string, database?: string) {
if (!mongoUri) {
throw new Error(
"Please define the MONGO_URI environment variable inside .env.local"
);
}
if (!database) {
throw new Error(
"Please define the DATABASE environment variable inside .env.local"
);
}
if (cachedConnection) return cachedConnection;
cachedConnection = await MongoClient.connect(mongoUri, {
useNewUrlParser: true,
useUnifiedTopology: true,
}).then((client) => ({
client,
db: client.db(database),
}));
return cachedConnection!;
}
Everytime I need to connect to MongoDB I do as follows:
const { db } = await connectToDatabase(config.URI, config.USERS_DATABASE);
const myUniversity = await db
.collection(config.MY_COLLECTION)
.findOne({})
Everything seems ok, so what is the problem?
The problem is that the connections to my DB don't close after I use them. In fact I thought that my server is stateless so after every time i use my DB, the connections end. But it is not true! They stay alive, and after few hours of using my app mongo atlas sends me an email saying that the limit is exceeded.
As you can see in this screenshot, this chart is ever growing. That means that connections stay on and they accumulate. How do you think I can solve this problem?
Keep in mind that it uses cachedConnection only if I use the same connection. If I call a different API from the first one it creates another connection and it doesn't enter in if (cachedConnection) block, but it goes forward till the end.
You can try this simple demo which will allow you to use the same connection throughout the application in different modules. There are three modules: the index.js is the starter program, the dbaccess.js is where you have code to create and maintain a connection which can be used again and again, and a apis.js module where you use the database connection to retrieve data.
index.js:
const express = require('express');
const mongo = require('./dbaccess');
const apis = require('./apis');
const app = express();
const init = async () => {
await mongo.connect();
app.listen(3000);
apis(app, mongo);
};
init();
dbaccess.js:
const { MongoClient } = require('mongodb');
class Mongo {
constructor() {
this.client = new MongoClient("mongodb://127.0.0.1:27017/", {
useNewUrlParser: true,
useUnifiedTopology: true
});
}
async connect() {
await this.client.connect();
console.log('Connected to MongoDB server.');
this.db = this.client.db('test');
console.log('Database:', this.db.databaseName);
}
}
module.exports = new Mongo();
apis.js:
module.exports = function(app, mongo) {
app.get('/', function(req, res) {
mongo.db.collection('users').find().limit(1).toArray(function(err, result) {
res.send('Doc: ' + JSON.stringify(result));
});
});
}
Change the appropriate values in the url, database name and collection name before trying.

SQL to MongoDB Migration using nodejs script

Im new to nodejs and Im currently doing an sql to mongodb migration. I have created a script to load data to mongodb from sql queries. I created the script with the sample code from Google and it is working. But im facing below issue and need a workaround for this.
I have an sql query array and I don't need to run those queries if any of the queries has any syntax issues or any errors in the query result. (Say if the second query has syntax issue then no need to load the data of first query to mongo, currently its loading in my case). Basically if any of the query has any issue then no need to load the result in the mongo collection. And also if any issues from the mongo side no need to commit the transactions.
I have used the mongo transactions here to roll back the data if any errors. please find the below code and any help would be much appreciated.The sql and mongo credentials are mock data only.
config file code
var mongoCollection = 'collectionName';
exports.mongoCollection = mongoCollection;
var queryList = [
'sample query one',
'sample query two '
];
exports.queryList = queryList;
main script code
var MongoClient = require('mongodb').MongoClient;
var sql = require('mysql');
const config = require('./assets/config');
var sqlConfig = {
user: 'username',
password: 'password',
server: 'servername',
database: 'databasename',
port: 'portname',
multipleStatements: true
};
async function transaction() {
const mongodbUrl = 'mongourl';
const client = await MongoClient.connect(mongodbUrl, {useNewUrlParser: true}, {useUnifiedTopology:
true});
const db = client.db();
config.queryList.forEach(query => {
new sql.ConnectionPool(sqlConfig).connect().then(pool => {
return pool.request().query(query)
}).then(result => {
(async()=>{
const session = client.startSession();
session.startSession({
readConcers: {level: 'snapshot'},
writeConcern: {w: 'majority'}
});
try {
const collection = client.db('mongodbName').collection(config.mongoCollection);
await collection.insertMany(result.recordset, {session});
await session.commitTransaction();
session.emdSession();
console.log('transaction completed');
}catch(error){
await session.abortTransaction();
session.endSession();
console.log('transaction aborted');
throw error;
}
});
sql.close();
}).catch(error => {
sql.close();
throw error;
})
});
};
transaction();
Depending on the volume of data, you might look at breaking the process into two parts
Get the data from mySql
If no errors, load into Mongo
That would save you having to roll back the mongo writes
You can also take advantage of the default mongo pool size (5) and use pool on the mySQL side too.
Currently, this code is creating a pool for every select, which isn't optimal
config.queryList.forEach(query => {
new sql.ConnectionPool(sqlConfig).connect().then(pool => {//<-New pool per query?
return pool.request().query(query)
})
})
Instead, you can set up a pool once, per the mySql documentation
It looks like that driver only has a callback api, but you can promisfy the query to make it easier to work with.
So to put it all together, you could try something like this (this isn't working/tested code, just a suggestion)
var MongoClient = require('mongodb').MongoClient;
var sql = require('mysql');
const config = require('./assets/config');
var pool = sql.createPool({
connectionLimit : 5,
host : 'servername',
user : 'username',
password : 'password',
database : 'databasename'
});
async function transaction() {
try{
const mongodbUrl = 'mongourl';
const client = await MongoClient.connect(mongodbUrl, {useNewUrlParser: true}, {useUnifiedTopology: true});
const db = client.db();
const collection = client.db('mongodbName').collection(config.mongoCollection);
//Map your query list to an array of runSql promises
//this will complete when all queries return, and jump to the catch if any fail
let results = await Promise.all(config.queryList.map(runSql))
//Map the results to an array of mongo inserts
let inserts = await Promise.all(results.map(r=>collection.insertMany(r.recordset)))
//Close all connections
pool.end((err)=>err?console.err(err):console.log('MySQL Closed'))
client.close((err)=>err?console.err(err):console.log('MongoDB Closed'))
}
catch(err){
console.error(err)
}
};
transaction();
function runSql(queryStr){
return new Promise((resolve, reject)=>{
pool.query(queryStr, function (error, results, fields){
error?reject(error):resolve(results)
})
})
}
If data volume is a concern, you might want to look at getting streams from your mySql selects instead of just running them

Mongoose and multiple database in single node.js project

I'm doing a Node.js project that contains sub projects. One sub project will have one Mongodb database and Mongoose will be use for wrapping and querying db. But the problem is
Mongoose doesn't allow to use multiple databases in single mongoose instance as the models are build on one connection.
To use multiple mongoose instances, Node.js doesn't allow multiple module instances as it has caching system in require(). I know disable module caching in Node.js but I think it is not the good solution as it is only need for mongoose.
I've tried to use createConnection() and openSet() in mongoose, but it was not the solution.
I've tried to deep copy the mongoose instance (http://blog.imaginea.com/deep-copy-in-javascript/) to pass new mongoose instances to the sub project, but it throwing RangeError: Maximum call stack size exceeded.
I want to know is there anyways to use multiple database with mongoose or any workaround for this problem? Because I think mongoose is quite easy and fast. Or any other modules as recommendations?
According to the fine manual, createConnection() can be used to connect to multiple databases.
However, you need to create separate models for each connection/database:
var conn = mongoose.createConnection('mongodb://localhost/testA');
var conn2 = mongoose.createConnection('mongodb://localhost/testB');
// stored in 'testA' database
var ModelA = conn.model('Model', new mongoose.Schema({
title : { type : String, default : 'model in testA database' }
}));
// stored in 'testB' database
var ModelB = conn2.model('Model', new mongoose.Schema({
title : { type : String, default : 'model in testB database' }
}));
I'm pretty sure that you can share the schema between them, but you have to check to make sure.
Pretty late but this might help someone. The current answers assumes you are using the same file for your connections and models.
In real life, there is a high chance that you are splitting your models into different files. You can use something like this in your main file:
mongoose.connect('mongodb://localhost/default');
const db = mongoose.connection;
db.on('error', console.error.bind(console, 'connection error:'));
db.once('open', () => {
console.log('connected');
});
which is just how it is described in the docs. And then in your model files, do something like the following:
import mongoose, { Schema } from 'mongoose';
const userInfoSchema = new Schema({
createdAt: {
type: Date,
required: true,
default: new Date(),
},
// ...other fields
});
const myDB = mongoose.connection.useDb('myDB');
const UserInfo = myDB.model('userInfo', userInfoSchema);
export default UserInfo;
Where myDB is your database name.
One thing you can do is, you might have subfolders for each projects. So, install mongoose in that subfolders and require() mongoose from own folders in each sub applications. Not from the project root or from global. So one sub project, one mongoose installation and one mongoose instance.
-app_root/
--foo_app/
---db_access.js
---foo_db_connect.js
---node_modules/
----mongoose/
--bar_app/
---db_access.js
---bar_db_connect.js
---node_modules/
----mongoose/
In foo_db_connect.js
var mongoose = require('mongoose');
mongoose.connect('mongodb://localhost/foo_db');
module.exports = exports = mongoose;
In bar_db_connect.js
var mongoose = require('mongoose');
mongoose.connect('mongodb://localhost/bar_db');
module.exports = exports = mongoose;
In db_access.js files
var mongoose = require("./foo_db_connect.js"); // bar_db_connect.js for bar app
Now, you can access multiple databases with mongoose.
As an alternative approach, Mongoose does export a constructor for a new instance on the default instance. So something like this is possible.
var Mongoose = require('mongoose').Mongoose;
var instance1 = new Mongoose();
instance1.connect('foo');
var instance2 = new Mongoose();
instance2.connect('bar');
This is very useful when working with separate data sources, and also when you want to have a separate database context for each user or request. You will need to be careful, as it is possible to create a LOT of connections when doing this. Make sure to call disconnect() when instances are not needed, and also to limit the pool size created by each instance.
Mongoose and multiple database in single node.js project
use useDb to solve this issue
example
//product databse
const myDB = mongoose.connection.useDb('product');
module.exports = myDB.model("Snack", snackSchema);
//user databse
const myDB = mongoose.connection.useDb('user');
module.exports = myDB.model("User", userSchema);
A bit optimized(for me atleast) solution. write this to a file db.js and require this to wherever required and call it with a function call and you are good to go.
const MongoClient = require('mongodb').MongoClient;
async function getConnections(url,db){
return new Promise((resolve,reject)=>{
MongoClient.connect(url, { useUnifiedTopology: true },function(err, client) {
if(err) { console.error(err)
resolve(false);
}
else{
resolve(client.db(db));
}
})
});
}
module.exports = async function(){
let dbs = [];
dbs['db1'] = await getConnections('mongodb://localhost:27017/','db1');
dbs['db2'] = await getConnections('mongodb://localhost:27017/','db2');
return dbs;
};
I have been using this method and it works great for me until now.
const mongoose = require('mongoose');
function makeNewConnection(uri) {
const db = mongoose.createConnection(uri, {
useNewUrlParser: true,
useUnifiedTopology: true
});
db.on('error', function (error) {
console.log(`MongoDB :: connection ${this.name} ${JSON.stringify(error)}`);
db.close().catch(() => console.log(`MongoDB :: failed to close connection ${this.name}`));
});
db.on('connected', function () {
mongoose.set('debug', function (col, method, query, doc) {
console.log(`MongoDB :: ${this.conn.name} ${col}.${method}(${JSON.stringify(query)},${JSON.stringify(doc)})`);
});
console.log(`MongoDB :: connected ${this.name}`);
});
db.on('disconnected', function () {
console.log(`MongoDB :: disconnected ${this.name}`);
});
return db;
}
// Use
const db1 = makeNewConnection(MONGO_URI_DB1);
const db2 = makeNewConnection(MONGO_URI_DB2);
module.exports = {
db1,
db2
}

Categories