fastify and prisma with postgres session storage - javascript

I am building a nodejs api that uses fastify, Prisma and Postgres. I have the API working with fastify-cookies and fastify-session and i can get cookies just fine but i need to be able to store the session cookies in the database. I saw a tutorial on doing this but it was without prisma, so im lost on how to connect fastify-session to the Prisma database pool.
I user the prisma client to connect to the database to do my normal calls in my routes, const data = await prisma.model.create({});
server.js
const fastify = require('fastify')({ logger: true });
const PORT = process.env.PORT || 3000;
// Session state
fastify.register(require('./sessions'));
// Register all our routes here.
...
// Startup code for the fastify server.
const start = async () => {
try {
await fastify.listen(PORT, '0.0.0.0');
} catch (error) {
fastify.log.error(error);
process.exit(1);
}
};
// Start the fastify server.
start();
sessions.js
const cookie = require('fastify-cookie');
const session = require('fastify-session');
const fp = require('fastify-plugin');
/**
* #param {import('fastify').FastifyInstance} fastify
*/
const plugin = async (fastify) => {
// All plugin data here is global to fastify.
fastify.register(cookie);
fastify.register(session, {
secret: process.env.SESSION_SECRET,
store: new SessionStore({
tableName: 'UserSession',
pool: ???, <--------------------------------- how to connect?
}),
saveUninitialized: false,
cookie: {
httpOnly: true,
secure: false,
},
});
fastify.addHook('preHandler', (req, reply, next) => {
req.session.user = {};
next();
});
};
module.exports = fp(plugin);

If you want to use the Prisma connection pool you would have to create a session storage library similar to connect-pg-simple or modify the codebase to accept a Prisma connection. This is definitely a non-trivial implementation and I don't think it would make a lot of sense without exceptional circumstances.
I would suggest creating a new pg.Pool or pgPromise instance and connecting with that like it was shown in the tutorial video you linked to. There's no reason you can't have two separate connection pools open to the same database (One with Prisma and one with pg.Pool)

Related

Is it possible to implement socket.io connection in express route?

I implement a payment service which depend on one of my express route as a callback route, so whenever a user want to make a payment, they will be redirected to this payment service link which entirely different my backend/frontend domain. After a successful payment, user will then be redirected to my express GET route (callback route), in this route is where I give users their asset and then redirect them to the frontend.
EXPECTATION
My expectation is, whenever a user make a purchase, I want a real time update on the frontend for others to see some details about the purchase without refreshing their browser.
WHAT I'VE TRIED
I had think socket.io would solve this, like adding a socket connection in the route to then push the data to the frontend. But after making lot of research, no solution seems to work for me.
HERE IS A SIMPLE CODE OF WHAT I'VE TRIED
=============================== server.js ========================
const express = require("express")
const app = express()
const http = require("http")
const cors = require("cors")
const session = require("express-session")
const runSocket = require("./runSocket")
const { Server } = require("socket.io")
app.use(cors())
app.use(express.json())
const server = http.createServer(app)
server.listen(3004, () => {
console.log("SERVER IS RUNNING")
})
const io = new Server(server, {
cors: {
origin: "http://localhost:3000",
methods: ["GET", "POST"],
},
})
const postRoute = require("./routes/postData")(io)
app.use("/post-data", postRoute)
==================================== postData Route ======================================
module.exports = function (io) {
router.post("/", async (req, res) => {
const data = req?.body?.data.message
const room = req?.body?.data?.room
io.on("connection", (socket) => {
console.log("Socket Running...")
socket.to(room).emit("the_message", data)
})
console.log("Under socket...")
return res.status(200).json({ data: req.body.data })
})
return router
}
This log: in postData route is not printing console.log("Socket Running...")
EXPECTATION
My expectation is, whenever a user make a purchase, I would like to make a real time update on the frontend for others to see some details about the purchase.
UPDATE: The Payment Gateway config looks somthing like this:
const { body } = await got.post("https://payment-provider-link", {
headers: { Authorization: "Bearer token for payment" },
json: {
email: "email#gmail.com",
amount: amount * 100,
initiate_type: "inline",
callback_url: `${BackendBaseUrl}/payment-callback`, // <<<============
},
})
Okay so you don't need the io.on("connection") in ur route. Remove that piece of code and simply change it to io.to(room).emit("the_message", data). Also make sure to have the other sockets joined the room ur trying to emit to otherwise they won't receive the data.

Node.Js MSSQL Query Timeout Expired

I am using Node Express API to run SQL queries to populate a dashboard of data. I am using the mssql-node package to do so. Sometimes it runs flawlessly, other times I get the following error:
[Error: [Microsoft][SQL Server Native Client 11.0]Query timeout expired]
I am creating a poolPromise with a connectionPool to the db, then I pass that object to my other controllers which run the specific queries to populate data. I run the server which initiates the db.js script and connects to MSSQL with a pool connection.
db.js:
// for connecting to sql server
const sql = require('mssql/msnodesqlv8');
// db config to connect via windows auth
const dbConfig = {
driver: 'msnodesqlv8',
connectionString: 'Driver={SQL Server Native Client 11.0};Server={my_server};Database={my_db};Trusted_Connection={yes};',
pool: {
idleTimeoutMillis: 60000
}
};
// create a connectionpool object to pass to controllers
// this should keep a sql connection open indefinitely that we can query when the server is running
const poolPromise = new sql.ConnectionPool(dbConfig)
.connect()
.then(pool => {
console.log('Connected to MSSQL');
return pool;
})
.catch(err => console.log('Database Connection Failed! Bad Config: ', err))
module.exports = { sql, poolPromise };
An example of one of my controllers and how I use the poolPromise object is below. I currently have about 7 of these controllers that run their own specific query to populate a specific element on the dashboard. The performance of the queries each run within 1-10 seconds (depending on current server load, as I am querying an enterprise production server/db, this can vary). As I mentioned earlier, the queries run flawlessly sometimes and I have no issues, but at other times I do have issues. Is this a symptom of me querying from a shared production server? Is it preferred to query from a server that has less load? Or am I doing something in my code that could be improved?
const { sql, poolPromise } = require('../db');
// function to get data
const getData = async (req, res) => {
try {
// create query parameters from user request
let id= req.query.id;
// create query from connectionPool
let pool = await poolPromise;
let qry = `
select * from tbl where id = #Id
`
let data = await pool.request()
.input('Id', sql.VarChar(sql.MAX), id)
.query(qry);
// send 200 status and return records
res.status(200);
res.send(data.recordset);
} catch(err) {
console.log('Error:');
console.log(err);
res.sendStatus(500);
}
};
module.exports = { getData };

Mongo DB problem - connections accumulation

I have a problem with the approach I use to connect to Mondo DB.
I use the following method:
import { Db, MongoClient } from "mongodb";
let cachedConnection: { client: MongoClient; db: Db } | null = null;
export async function connectToDatabase(mongoUri?: string, database?: string) {
if (!mongoUri) {
throw new Error(
"Please define the MONGO_URI environment variable inside .env.local"
);
}
if (!database) {
throw new Error(
"Please define the DATABASE environment variable inside .env.local"
);
}
if (cachedConnection) return cachedConnection;
cachedConnection = await MongoClient.connect(mongoUri, {
useNewUrlParser: true,
useUnifiedTopology: true,
}).then((client) => ({
client,
db: client.db(database),
}));
return cachedConnection!;
}
Everytime I need to connect to MongoDB I do as follows:
const { db } = await connectToDatabase(config.URI, config.USERS_DATABASE);
const myUniversity = await db
.collection(config.MY_COLLECTION)
.findOne({})
Everything seems ok, so what is the problem?
The problem is that the connections to my DB don't close after I use them. In fact I thought that my server is stateless so after every time i use my DB, the connections end. But it is not true! They stay alive, and after few hours of using my app mongo atlas sends me an email saying that the limit is exceeded.
As you can see in this screenshot, this chart is ever growing. That means that connections stay on and they accumulate. How do you think I can solve this problem?
Keep in mind that it uses cachedConnection only if I use the same connection. If I call a different API from the first one it creates another connection and it doesn't enter in if (cachedConnection) block, but it goes forward till the end.
You can try this simple demo which will allow you to use the same connection throughout the application in different modules. There are three modules: the index.js is the starter program, the dbaccess.js is where you have code to create and maintain a connection which can be used again and again, and a apis.js module where you use the database connection to retrieve data.
index.js:
const express = require('express');
const mongo = require('./dbaccess');
const apis = require('./apis');
const app = express();
const init = async () => {
await mongo.connect();
app.listen(3000);
apis(app, mongo);
};
init();
dbaccess.js:
const { MongoClient } = require('mongodb');
class Mongo {
constructor() {
this.client = new MongoClient("mongodb://127.0.0.1:27017/", {
useNewUrlParser: true,
useUnifiedTopology: true
});
}
async connect() {
await this.client.connect();
console.log('Connected to MongoDB server.');
this.db = this.client.db('test');
console.log('Database:', this.db.databaseName);
}
}
module.exports = new Mongo();
apis.js:
module.exports = function(app, mongo) {
app.get('/', function(req, res) {
mongo.db.collection('users').find().limit(1).toArray(function(err, result) {
res.send('Doc: ' + JSON.stringify(result));
});
});
}
Change the appropriate values in the url, database name and collection name before trying.

Multi-tenant with mongoose and Node.js. One or multiple connection?

We are building an Saas application where we need, for technical reasons (backup, security), adopt a multi-tenants architecture (aka one DB per customer). We are using Node.js (with Typescript) and MongoDB (with mongoose driver).
For a first test, we are doing as below. Please note that it is a simple draft easily reproductible.
We start a new connection in index.ts, the "entry-point":
try {
//Define Mongo connection options
const mongoOptions = {
useNewUrlParser: true,
useCreateIndex: true,
useUnifiedTopology: true,
useFindAndModify: false,
autoIndex: true,
poolSize: 10,
bufferMaxEntries: 0,
connectTimeoutMS: 10000,
socketTimeoutMS: 30000,
};
// Creating mongo connection
const connection = await mongoose.connect(
`mongodb://${process.env.MONGO_HOSTNAME}:${process.env.MONGO_PORT}`,
mongoOptions
);
// connectToMongoDB;
console.log('Successfuly connected to mongo database !');
} catch (err) {
console.log(err);
}
Then in a separate file, we have some logic for selecting the appropriate Database and return the Model :
import mongoose from 'mongoose';
export class ClientDbProvider {
static async getTenantDb(
tenantId: string,
modelName: string,
schema: mongoose.Schema
) {
const dbName = `spearateDB_${tenantId}`;
const db = mongoose.connection.useDb(dbName, { useCache: true });
db.model(modelName, schema);
return db;
}
static async getModelForDb(databaseName: string,
model: mongoose.Model<mongoose.Document>, schema: mongoose.Schema
) {
const connection = await ClientDbProvider.getTenantDb(
databaseName,
model.modelName,
schema
);
return connection.model(model.modelName);
}
}
Then, in any route, we include clientID for using the appropriate DB.
router.post('/api/data/:clientID', async (req: Request, res: Response) => {
const { name, value } = req.body;
const data = Data.build({
name: name,
value: value,
});
try {
const dataModel = await ClientDbProvider.getModelForDb(
req.params.clientID,
Data,
DataSchema
);
const doc = await dataModel.create(data);
} catch (err) {
console.log(err);
}
res.send({});
});
Basically we call ClientDbProvider.getModelForDb for getting a Model. The getModelForDb switches to a different database using the same connection pool and return a model.
Note on the app:
the database will be continuously fills by data as it will store telemetry data from several sensors (some can generate data every second, some every 10 minutes...)
The api will be mostly for reading data (send it as JSON).
Some queries could be long as it will depends on own many data the client ask (even if we will put some defaults limits).
We will never have a huge amount of customers (DB). We plan to have, for the next two years, 20 to 40 customers (we will upgrade to another architecture if needed such as Sharded cluster). In any cases, the number of customer will never be like 1 000 000 or so...
All DB are on the same server/mongo instance (for now)
Questions:
Does this 'draft' code could cause some trouble or performance issue ?
Would that make sense to create a connection per DB (customer) with mongoose.createConnection function and then cache the connection as describe here (in a global variable for example) ?
As all of our DB are on the same server, does increasing the pool of the connections isn't sufficient ?

Express.js: create additional mongodb DB in controller

I'm new to MongoDB.
When I create my node.js server I use only one DB connection (on start I connect to it).
But imagine: I have one database with some generic tables, and more databases - each for a custom client.
How can I create those DB at runtime?
start.js:
const mongoose = require("mongoose");
// import environmental variables from variables.env file
require("dotenv").config({ path: "variables.env" });
mongoose.connect(process.env.DATABASE);
mongoose.Promise = global.Promise;
mongoose.connection.on("error", err => {
console.error(`🚫 → ${err.message}`);
});
require("./models/MaintenanceType");
require("./models/Maintenance");
const app = require("./app");
app.set("port", process.env.PORT || 7777);
const server = app.listen(app.get("port"), () => {
console.log('started');
});
variables.env (example):
NODE_ENV=development
DATABASE=mongodb://db:qwe123#sometest.server.com:412345/webtest
PORT=1234
SECRET=webtest
KEY=webtestcom
and controller:
const mongoose = require("mongoose");
const Maintenance = mongoose.model("Maintenance");
exports.createMaintenance = async (req, res) => {
const maintenance = await new Maintenance(req.body).save();
// ALSO create a db and table if not exists for this client and use it somehow
res.json(maintenance);
};
is it possible to do?
You can create new connection
mongoose.connect('URI_FOR_ANOTHER_DATABASE')
But it's bad idea to create new connections, so the driver has a feature to use existing connections to query another database, for this purpose you can check useDb() method as shown here

Categories