NestJS stops working after a few requests - javascript

I'm getting a serious issue. NestJS stops working after a few requests. I'm using postman to make a specific API call, and then, after less than 10 requests to the same route, it gets extremely slow and i'm getting timeout from Postman everytime.
main.ts:
import { NestFactory } from '#nestjs/core';
import { AppModule } from './app.module';
import {Logger} from '#nestjs/common'
import * as rateLimit from 'express-rate-limit';
import { WrapContentInterceptor } from './dashboard/dashboard.interceptors';
const PORT = 5000
const TAG = 'main'
async function bootstrap() {
const logger = new Logger(TAG)
const app = await NestFactory.create(AppModule);
// app.useGlobalInterceptors(new WrapContentInterceptor())
app.enableCors()
await app.listen(PORT);
app.use(
rateLimit({
windowMs: 15 * 60 * 1000, // 15 minutes
max: 1000, // limit each IP to 100 requests per windowMs
}),
);
logger.log(`listening port::${PORT}`)
}
bootstrap();
controller where I'm getting the issue:
// some regular imports
#Controller('dashboard')
#UseGuards(AuthGuard()) // protecting routes
export class DashboardController {
constructor(
private dashboardService:DashboardService,
){}
// other routes
#Get('/cockpit/general-numbers') // route that i'm getting the issue
async getNumbersForCockpit(
#Query('desiredDate',ParseStringToDatePipe) desiredDay?:Date
):Promise<GeneralNumbersForCockpit>{
this.logger.log(`getNumbersForCockpit::${desiredDay.toISOString()}`)
let installation = await this.dashboardService.getInstallationsOfDay(desiredDay?desiredDay:undefined)
let averageTicket = await this.dashboardService.getAverageTicketPlanFromInterval(
desiredDay?getFirstDayOfMonth(desiredDay):undefined,
desiredDay?desiredDay:undefined
)
return {
averageTicket:averageTicket,
installation:installation.result
}
}
}
Ps: I realized another thing:
This error is happening in the routes where I'm using a pure SQL raw at the service layer to a MariaDB database running at Docker container.
For example at the service layer:
```
async getData(dateField:Date=new Date()):Promise<SimpleNumericalData>{
// this.logger.debug(`f::dateField:${dateField}`)
this.logger.debug(`getData::dateField:${dateField.toISOString()}`)
const queryRunner = this.connection.createQueryRunner()
let res = await queryRunner.manager.query(`
select count(*) as result from my_table
where my_field ='I'
and DATE(date_field) = DATE('${dateField.toISOString()}')
`)
//await this.connection.close()
// this.logger.debug(`Result:${JSON.stringify(res)}`)
return {
result:Number(res[0]['result'])
}
}

I guess your issue comes from the way you use the db connection. You created ~ 6 or 10 connections to db, but you did not release them, and the default limit of mariadb's connection pool is 10 (I guess).
Finally, new request want to create new connection, but it reach the limit, new request waiting for another connection be release, but it waiting forever.
In this case, you can extend the limit (not root cause but I think it is good to know).
Add connectionLimit under extra in ormconfig.json (or where you create db config)
{
...,
"extra": { connectionLimit: 10, ... }
}
More information
And you have to release the new connection what you just created in getData function, right after the query finish (success or error) :
...
...
await queryRunner.release()
Note: Let's take care about when your query throw an error, you can use try/catch/finally
...
let res = await queryRunner.manager.query(`
select count(*) as result from my_table
where my_field ='I'
and DATE(date_field) = DATE('${dateField.toISOString()}')
`).finally(() => queryRunner.release())
...

Related

Next js API + MongoDB error 431 on dynamic request

I'm getting a 431 (headers fields too large) on some API calls within a fullstack Next JS project. This only occurs on a dynamic API route (/author/get/[slug]), same result with both frontend and Postman. The server is running on local, and other endpoints works fine with exactly the same fetching logic.
The request is not even treated by Next API, no log will appear anywhere.
The database used is mongoDB. The API is pure simple JS.
The objective is to get a single author (will evolve in getStaticProps)
The API call looks like this (no headers whatsoever):
try {
const res = await fetch(`http://localhost:3000/api/author/get/${slug}`, { method: "GET" })
console.log(res)
} catch (error) { console.log(error) }
And the endpoint:
// author/get/[slug].js
import {getClient} from "../../../../src/config/mongodb-config";
export default async function handler(req, res) {
const { query } = req
const { slug } = query
if(req.method !== 'GET') {
return
}
const clientPromise = await getClient()
const author = clientPromise.db("database").collection("authors").findOne({ 'slug': slug })
res.status(200).json(author)
await clientPromise.close()
}
Tried without success:
To remove a nesting level (making the path /author/[slug])

Hooks.js running the db connection and results twice in sveltekit

I'm using sveltekit and trying to understand all the new features added after retiring Sapper. One of those new features is hooks.js which runs on the server and not accessible to the frontend. It makes dealing with db safe. So I created a connection to my mongodb to retrieve user's data before I use the db results in my getSession function. It works but I noticed that it access my database TWICE. Here is my hooks.js code:
import * as cookie from 'cookie';
import { connectToDatabase } from '$lib/mongodb.js';
export const handle = async ({event, resolve})=>{
const dbConnection = await connectToDatabase();
const db = dbConnection.db;
const userinfo = await db.collection('users').findOne({ username: "a" });
console.log("db user is :" , userinfo) //username : John
const response = await resolve(event)
response.headers.set(
'set-cookie', cookie.serialize("cookiewithjwt", "sticksafterrefresh")
)
return response
}
export const getSession = (event)=>{
return {
user : {
name : "whatever"
}
}
}
The console.log you see here returns the user data twice. One as soon as I fire up my app at localhost:3000 with npm run dev and then less than a second, it prints another console log with the same information
db user is : John
a second later without clicking on anything a second console.log prints
db user is : John
So my understanding from the sveltekit doc is that hooks.js runs every time SvelteKit receives a request. I removed all prerender and prefetch from my code. I made sure I only have the index.svelte in my app but still it prints twice. My connection code I copied from an online post has the following:
/**
* Global is used here to maintain a cached connection across hot reloads
* in development. This prevents connections growing exponentially
* during API Route usage.
*/
Here is my connection code:
import { MongoClient } from 'mongodb';
const mongoURI ="mongodb+srv://xxx:xxx#cluster0.qjeag.mongodb.net/xxxxdb?retryWrites=true&w=majority";
const mongoDB = "xxxxdb"
export const MONGODB_URI = mongoURI;
export const MONGODB_DB = mongoDB;
if (!MONGODB_URI) {
throw new Error('Please define the mongoURI property inside config/default.json');
}
if (!MONGODB_DB) {
throw new Error('Please define the mongoDB property inside config/default.json');
}
/**
* Global is used here to maintain a cached connection across hot reloads
* in development. This prevents connections growing exponentially
* during API Route usage.
*/
let cached = global.mongo;
if (!cached) {
cached = global.mongo = { conn: null, promise: null };
}
export const connectToDatabase = async() => {
if (cached.conn) {
return cached.conn;
}
if (!cached.promise) {
const opts = {
useNewUrlParser: true,
useUnifiedTopology: true
};
cached.promise = MongoClient.connect(MONGODB_URI).then((client) => {
return {
client,
db: client.db(MONGODB_DB)
};
});
}
cached.conn = await cached.promise;
return cached.conn;
So my question is : is hooks.js runs twice all the time, one time on the server and one time on the front? If not, then why the hooks.js running/printing twice the db results in my case?
Anyone?

NodeJS: Controlling the order of loading modules

Here is the scenario:
I have 3 files (modules):
app.js
(async () => {
await connectoDB();
let newRec = new userModel({
...someprops
});
await newRec.save();
})();
The app.ts is the entry point of the project.
database.ts
interface ConnectionInterface {
[name: string]: mongoose.Connection;
}
export class Connection {
public static connections: ConnectionInterface;
public static async setConnection(name: string, connection: mongoose.Connection) {
Connection.connections = {
...Connection.connections,
[name]: connection,
};
}
}
export async function connectToDB() {
const conn = await mongoose.createConnection('somePath');
await Connection.setConnection('report', conn);
}
model.ts
const userSchema = new mongoose.Schema(
{
..someprops
},
);
const userModel = Connection.connections.report.model('User', userSchema);
export default userModel;
What I am trying to do: I need to have multiple mongoose connections, so I use an static prop called connections in Connection class (in database.ts); every time that I connect to a database I use setConnection to store the connection in mentioned static prop, so I can access it from every module in my project by its name which is report in this case.
Later, In model.ts I use Connection.connections.report to access the connection report to load my model!
Then, When I run app.ts I get the following error which is logical:
const aggregationModel = Connection.connections.report.model('User', userSchema)
^
TypeError: Cannot read property 'report' of undefined
The reason that causes this (I think) is, while loading imported modules in app.ts, .report is not declared because the app.ts isn't run completely (connectoDB() defines the .report key).
The codes that I have mentioned have been simplified for preventing complexity. The original app is an express app!
Now, How should I solve this error?
Thanks in advance.
You can wait for the connection to finish before using it if you change up your class slightly.
const connection = await Connection.getConnection()
const model = connection.example
...
class Connection {
...
public static async getConnection() => {
if (!Connection.connection) {
await Connection.setConnection()
}
return Connection.connection
}
}

Infura calls eth_getBlockByNumber ten times for each eth_call

I am trying to make a simple nextjs API route (https://nextjs.org/docs/api-routes/introduction) that is connected to the Ethereum blockchain to perform a view function (requires no gas) from a smart contract.
I have a system where you can buy the rights to mint a NFT (ERC721), and this functions checks if the user has paid for any collectionIds that is not yet minted.
import Web3 from 'web3'
import { getPaidForCollectionsIds } from '../../database'
const mnemonic2 = 'main check ...'
var HDWalletProvider = require('truffle-hdwallet-provider')
export default async function (req, res) {
const paidFor = await getPaidForCollectionsIds(req.body.userId)
if (paidFor.length < 1) return res.json({ data: [] })
const provider = new HDWalletProvider(mnemonic2, 'https://rinkeby.infura.io/v3/INFURAAPIKEY', 0)
const web3 = new Web3(provider)
const TheContractAddress = '0xfbeF...'
const { abi } = require('../../abis/TheContract.json')
const KS = new web3.eth.Contract(abi, TheContractAddress, {
from: '0x5EE...', // default from address
gasPrice: '20000000000' // default gas price in wei, 20 gwei in this case
})
const unminted = []
await Promise.all(paidFor.data.map(async pf => KS.methods.readCollectionIdIsUsed(pf.collectionId).call().then(d => {
console.log(d, 'readCollectionIdIsUsed', pf.collectionId)
}).catch(e => {
unminted.push(sign)
})
)
}))
res.statusCode = 200
res.json({ data: unminted })
}
here is the code from the readCollectionIdIsUsed method in the smart contract:
mapping (uint256 => bool) collectionIdIsUsed;
function readCollectionIdIsUsed(uint256 collectionId) external view returns (bool res) {
require(collectionIdIsUsed[collectionId], 'This signature has not been used');
res = collectionIdIsUsed[collectionId];
}
This all works fine, except for that after a while, I reach the 100 000 request limit of infura.
infura top methods
I dont know why it is calling eth_getBlockByNumber 10 times for each call, is this necessary or is there a way around it?
Web3.js should not do this for calls, but should do for sends.
This is because when you await any web3.js contract methods, it has internal, somewhat unexpected, implied "wait X number of blocks before tx is confirmed" mechanism and somehow this get triggered, although your example code does not seem to have any contract writes. It is documented here.
Because you are using HTTPS connections instead of WebSocket connection, Web3.js needs to poll the new block numbers for confirmations. But if you switch to WebSocket infura provider, these calls should disappear, as Web3.js can simply subscribe to WebSocket new block detected events.

Why is my data import to Algolia search using the API script timing out

I am trying to implement single index searching using Algoliasearch for my iOS mobile app. I have about 110 users on my application. However, when I upload their data to Algolia search's index the function times out before uploading all users. Instead it throws an Error message in the http browser and declares a timeout in the Firestore console.
Firestore console:
sendCollectionToAlgolia
Function execution took 60044 ms, finished with status: 'timeout'
I created the function using this tutorial:
https://medium.com/#soares.rfarias/how-to-set-up-firestore-and-algolia-319fcf2c0d37
Although i have ran into some complications, I highly recommend that tutorial if you have your app using swiftUI iOS platform and implement cloud functions using Typescript.
Heres my function:
import * as functions from 'firebase-functions';
import * as admin from 'firebase-admin';
import algoliasearch from 'algoliasearch';
admin.initializeApp();
const db = admin.firestore();
const algoliaClient = algoliasearch(functions.config().algolia.appid, functions.config().algolia.apikey)
const collectionIndexName = functions.config().projectId === 'PROJECT-XXXX' ? 'prod_SEARCH' : 'dev_SEARCH';
const collectionIndex = algoliaClient.initIndex(collectionIndexName);
//rename to uploadUsersToAlgolia
export const sendCollectionToAlgolia = functions.https.onRequest(async (req, res) => {
const algoliaRecords: any[] = [];
const querySnapshot = await db.collection('users').get();
querySnapshot.docs.forEach(doc => {
const document = doc.data();
const record = {
objectID: doc.id,
fullname: document.fullname,
bio: document.bio,
username: document.username,
uid: document.uid,
profileImageURL: document.profileImageURL,
backgroundImageURL: document.backgroundImageURL,
fcmToken: document.fcmToken,
accountCreated: document.accountCreated,
inspirationCount: document.inspriationCount,
BucketListCount: document.BucketListCount,
CompletedBucketListCount: document.CompletedBucketListCount,
FriendsCount: document.FriendsCount
};
algoliaRecords.push(record);
});
// After all records are created, we save them to
collectionIndex.saveObjects(algoliaRecords, (_error: any, content: any) => {
res.status(200).send("users collection was indexed to Algolia successfully.");
});
});
If you just want to change the default 1 minute timeout, you can do that when you configure the function.
functions.runWith({timeoutSeconds: X}).https.onRequest(async (req, res)
Increasing the timeout won't help if your function doesn't end up sending a response, so you should also add some logging/debugging to figure out if the final call to res.send() is actually happening. If the function never sends a response, it will definitely time out no matter what happens.

Categories