I'm using sveltekit and trying to understand all the new features added after retiring Sapper. One of those new features is hooks.js which runs on the server and not accessible to the frontend. It makes dealing with db safe. So I created a connection to my mongodb to retrieve user's data before I use the db results in my getSession function. It works but I noticed that it access my database TWICE. Here is my hooks.js code:
import * as cookie from 'cookie';
import { connectToDatabase } from '$lib/mongodb.js';
export const handle = async ({event, resolve})=>{
const dbConnection = await connectToDatabase();
const db = dbConnection.db;
const userinfo = await db.collection('users').findOne({ username: "a" });
console.log("db user is :" , userinfo) //username : John
const response = await resolve(event)
response.headers.set(
'set-cookie', cookie.serialize("cookiewithjwt", "sticksafterrefresh")
)
return response
}
export const getSession = (event)=>{
return {
user : {
name : "whatever"
}
}
}
The console.log you see here returns the user data twice. One as soon as I fire up my app at localhost:3000 with npm run dev and then less than a second, it prints another console log with the same information
db user is : John
a second later without clicking on anything a second console.log prints
db user is : John
So my understanding from the sveltekit doc is that hooks.js runs every time SvelteKit receives a request. I removed all prerender and prefetch from my code. I made sure I only have the index.svelte in my app but still it prints twice. My connection code I copied from an online post has the following:
/**
* Global is used here to maintain a cached connection across hot reloads
* in development. This prevents connections growing exponentially
* during API Route usage.
*/
Here is my connection code:
import { MongoClient } from 'mongodb';
const mongoURI ="mongodb+srv://xxx:xxx#cluster0.qjeag.mongodb.net/xxxxdb?retryWrites=true&w=majority";
const mongoDB = "xxxxdb"
export const MONGODB_URI = mongoURI;
export const MONGODB_DB = mongoDB;
if (!MONGODB_URI) {
throw new Error('Please define the mongoURI property inside config/default.json');
}
if (!MONGODB_DB) {
throw new Error('Please define the mongoDB property inside config/default.json');
}
/**
* Global is used here to maintain a cached connection across hot reloads
* in development. This prevents connections growing exponentially
* during API Route usage.
*/
let cached = global.mongo;
if (!cached) {
cached = global.mongo = { conn: null, promise: null };
}
export const connectToDatabase = async() => {
if (cached.conn) {
return cached.conn;
}
if (!cached.promise) {
const opts = {
useNewUrlParser: true,
useUnifiedTopology: true
};
cached.promise = MongoClient.connect(MONGODB_URI).then((client) => {
return {
client,
db: client.db(MONGODB_DB)
};
});
}
cached.conn = await cached.promise;
return cached.conn;
So my question is : is hooks.js runs twice all the time, one time on the server and one time on the front? If not, then why the hooks.js running/printing twice the db results in my case?
Anyone?
Related
I use Next.js 13 Server Components to query MongoDB in the component
The data is rendered to the page.
If something changes in the database and I refresh the page, the changes should also be reflected in the page.
This also works, but only in development mode (npm run dev).
But when I start the server with npm run start, the new database data is not reflected on the page. It's gets it just one time.
With npm run build the route is marked as Static. Is this correct?
/lib/mongodb.js
import { MongoClient } from 'mongodb'
if (!process.env.MONGODB_URI) {
throw new Error('Invalid/Missing environment variable: "MONGODB_URI"')
}
const uri = process.env.MONGODB_URI
const options = {}
let client
let clientPromise
console.log("DEVELOPMENT MODE MONGODB")
// In development mode, use a global variable so that the value
// is preserved across module reloads caused by HMR (Hot Module Replacement).
if (!global._mongoClientPromise) {
client = new MongoClient(uri, options)
global._mongoClientPromise = client.connect()
}
clientPromise = global._mongoClientPromise
// Export a module-scoped MongoClient promise. By doing this in a
// separate module, the client can be shared across functions.
export default clientPromise
tickets.js
This component runs on the server
import clientPromise from "../../../lib/mongodb";
export default async function Tickets() {
const client = await clientPromise;
const db = client.db('sample_mflix');
const collection = db.collection('contact');
const result = await collection.find().toArray()
console.log("Database fetched")
return(
<>
{result.map(el => {
return (
<div key={el._id}>
{el.name}
{el.email}
{el.message}
</div>
)
})}
</>
)
}
I am trying to use the revalidate function. I tried to follow the code that Vercel offers, but I keep getting an error. Here is the function that I am using:
export async function getServerSideProps() {
const client = await clientPromise;
const db = client.db("myFirstDatabase");
let users = await db.collection("users").find({}).toArray();
users = JSON.parse(JSON.stringify(users));
return {
props: {
users,
},
revalidate: 15,
};
}
And here is the mongodb file that returns the client:
import { MongoClient } from 'mongodb'
const uri = process.env.MONGODB_URI
const options = {
useUnifiedTopology: true,
useNewUrlParser: true,
}
let client
let clientPromise
if (!process.env.MONGODB_URI) {
throw new Error('Please add your Mongo URI to .env.local')
}
if (process.env.NODE_ENV === 'development') {
// In development mode, use a global variable so that the value
// is preserved across module reloads caused by HMR (Hot Module Replacement).
if (!global._mongoClientPromise) {
client = new MongoClient(uri, options)
global._mongoClientPromise = client.connect()
}
clientPromise = global._mongoClientPromise
} else {
// In production mode, it's best to not use a global variable.
client = new MongoClient(uri, options)
clientPromise = client.connect()
}
export default clientPromise
I have been able to connect to the database and the code works fine if I remove the revalidate part. The error that I get is :
**
Error: Additional keys were returned from getServerSideProps. Properties intended for your component must be nested under the props key, e.g.:
return { props: { title: 'My Title', content: '...' } }
Keys that need to be moved: revalidate.
Read more: https://nextjs.org/docs/messages/invalid-getstaticprops-value
**
I am not sure what I am doing wrong. I want to get data from the database and update it every 15 seconds. Any help would be greatly appreciated.
revalidate is for getStaticProps, you are using it on getServerSideProps and this does not allow
I recommend you to see this library: https://swr.vercel.app/
So I've been searching for a long time on mqtt.js examples for structuring and best practices and haven't found anything worthwhile. thus [main] how do you structure your mqtt.js code in your node/express application?
[1] So the libraries mqttjs/async-MQTT provides some example on connecting and on-message but on a real app with lots of subscription and publishes how to structure code so that it initiliazes on the app.js and uses the same client (return from the mqtt.connect) for all the sub/pub in different files.
[2] and from the question[1] should my app only use 1 client for all the works or can use multiple clients as needed on multiple files (let's say I have 3 files mqttInit, subscriber, publisher. so if I use the init on subscriber and get a client should I export it or just make a new instance of a client on the publisher file)
[3] so the mqttjs API provides only an onMessage function so all subscribed topics message gets here thus I put a switch or a if else to manage this so if we have a lot of topics how do you manage this
[4] so my current setup is kind of messed up
this is the initializer file lets say'
mqttService.js
const mqtt = require("mqtt");
const { readFileSync } = require("fs");
module.exports = class mqttService {
constructor() {
this.client = mqtt.connect("mqtt://xxxxxxxxxxx", {
cert: readFileSync(process.cwd() + "/certificates/client.crt"),
key: readFileSync(process.cwd() + "/certificates/client.key"),
rejectUnauthorized: false,
});
this.client.on("error", (err) => {
console.log(err);
});
this.client.once("connect", () => {
console.log("connected to MQTT server");
});
}
};
subscriber.js
this is the function(subscribe()) that I call in app.js to init the mqtt thing
const { sendDeviceStatus, sendSensorStatus } = require("../socketApi");
const { client } = new (require("./mqttService"))();
function subscribe() {
let state = {
timer: false,
};
...
let topics = {
....
},
client.subscribe([...]);
client.on("message", async (topic, buffer) => {
if (topic) {
...
}
});
}
module.exports = {
subscribe,
client,
};
publish.js
const { AsyncClient } = require("async-mqtt");
const _client = require("./subscribe").client;
const client = new AsyncClient(_client);
async function sendSensorList(daqId) {
let returnVal = await client.publish(
`${daqId}-GSL-DFC`,
JSON.stringify(publishObject),
{ qos: 1 }
);
console.log(returnVal);
return publishObject;
}
.....
module.exports = {
sendSensorList,
.......
};
so as you can see from the above code everything is kind of linked with one another and messed up thus I need some expo on how you structure code
thanks for reading, please feel free to provide any info and any info is much appreciated
I'm getting a serious issue. NestJS stops working after a few requests. I'm using postman to make a specific API call, and then, after less than 10 requests to the same route, it gets extremely slow and i'm getting timeout from Postman everytime.
main.ts:
import { NestFactory } from '#nestjs/core';
import { AppModule } from './app.module';
import {Logger} from '#nestjs/common'
import * as rateLimit from 'express-rate-limit';
import { WrapContentInterceptor } from './dashboard/dashboard.interceptors';
const PORT = 5000
const TAG = 'main'
async function bootstrap() {
const logger = new Logger(TAG)
const app = await NestFactory.create(AppModule);
// app.useGlobalInterceptors(new WrapContentInterceptor())
app.enableCors()
await app.listen(PORT);
app.use(
rateLimit({
windowMs: 15 * 60 * 1000, // 15 minutes
max: 1000, // limit each IP to 100 requests per windowMs
}),
);
logger.log(`listening port::${PORT}`)
}
bootstrap();
controller where I'm getting the issue:
// some regular imports
#Controller('dashboard')
#UseGuards(AuthGuard()) // protecting routes
export class DashboardController {
constructor(
private dashboardService:DashboardService,
){}
// other routes
#Get('/cockpit/general-numbers') // route that i'm getting the issue
async getNumbersForCockpit(
#Query('desiredDate',ParseStringToDatePipe) desiredDay?:Date
):Promise<GeneralNumbersForCockpit>{
this.logger.log(`getNumbersForCockpit::${desiredDay.toISOString()}`)
let installation = await this.dashboardService.getInstallationsOfDay(desiredDay?desiredDay:undefined)
let averageTicket = await this.dashboardService.getAverageTicketPlanFromInterval(
desiredDay?getFirstDayOfMonth(desiredDay):undefined,
desiredDay?desiredDay:undefined
)
return {
averageTicket:averageTicket,
installation:installation.result
}
}
}
Ps: I realized another thing:
This error is happening in the routes where I'm using a pure SQL raw at the service layer to a MariaDB database running at Docker container.
For example at the service layer:
```
async getData(dateField:Date=new Date()):Promise<SimpleNumericalData>{
// this.logger.debug(`f::dateField:${dateField}`)
this.logger.debug(`getData::dateField:${dateField.toISOString()}`)
const queryRunner = this.connection.createQueryRunner()
let res = await queryRunner.manager.query(`
select count(*) as result from my_table
where my_field ='I'
and DATE(date_field) = DATE('${dateField.toISOString()}')
`)
//await this.connection.close()
// this.logger.debug(`Result:${JSON.stringify(res)}`)
return {
result:Number(res[0]['result'])
}
}
I guess your issue comes from the way you use the db connection. You created ~ 6 or 10 connections to db, but you did not release them, and the default limit of mariadb's connection pool is 10 (I guess).
Finally, new request want to create new connection, but it reach the limit, new request waiting for another connection be release, but it waiting forever.
In this case, you can extend the limit (not root cause but I think it is good to know).
Add connectionLimit under extra in ormconfig.json (or where you create db config)
{
...,
"extra": { connectionLimit: 10, ... }
}
More information
And you have to release the new connection what you just created in getData function, right after the query finish (success or error) :
...
...
await queryRunner.release()
Note: Let's take care about when your query throw an error, you can use try/catch/finally
...
let res = await queryRunner.manager.query(`
select count(*) as result from my_table
where my_field ='I'
and DATE(date_field) = DATE('${dateField.toISOString()}')
`).finally(() => queryRunner.release())
...
I am trying to implement single index searching using Algoliasearch for my iOS mobile app. I have about 110 users on my application. However, when I upload their data to Algolia search's index the function times out before uploading all users. Instead it throws an Error message in the http browser and declares a timeout in the Firestore console.
Firestore console:
sendCollectionToAlgolia
Function execution took 60044 ms, finished with status: 'timeout'
I created the function using this tutorial:
https://medium.com/#soares.rfarias/how-to-set-up-firestore-and-algolia-319fcf2c0d37
Although i have ran into some complications, I highly recommend that tutorial if you have your app using swiftUI iOS platform and implement cloud functions using Typescript.
Heres my function:
import * as functions from 'firebase-functions';
import * as admin from 'firebase-admin';
import algoliasearch from 'algoliasearch';
admin.initializeApp();
const db = admin.firestore();
const algoliaClient = algoliasearch(functions.config().algolia.appid, functions.config().algolia.apikey)
const collectionIndexName = functions.config().projectId === 'PROJECT-XXXX' ? 'prod_SEARCH' : 'dev_SEARCH';
const collectionIndex = algoliaClient.initIndex(collectionIndexName);
//rename to uploadUsersToAlgolia
export const sendCollectionToAlgolia = functions.https.onRequest(async (req, res) => {
const algoliaRecords: any[] = [];
const querySnapshot = await db.collection('users').get();
querySnapshot.docs.forEach(doc => {
const document = doc.data();
const record = {
objectID: doc.id,
fullname: document.fullname,
bio: document.bio,
username: document.username,
uid: document.uid,
profileImageURL: document.profileImageURL,
backgroundImageURL: document.backgroundImageURL,
fcmToken: document.fcmToken,
accountCreated: document.accountCreated,
inspirationCount: document.inspriationCount,
BucketListCount: document.BucketListCount,
CompletedBucketListCount: document.CompletedBucketListCount,
FriendsCount: document.FriendsCount
};
algoliaRecords.push(record);
});
// After all records are created, we save them to
collectionIndex.saveObjects(algoliaRecords, (_error: any, content: any) => {
res.status(200).send("users collection was indexed to Algolia successfully.");
});
});
If you just want to change the default 1 minute timeout, you can do that when you configure the function.
functions.runWith({timeoutSeconds: X}).https.onRequest(async (req, res)
Increasing the timeout won't help if your function doesn't end up sending a response, so you should also add some logging/debugging to figure out if the final call to res.send() is actually happening. If the function never sends a response, it will definitely time out no matter what happens.