I'm new to nextjs and I'm creating API on next.js to perform db update using the pg-promise. However, it always hit the WARNING: Creating a duplicate database object for the same connection on console when the app is calling the API.
I tried browsing the docs but couldn't find a solution. I also tried solution (update-2) mentioned on stackoverflow page below, but the warning still exists.
Where should I initialize pg-promise
I think the problem is on the method I used to set the columnset. However I can't find proper way to do it. How should I fix it with pg-promise ?
Db setting code:
import ConfigEnv from 'utils/configuration';
import * as pgLib from 'pg-promise';
const initOptions = {
capSQL: true,
};
const pgp = require('pg-promise')(initOptions);
interface IDatabaseScope {
db: pgLib.IDatabase<any>;
pgp: pgLib.IMain;
}
export function createSingleton<T>(name: string, create: () => T): T {
const s = Symbol.for(name);
let scope = (global as any)[s];
if (!scope) {
scope = {...create()};
(global as any)[s] = scope;
}
return scope;
}
export function getDB(): IDatabaseScope {
return createSingleton<IDatabaseScope>('my-app-db-space', () => {
return {
db: pgp(ConfigEnv.pgp),
pgp
};
});
}
API code:
import {getDB} from 'db/pgpdb';
const {db, pgp} = getDB();
const cs = new pgp.helpers.ColumnSet([
'?detail_id',
'age',
'name'
// 'last_modified_date',
], {
table: 'user_detail',
})
export default async (req, res) => {
try {
// generating the update query where it is needed:
const update = pgp.helpers.update(req.body.content, cs) + ` WHERE v.detail_id = t.detail_id`;
// executing the query
await db
.none(update)
.then(() => {
return res.status(200).end();
})
.catch((error) => {
console.log('error', error);
return res.status(500).send(error);
});
} catch (error) {
console.log(error);
}
};
Related
Im trying to dynamically load modules from a nitro server in a nuxt app, but I get the following error:
Cannot find module projectpath/.nuxt/services/listing imported from projectpath/.nuxt/dev/index.mjs
This is the snippet of code Im using for the handler where the dynamic import should take place:
export default defineEventHandler(async (event) => {
const { method, resource, paramValue } = parseRequestResource(event.node.req)
let ServiceInstance = services[resource]
if (ServiceInstance) {
return callResourceMethod(ServiceInstance, method, paramValue, event)
} else {
try {
ServiceInstance = await import(`../services/${resource}`)
} catch (error) {
const Proto = Object.assign({}, Service.prototype, { tableName: resource })
ServiceInstance = Object.create(Proto)
services[resource] = ServiceInstance
}
return callResourceMethod(ServiceInstance, method, paramValue, event)
}
})
How can I this to work? Is there some feature that nitro/nuxt have where I can do this?
I was able to achieve this functionality by using a nitro plugin. However the files being imported need to be *.mjs.
import fs from 'fs'
import { resolve } from 'path'
export default defineNitroPlugin(async (nitroApp) => {
const __dirname = resolve()
const servicesFolderPath = `${__dirname}/server/services`
const serviceFiles = fs.readdirSync(servicesFolderPath)
const services = {}
for (const fileName of serviceFiles) {
if (fileName == '__proto__.mjs') continue
try {
const moduleName = fileName.split('.')[0]
const module = await import(`${servicesFolderPath}/${fileName}`)
services[moduleName] = module.default
} catch (error) {
console.log(error);
}
}
nitroApp.$services = services
})
I am using redux-tookit, rtk-query (for querying other api's and not just Firebase) and Firebase (for authentication and db).
The code below works just fine for retrieving and caching the data but I wish to take advantage of both rtk-query caching as well as Firebase event subscribing, so that when ever a change is made in the DB (from any source even directly in firebase console) the cache is updated.
I have tried both updateQueryCache and invalidateTags but so far I am not able to find an ideal approach that works.
Any assistance in pointing me in the right direction would be greatly appreciated.
// firebase.ts
export const onRead = (
collection: string,
callback: (snapshort: DataSnapshot) => void,
options: ListenOptions = { onlyOnce: false }
) => onValue(ref(db, collection), callback, options);
export async function getCollection<T>(
collection: string,
onlyOnce: boolean = false
): Promise<T> {
let timeout: NodeJS.Timeout;
return new Promise<T>((resolve, reject) => {
timeout = setTimeout(() => reject('Request timed out!'), ASYNC_TIMEOUT);
onRead(collection, (snapshot) => resolve(snapshot.val()), { onlyOnce });
}).finally(() => clearTimeout(timeout));
}
// awards.ts
const awards = dbApi
.enhanceEndpoints({ addTagTypes: ['Themes'] })
.injectEndpoints({
endpoints: (builder) => ({
getThemes: builder.query<ThemeData[], void>({
async queryFn(arg, api) {
try {
const { auth } = api.getState() as RootState;
const programme = auth.user?.unit.guidingProgramme!;
const path = `/themes/${programme}`;
const themes = await getCollection<ThemeData[]>(path, true);
return { data: themes };
} catch (error) {
return { error: error as FirebaseError };
}
},
providesTags: ['Themes'],
keepUnusedDataFor: 1000 * 60
}),
getTheme: builder.query<ThemeData, string | undefined>({
async queryFn(slug, api) {
try {
const initiate = awards.endpoints.getThemes.initiate;
const getThemes = api.dispatch(initiate());
const { data } = (await getThemes) as ApiResponse<ThemeData[]>;
const name = slug
?.split('-')
.map(
(value) =>
value.substring(0, 1).toUpperCase() +
value.substring(1).toLowerCase()
)
.join(' ');
return { data: data?.find((theme) => theme.name === name) };
} catch (error) {
return { error: error as FirebaseError };
}
},
keepUnusedDataFor: 0
})
})
});
I'm beginner and tried to transfer the model to another file, it didn't work for me, suggest me how to do it correctly. The question may seem silly, but if I knew the answer, I would not ask it.
file todo.controller.js
const fs = require("fs");
const { v4: uuidv4 } = require("uuid");
const data = fs.readFileSync("./data/data.json");
let todos = JSON.parse(data);
class todoController {
async createTodo(req, res) {
req.on("data", (data) => {
const jsondata = JSON.parse(data);
const title = jsondata.title;
const description = jsondata.description;
if ((title, description)) {
todos.push({
id: uuidv4(),
title,
description,
dateOfCreate: new Date(),
lastModified: new Date(),
check: new Boolean(false),
});
fs.writeFile(
"./data/data.json",
JSON.stringify(todos, null, 2),
(err) => {
if (err) throw error;
}
);
}
});
}}
file todo.router.js
const url = require("url");
const todoController = require("../controllers/todo.controller");
const todoRouter = (req, res) => {
const urlparse = url.parse(req.url, true);
if (urlparse.pathname == "/todos" && req.method == "POST") {
todoController.createTodo(req, res);
}
};
module.exports = todoRouter;
here is file data.json
data.json
You have two separate problems here, separating your code to a different file and also saving or persisting that data somewhere, in this case a file.
You have to create something like a data model and then you have to import it in your other code.
// data.js
export const get = async () => {} // we will implement this just now
export const set = async (data) => {} // we will implement this just now
...
// controller.js
import {get, set} from './data.js' // import the methods we just created
...
const createTodo = async (req, res) => {
req.on("data", (data) => {
// here you can use get() if you want to use the data
set(JSON.stringify(data)) // send data to your data model
}
}
Then we also have to actually do something with those methods.
// data.js
export const get = async () => {
// may need to use JSON.parse here depending on how you'll use it
return fs.readFile('./data.json')
}
export const set = async (data) => {
fs.writeFile('data.json', JSON.stringify(data))
}
So the idea is to have a model responsible for managing the data, retrieving it and saving it, then importing and using those methods in the main controller. The code above isn't perfect, it's just to show you how to think about it.
I'm having trouble getting the AWS Secrets Manager module mocked for the jest unit tests... The part it errors on is the .promise(). When I remove that, the code doesn't work for the real Secrets Manager so I think it needs to stay there. How do I mock the getSecretData function so that getSecretData.promise() will work for the mock?
Here is the SecretsManager.js code:
import AWS from 'aws-sdk';
export class SecretsManager {
constructor() {
AWS.config.update({
region: 'us-east-1',
});
this.secretsManager = new AWS.SecretsManager();
}
async getSecretData(secretName) {
try {
const response = await this.secretsManager.getSecretValue({
SecretId: secretName,
}).promise();
const secretString = response.SecretString;
const parsedSecret = JSON.parse(secretString);
return parsedSecret;
} catch (e) {
console.log('Failed to get data from AWS Secrets Manager.');
console.log(e);
throw new Error('Unable to retrieve data.');
}
}
}
Here is the SecretsManager.test.js code:
import { SecretsManager } from '../utils/SecretsManager';
jest.mock('aws-sdk', () => {
return {
config: {
update(val) {
},
},
SecretsManager: function () {
return {
async getSecretValue({
SecretId: secretName
}) {
return {
promise: function () {
return {
UserName: 'test',
Password: 'password',
};
}
};
}
};
}
}
});
describe('SecretsManager.js', () => {
describe('Given I have a valid secret name', () => {
describe('When I send a request for test_creds', () => {
it('Then the correct data is returned.', async () => {
const mockReturnValue = {
UserName: 'test',
Password: 'password',
};
const logger = getLogger();
const secretManager = new SecretsManager();
const result = await secretManager.getSecretData('test_creds');
expect(result).toEqual(mockReturnValue)
});
});
describe('When I send a request without data', () => {
it('Then an error is thrown.', async () => {
const secretManager = new SecretsManager();
await expect(secretManager.getSecretData()).rejects.toThrow();
});
});
});
});
This is the error I get when running the tests:
this.secretsManager.getSecretValue(...).promise is not a function
Any suggestions or pointers are greatly appreciated!
Thank you for looking at my post.
I finally got it to work... figures it'd happen shortly after posting the question, but instead of deleting the post I'll share how I changed the mock to make it work incase it helps anyone else.
Note: This is just the updated mock, the tests are the same as in the question above.
// I added this because it's closer to how AWS returns data for real.
const mockSecretData = {
ARN: 'x',
Name: 'test_creds',
VersionId: 'x',
SecretString: '{"UserName":"test","Password":"password"}',
VersionStages: ['x'],
CreatedDate: 'x'
}
jest.mock('aws-sdk', () => {
return {
config: {
update(val) {
},
},
SecretsManager: function () {
return {
getSecretValue: function ( { SecretId } ) {
{
// Adding function above to getSecretValue: is what made the original ".promise() is not a function" error go away.
if (SecretId === 'test_creds') {
return {
promise: function () {
return mockSecretData;
}
};
} else {
throw new Error('mock error');
}
}
}
};
}
}});
I ran into this issue as well. There may be a more elegant way to handle this that also allows for greater control and assertion, but I haven't found one. Note that the in-test option may work better with newer versions of Jest.
I personally solved this issue by making use of manual mocks and a custom mock file for aws-sdk. In your case, it would look something like the following:
# app_root/__tests__/__mocks__/aws-sdk.js
const exampleResponse = {
ARN: 'x',
Name: 'test_creds',
VersionId: 'x',
SecretString: '{"UserName":"test","Password":"password"}',
VersionStages: ['x'],
CreatedDate: 'x'
};
const mockPromise = jest.fn().mockResolvedValue(exampleResponse);
const getSecretValue = jest.fn().mockReturnValue({ promise: mockPromise });
function SecretsManager() { this.getSecretValue = getSecretValue };
const AWS = { SecretsManager };
module.exports = AWS;
Then in your test file:
// ... imports
jest.mock('aws-sdk');
// ... your tests
So, in a nutshell:
Instead of mocking directly in your test file, you're handing mocking control to a mock file, which Jest knows to look for in the __mocks__ directory.
You create a mock constructor for the SecretsManager in the mock file
SecretsManager returns an instance with the mock function getSecretValue
getSecretValue returns a mock promise
the mock promise returns the exampleResponse
Bada boom, bada bing. You can read more here.
I ran into a same issue, I have tried to solve as below. It worked perfectly in my case.
Terminalsecret.ts
import AWS from 'aws-sdk';
AWS.config.update({
region: "us-east-1",
});
const client = new AWS.SecretsManager();
export class Secret {
constructor(){}
async getSecret(secretName: string) {
let secret: any;
const data = await client.getSecretValue({ SecretId: secretName).promise();
if ('SecretString' in data) {
secret = data.SecretString;
} else {
const buff = Buffer.alloc(data.SecretBinary as any, 'base64');
secret = buff.toString('ascii');
}
const secretParse = JSON.parse(secret);
return secretParse[secretName];
}
}
Terminalsecret.test.ts
import { SecretsManager as fakeSecretsManager } from 'aws-sdk';
import { Secret } from './terminalSecret';
jest.mock('aws-sdk');
const setup = () => {
const mockGetSecretValue = jest.fn();
fakeSecretsManager.prototype.getSecretValue = mockGetSecretValue;
return { mockGetSecretValue };
};
describe('success', () => {
it('should call getSecretValue with the argument', async () => {
const { mockGetSecretValue } = setup();
mockGetSecretValue.mockReturnValueOnce({
promise: async () => ({ SecretString: '{"userName": "go-me"}' })
});
const fakeName = 'userName';
const terminalSecretMock: TerminalSecret = new TerminalSecret()
terminalSecretMock.getTerminalSecret(fakeName);
expect(mockGetSecretValue).toHaveBeenCalledTimes(1);
});
});
I'm working with class 'State', it has a model and controller file. However, when importing these classes on my main (app.js), there seem to be a clash as 'state.model.js' is 'required' both in app.js and state.controller.js. Thus, giving me the error, State is not a constructor.
app.js:
const State = require('./model/state.model');
const stateController = require('./controller/state.controller');
stateController.insert(msg, State.STATE_REMINDER.name, State.STATE_REMINDER.key.day, selected_day, (err, doc) => {
if (err) throw err;
console.log("[STATE][INSERT]", doc);
});
state.controller.js:
const Database = require('../database');
const State = require('../model/state.model');
const db = Database.collection('states');
db.loadDatabase((err) => {
if (err) throw err;
console.log("[STATES] Database connected");
});
exports.insert = (msg, state, key, value, callback) => { // msg refers to Telegram Callback
let insertState = new State(undefined, msg.from.id, msg.chat.id, state, key, value);
console.log(insertState);
db.insert(insertState, (err, newDoc) => {
if (err) throw err;
callback(err, newDoc);
});
};
state.model.js:
const stateController = require('../controller/state.controller');
const STATE_REMINDER = {
name: "STATE_REMINDER",
key: {
day: "DAY",
time: "TIME"
}
};
class State {
constructor(id, user_id, chat_id, state, key, value) {
this._id = id;
this.user_id = user_id;
this.chat_id = chat_id;
this.state = state;
this.key = key;
this.value = value;
this.timestamp = new Date();
}
static get STATE_REMINDER() {
return STATE_REMINDER;
}
}
module.exports = State;
This problem is actually fixed when i swap the order of codes in app.js to:
const stateController = require('./controller/state.controller');
const State = require('./model/state.model');
Why is that so? Any help is appreciated. Thanks!
Your code has a circular dependency loop:
app.js requires state.model.js which requires state.controller.js which requires state.model.js which requires state.controller.js etc etc
In this case, require will detect a circular dependency and will NOT go on forever. However this means at some point it will just return "null" instead of the actual class. It probably will look like this:
app.js -> requires state.model.js
state.model.js -> requires state.controller.js
state.controller.js -> requires state.model.js. At this point node detects the circular looop and will return null for state.model.js, then will begin to unwind the stack.
I'm not sure why it would work when you switch the order, the loop is still there and should be causing an issue still, probably some oddity in how node is importing the files.