ConnectionNotFoundError: Connection "default" was not found. Can someone help me? [duplicate] - javascript

I use TypeORM with NestJS and I am not able to save properly an entity.
The connection creation works, postgres is running on 5432 port. Credentials are OK too.
However when I need to save a resource with entity.save() I got :
Connection "default" was not found.
Error
at new ConnectionNotFoundError (/.../ConnectionNotFoundError.ts:11:22)
I checked the source file of TypeORM ConnectionManager (https://github.com/typeorm/typeorm/blob/master/src/connection/ConnectionManager.ts) but it seems that the first time TypeORM creates connection it attributes "default" name if we don't provide one, which is the case for me.
I setup TypeORM with TypeOrmModule as
TypeOrmModule.forRoot({
type: config.db.type,
host: config.db.host,
port: config.db.port,
username: config.db.user,
password: config.db.password,
database: config.db.database,
entities: [
__dirname + '/../../dtos/entities/*.entity.js',
]
})
Of course my constants are correct. Any ideas ?

You are trying to create a repository or manager without the connection being established.
Try doing this const shopkeeperRepository = getRepository(Shopkeeper); inside a function. it will work

the upvoted answer is not necessarily correct, if you not specify the connection name it will default to "default".
const manager = getConnectionManager().get('your_orm_name');
const repository = manager.getRepository<AModel>(Model);

If anyone else has this problem in the future, check this out just in case:
I accidentally did "user.save()" instead of "userRepo.save(user)".
(And of course above initializing the connection like this:
const userRepo = getConnection(process.env.NODE_ENV).getRepository(User)

We are using lerna and using code from library A in package B.
The problem was that both TypeOrm versions in each package differ.
Solution is to make sure that you have exactly the same version installed in each package.
To be on the safe side, delete your node_modules directory and reinstall everything again with yarn install or npm install
Check your yarn.lock for multiple entries of typeorm and make sure there is only one.

If anyone using Express Router with getRepository(), check the code below
const router = Router();
router.get("/", async function (req: Request, res: Response) {
// here we will have logic to return all users
const userRepository = getRepository(User);
const users = await userRepository.find();
res.json(users);
});
router.get("/:id", async function (req: Request, res: Response) {
// here we will have logic to return user by id
const userRepository = getRepository(User);
const results = await userRepository.findOne(req.params.id);
return res.send(results);
});
Just make sure to call getRepository() in every route just like Saras Arya said in the accepted answer.

I follow the below approach creating the Database class. If the connection doesn't exist then it creates the connection else return the existing connection.
import { Connection, ConnectionManager, ConnectionOptions, createConnection, getConnectionManager } from 'typeorm';
export class Database {
private connectionManager: ConnectionManager;
constructor() {
this.connectionManager = getConnectionManager();
}
public async getConnection(name: string): Promise<Connection> {
const CONNECTION_NAME: string = name;
let connection: Connection;
const hasConnection = this.connectionManager.has(CONNECTION_NAME);
if (hasConnection) {
connection = this.connectionManager.get(CONNECTION_NAME);
if (!connection.isConnected) {
connection = await connection.connect();
}
} else {
const connectionOptions: ConnectionOptions = {
name: 'default',
type: 'mysql',
host: 'localhost',
port: 3306,
username: 'root',
password: 'password',
database: 'DemoDb',
synchronize: false,
logging: true,
entities: ['src/entities/**/*.js'],
migrations: ['src/migration/**/*.js'],
subscribers: ['src/subscriber/**/*.js'],
};
connection = await createConnection(connectionOptions);
}
return connection;
}
}
If you are using webpack the make sure entities are imported specifically & returned in array.
import {User} from 'src/entities/User.ts';
import {Album} from 'src/entities/Album.ts';
import {Photos} from 'src/entities/Photos.ts';
const connectionOptions: ConnectionOptions = {
name: 'default',
type: 'mysql',
host: 'localhost',
port: 3306,
username: 'root',
password: 'password',
database: 'DemoDb',
synchronize: false,
logging: true,
entities: [User, Album, Photos],
migrations: ['src/migration/**/*.js'],
subscribers: ['src/subscriber/**/*.js'],
};
Finally
const connectionName = 'default';
const database = new Database();
const dbConn: Connection = await database.getConnection(connectionName);
const MspRepository = dbConn.getRepository(Msp);
await MspRepository.delete(mspId);

For those of you looking for another answer, check this out.
In my case, the issue was because I was passing name in my db config.
export const dbConfig = {
name: 'myDB',
...
}
await createConnection(dbConfig) // like this
As a result, the only connection server knows is myDB not default.
At the same time, in my service, repository was injected without name which will fallback to default. (Service will looking for default connection as a result)
#Service() // typedi
export class Service {
constructor(
// inject without name -> fallback to default
#InjectRepository() private readonly repository
) {}
}
As a fix, I removed name property in my db config.
Or you can pass myDB as a parameter for InjectRepository like #InjectRepository('myDB'), either way works.

In my own case, the actual problem was that my index file imports my router file which imports my controllers which then import my services (where the call to getRepository was made). So the imports were resolving (and thus the call to getRepository) before the connection was established.
I considered implementing Sarya's answer but it's going to leave my code more verbose.
What I did was create a function to connect to the DB in a db/index.ts file
import { createConnection } from "typeorm";
export const getDBConnection = async () => {
const dbConnection = await createConnection();
if (!dbConnection.isConnected) await dbConnection.connect();
return dbConnection;
}
Then create an async function to bootstrap my app. I wait on getDBConnection to resolve before instantiating my app then I import my router file after. That way the import resolution only happens after the connection has been established.
routers/index.ts
import { Router } from 'express';
const router = Router();
/* ... route configurations ... */
export default router;
app.ts
const bootstrap = async () => {
try {
// wait on connection to be established
await getDBConnection();
} catch (error) {
// log error then throw
throw error;
}
// create app
const app = express();
// some middleware configuration...
// now import and setup the router
const { default: router } = await import("./routers");
app.use("/api", router);
// some more middleware configuration...
const server = http.createServer(app);
server.listen(3000, () => console.log('app running at port: 3000'));
};
bootstrap();

I got this error while using getConnectionOptions for different environments. Using one database for development and another for testing. This is how I fixed it:
const connectionOptions = await getConnectionOptions(process.env.NODE_ENV);
await createConnection({...connectionOptions, name:"default"});
I usegetConnectionOptions to get the connection for the current environment, in order to do that successfully you have to change ormconfig.json to an array, with keys "name" containing the different environments you want, like so:
[
{
"name" : "development",
"type": "USER",
"host": "localhost",
"port": 5432,
"username": "postgres",
"password": "PASS",
"database": "YOURDB"
},
{
"name" : "test",
"type": "USERTEST",
"host": "localhost",
"port": 5432,
"username": "postgres",
"password": "PASSTEST",
"database": "YOURDBTEST"
}
]
Now connectionOptions will contain the connection parameters of the current environment, but loading it to createConnection threw the error you pointed. Changing connectionOptions name to "default" fixed the issue.

I know it is super weird but someone might need this:
Windows related reason.
I had the same error caused by the current location set with the drive letter in the lower case (d:/apps/app-name/etc).
The problem got fixed once I updated the directory change instruction to use capital D (D:/apps/app-name/etc).

After verifying TypeOrm versions is same in both the packages i.e- external package and consumer repository as mentioned by #InsOp still issue persist then issue could be-
Basically when we create an external package - TypeORM tries to get the "default" connection option, but If not found then throws an error:
ConnectionNotFoundError: Connection "default" was not found.
We can solve this issue by doing some kind of sanity check before establishing a connection - luckily we have .has() method on getConnectionManager().
import { Connection, getConnectionManager, getConnectionOptions,
createConnection, getConnection, QueryRunner } from 'typeorm';
async init() {
let connection: Connection;
let queryRunner: QueryRunner;
if (!getConnectionManager().has('default')) {
const connectionOptions = await getConnectionOptions();
connection = await createConnection(connectionOptions);
} else {
connection = getConnection();
}
queryRunner = connection.createQueryRunner();
}
Above is a quick code-snippet which was the actual root cause for this issue but If you are interested to see complete working repositories (different examples) -
External NPM Package :
Git Repo : git-unit-of-work (specific file- src/providers/typeorm/typeorm-uow.ts)
Published in NPM : npm-unit-of-work
Consumer of above package : nest-typeorm-postgre (specific files- package.json, src/countries/countries.service.ts & countries.module.ts)

In my case was that I have an array of multiple connections, instead of just one. You have 2 alternatives.
To have at least one default named connection, example:
createConnections([
{
name: 'default',
type: 'mysql',
host: 'localhost',
port: 3306,
username: 'root',
password: 'root',
database: 'users',
entities: [`${__dirname}/entity/*{.js,.ts}`],
synchronize: true,
logging: true
}
]);
To be specific when using the connection:
import {getConnection} from "typeorm";
const db1Connection = getConnection("db1Connection");
// you can work with "db1" database now...

I had this same problem with the following code:
import { HttpException, Inject, NotFoundException } from "#nestjs/common";
import { Not } from "typeorm";
import { Transactional } from "typeorm-transactional-cls-hooked";
import { TENANT_CONNECTION } from "../tenant/tenant.module";
import {Feriados} from './feriados.entity';
export class FeriadosService {
repository: any;
constructor(
#Inject(TENANT_CONNECTION) private connection)
{
this.repository = connection.getRepository(Feriados)
}
#Transactional()
async agregar(tablaNueva: Feriados): Promise<Number> {
const tablaAGuardar = await this.repository.create(tablaNueva)
return await this.guardar(tablaAGuardar)
}
#Transactional()
async actualizar(tablaActualizada: Feriados): Promise<Number>{
const tablaAGuardar = await this.repository.merge(tablaActualizada);
return await this.guardar(tablaAGuardar)
}
async guardar(tabla:Feriados){
await this.repository.save(tabla)
return tabla.id
}
I fixed it by removing the 2 #Transactional()
I hope someone helps.

In typeorm v0.3 the Connection API was replaced by the DataSource API. NestJS adapted this change as well, so if you relied on the old API (e.g. getConnection method) you might see the Connection "default" was not found error.
You can read about the changes and the new API in the release notes: https://github.com/typeorm/typeorm/releases/tag/0.3.0
If you used getConnection you can use app.get(DataSource) instead.

In the new version of Typeorm, 0.3.7, a solution to this problem is next:
In the app.module.ts, change the constructor of the AppModule class and create a method to return Datasource:
export class AppModule {
constructor(private dataSource: DataSource) {}
getDataSource() {
return this.dataSource;
}
}
Then, in the file you need to use add:
const repository = app
.get(AppModule)
.getDataSource()
.getRepository('Entity_name');

Although Saras Arya has provided the correct answer, I have encountered the same error
ConnectionNotFoundError: Connection "default" was not found.
due to the fact that my typeORM entity did have an #Entity() decorator as well as that it had extended BaseEntity.
The two can't live together.

Related

How can I configure postgreSQL in the Nestjs way?

So I'm in the process of learning NestJs ways. I have a small NestJs backend with only a few routes. Some of them call postgreSQL. I don't want to use any ORM and directly use pg package.
So my next step is learning how to use ConfigService. I have successfully used it to configure all env vars in the backend, but I'm struggling to use it in a small file I use to configure postgreSQL. This is the configuration file (pgconnect.ts):
import { Pool } from 'pg';
import configJson from './config/database.json';
import dotenv from 'dotenv';
dotenv.config();
const config = configJson[process.env.NODE_ENV];
const poolConfig = {
user: config.username,
host: config.host,
database: config.database,
password: config.password,
port: config.port,
max: config.maxClients
};
export const pool = new Pool(poolConfig)
database.json is a json file where I have all connect values divided by environment. Then in service classes I just:
import { Injectable } from '#nestjs/common';
import { Response } from 'express';
import { pool } from 'src/database/pgconnect';
#Injectable()
export class MyService {
getDocumentByName(res: Response, name: string) {
pool.query(
<query, error treatment, etc>
});
}
<...> more queries for insert, update, other selects, etc
}
So how could I use ConfigService inside my configuration file ? I already tried to instance class like this:
let configService = new ConfigService();
and what I would like to do is:
const config = configJson[configService.get<string>('NODE_ENV')];
but it didn't work. You have to pass .env file path to new ConfigService(). And I need to use NODE_ENV var to get it, because it depends on environment. To get NODE_ENV without using ConfigService I would have to use dotenv, but if I'm going to use dotenv I don't need ConfigService in the first place.
So then I tried to create a class:
import { Injectable, HttpException, HttpStatus } from '#nestjs/common';
import { ConfigService } from '#nestjs/config'
const { Pool } = require('pg');
import configJson from './config/database.json';
#Injectable()
export class PgPool {
constructor(private configService: ConfigService) { };
config = configJson[this.configService.get<string>('NODE_ENV')];
poolConfig = {
user: this.config.username,
host: this.config.host,
database: this.config.database,
password: this.config.password,
port: this.config.port,
max: this.config.maxClients
};
static pool = new Pool(this.poolConfig);
}
export const PgPool.pool;
But this doesn't work in several ways. If I use non-static members, I canĀ“t export pool member which is the only thing I need. If I use static members one can't access the other or at least I'm not understanding how one access the other.
So, the questions are: How do I use ConfigService outside of a class or how can I change pgconnect.ts file to do it's job ? If it's through a class the best would be to export only pool method.
Also if you think there's a better way to configure postgreSQL I would be glad to hear.
What I would do, if you're going to be using the pg package directly, is create a PgModule that exposes the Pool you create as a provider that can be injected. Then you can also create a provider for the options specifically for ease of swapping in test. Something like this:
#Module({
imports: [ConfigModule],
providers: [
{
provide: 'PG_OPTIONS',
inject: [ConfigService],
useFactory: (config) => ({
host: config.get('DB_HOST'),
port: config.get('DB_PORT'),
...etc
}),
},
{
provide: 'PG_POOL',
inject: ['PG_OPTIONS'],
useFactory: (options) => new Pool(options),
}
],
exports: ['PG_POOL'],
})
export class PgModule {}
Now, when you need to use the Pool in another service you add PgModule to that service's module's imports and you add #Inject('PG_POOL') private readonly pg: Pool to the service's constructor.
If you want to see an overly engineered solution, you can take a look at my old implementation here
I normally have my own pg module handling the pool with either an additional config file (json) or via processing a .env file:
node-pg-sql.js:
/* INFO: Require json config file */
const fileNameConfigPGSQL = require('./config/pgconfig.json');
/* INFO: Require file operations package */
const { Pool } = require('pg');
const pool = new Pool(fileNameConfigPGSQL);
module.exports = {
query: (text, params, callback) => {
const start = Date.now()
return pool.query(text, params, (err, res) => {
const duration = Date.now() - start
// console.log('executed query', { text, duration, rows: res.rowCount })
callback(err, res)
})
},
getClient: (callback) => {
pool.connect((err, client, done) => {
const query = client.query.bind(client)
// monkey patch for the query method to track last queries
client.query = () => {
client.lastQuery = arguments
client.query.apply(client, arguments)
}
// Timeout of 5 secs,then last query is logged
const timeout = setTimeout(() => {
// console.error('A client has been checked out for more than 5 seconds!')
// console.error(`The last executed query on this client was: ${client.lastQuery}`)
}, 5000)
const release = (err) => {
// calling 'done'-method to return client to pool
done(err)
// cleat timeout
clearTimeout(timeout)
// reset query-methode before the Monkey Patch
client.query = query
}
callback(err, client, done)
})
}
}
pgconfig.json:
{
"user":"postgres",
"host":"localhost",
"database":"mydb",
"password":"mypwd",
"port":"5432",
"ssl":true
}
If you prefer processing a .env file:
NODE_ENV=develepment
NODE_PORT=45500
HOST_POSTGRESQL='localhost'
PORT_POSTGRESQL='5432'
DB_POSTGRESQL='mydb'
USER_POSTGRESQL='postgres'
PWD_POSTGRESQL='mypwd'
and process the file and export vars:
var path = require('path');
const dotenvAbsolutePath = path.join(__dirname, '.env');
/* INFO: Require dotenv package for retieving and setting env-vars at runtime via absolute path due to pkg */
const dotenv = require('dotenv').config({
path: dotenvAbsolutePath
});
if (dotenv.error) {
console.log(`ERROR WHILE READING ENV-VARS:${dotenv.error}`);
throw dotenv.error;
}
module.exports = {
nodeEnv: process.env.NODE_ENV,
nodePort: process.env.NODE_PORT,
hostPostgresql: process.env.HOST_POSTGRESQL,
portPostgresql: process.env.PORT_POSTGRESQL,
dbPostgresql: process.env.DB_POSTGRESQL,
userPostgresql: process.env.USER_POSTGRESQL,
pwdPostgresql: process.env.PWD_POSTGRESQL,
};

Mongo DB problem - connections accumulation

I have a problem with the approach I use to connect to Mondo DB.
I use the following method:
import { Db, MongoClient } from "mongodb";
let cachedConnection: { client: MongoClient; db: Db } | null = null;
export async function connectToDatabase(mongoUri?: string, database?: string) {
if (!mongoUri) {
throw new Error(
"Please define the MONGO_URI environment variable inside .env.local"
);
}
if (!database) {
throw new Error(
"Please define the DATABASE environment variable inside .env.local"
);
}
if (cachedConnection) return cachedConnection;
cachedConnection = await MongoClient.connect(mongoUri, {
useNewUrlParser: true,
useUnifiedTopology: true,
}).then((client) => ({
client,
db: client.db(database),
}));
return cachedConnection!;
}
Everytime I need to connect to MongoDB I do as follows:
const { db } = await connectToDatabase(config.URI, config.USERS_DATABASE);
const myUniversity = await db
.collection(config.MY_COLLECTION)
.findOne({})
Everything seems ok, so what is the problem?
The problem is that the connections to my DB don't close after I use them. In fact I thought that my server is stateless so after every time i use my DB, the connections end. But it is not true! They stay alive, and after few hours of using my app mongo atlas sends me an email saying that the limit is exceeded.
As you can see in this screenshot, this chart is ever growing. That means that connections stay on and they accumulate. How do you think I can solve this problem?
Keep in mind that it uses cachedConnection only if I use the same connection. If I call a different API from the first one it creates another connection and it doesn't enter in if (cachedConnection) block, but it goes forward till the end.
You can try this simple demo which will allow you to use the same connection throughout the application in different modules. There are three modules: the index.js is the starter program, the dbaccess.js is where you have code to create and maintain a connection which can be used again and again, and a apis.js module where you use the database connection to retrieve data.
index.js:
const express = require('express');
const mongo = require('./dbaccess');
const apis = require('./apis');
const app = express();
const init = async () => {
await mongo.connect();
app.listen(3000);
apis(app, mongo);
};
init();
dbaccess.js:
const { MongoClient } = require('mongodb');
class Mongo {
constructor() {
this.client = new MongoClient("mongodb://127.0.0.1:27017/", {
useNewUrlParser: true,
useUnifiedTopology: true
});
}
async connect() {
await this.client.connect();
console.log('Connected to MongoDB server.');
this.db = this.client.db('test');
console.log('Database:', this.db.databaseName);
}
}
module.exports = new Mongo();
apis.js:
module.exports = function(app, mongo) {
app.get('/', function(req, res) {
mongo.db.collection('users').find().limit(1).toArray(function(err, result) {
res.send('Doc: ' + JSON.stringify(result));
});
});
}
Change the appropriate values in the url, database name and collection name before trying.

nodejs MySQL - Server requests authentication using unknown plugin

When attempting to connect to MySQL 8.0.21 server running Ubuntu 20.04 using NodeJS and mysql2 package, I receive the common error below: Server requests authentication using unknown plugin sha256_password I know that mysqljs and mysql2 do not support sha256, so I confirmed my user was setup for mysql_native_password:
ALTER USER 'userName'#'%' IDENTIFIED WITH mysql_native_password BY 'password';
And have confirmed that default_authentication_plugin is set as mysql_native_password.
What makes this a strange issue, is that it only occurs when attempting to unit test the function in Mocha or Jest. When running the app normally, I am able to connect and make DB calls with no issues. To simplify troubleshooting, I created a new app.js file that only calls the dbQuery.getRow() function. Contents of those files and the output is given below.
app.js
(async function main () {
require('dotenv').config({ path: __dirname + '/config/.env' });
const dbQuery = require('./src/js/dbQuery');
let result = await dbQuery.getRow('table', 'c9024a7aead711eab20be6a68ff5219c');
console.log(result);
})();
dbQuery.js
const dbPool = require('./dbPool');
async function getRow(tableName, guid) {
try {
let sql = `
SELECT *
FROM \`${tableName}\`
WHERE guid='${guid}'`;
let [rows] = await dbPool.execute(sql);
return rows[0];
} catch (ex) {
console.log('dbQuery getRow failed with error: ' + ex);
return { error: true, message: ex };
}
}
dbPool.js
const { env } = require('process');
const mysql = require('mysql2/promise');
const dbPool = mysql.createPool({
host: env.DB_HOST,
port: env.DB_PORT,
database: env.DB_NAME,
user: env.DB_USER,
password: env.DB_PW,
// waitForConnections: env.WAIT_FOR_CONNECTIONS.toUpperCase() == 'TRUE' ? true : false,
connectTimeout: 10000,
connectionLimit: parseInt(env.CONNECTION_LIMIT),
queueLimit: parseInt(env.QUEUE_LIMIT)
});
module.exports = dbPool;
Terminal Output - Running the simplified app now returns the row as expected
node app.js
BinaryRow {
guid: 'c9024a7aead711eab20be6a68ff5219c',
name: 'spiffyRow',
displayValue: 'Spiffy Display Value'
}
However, when I attempt to do the same DB call in either Jest or Mocha, I run into the issue again, where it appears mysql2 is attempting to use the wrong authentication plugin.
dbQuery.test.js - currently setup for Mocha, but Jest exposed the same issue
const dbQuery = require('../src/js/dbQuery');
describe('MySQL DB Operations', function () {
describe('#getRow()', function () {
it('Should return row with guid specified', async function (done) {
let result = await dbQuery.getRow('table', 'c9024a7aead711eab20be6a68ff5219c');
if (result.guid == 'c9024a7aead711eab20be6a68ff5219c') done();
else done(result.error);
});
});
});
Terminal Output
npm test
MySQL DB Operations
#getRow()
dbQuery getRow failed with error: Error: Server requests authentication using unknown plugin sha256_password. See TODO: add plugins doco here on how to configure or author authentication plugins.
1) Should return row with guid specified
0 passing (49ms)
1 failing
Thanks in advance for any help, please let me know if any additional information is needed.
When executing the tests, my env variables were not being populated. The fix was as simple as adding require('dotenv').config({ path: 'path/to/.env' }); to my test file. I was thrown off by the error message returned by MySQL. I'm still not sure why MySQL responds stating sha256_password is requested when no credentials are provided, even when the default_auth_plugin is set to mysql_native_password, but once valid credentials were provided everything works as expected.

Sequelize with asynchronous configuration in nodejs

I have been bashing my head for days as I cannot find a valid example of async configuration in Sequelize
So as you may know, you can simply config a Sequelize instance like that
const sequelize = new Sequelize('postgres://user:pass#example.com:5432/dbname')
and then declare your Model
const User = sequelize.define('User', {
// Model attributes are defined here
firstName: {
type: DataTypes.STRING,
allowNull: false
},
lastName: {
type: DataTypes.STRING
// allowNull defaults to true
}
}, {
// Other model options go here
});
However what happens when the db credentials comes from an external service?
const credentials = await getDbCredentials();
const sequelize = new Sequelize({credentials})
since sequelize models creation are coupled with the instance creation (unlike many others ORMs) this becomes a big problem.
My current solution is the following:
const Sequelize = require("sequelize");
// Models
const { User } = require("./User");
const env = process.env.NODE_ENV || "development";
const db = {};
let sequelize = null;
const initSequelize = async () => {
if (!sequelize) {
let configWithCredentials = {};
if (env === "production") {
const credentials = await getDbCredentials();
const { password, username, dbname, engine, host, port } = credentials;
configWithCredentials = {
username,
password,
database: dbname,
host,
port,
dialect: engine,
operatorsAliases: 0
};
}
const config = {
development: {
// Dev config
},
production: configWithCredentials,
};
sequelize = new Sequelize(config[env]);
sequelize.authenticate().then(() => {
console.log("db authenticated")
});
});
}
db.User = User;
db.sequelize = sequelize;
db.Sequelize = Sequelize;
};
initSequelize().then(() => {
console.log("done");
});
module.exports = db;
However I feel that this is not a good approach because of the asynchronous nature of the initialization and sometimes the db is undefined.
Is there a better way to approach this thing?
Thanks
You can achieve this with beforeConnect hook, something like this:
sequelize = new Sequelize(config.database, '', '', config);
sequelize.beforeConnect(async (config) => {
config.username = await getSecretUsername();
config.password = await getSecretPassword();
});
Leave the initial credentials empty, then use beforeConnect to mutate the config. Not sure if this is the cleanest way to use it but seems to be working.
https://sequelize.org/master/manual/hooks.html
I think your db is sometimes undefined, because in your async function you're not "waiting" for the resolution of sequelize.authenticate(). Change this:
sequelize.authenticate().then(() => {
console.log("db authenticated")
});
To this:
await sequelize.authenticate()
console.log("db authenticated")
What was happening, is that your initSequelize async function would resolve, before sequelize.authenticate promise would. This is a common pitfall in JS. I think this adjustment will solve your problem. Regarding "the best approach", i don't see much that can be done here, but of course i don't have the entire picture.
I found a 'pure' sequelize way to do this through lifecycle hooks:
Basically a generic setup in a db.js file would look like this:
const { Sequelize } = require('sequelize');
const asyncFetch = require('../util/async-fetch');
const sequelize = new Sequelize({
dialect: 'mysql',
database: 'db_name',
host: '127.0.0.1'
});
sequelize.beforeConnect(async (config) => {
const [username, password] = await Promise.all([
asyncFetch('username'),
asyncFetch('password')
]);
config.username = username;
config.password = password;
});
module.exports = sequelize;
The sequelize model definition is really just a plain object so that can be set early. Model initialisation does require a sequelize instance to be passed in.
The setup was a bit clearer to me when using ES6 class definitions for the models. sequelize.define is replaced with a call to Model.init, and this can all be done in an async setup function.
const Sequelize = require('sequelize')
const { Model } = Sequelize
class User extends Model {
static get modelFields(){
return {
id: {
type: Sequelize.UUID,
primaryKey: true,
defaultValue: Sequelize.UUIDV4,
},
name: {
type: Sequelize.STRING,
allowNull: false,
unique: true,
}
}
}
static get modelOptions(){
return {
version: true,
}
}
static init(sequelize){
const options = { ...this.modelOptions, sequelize }
return super.init(this.modelFields, options)
}
static associate(models) {
this.hasMany(models.Task)
}
}
module.exports = User
const User = require('./User')
class Database {
async static setup(){
const credentials = await getCredentials()
this.sequelize = new Sequelize(credentials)
User.init(this.sequelize)
this.User = User
// When you have multiple models to associate add:
this.User.associate(this)
}
}
module.exports = Database
Due to the async credentials requirement, the rest of your app will just need to cope with the delay until the DB is setup. If this is a koa/express app for example, you could delay the server .listen() until the Database.setup() promise has resolved.
As this would have changed a lot of my code. I have ended up by creating a script in golang that gets my credential "asynchronously" before running my server.
I have use some code from this package: https://github.com/telia-oss/aws-env
And then pass my starting script as a command argument so I could "inherit" the environmental variables
./getEnv exec -- node index.js

OpenRecord: TypeError: User.create is not a function

ORM OpenRecord
I'm getting an error ( TypeError: User.create is not a function ) when I do the following:
await User.create(req.body.data)
When I log User I get this: [Function: User]
Database config:
//config/database/openRecord.js
/** more code above */
let store = new Store({
database,
user,
password,
host,
autoConnect: true,
autoAttributes: true,
models: require("../../app/models/models")
});
Models:
//app/models/models.js
module.exports = [
require("./user")
]
Model:
//app/models/user.js
const Store = require('openrecord/store/mysql');
class User extends Store.BaseModel {
static definition(){
this.validatePresenceOf(
'first_name', 'last_name', 'email', 'password'
);
this.validatesConfirmationOf('password');
this.validateFormatOf('email', 'email');
}
fullName(){
return `${this.first_name} ${this.last_name}`
}
}
module.exports = User;
Why is it throwing an error when I've clearly defined my object the correct way?
openrecord github link
openrecord website documentation
Two small typos are validatePresenceOf (should be validatesPresenceOf) and validateFormatOf (should be validatesFormatOf).
However, this is not the error you experience!
I'm pretty sure that you forget to wait until the store is ready.
If you use e.g. express I recommend to start listening after the store has loaded:
const express = require('express')
const store = require('./store')
const app = express()
async function start() {
// add middleware ...
await store.ready()
// start express server
app.listen()
}
start()
You have not exported the module. For example, we write:
module.exports = mongoose.model("user", userobj)
This is missing

Categories