I've been reading and searching about a good design related with mongodb driver.
Like here was said, there is no need to open/close connection, so I was trying to prevent boilerplate code using a dbLoader.js file like this (only using one database):
const { MongoClient } = require('mongodb')
const logger = require('./logger')
const configLoader = require('./configLoader')
module.exports = (async () => {
const config = await configLoader()
const client = await MongoClient.connect(config.db.uri)
const db = client.db(config.db.dbName)
logger.log('info', `Database ${config.db.dbName} loaded`)
return db
})()
I wonder if there is some other approach that is better, and why. For example attaching it to app.locals (in express.js).
I'm not using mongoose and don't want to.
Consider using IOC containers such as https://github.com/inversify/InversifyJS or similar. For starters, you get the ability to mock the defined dependency for testing purposes (such as database connection) without hacky solutions, such as manual module mocking.
Related
I am a beginner in "node js"
I am developing an program and I want to use the database model in all
For example (something like "wpdb") WordPress
Is the best way to create it as a global variable, or to use the require statement as needed?
Please help me get the best answer.
Thanks
No, this is not the best way to handle a db connection. You only want the db open and around for as long as necessary and no longer. Usually this means opening the db connection at the place in your code that can know how long to keep it open and then close the connection after using it.
If you are simply referring to the configuration for opening a DB connection, then you could define that configuration object at application start and then pass it as a parameter to your db instantiation code.
import myUserDbCode from '../myUserDbFile';
import myUserMessagesCode from '../myUserMsgsDbFile';
const dbConfig = {
dbname: 'myFancyDb',
serverName: 'myDbServerName',
connectionTimeout: 60
}
(async () => {
const userList = await myUserDbCode(dbConfig)
.then((dbResultSet) => {
return dbResultSet.data;
}
};
const myUser = userList[42];
const userMessages = await myUserMessagesCode(dbConfig, myUser)
.then((dbResultSet) => {
return dbResultSet.data;
}
};
console.log(userMessages);
})();
I recommend following some DB tutorials that show how to build a complete application to see some patterns about how to handle passing configuration options to code and how to manage DB connections.
I've started to develop a desktop app with node and electron. It has a package, which is implementing connection with some API. It is structured as one base class, and some derrived classes in this way:
ApiBase
ApiAuth extends ApiBase
ApiAuth.login()
ApiAuth.logout()
etc...
ApiTasks extends ApiBase
ApiTasks.getTaskList()
etc...
etc...
And now, i want to make nice and convinient way to use these classes in my app. So i need to create some entry point, which will provide an access to my API implementation. But, i do not have much expirience to make it right.
I thought about something like this:
index.js:
const ApiAuth = require('./amazing-time-api-auth');
const ApiTasks = require('./amazing-time-api-tasks');
apiAuth = new ApiAuth('www.sample-host.com');
apiTasks = new ApiTasks('www.sample-host.com');
module.exports = {
login: apiAuth.login,
logout: apiAuth.logout,
getTaskList: apiTasks.getTaskList,
etc...
}
somwhere at the app:
const api = require("./lib/someApi");
// need to get task list for some reason
api.getTaskList(param1, param2)
But there are some problems with this approach i managed:
it is a problem to pass host param to the constructors in index.js dynamicly
i am not sure if creating this instances everytime requiring index.js is a rigth thing
So i want to know about some approches i can use here, because i do now even know where to start research. Thank you.
I think that you identified some of the most crucial decisions with this:
it is a problem to pass host param to the constructors in index.js dynamicly
IMO Configuration and the interface are important considerations. Even though it can be refactored after the fact an easy to configure and consume interface will help reduce adoption of your library. As you pointed out the configuration is static right now and very brittle. Ie a change to the URL will cascade to all clients and require all clients to update.
A first intuitive alternative may be to allow dynamic configuration of the current structure:
apiAuth = new ApiAuth(process.env.API_AUTH_URL || 'www.sample-host.com');
apiTasks = new ApiTasks(process.env.API_TASKS_URL || 'www.sample-host.com');
While this allows client to dynamically configure the URL, the configuration is "implicit". IMO this is unintuitive and difficult to document. Also it's not explicit and requires a client to look in the code to see the environmental variables and instantiation flow.
I would favor exposing these classes to the client directly. I would consider this approach "explicit" as it forces the client to explicitly configure/instantiate your components. I think it's like providing your clients with primitives and allowing them to compose, build, and configure them in whatever way they want:
const ApiAuth = require('./amazing-time-api-auth');
const ApiTasks = require('./amazing-time-api-tasks');
module.exports = {
auth: ApiAuth,
tasks: ApiTasks
}
This automatically namespaces the api behind its functions (auth|tasks) AND requires that the client instantiatae the classes before using:
const api = require("./lib/someApi");
const auth = new api.auth(process.env.SOMETHING, 'some-url');
This pulls the configuration further out in the architecture. It forces the client to decide how it wants to get the URL and explicitly instantiate the library. What if one of your clients doesn't use login/logout? This may be more flexible in that case.
i am not sure if creating this instances everytime requiring index.js is a rigth thing
If instantiation should remain hidden, another alternative would be to provide a builder function in order to encapsulate it:
const ApiAuth = require('./amazing-time-api-auth');
const ApiTasks = require('./amazing-time-api-tasks');
apiAuth = new ApiAuth('www.sample-host.com');
apiTasks = new ApiTasks('www.sample-host.com');
module.exports = {
auth: {
build: (url) => {
return new ApiAuth(url);
}
},
tasks: {
build: (url) => {
return new ApiTasks(url);
}
}
}
This should still hide each class but still allows the client to decide how it configures each class:
const api = require("./lib/someApi");
const auth = api.auth.build('my-url');
auth.login();
I am trying to make a small REST API with API gateway, lambda, and DynamoDB, while following good development practices such as TDD. I'm used to being able to use a DI container to provision my objects, which lends itself perfectly for mocking and testing. In an MVC framework, there would be a single entry point, where I could define my container configuration, bootstrap the application, and invoke the controller to handle the event. I could test the controller independently of the rest of the application, and inject mocked dependencies. I can't figure out how to decouple the dependencies a lambda function may have from the lambda function itself. For example:
const { DynamoDB } = require('aws-sdk')
const { UserRepo } = require('../lib/user-repo')
const client = new DynamoDB({ region: process.env.REGION }) // Should be resolved by DI container
const userRepo = new UserRepo(client) // Should be resolved by DI container
exports.handler = async (event) => {
return userRepo.get(event.id)
}
Please can anyone lead me in the right direction for structuring lambda code so it can be unit tested properly?
One way we've approached this in the project I'm currently working on is splitting out the requirements, so the handler is responsible for:
Creating the clients;
Extracting any config from the environment; and
Getting the parameters from the event.
Then it calls another function that does most of the work, and which we can test in isolation. Think of the handler like a controller, and the other function like the service that does the work.
In your specific case, that might look like:
const { DynamoDB } = require('aws-sdk');
const { UserRepo } = require('../lib/user-repo');
const doTheWork = (repo, id) => repo.get(id);
exports.handler = async (event) => {
const client = new DynamoDB({ region: process.env.REGION });
const userRepo = new UserRepo(client);
return doTheWork(userRepo, event.id);
}
doTheWork can now be exercised at the unit level using test doubles for the repo object and whatever inputs you want. The UserRepo is already decoupled by constructor injection of the Dynamo client, so that should be pretty testable too.
We also have tests at the integration level that only mock out the AWS SDK stuff (you could alternatively use transport layer mocking or something like aws-sdk-mock) plus E2E testing that ensures the whole system works together.
I'm developing a Node.js ORM library based on Knex, similar to Bookshelf, for use in other personal projects.
Some components of my library require an initialised instance of Knex, so I wrapped them up in an object that gets a Knex instance in the constructor, using a wrapper function to insert the Knex object without having the user inserting it whenever using the library. I tried to do it similar to how Knex and Bookshelf do it, but I found that code hard to read, besides I use ES6 classes, so it's not quite the same.
This is my current code:
const _Entity = require('./Entity.js');
class ORM {
constructor(knex) {
// wrapper for exposed class with Knex dependency;
// knex is the first argument of Entity's constructor
this.Entity = function(...args) {
return new _Entity(knex, ...args);
};
// exposed class without Knex dependency
this.Field = require('./Field.js');
}
}
function init(knex) {
return new ORM(knex);
}
module.exports = init;
The idea is that the user can use it something like this:
const ORM = require('orm')(knex);
const Entity = ORM.Entity;
const Field = ORM.Field;
const User = new Entity('user', [
new Field.Id(),
new Field.Text('name'),
// define columns...
]);
let user = User.get({id: 5});
It bothers me that Entity is only indirectly exposed and the code looks odd to me. Is there any more elegant or a "standard" way to expose components with dependencies?
Just use a regular function? :
const _Entity = require('./Entity.js');
const Field = require('./Field.js');
module.exports = function init(knex){
return {
Field,
Entity: _Entity.bind(_Entity, knex)
};
};
I haven't been able to find any documents outlining if it's a bad / good idea to keep a reference to a collection so that it can be re-used after the db connection has been established.
for instance, in our database.ts which is called during our server start-up. It will grab the collections and store it for use in the module which can then be references throughout the project.
example of storing the collection
/*database.ts*/
//module to keep 1 connection alive and allow us to have 1 instance of our collections.
import {MongoClient, Db, Collection } from 'mongodb';
let uri: string = 'connection_string';
export let db: Db;
export let usrCollection: Collection;
export let bookCollection: Collection;
export function initDb(cb: any) {
MongoClient.connect(uri, function(err, database) {
//handle err
db = database;
cb(); //returns now since db is assigned.
});
}
export function initCollections() {
usrCollection = db.collection('users'); //non-async
bookCollection = db.collection('book'); //non-async
}
/*app.ts*/
//in our express app when it start up
import {db, initCollections, initDb} from './database';
initDb(function() {
//this will hold up our server from starting BUT
//there is no reason to make it async.
initCollections();
});
what are some of the short coming of this pattern/build? what can I improve and what should I avoid for performance hits specificially with managing the collections.would this pattern keep my source code bug free or easy to find the bugs and extensible for the future? or is there even a better pattern - please no ext library aside from the native mongodb driver for node.js solutions, such as mongoose.
IMO, this is not a good practise. Users of database.ts will have to make sure that they have initialized these collections before using them. I will not even export any internal collections, since user should never care how the database is structured. You should export some operation functions like function addUser(user: User) {/.../}. There is a neat code sample created by MS.
//Edit. Yes, you can store collections reference internally so that you don't need to type collection names every time. It's easy to introduce bugs by ignoring singular and plural. The best practise would be:
import mongodb = require('mongodb');
const server = new mongodb.Server('localhost', 27017, {auto_reconnect: true})
const db = new mongodb.Db('mydb', server, { w: 1 });
db.open(function() {});
const userCollection =db.collection("users");
const bookCollection = db.collection("book");