Native MongoDB Driver Node.js - should we store collections? - javascript

I haven't been able to find any documents outlining if it's a bad / good idea to keep a reference to a collection so that it can be re-used after the db connection has been established.
for instance, in our database.ts which is called during our server start-up. It will grab the collections and store it for use in the module which can then be references throughout the project.
example of storing the collection
/*database.ts*/
//module to keep 1 connection alive and allow us to have 1 instance of our collections.
import {MongoClient, Db, Collection } from 'mongodb';
let uri: string = 'connection_string';
export let db: Db;
export let usrCollection: Collection;
export let bookCollection: Collection;
export function initDb(cb: any) {
MongoClient.connect(uri, function(err, database) {
//handle err
db = database;
cb(); //returns now since db is assigned.
});
}
export function initCollections() {
usrCollection = db.collection('users'); //non-async
bookCollection = db.collection('book'); //non-async
}
/*app.ts*/
//in our express app when it start up
import {db, initCollections, initDb} from './database';
initDb(function() {
//this will hold up our server from starting BUT
//there is no reason to make it async.
initCollections();
});
what are some of the short coming of this pattern/build? what can I improve and what should I avoid for performance hits specificially with managing the collections.would this pattern keep my source code bug free or easy to find the bugs and extensible for the future? or is there even a better pattern - please no ext library aside from the native mongodb driver for node.js solutions, such as mongoose.

IMO, this is not a good practise. Users of database.ts will have to make sure that they have initialized these collections before using them. I will not even export any internal collections, since user should never care how the database is structured. You should export some operation functions like function addUser(user: User) {/.../}. There is a neat code sample created by MS.
//Edit. Yes, you can store collections reference internally so that you don't need to type collection names every time. It's easy to introduce bugs by ignoring singular and plural. The best practise would be:
import mongodb = require('mongodb');
const server = new mongodb.Server('localhost', 27017, {auto_reconnect: true})
const db = new mongodb.Db('mydb', server, { w: 1 });
db.open(function() {});
const userCollection =db.collection("users");
const bookCollection = db.collection("book");

Related

Is it okay to declare the db instance as a global variable for accessible at all the time , without need to use "require" statement in node js

I am a beginner in "node js"
I am developing an program and I want to use the database model in all
For example (something like "wpdb") WordPress
Is the best way to create it as a global variable, or to use the require statement as needed?
Please help me get the best answer.
Thanks
No, this is not the best way to handle a db connection. You only want the db open and around for as long as necessary and no longer. Usually this means opening the db connection at the place in your code that can know how long to keep it open and then close the connection after using it.
If you are simply referring to the configuration for opening a DB connection, then you could define that configuration object at application start and then pass it as a parameter to your db instantiation code.
import myUserDbCode from '../myUserDbFile';
import myUserMessagesCode from '../myUserMsgsDbFile';
const dbConfig = {
dbname: 'myFancyDb',
serverName: 'myDbServerName',
connectionTimeout: 60
}
(async () => {
const userList = await myUserDbCode(dbConfig)
.then((dbResultSet) => {
return dbResultSet.data;
}
};
const myUser = userList[42];
const userMessages = await myUserMessagesCode(dbConfig, myUser)
.then((dbResultSet) => {
return dbResultSet.data;
}
};
console.log(userMessages);
})();
I recommend following some DB tutorials that show how to build a complete application to see some patterns about how to handle passing configuration options to code and how to manage DB connections.

Reusing Database Connections With Azure Functions Using Javascript

I cannot find clear information on how to manage database connections (MongoDB in my case) from an Azure function written in Javascript.
The Microsoft document below says to not create a connection for each invocation of the function by using static variables in C# using .NET Framework Data Provider for SQL Server and the pooling is handled by the client connection. It does not describe how to do this in Javascript.
https://learn.microsoft.com/en-us/azure/azure-functions/manage-connections
A solution of creating a global variable to hold the database client between invocations is described here but the author is not confident this is the correct way to do it.
http://thecodebarbarian.com/getting-started-with-azure-functions-and-mongodb.html
Has anyone used this in production or understand if this is the correct approach?
Yes, there's a very close equivalence between C#/SQL storing a single SqlConnection instance in a static variable and JS/MongoDB storing a single Db instance in a global variable. The basic pattern for JS/MongoDB in Azure Functions is (assuming you're up to date for async/await - alternatively you can use callbacks as per your linked article):
// getDb.js
let dbInstance;
module.exports = async function() {
if (!dbInstance) {
dbInstance = await MongoClient.connect(uri);
}
return dbInstance;
};
// function.js
const getDb = require('./getDb.js');
module.exports = async function(context, trigger) {
let db = await getDb();
// ... do stuff with db ..
};
This will mean you only instantiate one Db object per host instance. Note this isn't one per Function App - if you're using a dedicated App Service Plan then there will be the number of instances you've specified in the plan, and if you're using a Consumption Plan then it'll vary depending on how busy your app is.

MongoDB in node.js pattern

I've been reading and searching about a good design related with mongodb driver.
Like here was said, there is no need to open/close connection, so I was trying to prevent boilerplate code using a dbLoader.js file like this (only using one database):
const { MongoClient } = require('mongodb')
const logger = require('./logger')
const configLoader = require('./configLoader')
module.exports = (async () => {
const config = await configLoader()
const client = await MongoClient.connect(config.db.uri)
const db = client.db(config.db.dbName)
logger.log('info', `Database ${config.db.dbName} loaded`)
return db
})()
I wonder if there is some other approach that is better, and why. For example attaching it to app.locals (in express.js).
I'm not using mongoose and don't want to.
Consider using IOC containers such as https://github.com/inversify/InversifyJS or similar. For starters, you get the ability to mock the defined dependency for testing purposes (such as database connection) without hacky solutions, such as manual module mocking.

javascript node.js variable share among modules

i'm newbie looking for solution how to share a varible among files(modules) by reference. for an example my here is app.js
const users = {},
waitingQueue = []
module.exports = function(io){
io.on('connection', (socket)=>{
users[socket.id] = socket
socket.on("Search", (d, ack)=>{
waitingQueue.push(socket)
// logic with waitingQueue
// logic with users {}
})
socket.on("otherEvent", (d, ack)=>{
// logic with waitingQueue
// logic with users {}
})
})
}
now i'd like to divided it modulewise. now new app.js
const users = {},
waitingQueue = []
const Search = require('./search')
module.exports = function(io){
io.on('connection', (socket)=>{
users[socket.id] = socket
socket.on("Search", Search(users, waitingQueue))
})
}
now in differenct case my trouble
socket.on("Search", ...)
... should be a funtion
now if use encloser
socket.on("Search", ()=>...)
export modified{users, waitingQueue} form Search.js is ok but need to override users & waitingQueue varibles but not working.
i need to share these varible to other modules too with tightly coupled.
i also try event emitter based approach Object.observe() but unable to fix problem.
anyone help me pls to fix this problem
I would use REDIS to store the shared values.
You can use the redis client to get/set the values, those values will be stored in the redis database
Then those values can be get/set from any code in any process in the same local machine (as the db) or a set of remote machines (thats why i will mention scaling later on...)
Redis can be benchmarked and in my experience, compared to other db solutions its amazingly fast.
It's quite simple to use : An introduction to Redis data types and abstractions
Binary-safe strings.
Lists: collections of string elements sorted
according to the order of insertion.
Sets: collections of unique, unsorted string elements.
(Sorted sets, similar to Sets )
Hashes, which are maps composed of fields associated with values. Both the field and the value are strings. This is very similar to
Ruby or Python hashes.
...and some more...
I suggest you install the database server and client and try to get and set some values, I'm sure it will be useful knowledge for the future.
Extra Reasons :
It will give youeven more power to your arsenal
Its really good for scaling purposes (both : vertically &
horizontally scaling)
socket.io-redis (GitHub)
By running socket.io with the socket.io-redis adapter you can run
multiple socket.io instances in different processes or servers that
can all broadcast and emit events to and from each other.
socket.io-emitter (GitHub)
If you need to emit events to socket.io instances from a non-socket.io
process
EDIT I know you want to share variables between your code, but since I saw you are using socket.io, you might want to share those variables also accross your slaves...
Socket.IO : passing events between nodes
Well, my presumption is in a real world application users & waitingQueue would come from some sort of persistent storage e.g. Database, so the notion of "sharing" data across modules would be a non-issue as you'd most likely fetch directly from the DB.
However, if I did have in-memory "global" data that I wanted to share across different modules, then I'd most likely move it into it's own module e.g.
data.js
module.exports = {
users: {},
waitingQueue: []
}
app.js
const Search = require('../search');
const data = require('./data');
module.exports = io => {
io.on('connection', (socket) => {
data.users[socket.id] = socket;
socket.on('Search', Search(data.users, data.waitingQueue));
});
}
As it's own module, data.js can then be shared across various other modules.

Koa: What is the most sensible way to connect to a database without an existing wrapper?

I am very new to node & koa, so please excuse my stupidity.
I am not sure if I mess something up. But I would like to use Koa together with OrientDB. I can connect to OrientDB using Oriento (the module for Node). And I would like to use the power of generators of Koa.
Since the data in my OrientDB database relates to objects I'm using in my app, I would like to implement models (of course). So I guess the connecting to the database part would go in to that.
Say I had a model named "Task" then I would like it to expose a couple of methods and getters/setter. So Task.find(); should get all Tasks from the OrientDB Database
As far as I understand it, I would hook that somewhere in the middleware stack. And it would be nice if I could use generators so that my middleware waits until it gets the data back, using yield. Some error handling would be good as well...
With all that said:
Are my assumptions correct? Or is there a better way?
Do I have to do that all myself? Or am I missing modules that facilitate what I am
planning?
What would be a good point to start learning on how to properly do something like that?
Should I just look at existing wrappers for mongodb/mysql/whatever and abstract from that?
Thanks!
I've never used orient-db, but looking at the github page it looks like it offers a connection pool and returns promises. Based on that I would do something like this:
// in utils/oriento.js or similar
// instantiate it once and share in multiple places(models), since it offers
// connection pooling
var Oriento = require('oriento');
var server = Oriento({
host: 'localhost',
port: 2424,
username: 'root',
password: 'yourpassword'
});
var db = server.use({
name: 'mydb',
username: 'admin',
password: 'admin'
});
module.exports = db;
then for models:
// models/item.js
var db = require('../utils/oriento.js');
var Item = function(props){
};
Item.findById = function(id){
// db.query returns a promise
return db.query('select from OItem where id=:id', {
params: {
id: id
},
limit: 1
});
}
// add getters, setters, find etc...
module.exports = Item;
then in controllers:
// server.js or wherever
var Item = require('./models/item');
app.get('/:id', function *(){
var id = this.params.id;
this.body = yield Item.findById(id);
}
Hopefully this helps
You might want to checkout my screencasts on koajs at http://knowthen.com

Categories