Reusing Database Connections With Azure Functions Using Javascript - javascript

I cannot find clear information on how to manage database connections (MongoDB in my case) from an Azure function written in Javascript.
The Microsoft document below says to not create a connection for each invocation of the function by using static variables in C# using .NET Framework Data Provider for SQL Server and the pooling is handled by the client connection. It does not describe how to do this in Javascript.
https://learn.microsoft.com/en-us/azure/azure-functions/manage-connections
A solution of creating a global variable to hold the database client between invocations is described here but the author is not confident this is the correct way to do it.
http://thecodebarbarian.com/getting-started-with-azure-functions-and-mongodb.html
Has anyone used this in production or understand if this is the correct approach?

Yes, there's a very close equivalence between C#/SQL storing a single SqlConnection instance in a static variable and JS/MongoDB storing a single Db instance in a global variable. The basic pattern for JS/MongoDB in Azure Functions is (assuming you're up to date for async/await - alternatively you can use callbacks as per your linked article):
// getDb.js
let dbInstance;
module.exports = async function() {
if (!dbInstance) {
dbInstance = await MongoClient.connect(uri);
}
return dbInstance;
};
// function.js
const getDb = require('./getDb.js');
module.exports = async function(context, trigger) {
let db = await getDb();
// ... do stuff with db ..
};
This will mean you only instantiate one Db object per host instance. Note this isn't one per Function App - if you're using a dedicated App Service Plan then there will be the number of instances you've specified in the plan, and if you're using a Consumption Plan then it'll vary depending on how busy your app is.

Related

Is it okay to declare the db instance as a global variable for accessible at all the time , without need to use "require" statement in node js

I am a beginner in "node js"
I am developing an program and I want to use the database model in all
For example (something like "wpdb") WordPress
Is the best way to create it as a global variable, or to use the require statement as needed?
Please help me get the best answer.
Thanks
No, this is not the best way to handle a db connection. You only want the db open and around for as long as necessary and no longer. Usually this means opening the db connection at the place in your code that can know how long to keep it open and then close the connection after using it.
If you are simply referring to the configuration for opening a DB connection, then you could define that configuration object at application start and then pass it as a parameter to your db instantiation code.
import myUserDbCode from '../myUserDbFile';
import myUserMessagesCode from '../myUserMsgsDbFile';
const dbConfig = {
dbname: 'myFancyDb',
serverName: 'myDbServerName',
connectionTimeout: 60
}
(async () => {
const userList = await myUserDbCode(dbConfig)
.then((dbResultSet) => {
return dbResultSet.data;
}
};
const myUser = userList[42];
const userMessages = await myUserMessagesCode(dbConfig, myUser)
.then((dbResultSet) => {
return dbResultSet.data;
}
};
console.log(userMessages);
})();
I recommend following some DB tutorials that show how to build a complete application to see some patterns about how to handle passing configuration options to code and how to manage DB connections.

What is the way to create unit-test for sailsJS (or another framework)

Well I embraced test-driven-development in the past year while learning C# (those seem to go hand in hand). In javascript however I am struggling to find a good workflow for tdd. This is mainly due to the combination of many frameworks which seemingly consider testing a second class citizen.
As an example consider a class worker. This class would have some functionality to act upon a database. So how would I write unit tests for the functionality of this class?
In c# (and rest of C/JAVA family) I'd write this class in such a way that the constructor would take a database-connection parameter. Then during test runs the object is called with a mock-database-connection object instead of the real object. Thus no modification of the source.
In python a similar approach can be used, however apart from providing a mocking object to the constructor to handle HAS_A dependencies, we can also use dependency injection to mock IS_A dependencies.
Now apply this in javascript, and sailsJS in particular (Though a similar problem occurs with sencha and other frameworks). It seems that the code is so tightly coupled to the library/framework that I can't create manual stubs/mocks? - Other than by actually using a pre-run task to modify the source/config.js?
In sails an object (say worker, a controller) has to reside in a specific folder to work, and it "connects" automatically to the database, without me providing any notion of a database object. (Thus preventing me from actually supplying it with my own object).
Say I have a database with a table "Students", then a controller would look something like (With Students being a model defined in api/models:
const request = require('request');
module.exports = {
updateData: function (req, res) {
let idx = params.jobNumber;
Students.find({Nr:idx})
.exec(function (err, result) {
//....
});
},
};
So how would I load above function into a (mocha) test? And how would I decouple the database (used implicitly by sails) so that I can mock the functionality? - And what should I actually mock?
I of course don't wish to do integration tests, so I shouldn't build a "development database" as I don't wish to test the connection, I wish to test the controller functions.
In the documentation, they provide a nice quick example of how to set up testing using Mocha: https://sailsjs.com/documentation/concepts/testing
In the bootstrap.test.js file, all they're doing is lifting and lowering your application with Sails, just so your application has access to controllers/models/etc. within its test environment. They also show how to test individual controllers, which is essentially just making requests that hit the endpoints to fire off the controller's actions. To avoid testing the full lifecycle of a request, you can just require the controller file within a *.test.js file and test any exported action. Remember, though, Sails builds the request and response objects that get passed to the controllers. So, if you want all of the correct data and have those objects be valid, it's best to just let Sails handle it, and you only make a request to the endpoint, unless if you know exactly how to build the request and response objects. But, that's the point of a framework: you use it as intended, and test against/with it. You don't use your version of how it may work. TDD is in all languages and frameworks, you just need to fit it within your technology.
If you don't want to use a database for your test environment, you can tell it to use the sails-disk adapter by creating an environment file under config/env/ for the test environment and forcing that environment to use sails-disk.
For example...
config/env/test.js --> test environment file
module.exports = {
models: {
connection: 'localDiskDb',
migrate: 'drop',
},
port: 1337,
host: '127.0.0.1',
};
In config/connections.js the below to connections object (if not already there)...
localDiskDb: {
adapter: 'sails-disk'
},
And finally, we need to tell it to use that environment when running the tests. Modify test/bootstrap.test.js like the following...
var sails = require('sails');
before(function(done) {
// Increase the Mocha timeout so that Sails has enough time to lift.
this.timeout(10000);
// Set environment to testing
process.env.NODE_ENV = 'test';
sails.lift({
// configuration for testing purposes
}, function(err) {
if (err) {
return done(err);
}
//...
done(err, sails);
});
});
after(function(done) {
// here you can clear fixtures, etc.
// This will "refresh" the memory store so you
// have a clean test datastore every time you run tests
sails.once('hook:orm:reloaded', () => {
sails.lower((err) => {
done();
if (err) {
process.exit(1);
} else {
process.exit(0);
}
});
});
sails.emit('hook:orm:reload');
});
Adding Jason's suggestion in an "answer" format, so that others may find it more easily.
sails-mock-models allows simple mocking for sails model queries, based on sinon
Mock any of the standard query methods (ie 'find', 'count', 'update') They will be called with no side effects.
I haven't actually tried it yet (just found this question), but I'll edit this if/when I have any problems.
Sails Unit Test is perfectly explained in the following blog.
https://www.packtpub.com/books/content/how-add-unit-tests-sails-framework-application
Please refer to it.

Throws error when "new CouchDB.Database(queueDb);" for the second time

I am using meteor-couchdb and trying to connect to db when an API call is made and perform the required operation.
dbName = new CouchDB.Database('db_name');
But when API call is made again it throws below error
Error: A method named '/db_name/insert' is already defined
Depending on the API call, I should be able to select the Db it need to be connected.
I tried doing in node way i.e
Cloudant.use('db_name');
But then as Meteor is my server side framework I need to handle async function in synchronously using async await or Meteor.wrapAsync().
What would be the suggested approach to connect to db and perform the actions, whenever an API call is made ?
If i understand the meteor CouchDB implementation correctly it connects to one db server and allows you to work with multiple databases so there's essentialy one single connection to the server no matter how many times you call new CouchDB.Database('db_name');
What you should do is following:
// tasks.js
// create an instance of Tasks database only once
var Tasks = new CouchDB.Database('tasks');
// you may want to export it so you can use it elsewhere
exports.Tasks = Tasks;
// blabla.js
// in another file require the file
var Tasks = require('path/to/tasks.js').Tasks;
// and use it when needed
Tasks.find();
Additional code to answer the comment bellow
You could have a file let's call it dbs.js that would handle dynamic creation of dbs for you
var dbs = {};
exports.getDb = function(name){
if (!dbs[name])
dbs[name] = new CouchDB.Database(name);
return dbs[name];
};
then use this anywhere you want
var Tasks = require('dbs.js').getDb('Tasks');
Tasks.find();

javascript node.js variable share among modules

i'm newbie looking for solution how to share a varible among files(modules) by reference. for an example my here is app.js
const users = {},
waitingQueue = []
module.exports = function(io){
io.on('connection', (socket)=>{
users[socket.id] = socket
socket.on("Search", (d, ack)=>{
waitingQueue.push(socket)
// logic with waitingQueue
// logic with users {}
})
socket.on("otherEvent", (d, ack)=>{
// logic with waitingQueue
// logic with users {}
})
})
}
now i'd like to divided it modulewise. now new app.js
const users = {},
waitingQueue = []
const Search = require('./search')
module.exports = function(io){
io.on('connection', (socket)=>{
users[socket.id] = socket
socket.on("Search", Search(users, waitingQueue))
})
}
now in differenct case my trouble
socket.on("Search", ...)
... should be a funtion
now if use encloser
socket.on("Search", ()=>...)
export modified{users, waitingQueue} form Search.js is ok but need to override users & waitingQueue varibles but not working.
i need to share these varible to other modules too with tightly coupled.
i also try event emitter based approach Object.observe() but unable to fix problem.
anyone help me pls to fix this problem
I would use REDIS to store the shared values.
You can use the redis client to get/set the values, those values will be stored in the redis database
Then those values can be get/set from any code in any process in the same local machine (as the db) or a set of remote machines (thats why i will mention scaling later on...)
Redis can be benchmarked and in my experience, compared to other db solutions its amazingly fast.
It's quite simple to use : An introduction to Redis data types and abstractions
Binary-safe strings.
Lists: collections of string elements sorted
according to the order of insertion.
Sets: collections of unique, unsorted string elements.
(Sorted sets, similar to Sets )
Hashes, which are maps composed of fields associated with values. Both the field and the value are strings. This is very similar to
Ruby or Python hashes.
...and some more...
I suggest you install the database server and client and try to get and set some values, I'm sure it will be useful knowledge for the future.
Extra Reasons :
It will give youeven more power to your arsenal
Its really good for scaling purposes (both : vertically &
horizontally scaling)
socket.io-redis (GitHub)
By running socket.io with the socket.io-redis adapter you can run
multiple socket.io instances in different processes or servers that
can all broadcast and emit events to and from each other.
socket.io-emitter (GitHub)
If you need to emit events to socket.io instances from a non-socket.io
process
EDIT I know you want to share variables between your code, but since I saw you are using socket.io, you might want to share those variables also accross your slaves...
Socket.IO : passing events between nodes
Well, my presumption is in a real world application users & waitingQueue would come from some sort of persistent storage e.g. Database, so the notion of "sharing" data across modules would be a non-issue as you'd most likely fetch directly from the DB.
However, if I did have in-memory "global" data that I wanted to share across different modules, then I'd most likely move it into it's own module e.g.
data.js
module.exports = {
users: {},
waitingQueue: []
}
app.js
const Search = require('../search');
const data = require('./data');
module.exports = io => {
io.on('connection', (socket) => {
data.users[socket.id] = socket;
socket.on('Search', Search(data.users, data.waitingQueue));
});
}
As it's own module, data.js can then be shared across various other modules.

Native MongoDB Driver Node.js - should we store collections?

I haven't been able to find any documents outlining if it's a bad / good idea to keep a reference to a collection so that it can be re-used after the db connection has been established.
for instance, in our database.ts which is called during our server start-up. It will grab the collections and store it for use in the module which can then be references throughout the project.
example of storing the collection
/*database.ts*/
//module to keep 1 connection alive and allow us to have 1 instance of our collections.
import {MongoClient, Db, Collection } from 'mongodb';
let uri: string = 'connection_string';
export let db: Db;
export let usrCollection: Collection;
export let bookCollection: Collection;
export function initDb(cb: any) {
MongoClient.connect(uri, function(err, database) {
//handle err
db = database;
cb(); //returns now since db is assigned.
});
}
export function initCollections() {
usrCollection = db.collection('users'); //non-async
bookCollection = db.collection('book'); //non-async
}
/*app.ts*/
//in our express app when it start up
import {db, initCollections, initDb} from './database';
initDb(function() {
//this will hold up our server from starting BUT
//there is no reason to make it async.
initCollections();
});
what are some of the short coming of this pattern/build? what can I improve and what should I avoid for performance hits specificially with managing the collections.would this pattern keep my source code bug free or easy to find the bugs and extensible for the future? or is there even a better pattern - please no ext library aside from the native mongodb driver for node.js solutions, such as mongoose.
IMO, this is not a good practise. Users of database.ts will have to make sure that they have initialized these collections before using them. I will not even export any internal collections, since user should never care how the database is structured. You should export some operation functions like function addUser(user: User) {/.../}. There is a neat code sample created by MS.
//Edit. Yes, you can store collections reference internally so that you don't need to type collection names every time. It's easy to introduce bugs by ignoring singular and plural. The best practise would be:
import mongodb = require('mongodb');
const server = new mongodb.Server('localhost', 27017, {auto_reconnect: true})
const db = new mongodb.Db('mydb', server, { w: 1 });
db.open(function() {});
const userCollection =db.collection("users");
const bookCollection = db.collection("book");

Categories