Dependency injection pattern in NodeJS - javascript

Recently, i've started dabbling into nodeJS.
Now i know that dependency injection is the go to pattern to decoupling code and keeping it all organized.
However, what do you do when you have one dependency that uses other dependencies?
For example, lets say i have a invest module which uses the database module and the user module, the user module also uses the database module.
And suppose i have an withdrawal module that uses the invest module, database module and the user module. Is it smart to define a property in the module's object and inject the other dependency into it throughout the application?
const dbc = require('./dbc');
const user = require('./user');
const invest = require('./invest');
const withdraw = require('./withdraw');
let user.dbc = dbc;
let invest.user = user;
let invest.dbc = dbc;
let withdraw.invest = invest;
let withdraw.user = user;
let withdraw.dbc = dbc;
Is this the best/smart way to go about doing this? Something feels wrong about this.
Sure, i could just inject the database module into user module and inject the user module into invest module, and the invest module into the withdrawal module. But something about this doesn't feel clean either.
If someone should show me the light, the community acccepted standard practice of going about this, that would great.

It is absolutely fine to require a module more than once. NodeJS executes the code in the module only once and remembers what the module exported.
When you require the module a second time, the module is not executed a second time (and doesn't create another database connection). Instead, the result of require('./dbc') will always by a reference to the same object that represents the database connection.

The built-in NodeJS Dependency Injection is not enough for me. Here is the reasons:
It doesn't naturally have component life cycle management, i.e Singleton, Transient, and Wrapped. Say if you wanna use a module as singleton in a context but transient in another context, you will have to deal with having two different factory methods for the module creation.
Injecting dependencies to a module body via 'require()' is not recommended as it takes central dependency management off our hands. Besides, it makes it hard to resolve Circular Dependency although this problem seldom happens.
Those motivated me to find a better DI for NodeJS apps. There're quite a few of them out there. I also wrote a lightweight DI for myself. Give it a try here if you think it helps: https://github.com/robo-creative/nodejs-robo-container
With DI's supports, I'm able to stick to declaring every module as a type, no more factory methods for them. Module life cycle will be managed by the container. Your code will look like:
var $ = require('robo-container');
// dbc should only be a type, we can restrict it to have only one instance.
$.bind('dbc').to(require('./dbc')).asSingleton();
// following statements use Property Injection
// user should be a type that may have multiple instances (transient).
$.bind('user').to(require('./user')).set({ dbc: 'dbc' });
// invest and withdraw should be types.
$.bind('invest').to(require('./invest')).set({ user: 'user', dbc: 'dbc' });
$.bind('withdraw').to(require('./withdraw')).set({ user: 'user', dbc: 'dbc', invest: 'invest' });
The binding statements above use Property Injection. If your 'withdraw' module has a constructor that accepts user, dbc and invest, you can use Constructor Injection like this:
// the withdraw module.
function WithdrawModule(user, dbc, invest) {
this.doSomething = function() {
}
}
$.bind('withdraw').to(require('./withdraw')).use('user', 'dbc', 'invest');
To create withdraw instance, call $() or $.resolve():
var withdraw = $('withdraw'); // or $.resolve('withdraw');
withdraw.doSomething();
That's it. Hope you have the answer for your questions.

Related

Node js, split up files without having to pass dependencies around?

This may be just me lacking a 'bigger picture' so to speak, but I'm having trouble understanding why exporting modules is needed to just split up files.
I tried doing something like this:
//server.js
var app = require('koa')();
var othermodule1 = require('othermodule1')();
var othermodule2 = require('othermodule2')();
var router = require('./config/routes')();
app.use(router.routes());
//routes.js
module.exports = require('koa-router')()
.get('*', function*(next){
othermodule1.something;
})
realizing that routes.js does not have access to 'othermodule1' after calling it from serverjs. I know that there's a way to pass needed variables during the require call, but I have a lot more than just 2 modules that I would need to pass. So from my probably naive perspective, this seems somewhat unnecessarily cumbersome. Someone care to enlighten me or is there actually a way to do this that I missed?
Each node.js module is meant to be a stand-alone sharable unit. It includes everything that it needs to do its job. That's the principle behind modules.
This principle makes for a little more overhead at the start of each module to require() in everything you need in that module, but it's only done once at the server startup and all modules are cached anyway so it isn't generally a meaningful performance issue.
You can make global things by assigning to the global object, but they that often breaks modularity and definitely goes against the design spirit of independently shareable modules.
In your specific code, if routes needs access to othermodule1, then it should just require() it in as needed. That's how modules work. routes should just include the things it needs. Modules are cached so requiring it many times just gives every require() the same module handle from a cache.
This is an adjustment in thinking from other systems, but once you get use to it, you just do it and it's no big deal. Either require() in what you need (the plain shareable module method) or pass something into a module on its constructor (the push method) or create init() methods so someone can initialize you properly or call some other module to get the things you need (the pull method).

The Fundamentals of the Node.js Module Paradigm?

I am struggling to really get a grasp on some fundamental basics here, and I feel it is not only holding me back, but resulting in crappy code and I don't like that.
I understand the concept of breaking out functional chunks of code into separate modules, like say routes, DB models, etc, but i'm having a real tough time understanding how to properly orchestrate the interdependent functioning of all these separate modules.
Let me give a fe examples of where my struggles lie.
Example 1
My ExpressJS 'app' is setup in my main program module, just like you see in every tutorial. However I need to access the app instance in other modules as well. How do I do that? One way I learned from various tutorials is to make the entire module export a function which takes the app as a param, then do what I need in the function. But that seems to me to add a lot of complexity to things. Not only do I now have an entire module enclosed in a function, but I seem to lose the ability to actually export multiple functions, objects, or other variables out of that module.
Module as a Function
module.exports = function(app) {
blah;
};
Module without a Function
exports.func1 = function() {
}
exports.func2 = function() {
}
The latter gives me much more flexibility in my eyes, but I seem to be forced often to use the former, because I need to pass in things like the app from somewhere else.
Example 2
I am using connect-rest for my REST API. All the code for my API lives in a separate module named simply 'api'. That has been fine until recently. Now I need to access a function that is in the api module, from inside my routes module. Presently my main routes are defined before my api, so I can't exactly pass my api export into my route function. I could reverse them probably, but this is only covering up a larger issue.
In short, the issue is one of increasing interdependence
As my codebase grows, i'm finding it more and more frequent that various modules need to work with each other - it just isn't feasible to keep them all completely searate. Sometime it is possible, but it is unclean.
I feel like i'm missing some basic Node.JS (or maybe just Javascript) paradigm that is used to manage all of this.
If anyone could help me understand I would be most appreciative. I am an experienced developer in other languages such as C++ and Python if it helps to couch things in other terms.
An attempt to sum up the issue
I feel that I did not adequately communicate my intention for posting, so let me try and sum up my issue with a working problem.
server.js
var http = require('http'),
express = require('express'),
app = express();
// Bunch of stuff done with app to get it set up
var routes = require('routes.js')(app);
app.js
module.exports = function(app, express) {
var router = express.router();
// code for routes
app.use('/', router);
}
In the above example, routes are split off into their own module, but that module needs app and express objects from server.js in order to function. So, by my current understanding, the only way to get those over into routes.js is to make routes.js export one big function which you then call with the two objects you need.
However, what if I want routes.js to export multiple functions that might be used in other places? By my understanding I now can't. What if I Wanted to do:
authentication.js
var routes = require('routes');
// setup auth
routes.doSomethingFunction(auth);
I can't do that because routes is only exporting that one mega function.
Each node module is simply an object. The part of that object which is available to the outside world is the module.exports object which contains properties which can be functions or data.
The require("xxx") command gets you the exports object for that module (from a central cache or loads it from the .js file is it hasn't yet been loaded).
So, code sharing is simple. Just have each module do a require() on any other modules that it wants to share code from and have those modules make sure the shared functions are accessible via it's own exports object. This allows each module to essentially be stand-alone. It loads any other code that it needs and makes it a lot easier to reuse code. Modules are cached so doing lots of require() operations on the same module from lots of other modules is nothing more than a cache lookup and is not something to worry about.
Data sharing (such as your app object) can be accomplished in several different ways. The most common way is when you load the module to just call some sort of initialization function for the module and pass it any data that it might need. That would be the push model. Or, you can also do the pull model where a module asks another module for some piece of data.
All of this is a lot easier with the right code organization. If you start to feel like you have a spaghetti or interdependence, then perhaps you don't have the right code organization or you're just a bit too shy on just using require() to pull in everything a given module needs. Remember each module will load whatever it needs itself so you only have to worry about what you need. Load those modules and they will load what they need.
You may also want to think more in terms of objects so you put most properties on some sort of object rather than lots of loose, individually shared variables. You can then share a single object and it automatically makes all the properties of that variable available to whomever you shared it with.
As to your question about sharing the app object with another module, you can do that like this:
// in your app module
var express = require('express');
var app = express();
var otherModule = require('otherModule');
otherModule.setApp(app);
// now otherModule has the singleton `app` object that it can use
// in this case, otherModule must be initialized this way before it can do its job
In this example, I just used a single method .setApp() to set the app object. That means all the other methods are available for other access into that module.
This could have also been done with a constructor-like method:
// in your app module
var express = require('express');
var app = express();
var otherModule = require('otherModule')(app);
This works also because the constructor can then return an object with other methods on it if you want. If you want to be able to get access to otherModule from within other modules, but obviously you only want to initialize it just once and not in those other places, then you can either do this:
var otherModule = require('otherModule')();
from those other modules and have the constructor just check that if nothing is passed to it, then it is not getting the app object from this constructor call so it should just return an object with other methods. Or, you can use the first code block above that returns all the methods from the initial require(). You're totally free to decide what to return from the require(). It can be just a constructor-like function which then returns another object when it is called. It can be just an object that has methods on it or (because functions are objects that can also have properties), you can even return a constructor-like function that ALSO has methods on it (though this is a bit less standard way of doing things).
And, your constructor function can decide what to do based on what is passed to it, given it a multitude of different behaviors based on what you pass to it.

Angular.js - Javascript Dependency Injection

I read about DI and DI in Angular.js.
From what I understand DI in Angular.js means that Angular.js is allowing controller, factory, service, or others, to specify dependencies, without the need of creating the dependency.
Questions:
In some point dependency has to be created, making the place where the dependency was created not DIed, how do I understand this?
What if I have:
var thing = function(dep){
this.dep = dep || new depCreator();
}
is this DIed? or depends whether dep is passed to the function?
From what I see, DI means allow to set a dependency, being it in a function or object, at the end, would it mean to separate initialization/configuration/data from other parts of the program(logic? although we could have also initialization logic)?:
var dep1 = 'qwe';
var thing = function(dep){ this.dep = dep; }
var diedThing = new thing(dep1);
This would allow to set dep1 in a configuration file, for example.
If the plain JavaScript implementing DI is:
var thing = function(dep){
this.dep = dep;
}
instead of
var thing = function(){
this.dep = new depCreator();
}
Is this right?
But what if depCreator depends on configuration files(or an extracted configuration), would this be DIed?
When I read that Angular.js has(?) DI, is it correct to think that this DI means that Angular.js creates and searches dependencies for me? is there another meaning?
Lastly, if DI is so complex, but means just to separate configuration from implementation(or logic?), why not call it single responsibility principle, i.e. the method does what the method does, the configuration does what the configuration does, and so on.
At the end, DI is to me a subjective concept, which is how do you imagine and split responsibilities in some application, is this even near to correct?
Sorry for the long question.
The place where the dependency is created does not depend on it. It's sole purpose is usually to create the "thing" and register it with the DI subsystem. There is nothing weird or suspicious about that.
Why would you want to do this? Maybe instead depend on a service that creates the object for you if you need more flexibility.
DI means dependency injection - exactly that, you don't create the thing you depend on yourself. Instead you ask for it and voila, it is made available to you. You don't need to know how to create it, who created it etc.
If depCreator depends on the configuration files then that is fine. It can use them. Prior to registering dep with the DI subsystem it can do just about anything. That is what you would do, create a service/factory depCreator that would register dep and make it available for other components of your app.
No question mark. Angular has a DI subsystem and it is actually one of the core ideas behind angular. Angular provides many components for you out of the box ready to be injected, the rest you have to create and register on your own.
I don't know if I would say DI is complex. Maybe it is tricky to implement, I wouldn't know, but once you learn to use it you will not want to go back. DI in angular might just be the easiest to use I have ever seen. It's so good it's sort of transparent. After a while you don't even notice it's there and it works so well.
Your last remark is sort of correct I think. It is in a way about separation of concerns the way I see it. But there are many, many good resources out there that explain DI so I will not elaborate here. As always I would recommend reading ng-book for more angular specific details.

How should I make configurable modules in AngularJS

I've been tinkering with AngularJS and I've built up a small collection of directives and services that I would like to package into a single JS file so that I can use them anywhere.
I have some website specific settings that my module will need for API calls and that sort of thing. I'm just wondering what the Angular way of making configurable modules is. Obviously I don't want to have to modify my reusable JS file for each website, as that kind of defeats the purpose of having it. Seeing as the values are going to remain the same for each website it seems like an awful lot of hassle to pass them in as an argument on each function call, and I'd rather stay away from global variables as much as possible.
I've searched a lot of questions for the answers I seek, and the closest pattern I've found so far is to have my reusable module be dependant on a not included module called "settings" or something and then define that module in the page's JS file, allowing the reusable module to pull the values from it. Here's an example to show what I mean.
This seems backwards to me. It's kind of like having a function pull values from global values instead of passing the values in as arguments.
Is this really the best way of doing this, or is there an alternative?
It sounds like you're looking for a provider.
You should use the Provider recipe only when you want to expose an API for application-wide configuration that must be made before the application starts. This is usually interesting only for reusable services whose behavior might need to vary slightly between applications.
Here's a very basic example of a provider:
myMod.provider('greeting', function() {
var text = 'Hello, ';
this.setText = function(value) {
text = value;
};
this.$get = function() {
return function(name) {
alert(text + name);
};
};
});
This creates a new service, just like you might with myMod.service or myMod.factory, but provides an additional API that is available at config time—namely, a setText method. You can get access to the provider in config blocks:
myMod.config(function(greetingProvider) {
greetingProvider.setText("Howdy there, ");
});
Now, when we inject the greeting service, Angular will call the provider's $get method (injecting any services it asks for in its parameters) and gives you whatever it returns; in this case, $get returns a function that, when called with a name, will alert the name with whatever we've set with setText:
myMod.run(function(greeting) {
greeting('Ford Prefect');
});
// Alerts: "Howdy there, Ford Prefect"
This is exactly how other providers, like $httpProvider and $routeProvider work.
For more information on providers and dependency injection in general, check out this SO question on dependency injection.

Nodejs app structure

I'm wondering if I'm structuring my nodejs app accordingly to account for best performance. My major concern is in regards to how I'm passing in moving my app reference around modules.
Basically in my app.js file I'm declaring all of my requires, libraries etc:
var app = {
config : require('../../config.json'),
restify : require('restify'),
path : require('path'),
mongo : require('mongodb'),
model : require('./models.js'),
step : require('step'),
q : require('q'),
api : require('./api_util.js'),
underscore : require('underscore')
};
In my exports I'm passing in the entire app object. Now given my knowledge of JavaScript (you can correct me if I'm wrong), this will not create new instances of the object, it will simply pass in the object as a pointer and reference the same object in memory.
Now what I find myself doing aside from that (for ease) is in my restify library (the same can be done with Express), I'm appending the app value to the server request object like so:
app.server.pre(function (request, response, next) {
request.app = app;
return next();
});
Hence on every single request if I need quick access to any of my library declarations, config etc. I can easily access request.app. I don't see this being an issue either, same logic the object acts a pointer back to the same memory space, so I'm not doubling memory usage or anything.
Is there a better/easier way of structuring this?
You are correct about references being passed instead of objects being duplicated. From that point of view, you are not wasting extra space when passing references to your app.
However, I would advise you against doing this: if you pass a reference to app around everywhere, what it tells me is that you don't really know what you will need in this or that module.
You should carefully plan your dependencies and know what each module will need so that you can pass the right dependencies for each module in your app.
Regarding things like underscore or mongodb, you should not be doing what you are doing. You should only pass around modules that need initialization. For things like underscore or mongodb, node.js caches the definition the first time you require() it, so you can really call require at the top of every module that needs it.
This will not incur any performance loss, and it will make it clearer what library each module requires.

Categories