Dependency Injection Framework - Dependency Propogation - javascript

I understand what Dependency Injection is but I still don't see the full picture yet in terms of how it benefits the consumer. See example below.
//bad
class car () {
var tire = new Tire('snow');
}
//good
class car () {
var tire;
constructor(tire){
this.tire=tire
}
}
So most articles I have read state that the above example is good since it eliminates the tire dependency from car and thus becomes more testable. But what about the other class that instantiates a car object. If a driver class was to summon a car class would that not force the driver to instantiate car object and the tires as well. It seems like as though the dependency always gets propagated further up. Where does this end? What actually instantiates objects? Is this what DI frameworks are all about?

You are correct that the dependency requirements get "propagated up" all the way. The driver wanting to instantiate their car needs to bring the Car and the Tires. A driver pool with many drivers would need to bring the Drivers, Cars and Tires and so on and so forth. The simple way to combat that is bundling things into a factory:
class Driver {
constructor(carFactory) {
this.car = carFactory.newCarWithRegularTires();
}
}
(Only for illustrative purposes, see below.)
You can inject a different factory which can create other cars, and the dependency of a Car can change without Driver needing to be aware of any of this.
Taken a step further, you can create a general global factory which can create all kinds of objects and knows how to satisfy the dependencies of each, which is a dependency injection container. You typically configure them in a textual format, declaring that a Car needs a Tire, and when you ask it for an instance of Car it knows to instantiate a Tire first and pass it.
The disadvantage of a DI container is that while you can still swap dependencies simply by reconfiguring the DI container, that sometimes means you have to reconfigure the entire thing and it becomes a giant interdependent object registry. You should avoid creating one giant über-container, but keep things modular.
Another word on factories: the above factory example isn't great. An object should use a factory if it needs to create new instances of objects at runtime. It should not use a factory simply to satisfy a dependency that could be injected directly.
In the end you want to strike a balance between classes declaring and receiving their dependencies directly, but also not creating extremely deep dependency hierarchies. Keeping dependencies as shallow as possible is a good start, introducing factories or modular DI containers at stategic points is another way.

Related

Should I use a single Model object for the whole SPA or many small models for each component?

I am thinking about how to organize Model in my webapp. I have two options, as I see it:
Create a single globally available Model object, then on instantiating new components they must register themselves in the Model object via methods, then update it when their state changes. Others would listen to changes from the Model by some key.
Create a model object for each component and let others know via some Event dispatcher about their state change.
In 1st case I feel like it will be easier to manage all the state change in one place, have more standardized way of doing it, while with the latter scenario I think it will be harder to maintain consistence over the system.
What are the factors I should base the decision on?
I would go with the first but adding the idea of the second: Instead of the components checking constantly the keys for changes you can make the only model signal the components when needed after they register. You can pass the signal/callback with the register itself.
That way you don't need to create an army of models that update themselves (that might also mean you have to avoid cyclic propagation). In addition to this, if I understand correctly it would violate the single source of truth. Also it looks worse both memory and performance wise.

How should I be testing ES2015 classes?

My experience is with statically typed languages, and I must admit that I feel lost when trying to achieve the same with dynamic languages. One of the things that I would like to avoid, is apply concepts that don't make sense in this context. Please, assume that this class belongs to my project and that we want to test it with Jasmine:
class MyEs6Class {
constructor(
collaborator1 = new MyCollaborator(),
factory = new MyFactory()) {
this.collaborator1 = collaborator1;
this.factory = factory;
}
method() {
// Code
}
}
I'm providing default instances of the objects in the constructor because that allows me to mock them when testing. I'm trying to use inversion of control in the same way that I would use with, let's say C#, but using the dynamic features of the language to avoid a Dependency Injection container. These two dependencies are required by the class, so from the structural point of view, I think is very clear that they must be provided using the constructor.
Also, I'm using the concept of a factory only because while the class is 'alive' can require new objects from the factory several times.
From the point of view of ES6 classes, I know that there is no difference between private and public (https://stackoverflow.com/a/27853642/185027), so I could have the logic handled by the factory in a private method, but depend on him testing seems just wrong. On the other side having something called factory just because I need to fake the objects that returns seems weird, and maybe is screaming my lack of knowledge.
What is the proper way to mock collaborators in this context?.
Is it silly to have the concept of a factory only because I need to mock the object that returns?.
What would be a maintainable and elegant way of isolating the subject under test in Javascript/ES6?. Any interesting public codebase that I can study?.

Managing globally needed services in web applications

I struggle to find a statisfying solution on how to expose service instances whose methods need to be accessed through multiple parts of my applications.
The situation
First things first, by a 'service', I mean an instance of a function that holds properties & methods which are exposed through an API.
Consider a REST service whose purpose it is to provide convenient methods to access REST points on a server. I would make the following assumptions on that service:
It must be available throughout the application. It is likely that as the app grows, there will be new components that need access.
There is no need of multiple instances of this service. We can consider it a singleton.
My solutions
I can think of 2 possible solutions:
Concatenating scripts & utilizing the global object
I could combine all my script files (e.g rest.service.js, app.js) into a single file and create an object property on the global object (like App).
Then, I could attach service instances to this object. This allows me to do something like this from everywhere within the app:
App.restService.get()
However, even if I wrap each service in an IIFE, i still have to add some variables on window in order to retrieve instances.
Using commonJS / AMD modules
I could require() my service instances from everywhere by using require.js / browserify
The issues
Now I got a headache because on the one hand, people are telling me that polluting the global object is bad practice. Singletons are bad practice also.
On the other hand, we make a lot of effort to 'uglify' scripts, each byte saved considered an enhancement. Using browserify would lead to the same script injected in multiple files, though (I'm using web-components, therefore I've got a lot of isolated scripts). Not mentioning the fact that I have no idea on how to provide a state-safe service using browserify.
So how should I approach this problem?
How should I expose standard services that may or may not be instantiated multiple times? How should I implement state-safe ones?
Just a starting point (but too long to be a comment) I really enjoy the strategy used by AngularJs, where you always instantiate services within a container - and every time you instantiate something you also specify which modules should be injected into it:
angular.module('myApp.services', []); // the second argument are the dependencies (an empty array
At any point, you can retrieve your modules and add functionalities:
var services = angular.module('myApp.services');
services.factory('yourServiceName', //
['other', 'service', 'dependencies'],
function(other, service, dependencies){
other.doStuff();
service.doStuff();
dependencies.doStuff();
[..]
});
You can then inject your module in other modules
var myApp = angular.module('na', ['yourServiceName'])
In angular, the app is instantiated by the framework itself - but I guess you can develop a entry point for your app, so that you can use your services.
..unfortunately, I do not know exactly how this pattern is implemented - probably all the modules are stored within an instance of the application, so the global namespace is not polluted.
This problem also confuses me a lot, I think there are two points I can figure out:
1) There must be an entry point for each service in global, otherwise it is impossible to get the one you need everywhere. It's not good to add many things in global, but I think service reference is the one deserved.
2) Config the service object other than initialization, for example, they can be only one ajax service object with different configuration to do different things. There are objects, so they can be merged and extended.
This is an interesting topic, I would like to see more opinions about, not just management of services, also other resources like templates, objects, files, etc.

Where would you instantiate nested components using Closure Library?

Closure library offers a basic life cycle for Components:
Instantiation
Rendering/Decoration
Document Entering
Document Exiting
Disposal
I'm focusing on the first two. About design patterns, when is it better to instantiate a nested component in the first steps?
Instantiation: needs to be hold on a property till added through addChild and consumes memory before necessary. Anyways, it allows to do some dependency injection or better initialization because of the parameters it receives.
Rendering/Decorating: messes up the dom creation, which can be already complicated because of the references of other objects it needs. It also would need the instantiation parameters previously stored in some property. Anyways, holds the instantiation till is needed.
Maybe a separated method called after instantiation which wraps the rendering? I'm asking because Closure Libray book and documentation don't talk about this.
Doing some refactorying and trying to split logic, came to the following conclusion:
The best option I've found so far is to create the components inside the createDom method:
Normally, the parameters needed for components involve the data they present. My arquitecture uses DAO, which means all data objects are conveniently connected. Subcomponents usually need some other data objects which are accessible by the parent's DAO. Since the parent object is needing this DAO, it's ok to store it on a property for the use inside the createDom.
The other thing is that instantiation and rendering in the createDom component, theoretically, only needs two lines, which isn't a mess.
Again, it's the best possible solution to increase cohesion.

Angular.js - Javascript Dependency Injection

I read about DI and DI in Angular.js.
From what I understand DI in Angular.js means that Angular.js is allowing controller, factory, service, or others, to specify dependencies, without the need of creating the dependency.
Questions:
In some point dependency has to be created, making the place where the dependency was created not DIed, how do I understand this?
What if I have:
var thing = function(dep){
this.dep = dep || new depCreator();
}
is this DIed? or depends whether dep is passed to the function?
From what I see, DI means allow to set a dependency, being it in a function or object, at the end, would it mean to separate initialization/configuration/data from other parts of the program(logic? although we could have also initialization logic)?:
var dep1 = 'qwe';
var thing = function(dep){ this.dep = dep; }
var diedThing = new thing(dep1);
This would allow to set dep1 in a configuration file, for example.
If the plain JavaScript implementing DI is:
var thing = function(dep){
this.dep = dep;
}
instead of
var thing = function(){
this.dep = new depCreator();
}
Is this right?
But what if depCreator depends on configuration files(or an extracted configuration), would this be DIed?
When I read that Angular.js has(?) DI, is it correct to think that this DI means that Angular.js creates and searches dependencies for me? is there another meaning?
Lastly, if DI is so complex, but means just to separate configuration from implementation(or logic?), why not call it single responsibility principle, i.e. the method does what the method does, the configuration does what the configuration does, and so on.
At the end, DI is to me a subjective concept, which is how do you imagine and split responsibilities in some application, is this even near to correct?
Sorry for the long question.
The place where the dependency is created does not depend on it. It's sole purpose is usually to create the "thing" and register it with the DI subsystem. There is nothing weird or suspicious about that.
Why would you want to do this? Maybe instead depend on a service that creates the object for you if you need more flexibility.
DI means dependency injection - exactly that, you don't create the thing you depend on yourself. Instead you ask for it and voila, it is made available to you. You don't need to know how to create it, who created it etc.
If depCreator depends on the configuration files then that is fine. It can use them. Prior to registering dep with the DI subsystem it can do just about anything. That is what you would do, create a service/factory depCreator that would register dep and make it available for other components of your app.
No question mark. Angular has a DI subsystem and it is actually one of the core ideas behind angular. Angular provides many components for you out of the box ready to be injected, the rest you have to create and register on your own.
I don't know if I would say DI is complex. Maybe it is tricky to implement, I wouldn't know, but once you learn to use it you will not want to go back. DI in angular might just be the easiest to use I have ever seen. It's so good it's sort of transparent. After a while you don't even notice it's there and it works so well.
Your last remark is sort of correct I think. It is in a way about separation of concerns the way I see it. But there are many, many good resources out there that explain DI so I will not elaborate here. As always I would recommend reading ng-book for more angular specific details.

Categories