Javascript dependency injection & DIP in node: require vs constructor injection - javascript

I'm new to NodeJs development coming from the .NET world
i'm searching the web for best practices regrading DI / DIP in Javascript
In .NET i would declare my dependencies at the constructor whereas in javascript i see a common pattern is to declare dependencies in the module level via a require statement.
for me it looks like that when i use require i'm coupled to a specific file while using a constructor to receive my dependency is more flexible.
What would you recommend doing as a best practice in javascript? (I'm looking for the architectural pattern and not an IOC technical solution)
searching the web i came along this blog post (which has some very interesting discussion in the comments):
https://blog.risingstack.com/dependency-injection-in-node-js/
it summerizes my conflict pretty good.
here's some code from the blog post to make you understand what i'm talking about:
// team.js
var User = require('./user');
function getTeam(teamId) {
return User.find({teamId: teamId});
}
module.exports.getTeam = getTeam;
A simple test would look something like this:
// team.spec.js
var Team = require('./team');
var User = require('./user');
describe('Team', function() {
it('#getTeam', function* () {
var users = [{id: 1, id: 2}];
this.sandbox.stub(User, 'find', function() {
return Promise.resolve(users);
});
var team = yield team.getTeam();
expect(team).to.eql(users);
});
});
VS DI:
// team.js
function Team(options) {
this.options = options;
}
Team.prototype.getTeam = function(teamId) {
return this.options.User.find({teamId: teamId})
}
function create(options) {
return new Team(options);
}
test:
// team.spec.js
var Team = require('./team');
describe('Team', function() {
it('#getTeam', function* () {
var users = [{id: 1, id: 2}];
var fakeUser = {
find: function() {
return Promise.resolve(users);
}
};
var team = Team.create({
User: fakeUser
});
var team = yield team.getTeam();
expect(team).to.eql(users);
});
});

Regarding your question: I don't think that there is a common practice in the JS community. I've seen both types in the wild, require modifications (like rewire or proxyquire) and constructor injection (often using a dedicated DI container). However, personally, I think not using a DI container is a better fit with JS. And that's because JS is a dynamic language with functions as first-class citizens. Let me explain that:
Using DI containers enforce constructor injection for everything. It creates a huge configuration overhead for two main reasons:
Providing mocks in unit tests
Creating abstract components that know nothing about their environment
Regarding the first argument: I would not adjust my code just for my unit tests. If it makes your code cleaner, simpler, more versatile and less error-prone, then go for out. But if your only reason is your unit test, I would not take the trade-off. You can get pretty far with require modifications and monkey patching. And if you find yourself writing too many mocks, you should probably not write a unit test at all, but an integration test. Eric Elliott has written a great article about this problem.
Regarding the second argument: This is a valid argument. If you want to create a component that only cares about an interface, but not about the actual implementation, I would opt for a simple constructor injection. However, since JS does not force you to use classes for everything, why not just use functions?
In functional programming, separating stateful IO from actual processing is a common paradigm. For instance, if you're writing code that is supposed to count file types in a folder, one could write this (especially when he/she is coming from a language that enforce classes everywhere):
const fs = require("fs");
class FileTypeCounter {
countFileTypes(dirname, callback) {
fs.readdir(dirname, function (err) {
if (err) return callback(err);
// recursively walk all folders and count file types
// ...
callback(null, fileTypes);
});
}
}
Now if you want to test that, you need to change your code in order to inject a fake fs module:
class FileTypeCounter {
constructor(fs) {
this.fs = fs;
}
countFileTypes(dirname, callback) {
this.fs.readdir(dirname, function (err) {
// ...
});
}
}
Now, everyone who is using your class needs to inject fs into the constructor. Since this is boring and makes your code more complicated once you have long dependency graphs, developers invented DI containers where they can just configure stuff and the DI container figures out the instantiation.
However, what about just writing pure functions?
function fileTypeCounter(allFiles) {
// count file types
return fileTypes;
}
function getAllFilesInDir(dirname, callback) {
// recursively walk all folders and collect all files
// ...
callback(null, allFiles);
}
// now let's compose both functions
function getAllFileTypesInDir(dirname, callback) {
getAllFilesInDir(dirname, (err, allFiles) => {
callback(err, !err && fileTypeCounter(allFiles));
});
}
Now you have two super-versatile functions out-of-the-box, one which is doing IO and the other one which processes the data. fileTypeCounter is a pure function and super-easy to test. getAllFilesInDir is impure but a such a common task, you'll often find it already on npm where other people have written integration tests for it. getAllFileTypesInDir just composes your functions with a little bit of control flow. This is a typical case for an integration test where you want to make sure that your whole application is working correctly.
By separating your code between IO and data processing, you won't find the need to inject anything at all. And if you don't need to inject anything, that's a good sign. Pure functions are the easiest thing to test and are still the easiest way to share code between projects.

In the past, DI containers as we know them from Java and .NET did not exist. With Node 6 came ES6 Proxies which opened up the possibility of such containers - Awilix for example.
So let's rewrite your code to modern ES6.
class Team {
constructor ({ User }) {
this.User = user
}
getTeam (teamId) {
return this.User.find({ teamId: teamId })
}
}
And the test:
import Team from './Team'
describe('Team', function() {
it('#getTeam', async function () {
const users = [{id: 1, id: 2}]
const fakeUser = {
find: function() {
return Promise.resolve(users)
}
}
const team = new Team({
User: fakeUser
})
const team = await team.getTeam()
expect(team).to.eql(users)
})
})
Now, using Awilix, let's write our composition root:
import { createContainer, asClass } from 'awilix'
import Team from './Team'
import User from './User'
const container = createContainer()
.register({
Team: asClass(Team),
User: asClass(User)
})
// Grab an instance of Team
const team = container.resolve('Team')
// Alternatively...
const team = container.cradle.Team
// Use it
team.getTeam(123) // calls User.find()
That's as simple as it gets; Awilix can handle object lifetimes as well, just like the .NET / Java containers did. This lets you do cool stuff like inject the current user to your services, intantiating your services once per http request, etc.

Related

Should I opt for code repetition or consolidation with api service - JS

I'm working on a large CMS system where a particular module and its submodules take advantage of the same backend API. The endpoint is exactly the same for each submodule aside from its "document type".
So a pattern like this is followed:
api/path/v1/{document-type}
api/path/v1/{document-type}/{id}
api/path/v1/{document-type}/{id}/versions
As time goes on the number of modules that use this API grows and I am left with many, many redundant api services that implement 7 CRUD methods:
getAllXs() {...}
getX(id) {...}
getXVersion(id, versionId) {...}
etc...
with an individual method looking like this
getAllXs() {
let endpoint = BASE.URL + ENDPOINTS.X;
let config = ...
return http.get(endpoint, config)
.then(response => response.data);
.catch(...);
}
Where X would be the name of a particular Document Type.
I came to a point where I decided to make a single service and do something like this:
const BASE_URL = window.config.baseUrl + Const.API_ENDPOINT;
const ENDPOINTS = {
"W": "/v1/W/",
"X": "/v1/X/",
"Y": "/v1/Y/",
"Z": "/v1/Z/",
}
getAllDocuments(docType, config={}) {
let endpoint = BASE_URL + ENDPOINTS[docType];
return http.get(endpoint, config)
.then(response => response.data);
.catch(...);
}
...other methods
Where a type is specified and a mapped endpoint is used to build the path.
This reduces all of the document api services down to one. Now this is more concise code wise, but obviously now requires an extra parameter and the terminology is more generic:
getAllXs() --> getAllDocuments()
and it's a bit less 'idiot-proof'. What makes me insecure about the current way it is written is that there are 6 modules that use this API and the same 7 methods in each service.
The questions I keep asking myself are:
Am I bordering anti-pattern with the dynamic functions?
What if I had 10+ modules using the same API?
Your question made me think of a common Object Relational Mapping design problem.
There are no single source of truth when it comes to design, but if your recognize an ORM in what you are building and value object oriented design principles, I have some inspiration for you.
Here's an over simplification of my own vanilla ES6 ORM I have used on many projects (reusing your code snippets for relatability). This deisgn is inspired by heavy ORM frameworks I have used in other languages.
class ORM {
constructor() {
this.BASEURL = window.config.baseUrl + Const.API_ENDPOINT
this.config = {foo:bar} // default config
}
getAll() {
let endpoint = this.BASEURL + this.ENDPOINT
return http.get(endpoint, this.config)
.then(response => response.data)
.catch(...)
}
get(id) {
// ...
}
}
And examples of extension of that class (notice the one with a special configuration)
class XDocuments extends ORM {
static endpoint = '/XDocument/'
constuctor() {
super()
}
otherMethod() {
return 123
}
}
class YDocuments extends ORM {
static endpoint = '/YDocument/'
constuctor() {
super()
}
getAll() {
this.config = {foo:not_bar}
super.getAll()
}
}
Since you are specifically asking if this is bordering anti-patterns. I would suggest reading about SOLID and DRY principles, and ORM designs in general. You will also find code smells about global constants, but this would only be true if you are in a window context. I see you are already on the right path trying to avoid code duplication smell and the shotgun surgery smell. :-)
Good luck and don't hesitate to ask further questions and add additional details in comments!
If you'd provide more of your original version of code, that'd be more points to point to.
At the moment I just can say that the DRY principle is generally a good idea (not always but... well it's a complicated topic). There's a lot of articles about DRY. Google it.
You're afraid that your code became more complex. Don't be. In my experience, novice programmers fail miserably exactly because they discard the DRY principle. And only after a while, when they become stronger they start to fail at the KISS principle. And only one extra argument in addition to the 3 that already there doesn't add much to the complexity.
Am I bordering anti-pattern with the dynamic functions?
To be "dynamic" - that's the reason functions exist
What if I had 10+ modules using the same API?
Exactly! 10+ more chances to make a typo, to forget something, to misread, etc and 10+ more work to do if you'll need to change something. That is if you don't DRY your code.
PS. And the name getAllDocuments is actually better than getAllDocType1s if that's the real name from your original code.
While I share most of x00's answer, I would take into account how static your endpoints really.
Is there no chance module "X" can change any of its endpoints definitions? For instance, you need to pass one more query param. Are your modules all exactly the same with no room for change?
If the answer is no, there you go. To make that simple change you would have to refactor your whole code base (if you would implement it the way you propose, that is).
If the answer is yes, well, I see no reason for you to not implement your proposed dynamic functions. Personally, I would lean towards a service that my modules extend and use, just in case I do want to make minimal changes to them. For instance:
class MyGenericService {
constructor() {
this.url = window.config.baseUrl;
}
async getAllDocuments(config) {
return http.get(this.url, config)
.then(response => response.data);
.catch(...);
};
// ...and so on
}
This allows my code to scale and be modifiable, with just one takeaway which is you would need to maintain one file per module, that has something like this:
class XService extends MyGenericService {
constructor() {
this.url = window.config.baseUrl + '/v1/x';
}
}
If maintaining this extra files is too much overhead, you could receive the endpoint's URL in the constructor, on the MyGenericService, and you would just need to do stuff like this in your controllers:
const myXService = new MyGenericService('/v1/x');
const myYService = new MyGenericService('/v1/y');
// ...or it could use your endpoint url mapping
// I don't really know how is your code structured, just giving you ideas
There you have some options, hope it helps!

How to correctly make an entry point to my module, which is containing multiple classess?

I've started to develop a desktop app with node and electron. It has a package, which is implementing connection with some API. It is structured as one base class, and some derrived classes in this way:
ApiBase
ApiAuth extends ApiBase
ApiAuth.login()
ApiAuth.logout()
etc...
ApiTasks extends ApiBase
ApiTasks.getTaskList()
etc...
etc...
And now, i want to make nice and convinient way to use these classes in my app. So i need to create some entry point, which will provide an access to my API implementation. But, i do not have much expirience to make it right.
I thought about something like this:
index.js:
const ApiAuth = require('./amazing-time-api-auth');
const ApiTasks = require('./amazing-time-api-tasks');
apiAuth = new ApiAuth('www.sample-host.com');
apiTasks = new ApiTasks('www.sample-host.com');
module.exports = {
login: apiAuth.login,
logout: apiAuth.logout,
getTaskList: apiTasks.getTaskList,
etc...
}
somwhere at the app:
const api = require("./lib/someApi");
// need to get task list for some reason
api.getTaskList(param1, param2)
But there are some problems with this approach i managed:
it is a problem to pass host param to the constructors in index.js dynamicly
i am not sure if creating this instances everytime requiring index.js is a rigth thing
So i want to know about some approches i can use here, because i do now even know where to start research. Thank you.
I think that you identified some of the most crucial decisions with this:
it is a problem to pass host param to the constructors in index.js dynamicly
IMO Configuration and the interface are important considerations. Even though it can be refactored after the fact an easy to configure and consume interface will help reduce adoption of your library. As you pointed out the configuration is static right now and very brittle. Ie a change to the URL will cascade to all clients and require all clients to update.
A first intuitive alternative may be to allow dynamic configuration of the current structure:
apiAuth = new ApiAuth(process.env.API_AUTH_URL || 'www.sample-host.com');
apiTasks = new ApiTasks(process.env.API_TASKS_URL || 'www.sample-host.com');
While this allows client to dynamically configure the URL, the configuration is "implicit". IMO this is unintuitive and difficult to document. Also it's not explicit and requires a client to look in the code to see the environmental variables and instantiation flow.
I would favor exposing these classes to the client directly. I would consider this approach "explicit" as it forces the client to explicitly configure/instantiate your components. I think it's like providing your clients with primitives and allowing them to compose, build, and configure them in whatever way they want:
const ApiAuth = require('./amazing-time-api-auth');
const ApiTasks = require('./amazing-time-api-tasks');
module.exports = {
auth: ApiAuth,
tasks: ApiTasks
}
This automatically namespaces the api behind its functions (auth|tasks) AND requires that the client instantiatae the classes before using:
const api = require("./lib/someApi");
const auth = new api.auth(process.env.SOMETHING, 'some-url');
This pulls the configuration further out in the architecture. It forces the client to decide how it wants to get the URL and explicitly instantiate the library. What if one of your clients doesn't use login/logout? This may be more flexible in that case.
i am not sure if creating this instances everytime requiring index.js is a rigth thing
If instantiation should remain hidden, another alternative would be to provide a builder function in order to encapsulate it:
const ApiAuth = require('./amazing-time-api-auth');
const ApiTasks = require('./amazing-time-api-tasks');
apiAuth = new ApiAuth('www.sample-host.com');
apiTasks = new ApiTasks('www.sample-host.com');
module.exports = {
auth: {
build: (url) => {
return new ApiAuth(url);
}
},
tasks: {
build: (url) => {
return new ApiTasks(url);
}
}
}
This should still hide each class but still allows the client to decide how it configures each class:
const api = require("./lib/someApi");
const auth = api.auth.build('my-url');
auth.login();

How to structure lambda code for testability

I am trying to make a small REST API with API gateway, lambda, and DynamoDB, while following good development practices such as TDD. I'm used to being able to use a DI container to provision my objects, which lends itself perfectly for mocking and testing. In an MVC framework, there would be a single entry point, where I could define my container configuration, bootstrap the application, and invoke the controller to handle the event. I could test the controller independently of the rest of the application, and inject mocked dependencies. I can't figure out how to decouple the dependencies a lambda function may have from the lambda function itself. For example:
const { DynamoDB } = require('aws-sdk')
const { UserRepo } = require('../lib/user-repo')
const client = new DynamoDB({ region: process.env.REGION }) // Should be resolved by DI container
const userRepo = new UserRepo(client) // Should be resolved by DI container
exports.handler = async (event) => {
return userRepo.get(event.id)
}
Please can anyone lead me in the right direction for structuring lambda code so it can be unit tested properly?
One way we've approached this in the project I'm currently working on is splitting out the requirements, so the handler is responsible for:
Creating the clients;
Extracting any config from the environment; and
Getting the parameters from the event.
Then it calls another function that does most of the work, and which we can test in isolation. Think of the handler like a controller, and the other function like the service that does the work.
In your specific case, that might look like:
const { DynamoDB } = require('aws-sdk');
const { UserRepo } = require('../lib/user-repo');
const doTheWork = (repo, id) => repo.get(id);
exports.handler = async (event) => {
const client = new DynamoDB({ region: process.env.REGION });
const userRepo = new UserRepo(client);
return doTheWork(userRepo, event.id);
}
doTheWork can now be exercised at the unit level using test doubles for the repo object and whatever inputs you want. The UserRepo is already decoupled by constructor injection of the Dynamo client, so that should be pretty testable too.
We also have tests at the integration level that only mock out the AWS SDK stuff (you could alternatively use transport layer mocking or something like aws-sdk-mock) plus E2E testing that ensures the whole system works together.

Mediate and share data between different modules

I am just trying to get my head around event driven JS, so please bear with me. There are different kinds of modules within my app. Some just encapsulate data, others manage a part of the DOM. Some modules depend on others, sometimes one module depends on the state of multiple other modules, but I don't want them to communicate directly or pass one module to the other just for easy access.
I tried to create the simplest scenario possible to illustrate my problem (the actual modules are much more complex of course):
I have a dataModule that just exposes some data:
var dataModule = { data: 3 };
There is a configModule that exposes modifiers for displaying that data:
var configModule = { factor: 2 };
Finally there is a displayModule that combines and renders the data from the two other modules:
var displayModule = {
display: function(data, factor) {
console.log(data * factor);
}
};
I also have a simple implementation of pub-sub, so I could just mediate between the modules like this:
pubsub.subscribe("init", function() {
displayModule.display(dataModule.data, configModule.factor);
});
pubsub.publish("init"); // output: 6
However this way I seem to end up with a mediator that has to know all of the module-instances explicitly - is there even a way to avoid that? Also I don't know how this would work if there are multiple instances of these modules. What is the best way to avoid global instance-variables? I guess my question is what would be the most flexible way to manage something like that? Am I on the right track, or is this completely wrong? Sorry for not being very precise with my question, I just need someone to push me in the right direction.
You are on the right track, I'll try to give you that extra push you're talking about:
It you want loose coupling, pub-sub is a good way to go.
But, you don't really need that "mediator", each module should ideally be autonomous and encapsulate its own logic.
This is done in the following way: each module depends on the pubsub service, subscribe to all relevant events and act upon them. Each module also publishes events which might be relevant to others (code samples in a minute, bear with me).
I think the bit you might be missing here is that modules, which use events, will hardly never be just plain models. They will have some logic in them and can also hold a model (which they update when receiving events).
So instead of a dataModule you are more likely to have a dataLoaderModule which will publish the data model (e.g. {data: 3}), once he finishes loading.
Another great requirement you set is sharing data while avoiding global instance-variables - this is a very important concept and also a step in the right direction. What you miss in your solution for this is - Dependency Injection or at least a module system which allows defining dependencies.
You see, having an event driven application doesn't necessarily mean that every piece of the code should communicate using events. An application configuration model or a utility service is definitely something I would inject (when using DI, like in Angular), require (when using AMD/CommonJS) or import (when using ES6 modules).
(i.e. rather then communicating with a utility using events).
In your example it's unclear whether configModule is a static app configuration or some knob I can tweak from the UI. If it's a static app config - I would inject it.
Now, let's see some examples:
Assuming the following:
Instead of a dataModule we have a dataLoaderModule
configModule is a static configuration model.
We are using AMD modules (and not ES6 modules, which I prefer), since I see you stuck to using only ES5 features (I see no classes or consts).
We would have:
data-loader.js (aka dataLoaderModule)
define(['pubsub'], function (pubsub) {
// ... load data using some logic...
// and publish it
pubsub.publish('data-loaded', {data: 3});
});
configuration.js (aka configModule)
define([], function () {
return {factor: 2};
});
display.js (aka displayModule)
define(['configuration', 'pubsub'], function (configuration, pubsub) {
var displayModule = {
display: function (data, factor) {
console.log(data * factor);
}
};
pubsub.subscribe('data-loaded', function (data) {
displayModule.display(data, configuration.factor);
});
});
That's it.
You will notice that we have no global variables here (not even pubsub), instead we are requiring (or injecting) our dependencies.
Here you might be asking: "and what if I meant for my config to change from the UI?", so let's see that too:
In this case, I rather rename configModule to settingsDisplayModule (following your naming convention).
Also, in a more realistic app, UI modules will usually hold a model, so let's do that too.
And lets also call them "views" instead of "displayModules", and we will have:
data-loader.js (aka dataLoaderModule)
define(['pubsub'], function (pubsub) {
// ... load data using some logic...
// and publish it
pubsub.publish('data-loaded', {data: 3});
});
settings-view.js (aka settingsDisplayModule, aka config)
define(['pubsub'], function (pubsub) {
var settingsModel = {factor: 2};
var settingsView = {
display: function () {
console.log(settingsModel);
// and when settings (aka config) changes due to user interaction,
// we publish the new settings ...
pubsub.publish('setting-changed', settingsModel);
}
};
});
data-view.js (aka displayModule)
define(['pubsub'], function (pubsub) {
var model = {
data: null,
factor: 0
};
var view = {
display: function () {
if (model.data && model.factor) {
console.log(model.data * model.factor);
} else {
// whatever you do/show when you don't have data
}
}
};
pubsub.subscribe('data-loaded', function (data) {
model.data = data;
view.display();
});
pubsub.subscribe('setting-changed', function (settings) {
model.factor = settings.factor;
view.display();
});
});
And that's it.
Hope it helps :)
If not - comment!
You do not need a mediator. Just import data, config, and display and call display(data, config) where you need to.
// import data
// import config
function render(){
display(data, config)
}

stubbing an entire class for testing in sinon

Preamble: I've read lots of of SO and blog posts, but haven't seen anything that answers this particular question. Maybe I'm just looking for the wrong thing...
Suppose I'm developing a WidgetManager class that will operate on Widget objects.
How do I use sinon to test that WidgetManager is using the Widget API correctly without pulling in the whole Widget library?
Rationale: The tests for a WidgetManager should be decoupled from the Widget class. Perhaps I haven't written Widget yet, or perhaps Widget is an external library. Either way, I should be able to test that WidgetManager is using Widget's API correctly without creating real Widgets.
I know that sinon mocks can only work on existing classes, and as far as I can tell, sinon stubs also need the class to exist before it can be stubbed.
To make it concrete, how would I test that Widget.create() is getting called exactly once with a single argument 'name' in the following code?
code under test
// file: widget-manager.js
function WidgetManager() {
this.widgets = []
}
WidgetManager.prototype.addWidget = function(name) {
this.widgets.push(Widget.create(name));
}
testing code
// file: widget-manager-test.js
var WidgetManager = require('../lib/widget-manager.js')
var sinon = require('sinon');
describe('WidgetManager', function() {
describe('#addWidget', function() {
it('should call Widget.create with the correct name', function() {
var widget_manager = new WidgetManager();
// what goes here?
});
it('should push one widget onto the widgets list', function() {
var widget_manager = new WidgetManager();
// what setup goes here?
widget_manager.addWidget('fred');
expect(widget_manager.widgets.length).to.equal(1);
});
});
Aside: Of course, I could define a MockWidget class for testing with the appropriate methods, but I'm more interested in really learning how to use sinon's spy / stub / mock facilities correctly.
The answer is really about dependency injection.
You want to test that WidgetManager is interacting with a dependency (Widget) in the expected way - and you want freedom to manipulate and interrogate that dependency. To do this, you need to inject a stub version of Widget at testing time.
Depending on how WidgetManager is created, there are several options for dependency injection.
A simple method is to allow the Widget dependency to be injected into the WidgetManager constructor:
// file: widget-manager.js
function WidgetManager(Widget) {
this.Widget = Widget;
this.widgets = [];
}
WidgetManager.prototype.addWidget = function(name) {
this.widgets.push(this.Widget.create(name));
}
And then in your test you simply pass a stubbed Widget to the WidgetManager under test:
it('should call Widget.create with the correct name', function() {
var stubbedWidget = {
create: sinon.stub()
}
var widget_manager = new WidgetManager(stubbedWidget);
widget_manager.addWidget('fred');
expect(stubbedWidget.create.calledOnce);
expect(stubbedWidget.create.args[0] === 'fred');
});
You can modify the behaviour of your stub depending on the needs of a particular test. For example, to test that the widget list length increments after widget creation, you can simply return an object from your stubbed create() method:
var stubbedWidget = {
create: sinon.stub().returns({})
}
This allows you to have full control over the dependency, without having to mock or stub all methods, and lets you test the interaction with its API.
There are also options like proxyquire or rewire which give more powerful options for overriding dependencies at test time. The most suitable option is down to implementation and preference - but in all cases you are simply aiming to replace a given dependency at testing time.
Your addWidget method does 2 things:
"converts" a string to a Widget instance;
adds that instance to internal storage.
I suggest you change addWidget signature to accept instance directly, instead of a name, and move out creation some other place. Will make testing easier:
Manager.prototype.addWidget = function (widget) {
this.widgets.push(widget);
}
// no stubs needed for testing:
const manager = new Manager();
const widget = {};
manager.addWidget(widget);
assert.deepStrictEquals(manager.widgets, [widget]);
After that, you'll need a way of creating widgets by name, which should be pretty straight-forward to test as well:
// Maybe this belongs to other place, not necessarily Manager class…
Manager.createWidget = function (name) {
return new Widget(name);
}
assert(Manager.createWidget('calendar') instanceof Widget);

Categories