I am trying to make a small REST API with API gateway, lambda, and DynamoDB, while following good development practices such as TDD. I'm used to being able to use a DI container to provision my objects, which lends itself perfectly for mocking and testing. In an MVC framework, there would be a single entry point, where I could define my container configuration, bootstrap the application, and invoke the controller to handle the event. I could test the controller independently of the rest of the application, and inject mocked dependencies. I can't figure out how to decouple the dependencies a lambda function may have from the lambda function itself. For example:
const { DynamoDB } = require('aws-sdk')
const { UserRepo } = require('../lib/user-repo')
const client = new DynamoDB({ region: process.env.REGION }) // Should be resolved by DI container
const userRepo = new UserRepo(client) // Should be resolved by DI container
exports.handler = async (event) => {
return userRepo.get(event.id)
}
Please can anyone lead me in the right direction for structuring lambda code so it can be unit tested properly?
One way we've approached this in the project I'm currently working on is splitting out the requirements, so the handler is responsible for:
Creating the clients;
Extracting any config from the environment; and
Getting the parameters from the event.
Then it calls another function that does most of the work, and which we can test in isolation. Think of the handler like a controller, and the other function like the service that does the work.
In your specific case, that might look like:
const { DynamoDB } = require('aws-sdk');
const { UserRepo } = require('../lib/user-repo');
const doTheWork = (repo, id) => repo.get(id);
exports.handler = async (event) => {
const client = new DynamoDB({ region: process.env.REGION });
const userRepo = new UserRepo(client);
return doTheWork(userRepo, event.id);
}
doTheWork can now be exercised at the unit level using test doubles for the repo object and whatever inputs you want. The UserRepo is already decoupled by constructor injection of the Dynamo client, so that should be pretty testable too.
We also have tests at the integration level that only mock out the AWS SDK stuff (you could alternatively use transport layer mocking or something like aws-sdk-mock) plus E2E testing that ensures the whole system works together.
Related
I've started to develop a desktop app with node and electron. It has a package, which is implementing connection with some API. It is structured as one base class, and some derrived classes in this way:
ApiBase
ApiAuth extends ApiBase
ApiAuth.login()
ApiAuth.logout()
etc...
ApiTasks extends ApiBase
ApiTasks.getTaskList()
etc...
etc...
And now, i want to make nice and convinient way to use these classes in my app. So i need to create some entry point, which will provide an access to my API implementation. But, i do not have much expirience to make it right.
I thought about something like this:
index.js:
const ApiAuth = require('./amazing-time-api-auth');
const ApiTasks = require('./amazing-time-api-tasks');
apiAuth = new ApiAuth('www.sample-host.com');
apiTasks = new ApiTasks('www.sample-host.com');
module.exports = {
login: apiAuth.login,
logout: apiAuth.logout,
getTaskList: apiTasks.getTaskList,
etc...
}
somwhere at the app:
const api = require("./lib/someApi");
// need to get task list for some reason
api.getTaskList(param1, param2)
But there are some problems with this approach i managed:
it is a problem to pass host param to the constructors in index.js dynamicly
i am not sure if creating this instances everytime requiring index.js is a rigth thing
So i want to know about some approches i can use here, because i do now even know where to start research. Thank you.
I think that you identified some of the most crucial decisions with this:
it is a problem to pass host param to the constructors in index.js dynamicly
IMO Configuration and the interface are important considerations. Even though it can be refactored after the fact an easy to configure and consume interface will help reduce adoption of your library. As you pointed out the configuration is static right now and very brittle. Ie a change to the URL will cascade to all clients and require all clients to update.
A first intuitive alternative may be to allow dynamic configuration of the current structure:
apiAuth = new ApiAuth(process.env.API_AUTH_URL || 'www.sample-host.com');
apiTasks = new ApiTasks(process.env.API_TASKS_URL || 'www.sample-host.com');
While this allows client to dynamically configure the URL, the configuration is "implicit". IMO this is unintuitive and difficult to document. Also it's not explicit and requires a client to look in the code to see the environmental variables and instantiation flow.
I would favor exposing these classes to the client directly. I would consider this approach "explicit" as it forces the client to explicitly configure/instantiate your components. I think it's like providing your clients with primitives and allowing them to compose, build, and configure them in whatever way they want:
const ApiAuth = require('./amazing-time-api-auth');
const ApiTasks = require('./amazing-time-api-tasks');
module.exports = {
auth: ApiAuth,
tasks: ApiTasks
}
This automatically namespaces the api behind its functions (auth|tasks) AND requires that the client instantiatae the classes before using:
const api = require("./lib/someApi");
const auth = new api.auth(process.env.SOMETHING, 'some-url');
This pulls the configuration further out in the architecture. It forces the client to decide how it wants to get the URL and explicitly instantiate the library. What if one of your clients doesn't use login/logout? This may be more flexible in that case.
i am not sure if creating this instances everytime requiring index.js is a rigth thing
If instantiation should remain hidden, another alternative would be to provide a builder function in order to encapsulate it:
const ApiAuth = require('./amazing-time-api-auth');
const ApiTasks = require('./amazing-time-api-tasks');
apiAuth = new ApiAuth('www.sample-host.com');
apiTasks = new ApiTasks('www.sample-host.com');
module.exports = {
auth: {
build: (url) => {
return new ApiAuth(url);
}
},
tasks: {
build: (url) => {
return new ApiTasks(url);
}
}
}
This should still hide each class but still allows the client to decide how it configures each class:
const api = require("./lib/someApi");
const auth = api.auth.build('my-url');
auth.login();
Well I embraced test-driven-development in the past year while learning C# (those seem to go hand in hand). In javascript however I am struggling to find a good workflow for tdd. This is mainly due to the combination of many frameworks which seemingly consider testing a second class citizen.
As an example consider a class worker. This class would have some functionality to act upon a database. So how would I write unit tests for the functionality of this class?
In c# (and rest of C/JAVA family) I'd write this class in such a way that the constructor would take a database-connection parameter. Then during test runs the object is called with a mock-database-connection object instead of the real object. Thus no modification of the source.
In python a similar approach can be used, however apart from providing a mocking object to the constructor to handle HAS_A dependencies, we can also use dependency injection to mock IS_A dependencies.
Now apply this in javascript, and sailsJS in particular (Though a similar problem occurs with sencha and other frameworks). It seems that the code is so tightly coupled to the library/framework that I can't create manual stubs/mocks? - Other than by actually using a pre-run task to modify the source/config.js?
In sails an object (say worker, a controller) has to reside in a specific folder to work, and it "connects" automatically to the database, without me providing any notion of a database object. (Thus preventing me from actually supplying it with my own object).
Say I have a database with a table "Students", then a controller would look something like (With Students being a model defined in api/models:
const request = require('request');
module.exports = {
updateData: function (req, res) {
let idx = params.jobNumber;
Students.find({Nr:idx})
.exec(function (err, result) {
//....
});
},
};
So how would I load above function into a (mocha) test? And how would I decouple the database (used implicitly by sails) so that I can mock the functionality? - And what should I actually mock?
I of course don't wish to do integration tests, so I shouldn't build a "development database" as I don't wish to test the connection, I wish to test the controller functions.
In the documentation, they provide a nice quick example of how to set up testing using Mocha: https://sailsjs.com/documentation/concepts/testing
In the bootstrap.test.js file, all they're doing is lifting and lowering your application with Sails, just so your application has access to controllers/models/etc. within its test environment. They also show how to test individual controllers, which is essentially just making requests that hit the endpoints to fire off the controller's actions. To avoid testing the full lifecycle of a request, you can just require the controller file within a *.test.js file and test any exported action. Remember, though, Sails builds the request and response objects that get passed to the controllers. So, if you want all of the correct data and have those objects be valid, it's best to just let Sails handle it, and you only make a request to the endpoint, unless if you know exactly how to build the request and response objects. But, that's the point of a framework: you use it as intended, and test against/with it. You don't use your version of how it may work. TDD is in all languages and frameworks, you just need to fit it within your technology.
If you don't want to use a database for your test environment, you can tell it to use the sails-disk adapter by creating an environment file under config/env/ for the test environment and forcing that environment to use sails-disk.
For example...
config/env/test.js --> test environment file
module.exports = {
models: {
connection: 'localDiskDb',
migrate: 'drop',
},
port: 1337,
host: '127.0.0.1',
};
In config/connections.js the below to connections object (if not already there)...
localDiskDb: {
adapter: 'sails-disk'
},
And finally, we need to tell it to use that environment when running the tests. Modify test/bootstrap.test.js like the following...
var sails = require('sails');
before(function(done) {
// Increase the Mocha timeout so that Sails has enough time to lift.
this.timeout(10000);
// Set environment to testing
process.env.NODE_ENV = 'test';
sails.lift({
// configuration for testing purposes
}, function(err) {
if (err) {
return done(err);
}
//...
done(err, sails);
});
});
after(function(done) {
// here you can clear fixtures, etc.
// This will "refresh" the memory store so you
// have a clean test datastore every time you run tests
sails.once('hook:orm:reloaded', () => {
sails.lower((err) => {
done();
if (err) {
process.exit(1);
} else {
process.exit(0);
}
});
});
sails.emit('hook:orm:reload');
});
Adding Jason's suggestion in an "answer" format, so that others may find it more easily.
sails-mock-models allows simple mocking for sails model queries, based on sinon
Mock any of the standard query methods (ie 'find', 'count', 'update') They will be called with no side effects.
I haven't actually tried it yet (just found this question), but I'll edit this if/when I have any problems.
Sails Unit Test is perfectly explained in the following blog.
https://www.packtpub.com/books/content/how-add-unit-tests-sails-framework-application
Please refer to it.
I've been reading and searching about a good design related with mongodb driver.
Like here was said, there is no need to open/close connection, so I was trying to prevent boilerplate code using a dbLoader.js file like this (only using one database):
const { MongoClient } = require('mongodb')
const logger = require('./logger')
const configLoader = require('./configLoader')
module.exports = (async () => {
const config = await configLoader()
const client = await MongoClient.connect(config.db.uri)
const db = client.db(config.db.dbName)
logger.log('info', `Database ${config.db.dbName} loaded`)
return db
})()
I wonder if there is some other approach that is better, and why. For example attaching it to app.locals (in express.js).
I'm not using mongoose and don't want to.
Consider using IOC containers such as https://github.com/inversify/InversifyJS or similar. For starters, you get the ability to mock the defined dependency for testing purposes (such as database connection) without hacky solutions, such as manual module mocking.
I'm new to NodeJs development coming from the .NET world
i'm searching the web for best practices regrading DI / DIP in Javascript
In .NET i would declare my dependencies at the constructor whereas in javascript i see a common pattern is to declare dependencies in the module level via a require statement.
for me it looks like that when i use require i'm coupled to a specific file while using a constructor to receive my dependency is more flexible.
What would you recommend doing as a best practice in javascript? (I'm looking for the architectural pattern and not an IOC technical solution)
searching the web i came along this blog post (which has some very interesting discussion in the comments):
https://blog.risingstack.com/dependency-injection-in-node-js/
it summerizes my conflict pretty good.
here's some code from the blog post to make you understand what i'm talking about:
// team.js
var User = require('./user');
function getTeam(teamId) {
return User.find({teamId: teamId});
}
module.exports.getTeam = getTeam;
A simple test would look something like this:
// team.spec.js
var Team = require('./team');
var User = require('./user');
describe('Team', function() {
it('#getTeam', function* () {
var users = [{id: 1, id: 2}];
this.sandbox.stub(User, 'find', function() {
return Promise.resolve(users);
});
var team = yield team.getTeam();
expect(team).to.eql(users);
});
});
VS DI:
// team.js
function Team(options) {
this.options = options;
}
Team.prototype.getTeam = function(teamId) {
return this.options.User.find({teamId: teamId})
}
function create(options) {
return new Team(options);
}
test:
// team.spec.js
var Team = require('./team');
describe('Team', function() {
it('#getTeam', function* () {
var users = [{id: 1, id: 2}];
var fakeUser = {
find: function() {
return Promise.resolve(users);
}
};
var team = Team.create({
User: fakeUser
});
var team = yield team.getTeam();
expect(team).to.eql(users);
});
});
Regarding your question: I don't think that there is a common practice in the JS community. I've seen both types in the wild, require modifications (like rewire or proxyquire) and constructor injection (often using a dedicated DI container). However, personally, I think not using a DI container is a better fit with JS. And that's because JS is a dynamic language with functions as first-class citizens. Let me explain that:
Using DI containers enforce constructor injection for everything. It creates a huge configuration overhead for two main reasons:
Providing mocks in unit tests
Creating abstract components that know nothing about their environment
Regarding the first argument: I would not adjust my code just for my unit tests. If it makes your code cleaner, simpler, more versatile and less error-prone, then go for out. But if your only reason is your unit test, I would not take the trade-off. You can get pretty far with require modifications and monkey patching. And if you find yourself writing too many mocks, you should probably not write a unit test at all, but an integration test. Eric Elliott has written a great article about this problem.
Regarding the second argument: This is a valid argument. If you want to create a component that only cares about an interface, but not about the actual implementation, I would opt for a simple constructor injection. However, since JS does not force you to use classes for everything, why not just use functions?
In functional programming, separating stateful IO from actual processing is a common paradigm. For instance, if you're writing code that is supposed to count file types in a folder, one could write this (especially when he/she is coming from a language that enforce classes everywhere):
const fs = require("fs");
class FileTypeCounter {
countFileTypes(dirname, callback) {
fs.readdir(dirname, function (err) {
if (err) return callback(err);
// recursively walk all folders and count file types
// ...
callback(null, fileTypes);
});
}
}
Now if you want to test that, you need to change your code in order to inject a fake fs module:
class FileTypeCounter {
constructor(fs) {
this.fs = fs;
}
countFileTypes(dirname, callback) {
this.fs.readdir(dirname, function (err) {
// ...
});
}
}
Now, everyone who is using your class needs to inject fs into the constructor. Since this is boring and makes your code more complicated once you have long dependency graphs, developers invented DI containers where they can just configure stuff and the DI container figures out the instantiation.
However, what about just writing pure functions?
function fileTypeCounter(allFiles) {
// count file types
return fileTypes;
}
function getAllFilesInDir(dirname, callback) {
// recursively walk all folders and collect all files
// ...
callback(null, allFiles);
}
// now let's compose both functions
function getAllFileTypesInDir(dirname, callback) {
getAllFilesInDir(dirname, (err, allFiles) => {
callback(err, !err && fileTypeCounter(allFiles));
});
}
Now you have two super-versatile functions out-of-the-box, one which is doing IO and the other one which processes the data. fileTypeCounter is a pure function and super-easy to test. getAllFilesInDir is impure but a such a common task, you'll often find it already on npm where other people have written integration tests for it. getAllFileTypesInDir just composes your functions with a little bit of control flow. This is a typical case for an integration test where you want to make sure that your whole application is working correctly.
By separating your code between IO and data processing, you won't find the need to inject anything at all. And if you don't need to inject anything, that's a good sign. Pure functions are the easiest thing to test and are still the easiest way to share code between projects.
In the past, DI containers as we know them from Java and .NET did not exist. With Node 6 came ES6 Proxies which opened up the possibility of such containers - Awilix for example.
So let's rewrite your code to modern ES6.
class Team {
constructor ({ User }) {
this.User = user
}
getTeam (teamId) {
return this.User.find({ teamId: teamId })
}
}
And the test:
import Team from './Team'
describe('Team', function() {
it('#getTeam', async function () {
const users = [{id: 1, id: 2}]
const fakeUser = {
find: function() {
return Promise.resolve(users)
}
}
const team = new Team({
User: fakeUser
})
const team = await team.getTeam()
expect(team).to.eql(users)
})
})
Now, using Awilix, let's write our composition root:
import { createContainer, asClass } from 'awilix'
import Team from './Team'
import User from './User'
const container = createContainer()
.register({
Team: asClass(Team),
User: asClass(User)
})
// Grab an instance of Team
const team = container.resolve('Team')
// Alternatively...
const team = container.cradle.Team
// Use it
team.getTeam(123) // calls User.find()
That's as simple as it gets; Awilix can handle object lifetimes as well, just like the .NET / Java containers did. This lets you do cool stuff like inject the current user to your services, intantiating your services once per http request, etc.
Preamble: I've read lots of of SO and blog posts, but haven't seen anything that answers this particular question. Maybe I'm just looking for the wrong thing...
Suppose I'm developing a WidgetManager class that will operate on Widget objects.
How do I use sinon to test that WidgetManager is using the Widget API correctly without pulling in the whole Widget library?
Rationale: The tests for a WidgetManager should be decoupled from the Widget class. Perhaps I haven't written Widget yet, or perhaps Widget is an external library. Either way, I should be able to test that WidgetManager is using Widget's API correctly without creating real Widgets.
I know that sinon mocks can only work on existing classes, and as far as I can tell, sinon stubs also need the class to exist before it can be stubbed.
To make it concrete, how would I test that Widget.create() is getting called exactly once with a single argument 'name' in the following code?
code under test
// file: widget-manager.js
function WidgetManager() {
this.widgets = []
}
WidgetManager.prototype.addWidget = function(name) {
this.widgets.push(Widget.create(name));
}
testing code
// file: widget-manager-test.js
var WidgetManager = require('../lib/widget-manager.js')
var sinon = require('sinon');
describe('WidgetManager', function() {
describe('#addWidget', function() {
it('should call Widget.create with the correct name', function() {
var widget_manager = new WidgetManager();
// what goes here?
});
it('should push one widget onto the widgets list', function() {
var widget_manager = new WidgetManager();
// what setup goes here?
widget_manager.addWidget('fred');
expect(widget_manager.widgets.length).to.equal(1);
});
});
Aside: Of course, I could define a MockWidget class for testing with the appropriate methods, but I'm more interested in really learning how to use sinon's spy / stub / mock facilities correctly.
The answer is really about dependency injection.
You want to test that WidgetManager is interacting with a dependency (Widget) in the expected way - and you want freedom to manipulate and interrogate that dependency. To do this, you need to inject a stub version of Widget at testing time.
Depending on how WidgetManager is created, there are several options for dependency injection.
A simple method is to allow the Widget dependency to be injected into the WidgetManager constructor:
// file: widget-manager.js
function WidgetManager(Widget) {
this.Widget = Widget;
this.widgets = [];
}
WidgetManager.prototype.addWidget = function(name) {
this.widgets.push(this.Widget.create(name));
}
And then in your test you simply pass a stubbed Widget to the WidgetManager under test:
it('should call Widget.create with the correct name', function() {
var stubbedWidget = {
create: sinon.stub()
}
var widget_manager = new WidgetManager(stubbedWidget);
widget_manager.addWidget('fred');
expect(stubbedWidget.create.calledOnce);
expect(stubbedWidget.create.args[0] === 'fred');
});
You can modify the behaviour of your stub depending on the needs of a particular test. For example, to test that the widget list length increments after widget creation, you can simply return an object from your stubbed create() method:
var stubbedWidget = {
create: sinon.stub().returns({})
}
This allows you to have full control over the dependency, without having to mock or stub all methods, and lets you test the interaction with its API.
There are also options like proxyquire or rewire which give more powerful options for overriding dependencies at test time. The most suitable option is down to implementation and preference - but in all cases you are simply aiming to replace a given dependency at testing time.
Your addWidget method does 2 things:
"converts" a string to a Widget instance;
adds that instance to internal storage.
I suggest you change addWidget signature to accept instance directly, instead of a name, and move out creation some other place. Will make testing easier:
Manager.prototype.addWidget = function (widget) {
this.widgets.push(widget);
}
// no stubs needed for testing:
const manager = new Manager();
const widget = {};
manager.addWidget(widget);
assert.deepStrictEquals(manager.widgets, [widget]);
After that, you'll need a way of creating widgets by name, which should be pretty straight-forward to test as well:
// Maybe this belongs to other place, not necessarily Manager class…
Manager.createWidget = function (name) {
return new Widget(name);
}
assert(Manager.createWidget('calendar') instanceof Widget);