I have a model which looks like this:
//myModel.js
define([], function () {
var MyModel = Backbone.Model.extend({
// my code
});
return MyModel
});
Then If I want to write a spec for this model how should I load the model using requireJs?
I did try the following:
//myModel.spec.js
define([
"js/models/myModel",
], function (MyModel) {
describe("My model", function()
{
beforeEach(function ()
{
this.myModel = new MyModel({
name: "my title"
});
});
});
});
Is this the right way?
Yes, this is correct. The great thing about using RequireJS for testing is that you're forced to declare all your test dependencies in your define block. By definition, a unit test should only be testing one thing. So if you have multiple dependencies in one test, it's a code smell that you're not doing real "unit testing" at all.
Ideally, the only dependency should be the file which is under test. If that file has any dependencies, itself, such as server-side services, or complex asynchronous APIS, you can use stubs and mocks to simulate them. Check out SinonJS for a great stubbing/mocking library.
Related
We are planning to use web workers in an ionic 2 application. However ionic 2 uses ionic-app-scripts (https://github.com/ionic-team/ionic-app-scripts, we are using version 1.0.0) with webpack. We need to have a webworker typescript file that is compiled to JS but NOT bundled with the other files as main.js.
What we had in mind was to have a file name format such as
servicetest.worker.ts where the ".worker" file extension part will be identified and compiled from typescript to javascript but not bundled along with the other files.
Any advice on this is much appreciated as it seems that we have to customize the appscripts.
Kind of a really late answer but perhaps it may help someone else.
Take a look at https://www.javascripttuts.com/how-to-use-web-workers-with-ionic-in-one-go/
I followed that article but had to make some adjustments because I had to call a worker method on demand, not on the constructor.
On the ./src/assets folder create a new folder called workers where your worker.JS files will live. Yes JS not TS, as far as I know TypeScript files won't compile to an usable webworker.
Create a webworker. I'll paste the main code of my fuzzySearch webworker (./assets/workers/fuzzysearch-worker.js):
'use strict';
var pattern, originalList;
self.onmessage = function(event) {
// Receive a message and answer accordingly
pattern = event.data.pattern;
originalList = event.data.originalList;
self.doFuzzySearch();
};
self.doFuzzySearch = function() {
var filteredList;
console.time('internalFastFilter');
filteredList = self.fastFilter(originalList, (item) => self.hasApproxPattern(item.searchField, pattern));
console.timeEnd('internalFastFilter');
// Send the results back
postMessage({ filteredList: filteredList });
};
// The code above is purposely incomplete
On your .ts file declare the worker variable (usually above the constructor):
private readonly searchWorker: Worker = new Worker('./assets/workers/fuzzysearch-worker.js');
On the constructor:
constructor(/* Other injected dependencies here */
public ngZone: NgZone) {
this.searchWorker.onmessage = (event) => {
// Inside ngZone for proper ChangeDetection
this.ngZone.run(()=> {
this.dataList = event.data.filteredList;
console.timeEnd('searchWorker');
})
};
}
Finally, on your "action function", lets say doSearch:
doSearch(event) {
// ... extra code to do some magic
console.time('searchWorker');
this.searchWorker.postMessage({ command: 'doFuzzySearch', originalList: this.realList, pattern: searchFilter });
// ... extra code to do some other magic
}
this.searchWorker.postMessage makes the call. All heavy loading operations are resolved inside the webworker.
Hope it helps.
Best regards.
I have an angular 1.5 project with many modules and each module may depend on other modules. Trying to unit test say a controller which is part of a module I would do import the module like this:
angular.mock.module('SaidModule');
...then provide and inject its services where needed.
The problem is that SaidModule depends on AnotherModule1, AnotherModule2, AnotherModule3....
angular.module('SaidModule', ['AnotherModule1', 'AnotherModule2', 'AnotherModule3']);
So naturally when I call SaidModule the other modules are also invoked which is out of scope in terms of Unit testing
In the unit test I have tried the following solution
angular.module('AnotherModule1',[]);
angular.module('AnotherModule2',[]);
angular.module('AnotherModule3',[]);
angular.mock.module('SaidModule');
and although for the current unit test I have successfully decoupled the dependencies I have also destroyed
the actual AnotherModule1, AnotherModule2, AnotherModule3 so when its there turn to be unit tested they are
not even visible in the angular project which seems correct to me. as I am using angular.module to define a
new module which just happens to override the actual module.
This solution though is also suggested here mocking module dependencies
In the angular docs it states see angular docs mock module
If an object literal is passed each key-value pair will be registered on the module via $provide.value,
the key being the string name (or token) to associate with the value on the injector.
So it seems to me that the solution is using somehow angular.mock.module somehow to override the dependent
modules but so far I have not found a solution.
Any help much appreciated
By calling angular.module('AnotherModule1',[]) you are redefining the AnotherModule1, which I think is causing your downstream problems. Instead, use $provide for each dependent service. There's no need to mock the dependent modules.
Let's say your controller definition looks like this:
angular
.module('SaidModule', ['AnotherModule1', 'AnotherModule2'])
.controller('SaidController', [
'$scope',
'AnotherService',
function($scope, AnotherService) {
this.anotherService = AnotherService.helper();
}
);
Then your test might look like:
describe('SaidController', function() {
var controller, scope, AnotherService;
beforeEach(module('SaidModule'));
beforeEach(module(function($provide) {
AnotherService = { helper: function() { return 0; } };
$provide.value('AnotherService', AnotherService);
}));
beforeEach(inject(function($controller, $rootScope) {
scope = $rootScope.$new();
controller = $controller('SaidController', {
$scope: scope
});
}));
it('creates controller', function() {
expect(controller).not.toBeNull();
});
});
There's no need to mock the dependent modules, just the dependent services.
I have an angular service responsible for loading a config.json file. I would like to call it in my run phase so I set that json in my $rootContext and hence, it is available in the future for everyone.
Basically, this is what I've got:
angular.module('app.core', []).run(function(CoreRun) {
CoreRun.run();
});
Where my CoreRun service is:
angular.module('app.core').factory('CoreRun', CoreRun);
CoreRun.$inject = ['$rootScope', 'config'];
function CoreRun($rootScope, config) {
function run() {
config.load().then(function(response) {
$rootScope.config = response.data;
});
}
return {
run: run
};
}
This works fine and the problem comes up when I try to test it. I want to spy on my config service so it returns a fake promise. However, I cannot make it since during the config phase for my test, services are not available and I cannot inject $q.
As far as I can see the only chance I have to mock my config service is there, in the config phase, since it is called by run block.
The only way I have found so far is generating the promise using jQuery which I really don't like.
beforeEach(module('app.core'));
var configSample;
beforeEach(module(function ($provide) {
config = jasmine.createSpyObj('config', [ 'load' ]);
config.load.and.callFake(function() {
configSample = { baseUrl: 'someurl' };
return jQuery.Deferred().resolve({data: configSample}).promise();
});
provide.value('config', config);
}));
it('Should load configuration using the correspond service', function() {
// assert
expect(config.load).toHaveBeenCalled();
expect($rootScope.config).toBe(configSample);
});
Is there a way to make a more correct workaround?
EDIT: Probably worth remarking that this is an issue just when unit testing my run block.
Seems that it is not possible to inject $q the right way, because function in your run() block fires immediately. run() block is considered a config phase in Angular, so inject() in tests only runs after config blocks, therefore even if you inject() $q in test, it will be undefined, because run() executes first.
After some time I was able to get $q in the module(function ($provide) {}) block with one very dirty workaround. The idea is to create an extra angular module and include it in test before your application module. This extra module should also have a run() block, which is gonna publish $q to a global namespace. Injector will first call extra module's run() and then app module's run().
angular.module('global $q', []).run(function ($q) {
window.$q = $q;
});
describe('test', function () {
beforeEach(function () {
module('global $q');
module('app.core');
module(function ($provide) {
console.log(window.$q); // exists
});
inject();
});
});
This extra module can be included as a separate file for the test suite before spec files. If you put the module in the same file where the tests are, then you don't event need to use a global window variable, but just a variable within a file.
Here is a working plunker (see a "script.js" file)
First solution (does not solve the issue):
You actually can use $q in this case, but you have to inject it to a test file. Here, you won't really inject it to a unit under test, but directly to a test file to be able to use it inside the test. So it does not actually depend on the type of a unit under test:
// variable that holds injected $q service
var $q;
beforeEach(module(function ($provide) {
config = jasmine.createSpyObj('config', [ 'load' ]);
config.load.and.callFake(function() {
var configSample = { baseUrl: 'someurl' };
// create new deferred obj
var deferred = $q.defer();
// resolve promise
deferred.resolve({ data: configSample });
// return promise
return deferred.promise;
});
provide.value('config', config);
}));
// inject $q service and save it to global (for spec) variable
// to be able to access it from mocks
beforeEach(inject(function (_$q_) {
$q = _$q_;
}));
Resources:
Read more about inject() - angular.mock.inject
$q
And one more note: config phase and run phase are two different things. Config block allows to use providers only, but in the run block you can inject pretty much everything (except providers). More info here - Module Loading & Dependencies
I've just started using Jasmine with maven. I have Jasmine working, but for some reason it cannot find my Backbone models. I have the JavaScript src directory pointing to the folder containing my Backbone.js models. In my JavaScript test directory I have a simple test as such:
describe('ToDo Model',function()
{
it('Test',function() {
var todo = new ToDo();
});
});
But I keep getting ToDo is not defined. Do I have to write my tests inside of the my backbone model files or anything? Thanks.
ToDo has to be in the global namespace as well. Try typing this in your Chrome/Firefox console:
window.ToDo
If it returns undefined, then that's the problem!
It's usually a good practice to define a global namespace for your app, for example:
window.Application = {
Models: {},
Views: {},
Collections: {}
}
// etc.
Then, I like to define models like this:
(function (Models) {
Models.ToDo = Backbone.Models.extend({
// etc...
});
})(Application.Models);
The namespacing here isn't necessary, but seeing Models right at the top of the file is a nice visual cue, I think.
This is a trivial example that illustrates the crux of my problem:
var innerLib = require('./path/to/innerLib');
function underTest() {
return innerLib.doComplexStuff();
}
module.exports = underTest;
I am trying to write a unit test for this code. How can I mock out the requirement for the innerLib without mocking out the require function entirely?
So this is me trying to mock out the global require and finding out that it won’t work even to do that:
var path = require('path'),
vm = require('vm'),
fs = require('fs'),
indexPath = path.join(__dirname, './underTest');
var globalRequire = require;
require = function(name) {
console.log('require: ' + name);
switch(name) {
case 'connect':
case indexPath:
return globalRequire(name);
break;
}
};
The problem is that the require function inside the underTest.js file has actually not been mocked out. It still points to the global require function. So it seems that I can only mock out the require function within the same file I’m doing the mocking in. If I use the global require to include anything, even after I’ve overridden the local copy, the files being required will still have the global require reference.
You can now!
I published proxyquire which will take care of overriding the global require inside your module while you are testing it.
This means you need no changes to your code in order to inject mocks for required modules.
Proxyquire has a very simple api which allows resolving the module you are trying to test and pass along mocks/stubs for its required modules in one simple step.
#Raynos is right that traditionally you had to resort to not very ideal solutions in order to achieve that or do bottom-up development instead
Which is the main reason why I created proxyquire - to allow top-down test driven development without any hassle.
Have a look at the documentation and the examples in order to gauge if it will fit your needs.
A better option in this case is to mock methods of the module that gets returned.
For better or worse, most node.js modules are singletons; two pieces of code that require() the same module get the same reference to that module.
You can leverage this and use something like sinon to mock out items that are required. mocha test follows:
// in your testfile
var innerLib = require('./path/to/innerLib');
var underTest = require('./path/to/underTest');
var sinon = require('sinon');
describe("underTest", function() {
it("does something", function() {
sinon.stub(innerLib, 'toCrazyCrap').callsFake(function() {
// whatever you would like innerLib.toCrazyCrap to do under test
});
underTest();
sinon.assert.calledOnce(innerLib.toCrazyCrap); // sinon assertion
innerLib.toCrazyCrap.restore(); // restore original functionality
});
});
Sinon has good integration with chai for making assertions, and I wrote a module to integrate sinon with mocha to allow for easier spy/stub cleanup (to avoid test pollution.)
Note that underTest cannot be mocked in the same way, as underTest returns only a function.
Another option is to use Jest mocks. Follow up on their page
I use mock-require. Make sure you define your mocks before you require the module to be tested.
Simple code to mock modules for the curious
Notice the parts where you manipulate the require.cache and note require.resolve method as this is the secret sauce.
class MockModules {
constructor() {
this._resolvedPaths = {}
}
add({ path, mock }) {
const resolvedPath = require.resolve(path)
this._resolvedPaths[resolvedPath] = true
require.cache[resolvedPath] = {
id: resolvedPath,
file: resolvedPath,
loaded: true,
exports: mock
}
}
clear(path) {
const resolvedPath = require.resolve(path)
delete this._resolvedPaths[resolvedPath]
delete require.cache[resolvedPath]
}
clearAll() {
Object.keys(this._resolvedPaths).forEach(resolvedPath =>
delete require.cache[resolvedPath]
)
this._resolvedPaths = {}
}
}
Use like:
describe('#someModuleUsingTheThing', () => {
const mockModules = new MockModules()
beforeAll(() => {
mockModules.add({
// use the same require path as you normally would
path: '../theThing',
// mock return an object with "theThingMethod"
mock: {
theThingMethod: () => true
}
})
})
afterAll(() => {
mockModules.clearAll()
})
it('should do the thing', async () => {
const someModuleUsingTheThing = require('./someModuleUsingTheThing')
expect(someModuleUsingTheThing.theThingMethod()).to.equal(true)
})
})
BUT... jest has this functionality built in and I recommend that testing framework over rolling your own for testing purposes.
Mocking require feels like a nasty hack to me. I would personally try to avoid it and refactor the code to make it more testable.
There are various approaches to handle dependencies.
1) pass dependencies as arguments
function underTest(innerLib) {
return innerLib.doComplexStuff();
}
This will make the code universally testable. The downside is that you need to pass dependencies around, which can make the code look more complicated.
2) implement the module as a class, then use class methods/ properties to obtain dependencies
(This is a contrived example, where class usage is not reasonable, but it conveys the idea)
(ES6 example)
const innerLib = require('./path/to/innerLib')
class underTestClass {
getInnerLib () {
return innerLib
}
underTestMethod () {
return this.getInnerLib().doComplexStuff()
}
}
Now you can easily stub getInnerLib method to test your code.
The code becomes more verbose, but also easier to test.
If you've ever used jest, then you're probably familiar with jest's mock feature.
Using "jest.mock(...)" you can simply specify the string that would occur in a require-statement in your code somewhere and whenever a module is required using that string a mock-object would be returned instead.
For example
jest.mock("firebase-admin", () => {
const a = require("mocked-version-of-firebase-admin");
a.someAdditionalMockedMethod = () => {}
return a;
})
would completely replace all imports/requires of "firebase-admin" with the object you returned from that "factory"-function.
Well, you can do that when using jest because jest creates a runtime around every module it runs and injects a "hooked" version of require into the module, but you wouldn't be able to do this without jest.
I have tried to achieve this with mock-require but for me it didn't work for nested levels in my source. Have a look at the following issue on github: mock-require not always called with Mocha.
To address this I have created two npm-modules you can use to achieve what you want.
You need one babel-plugin and a module mocker.
babel-plugin-mock-require
jestlike-mock
In your .babelrc use the babel-plugin-mock-require plugin with following options:
...
"plugins": [
["babel-plugin-mock-require", { "moduleMocker": "jestlike-mock" }],
...
]
...
and in your test file use the jestlike-mock module like so:
import {jestMocker} from "jestlike-mock";
...
jestMocker.mock("firebase-admin", () => {
const firebase = new (require("firebase-mock").MockFirebaseSdk)();
...
return firebase;
});
...
The jestlike-mock module is still very rudimental and does not have a lot of documentation but there's not much code either. I appreciate any PRs for a more complete feature set. The goal would be to recreate the whole "jest.mock" feature.
In order to see how jest implements that one can look up the code in the "jest-runtime" package. See https://github.com/facebook/jest/blob/master/packages/jest-runtime/src/index.js#L734 for example, here they generate an "automock" of a module.
Hope that helps ;)
You can't. You have to build up your unit test suite so that the lowest modules are tested first and that the higher level modules that require modules are tested afterwards.
You also have to assume that any 3rd party code and node.js itself is well tested.
I presume you'll see mocking frameworks arrive in the near future that overwrite global.require
If you really must inject a mock you can change your code to expose modular scope.
// underTest.js
var innerLib = require('./path/to/innerLib');
function underTest() {
return innerLib.toCrazyCrap();
}
module.exports = underTest;
module.exports.__module = module;
// test.js
function test() {
var underTest = require("underTest");
underTest.__module.innerLib = {
toCrazyCrap: function() { return true; }
};
assert.ok(underTest());
}
Be warned this exposes .__module into your API and any code can access modular scope at their own danger.
You can use mockery library:
describe 'UnderTest', ->
before ->
mockery.enable( warnOnUnregistered: false )
mockery.registerMock('./path/to/innerLib', { doComplexStuff: -> 'Complex result' })
#underTest = require('./path/to/underTest')
it 'should compute complex value', ->
expect(#underTest()).to.eq 'Complex result'
I use a simple factory the returns a function that calls a function with all of its dependencies:
/**
* fnFactory
* Returns a function that calls a function with all of its dependencies.
*/
"use strict";
const fnFactory = ({ target, dependencies }) => () => target(...dependencies);
module.exports = fnFactory;
Wanting to test the following function:
/*
* underTest
*/
"use strict";
const underTest = ( innerLib, millions ) => innerLib.doComplexStuff(millions);
module.exports = underTest;
I would setup my test (I use Jest) as follows:
"use strict";
const fnFactory = require("./fnFactory");
const _underTest = require("./underTest");
test("fnFactory can mock a function by returng a function that calls a function with all its dependencies", () => {
const fake = millions => `Didn't do anything with ${millions} million dollars!`;
const underTest = fnFactory({ target: _underTest, dependencies: [{ doComplexStuff: fake }, 10] });
expect(underTest()).toBe("Didn't do anything with 10 million dollars!");
});
See results of test
In production code I would manually inject the callee's dependencies as below:
/**
* main
* Entry point for the real application.
*/
"use strict";
const underTest = require("./underTest");
const innerLib = require("./innerLib");
underTest(innerLib, 10);
I tend to limit the scope of most of the modules that I write to one thing, which reduces the number of dependencies that have to be accounted for when testing and integrating them into the project.
And that's my approach to dealing with dependencies.