I am currently new to Sinon, Mocha, Supertest and in the process to writes tests. In my current scenario, i have authentication library which verifies my "OTP" and after verifying it proceeds to do operation within the callback function.
I am unable to mock the callback to return null and carry on to test rest of the code. Following is my code snippet:
Controller.js
var authy = require('authy')(sails.config.authy.token);
authy.verify(req.param('aid'), req.param('oid'), function(err, response) {
console.log(err);
if (err) {
return res.badRequest('verification failed.');
}
....
My test is :
var authy = require('authy')('token');
describe('Controller', function() {
before(function() {
var authyStub = sinon.stub(authy, 'verify');
authyStub.callsArgWith(2, null, true);
});
it('creates a test user', function(done) {
// This function will create a user again and again.
this.timeout(15000);
api.post('my_endpoint')
.send({
aid: 1,
oid: 1
})
.expect(201, done);
});
});
I essentially want to call authy verify get a null as "err" in callback, so i can test the rest of the code.
Any help would be highly appreciated.
Thanks
The trouble is that you're using different instances of the authy object in your tests and your code. See here authy github repo.
In your code you do
var authy = require('authy')(sails.config.authy.token);
and in your test
var authy = require('authy')('token');
So your stub is generally fine, but it will never work like this because your code does not use your stub.
A way out is to allow for the authy instance in your controller to be injected from the outside. Something like this:
function Controller(authy) {
// instantiate authy if no argument passed
in your tests you can then do
describe('Controller', function() {
before(function() {
var authyStub = sinon.stub(authy, 'verify');
authyStub.callsArgWith(2, null, true);
// get a controller instance, however you do it
// but pass in your stub explicitly
ctrl = Controller(authyStub);
});
});
Related
I have a situation where there are many mocked http requests. While working on angular upload, something fishy happening in my case. It is always throwing status:200 success and html complete body response.
Below is my angular upload code:
function fileUploadController(FileUploader) {
/* jshint validthis: true */
let vm = this;
vm.type = this.type;
vm.clientId = this.client;
let uploader = new FileUploader({
url: 'http://localhost:8001/prom-scenario-config/files?clientId=' + vm.clientId,
data: {type: vm.type}
});
vm.uploads = {uploader};
vm.upload = upload;
vm.uploads.uploader.queue = [];
vm.uploads.uploader.onCompleteItem = function (item, response) {
let type = item.uploader.data.type;
console.log('response => ', response);
};
}
mock of httpbackend code looks like this:
$httpBackend.whenPOST(new RegExp('http://localhost:8001/prom-scenario-config/files\\?clientId=([a-zA-Z0-9-])$$'))
.respond(function () {
return [200, 'foo'];
});
But there is no affect on this.
Is there any error in my regex code in constructing?
With or without having the mock code. Still the response i am receiving is 200.
There are so many mock requests, i am facing difficulty in identifying which request is being called.
Is there any tricky way to identify which regex call is called? Or enforce my request to mock?
Below is the reference for status and response FYI.
Unit test suppose that a unit is tested in isolation. Any other thing which is not a tested unit, i.e. a controller should be mocked, especially third-party units.
Considering that it is tested with Jasmine, FileUpload service should be stubbed:
beforeEach(() => {
// a spy should be created inside beforeEach to be fresh every time
module('app', { FileUpload: jasmine.createSpy() });
});
And then controller is tested line by line like:
it('...', inject(($controller, FileUpload) => {
const ctrl = $controller('fileUploadController');
...
expect(FileUpload).toHaveBeenCalledTimes(1);
expect(FileUpload).toHaveBeenCalledWith({ url: '...', type: {...} });
// called with new
const fileUpload = FileUpload.calls.first().object;
expect(fileUpload instanceof FileUpload).toBe(true);
expect(ctrl.fileUpload).toBe(fileUpload);
expect(fileUpload.onCompleteItem).toEqual(jasmine.any(Function));
expect(fileUpload.queue).toEqual([]);
...
}));
In the code above clientId=([a-zA-Z0-9-]) regexp part matches only ids consisting of a single character, which isn't true. That's why it is preferable to hard-code values in unit tests, human errors are easier to spot and detect. It's not possible to unambiguously identify the problem when the tests are too loose, this results in wasted time.
Im working with an node handler in AWS lambda and i need to make another files with integration tests from that function, but i cant mock the transporter with sinon or mockery.
the index.js function:
var nodemailer = require('nodemailer');
exports.handler = (event, context, callback) =>
{
var transporter=createTransporter();
transporter.sendMail(data, function (error, success) {
console.log(error);
response = getResponse(404, error);
}
callback(null, response);
});
}
function createTransporter() {
return nodemailer.createTransport({
service: "SMTP",
auth: {
user: "XXXX#XXX",
pass: "XXXX"
}
});
}
the purpose is to mock the function createTransporter() so that it doesnt send any email when it is called in javascript file test with mocha and expect:
var mockery = require('mockery');
var nodemailerMock = require('nodemailer-mock');
var index = require("../index.js");
describe("The handler function tests", function () {
before(function () {
mockery.enable({
warnOnUnregistered: false
});
mockery.registerMock('nodemailer', nodemailerMock);
});
it('JSON error html ', function () {
var callback = function (name, response) {
expect(JSON.stringify(response.statusCode)).to.be('404');
};
var context = {};
index.handler(event, context, callback);
});
});
I wrote nodemailer-mock :)
The problem you're having is that you are calling var index = require("../index.js"); before you are mocking nodemailer via mockery, so it is already in the module cache. I included // Make sure anything that uses nodemailer is loaded here, after it is mocked... in the examples in the README, but should probably make it more clear.
Move the require("../index.js") after nodemailer is mocked and it will be work as expected.
var mockery = require('mockery');
var nodemailerMock = require('nodemailer-mock');
// don't require here since you will get the real nodemailer and cache it
var index;
describe("The handler function tests", function () {
before(function () {
mockery.enable({
warnOnUnregistered: false
});
mockery.registerMock('nodemailer', nodemailerMock);
// do the require() here after nodemailer is mocked
index = require("../index.js");
});
// your tests here should now use nodemailer-mock
it('JSON error html ', function () {
var callback = function (name, response) {
expect(JSON.stringify(response.statusCode)).to.be('404');
};
var context = {};
index.handler(event, context, callback);
});
});
Another option is to use the { useCleanCache: true } option with calls to mockery.resetCache();, though I have had mixed results. See Controlling the Module Cache in the mockery documentation.
I'm not 100% sure why this would fail, but I suggest one of two things:
Try doing var createTransporter = function()... there's a slight difference here that might be your issue
exporting createTransporter so you can assign a new value to it, either a mock or not. This isn't very "keep implementation details private", it does work
Have your module return a class, or object anyway, where you can set some "use this transporter method" value. (ie dependency injection)
You can use the following option from Jest:
jest.mock('nodemailer').setMock(/* function mock for module */)
Remember to use this at the top of the file, before import or require statements.
Here is the official Jest documentation: https://jestjs.io/docs/manual-mocks#mocking-node-modules.
I have a module where I load a mustache template file. I would like to write a unit test for this. I am trying to use mocha, chai and rewire.
Here is my module.js:
var winston = require('winston');
var fs = require('fs');
var config = require('./config.js');
exports.logger = new winston.Logger({
transports: [
new winston.transports.File(config.logger_config.file_transport),
new winston.transports.Console(config.logger_config.console_transport)
],
exitOnError: false
});
exports.readTemplateFile = function(templateFile, callback) {
fs.readFile(config.base_directory + templateFile + '.tpl.xml', 'utf8', function (err, data) {
if (err) {
logger.error('Could not read template ' + templateFile + ': ' + err);
}
callback(data);
});
};
In the callback function I use mustache to do something with the template.
What is the best way to test this?
Maybe I will have to rewire the fs.readFile? As the file won't be there when the test will be executed. The Winston logger is also an interesting part I guess, not sure if it will be initialized if I import this within a mocha test. My first test shows the logger is undefined.
One of the most important unit testing principle is testing very small piece of code. To achieve that you should definitely mock or stub calls to functions that not belong to testing code (readFile and logger.error in this case). For provided code you can make three test cases:
calling readFile with proper arguments
calling error if err is present
calling callback function with proper argument
Your callback function should be tested outside this code, for e.g. by providing fake data as parameter:
define('Some test', () => {
it('should return true', () => {
expect(callbackFunction('fakeData').to.be.ok);
});
});
I'm new to Sails and don't know exactly where to put the initialisation of an object to be unique in all the app. After reading the docs I assumed that I can have it in the global sails object, but not sure if is the better place.
I'm using the new Appcelerator ArrowDB to store my users and objects. Docs talk about declare the appropriate vars and use it, with the APP_KEY.
var ArrowDB = require('arrowdb'),
arrowDBApp = new ArrowDB('<App Key>');
function login(req, res) {
var data = {
login: req.body.username,
password: req.body.password,
// the req and res parameters are optional
req: req,
res: res
};
arrowDBApp.usersLogin(data, function(err, result) {
if (err) {
console.error("Login error:" + (err.message || result.reason));
} else {
console.log("Login successful!");
console.log("UserInfo: " + JSON.stringify(result.body.response.users[0]));
}
});
}
But I will need to use constantly that arrowDBApp var to create, update, delete objects in the database, so I think the best way is to initialize it in the starting script app.js and share across the app.
I tried it, but I was not able to store it in the sails var, it seems that this var is not available (or lose its config) until sails.lift() is executed.
This code (app.js file) shows nothing in the console:
// Ensure we're in the project directory, so relative paths work as expected
// no matter where we actually lift from.
process.chdir(__dirname);
// Ensure a "sails" can be located:
(function() {
var sails;
try {
sails = require('sails');
} catch (e) {
console.error('To run an app using `node app.js`, you usually need to have a version of `sails` installed in the same directory as your app.');
console.error('To do that, run `npm install sails`');
console.error('');
console.error('Alternatively, if you have sails installed globally (i.e. you did `npm install -g sails`), you can use `sails lift`.');
console.error('When you run `sails lift`, your app will still use a local `./node_modules/sails` dependency if it exists,');
console.error('but if it doesn\'t, the app will run with the global sails instead!');
return;
}
// Try to get `rc` dependency
var rc;
try {
rc = require('rc');
} catch (e0) {
try {
rc = require('sails/node_modules/rc');
} catch (e1) {
console.error('Could not find dependency: `rc`.');
console.error('Your `.sailsrc` file(s) will be ignored.');
console.error('To resolve this, run:');
console.error('npm install rc --save');
rc = function () { return {}; };
}
}
// My own code
var APP_KEY = 'mykey';
var ArrowDB = require('arrowdb');
sails.arrowDBApp = new ArrowDB(APP_KEY);
console.log("Hi" + JSON.stringify(sails));
// Start server
sails.lift(rc('sails'));
console.log("Finish");
})();
No "HI" and no "Finish" are printed. If I try to use sails.arrowDBApp in another controller, it is undefined.
Tips are welcome.
It's not advisable to modify app.js unless you really need to.
The usual space to save all configuration information (e.g. the APP_KEY) is in the config directory in your project root.
One-time initializations (e.g. ArrowDB initialization) can be added to config/bootstrap.js.
Update
In config/arrowdb.js (you need to create this file yourself):
module.exports.arrowdb = {
APP_KEY: 'yourappkey',
ArrowDBApp: null
};
In config/bootstrap.js:
var ArrowDB = require('arrowdb');
module.exports.bootstrap = function(next){
sails.config.arrowdb['ArrowDBApp'] = new ArrowDB(sails.config.arrowdb['APP_KEY']);
next(); // Don't forget to add this
};
In your controller:
'task': function(req, res, next) {
sails.config.arrowdb['ArrowDBApp'].usersLogin(...);
// and so on.
// You could also add something like
// var ADB = sails.config.arrowdb['ArrowDBApp'];
// at the top in case you need to use it on and on.
}
Use config/bootstrap.js to initialize something before Sails lifted. Sometimes if we want to put something in global variable, this approach is good to use, like define/ override native Promise with Bluebird Promise.
Use api/services to put some method or other things that you will use regularly in your code (controllers, models, etc.), like Mail Service, that handle sending email within your application.
Use config at config folder to predefined something at sails.config[something]. It can be an object, function, or whatever in order to become configurable, like put Twitter API Key to use Twitter REST API.
To achieve what you wanted, I'll try to use service and bootstrap.js. Try this example.
Create service file at api/services/ArrowDBService.js
Put with this code:
var ArrowDB = require('arrowdb'),
arrowDBApp = new ArrowDB('<App Key>');
module.exports = {
arrowDBApp : arrowDBApp,
login : function (req, res) {
var data = {
login: req.body.username,
password: req.body.password,
// the req and res parameters are optional
req: req,
res: res
};
arrowDBApp.usersLogin(data, function(err, result) {
if (err) {
console.error("Login error:" + (err.message || result.reason));
} else {
console.log("Login successful!");
console.log("UserInfo: " + JSON.stringify(result.body.response.users[0]));
}
});
}
};
Now you can use it by sails.services.arrowdbservice.login(req,res) or simply ArrowDBService.login(req,res) (notice about case sensitive thing). Since I don't know about ArrowDB, so you may explore by yourself about login method that your example provide.
I have the following code in server/statusboard.js;
var require = __meteor_bootstrap__.require,
request = require("request")
function getServices(services) {
services = [];
request('http://some-server/vshell/index.php?type=services&mode=json', function (error, response, body) {
var resJSON = JSON.parse(body);
_.each(resJSON, function(data) {
var host = data["host_name"];
var service = data["service_description"];
var hardState = data["last_hard_state"];
var currState = data["current_state"];
services+={host: host, service: service, hardState: hardState, currState: currState};
Services.insert({host: host, service: service, hardState: hardState, currState: currState});
});
});
}
Meteor.startup(function () {
var services = [];
getServices(services);
console.log(services);
});
Basically, it's pulling some data from a JSON feed and trying to push it into a collection.
When I start up Meteor I get the following exception;
app/packages/livedata/livedata_server.js:781
throw exception;
^
Error: Meteor code must always run within a Fiber
at [object Object].withValue (app/packages/meteor/dynamics_nodejs.js:22:15)
at [object Object].apply (app/packages/livedata/livedata_server.js:767:45)
at [object Object].insert (app/packages/mongo-livedata/collection.js:199:21)
at app/server/statusboard.js:15:16
at Array.forEach (native)
at Function.<anonymous> (app/packages/underscore/underscore.js:76:11)
at Request._callback (app/server/statusboard.js:9:7)
at Request.callback (/usr/local/meteor/lib/node_modules/request/main.js:108:22)
at Request.<anonymous> (/usr/local/meteor/lib/node_modules/request/main.js:468:18)
at Request.emit (events.js:67:17)
Exited with code: 1
I'm not too sure what that error means. Does anyone have any ideas, or can suggest a different approach?
Just wrapping your function in a Fiber might not be enough and can lead to unexpected behavior.
The reason is, along with Fiber, Meteor requires a set of variables attached to a fiber. Meteor uses data attached to a fiber as a dynamic scope and the easiest way to use it with 3rd party api is to use Meteor.bindEnvironment.
T.post('someurl', Meteor.bindEnvironment(function (err, res) {
// do stuff
// can access Meteor.userId
// still have MongoDB write fence
}, function () { console.log('Failed to bind environment'); }));
Watch these videos on evented mind if you want to know more:
https://www.eventedmind.com/posts/meteor-dynamic-scoping-with-environment-variables
https://www.eventedmind.com/posts/meteor-what-is-meteor-bindenvironment
As mentioned above it is because your executing code within a callback.
Any code you're running on the server-side needs to be contained within a Fiber.
Try changing your getServices function to look like this:
function getServices(services) {
Fiber(function() {
services = [];
request('http://some-server/vshell/index.php?type=services&mode=json', function (error, response, body) {
var resJSON = JSON.parse(body);
_.each(resJSON, function(data) {
var host = data["host_name"];
var service = data["service_description"];
var hardState = data["last_hard_state"];
var currState = data["current_state"];
services+={host: host, service: service, hardState: hardState, currState: currState};
Services.insert({host: host, service: service, hardState: hardState, currState: currState});
});
});
}).run();
}
I just ran into a similar problem and this worked for me. What I have to say though is that I am very new to this and I do not know if this is how this should be done.
You probably could get away with only wrapping your insert statement in the Fiber, but I am not positive.
Based on my tests you have to wrap the insert in code I tested that is similar to the above example.
For example, I did this and it still failed with Fibers error.
function insertPost(args) {
if(args) {
Fiber(function() {
post_text = args.text.slice(0,140);
T.post('statuses/update', { status: post_text },
function(err, reply) {
if(reply){
// TODO remove console output
console.log('reply: ' + JSON.stringify(reply,0,4));
console.log('incoming twitter string: ' + reply.id_str);
// TODO insert record
var ts = Date.now();
id = Posts.insert({
post: post_text,
twitter_id_str: reply.id_str,
created: ts
});
}else {
console.log('error: ' + JSON.stringify(err,0,4));
// TODO maybe store locally even though it failed on twitter
// and run service in background to push them later?
}
}
);
}).run();
}
}
I did this and it ran fine with no errors.
function insertPost(args) {
if(args) {
post_text = args.text.slice(0,140);
T.post('statuses/update', { status: post_text },
function(err, reply) {
if(reply){
// TODO remove console output
console.log('reply: ' + JSON.stringify(reply,0,4));
console.log('incoming twitter string: ' + reply.id_str);
// TODO insert record
var ts = Date.now();
Fiber(function() {
id = Posts.insert({
post: post_text,
twitter_id_str: reply.id_str,
created: ts
});
}).run();
}else {
console.log('error: ' + JSON.stringify(err,0,4));
// TODO maybe store locally even though it failed on twitter
// and run service in background to push them later?
}
}
);
}
}
I thought this might help others encountering this issue. I have not yet tested calling the asynchy type of external service after internal code and wrapping that in a Fiber. That might be worth testing as well. In my case I needed to know the remote action happened before I do my local action.
Hope this contributes to this question thread.