I just started playing around Jasmine and I'm still struggling on the spyon/mocking things, e.g., I have a function
module.exports = (() => {
....
function getUserInfo(id) {
return new Promise((resolve, reject) => {
redis.getAsync(id).then(result => {
resolve(result)
})
})
}
return { getUserInfo: getUserInfo }
})()
Then I start writing the Jasmine spec
describe('Test user helper', () => {
let userInfo
beforeEach(done => {
userHelper.getUserInfo('userid123')
.then(info => {
userInfo = info
done()
})
})
it('return user info if user is found', () => {
expect(userInfo).toEqual('info of userid 123')
})
})
It runs well, but my question is how can I mock the redis.getAsync call, so it can become a real isolated unit test?
Thanks.
Good question. You can mock out the redis dependency but only if you rewrite you code, slightly, to be more testable.
Here, that means making redis a parameter to the factory that returns the object containing getUserInfo.
Of course, this changes the API, callers now need to call the export to get the object. To fix this, we can create a wrapper module that calls the function with the standard redis object, and returns the result. Then we move the actual factory into an inner module, which still allows it to be tested.
Here is what that might well look like
user-helper/factory.js
module.exports = redis => {
....
function getUserInfo(id) {
return redis.getAsync(id); // note simplified as new Promise was not needed
}
return {getUserInfo};
};
user-helper/index.js
// this is the wrapper that preserves existing API
module.exports = require('./factory')(redis);
And now for the test
const userHelperFactory = require('./user-helper/factory');
function createMockRedis() {
const users = [
{userId: 'userid123'},
// etc.
];
return {
getAsync: function (id) {
// Note: I do not know off hand what redis returns, or if it throws,
// if there is no matching record - adjust this to match.
return Promise.resolve(users.find(user => user.userId === id));
}
};
}
describe('Test user helper', () => {
const mockRedis = createMockRedis();
const userHelper = userHelperFactory(mockRedis);
let userInfo;
beforeEach(async () => {
userInfo = await userHelper.getUserInfo('userid123');
});
it('must return user info when a matching user exists', () => {
expect(userInfo).toEqual('info of userid 123');
});
});
NOTE: As discussed in comments, this was just my incidental approach to the situation at hand. There are plenty of other setups and conventions you can use but the primary idea was just based on the existing export of the result of an IIFE, which is a solid pattern, and I leveraged the NodeJS /index convention to preserve the existing API. You could also use one file and export via both module.exports = factory(redis) and module.exports.factory = factory, but that would, I believe, be less idiomatic in NodeJS. The broader point was that being able to mock for tests, and testability in general is just about parameterization.
Parameterization is wonderfully powerful, and its simplicity is why developers working in functional languages sometimes laugh at OOP programmers, such as yours truly, and our clandestine incantations like "Oh glorious Dependency Injection Container, bequeath unto me an instanceof X" :)
It is not that OOP or DI get it wrong it is that testability, DI, IOC, etc. are just about parameterization.
Interestingly, if we were loading redis as a module, and if we were using a configurable module loader, such as SystemJS, we could do this by simply using loader configuration at the test level. Even Webpack lets you do this to some extent, but for NodeJS you would need to monkey patch the Require Function, or create a bunch of fake packages, which are not good options.
To the OP's specific response
Thanks! That's a good idea, but practically, it seems it's quite strange when I have tons of file to test in which I will need to create a factory and index.js for each of them.
You would need to restructure your API surface and simply export factories that consuming code must call, rather than the result of applying those factories, to reduce the burden, but there are tradeoffs and default instances are helpful to consumers.
Related
Can someone please explain what’s the use of this pattern with example?
All I'm confused is that I can have database instance wherever I want and I have flexibility to do anything by it, am I wrong? specially is
The repository pattern is a strategy for abstracting data access.
it’s like putting a universal adapter in between your application and your data so it doesn’t matter what data storage technology you use. All your app wants is having defined operations on items, it shouldn’t have to care about how it’s stored or where it comes from.
Also, there's no need to mention that all impacts of changes will be handled from one place instead of cascading all through your code!
Personally, I love this design pattern because it allows me to only have concerns about my business logics at the first steps instead of dealing with variant databases, on top of that, it solves a huge amount of my headaches when it comes to writing tests! So instead of stubbing or spying databases, which can be a headache, I can simply enjoy a mock version of operations
Now let’s implement a sample in js, it can be just as simple as below code (it's a simplified sample of course)
// userRepository.js
const userDb = [];
module.exports = {
insert: async (user) => userDb.push(user),
findAll: async () => userDb,
};
here is how I use this pattern, first I write something like below code in a 5 minute
// userRepository.js
const userDb = new Map();
module.exports = Object.freeze({
findById: async (id) => userDb.get(id),
insert: async (user) => userDb.set(user.id, user),
findAll: async () => Array.from(userDb.values()),
removeById: async (id) => userDb.delete(id),
update: async (updatedUser) => {
if (!userDb.has(updatedUser.id)) throw new Error('user not found');
userDb.set(updatedUser.id, { ...userDb.get(updatedUser.id), ...updatedUser });
},
});
Then I start to write my unit tests for repository that I’ve just written and business use-cases and so on…
anytime I’m satisfied with everything I can simply use a real database, because it’s just an IO mechanism, isn’t it? :)
So in above code I’ll replace userDb with a real database and write real data access methods, and of course expect all my tests to be passed.
I have two different libraries that I'm using to make mocks in Jest. The libraries have the same function called get. This is a problem for my current implementation since get is used by two different libraries is it possible to use an alias for mock functions (jest.fn()) or maybe some kind of workaround that doesn't ruin the integrity of the current implementation?
Here is my current implementation and I would I like to keep this way if possible:
let get: jest.Mock<{}>
jest.mock('rxjs/ajax', () => {
get = jest.fn()
return { ajax: { get } }
})
let get as cookieGet: jest.Mock<()> // Can I do something like this
jest.mock('js-cookie', () => {
get = jest.fn()
return { get }
})
I'm not too familiar with aliases in JS or they Jest handles things like this so any help is much appreciated.
It's unnecessary to use { get } shorthand property syntax for object literal if it results in name collisions.
Another problem is that a variable needs to have mock prefix in order to be used in the scope of jest.mock factory function. As the documentation states,
A limitation with the factory parameter is that, since calls to jest.mock() are hoisted to the top of the file, it's not possible to first define a variable and then use it in the factory. An exception is made for variables that start with the word 'mock'. It's up to you to guarantee that they will be initialized on time!
It can be:
import ... from 'rxjs/ajax';
import ... from 'js-cookie';
let mockRxAjaxGet: jest.Mock<{}>
jest.mock('rxjs/ajax', () => {
mockRxAjaxGet = jest.fn()
return { ajax: { get: mockRxAjaxGet } }
})
let mockJsCookieGet: jest.Mock<()>
jest.mock('js-cookie', () => {
mockJsCookieGet = jest.fn()
return { get: mockJsCookieGet }
})
The problem is that once jest.mock is hoisted above imports, it will be evaluated when let variables are in temporal dead zone and cannot be assigned.
So let should be preferably changed to var, which is hoisted. Or mocked function be imported as usual and used with get as jest.Mock<...> where a spy is expected. mocked helper can be used to enforce TypeScript type safety.
Just wanted to know how to group all of my API calls altogether in an api.js file, in my React App (just some pseudocode would work). I have read an interesting article that introduces that idea, and I feel curious because that file structure really fits my needs. How would it be?
Moreover, the author states in a comment:
I usually just put all of my API calls into that file - they're
usually small one-or-two-line functions that call out to axios, and I
just export them.
export function login(username, password) { ... } export function
getFolders() { ... } etc.
But I feel it lacks some details to reproduce it. I am new to Javascript and React. Thanks.
Say you are using axios for http calls, I guess it would be smth like this:
api.js:
import axios from 'axios';
import { resolve } from './resolve.js';
export async function login(user, pass) {
return await resolve(axios.post('http://some-api.com/auth', { user, pass }).then(res => res.data));
}
export async function getUser(id) {
return await resolve(axios.get(`http://some-api.com/users/${id}`).then(res => res.data));
}
// and so on....
And as he said on the post, If your files starts to get too big, you can create a src/api/ and create separate files like src/api/auth.js, src/api/users.js, etc..
To resolve the promises I like to use the new async/await syntax and wrap it in a little module resolver.js:
export function async resolve(promise) {
const resolved = {
data: null,
error: null
};
try {
resolved.data = await promise;
} catch(e) {
resolved.error = e;
}
return resolved;
}
And your component smth like:
// ...
componentDidMount() {
this.getUser();
}
async getUser() {
const user = await api.getUser(this.props.id);
if(user.error)
this.setState({ error: user.error });
else
this.setState({ user: user.data });
}
Again, this is something I like to do, I think the code looks clearer, keeping a synchronous structure. But I guess it's perfectly fine to resolve your promises with .then() and .catch() also.
I hope that helped!
It depends on how much API functions a project has.
I usually stick with project structure called "grouping by file type" mentioned in React official website and keep API related files in a separate directory where every file has an API functionality dedicated to a specific entity.
However, for small projects, it makes sense to keep all API functionality in one file.
I'm trying to figure out a way to mock redis in this module:
const Redis = require('ioredis');
const myFunction = {
exists: (thingToCheck) {
let redis_client = new Redis(
6379,
process.env.REDIS_URL,
{
connectTimeout: 75,
dropBufferSupport: true,
retryStrategy: functionHere
});
redis_client.exists(thingToCheck, function (err, resp) {
// handlings in here
});
}
};
Using this test-code:
const LambdaTester = require('lambda-tester');
const chai = require('chai');
const expect = chai.expect;
const sinon = require('sinon');
const mockRedis = sinon.mock(require('ioredis'));
describe( 'A Redis Connection error', function() {
before(() => {
mockRedis.expects('constructor').returns({
exists: (sha, callback) => {
callback('error!', null);
}
});
});
it( 'It returns a database error', function() {
return LambdaTester(lambdaToTest)
.event(thingToCheck)
.expectError((err) => {
expect(err.message).to.equal('Database error');
});
});
});
I also tried a few variations, but I'm kind of stuck as I basically need to mock the constructor and I'm not sure Sinon supports this?
mockRedis.expects('exists').returns(
(thing, callback) => {
callback('error!', null);
}
);
sinon.stub(mockRedis, 'constructor').callsFake(() => console.log('test!'));
sinon.stub(mockRedis, 'exists').callsFake(() => console.log('test!'));
Not sure what else to try here, I also tried using rewire as suggested here, but using mockRedis.__set__("exists", myMock); never set that private variable.
I want to fake my error paths ultimately.
I would love to hear what others are doing to test redis in node js 😄.
Your problem is not whether Sinon supports this or that, but rather a missing understanding of how "classes" work in Ecmascript, as shown by the attempt at stubbing constructor property in the test code. That will never work, as that property has nothing to do with how any resulting objects turn out. It is simply a reference to the function that was used to create the object. I have covered a very similar topic on the Sinon tracker that you might have interest in reading to gain some core JS foo :-) Basically, it is not possible to stub a constructor, but you can probably coerce your code to use another constructor function in its place through either DI or link seams.
As a matter of fact, a few answers down in the same thread, you will see me covering an example of how I myself designed a Redis using class to be easily testable by supporting dependency injection. You might want to check it out, as it is more or less directly applicable to your example module above.
Another technique, which you already has tried getting to work, is using link seams (using rewire). The Sinon homepage has a nice article on doing this. Both rewire and proxyquire will do the job just fine here: I think you have just complicated the matter a bit by wrapping the require statement with a mock.
Even though I am on the Sinon maintainer team, I never use the mock functionality, so I cannot tell you how to use that, as I think it obscures the testing, but to get the basic link seams working using rewire I would basically remove all the Sinon code first and get the basic case going (removing redis for a stubbed module you have created).
Only then, add Sinon code as required.
I have created a web server i node.js using express and passport. It authenticates using an oauth 2.0 strategy (https://www.npmjs.com/package/passport-canvas). When authenticated, I want to make a call such as:
app.get("/api/courses/:courseId", function(req, res) {
// pass req.user.accessToken implicitly like
// through an IIFE
createExcelToResponseStream(req.params.courseId, res).catch(err => {
console.log(err);
res.status(500).send("Ops!");
});
});
My issue is that i would like, in all subsequent calls from createExcelToResponseStream, to have access to my accessToken. I need to do a ton of api calls later in my business layer. I will call a method that looks like this:
const rq = require("request");
const request = url => {
return new Promise(resolve => {
rq.get(
url,
{
auth: {
bearer: CANVASTOKEN // should be req.user.accessToken
}
},
(error, response) => {
if (error) {
throw new Error(error);
}
resolve(response);
}
);
});
};
If i try to create a global access to the access token, i will risk
race conditions (i think) - i.e. that people get responses in the context of another persons access token.
If i pass the context as a variable i have to refactor a
lof of my code base and a lot of business layer functions have to
know about something they don't need to know about
Is there any way in javascript where i can pass the context, accross functions, modules and files, through the entire callstack (by scope, apply, bind, this...). A bit the same way you could do in a multithreaded environment where you have one user context per thread.
The only thing you could do would be
.bind(req);
But that has has to be chained into every inner function call
somefunc.call(this);
Or you use inline arrow functions only
(function (){
inner=()=>alert(this);
inner();
}).bind("Hi!")();
Alternatively, you could apply all functions onto an Object, and then create a new Instance:
var reqAuthFunctions={
example:()=>alert(this.accessToken),
accessToken:null
};
instance=Object.assign(Object.create(reqAuthFunctions),{accessToken:1234});
instance.example();
You could use a Promise to avoid Race conditions.
Let's have this module:
// ContextStorage.js
let gotContext;
let failedGettingContext;
const getContext = new Promise((resolve,reject)=>{
gotContext = resolve;
failedGettingContext = reject;
}
export {getContext,gotContext, failedGettingContext};
And this inititalization:
// init.js
import {gotContext} from './ContextStorage';
fetch(context).then(contextIGot => gotContext(contextIGot));
And this thing that needs the context:
// contextNeeded.js
import {getContext} from './ContextStorage';
getContext.then(context => {
// Do stuff with context
}
This is obviously not very usable code, since it all executes on load, but I hope it gives you a framework of how to think about this issue with portals... I mean Promises...
The thing that happens when you call the imported 'gotContext', you actually resolve the promise returned by 'getContext'. Hence no matter the order of operations, you either resolve the promise after the context has been requested setting the dependent operation into motion, or your singleton has already a resolved promise, and the dependent operation will continue synchronously.
On another note, you could easily fetch the context in the 'body' of the promise in the 'ContextStorage' singleton. However that's not very modular, now is it. A better approach would be to inject the initializing function into the singleton in order to invert control, but that would obfuscate the code a bit I feel hindering the purpose of the demonstration.