I've been wondering what is the preferred way of injecting dependencies in node.js code.
I've been working on a project where there is no dependency injection. This is why i have 2 ways of providing dependencies in my code:
Through constructor/function arguments - this has the disadvantage of exploding the number of arguments in functions as i pass arguments from higher levels of my program to the lower ones,
function createListener(queue) {
return function listen() {
while (true) {
const messages = queue.receiveMessages();
...
}
};
}
Using require() - this is equivalent to hardcoding those dependencies making if harder to mock and test.
const queue = require('./queue');
function createListener() {
return function listen() {
while (true) {
const messages = queue.receiveMessages();
...
}
};
}
I've been trying to hit a sweet spot between those. When providing a dependency which is a complex mechanism i tend to inject it, when dealing with values or less crucial mechanisms i use require.
Is that ok? What would be a better approach?
Related
I have two different libraries that I'm using to make mocks in Jest. The libraries have the same function called get. This is a problem for my current implementation since get is used by two different libraries is it possible to use an alias for mock functions (jest.fn()) or maybe some kind of workaround that doesn't ruin the integrity of the current implementation?
Here is my current implementation and I would I like to keep this way if possible:
let get: jest.Mock<{}>
jest.mock('rxjs/ajax', () => {
get = jest.fn()
return { ajax: { get } }
})
let get as cookieGet: jest.Mock<()> // Can I do something like this
jest.mock('js-cookie', () => {
get = jest.fn()
return { get }
})
I'm not too familiar with aliases in JS or they Jest handles things like this so any help is much appreciated.
It's unnecessary to use { get } shorthand property syntax for object literal if it results in name collisions.
Another problem is that a variable needs to have mock prefix in order to be used in the scope of jest.mock factory function. As the documentation states,
A limitation with the factory parameter is that, since calls to jest.mock() are hoisted to the top of the file, it's not possible to first define a variable and then use it in the factory. An exception is made for variables that start with the word 'mock'. It's up to you to guarantee that they will be initialized on time!
It can be:
import ... from 'rxjs/ajax';
import ... from 'js-cookie';
let mockRxAjaxGet: jest.Mock<{}>
jest.mock('rxjs/ajax', () => {
mockRxAjaxGet = jest.fn()
return { ajax: { get: mockRxAjaxGet } }
})
let mockJsCookieGet: jest.Mock<()>
jest.mock('js-cookie', () => {
mockJsCookieGet = jest.fn()
return { get: mockJsCookieGet }
})
The problem is that once jest.mock is hoisted above imports, it will be evaluated when let variables are in temporal dead zone and cannot be assigned.
So let should be preferably changed to var, which is hoisted. Or mocked function be imported as usual and used with get as jest.Mock<...> where a spy is expected. mocked helper can be used to enforce TypeScript type safety.
Please help me deal with this garbage I produced:
Program.prototype.init = function()
{
loadText('../res/shaders/blinnPhong-shader.vsh', function (vshErr, vshText) {
if (vshErr) {
alert('Fatal error loading vertex shader.');
console.error(vshErr);
} else {
loadText('../res/shaders/blinnPhong-shader.fsh', function (fshErr, fshText) {
if (fshErr) {
alert('Fatal error loading fragment shader.');
console.error(fshErr);
} else {
loadJSON('../res/models/dragon.json', function (modelErr, modelObj) {
if (modelErr) {
alert('Fatal error loading model.');
console.error(modelErr);
} else {
loadImage('../res/textures/susanTexture.png', function (imgErr, img) {
if (imgErr) {
alert('Fatal error loading texture.');
console(imgErr);
} else {
this.run = true;
RunProgram(vshText, fshText, img, modelObj);
}
});
}
});
}
});
}
});
};
My actual goal is to abstract the resource loading process for a WebGL program.
That means in the future there will be arrays of meshes, textures, shaders and I want to be able to connect certain dependencies between resources. For example: I want to create two GameObjects One and Two. One uses shaders and is loaded from a mesh but has no texture, whereas Two uses the same shaders as One but uses its own mesh and also needs a texture. What principles could I use to achieve building these dependencies in JavaScript (with asynchronous loading and so on)?
Edit:
So the following is happening with this code: I kept callbacks for now. However this method is part of a Singleton object. I edited the code because in the last else case I am setting a flag of program to true. I keep a global reference of the program object in my main. However due to the callbacks the reference is somehow lost, the global reference keeps its flag to false so the main loop is never reached. It is clearly a problem of the callbacks, since the flag is set when I call "this.run = true" outside the nested callbacks. Any advice on that?
Using modern APIs like Promises, Fetch and sugar like arrow functions, your code can become:
Program.prototype.init = function () {
return Promise.all(
fetch('../res/shaders/blinnPhong-shader.vsh').then(r=>r.text()),
fetch('../res/shaders/blinnPhong-shader.fsh').then(r=>r.text()),
fetch('../res/models/dragon.json').then(r=>r.json()),
new Promise(function (resolve,reject) {
var i = new Image();
i.onload = () => resolve(i);
i.onerror = () => reject('Error loading image '+i.src);
i.src = '../res/textures/susanTexture.png';
})
)
.then(RunProgram);
}
You could spice things up even further by using related ES2017 features like async functions/await or go all in on compatibility by forgoing arrow functions and using seamless polyfills for promises and fetch. For some simple request caching, wrap fetch:
const fetchCache = Object.create(null);
function fetchCached (url) {
if (fetchCache[url])
return Promise.resolve(fetchCache[url]);
return fetch.apply(null,arguments).then(r=>fetchCache[url]=r);
}
Note that you want your resources to be unique so the above mentioned caching still needs another layer of actual GPU resource caching on top of it, you don't want to create multiple shader programs with the same shader code or array buffers with the same vertex data in them.
Your actual core question as to how you could manage dependencies is a bit too broad / application specific to be answered here on SO. In regards to managing the async nature in such an environment I see two options:
Use placeholder resources and seamlessly replace them once the actual resources are loaded
Wait until everything is loaded before you insert the GameObject into the rendering pipeline
Both approaches have their pros and cons, but usually I'd recommend the first option.
You can use promises for this. With the bluebird module, you can convert loadText to a promise function with promise.promiseifyAll(the module loadText is from), or if that is your module, you can make it return a new Promise(function(resolve, reject){})
Using promises, you can make an array of all the promises you want to run and Promise.all([loadText('shader'), loadText("other shader"), ...])
More information on promises
I just started playing around Jasmine and I'm still struggling on the spyon/mocking things, e.g., I have a function
module.exports = (() => {
....
function getUserInfo(id) {
return new Promise((resolve, reject) => {
redis.getAsync(id).then(result => {
resolve(result)
})
})
}
return { getUserInfo: getUserInfo }
})()
Then I start writing the Jasmine spec
describe('Test user helper', () => {
let userInfo
beforeEach(done => {
userHelper.getUserInfo('userid123')
.then(info => {
userInfo = info
done()
})
})
it('return user info if user is found', () => {
expect(userInfo).toEqual('info of userid 123')
})
})
It runs well, but my question is how can I mock the redis.getAsync call, so it can become a real isolated unit test?
Thanks.
Good question. You can mock out the redis dependency but only if you rewrite you code, slightly, to be more testable.
Here, that means making redis a parameter to the factory that returns the object containing getUserInfo.
Of course, this changes the API, callers now need to call the export to get the object. To fix this, we can create a wrapper module that calls the function with the standard redis object, and returns the result. Then we move the actual factory into an inner module, which still allows it to be tested.
Here is what that might well look like
user-helper/factory.js
module.exports = redis => {
....
function getUserInfo(id) {
return redis.getAsync(id); // note simplified as new Promise was not needed
}
return {getUserInfo};
};
user-helper/index.js
// this is the wrapper that preserves existing API
module.exports = require('./factory')(redis);
And now for the test
const userHelperFactory = require('./user-helper/factory');
function createMockRedis() {
const users = [
{userId: 'userid123'},
// etc.
];
return {
getAsync: function (id) {
// Note: I do not know off hand what redis returns, or if it throws,
// if there is no matching record - adjust this to match.
return Promise.resolve(users.find(user => user.userId === id));
}
};
}
describe('Test user helper', () => {
const mockRedis = createMockRedis();
const userHelper = userHelperFactory(mockRedis);
let userInfo;
beforeEach(async () => {
userInfo = await userHelper.getUserInfo('userid123');
});
it('must return user info when a matching user exists', () => {
expect(userInfo).toEqual('info of userid 123');
});
});
NOTE: As discussed in comments, this was just my incidental approach to the situation at hand. There are plenty of other setups and conventions you can use but the primary idea was just based on the existing export of the result of an IIFE, which is a solid pattern, and I leveraged the NodeJS /index convention to preserve the existing API. You could also use one file and export via both module.exports = factory(redis) and module.exports.factory = factory, but that would, I believe, be less idiomatic in NodeJS. The broader point was that being able to mock for tests, and testability in general is just about parameterization.
Parameterization is wonderfully powerful, and its simplicity is why developers working in functional languages sometimes laugh at OOP programmers, such as yours truly, and our clandestine incantations like "Oh glorious Dependency Injection Container, bequeath unto me an instanceof X" :)
It is not that OOP or DI get it wrong it is that testability, DI, IOC, etc. are just about parameterization.
Interestingly, if we were loading redis as a module, and if we were using a configurable module loader, such as SystemJS, we could do this by simply using loader configuration at the test level. Even Webpack lets you do this to some extent, but for NodeJS you would need to monkey patch the Require Function, or create a bunch of fake packages, which are not good options.
To the OP's specific response
Thanks! That's a good idea, but practically, it seems it's quite strange when I have tons of file to test in which I will need to create a factory and index.js for each of them.
You would need to restructure your API surface and simply export factories that consuming code must call, rather than the result of applying those factories, to reduce the burden, but there are tradeoffs and default instances are helpful to consumers.
I want to know if there is a better way to define callback functions of angular 2 observable subscribe when dealing with http calls without violating Single responsibility principle when it comes to embedded logic witch leads to an ugly dirty code.
I am trying to use function variables instead of arrow functions to separate callbacks logic but I can't access this and local function variables (state in the example).
updateState(state: string) {
let proposition = new Proposition();
proposition.id = this.id;
proposition.state = state;
this.propositionService.updateProposition(proposition).subscribe(
(data) => {
....
// instruction using local variable
this.router.navigate(['/portfolio', state]);
....
},
.....
// instrution using this
(errors) => this.toastr.warning('Error.', 'ops !');
.....}
There are many options and all have upsides and downsides. You should choose the one with the most upsides and the fewest downsides on a case by case basis.
Here are a few options (there are many more)
Create a local binding for an arrow function.
updateState(state: string) {
const withNext = (data: { values: {}[] }) => {
console.info(data.values);
....
// instruction using local variable
this.router.navigate(['/portfolio', state]);
....
};
const withError = error => {
this.toastr.warning('Error.', error);
}
this.propositionService.updateProposition(proposition)
.subscribe(withNext, withError);
}
The downsides of this approach are that you need to create the callbacks before you use them, because the assignments will not be hoisted, and that you the lose type inference of the callback arguments, needing to restate the argument types redundantly.
To get around the declaration order issue, we can create a local function declaration
updateState(state: string) {
this.propositionService.updateProposition(proposition)
.subscribe(withNext, withError);
const that = this;
function withNext(data: { values: {}[] }) {
console.info(data.values);
....
// instruction using local variable
that.router.navigate(['/portfolio', state]);
....
}
function withError(error) {
that.toastr.warning('Error.', error);
}
}
The downsides of this approach are that you need to alias this, and that again, the we lose type inference and must resort to redundantly and perhaps incorrectly manually specifying the argument types of the callbacks.
If the observable only emits a single value, for example if it represents an HTTP request, we can use toPromise and enjoy clear and clean code with full type inference and no need for callbacks.
async updateState(state: string) {
try {
const data = await this.propositionService.updateProposition(proposition)
.toPromise();
console.info(data.values);
....
// instruction using local variable
this.router.navigate(['/portfolio', state]);
....
} catch (error) {
this.toastr.warning('Error.', error);
}
}
The downside is that this approach only works for observables that emit at most a single value (e.g. HTTP requests).
The state parameter is accessible to all local declarations regardless of approach and is not a factor unless you wish to extract the success and failure logic to a location outside of the updateState method.
Please could I ask for some advice on a control flow issue with node and redis? (aka Python coder trying to get used to JavaScript)
I don't understand why client.smembers and client.get (Redis lookups) need to be callbacks rather than simply being statements - it makes life very complicated.
Basically I'd like to query a set, and then when I have the results for the set, I need to carry out a get for each result. When I've got all the data, I need to broadcast it back to the client.
Currently I do this inside two callbacks, using a global object, which seems messy. I'm not even sure if it's safe (will the code wait for one client.get to complete before starting another?).
The current code looks like this:
var all_users = [];
// Get all the users for this page.
client.smembers("page:" + current_page_id, function (err, user_ids ) {
// Now get the name of each of those users.
for (var i = 0; i < user_ids.length; i++) {
client.get('user:' + user_ids[i] + ':name', function(err, name) {
var myobj = {};
myobj[user_ids[i]] = name;
all_users.push(myobj);
// Broadcast when we have got to the end of the loop,
// so all users have been added to the list -
// is this the best way? It seems messy.
if (i === (user_ids.length - 1)) {
socket.broadcast('all_users', all_users);
}
});
}
});
But this seems very messy. Is it really the best way to do this? How can I be sure that all lookups have been performed before calling socket.broadcast?
scratches head Thanks in advance for any advice.
I don't understand why client.smembers and client.get (Redis lookups) need to be callbacks rather than simply being statements - it makes life very complicated.
That's what Node is. (I'm pretty sure that this topic was discussed more than enough times here, look through other questions, it's definitely there)
How can I be sure that all lookups have been performed before calling socket.broadcast?
That's what is err for in callback function. This is kinda Node's standard - first parameter in callback is error object (null if everything fine). So just use something like this to be sure no errors occurred:
if (err) {
... // handle errors.
return // or not, it depends.
}
... // process results
But this seems very messy.
You'll get used to it. I'm actually finding it nice, when code is well formatted and project is cleverly structured.
Other ways are:
Using libraries to control async code-flow (Async.js, Step.js, etc.)
If spaghetti-style code is what you think mess is, define some functions to process results and pass them as parameters instead of anonymous ones.
If you totally dislike writing stuff callback-style, you might want to try streamlinejs:
var all_users = [];
// Get all the users for this page.
var user_ids = client.smembers("page:" + current_page_id, _);
// Now get the name of each of those users.
for (var i = 0; i < user_ids.length; i++) {
var name = client.get('user:' + user_ids[i] + ':name', _);
var myobj = {};
myobj[user_ids[i]] = name;
all_users.push(myobj);
}
socket.broadcast('all_users', all_users);
Note that a disadvantage of this variant is that only one username will be fetched at a time. Also, you should still be aware of what this code really does.
Async is a great library and you should take a look. Why ? Clean code / process / easy to track .. etc
Also, keep in mind that all your async function will be processed after your for loop. In you exemple, it may result in wrong "i" value. Use closure :
for (var i = 0; i < user_ids.length; i++) { (function(i) {
client.get('user:' + user_ids[i] + ':name', function(err, name) {
var myobj = {};
myobj[user_ids[i]] = name;
all_users.push(myobj);
// Broadcast when we have got to the end of the loop,
// so all users have been added to the list -
// is this the best way? It seems messy.
if (i === (user_ids.length - 1)) {
socket.broadcast('all_users', all_users);
}
});
})(i)}
What you should do to know when it's finish is use a recursive pattern like async ( i think ) do. It's much simple then doing it yourself.
async.series({
getMembers: function(callback) {
client.smembers("page:" + current_page_id, callback);
}
}, function(err, results) {
var all_users = [];
async.forEachSeries(results.getMembers, function(item, cb) {
all_users.push(item);
cb();
}, function(err) {
socket.broadcast('all_users', all_users);
});
});
This code may not be valid, but you should be able to figure out how to do it.
Step library is good too ( and only 30~ line of code i think)
I don't understand why client.smembers and client.get (Redis lookups)
need to be callbacks rather than simply being statements - it makes
life very complicated.
Right, so everyone agrees callback hell is no bueno. As of this writing, callbacks are a dying feature of Node. Unfortunately, the Redis library does not have native support for returning Promises.
But there is a module you can require in like so:
const util = require("util");
This is a standard library that is included in the Node runtime and has a bunch of utility functions we can use, one of them being "promisify":
https://nodejs.org/api/util.html#util_util_promisify_original
Now of course when you asked this question seven years ago, util.promisify(original) did not exist as it was added in with the release of -v 8.0.0, so we can now update this question with an updated answer.
So promisify is a function and we can pass it a function like client.get() and it will return a new function that take the nasty callback behavior and instead wraps it up nice and neat to make it return a Promise.
So promisify takes any function that accepts a callback as the last argument and makes it instead return a Promise and it sounds like thats the exact behavior that you wanted seven years ago and we are afforded today.
const util = require("util");
client.get = util.promisify(client.get);
So we are passing a reference to the .get() function to util.promisify().
This takes your function, wraps it up so instead of implementing a callback, it instead returns a Promise. So util.promisify() returns a new function that has been promisified.
So you can take that new function and override the existing one on client.get().
Nowadays, you do not have to use a callback for Redis lookup. So now you can use the async/await syntax like so:
const cachedMembers = await client.get('user:' + user_ids[i]);
So we wait for this to be resolved and whatever it resolves with will be assigned to cachedMembers.
The code can be even further cleaned up to be more updated by using an ES6 array helper method instead of your for loop. I hope this answer is useful for current users, otherwise the OP was obsolete.