Mocking Postgres for unit tests with Sinon.js in Node.js - javascript

I am having trouble getting my head round how i can use sinon to mock a call to postgres which is required by the module i am testing, or if it is even possible.
I am not trying to test the postgres module itself, just my object to ensure it is working as expected, and that it is calling what it should be calling in this instance.
I guess the issue is the require setup of node, in that my module requires the postgres module to hit the database, but in here I don't want to run an integration test I just want to make sure my code is working in isolation, and not really care what the database is doing, i will leave that to my integration tests.
I have seen some people setting up their functions to have an optional parameter to send the mock/stub/fake to the function, test for its existence and if it is there use it over the required module, but that seems like a smell to me (i am new at node so maybe this isn't).
I would prefer to mock this out, rather then try and hijack the require if that is possible.
some code (please note this is not the real code as i am running with TDD and the
function doesn't do anything really, the function names are real)
TEST SETUP
describe('#execute', function () {
it('should return data rows when executing a select', function(){
//Not sure what to do here
});
});
SAMPLE FUNCTION
PostgresqlProvider.prototype.execute = function (query, cb) {
var self = this;
if (self.connection === "")
cb(new Error('Connection can not be empty, set Connection using Init function'));
if (query === null)
cb(new Error('Invalid Query Object - Query Object is Null'))
if (!query.buildCommand)
cb(new Error("Invalid Query Object"));
//Valid connection and query
};
It might look a bit funny to wrap around the postgres module like this but there are some design as this app will have several "providers" and i want to expose the same API for them all so i can use them interchangeably.
UPDATE
I decided that my test was too complicated, as i was looking to see if the connect call had been made AND then returning data, which smelt to me, so i stripped it back and put it into two tests:
The Mock Test
it('should call pg.connect when a valid Query object is parsed', function(){
var mockPg = sinon.mock(pg);
mockPg.expects('connect').once;
Provider.init('ConnectionString');
Provider.execute(stubQueryWithBuildFunc, null, mockPg);
mockPg.verify();
});
This works (i think) as without the postgres connector code it fails, with it passes (Boom :))
Issue now is with the second method, which i am going to use a stub (maybe a spy) which is passing 100% when it should fail, so i will pick that up in the morning.
Update 2
I am not 100% happy with the test, mainly because I am not hijacking the client.query method which is the one that hits the database, but simply my execute method and forcing it down a path, but it allows me to see the result and assert against it to test behaviour, but would be open to any suggested improvements.
I am using a spy to catch the method and return null and a faux object with contains rows, like the method would pass back, this test will change as I add more Query behaviour but it gets me over my hurdle.
it('should return data rows when a valid Query object is parsed', function(){
var fauxRows = [
{'id': 1000, 'name':'Some Company A'},
{'id': 1001, 'name':'Some Company B'}
];
var stubPg = sinon.stub(Provider, 'execute').callsArgWith(1, null, fauxRows);
Provider.init('ConnectionString');
Provider.execute(stubQueryWithBuildFunc, function(err, rows){
rows.should.have.length(2);
}, stubPg);
stubPg.called.should.equal.true;
stubPg.restore();
});

Use pg-pool: https://www.npmjs.com/package/pg-pool
It's about to be added to pg anyway and purportedly makes (mocking) unit-testing easier... from BrianC ( https://github.com/brianc/node-postgres/issues/1056#issuecomment-227325045 ):
Checkout https://github.com/brianc/node-pg-pool - it's going to be the pool implementation in node-postgres very soon and doesn't rely on singletons which makes mocking much easier. Hopefully that helps!

I very explicitly replace my dependencies. It's probably not the best solution but all the other solutions I saw weren't that great either.
inject: function (_mock) {
if (_mock) { real = _mock; }
}
You add this code to the module under test. In my tests I call the inject method and replace the real object. The reason why I don't 100% like it is because you have to add extra code only for testing.
The other solution is to read the module file as a string and use vm to manually load the file. When I investigated this I found it a little to complex so I went with just using the inject function. It's probably worth investigating this approach though. You can find more information here.

Related

Using sinon how does one fake/stub/mock a non existing JS API in a way than sinon.restore removes all trace of it afterwards?

While running my tests I sometimes want to provide a API for my tests. I want this api to be defined for the duration of the test(s) alone, and so I want to ensure that sinon.restore() removes this test api. This is not replacing an existing JS API. (eg. NOT something like window.requestAnimiationFrame). This API is assumed to exist globally. (eg. on the global/window object)
Now If I didn't care about removing this API, after the test was done I would do the following:
globalAPIObject.someTestApi = sinon.fake.returns('something');
However sinon.restore won't/can't remove globalAPIObject.someTestApi after the test has run.
I would like to be able to use fake, in the same way as stub. (but sinon doesn't provide this)
// !! this API doesn't exist !!
sinon.fake(globalAPIObject, 'someTestApi').returns('something');
// !! this API doesn't exist !!
So I use stub instead:
// This doesn't work if globalAPIObject.someTestAPi doesn't already exist.
sinon.stub(globalAPIObject, 'someTestApi').returns('something');
However this only works for replacing props/functions that already exist, so I have to do:
globalAPIObject.someTestApi = () = {};
sinon.stub(globalAPIObject, 'someTestApi').returns('something');
Which is less than ideal. (As globalAPIObject.someTestApi isn't removed at the end of the test, by sinon.restore(). Also I'd rather only have to write a single line)
Since I guess that providing a non existent API is something that lots of people want to do, I guessing I'm missing something obvious.
What is the best way to fake/stub/mock a new API in a way than sinon.restore removes all trace of it afterwards?
In short, you cannot do this using sinon.restore(). We explicitly made the API throw on trying to replace non-existent props, but we have had a longer discussion about this on the sinon team and essentially came to the the conclusion that we should add something like sinon.define(). Still, no one ever made that feature request into an actual issue, so if I were you I would just file a new issue with this as your feature request to the Sinon Github tracker. Lots of people would like this feature and it's really not that hard to implement, so just make it visible :)
To actually do this using today's existing machinery I would probably just do as mentioned in the comments: use your test frameworks before/after hooks to setup and tear down manually constructed stubs.
If I were to do it myself I would probably have redone the code to be able to inject the API into your module, though, instead of relying on globals. That's another school of thought, so, whatever rocks your boat :)
Given that 'oligofren' helpfully clarified that sinon.stub doesn't our use case yet we just extended it so does.
// sinonExtensions.js
import sinon from "sinon";
if (sinon.stub !== enhancedStub)
{
sinon.originalStub = sinon.stub;
sinon.stub = enhancedStub;
}
function enhancedStub(obj, method)
{
if (!obj)
return sinon.originalStub();
if (!obj[method])
obj[method] = () => { };
return sinon.originalStub(obj, method);
}
// TODO: enhance sinon.restore to return non-existing methods to undefined
Which we just import along with the rest of the sinon imports:
import sinon, { mock } from "sinon";
import sinonChai from 'sinon-chai';
import 'sinonExtensions.js';
chai.use(sinonChai);
Which is good for all our current usages of sinon.stub.
And now this happy works:
sinon.stub(globalAPIObject, 'someTestApi').returns('something');

Safe way to let users register handelbars helpers in nodejs

I have a node js web app that is using handlebars. Users are asking me to let them register their own handlebars helpers.
I'm quite hesitant about letting them do it... but I'll give it a go if there is a secure way of doing it so.
var Handlebars = require("handlebars");
var fs = require("fs");
var content = fs.readFileSync("template.html", "utf8");
//This helper will be posted by the user
var userHandlebarsHelpers = "Handlebars.registerHelper('foo', function(value) { return 'Foo' + value; });"
//eval(userHandlebarsHelpers); This I do not like! Eval is evil
//Compile handlebars with user submitted Helpers
var template = Handlebars.compile(content);
var handleBarContent = template({ foo: bar });
//Save compiled template and some extra code.
Thank you in advance!
Because helpers are just Javascript code, the only way you could safely run arbitrary Javascript from the outside world on your server is if you either ran it an isolated sandbox process or you somehow sanitized the code before you ran it.
The former can be done with isolated VMs and external control over the process, but that makes it quite a pain to have helper code in some external process as you now have to develop ways to even call it and pass data back and forth.
Sanitizing Javascript to be safe from running exploits on your server is a pretty much impossible task when your API set is as large as node.js. The browser has a very tightly controlled set of things that Javascript can do to keep the underlying system safe from what browser Javascript can do. node.js has none of those safeguards. You could put code in one of these helpers to erase the entire hard drive of the server or install multiple viruses or pretty much whatever evil exploit you wanted to code. So, running arbitrary Javascript will simply not be safe.
Depending upon the exact problems that need to be solved, one can something develop a data driven approach where, instead of code, the user provides some higher level set of instructions (map this to that, substitute this with that, replace this with that, display from this set of data, etc...) that is not actually Javascript, but rather some non-executable meta data. That is much more feasible to make safe because you control all the code that acts on this meta data so you just have to make sure that the code that processes the meta data isn't capable of being tricked into doing something evil.
Following #jfriend00 input and after some serious testing I found a way to do it using nodejs vm module.
Users will input their helpers with this format:
[[HBHELPER 'customHelper' value]]
value.replace(/[0-9]/g, "");
[[/HBHELPER]]
[[HBHELPER 'modulus' index mod result block]]
if(parseInt(index) % mod === parseInt(result))
block.fn(this);
[[/HBHELPER]]
//This will throw an error when executed "Script execution timed out."
[[HBHELPER 'infiniteLoop' value]]
while(1){}
[[/HBHELPER]]
I translate that block into this and execute it:
Handlebars.registerHelper('customHelper', function(value) {
//All the code is executed inside the VM
return vm.runInNewContext('value.replace(/[0-9]/g, "");', {
value: value
}, {
timeout: 1000
});
});
Handlebars.registerHelper('modulus', function(index, mod, result, block) {
return vm.runInNewContext('if(parseInt(index) % mod === parseInt(result)) block.fn(this);', {
index: index,
mod: mod,
result: result,
block: block
}, {
timeout: 1000
});
});
Handlebars.registerHelper('infiniteLoop', function(value) {
//Error
return vm.runInNewContext('while(1){}', {
value: value
}, {
timeout: 1000
});
});
I made multiple tests so far, trying to delete files, require modules, infinite loops. Everything is going perfectly, all those operations failed.
Running the handlebar helper callback function in a VM is what made this work for me, because my main problem using VM's and running the whole code inside was adding those helpers to my global Handlebars object.
I'll update if I found a way to exploit it.

NodeJS, SocketIO and Express logic context build

I read a lot about Express / SocketIO and that's crazy how rarely you get some other example than a "Hello" transmitted directly from the app.js. The problem is it doesn't work like that in the real world ... I'm actually desperate on a logic problem which seems far away from what the web give me, that's why I wanted to point this out, I'm sure asking will be the solution ! :)
I'm refactoring my app (because there were many mistakes like using the global scope to put libs, etc.) ; Let's say I've got a huge system based on SocketIO and NodeJS. There's a loader in the app.js which starts the socket system.
When someone join the app it require() another module : it initializes many socket.on() which are loaded dynamically and go to some /*_socket.js files in a folder. Each function in those modules represent a socket listener, then it's way easier to call it from the front-end, might look like this :
// Will call `user_socket.js` and method `try_to_signin(some params)`
Queries.emit_socket('user.try_to_signin', {some params});
The system itself works really well. But there's a catch : the module that will load all those files which understand what the front-end has sent also transmit libraries linked with req/res (sessions, cookies, others...) and must do it, because the called methods are the core of the app and very often need those libraries.
In the previous example we obviously need to check if the user isn't already logged-in.
// The *_socket.js file looks like this :
var $h = require(__ROOT__ + '/api/helpers');
module.exports = function($s, $w) {
var user_process = require(__ROOT__ + '/api/processes/user_process')($s, $w);
return {
my_method_called: function(reference, params, callback) {
// Stuff using $s, $w, etc.
}
}
// And it's called this way :
// $s = services (a big object)
// $w = workers (a big object depending on $s)
// They are linked with the req/res from the page when they are instantiated
controller_instance = require('../sockets/'+ controller_name +'_socket')($s, $w);
// After some processes ...
socket_io.on(socket_listener, function (datas, callback) {
// Will call the correct function, etc.
$w.queries.handle_socket($w, controller_name, method_name, datas);
});
The good news : basically, it works.
The bad news : every time I refresh the page, the listeners double themselves because they are in a loop called on page load.
Below, this should have been one line :
So I should put all the socket.on('connection'...) stuff outside the page loading, which means when the server starts ... Yes, but I also need the req/res datas to be able to load the libraries, which I get only when the page is loaded !
It's a programing logic problem, I know I did something wrong but I don't know where to go now, I got this big system which "basically" works but there's like a paradox on the way I did it and I can't figure out how to resolve this ... It's been a couple of hours I'm stuck.
How can I refacto to let the possibility to get the current libraries depending on req/res within a socket.on() call ? Is there a trick ? Should I think about changing completely the way I did it ?
Also, is there another way to do what I want to do ?
Thank you everyone !
NOTE : If I didn't explain well or if you want more code, just tell me :)
EDIT - SOLUTION : As seen above we can use sockets.once(); instead of sockets.on(), or there's also the sockets.removeAllListeners() solution which is less clean.
Try As Below.
io.sockets.once('connection', function(socket) {
io.sockets.emit('new-data', {
channel: 'stdout',
value: data
});
});
Use once instead of on.
This problem is similar as given in the following link.
https://stackoverflow.com/questions/25601064/multiple-socket-io-connections-on-page-refresh/25601075#25601075

How to use tinytest with publish/subscribe?

Tinytest is fairly undocumented at this point, but it looks like a nice lightweight framework to use. I have a package that depends on some publish/subscribe data and a bit stymied on how to use this. I can't seem to subscribe to a user.
Do i need to guarantee the server block runs first, such that the publish blocks are available to be subscribed to?
I also made sure my package under test included the various required other/auth packages.
Package.on_test(function (api) {
api.use([
'accounts-ui',
'accounts-facebook', ...
I was hoping to find some examples in LiveData test suite or even the Iron-Router test suite, but haven't turned anything up yet. FWIW i've reviewed other links on testing with meteor, and Laika and RTD RTD look a bit overkill for now.
Pointers appreciated, how to get a basic sample like below working.
if Meteor.isServer
Tinytest.add "Chatbot data", (test) ->
test.equal 1, 1, "server tests running"
userdata = {
email: "c#c.com"
profile:
nickname: "chaka"
name: "chaka"
icon: "/images/bots/chaka/icon/50.png"
}
Meteor.users.insert(userdata)
check = Meteor.users.findOne()
test.isNotNull check, "can't create user"
console.log("userCheck", check) # ok
Meteor.publish 'allUsers', ->
return Meteor.users.find()
if Meteor.isClient
Tinytest.add "Chatbot create", (test) ->
Meteor.subscribe("allUsers")
user = Meteor.users.findOne()
if (user == undefined)
test.fail("cant find user:", user)
edit: this guidance on asycn tests seems to be very related.
If you want to test if your method subscribes with the proper find selector I would spy Meteor.users.find() and check if that is called with the right arguments.
var actualFind = Meteor.users.find, receivedArguments;
Meteor.users.find = function () {
receivedArguments = arguments;
actualFind.apply(actualFind, arguments);
}
Then do tests on receivedArguments.
Testing subscribe/publish itself is something the meteor team should do.
You should only ever write unit tests for code you've written.

Simplest approach to Node.js request serialisation

I've got the classic asynchronous/concurrency problem that folks writing a service in Node.js at some point stumble into. I have an object that fetches some data from an RDBM in response to a request, and emits a fin event (using an EventEmitter) when the row fetching is complete.
As you might expect, when the caller of the service makes several near-simultaneous calls to it, the rows are returned in an unpredictable order. The fin event is fired for rows that do not correspond to the calling function's understanding of the request that produced them.
Here's what I've got going on (simplified for relevance):
var mdl = require('model.js');
dispatchGet: function(req, res, sec, params) {
var guid = umc.genGUID(36);
mdl.init(this.modelMap[sec], guid);
// mdl.load() creates returns a 'new events.EventEmitter()'
mdl.load(...).once('fin',
function() {
res.write(...);
res.end();
});
}
A simple test shows that the mdl.guid often does not correspond to the guid.
I would have thought that creating a new events.EventEmitter() inside the mdl.load() function would fix this problem by creating a discrete EventEmitter for every request, but evidently that is not the case; I suppose the same rules of object persistence apply to it as to any other object, irrespective of new.
I'm a C programmer by background: I can certainly come up with my own scheme for associating these replies with their requests, using some circular queue or hashing scheme. However, I am guessing this problem has already been solved many times over. My research has revealed many opinions on how to best handle this--various kinds of queuing implementations, Futures, etc.
What I'm wondering is, what's the simplest possible approach to good asynchronous flow control here? I don't want to get knee-deep in some dependency's massive paradigm shift if I don't have to. Is there a relatively simple, canonical, definitive solution, and/or widespread consensus on which third-party module is best?
Could it be that your model.js looks something like this?
module.exports = {
init : function(model, guid) {
this.guid = guid;
...
}
};
You have to be aware that the object you're passing to module.exports there is a shared object, in the sense that every other module that runs require("model.js") it will receive a reference to the same object.
So every time you run mdl.init(), the guid property of that object is changed, which would explain your comment that "...a simple test shows that the mdl.guid often does not correspond to the guid".
It really depends on your exact implementation, but I think you'd want to use a class instead:
// model.js
var Mdl = function(model, guid) {
this.guid = guid;
};
Mdl.prototype.load = function() {
// instantiate and return a new EventEmitter.
};
module.exports = Mdl;
// app.js
var Mdl = require('model.js');
...
var mdl = new Mdl(this.modelMap[sec], guid);
mdl.load(...)

Categories