Given that I'm extending an existing module and it uses the module.exports in the way shown below, can I even call the start methods from (mocha) tests?
I suspect that there's no decent way to tap into it - and that's ok. I'd just rather test these if I am able to and would love to know how to do it if possible.
const NodeHelper = require("node_helper");
module.exports = NodeHelper.create({
start: function() {
//do stuff
};
});
Edit: NodeHelper returns a function that appears to be "extended":
NodeHelper.create = function(moduleDefinition) {
return NodeHelper.extend(moduleDefinition);
};
Looking closer at the linked code, it uses Resig's class.js so you probably need to create an instance to call the prototype methods, ie
const YourNodeHelper = require('path/to/your/module')
const yourNodeHelper = new YourNodeHelper() // create instance here
yourNodeHelper.start()
Related
How can I monkey-patch some methods in the global jest object for all test files at once? I don't want to add any extra code to my test files, it has to be done somewhere in setup and it can be an ugly hack.
I tried doing that from a custom environment, setupFiles and setupFilesAfterEnv, but it looks like they all get a different instance of jest object and my changes aren't visible in test files.
Disclaimer: I know that it's a bad thing to do but I need it for some one-time benchmarking only and it's the easiest solution that gets the job done.
I got this working! You're right, Jest does re-construct the global jest object for every test case, but you can override the function it uses to do that. In jest.config.js, set globalSetup to something like <rootDir>/jest-global-setup.js. Then, in jest-global-setup.js, add this:
const jestRuntime = require('jest-runtime');
const { _createJestObjectFor } = jestRuntime.prototype;
jestRuntime.prototype._createJestObjectFor = function(...args) {
// Call the original function to create a normal jest object.
const jestObject = _createJestObjectFor.apply(this, args);
// Apply your changes.
jestObject.isMonkeyPatched = true;
// Return the patched object.
return jestObject;
}
// Jest expects to find a function of some sort as well,
// but we don't need it for this example.
module.exports = function() { /* do nothing */ }
I am using Protractor with pageObject concept to do e2e testing.
However, I have difficulties to understand why creating new objects is needed for each pageObject?
Show you my question by code
Currently, I define the pageObject in pageObj.js as
var PageObj = function () {
this.method1 = function() { //whatever content };
}
module.exports = PageObj;
and invoking it in test spec file as
var PageObj = require('./pageObject/pageObj.js');
var pageObj = new PageObj();
//use pageObj's method here;
pageObj.method1();
However, I think this way below is simpler, why shouldn't I use this?
Define the same method in pageObj.js
```
module.exports = {
method1: function() {
//whatever content;
},
Invoke it as
var pageObj = require('./pageObject/pageObj.js');
//use pageObj's method here;
pageObj.method1();
Sometimes you might have multiple tests using the same page object and you might want to store there data representing current or changing state of page during one test or suite of tests. Using it as a class/constructor function allows you to have a clear state between every test.
If your take works for you for now and the future and doesn't limit you it's completely fine, just for these more complex cases you might need to have to use instanced page objects to achieve what you need.
At our company we prefer to stick to one pattern to not have to adjust to use page object as an object here and as constructor function there. So to keep it more uniform across our tests we just follow the recommended style + this pattern has already settled and it's easier to switch between projects if they follow the same guidelines.
As #Tom mentions, using an object literal is fine, but can be limiting. I use them if I don't have to extend other pages (eg. a basePage). I also feel like instantiating page objects in the spec is a bit clunky, so I opt for a solution somewhere in between..
var PageObj = function() {
this.method1 = function() { //whatever content };
};
module.exports = new PageObj();
And then your spec...
var pageObj = require('./pageObject/pageObja ');
//use pageObj's method here;
pageObj.method1();
The doc for yeoman unit testing seems to be oriented around integration testing, namely running the entire generator and then examining the side effects produced i.e. for the existence of certain files. For this you can use helpers.run().
This is all fine and well, but I also want to be able to unit test a single method (or "priority") and test internal states of the generator i.e. internal vars. I have been able to do this before by using createGenerator like so:
subAngularGenerator = helpers.createGenerator('webvr-decorator:sub-angular', [
path.join(__dirname, '../generators/sub-angular')
],
null,
{'artifacts': artifacts, appName: APP_NAME, userNames: userNames,
});
This has no RunContext, but I can usually add enough things to the structure so that it will run. For instance:
// mixin common class
_.extend(subAngularGenerator.prototype, require('../lib/common.js'));
// we need to do this to properly feed in options and args
subAngularGenerator.initializing();
// override the artifacts hash
subAngularGenerator.artifacts = artifacts;
// call method
subAngularGenerator._injectDependencies(fp, 'controller', ['service1', 'service2']);
Which allows me to test internal state:
var fileContents = subAngularGenerator.fs.read(fp);
var regex = /\('MainCtrl', function \(\$scope, service1, service2\)/m;
assert(regex.test(fileContents));
This works fine as long as the method is basic javascript, like for/next loops and such. If the method make use of any 'this' variables, like this.async(), I get 'this.async' is not a function.
initialPrompt: function () {
var prompts = [];
var done = this.async(); //if this weren't needed my ut would work
...
I can manually add a dummy this.async, but then I go down the rabbit's hole with other errors, like 'no store available':
AssertionError: A store parameter is required
at Object.promptSuggestion.prefillQuestions (node_modules/yeoman-generator/lib/util/prompt-suggestion.js:98:3)
at RunContext.Base.prompt (node_modules/yeoman-generator/lib/base.js:218:32)
at RunContext.module.exports.AppBase.extend.prompting.initialPrompt (generators/app/index.js:147:12)
at Context.<anonymous> (test/test-app.js:158:42)
I tried to create a runContext and then add my generator to that:
var helpers = require('yeoman-generator').test;
// p.s. is there a better way to get RunContext?
var RunContext = require('../node_modules/yeoman-generator/lib/test/run-context');
before(function (done) {
appGenerator = helpers.createGenerator('webvr-decorator:app', [
path.join(__dirname, '../generators/app')
],
null,
appName: APP_NAME, userNames: userNames,
{});
app = new RunContext(appGenerator); //add generator to runContext
});
app.Generator.prompting.initialPrompt(); //gets async not defined
But this gets the same problem.
My theory is the problem has to with 'this' contexts. Normally the method runs with the 'this' context of the entire generator (which has a this.async etc), but when I run the method individually, the 'this' context is just that of the method/function itself (which has no async in its context). If this is true, then it's really more of a javascript question, and not a yeoman one.
It seems like there should be an easy way to unit test individual methods that depend on the generator context such as calls to this.async. I referred to generator-node as an example of best practices, but it only appears to be doing integration testing.
Does anyone have any better ideas, or do I need to just keep futzing around with JavaScript techniques?
Many Thanks.
I was able to get it to work, but it's a total hack. I was able to decorate a RunContext with the necessary artifacts, and then using apply, I put my generator in the context of the RunContext:
var appGenerator;
var app;
before(function (done) {
// create a generator
appGenerator = helpers.createGenerator('webvr-decorator:app', [
path.join(__dirname, '../generators/app')
],
null,
appName: APP_NAME, userNames: userNames,
{}
);
// get a RunContext
app = new RunContext(appGenerator);
// the following did *not* work -- prompts were not auto-answered
app.withPrompts({'continue': true, 'artifactsToRename': {'mainCtrl' : 'main'}});
//add the following functions and hashes from the generator to the RunContext
app.prompt = appGenerator.prompt;
app._globalConfig = appGenerator._globalConfig;
app.env = appGenerator.env;
// the following two lines are specific to my app only
app.globals = {};
app.globals.MAIN_CTRL = 'main';
done();
});
it('prompting works', function () {
// Run the generator in the context of RunContext by using js 'call'
appGenerator.prompting.initialPrompt.call(app);
}
I no longer get any 'missing functions' messages, but unfortunately the prompts are not being automatically provided by the unit test, so the method stops waiting for something to feed the prompts.
The big "secret" was to call with apply which you can use to override the default this context. I put the generator in the context of the RunContext, which verifies my theory that the problem is about being in the improper context.
I assume there's a much better way to do this and that I'm totally missing something. But I thought I'd at least document what I had to do to get it to work. In the end, I moved the variable initialization code from the 'prompting'method, into the 'initializing' method, and since my 'intializing' method has no Yeoman runtime dependencies, I was able to use a simple generator without a RunContext. But that was just fortuitous in this case. In the general case, I would still like to find out the proper way to invoke a single method.
I have an initialized object that I initialized in app.js file and I would like to make this initialized object is available in all modules. How could I do that? Passing this object to every modules is one way to do and I'm wondering if I'm missing anything or there should be done in difference ways?
I saw mongoose actually support default connection, which I need to init in app.js one time and anywhere in other modules, I can just simply use it without requiring passing it around. Is there any I can do the same like this?
I also checked global object doc from node.js http://nodejs.org/api/globals.html, and wondering I should use global for issue.
Thanks
A little advice:
You should only very rarely need to use a global. If you think you need one, you probably don't.
Singletons are usually an anti-pattern in Node.js, but sometimes (logging, config) they will get the job done just fine.
Passing something around is sometimes a useful and worthwhile pattern.
Here's an example of how you might use a singleton for logging:
lib/logger.js
var bunyan = require('bunyan'),
mixIn = require('mout/object/mixIn'),
// add some default options here...
defaults = {},
// singleton
logger,
createLogger = function createLogger(options) {
var opts;
if (logger) {
return logger;
}
opts = mixIn({}, defaults, options);
logger = bunyan.createLogger(opts);
return logger;
};
module.exports = createLogger;
lib/module.js
var logger = require('./logger.js'),
log = logger();
log.info('Something happened.');
Hope that helps.
The solution, as you suggest is to add the object as a property to the global object. However, I would recommend against doing this and placing the object in its own module that is required from every other module that needs it. You will gain benefits later on in several ways. For one, it is always explicit where this object comes from and where it is initialized. You will never have a situation where you try to use the object before it is initialized (assuming that the module that defines it also initializes it). Also, this will help make your code more testable,
There are multiple solutions to the problem, depends upon how large your application is. The two solutions that you have mentioned are the most obvious ones. I would rather go for the third which is based on re-architecturing your code. The solution that I am providing looks alot like the executor pattern.
First create actions which require your common module that are in this particular form -
var Action_One = function(commonItems) {
this.commonItems = commonItems;
};
Action_One.prototype.execute = function() {
//..blah blah
//Your action specific code
};
var Action_Two = function(commonItems) {
this.commonItems = commonItems;
};
Action_Two.prototype.execute = function() {
//..blah blah
//Your action_two specific code
};
Now create an action initializer which will programmatically initialize your actions like this -
var ActionInitializer = function(commonItems) {
this.commonItems = commonItems;
};
ActionInitializer.prototype.init = function(Action) {
var obj = new Action(this.commonItems);
return obj;
};
Next step is to create an action executor -
//You can create a more complex executor using `Async` lib or something else
var Executor = function(ActionInitializer, commonItems) {
this.initializer = new ActionInitializer(commonItems);
this.actions = [];
};
//Use this to add an action to the executor
Executor.prototype.add = function(action) {
var result = this.initializer.init(action);
this.actions.push(result);
};
//Executes all the actions
Executor.prototype.executeAll = function() {
var result = [];
for (var i = this.action.length - 1; i >= 0; i--) {
result[i] = this.action[i].execute();
}
this.action = []
return result;
};
The idea was to decouple every module so that there is only one module Executor in this case which is dependent on the common properties. Now lets see how it would work -
var commonProperties = {a:1, b:2};
//Pass the action initilizer class and the common property object to just this one module
var e = new Executor(ActionInitializer, commonProperties);
e.add(Action_One);
e.add(Action_Two);
e.executeAll();
console.log(e.results);
This way your program will be cleaner and more scalable. Shoot questions if it's not clear. Happy coding!
I've got a browser addon I've been maintaining for 5 years, and I'd like to share some common code between the Firefox and Chrome versions.
I decided to go with the Javascript Module Pattern, and I'm running into a problem with, for example, loading browser-specific preferences, saving data, and other browser-dependent stuff.
What I'd like to do is have the shared code reference virtual, overrideable methods that could be implemented in the derived, browser-specific submodules.
Here's a quick example of what I've got so far, that I've tried in the Firebug console, using the Tight Augmentation method from the article I referenced:
var core = (function(core)
{
// PRIVATE METHODS
var over = function(){ return "core"; };
var foo = function() {
console.log(over());
};
// PUBLIC METHODS
core.over = over;
core.foo = foo;
return core;
}(core = core || {}));
var ff_specific = (function(base)
{
var old_over = base.over;
base.over = function() { return "ff_specific"; };
return base;
}(core));
core.foo();
ff_specific.foo();
Unfortunately, both calls to foo() seem to print "core", so I think I've got a fundamental misunderstanding of something.
Essentially, I'm wanting to be able to call:
get_preference(key)
set_preference(key, value)
load_data(key)
save_data(key, value)
and have each browser do their own thing. Is this possible? Is there a better way to do it?
In javascript functions have "lexical scope". This means that functions create their environment - scope when they are defined, not when they are executed. That's why you can't substitute "over" function later:
var over = function(){ return "core"; };
var foo = function() {
console.log(over());
};
//this closure over "over" function cannot be changed later
Furthermore you are "saying" that "over" should be private method of "core" and "ff_specific" should somehow extend "core" and change it (in this case the private method which is not intended to be overridden by design)
you never override your call to foo in the ff_specific code, and it refers directly to the private function over() (which never gets overridden), not to the function core.over() (which does).
The way to solve it based on your use case is to change the call to over() to be a call to core.over().
That said, you're really confusing yourself by reusing the names of things so much, imo. Maybe that's just for the example code. I'm also not convinced that you need to pass in core to the base function (just to the children).
Thanks for your help. I'd forgotten I couldn't reassign closures after they were defined. I did figure out a solution.
Part of the problem was just blindly following the example code from the article, which meant that the anonymous function to build the module was being called immediately (the reusing of names Paul mentioned). Not being able to reassign closures, even ones that I specifically made public, meant I couldn't even later pass it an object that would have its own methods, then check for them.
Here's what I wound up doing, and appears to work very well:
var ff_prefs = (function(ff_prefs)
{
ff_prefs.foo = function() { return "ff_prefs browser specific"; };
return ff_prefs;
}({}));
var chrome_prefs = (function(chrome_prefs)
{
chrome_prefs.foo = function() { return "chrome_prefs browser specific"; };
return chrome_prefs;
}({}));
var test_module = function(extern)
{
var test_module = {};
var talk = function() {
if(extern.foo)
{
console.log(extern.foo());
}
else
{
console.log("No external function!");
}
};
test_module.talk = talk;
return test_module;
};
var test_module_ff = new test_module(ff_prefs);
var test_module_chrome = new test_module(chrome_prefs);
var test_module_none = new test_module({});
test_module_ff.talk();
test_module_chrome.talk();
test_module_none.talk();
Before, it was running itself, then when the extension started, it would call an init() function, which it can still do. It's just no longer an anonymous function.