Nested module 'not defined' - javascript

In this utils.js module, I'm also calling another construct in constants.js which lives at the same level as utils.js. My issue occurs when the test fires, I get "Constants is not defined".
describe("A suite", function() {
let Utils = require('../utils.js');
it("contains spec with an expectation", function() {
let utils = new Utils();
expect(utils.testFunction()).toBe(true);
});
});
Utils.js:
function Utils() {
const CONSTANTS = new Constants();
...
There is clearly something I'm not yet doing in order to make the Constants() object valid when the Jasmine test creates a new instance of Utils.
I've tried this thinking that Constants would then exist before Utils tried to instantiate it, but Jasmine doesn't seem to have access to it this way:
describe("A suite", function() {
let Constants = require('../constants.js');
let Utils = require('../utils.js');
it("contains spec with an expectation", function() {
let utils = new Utils();
expect(utils.testFunction()).toBe(true);
});
});
NOTE: If I remove the use of Constants.js from the picture, my test runs fine, evaluates correctly and passes.

Related

JavaScript NodeJs - TypeError X is not a constructor [duplicate]

I've been working with nodejs lately and still getting to grips with the module system, so apologies if this is an obvious question. I want code roughly like the below:
a.js (the main file run with node)
var ClassB = require("./b");
var ClassA = function() {
this.thing = new ClassB();
this.property = 5;
}
var a = new ClassA();
module.exports = a;
b.js
var a = require("./a");
var ClassB = function() {
}
ClassB.prototype.doSomethingLater() {
util.log(a.property);
}
module.exports = ClassB;
My problem seems to be that I can't access the instance of ClassA from within an instance of ClassB.
Is there any correct / better way to structure modules to achieve what I want?
Is there a better way to share variables across modules?
Try to set properties on module.exports, instead of replacing it completely. E.g., module.exports.instance = new ClassA() in a.js, module.exports.ClassB = ClassB in b.js. When you make circular module dependencies, the requiring module will get a reference to an incomplete module.exports from the required module, which you can add other properties latter on, but when you set the entire module.exports, you actually create a new object which the requiring module has no way to access.
While node.js does allow circular require dependencies, as you've found it can be pretty messy and you're probably better off restructuring your code to not need it. Maybe create a third class that uses the other two to accomplish what you need.
[EDIT] it's not 2015 and most libraries (i.e. express) have made updates with better patterns so circular dependencies are no longer necessary. I recommend simply not using them.
I know I'm digging up an old answer here...
The issue here is that module.exports is defined after you require ClassB.
(which JohnnyHK's link shows)
Circular dependencies work great in Node, they're just defined synchronously.
When used properly, they actually solve a lot of common node issues (like accessing express.js app from other files)
Just make sure your necessary exports are defined before you require a file with a circular dependency.
This will break:
var ClassA = function(){};
var ClassB = require('classB'); //will require ClassA, which has no exports yet
module.exports = ClassA;
This will work:
var ClassA = module.exports = function(){};
var ClassB = require('classB');
I use this pattern all the time for accessing the express.js app in other files:
var express = require('express');
var app = module.exports = express();
// load in other dependencies, which can now require this file and use app
Sometimes it is really artificial to introduce a third class (as JohnnyHK advises), so in addition to Ianzz:
If you do want to replace the module.exports, for example if you're creating a class (like the b.js file in the above example), this is possible as well, just make sure that in the file that is starting the circular require, the 'module.exports = ...' statement happens before the require statement.
a.js (the main file run with node)
var ClassB = require("./b");
var ClassA = function() {
this.thing = new ClassB();
this.property = 5;
}
var a = new ClassA();
module.exports = a;
b.js
var ClassB = function() {
}
ClassB.prototype.doSomethingLater() {
util.log(a.property);
}
module.exports = ClassB;
var a = require("./a"); // <------ this is the only necessary change
The solution is to 'forward declare' your exports object before requiring any other controller. So if you structure all your modules like this and you won't run into any issues like that:
// Module exports forward declaration:
module.exports = {
};
// Controllers:
var other_module = require('./other_module');
// Functions:
var foo = function () {
};
// Module exports injects:
module.exports.foo = foo;
What about lazy requiring only when you need to? So your b.js looks as follows
var ClassB = function() {
}
ClassB.prototype.doSomethingLater() {
var a = require("./a"); //a.js has finished by now
util.log(a.property);
}
module.exports = ClassB;
Of course it is good practice to put all require statements on top of the file. But there are occasions, where I forgive myself for picking something out of an otherwise unrelated module. Call it a hack, but sometimes this is better than introducing a further dependency, or adding an extra module or adding new structures (EventEmitter, etc)
You can solve this easily: just export your data before you require anything else in modules where you use module.exports:
classA.js
class ClassA {
constructor(){
ClassB.someMethod();
ClassB.anotherMethod();
};
static someMethod () {
console.log( 'Class A Doing someMethod' );
};
static anotherMethod () {
console.log( 'Class A Doing anotherMethod' );
};
};
module.exports = ClassA;
var ClassB = require( "./classB.js" );
let classX = new ClassA();
classB.js
class ClassB {
constructor(){
ClassA.someMethod();
ClassA.anotherMethod();
};
static someMethod () {
console.log( 'Class B Doing someMethod' );
};
static anotherMethod () {
console.log( 'Class A Doing anotherMethod' );
};
};
module.exports = ClassB;
var ClassA = require( "./classA.js" );
let classX = new ClassB();
A solution which require minimal change is extending module.exports instead of overriding it.
a.js - app entry point and module which use method do from b.js*
_ = require('underscore'); //underscore provides extend() for shallow extend
b = require('./b'); //module `a` uses module `b`
_.extend(module.exports, {
do: function () {
console.log('doing a');
}
});
b.do();//call `b.do()` which in turn will circularly call `a.do()`
b.js - module which use method do from a.js
_ = require('underscore');
a = require('./a');
_.extend(module.exports, {
do: function(){
console.log('doing b');
a.do();//Call `b.do()` from `a.do()` when `a` just initalized
}
})
It will work and produce:
doing b
doing a
While this code will not work:
a.js
b = require('./b');
module.exports = {
do: function () {
console.log('doing a');
}
};
b.do();
b.js
a = require('./a');
module.exports = {
do: function () {
console.log('doing b');
}
};
a.do();
Output:
node a.js
b.js:7
a.do();
^
TypeError: a.do is not a function
The important thing is not to re-assign the module.exports object that you have been given, because that object may have already been given to other modules in the cycle! Just assign properties inside module.exports and other modules will see them appear.
So a simple solution is:
module.exports.firstMember = ___;
module.exports.secondMember = ___;
The only real downside is the need to repeat module.exports. many times.
Similar to lanzz and setec's answers, I have been using the following pattern, which feels more declarative:
module.exports = Object.assign(module.exports, {
firstMember: ___,
secondMember: ___,
});
The Object.assign() copies the members into the exports object that has already been given to other modules.
The = assignment is logically redundant, since it is just setting module.exports to itself, but I am using it because it helps my IDE (WebStorm) to recognise that firstMember is a property of this module, so "Go To -> Declaration" (Cmd-B) and other tooling will work from other files.
This pattern is not very pretty, so I only use it when a cyclic dependency issue needs to be resolved.
It is fairly well suited to the reveal pattern, because you can easily add and remove exports from the object, especially when using ES6's property shorthand.
Object.assign(module.exports, {
firstMember,
//secondMember,
});
the extremely simple solution is often:
usually you'd have the require at the top of the file ...
var script = require('./script')
function stuff() {
script.farfunction()
}
instead, just require it "in the function"
function stuff() {
var _script = require('./script')
_script.farfunction()
}
An other method I've seen people do is exporting at the first line and saving it as a local variable like this:
let self = module.exports = {};
const a = require('./a');
// Exporting the necessary functions
self.func = function() { ... }
I tend to use this method, do you know about any downsides of it?
TL;DR
Just use exports.someMember = someMember instead of module.exports = { // new object }.
Extended Answer
After reading lanzz's response I could finally figure it out what is happening here, so I'll give my two cents on the subject, extending his answer.
Let's see this example:
a.js
console.log("a starting");
console.log("a requires b");
const b = require("./b");
console.log("a gets b =", b);
function functionA() {
console.log("function a");
}
console.log("a done");
exports.functionA = functionA;
b.js
console.log("b starting");
console.log("b requires a");
const a = require("./a");
console.log("b gets a =", a);
function functionB() {
console.log("On b, a =", a)
}
console.log("b done");
exports.functionB = functionB;
main.js
const a = require("./a");
const b = require("./b");
b.functionB()
Output
a starting
a requires b
b starting
b requires a
b gets a = {}
b done
a gets b = { functionB: [Function: functionB] }
a done
On b, a = { functionA: [Function: functionA] }
Here we can see that at first b receives an empty object as a, and then once a is fully loaded, that reference is updated through exports.functionA = functionA. If you instead replace the entire module with another object, through module.exports, then b will lose the reference from a, since it will point out to the same empty object from the beginning, instead of pointing to the new one.
So if you export a like this: module.exports = { functionA: functionA }, then the output will be:
a starting
a requires b
b starting
b requires a
b gets a = {}
b done
a gets b = { functionB: [Function: functionB] }
a done
On b, a = {} // same empty object
Actually I ended up requiring my dependency with
var a = null;
process.nextTick(()=>a=require("./a")); //Circular reference!
not pretty, but it works. It is more understandable and honest than changing b.js (for example only augmenting modules.export), which otherwise is perfect as is.
Here is a quick workaround that I've found use full.
On file 'a.js'
let B;
class A{
constructor(){
process.nextTick(()=>{
B = require('./b')
})
}
}
module.exports = new A();
On the file 'b.js' write the following
let A;
class B{
constructor(){
process.nextTick(()=>{
A = require('./a')
})
}
}
module.exports = new B();
This way on the next iteration of the event loop classes will be defined correctly and those require statements will work as expected.
One way to avoid it is to don't require one file in other just pass it as an argument to a function what ever you need in an another file.
By this way circular dependency will never arise.
If you just can't eliminate circular dependencies (e.g useraccount <---> userlogin), there's one more option...
Its as simple as using setTimeout()
//useraccount.js
let UserLogin = {};
setTimeout(()=>UserLogin=require('./userlogin.js'), 10);
class UserAccount{
getLogin(){
return new UserLogin(this.email);
}
}
//userlogin.js
let UserAccount ={};
setTimeout(()=>UserAccount=require('./useraccount.js'), 15);
class UserLogin{
getUser(){
return new User(this.token);
}
}

Does requiring a module that requires the same file cause issues in Node.js? [duplicate]

I've been working with nodejs lately and still getting to grips with the module system, so apologies if this is an obvious question. I want code roughly like the below:
a.js (the main file run with node)
var ClassB = require("./b");
var ClassA = function() {
this.thing = new ClassB();
this.property = 5;
}
var a = new ClassA();
module.exports = a;
b.js
var a = require("./a");
var ClassB = function() {
}
ClassB.prototype.doSomethingLater() {
util.log(a.property);
}
module.exports = ClassB;
My problem seems to be that I can't access the instance of ClassA from within an instance of ClassB.
Is there any correct / better way to structure modules to achieve what I want?
Is there a better way to share variables across modules?
Try to set properties on module.exports, instead of replacing it completely. E.g., module.exports.instance = new ClassA() in a.js, module.exports.ClassB = ClassB in b.js. When you make circular module dependencies, the requiring module will get a reference to an incomplete module.exports from the required module, which you can add other properties latter on, but when you set the entire module.exports, you actually create a new object which the requiring module has no way to access.
While node.js does allow circular require dependencies, as you've found it can be pretty messy and you're probably better off restructuring your code to not need it. Maybe create a third class that uses the other two to accomplish what you need.
[EDIT] it's not 2015 and most libraries (i.e. express) have made updates with better patterns so circular dependencies are no longer necessary. I recommend simply not using them.
I know I'm digging up an old answer here...
The issue here is that module.exports is defined after you require ClassB.
(which JohnnyHK's link shows)
Circular dependencies work great in Node, they're just defined synchronously.
When used properly, they actually solve a lot of common node issues (like accessing express.js app from other files)
Just make sure your necessary exports are defined before you require a file with a circular dependency.
This will break:
var ClassA = function(){};
var ClassB = require('classB'); //will require ClassA, which has no exports yet
module.exports = ClassA;
This will work:
var ClassA = module.exports = function(){};
var ClassB = require('classB');
I use this pattern all the time for accessing the express.js app in other files:
var express = require('express');
var app = module.exports = express();
// load in other dependencies, which can now require this file and use app
Sometimes it is really artificial to introduce a third class (as JohnnyHK advises), so in addition to Ianzz:
If you do want to replace the module.exports, for example if you're creating a class (like the b.js file in the above example), this is possible as well, just make sure that in the file that is starting the circular require, the 'module.exports = ...' statement happens before the require statement.
a.js (the main file run with node)
var ClassB = require("./b");
var ClassA = function() {
this.thing = new ClassB();
this.property = 5;
}
var a = new ClassA();
module.exports = a;
b.js
var ClassB = function() {
}
ClassB.prototype.doSomethingLater() {
util.log(a.property);
}
module.exports = ClassB;
var a = require("./a"); // <------ this is the only necessary change
The solution is to 'forward declare' your exports object before requiring any other controller. So if you structure all your modules like this and you won't run into any issues like that:
// Module exports forward declaration:
module.exports = {
};
// Controllers:
var other_module = require('./other_module');
// Functions:
var foo = function () {
};
// Module exports injects:
module.exports.foo = foo;
What about lazy requiring only when you need to? So your b.js looks as follows
var ClassB = function() {
}
ClassB.prototype.doSomethingLater() {
var a = require("./a"); //a.js has finished by now
util.log(a.property);
}
module.exports = ClassB;
Of course it is good practice to put all require statements on top of the file. But there are occasions, where I forgive myself for picking something out of an otherwise unrelated module. Call it a hack, but sometimes this is better than introducing a further dependency, or adding an extra module or adding new structures (EventEmitter, etc)
You can solve this easily: just export your data before you require anything else in modules where you use module.exports:
classA.js
class ClassA {
constructor(){
ClassB.someMethod();
ClassB.anotherMethod();
};
static someMethod () {
console.log( 'Class A Doing someMethod' );
};
static anotherMethod () {
console.log( 'Class A Doing anotherMethod' );
};
};
module.exports = ClassA;
var ClassB = require( "./classB.js" );
let classX = new ClassA();
classB.js
class ClassB {
constructor(){
ClassA.someMethod();
ClassA.anotherMethod();
};
static someMethod () {
console.log( 'Class B Doing someMethod' );
};
static anotherMethod () {
console.log( 'Class A Doing anotherMethod' );
};
};
module.exports = ClassB;
var ClassA = require( "./classA.js" );
let classX = new ClassB();
A solution which require minimal change is extending module.exports instead of overriding it.
a.js - app entry point and module which use method do from b.js*
_ = require('underscore'); //underscore provides extend() for shallow extend
b = require('./b'); //module `a` uses module `b`
_.extend(module.exports, {
do: function () {
console.log('doing a');
}
});
b.do();//call `b.do()` which in turn will circularly call `a.do()`
b.js - module which use method do from a.js
_ = require('underscore');
a = require('./a');
_.extend(module.exports, {
do: function(){
console.log('doing b');
a.do();//Call `b.do()` from `a.do()` when `a` just initalized
}
})
It will work and produce:
doing b
doing a
While this code will not work:
a.js
b = require('./b');
module.exports = {
do: function () {
console.log('doing a');
}
};
b.do();
b.js
a = require('./a');
module.exports = {
do: function () {
console.log('doing b');
}
};
a.do();
Output:
node a.js
b.js:7
a.do();
^
TypeError: a.do is not a function
The important thing is not to re-assign the module.exports object that you have been given, because that object may have already been given to other modules in the cycle! Just assign properties inside module.exports and other modules will see them appear.
So a simple solution is:
module.exports.firstMember = ___;
module.exports.secondMember = ___;
The only real downside is the need to repeat module.exports. many times.
Similar to lanzz and setec's answers, I have been using the following pattern, which feels more declarative:
module.exports = Object.assign(module.exports, {
firstMember: ___,
secondMember: ___,
});
The Object.assign() copies the members into the exports object that has already been given to other modules.
The = assignment is logically redundant, since it is just setting module.exports to itself, but I am using it because it helps my IDE (WebStorm) to recognise that firstMember is a property of this module, so "Go To -> Declaration" (Cmd-B) and other tooling will work from other files.
This pattern is not very pretty, so I only use it when a cyclic dependency issue needs to be resolved.
It is fairly well suited to the reveal pattern, because you can easily add and remove exports from the object, especially when using ES6's property shorthand.
Object.assign(module.exports, {
firstMember,
//secondMember,
});
the extremely simple solution is often:
usually you'd have the require at the top of the file ...
var script = require('./script')
function stuff() {
script.farfunction()
}
instead, just require it "in the function"
function stuff() {
var _script = require('./script')
_script.farfunction()
}
An other method I've seen people do is exporting at the first line and saving it as a local variable like this:
let self = module.exports = {};
const a = require('./a');
// Exporting the necessary functions
self.func = function() { ... }
I tend to use this method, do you know about any downsides of it?
TL;DR
Just use exports.someMember = someMember instead of module.exports = { // new object }.
Extended Answer
After reading lanzz's response I could finally figure it out what is happening here, so I'll give my two cents on the subject, extending his answer.
Let's see this example:
a.js
console.log("a starting");
console.log("a requires b");
const b = require("./b");
console.log("a gets b =", b);
function functionA() {
console.log("function a");
}
console.log("a done");
exports.functionA = functionA;
b.js
console.log("b starting");
console.log("b requires a");
const a = require("./a");
console.log("b gets a =", a);
function functionB() {
console.log("On b, a =", a)
}
console.log("b done");
exports.functionB = functionB;
main.js
const a = require("./a");
const b = require("./b");
b.functionB()
Output
a starting
a requires b
b starting
b requires a
b gets a = {}
b done
a gets b = { functionB: [Function: functionB] }
a done
On b, a = { functionA: [Function: functionA] }
Here we can see that at first b receives an empty object as a, and then once a is fully loaded, that reference is updated through exports.functionA = functionA. If you instead replace the entire module with another object, through module.exports, then b will lose the reference from a, since it will point out to the same empty object from the beginning, instead of pointing to the new one.
So if you export a like this: module.exports = { functionA: functionA }, then the output will be:
a starting
a requires b
b starting
b requires a
b gets a = {}
b done
a gets b = { functionB: [Function: functionB] }
a done
On b, a = {} // same empty object
Actually I ended up requiring my dependency with
var a = null;
process.nextTick(()=>a=require("./a")); //Circular reference!
not pretty, but it works. It is more understandable and honest than changing b.js (for example only augmenting modules.export), which otherwise is perfect as is.
Here is a quick workaround that I've found use full.
On file 'a.js'
let B;
class A{
constructor(){
process.nextTick(()=>{
B = require('./b')
})
}
}
module.exports = new A();
On the file 'b.js' write the following
let A;
class B{
constructor(){
process.nextTick(()=>{
A = require('./a')
})
}
}
module.exports = new B();
This way on the next iteration of the event loop classes will be defined correctly and those require statements will work as expected.
One way to avoid it is to don't require one file in other just pass it as an argument to a function what ever you need in an another file.
By this way circular dependency will never arise.
If you just can't eliminate circular dependencies (e.g useraccount <---> userlogin), there's one more option...
Its as simple as using setTimeout()
//useraccount.js
let UserLogin = {};
setTimeout(()=>UserLogin=require('./userlogin.js'), 10);
class UserAccount{
getLogin(){
return new UserLogin(this.email);
}
}
//userlogin.js
let UserAccount ={};
setTimeout(()=>UserAccount=require('./useraccount.js'), 15);
class UserLogin{
getUser(){
return new User(this.token);
}
}

Testing node js module with injected dependecy

I am java, c# programmer with more than 10 years experience. Now I have to write some code in js and really, I feel like Alice in Wonderland....
I have myModule.js :
module.exports = function (logger) {
return {
read: function (param) {
......
logger.eror(....)
},
write: function (param) {
......
logger.info(....)
}
};
}
The logger.js, (legacy code, I cannot change it) looks similar:
module.exports = function(some_settings){
return {
info: function () {
......
},
error: function () {
......
}
};
}
Important thing: logger is not included to myModule by require("logger") but injected in some "global" index.js file which looks like:
var settings = require("../someSettings.js")();
var logger = require("../logger.js")(settings);
var myModule = require("../myModule .js")(logger);
Now I want to write UT for myModule in mocha. So I want to mock logger. And I face the wall.
I tried proxyquire but this is not good solutions because logger is not included to myModule by require.
I tried sinon stub and mock e.g.:
var logger = require("logger.js");
var loggerMock = sinon.mock(logger);
var myModule= require("myModule.js")(loggerMock);
Or:
var logger = require("logger.js");
var loggerStub = sinon.stub(logger, "info", function() {return ....});
var myModule= require("myModule.js")(loggerStub);
But I still have some errors like :
"Attempted to wrap undefined property error as function"...
Please, be my white rabbit in this javascript freeky world...
When you are testing your module, you can use a fake object for your logger object, i.e.:
const logger = {info: noop, error: noop, debug:...};
function noop() {}
or well create a mock object with sinonjs if you prefer create it programatically and use spy features.
Then, you can inject it wrapping your module definition in a closure (as you posted):
# myModule.js:
module.exports = (dep1, dep2, ...) => {
// module definition
}
Or use rewire package, to "inject" module scoped dependencies (it works only for local-vars, and not need to create a closure in your module):
# myModuleTest
myModule = rewire('./myModule');
myModule.__set__('logger', { /* fake logger */ });

Complex circular Node module dependency throwing "TypeError: The super constructor to 'inherits' must have a prototype"

I've got a complex Node SDK project that uses some class inheritance to try and static-ify Javascript. I'm using Node's module caching behavior to create a singleton-like behavior for the SDK (the Project class, a shared instance of ProjectClient). To init, it looks like this:
var Project = require('./project'),
Project.init(params)
// Project is now an instance of ProjectClient
I've got some object type classes for data as well: Entity (a standard parsed JSON payload object) and User (a special type of Entity that contains user properties).
The ProjectClient class has several methods that allow RESTful API calls to occur, e.g. Project.GET(), Project.PUT(). These work just fine when instantiating the Project "singleton".
I'm now trying to create convenience methods attached to Entity that will leverage ProjectClient's RESTful operations, e.g. Entity.save(), Entity.refresh().
When I try to import Project into the Entity:
var Project = require('../project')
I get:
TypeError: The super constructor to `inherits` must have a prototype.
at Object.exports.inherits (util.js:756:11)
Troubleshooting leads me to this being related to util.inherits(ProjectUser, ProjectEntity) in User, because if I comment it out, I get this instead:
Uncaught TypeError: ProjectEntity is not a function
What's going on with inherits? why does it think that Entity doesn't have a prototype? My best guess is that it's related to the fact that I'm recursively nesting modules within other modules (bad, I know), but I've even tried doing something like this in the various classes but to no avail:
module.exports = _.assign(module.exports, **ClassNameHere**)
Here's some reduced code for each class:
Entity
var Project = require('../Project'),
_ = require('lodash')
var ProjectEntity = function(obj) {
var self = this
_.assign(self, obj)
Object.defineProperty(self, 'isUser', {
get: function() {
return (self.type.toLowerCase() === 'user')
}
})
return self
}
module.exports = ProjectEntity
User (a subclass of Entity)
var ProjectEntity = require('./entity'),
util = require('util'),
_ = require('lodash')
var ProjectUser = function(obj) {
if (!ok(obj).has('email') && !ok(obj).has('username')) {
// This is not a user entity
throw new Error('"email" or "username" property is required when initializing a ProjectUser object')
}
var self = this
_.assign(self, ProjectEntity.call(self, obj))
return self
}
util.inherits(ProjectUser, ProjectEntity)
module.exports = ProjectUser
Project ("singleton" but not really)
'use strict'
var ProjectClient = require('./lib/client')
var Project = {
init: function(options) {
var self = this
if (self.isInitialized) {
return self
}
Object.setPrototypeOf(Project, new ProjectClient(options))
ProjectClient.call(self)
self.isInitialized = true
}
}
module.exports = Project
Client
var ProjectUser = require('./user'),
_ = require('lodash')
var ProjectClient = function(options) {
var self = this
// some stuff happens here to check options and init with default values
return self
}
ProjectClient.prototype = {
GET: function() {
return function() {
// async GET request with callback
}
},
PUT: function() {
return function() {
// async PUT request with callback
}
}
}
module.exports = ProjectClient
So, as you have correctly deducted there's a problem with a circular dependency. Your Entity module requires the Project Module which requires the Client module which requires the User Module which requires the Entity Module.
There's something you can do about it but it will depend on your starting point. If you require the Project module first then it should work with the code provided because the Entity module doesn't do anything with the Project module. Nothing has been exported on that module, so it's just an empty object. Then again, any source of errors on that module would be related on whatever exported objects are depended inside that module. So if you'd need the object with init inside Entity then there would be a problem.
You can export some methods/functions before initializing the dependency chain, which will make them available by then. Take the example of NodeJS docs:
a.js
console.log('a starting');
exports.done = false;
var b = require('./b.js');
console.log('in a, b.done = %j', b.done);
exports.done = true;
console.log('a done');
b.js
console.log('b starting');
exports.done = false;
var a = require('./a.js');
console.log('in b, a.done = %j', a.done);
exports.done = true;
console.log('b done');
main.js
console.log('main starting');
var a = require('./a.js');
var b = require('./b.js');
console.log('in main, a.done=%j, b.done=%j', a.done, b.done);
So, main.js is the starting point. It requires a.js which immediately exports a done property and then requires b.js. b.js also exports a done property. The next line requires a.js which doesn't load a.js again but returns the exported properties so far (which includes the done property). At this point, a is incomplete but it has managed to give b enough to continue working. It's next line (on b.js) will print the exported property (a.done, which is false), then reset the exported property done to true. We are back on a.js in the require('b.js') line. b.js is now loaded completely and the rest is pretty easy to explain.
Output is:
main starting
a starting
b starting
in b, a.done = false
b done
in a, b.done = true
a done
in main, a.done=true, b.done=true
Here's the example, just in case you want to read the official docs.
Right, so point being... what can you do?
It's possible to export some things before initializing the dependency cycle, just so long as you don't actually need those dependencies. For example you can:
a.js
exports.func = function(){ console.log('Hello world'); }
var b = require('./b.js');
console.log('a done');
b.js
var a = require('./a.js');
a.func(); //i'm still incomplete but i got func!
console.log('b done');
You can't:
a.js
b.func(); //b isn't even an object yet.
var b = require('./b.js');
console.log('a done');
b.js
exports.func = function(){ console.log('hello world'); }
var a = require('./a.js');
a.func();
console.log('b done');
However, if your a.js module only exports functions then there's no real problem as long as these functions aren't being called somewhere else:
a.js
exports.func = function(){ b.func(); }
var b = require('./b.js');
console.log('a done');
b.js
exports.func = function(){ console.log('hello world'); }
var a = require('./a.js');
console.log('b done');
You are exporting a function that uses b, the function doesn't need to know about b right then, only when it's called. So both modules are being loaded correctly. If you are just exporting functions there's no problem with having the dependencies declared afterwards.
So you can require a.js from your main point and func will work correctly as the b reference points now to the complete b.js module. You can follow this pattern so long as you don't use the functions you export when loading the dependencies.

How to Augument commonjs module from requiring code?

I am wondering how I can augument a commonjs module from another module that requires it.
Let's assume that I have three files, two commonjs modules as below:
my-example-module.js
function MyExampleModule(){}
MyExampleModule.prototype = {
bindings: {
some: 'random',
prop: 'values'
}
}
module.exports = MyExampleModule;
another-example-module.js
var MyExampleModule = require('./my-example-module');
function AnotherExampleModule(){}
AnotherExampleModule.prototype = {
getMyExampleModuleBindings: function(){
var module = new MyExampleModule();
return module.bindings;
}
}
module.exports = AnotherExampleModule;
app.js
var MyExampleModule = require('./my-example-module');
var AnotherExampleModule = require('./another-example-module');
//modify?!?
var anotherExampleModule = new AnotherExampleModule();
console.log(anotherExampleModule.getMyExampleModuleBindings());
So what I want to do is have //modify?!? be some kind of code that will alter the original MyExampleModule prototype so when anything else attempts to require MyExampleModule it will get the modified version.
A concrete question would be - what should I replace //modify?!? with so I get logged out the the console with the assumption that my-example-module.js is read only.
{
some: 'random',
prop: 'values',
added: 'binding'
}
If you want to do this in nodejs/iojs, this is pretty simple. When node imports a CommonJS module (let's call it A), it creates an internal object.
When another module (B) wants to load the same module (A), it just gets the reference to that internal object. So if you change something on MyExampleModule in app.js it is also applied to the MyExampleModule in another-example-module.js:
app.js
var MyExampleModule = require('./my-example-module');
var AnotherExampleModule = require('./another-example-module');
//modify:
MyExampleModule.prototype.bindings = {
some: 'random',
prop: 'values',
added: 'binding'
};
var anotherExampleModule = new AnotherExampleModule();
console.log(anotherExampleModule.getMyExampleModuleBindings());
Since you create a new instance of MyExampleModule in another-example-module.js after you call MyExampleModule.prototype.bindings = {...} in app.js, the new instance will already be created with the modified .prototype.
While I haven't tested this in browserify, it certainly works in webpacks implementation of CommonJS as well.
Check out the working example on runnable (app.js is called server.js):
http://code.runnable.com/VZPdN5k65gE5vUIz

Categories