I'm currently working on a small console project that depends a lot on the arguments that are passed initially and I'm looking for a good way to handle a configuration object in nodejs.
I have the project currently fully working with the following example but I think I'm relaying on the caching of modules when using 'require'.
lets assume a module options.js
'use strict';
var options = {
configName: '.jstail',
colorActive: (process.platform === 'win32') ? false : true, // deactivate color by default on windows platform
quiet: false,
debug: false,
config: null,
logFile: null,
setting: null
};
module.exports = options;
And my initial module init.js
#!/usr/bin/env node
'use strict';
var options = require('options'); // require above options module
// modify the options object based on args
I then have a logger that depends on this options
For example if quiet is set to true no logging should happen
logger.js
'use strict';
var options = require('options');
/**
* prints to console if not explicitly suppresed
* #param {String} text
*/
function log(text) {
if (!options.quiet) {
console.log('[LOG]: ' + text);
}
}
My big problem is (I think) that I'm relaying on the caching of nodejs modules when I require the options module in the logger
So my two questions are:
Am I right that this only works because of the caching of the modules that nodejs does for me?
Is there any better way to handle a dynamic global configuration?
I know there are several questions and tutorials around with a config file but thats not what I'm looking for.
Yes, this only works because of caching, though I wouldn't call it caching (but node.js docs do) rather than lazy initialization. It's ok to rely on that, a lot of modules do some initialization of first require, using it for configuration is also typical. Generally speaking, require is a node.js way of accessing global singleton objects.
The other way to do it is to load configuration from a single file, modify it and then pass it to other modules who need it, like this:
//index.js
var config = require('./config')
config.flag = false
var module1 = require('./module1')(config)
//module1.js
module.exports = function (config) {
// do stuff
}
It makes code more decoupled and testable but adds complexity. Difference between these two approaches is basically the same as using globals vs dependency injection. Use whatever you like.
Related
[EDIT]
Thanks to Stafano that formalized my question in a better way:
You have a module
-) There are several files in this module
-) All these files depend on a configuration whose path is unknown to the module itself
-) This module does not do much on its own, and is meant to be used by other applications
-) These applications should inject a configuration path into the module before it can be used
So i have this module, used from another application. It's composed of other submodules and i want to configure it using a configuration object.
I already tried to inject the configuration in my submodels but i had the same problem exposed in the original question.
For example my module use mongoDB (with mongoose) as a store.
// app.js
// in the config object i have the URI to the mongo instance (in order to create a connection).
var myModule = require('myModule')(config);
// myModule.js
// files
// myModule/index.js expose the module's functionalities
// is the entry point so I create the mongoose connection
var mongoose = require('mongoose');
module.exports = function(config){
var connection = mongoose.createConnection(config.store.URL);
// I need to expose this connection to the others submodules.
}
// myModule/storeController.js contains the business logic that use the store (createItem, deleteItem, get...) and requrie mongoose and my Models (store in the models folder)
var mongoose = require('mongoose');
var Item = require('./models/item.js');
exports.createItem = function(item){
Item.save(item, function(err, item){
if (err) throw
...
});
}
// myModule/models/item.js
// In this module i need to use the connection application in the configuration.
var mongoose = require('mongoose');
var connection = // i don't know how to get it
var ItemSchema = new mongoose.Schema({
name: String
});
module.exports = mongoose.model('item', ItemSchema);
If I inject the configuration obj to the item.js i can't do the module.exports of my model.
I hope that this example can clarify my question, but the problem is the simple, expose an object after get it as a parameter.
[PREVIOUS]
I have a node.js application that require a module. This module accept the coniguration file path (a JSON file).
I need to load that configuration on require and expose it to the module.
How can I achieve this behavior?
Something like:
// app.js
var myModule = require('myModule')(__dirname + '/config/myModuleCnfig.json');
// myModule.js
module.exports = function(configPath){
var config = require(configPath);
module.exports = config; // This is wrong
}
Is there another way to get the configuration path, configure the module and share the configuration??
With "share the configuration" i mean that i want to give the possibility to other files of my module to use that configuration.
Thanks for any suggestions!
FINAL EDIT:
After many misunderstandings, your problem is finally clear to me. To summarise what's in the comments, here is the situation:
You have a module
There are several files in this module
All these files depend on a configuration whose path is unknown to the module
itself
This module does not do much on its own, and is meant to be
used by other applications
These applications should inject a
configuration path into the module before it can be used
Since you cannot modify dynamically what a module exports, you should use another approach. As with most situations that you encounter in programming, there is not one way which is always right, as much pedends on your requirements and limitations.
The easiest way to do this (which I don't recommend) is to use a global variable, which you set in your myModule.js file and will be used by the other files in your module. The biggest drawback of this approach is that you wouldn't be able to use multiple instances of the module at the same time with different configurations. Also, any other module could easily modify (deliberately or not) you configuration at any time, by simply changing the value of the global variable, so it's also a security risk.
A much better way, which will probably require more work on your part - depending on how many files you have - is to implement some kind of Inversion of Control (IoC). In your case, you could turn all your exports into functions that accept a config, and then initialise them by passing the active configuration after you require the module. I don't know the specifics of your implementation, so here is some sample code:
// submodule1.js
module.exports = function(config) {
// return something that uses the configuration
}
// myModule.js
var fs = require('fs');
var submodule1 = require('./submodule1');
var submodule2 = require('./submodule2');
// ...
module.exports = function(configPath){
var config = JSON.parse(fs.readFileSync(configPath));
var sm1 = submodule1(config);
var sm2 = submodule2(config);
return /* an object that uses sm1 and sm2 */;
}
If your module is quite complex, you can use some IoC library that does the binding for you. An good one could be Electrolite.
Hope this helps.
PREVIOUS ANSWER:
You can use a library called jsop:
var jsop = require('jsop');
var config = jsop('./config/myModuleCnfig.json');
If you don't want to add a dependency to this module, the linked GitHub page also has a snippet that you can use to load the json config using only native methods.
EDIT: I just realised that this module is only for node 0.11, which you are probably not using. Since you don't probably need the writing functionality, you can use the following snippet instead:
var fs = require('fs')
var config = JSON.parse(fs.readFileSync('./config/myModuleCnfig.json'))
EDIT 2:
Now I think I understand your problem better. To pass the path to the required configuration, you can do something like this:
// myModule.js
var fs = require('fs')
module.exports = function(configPath){
var config = JSON.parse(fs.readFileSync(configPath))
return config;
}
Is it possible to use the default node require function in a file that has been called through requirejs?
define(["require", "exports"], function(require, exports) {
//...
var Schema = require(DaoPublic._schemasDirectory + schemaFilename);
}
I always get ReferenceError: module is not defined, I also tried to load the schema using requireJs, same, because the file itself is coded as CommonJs, not AMD compatible.
Any solution?
Note that the loaded schema is in CommonJS and I need to keep this way, since it's used by several DAO, some in AMD and other in CommonJs. (Funny part)
Example of requested file (schema):
var userSchema = {
/**
* User Login, used as id to connect between all our platforms.
*/
login: {
type: String,
match: /^[a-zA-Z0-9_-]+$/,
trim: true,
required: true,
notEmpty: true,
unique: true,
check: {
minLength: 4,
maxLength: 16
}
}
};
module.exports = userSchema;
The problem is that your code is set so that RequireJS is able to find the CommonJS module by itself. However, when RequireJS is running in Node and cannot find a module, it will call Node's require function, which is what you need. So it is possible (with RequireJS) to have an AMD module use Node's require but the trick is getting RequireJS to not see the module in the first place.
Proof of Concept
Here's a proof of concept. The main file named test.js:
var requirejs = require("requirejs");
function myRequire(path) {
if (path.lastIndexOf("schemas/", 0) === 0)
path = "./" + path;
return require(path);
}
requirejs.config({
paths: {
"schemas": "BOGUS"
},
nodeRequire: myRequire
});
requirejs(['foo'], function (foo) {
console.log(foo);
});
The file foo.js:
define(["require", "exports"], function(require, exports) {
return require("./schemas/x") + " by way of foo";
});
The file schemas/x.js:
module.exports = "x";
If you run it with node test.js, you'll get on the console:
x by way of foo
Explanation
I'm calling this a "proof of concept" because I've not considered all eventualities.
The paths setting is there to throw RequireJS off track. BOGUS must be a non-existent directory. When RequireJS tries to load the module ./schemas/x, it tries to load the file ./BOGUS/x.js and does not find it. So it calls Node's require.
The nodeRequire setting tells RequireJS that Node's require function is myRequire. This is a useful lie.
The myRequire function changes the path to add the ./ at the start before calling Node's require. The issue here is that for some reason RequireJS transforms ./schemas/x to schemas/x before it gives the path to Node's require function, and Node will then be unable to find the module. Adding back the ./ at the start of the path name fixes this. I've tried a whole bunch of path variants but none of them worked. Some variants were such that RequireJS was able to find the module by itself and thus never tried calling Node's require or they prevented Node from finding the module. There may be a better way to fix this, which I've not found. (This is one reason why I'm calling this a "proof of concept".) Note that I've designed this function to only alter the paths that start with schemas/.
Other Possibilities
I've looked at other possibilities but they did not appear to me very promising. For instance, customizing NODE_PATH would eliminate myRequire but such customization is not always doable or desirable.
Using RequireJS I'm building an app which make extensive use of widgets. For each widget I have at least 3 separate files:
request.js containing code for setting up request/response handlers to request a widget in another part of my application
controller.js containing handling between model and view
view.js containing handling between user and controller
Module definition in request.js:
define(['common/view/widget/entity/term/list/table/controller'],
function(WidgetController) { ... });
Module definition in controller.js:
define(['common/view/widget/entity/term/list/table/view'],
function(WidgetView) { ... });
Module definition of view.js is:
define(['module','require'],function(module,require) {
'use strict';
var WidgetView = <constructor definition>;
return WidgetView;
});
I have lots of these little situations as above in the case of widgets I have developed. What I dislike is using the full path every time when a module is requiring another module and both are located in the same folder. I'd like to simply specify as follows (assuming we have a RequireJS plugin which solves this for us):
define(['currentfolder!controller'],
function(WidgetController) { ... });
For this, I have written a small plugin, as I couldn't find it on the web:
define({
load: function (name, parentRequire, onload, config) {
var path = parentRequire.toUrl('.').substring(config.baseUrl.length) + '/' + name;
parentRequire([path], function (value) {
onload(value);
});
}
});
As you might notice, in its basic form it looks like the example of the RequireJS plugins documentation.
Now in some cases, the above works fine (e.g. from the request.js to the controller.js), but in other cases a load timeout occurs (from controller.js to view.js). When I look at the paths which are generated, all are proper RequireJS paths. Looking at the load timeouts, the following is logged:
Timestamp: 13-09-13 17:27:10
Error: Error: Load timeout for modules: currentfolder!view_unnormalized2,currentfolder!view
http://requirejs.org/docs/errors.html#timeout
Source File: http://localhost/app/vendor/requirejs/require.js?msv15z
Line: 159
The above log was from a test I did with only loading the view.js from controller.js using currentfolder!view in the list of modules in the define statement. Since I only requested currentfolder!view once, I'm confused as to why I both see currentfolder!view_unnormalized2 and currentfolder!view in the message.
Any idea as to why this might be happening?
My answer may not answer your primary questions, but it will help you achieve what you're trying to do with your plugin.
In fact, Require.js support relative paths for requiring modules when using CommonJS style. Like so:
define(function( require, exports, module ) {
var relativeModule = require("./subfolder/module");
module.exports = function() {
console.log( relativeModule );
};
});
A small test app is set up like this:
init.js:
//#codekit-prepend "vendor/jquery-1.7.2.js"
//#codekit-prepend "vendor/underscore.js"
//#codekit-prepend "vendor/backbone.js"
// Setup namespace for the app
window.app = window.app || {};
//#codekit-append "models/Ride.js"
Ride.js:
(function() {
window.app.Ride = Backbone.Model.extend({
initialize: function() {
console.log("Ride initialized");
}
});
})();
CodeKit's JSHint check reports that both Backbone and console are not defined. What am I missing here?
JSHint doesn't run your code so it doesn't know about any modules you included in other files. You have to specifically tell it about all global variables you plan to use in Ride.js. In your case it will be: /*global Backbone */. console is disallowed by default because it is not a very good idea to ship your software with filled console.log calls. To remove this warning you can use /*jshint devel:true */.
So in the end your file should look like this to pass JSHint check:
/*jshint devel:true */
/*global Backbone */
(function() {
window.app.Ride = Backbone.Model.extend({
initialize: function() {
console.log("Ride initialized");
}
});
})();
More info here: http://www.jshint.com/options/
Bryan here. CodeKit does check your files in a full, global context. (That is, it combines them first, so variables declared in an earlier file will be valid in a later one. This assumes you use CodeKit to combine the files, either with #codekit-prepend/append statements or drag/drop import links set up in CodeKit itself). If you're combining your JS files some other way (such as a build script) then CodeKit is unaware that the files go together and therefore it checks each one separately.
You can use the comment flags in the answer above, or you can configure JSHint's options directly in CodeKit. See the preferences window (or project settings area, if your project uses project-level settings). You can also enter custom globals there as well, which will remove those warnings.
Cheers!
I am currently using requirejs to manage module js/css dependencies.
I'd like to discover the possibilities of having node do this via a centralized config file.
So instead of manually doing something like
define([
'jquery'
'lib/somelib'
'views/someview']
within each module.
I'd have node inject the dependencies ie
require('moduleA').setDeps('jquery','lib/somelib','views/someview')
Anyway, I'm interested in any projects looking at dependency injection for node.
thanks
I've come up with a solution for dependency injection. It's called injectr, and it uses node's vm library and replaces the default functionality of require when including a file.
So in your tests, instead of require('libToTest'), use injectr('libToTest' { 'libToMock' : myMock });. I wanted to make the interface as straightforward as possible, with no need to alter the code being tested. I think it works quite well.
It's just worth noting that injectr files are relative to the working directory, unlike require which is relative to the current file, but that shouldn't matter because it's only used in tests.
I've previously toyed with the idea of providing an alternate require to make a form of dependency injection available in Node.js.
Module code
For example, suppose you have following statements in code.js:
fs = require('fs');
console.log(fs.readFileSync('text.txt', 'utf-8'));
If you run this code with node code.js, then it will print out the contents of text.txt.
Injector code
However, suppose you have a test module that wants to abstract away the file system.
Your test file test.js could then look like this:
var origRequire = global.require;
global.require = dependencyLookup;
require('./code.js');
function dependencyLookup (file) {
switch (file) {
case 'fs': return { readFileSync: function () { return "test contents"; } };
default: return origRequire(file);
}
}
If you now run node test.js, it will print out "test contents", even though it includes code.js.
I've also written a module to accomplish this, it's called rewire. Just use npm install rewire and then:
var rewire = require("rewire"),
myModule = rewire("./path/to/myModule.js"); // exactly like require()
// Your module will now export a special setter and getter for private variables.
myModule.__set__("myPrivateVar", 123);
myModule.__get__("myPrivateVar"); // = 123
// This allows you to mock almost everything within the module e.g. the fs-module.
// Just pass the variable name as first parameter and your mock as second.
myModule.__set__("fs", {
readFile: function (path, encoding, cb) {
cb(null, "Success!");
}
});
myModule.readSomethingFromFileSystem(function (err, data) {
console.log(data); // = Success!
});
I've been inspired by Nathan MacInnes's injectr but used a different approach. I don't use vm to eval the test-module, in fact I use node's own require. This way your module behaves exactly like using require() (except your modifications). Also debugging is fully supported.