I have an application using node.js backend and require.js/backbone frontend.
My backend has a config/settings system, which depending on the environment (dev, production, beta) can do different things. I would like to propagate some of the variables to the client as well, and have them affect some template rendering (e.x change the Title or the URL of the pages).
What is the best way to achieve that?
I came up with a way to do it, and it seems to be working but I don't think its the smartest thing to do and I can't figure out how to make it work with requirejs optimizer anyway.
What I do is on the backend I expose an /api/config method (through GET) and on the client
I have the following module config.js:
// This module loads an environment config
// from the server through an API
define(function(require) {
var cfg = require('text!/api/config');
return $.parseJSON(cfg);
});
any page/module that needs config will just do:
var cfg = require('config');
As I said I am having problem with this approach, I can't compile/optimize my client code
with requirejs optimizer since /api/config file doesn't exist in offline during optimization. And I am sure there are many other reason my approach is a bad idea.
If you use use module bundlers such as webpack to bundle JavaScript files for usage in a browser, you can reuse your Node.js module for the client running in a browser. In other words, put your settings or configuration in Node.js modules, and share them between the backend and the client.
For example, you have the following settings in config.js:
Normal Node.js module: config.js
const MY_THIRD_PARTY_URL = 'https://a.third.party.url'
module.exports = { MY_THIRD_PARTY_URL }
Use the module in Node.js backend
const config = require('path-to-config.js')
console.log('My third party URL: ', config.MY_THIRD_PARTY_URL)
Share it in the client
import config from 'path-to-config.js'
console.log('My third party URL: ', config.MY_THIRD_PARTY_URL)
I do the following (note that this is Jade, i have never used require.js or backbone, however as long as you can pass variables from express into your templating language, you should be able to place JSON in data-* attributes on any element you want.)
// app.js
app.get('/', function(req, res){
var bar = {
a: "b",
c: Math.floor(Math.random()*5),
};
res.locals.foo = JSON.stringify(bar);
res.render('some-jade-template');
});
// some-jade-template.jade
!!!
html
head
script(type="text/javascript"
, src="http://ajax.googleapis.com/ajax/libs/jquery/1.10.2/jquery.min.js")
script(type="text/javascript")
$.ready(init);
function init(){
var json = $('body').attr('data-stackoverflowquestion');
var obj = JSON.parse(json);
console.log(obj);
};
body(data-stackoverflowquestion=locals.foo)
h4 Passing data with data-* attributes example
Related
Assume the following scenario :
Rest client is a module, which has many middlewares; which are themselves modules.
Now, we're trying to create a new middleware that requires client itself to fetch/update metadata for the client input url.
Testing this middleware, will bring in a published version of the client from npm registry, because middleware has a devDependency on the client. But we want to serve our local client.
Also, the published version of client does not contain this new middleware, so it will not allow testing the request pipeline with this middleware.
We want to initiate the client with this middleware, when we're testing the middleware itself to send a request to fetch data.
The middleware is smart enough to not request metadata for metadata, so it will skip the second call. The new flow should be like the diagram below :
Wrap the nodejs module loader, to return local client instead of published one when client is requested during test execution.
describe(()=>{
const localClient = require('../client');
const m = require('module');
const orig = m._load;
before(()=>{
m._load = function(name){
if(name==='client'){
return localClient;
}
return orig.appy(this, arguments)
}
});
after(()=>{
m._load = orig;
});
it('test goes here',()=>{...})
})
I use the FayeJS and the latest version has been modified to use RequireJS, so there is no longer a single file to link into the browser. Instead the structure is as follows:
/adapters
/engines
/mixins
/protocol
/transport
/util
faye_browser.js
I am using the following nodejs build script to try and end up with all the above minified into a single file:
var fs = require('fs-extra'),
requirejs = require('requirejs');
var config = {
baseUrl: 'htdocs/js/dev/faye/'
,name: 'faye_browser'
, out: 'htdocs/js/dev/faye/dist/faye.min.js'
, paths: {
dist: "empty:"
}
,findNestedDependencies: true
};
requirejs.optimize(config, function (buildResponse) {
//buildResponse is just a text output of the modules
//included. Load the built file for the contents.
//Use config.out to get the optimized file contents.
var contents = fs.readFileSync(config.out, 'utf8');
}, function (err) {
//optimization err callback
console.log(err);
});
The content of faye_browser.js is:
'use strict';
var constants = require('./util/constants'),
Logging = require('./mixins/logging');
var Faye = {
VERSION: constants.VERSION,
Client: require('./protocol/client'),
Scheduler: require('./protocol/scheduler')
};
Logging.wrapper = Faye;
module.exports = Faye;
As I under stand it the optimizer should pull in the required files, and then if those files have required files, it should pull in those etc..., and and output a single minified faye.min.js that contains the whole lot, refactored so no additional serverside calls are necessary.
What happens is faye.min.js gets created, but it only contains the content of faye_browser.js, none of the other required files are included.
I have searched all over the web, and looked at a heap of different examples and none of them work for me.
What am I doing wrong here?
For anyone else trying to do this, I mist that on the download page it says:
The Node.js version is available through npm. This package contains a
copy of the browser client, which is served up by the Faye server when
running.
So to get it you have to pull down the code via NPM and then go into the NPM install dir and it is in the "client" dir...
[EDIT]
Thanks to Stafano that formalized my question in a better way:
You have a module
-) There are several files in this module
-) All these files depend on a configuration whose path is unknown to the module itself
-) This module does not do much on its own, and is meant to be used by other applications
-) These applications should inject a configuration path into the module before it can be used
So i have this module, used from another application. It's composed of other submodules and i want to configure it using a configuration object.
I already tried to inject the configuration in my submodels but i had the same problem exposed in the original question.
For example my module use mongoDB (with mongoose) as a store.
// app.js
// in the config object i have the URI to the mongo instance (in order to create a connection).
var myModule = require('myModule')(config);
// myModule.js
// files
// myModule/index.js expose the module's functionalities
// is the entry point so I create the mongoose connection
var mongoose = require('mongoose');
module.exports = function(config){
var connection = mongoose.createConnection(config.store.URL);
// I need to expose this connection to the others submodules.
}
// myModule/storeController.js contains the business logic that use the store (createItem, deleteItem, get...) and requrie mongoose and my Models (store in the models folder)
var mongoose = require('mongoose');
var Item = require('./models/item.js');
exports.createItem = function(item){
Item.save(item, function(err, item){
if (err) throw
...
});
}
// myModule/models/item.js
// In this module i need to use the connection application in the configuration.
var mongoose = require('mongoose');
var connection = // i don't know how to get it
var ItemSchema = new mongoose.Schema({
name: String
});
module.exports = mongoose.model('item', ItemSchema);
If I inject the configuration obj to the item.js i can't do the module.exports of my model.
I hope that this example can clarify my question, but the problem is the simple, expose an object after get it as a parameter.
[PREVIOUS]
I have a node.js application that require a module. This module accept the coniguration file path (a JSON file).
I need to load that configuration on require and expose it to the module.
How can I achieve this behavior?
Something like:
// app.js
var myModule = require('myModule')(__dirname + '/config/myModuleCnfig.json');
// myModule.js
module.exports = function(configPath){
var config = require(configPath);
module.exports = config; // This is wrong
}
Is there another way to get the configuration path, configure the module and share the configuration??
With "share the configuration" i mean that i want to give the possibility to other files of my module to use that configuration.
Thanks for any suggestions!
FINAL EDIT:
After many misunderstandings, your problem is finally clear to me. To summarise what's in the comments, here is the situation:
You have a module
There are several files in this module
All these files depend on a configuration whose path is unknown to the module
itself
This module does not do much on its own, and is meant to be
used by other applications
These applications should inject a
configuration path into the module before it can be used
Since you cannot modify dynamically what a module exports, you should use another approach. As with most situations that you encounter in programming, there is not one way which is always right, as much pedends on your requirements and limitations.
The easiest way to do this (which I don't recommend) is to use a global variable, which you set in your myModule.js file and will be used by the other files in your module. The biggest drawback of this approach is that you wouldn't be able to use multiple instances of the module at the same time with different configurations. Also, any other module could easily modify (deliberately or not) you configuration at any time, by simply changing the value of the global variable, so it's also a security risk.
A much better way, which will probably require more work on your part - depending on how many files you have - is to implement some kind of Inversion of Control (IoC). In your case, you could turn all your exports into functions that accept a config, and then initialise them by passing the active configuration after you require the module. I don't know the specifics of your implementation, so here is some sample code:
// submodule1.js
module.exports = function(config) {
// return something that uses the configuration
}
// myModule.js
var fs = require('fs');
var submodule1 = require('./submodule1');
var submodule2 = require('./submodule2');
// ...
module.exports = function(configPath){
var config = JSON.parse(fs.readFileSync(configPath));
var sm1 = submodule1(config);
var sm2 = submodule2(config);
return /* an object that uses sm1 and sm2 */;
}
If your module is quite complex, you can use some IoC library that does the binding for you. An good one could be Electrolite.
Hope this helps.
PREVIOUS ANSWER:
You can use a library called jsop:
var jsop = require('jsop');
var config = jsop('./config/myModuleCnfig.json');
If you don't want to add a dependency to this module, the linked GitHub page also has a snippet that you can use to load the json config using only native methods.
EDIT: I just realised that this module is only for node 0.11, which you are probably not using. Since you don't probably need the writing functionality, you can use the following snippet instead:
var fs = require('fs')
var config = JSON.parse(fs.readFileSync('./config/myModuleCnfig.json'))
EDIT 2:
Now I think I understand your problem better. To pass the path to the required configuration, you can do something like this:
// myModule.js
var fs = require('fs')
module.exports = function(configPath){
var config = JSON.parse(fs.readFileSync(configPath))
return config;
}
I'm currently working on a small console project that depends a lot on the arguments that are passed initially and I'm looking for a good way to handle a configuration object in nodejs.
I have the project currently fully working with the following example but I think I'm relaying on the caching of modules when using 'require'.
lets assume a module options.js
'use strict';
var options = {
configName: '.jstail',
colorActive: (process.platform === 'win32') ? false : true, // deactivate color by default on windows platform
quiet: false,
debug: false,
config: null,
logFile: null,
setting: null
};
module.exports = options;
And my initial module init.js
#!/usr/bin/env node
'use strict';
var options = require('options'); // require above options module
// modify the options object based on args
I then have a logger that depends on this options
For example if quiet is set to true no logging should happen
logger.js
'use strict';
var options = require('options');
/**
* prints to console if not explicitly suppresed
* #param {String} text
*/
function log(text) {
if (!options.quiet) {
console.log('[LOG]: ' + text);
}
}
My big problem is (I think) that I'm relaying on the caching of nodejs modules when I require the options module in the logger
So my two questions are:
Am I right that this only works because of the caching of the modules that nodejs does for me?
Is there any better way to handle a dynamic global configuration?
I know there are several questions and tutorials around with a config file but thats not what I'm looking for.
Yes, this only works because of caching, though I wouldn't call it caching (but node.js docs do) rather than lazy initialization. It's ok to rely on that, a lot of modules do some initialization of first require, using it for configuration is also typical. Generally speaking, require is a node.js way of accessing global singleton objects.
The other way to do it is to load configuration from a single file, modify it and then pass it to other modules who need it, like this:
//index.js
var config = require('./config')
config.flag = false
var module1 = require('./module1')(config)
//module1.js
module.exports = function (config) {
// do stuff
}
It makes code more decoupled and testable but adds complexity. Difference between these two approaches is basically the same as using globals vs dependency injection. Use whatever you like.
I have one module which wraps the Socket.io functionality my app is using which looks something like this:
// realtime.js
var io = require('socket.io'),
sio;
exports.init = function(expressServer) {
sio = io.listen(expressServer);
}
...
The main app.js file looks like
// app.js
var rt = require('./realtime.js'),
other = require('./other.js');
...
rt.init(expressServer);
The other module also uses rt.js
// other.js
var rt = require('./realtime.js');
...
My question is, will both other.js and app.js have the same instance of rt.js?
The answer on SO relating to redis lead me to believe the above statement is true, but in the documentation here it says
Multiple calls to require('foo') may not cause the module code to be
executed multiple times. This is an important feature. With it,
"partially done" objects can be returned, thus allowing transitive
dependencies to be loaded even when they would cause cycles.
which seems to imply that it's not guaranteed to be the case?
Finally this question appears to indicate it depends on filename and that since there is only one instance of rt.js it shouldn't be executed more than once. If that's the case does it depend only on rt.js being the same file or does it depend on the path specified by require. Basically if rt.js and other.js were in lib/, and app.js was one level down the requires in other.js and app.js would point to rt.js from different files, does this matter?
I'd be grateful if anyone could clear this confusion up for me!
modules are currently evaluated only once, but you should not rely on this. Having any state in module is considered bad practice. What prevents you from passing reference to sio to other.js?
//realtime
var io = require('socket.io'),
exports.init = function(expressServer) {
return io.listen(expressServer);
}
// app.js
var rt = require('./realtime.js'),
other = require('./other.js');
...
var sio = rt.init(expressServer);
// now ask other.js to use same sio instance
other.use_sio(sio);
Be sure not to install socket.io in more than one place. If you require socket.io in different modules where each module is searching for packages from different paths, then each module will load a seperate instance of the package.
app directory layout:
-module1
--/npm_modules //has socket.io
---/socket.io
--/module1.js //requires socket.io from module1/npm_modules
-module2
--/npm_modules //has another socket.io installation
---/socket.io
--/module2.js //requires socket.io from module2/npm_modules (Does not create a reference to what was required in module1.)
Hope this helps.