Node js, split up files without having to pass dependencies around? - javascript

This may be just me lacking a 'bigger picture' so to speak, but I'm having trouble understanding why exporting modules is needed to just split up files.
I tried doing something like this:
//server.js
var app = require('koa')();
var othermodule1 = require('othermodule1')();
var othermodule2 = require('othermodule2')();
var router = require('./config/routes')();
app.use(router.routes());
//routes.js
module.exports = require('koa-router')()
.get('*', function*(next){
othermodule1.something;
})
realizing that routes.js does not have access to 'othermodule1' after calling it from serverjs. I know that there's a way to pass needed variables during the require call, but I have a lot more than just 2 modules that I would need to pass. So from my probably naive perspective, this seems somewhat unnecessarily cumbersome. Someone care to enlighten me or is there actually a way to do this that I missed?

Each node.js module is meant to be a stand-alone sharable unit. It includes everything that it needs to do its job. That's the principle behind modules.
This principle makes for a little more overhead at the start of each module to require() in everything you need in that module, but it's only done once at the server startup and all modules are cached anyway so it isn't generally a meaningful performance issue.
You can make global things by assigning to the global object, but they that often breaks modularity and definitely goes against the design spirit of independently shareable modules.
In your specific code, if routes needs access to othermodule1, then it should just require() it in as needed. That's how modules work. routes should just include the things it needs. Modules are cached so requiring it many times just gives every require() the same module handle from a cache.
This is an adjustment in thinking from other systems, but once you get use to it, you just do it and it's no big deal. Either require() in what you need (the plain shareable module method) or pass something into a module on its constructor (the push method) or create init() methods so someone can initialize you properly or call some other module to get the things you need (the pull method).

Related

Handling global variables in Meteor

I'm using a query on both server and client (pub/sub). So I have something like this at a few different locations.
const FOO = 'bar';
Collection.find({property:FOO})
Foo may potentially change and rather than have to update my code at separate locations, I was thinking it may be worth it to abstract this away to a global variable that is visible by both client and server.
I created a new file 'lib/constants.js' and simply did FOO = 'bar; (note no keyword). This seems to work just fine. I found this solution as the accepted answer How can I access constants in the lib/constants.js file in Meteor?
My question is if this a desired pattern in Meteor and even general JS.
I understand I can abstract this away into a module, but that may be overkill in this case. I also think using session/reactive vars is unsafe as it can kinda lead to action at a distance. I'm not even gonna consider using settings.json as that should only be for environment variables.
Any insights?
yes, If you are using older version of meteor then you can use setting.json but for updated version we have import option.
I don't think the pattern is that bad. I would put that file in /imports though and explicitly import it.
Alternatively, you can write into Meteor.settings.public from the server, e.g., on start-up, and those values will be available on the client in the same location. You can do this without having a settings file, which is nice because it doesn't require you to make any changes between development and production.
Server:
Meteor.startup(() => {
// code to run on server at startup
Meteor.settings.public.FOO = 'bar';
});
Client:
> console.log(Meteor.settings.public.FOO);
bar
This is actually a b̶a̶d̶ unfavoured pattern because with global variables you cannot track what things changed and in general constructing a modular and replaceable components is much better. This pattern was only made possible due to Meteor early days where imports directory/pattern was not supported yet and you'd have your entire code split up between both,server and client.
https://docs.meteor.com/changelog.html#v13220160415
You can find many write ups about it online and event stackoverflow answers so I don't want to restate the obvious.
Using a settings.json variable is not an option since we may dynamically change so what are our options? For me I'd say:
Store it the database and either publish it or retrieve it using methods with proper access scoping of course. Also you can dynamically modify it using methods that author DB changes.
Or, you may try using Meteor.EnvironmentVariable. I'd be lying if I said I know how to use it properly but I've seen it being used in couple Meteor projects to tackle a similar situation.
https://www.eventedmind.com/items/meteor-dynamic-scoping-with-environment-variables
Why are global variables considered bad practice?

ES6 exports/imports use-case, compared to traditional namespacing

I don't understand WHY and in what scenario this would be used..
My current web setup consists of lots of components, which are just functions or factory functions, each in their own file, and each function "rides" the app namespace, like : app.component.breadcrumbs = function(){... and so on.
Then GULP just combines all the files, and I end up with a single file, so a page controller (each "page" has a controller which loads the components the page needs) can just load it's components, like: app.component.breadcrumbs(data).
All the components can be easily accessed on demand, and the single javascript file is well cached and everything. This way of work seems extremely good, never saw any problem with this way of work. of course, this can (and is) be scaled nicely.
So how are ES6 imports for functions any better than what I described?
what's the deal with importing functions instead of just attaching them to the App's namespace? it makes much more sense for them to be "attached".
Files structure
/dist/app.js // web app namespace and so on
/dist/components/breadcrumbs.js // some component
/dist/components/header.js // some component
/dist/components/sidemenu.js // some component
/dist/pages/homepage.js // home page controller
// GULP concat all above to
/js/app.js // this file is what is downloaded
Then inside homepage.js it can look like this:
app.routes.homepage = function(){
"use strict";
var DOM = { page : $('#page') };
// append whatever components I want to this page
DOM.page.append(
app.component.header(),
app.component.sidemenu(),
app.component.breadcrumbs({a:1, b:2, c:3})
)
};
This is an extremely simplified code example but you get the point
Answers to this are probably a little subjective, but I'm going to do my best.
At the end of the day, both methods allow support creating a namespace for a piece of functionality so that it does not conflict with other things. Both work, but in my view, modules, ES6 or any other, provide a few extra benefits.
Explicit dependencies
Your example seems very bias toward a "load everything" approach, but you'll generally find that to be uncommon. If your components/header.js needs to use components/breadcrumbs.js, assumptions must be made. Has that file been bundled into the overall JS file? You have no way of knowing. You're two options are
Load everything
Maintain a file somewhere that explicitly lists what needs to be loaded.
The first option is easy and in the short term is probably fine. The second is complicated for maintainability because it would be maintained as an external list, it would be very easy to stop needing one of your component file but forget to remove it.
It also means that you are essentially defining your own syntax for dependencies when again, one has now been defined in the language/community.
What happens when you want to start splitting your application into pieces? Say you have an application that is a single large file that drives 5 pages on your site, because they started out simple and it wasn't big enough to matter. Now the application has grown and should be served with a separate JS file per-page. You have now lost the ability to use option #1, and some poor soul would need to build this new list of dependencies for each end file.
What if you start using a file in a new places? How do you know which JS target files actually need it? What if you have twenty target files?
What if you have a library of components that are used across your whole company, and one of they starts relying on something new? How would that information be propagated to any number of the developers using these?
Modules allow you to know with 100% certainty what is used where, with automated tooling. You only need to package the files you actually use.
Ordering
Related to dependency listing is dependency ordering. If your library needs to create a special subclass of your header.js component, you are no longer only accessing app.component.header() from app.routes.homepage(), which would presumable be running at DOMContentLoaded. Instead you need to access it during the initial application execution. Simple concatenation offers no guarantees that it will have run yet. If you are concatenating alphabetically and your new things is app.component.blueHeader() then it would fail.
This applies to anything that you might want to do immediately at execution time. If you have a module that immediately looks at the page when it runs, or sends an AJAX request or anything, what if it depends on some library to do that?
This is another argument agains #1 (Load everything) so you start having to maintain a list again. That list is again going to be a custom things you'll have come up with instead of a standardized system.
How do you train new employees to use all of this custom stuff you've built?
Modules execute files in order based on their dependencies, so you know for sure that the stuff you depend on will have executed and will be available.
Scoping
Your solution treats everything as a standard script file. That's fine, but it means that you need to be extremely careful to not accidentally create global variables by placing them in the top-level scope of a file. This can be solved by manually adding (function(){ ... })(); around file content, but again, it's one more things you need to know to do instead of having it provided for you by the language.
Conflicts
app.component.* is something you've chosen, but there is nothing special about it, and it is global. What if you wanted to pull in a new library from Github for instance, and it also used that same name? Do you refactor your whole application to avoid conflicts?
What if you need to load two versions of a library? That has obvious downsides if it's big, but there are plenty of cases where you'll still want to trade big for non-functional. If you rely on a global object, it is now up to that library to make sure it also exposes an API like jQuery's noConflict. What if it doesn't? Do you have to add it yourself?
Encouraging smaller modules
This one may be more debatable, but I've certainly observed it within my own codebase. With modules, and the lack of boilerplate necessary to write modular code with them, developers are encouraged to look closely on how things get grouped. It is very easy to end up making "utils" files that are giant bags of functions thousands of lines long because it is easier to add to an existing file that it is to make a new one.
Dependency webs
Having explicit imports and exports makes it very clear what depends on what, which is great, but the side-effect of that is that it is much easier to think critically about dependencies. If you have a giant file with 100 helper functions, that means that if any one of those helpers needs to depend on something from another file, it needs to be loaded, even if nothing is ever using that helper function at the moment. This can easily lead to a large web of unclear dependencies, and being aware of dependencies is a huge step toward thwarting that.
Standardization
There is a lot to be said for standardization. The JavaScript community has moved heavily in the direction of reusable modules. This means that if you hope into a new codebase, you don't need to start off by figuring out how things relate to eachother. Your first step, at least in the long run, won't be to wonder whether something is AMD, CommonJS, System.register or what. By having a syntax in the language, it's one less decision to have to make.
The long and short of it is, modules offer a standard way for code to interoperate, whether that be your own code, or third-party code.
Your current process is to concatenate everything always into a single large file, only ever execute things after the whole file has loaded and you have 100% control over all code that you are executing, then you've essentially defined your own module specification based on your own assumptions about your specific codebase. That is totally fine, and no-one is forcing you to change that.
No such assumptions can be made for the general case of JavaScript code however. It is precisely the objective of modules to provide a standard in such a way as to not break existing code, but to also provide the community with a way forward. What modules offer is another approach to that, which is one that is standardized, and one that offers clearer paths for interoperability between your own code and third-party code.

The Fundamentals of the Node.js Module Paradigm?

I am struggling to really get a grasp on some fundamental basics here, and I feel it is not only holding me back, but resulting in crappy code and I don't like that.
I understand the concept of breaking out functional chunks of code into separate modules, like say routes, DB models, etc, but i'm having a real tough time understanding how to properly orchestrate the interdependent functioning of all these separate modules.
Let me give a fe examples of where my struggles lie.
Example 1
My ExpressJS 'app' is setup in my main program module, just like you see in every tutorial. However I need to access the app instance in other modules as well. How do I do that? One way I learned from various tutorials is to make the entire module export a function which takes the app as a param, then do what I need in the function. But that seems to me to add a lot of complexity to things. Not only do I now have an entire module enclosed in a function, but I seem to lose the ability to actually export multiple functions, objects, or other variables out of that module.
Module as a Function
module.exports = function(app) {
blah;
};
Module without a Function
exports.func1 = function() {
}
exports.func2 = function() {
}
The latter gives me much more flexibility in my eyes, but I seem to be forced often to use the former, because I need to pass in things like the app from somewhere else.
Example 2
I am using connect-rest for my REST API. All the code for my API lives in a separate module named simply 'api'. That has been fine until recently. Now I need to access a function that is in the api module, from inside my routes module. Presently my main routes are defined before my api, so I can't exactly pass my api export into my route function. I could reverse them probably, but this is only covering up a larger issue.
In short, the issue is one of increasing interdependence
As my codebase grows, i'm finding it more and more frequent that various modules need to work with each other - it just isn't feasible to keep them all completely searate. Sometime it is possible, but it is unclean.
I feel like i'm missing some basic Node.JS (or maybe just Javascript) paradigm that is used to manage all of this.
If anyone could help me understand I would be most appreciative. I am an experienced developer in other languages such as C++ and Python if it helps to couch things in other terms.
An attempt to sum up the issue
I feel that I did not adequately communicate my intention for posting, so let me try and sum up my issue with a working problem.
server.js
var http = require('http'),
express = require('express'),
app = express();
// Bunch of stuff done with app to get it set up
var routes = require('routes.js')(app);
app.js
module.exports = function(app, express) {
var router = express.router();
// code for routes
app.use('/', router);
}
In the above example, routes are split off into their own module, but that module needs app and express objects from server.js in order to function. So, by my current understanding, the only way to get those over into routes.js is to make routes.js export one big function which you then call with the two objects you need.
However, what if I want routes.js to export multiple functions that might be used in other places? By my understanding I now can't. What if I Wanted to do:
authentication.js
var routes = require('routes');
// setup auth
routes.doSomethingFunction(auth);
I can't do that because routes is only exporting that one mega function.
Each node module is simply an object. The part of that object which is available to the outside world is the module.exports object which contains properties which can be functions or data.
The require("xxx") command gets you the exports object for that module (from a central cache or loads it from the .js file is it hasn't yet been loaded).
So, code sharing is simple. Just have each module do a require() on any other modules that it wants to share code from and have those modules make sure the shared functions are accessible via it's own exports object. This allows each module to essentially be stand-alone. It loads any other code that it needs and makes it a lot easier to reuse code. Modules are cached so doing lots of require() operations on the same module from lots of other modules is nothing more than a cache lookup and is not something to worry about.
Data sharing (such as your app object) can be accomplished in several different ways. The most common way is when you load the module to just call some sort of initialization function for the module and pass it any data that it might need. That would be the push model. Or, you can also do the pull model where a module asks another module for some piece of data.
All of this is a lot easier with the right code organization. If you start to feel like you have a spaghetti or interdependence, then perhaps you don't have the right code organization or you're just a bit too shy on just using require() to pull in everything a given module needs. Remember each module will load whatever it needs itself so you only have to worry about what you need. Load those modules and they will load what they need.
You may also want to think more in terms of objects so you put most properties on some sort of object rather than lots of loose, individually shared variables. You can then share a single object and it automatically makes all the properties of that variable available to whomever you shared it with.
As to your question about sharing the app object with another module, you can do that like this:
// in your app module
var express = require('express');
var app = express();
var otherModule = require('otherModule');
otherModule.setApp(app);
// now otherModule has the singleton `app` object that it can use
// in this case, otherModule must be initialized this way before it can do its job
In this example, I just used a single method .setApp() to set the app object. That means all the other methods are available for other access into that module.
This could have also been done with a constructor-like method:
// in your app module
var express = require('express');
var app = express();
var otherModule = require('otherModule')(app);
This works also because the constructor can then return an object with other methods on it if you want. If you want to be able to get access to otherModule from within other modules, but obviously you only want to initialize it just once and not in those other places, then you can either do this:
var otherModule = require('otherModule')();
from those other modules and have the constructor just check that if nothing is passed to it, then it is not getting the app object from this constructor call so it should just return an object with other methods. Or, you can use the first code block above that returns all the methods from the initial require(). You're totally free to decide what to return from the require(). It can be just a constructor-like function which then returns another object when it is called. It can be just an object that has methods on it or (because functions are objects that can also have properties), you can even return a constructor-like function that ALSO has methods on it (though this is a bit less standard way of doing things).
And, your constructor function can decide what to do based on what is passed to it, given it a multitude of different behaviors based on what you pass to it.

Mutually recursive AMD modules

I am trying to write a reasonably chunky client side web app using Javascript and jQuery. In order to organise my code, I read up on Javascript module systems and decided to go with AMD modules. Currently I'm using curl.js as my module loader, but I'm not particularly wedded to that.
Unfortunately I've now run up against an issue where two of my modules need to be mutually dependent. I was expecting it to Just Work --- but what actually happens is that loading the app just seems to stall half-way through and everything stops, with no error messages.
A quick Google shows practically no mention whatsoever of AMD and mutually recursive modules. Can I actually do this, and if so, how? (Do I need to change to a different module loader?)
If not, any suggestions on an alternative module system which does support mutually recursive modules?
So after realising that an alternative name for 'mutual recursion' is 'circular dependencies', I found some references online (notably the require.js manual page on the topic).
The short summary is: no, this doesn't work. There are various ways to get round it, but it fundamentally does not Just Work.
The simplest workaround is to break the dependency chain using an explicit synchronous require() call:
define(
["require", "NotLoadedYet"],
function (require, NLY)
{
// NLY is undefined here
return {
doSomething: function()
{
var realNLY = require("NotLoadedYet"); // fetch the real NLY on demand
realNLY.doSomething(); // actuall call the method
}
};
}
);
Obviously this only works if you can guarantee that NotLoadedYet really has been loaded by the time you call the method.
The idea of using dynamic late-binding in a language which already does dynamic late-binding is pretty ew, but it does work. Sigh. There seems to be a slightly less ew technique which involves changing to use requirejs's CommonJS support instead, but I don't know how that works so I'm sticking to this.
What I'll actually do is implement a NotLoadedYetImpl module which contains the implementation and a NotLoadedYet module which proxies through the above mechanism. It's a shame Javascript doesn't do getters and setters on all properties for an object, or I could do it all automatically, too...

Nodejs app structure

I'm wondering if I'm structuring my nodejs app accordingly to account for best performance. My major concern is in regards to how I'm passing in moving my app reference around modules.
Basically in my app.js file I'm declaring all of my requires, libraries etc:
var app = {
config : require('../../config.json'),
restify : require('restify'),
path : require('path'),
mongo : require('mongodb'),
model : require('./models.js'),
step : require('step'),
q : require('q'),
api : require('./api_util.js'),
underscore : require('underscore')
};
In my exports I'm passing in the entire app object. Now given my knowledge of JavaScript (you can correct me if I'm wrong), this will not create new instances of the object, it will simply pass in the object as a pointer and reference the same object in memory.
Now what I find myself doing aside from that (for ease) is in my restify library (the same can be done with Express), I'm appending the app value to the server request object like so:
app.server.pre(function (request, response, next) {
request.app = app;
return next();
});
Hence on every single request if I need quick access to any of my library declarations, config etc. I can easily access request.app. I don't see this being an issue either, same logic the object acts a pointer back to the same memory space, so I'm not doubling memory usage or anything.
Is there a better/easier way of structuring this?
You are correct about references being passed instead of objects being duplicated. From that point of view, you are not wasting extra space when passing references to your app.
However, I would advise you against doing this: if you pass a reference to app around everywhere, what it tells me is that you don't really know what you will need in this or that module.
You should carefully plan your dependencies and know what each module will need so that you can pass the right dependencies for each module in your app.
Regarding things like underscore or mongodb, you should not be doing what you are doing. You should only pass around modules that need initialization. For things like underscore or mongodb, node.js caches the definition the first time you require() it, so you can really call require at the top of every module that needs it.
This will not incur any performance loss, and it will make it clearer what library each module requires.

Categories