Content
I have a large module that I am assembling in Javascript, which is problematic because JS currently has poor native module support.
Since my module is large, I personally do not like having one massive file with my module object e.g.
var my_module = {
func_1: function(param) {console.log(param)},
...,
func_n: function(param_1, param_2) {console.log(param_1 - param_2)}
}
where func_n ends around line number 3000. I would much rather store each of my functions (or several related functions) in separate files. I personally find this easier to manage.
This poses a problem, however, as although one could use synchronous calls to load the functions - the javascript will still be parsed asynchronously (to my understanding). Thus several independent synchronous calls to loading files is insufficient - as the mth file might call something related to the nth file (n < m) which has not yet been parsed causing an error.
Thus the solution a solution is apparent: recursively - synchronously - load files in the callback of the previous file.
Consider the code at the bottom of this post.
Now this isn't perfect. It has several assumptions e.g. that each file contains one function and that function is the same as the filename after striping the extension (a() is in a.js; do_something(a, b, c) is in do_something.js). It also doesn't encapsulate private variables. However, this could be worked around by adding a JSON file with these variables. Adding this JSON to the module as module.config and then passing the config object to each of the functions in the module.
In addition this still pollutes the namespace.
Question
My question is as follows:
what is a native JS way (nor do I not want a library that does this for me - jQuery included) to load functions stored across many files into a cohesive module without polluting the namespace, and ensuring that all the files are parsed before any function calls?
Code to consider (my solution)
Code directory structure:
- directory
---- index.html
---- bundle.js
---- test_module/
-------- a.js
-------- b.js
-------- log_num.js
-------- many_parameters.js
index.html
<head>
<script src="bundle.js"></script>
</head>
bundle.js
// Give JS arrays the .empty() function prototype
if (!Array.prototype.empty){
Array.prototype.empty = function(){
return this.length == 0;
};
};
function bundle(module_object, list_of_files, directory="") {
if (!list_of_files.empty()) {
var current_file = list_of_files.pop()
var [function_name, extension] = current_file.split(".")
var new_script = document.createElement("script")
document.head.appendChild(new_script)
new_script.src = directory + current_file
new_script.onload = function() {
module_object[function_name] = eval(function_name)
bundle(module_object, list_of_files, directory)
/*
nullify the function in the global namespace as - assumed - last
reference to this function garbage collection will remove it. Thus modules
assembled by this function - bundle(obj, files, dir) - must be called
FIRST, else one risks overwritting a funciton in the global namespace and
then deleting it
*/
eval(function_name + "= undefined")
}
}
}
var test_module = {}
bundle(test_module, ["a.js", "b.js", "log_num.js", "many_parameters.js"], "test_module/")
a.js
function a() {
console.log("a")
}
b.js
function b() {
console.log("b")
}
log_num.js
// it works with parameters too
function log_num(num) {
console.log(num)
}
many_parameters.js
function many_parameters(a, b, c) {
var calc = a - b * c
console.log(calc)
}
If we restrict our tools to the "native JS way", there is an import() proposal, currently at Stage 3 on the TC39 proposal process:
https://github.com/tc39/proposal-dynamic-import
System.js offers a similar approach to dynamically load modules.
Have you looked at RequireJS. From the home page:
RequireJS is a JavaScript file and module loader. It is optimized for in-browser use, but it can be used in other JavaScript environments, like Rhino and Node. Using a modular script loader like RequireJS will improve the speed and quality of your code.
It has support for Definition Functions with Dependencies
If the module has dependencies, the first argument should be an array of dependency names, and the second argument should be a definition function. The function will be called to define the module once all dependencies have loaded. The function should return an object that defines the module. The dependencies will be passed to the definition function as function arguments, listed in the same order as the order in the dependency array:
That would allow you to "split up" your module into what ever arbitrary pieces you decide and "assemble" them at load time.
Related
My goal is to build a set of javascript tools for functional programming, to be used by our company's web developers. I've tried giving a look at the Underscore annotated source but I'm new with RequireJS and AMD, so it's a lot confusing for me.
To start I just want to have a variable that gets available when my library is imported.
In this case booleans is a module that has functions returning boolean values. For example: _myLib.booleans.isDefined(var) - returns true is var is a defined variable.
No I have RequireJS setup, but how do I make a variable available for usage?
My main.js:
requirejs(['app/booleans'], function (booleans) {
var _myLib = {};
_myLib.booleans = booleans;
return _myLib;
});
Of course _myLib is undefined, I suppose it's because it is not assigned to any scope.
Can anyone give me some lights on building this library?
Thanks in advance.
If you want to produce a proper AMD library, you need to set it so that it calls define to define itself as an AMD module.
define(['app/booleans'], function (booleans) {
var _myLib = {};
_myLib.booleans = booleans;
return _myLib;
});
If you called your file myLib.js and provided a good configuration to RequireJS for it to find it, then, when you want to use it, you can do:
require(["myLib"], function (myLib) {
myLib.booleans.isDefined("moo");
});
Or in another module:
define(["myLib"], function (myLib) {
myLib.booleans.isDefined("blah");
});
This is my second weekend playing with Node, so this is a bit newbie.
I have a js file full of common utilities that provide stuff that JavaScript doesn't. Severely clipped, it looks like this:
module.exports = {
Round: function(num, dec) {
return Math.round(num * Math.pow(10,dec)) / Math.pow(10,dec);
}
};
Many other custom code modules - also included via require() statements - need to call the utility functions. They make calls like this:
module.exports = {
Init: function(pie) {
// does lots of other stuff, but now needs to round a number
// using the custom rounding fn provided in the common util code
console.log(util.Round(pie, 2)); // ReferenceError: util is not defined
}
};
The node.js file that is actually run is very simple (well, for this example). It just require()'s in the code and kicks off the custom code's Init() fn, like this:
var util = require("./utilities.js");
var customCode = require("./programCode.js");
customCode.Init(Math.PI);
Well, this doesn't work, I get a "ReferenceError: util is not defined" coming from the customCode. I know everything in each required file is "private" and this is why the error is occuring, but I also know that the variable holding the utility code object has GOT to be stored somewhere, perhaps hanging off of global?
I searched through global but didn't see any reference to utils in there. I was thinking of using something like global.utils.Round in the custom code.
So the question is, given that the utility code could be referred to as anything really (var u, util, or utility), how in heck can I organize this so that other code modules can see these utilities?
There are at least two ways to solve this:
If you need something from another module in a file, just require it. That's the easy one.
Provide something which actually builds the module for you. I will explain this in a second.
However, your current approach won't work as the node.js module system doesn't provide globals as you might expect them from other languages. Except for the things exported with module.exports you get nothing from the required module, and the required module doesn't know anything of the requiree's environment.
Just require it
To avoid the gap mentioned above, you need to require the other module beforehand:
// -- file: ./programCode.js
var util = require(...);
module.exports = {
Init: function(pie) {
console.log(util.Round(pie, 2));
}
};
requires are cached, so don't think too much about performance at this point.
Keep it flexible
In this case you don't directly export the contents of your module. Instead, you provide a constructor that will create the actual content. This enables you to give some additional arguments, for example another version of your utility library:
// -- file: ./programCode.js
module.exports = {
create: function(util){
return {
Init: function(pie) {
console.log(util.Round(pie, 2));
}
}
}
};
// --- other file
var util = require(...);
var myModule = require('./module').create(util);
As you can see this will create a new object when you call create. As such it will consume more memory as the first approach. Thus I recommend you to just require() things.
First a bit of history, we have an engine which is made up of many javascript files which are essentially modules. These modules return a single class that are assigned to the global scope, although under a specified namespace.
The engine itself is used to display eLearning content, with each different eLearning course requiring slightly different needs, which is where we include javascript files into the page based on the necessary functionality. (There is only one entry page).
I've been trying to weigh up if it's worth changing to AMD, require.js and r.js or if it's better to stay with our current system which includes everything required on the page and minimises it into one script.
One of my biggest problems with going to AMD would be that it seems to be harder to extend a class easily. For example, sometimes we have to adjust the behaviour of the original class slightly. So we add another script include on the page that extends the original class by copying the original prototype, execute the original function that's being overridden with apply and then do whatever additional code is required.
Can you extend an AMD module without adapting the original file? Or am I missing the point and we're best staying with what we're doing at the moment?
I recently started a project using RequireJS, and the method I use to extend underscore boils down to something like this:
Relevant Directory Structure:
/scripts
/scripts/underscore.js
/scripts/base/underscore.js
The real underscore library goes to /scripts/base/underscore.js.
My extensions go in /scripts/underscore.js.
The code in /scripts/underscore.js looks like this:
define(['./base/underscore'], function (_) {
'use strict';
var exports = {};
// add new underscore methods to exports
_.mixin(exports); // underscore's method for adding methods to itself
return _; // return the same object as returned from the underscore module
});
For a normal extension, it could look more like this:
define(['underscore', './base/SomeClass'], function (_, SomeClass) {
'use strict';
_.extend(SomeClass.prototype, {
someMethod: function (someValue) {
return this.somethingOrOther(someValue * 5);
}
});
return SomeClass;
});
Note on underscore: Elsewhere I used the RequireJS shim-config to get underscore to load as an AMD module, but that should have no effect on this process with non-shimmed AMD modules.
You can have modules that contain your constructor functions. when these modules get included, they are ready for use. then you can create objects out of them afterwards.
example in require:
//construction.js
define(function(){
//expose a constructor function
return function(){
this....
}
});
//then in foo.js
define([construction],function(Construction){
var newObj = new Construction; //one object using constructor
});
//then in bar.js
define([construction],function(Construction){
//play with Construction's prototype here then use it
var newObj = new Construction;
});
Background
I'm building a javascript based application that works differently on mobile and desktop devices. However, except for the DOM manipulation, most code is common between both platforms, so I have structured all files like:
* foo.core.js
* foo.mobile.js
* foo.web.js
And hoping to leverage object oriented techniques to write cleaner code.
Problem:
I have two JavaScript files, with classes
File 1:
function ClassA()
{}
ClassA.prototype.foo = function(){};
GreatGrandChildA.prototype = new GrandChildA(); // this is where the error is
function GreatGrandChildA ()
{}
File 2:
ChildA.prototype = new ClassA();
function ChildA () // ChildA inherits ClassA
{}
GrandChildA.prototype = new ChildA()
function GrandChildA () // GrandChildA inherits ClassA
{}
Normally, in a language like C++, I would forward declare GrandChildA right in File 1. I would like to know how to do it in Javascript
Edit:
If I make a single file containing all four classes - in the same order in which they are loaded, the example works exactly as expected:
http://jsfiddle.net/k2XKL/
Simple logic for unordered js file loading:
File1:
// ClassB: inherite from ClassA
(function ClassB_Builder() {
if(window.ClassB)return; // ClassB is already defined;
if(!window.ClassA) { // ClassA is already not defined;
setTimeout(ClassB_Builder,0); // shedule class building
return;
}
ClassB=function() {
}
ClassB.prototype=new ClassA;
ClassB.prototype.constructor=ClassB; // can be important for inheritance!!!
})();
File2:
// ClassA: base class
(function ClassA_Builder() {
ClassA=function() {
}
})();
// ClassC: inherite from ClassB
(function ClassC_Builder() {
if(window.ClassC)return; // ClassC is already defined;
if(!window.ClassB) { // ClassB is already not defined;
setTimeout(ClassC_Builder,0); // shedule class building
return;
}
ClassC=function() {
}
ClassC.prototype=new ClassB;
ClassC.prototype.constructor=ClassC; // can be important for inheritance!!!
})();
I assume that on your HTML page, you import File 1 and then File 2.
In File 1, you should see exception because "GrandChildA" is undefined. The function declaration is not done because File 2 has not loaded yet.
In File 2, you're being able to do:
ChildA.prototype = new ClassA();
function ChildA () // ChildA inherits ClassA
{}
because the Javacript runtime hoisted your named function "ClassA" before the code executes until ChildA.prototype = new ClassA();
Please read more about function hoisting and should you be doing it in such situation at http://www.adequatelygood.com/2010/2/JavaScript-Scoping-and-Hoisting
The most sane way to accomplish what you want, is to make 2 separate versions of your source code. You're going to want to minify, obfuscate your code and merge all the source files anyway, so it would make sense to create a build script (python would be a great language for a simple build script) that you configure to merge mobile specific files into one (plus the files that both versions share) and non-mobile specific files into another file (and shared files also). In addition you could later add automatic obfuscating and gzipping. Then you can serve the appropriate source version to the appropriate client.
As mentioned in the comments, the requested functionality is not possible.
This is not only a technical problem but also an indication that
the application is not structured appropritately - the design should be improved.
Now, there is a kind of a circular dependency that shoul be avoided.
For comparison you mention that you would solve it in C++ by a forward declaration
of the superclass. But this is also not possible. In C++,
in order to declare a subclass you need to include the file with the
declaration of the superclass. And you cannot solve the problem when there are circular dependencies.
I have some constants in JavaScript that I'd like to reuse in several files while saving typing, reducing bugs from mistyping, keeping runtime performance high, and being useful on either the node.js server scripts or on the client web browser scripts.
example:
const cAPPLE = 17;
const cPEAR = 23;
const cGRAPE = 38;
...(some later js file)...
for...if (deliciousness[i][cAPPLE] > 45) ...
Here are some things I could do:
copy/paste const list to top of each file where used. Oh, Yuck. I'd rather not. This is compatible with keeping the constant names short and simple. It violates DRY and invites all sorts of awful bugs if anything in the list changes.
constant list ---> const.js
on browser, this is FINE ... script gets fed in by the html file and works fine.
but on node.js, the require mechanism changes the constant names, interfering with code reuse and requiring more typing, because of how require works....
AFAIK This doesn't work, by design, in node.js, for any const.js without using globals:
require('./const.js');
for...if...deliciousness[i][cAPPLE] > 45 ...;
This is the node.js way:
(... const.js ....)
exports.APPLE = 17;
(... dependency.js ... )
var C = require('./const.js');
for...if...deliciousness[i][C.APPLE] > 45.....
so I would either have to have two files of constants, one for the node.js requires and one for the browser, or I have to go with something further down the list...
3 make the constants properties of an object to be imported ... still needs two files... since the node.js way of importing doesn't match the browser. Also makes the names longer and probably takes a little more time to do the lookups which as I've hinted may occur in loops.
4 External constant list, internal adapter.... read the external constants, however stored, into internal structure in each file instead of trying to use the external list directly
const.js
exports.cAPPLE = 17
browser.js
const cAPPLE = exports.cAPPLE;
...code requiring cAPPLE...
node.js
CONST = require(./const.js)
const cAPPLE = CONST.cAPPLE;
...code requiring cAPPLE...
This requires a one-time-hit per file to write the code to extract the constants back out, and so would duplicate a bunch of code over and over in a slightly more evolved cut and paste.
It does allows the code requiring cAPPLE to continue to work based on use of short named constants
Are there any other solutions, perhaps a more experienced JavaScripter might know, that I might be overlooking?
module.exports = Object.create({},{
"foo": { value:"bar", writable:false, enumerable:true }
});
Properties are not writable. Works in strict mode unlike "const".
I would just make them global keys:
...(module consts.js)...
global.APPLE = 17;
global.PEAR = 23;
global.GRAPE = 38;
...(some later js file)...
var C = require('./const.js');
for (var i = 0; i < something.length; i++) {
if (deliciousness[i][global.APPLE] > 45) { blah(); }
}
They wouldn't be enforced constants, but if you stick to the ALL_CAPS naming convention for constants it should be apparent that they shouldn't be altered. And you should be able to reuse the same file for the browser if you include it and use it like so:
var global = {};
<script src="const.js"></script>
<script>
if (someVar > global.GRAPE) { doStuff(); }
</script>
You can make an object unwritable using Object.freeze .
var configs={
ENVIRONMENT:"development",
BUILDPATH:"./buildFiles/",
}
Object.freeze(configs);
module.exports=configs;
Than you can use it as constant
var config=require('config');
// config.BUILDPATH will act as constant and will be not writable.