I am trying to create javascript extendable library with submodules in different files. I am using Module Pattern based on Ben Cherry article
module.js
var SERVICES = (function (service) {
var service = {},
privateVariable = 1;
function privateMethod() {
//
}
service.moduleMethod = function () {
//
};
return service;
}());
submodule.js
SERVICES.submodule = (function (service) {
var submodule = {},
privateVariable = 1;
submodule.moduleMethod = function () {
//
};
return submodule;
}(SERVICES));
What I want to achieve is the following. The module.js is a module which I want to act as a library. Submodules are separated in the different files, thus I am using Module Pattern. Clients will be able to write new submodules for the mentioned library (module.js). I want to create a script which will call library SERVICES with all submodules available. This script will be included in one front-end file just once, and there will be no need to change anything when someone writes new submodule. Library should load it.
Related
I have a project that has old-school, pre-module style JS for use in a browser. The JS consists of hundreds of components built like this:
Root:
(function() {
"use strict"
window.App = {};
App.SomeNameSpace = {};
App.SomeOtherNameSpace = {};
})();
ExampleComponent:
(function() {
"use strict";
App.SomeNameSpace.ExampleComponent = function () { };
App.SomeNameSpace.ExampleComponent.prototype.func = function () {
// Do some random things here
};
})();
ExampleComponent2:
(function() {
"use strict";
App.SomeNameSpace.ExampleComponent2 = function () { };
App.SomeNameSpace.ExampleComponent2.prototype.func = function () {
// Do some other random things here
};
})();
And the file layout would be something like this:
some-namespace/component.0.js
some-namespace/component.1.js
some-other-namespace/other-component.0.js
root-component.js
And the bundling process would be to concatinate all scripts in a single folder (namespace), minify the result and output it as a single bundle (some-namespace.js, for example). And then the website would output the scripts in the correct order so root.js would be run first (so that all namespaces are defined), and then the bundles with actual "classes" would follow.
There are no imports, there's no ES5/ES6 functionality in there, except for some APIs that are polyfilled, it relies on jQuery and some other external libs.
Also it's important to note that each namespace has its own bundle and not all of them get used in each page. Each app page picks and chooses the components it needs, only the root script is loaded in all pages.
How can I "webpack-ify" this in the least painful way possible? Since there are no imports, there are no real entrypoints and everything is is defined globally using namespaces.
You change only the root, to expose App variable to the window.
(function() {
"use strict"
App = App || {};
App.SomeNameSpace = {};
App.SomeOtherNameSpace = {};
window.App = App;
// ----^ exposing App variable to window
})();
Then create additional file, let's call it loader.js, that will require all the files by the correct order.
I'm wondering how helper dependencies are managed in compiled dust templates, specifically relating to being used on the client -- is the helper method bundled with the compiled dust template? What about dependencies that might not be supported on the client? Or if that dependency has multiple other dependencies?
Here's a trivial example of a dust template I'd like to be able to use on the client:
// foo.dust
{#myHelper}
<div>{foo}{bar}</div>
{/myHelper}
// my-helper.js
const isomorphicDep = require('isomorphic-dep');
const nodeDep = require('node-dep');
module.exports = function(dust) {
dust.helpers.myHelper = function(chunk, context, bodies, params) {
// do some stuff using deps
let foo = nodeDep.getFoo();
let bar = isomorphicDep.getBar(params.someInput);
return chunk.render(bodies.block, context.push({ foo, bar });
};
};
Thanks
A compiled template just contains instructions for how to render-- it does not include any client code itself.
For example, a simple template like this:
{#helper}foo{/helper}
Compiles into these two instruction sets:
function body_0(chk, ctx) {
return chk.h("helper", ctx, {
"block": body_1
}, {}, "h");
}
function body_1(chk, ctx) {
return chk.w("foo");
}
When the template is rendered, it asks Dust to look for the helper named helper and execute it (in the body_0 function). The code for helper is not included with the template.
So on the client, you'll need to include a file containing the helper that loads the correct isomorphic dep (like node-fetch vs whatwg-fetch, for example).
Consider this example from their website
define(function (require) {
var foo = require('foo');
//Define this module as exporting a function
return function () {
foo.doSomething();
};
});
My question is, as 'foo' is loaded asynchronously, how does the Javascript below it not execute before it has loaded?
This is explained in http://requirejs.org/docs/api.html#cjsmodule and http://requirejs.org/docs/whyamd.html#sugar.
Require.js will at some point (before running the function) look at the string representation of the function, find all require calls to determine the dependencies and load them.
To make this easier, and to make it easy to do a simple wrapping around CommonJS modules, this form of define is supported, sometimes referred to as "simplified CommonJS wrapping":
define(function (require) {
var dependency1 = require('dependency1'),
dependency2 = require('dependency2');
return function () {};
});
The AMD loader will parse out the require('') calls by using Function.prototype.toString(), then internally convert the above define call into this:
define(['require', 'dependency1', 'dependency2'], function (require) {
var dependency1 = require('dependency1'),
dependency2 = require('dependency2');
return function () {};
});
This allows the loader to load dependency1 and dependency2 asynchronously, execute those dependencies, then execute this function.
I am working on converting part of a large Backbone application to use Typescript. Some of the core js files have been converted to Typescript and are referenced by other non-TS js files.
Before we converted our requirejs modules would be of the form:
View.js
define([backbone, deps...], function (backbone, deps...) {
return backbone.View.extend({
render: function () {...}
});
});
Which in the consuming code we could do:
define([View, ...], function (View,....) {
var view = new View({...});
view.render();
});
Using Typescript external modules and exports I can compile a file containing a Typescript class to AMD but the code generated sets a property on an exports object as the results which means consuming code looks like this now, given the file name is View.js:
var view = new View.View();
This is annoying and also means we can't maintain backward compatibility with the legacy non-TS code without changes (though minor). Is there anyway to get around this?
EDIT:
Relevant generated code (the define callback)
function (require, exports) {
var Backbone = require('backbone');
var View = (function (_super) {
__extends(View, _super);
function View(options) {
_super.call(this, options);
}
})(Backbone.View);
exports.View = View;
}
The aim is to have some core components in typescript so that we can use TS going forward, and have other older libraries require the completed js components.
I'd like require a module and somehow pass in the current module, or something like that, such that the module being required has the properties of the module requiring it.
For example if I have a file I'm requiring:
(function(global) {
console.log(this.exists);
}(this));
And am requiring it like so:
this.exists = "I exist.";
var myFile = require("my-file"); // Somehow make require pass in 'this'.
The file being required should be able to see this.exists since I've passed this into the require function somehow. I'm not sure if this is possible. I would think you would need to fiddle with the different module objects.
The one constraint of this is that the module being required can't have any nodejs specific things, like code to export it on the module. It has to stay the same as the way I've written it.
Edit:
Turns out there is no way to do this exactly the way I want to. There have been some awesome suggestions on how to do this in other ways, though.
I had to do something similar to this once... best way I figured out was through a level of indirection.
Part1:
define(function() {
"use strict";
function init() {
console.log("this = " + this);
}
return init;
});
Part2:
var myFileInit = require("my-file");
var myFile = myFileInit.init.call(this);
Edit: Another possibility
Create a wrapper for the original module:
// my-file-wrapper
define(["my-file"], function(myFunc) {
"use strict";
function init() {
myFunc.call(this);
}
return init;
});
// Elsewhere
var myFileWrapper = require("my-file-wrapper");
var myFile = myFileInit.init.call(this);