RequireJS: apply plugin to module id dynamically - javascript

I'm using the requirejs-babel plugin which requires prepending 'es6!' to all module ids that need babel transpilation.
define(['es6!some-es6-module'], function(module) {
// ...
});
Is there an API in RequireJS that would allow me to inspect a module id and prepend the plugin id as-needed? For example, if I wanted to apply 'es6!' to all module ids in a specific directory?
Ultimately I need to be able to write defines like this define(['some-es6-module'], ...) and automatically add the es6! prefix depending on what the module id is.
Not looking for information on SystemJS or gulp tasks that do the transpilation ahead of time, etc.
The exact module ids are not known at configuration time- I just know in certain locations/directories, modules will need es6!.
Needs to work in the browser, at runtime

I am not 100% sure on your overall objective (do you want the es6 addition to module ID saved permanently or always auto-added?), but you may be able to use RequireJS mapping to substitute module ID's for defined modules. For example: -
requirejs.config({
map: {
// * - for all modules that require these, do this
'*': {
'some-es6-module': 'es6!some-es6-module'
}
}
});
However, considering your use-case you may need something more complicated than this, as mapping assumes you have actual different versions of files and is generally used for this purpose.
A more complicated solution I assume you are looking to avoid could be to dynamically loop your files before optimising them in r.js and loading/editing them via Node. It would get a little messy!
var config = requirejs.s.contexts._.config;
var needBabel = ['some-es6-module', 'another-module-name', 'another'];
for (var property in config.paths) {
if (config.paths.hasOwnProperty(property) && needBabel.indexOf(property) > -1) {
// load the module in node
// fs.readFileSync(__dirname + config.paths[property] + '.js');
// dynamically modify this file with text replacement
// save this file via Node again
}
}
// run Require JS optimiser
// undo everything you've just done when optimisation is complete

I ended up overriding the load method. The override uses the standard load for modules with mapped paths, otherwise it uses the es6 (requirejs-babel) plugin to load the module.
require.standardLoad = require.load;
require.load = function(context, moduleName, url) {
var config = requirejs.s.contexts._.config;
if (moduleName in config.paths) {
return require.standardLoad(context, moduleName, url);
}
require(['es6'], function(es6) {
es6.load(
moduleName,
require,
{
fromText: function(text) {
require.exec(text);
context.completeLoad(moduleName);
}
},
{});
});
};
Here it is in action: https://gist.run/?id=7542e061bc940cde506b

Related

Are JavaScript classes acceptable Node modules?

Bear with me as I lead you through the process that elicited my question.
I'm working on a CLI app in node and I'm using objects to encapsulate my business logic using this pattern:
// my-project/lib/widget/myobject.js
var MyObject = function(x) {
this.x = x;
};
MyObject.prototype.getX = function() {
return this.x;
};
module.exports = MyObject;
I'm also testing these objects:
// my-project/test/lib/widget/myobject.spec.js
var MyObject = require('../../../lib/widget/myobject.js');
describe('MyObject', function() {
...
});
At one point I was unhappy with the naming and directory structure I had chosen. I found myself tediously counting those parent directory references (..) in several spec files when rewriting the relative paths. I figured there must be an easier way to reference a root directory containing these object definitions.
One of the recommendations I found here suggested "putting application-specific modules into node_modules".
Now, as I understand modules, they are the packages I download from npm and use in my project. They contain libraries of useful things with a single API exported to me when I call require. This is not how I view the simple single-purpose classes built specifically for the internal use of my application.
If you've stuck with me this far, thank you! Here is my question:
How do I make the internals of my application more "modular" so it properly follows the intent of the Node module system while remaining object oriented?
I'm not sure how suitable this is for production or for modules you plan on distributing but in your main file you could add this:
process.env.NODE_PATH = __dirname;
require('module').Module._initPaths();
which would let you always require modules relative to the folder containing your main file. I.e. if you had a file in:
library/some_file.js
then in tests/some_other_file.js you could just do:
require('library/some_file');
Or as an alternative you could add this in your main file:
global.__base = __dirname + '/';
and then in your other modules require using:
var MyObject = require(__base + 'my-project/lib/widget/myobject');

Using webpack to build a dependency before compiling the build

I'm using inline styles & my theme comes from a file called themes.json, which I create & inject into my build folder before calling webpack. I'd like webpack to handle this, too.
My first attempt was to use a plugin:
compiler.plugin('compilation', async(compilation, callback) => {
const themes = {}; // empty for this example
compilation.assets['themes.json'] = {
source: function() {
return themes;
},
size: function() {
return themes.length;
}
};
callback();
}
However, since plugins run async or parallel, I come across a race condition where my index.js that has a const themes = require('../build/themes.json') is trying to require something before it exists.
How can I make sure themes.json exists and is useable within my build? I see 3 possibilities:
There's a way to run plugins in serial that I don't know about
I somehow use a loader to perform this
I write a plugin that looks for require('../build/themes.json') and when it finds it, it creates & injects the themes.
Any help on the right way to do this?

Writing commonjs module and load it using require (not using relative path)

What is best pratice for referencing a local commonjs module without using a relative path as below?
var custMod= require(‘customModule’);
custMod(true);
custMod.save(‘foo’);
Is there any reference for building a module like this?
If I wrote module like below, getting undefined when I call custMode.save(12);
module.exports = customModule;function customModule(mode) {
var mode = 1;
return {
save: function (value) {
console.log(mode, value)
},
get: function (id) {
console.log(mode, id)
}
}
You can add a path for require to check using
require.paths.push('/my/path');
or
require.main.paths.push('/my/path');
Depending on your node version.
Then if customModule.js exists at /my/path/customModule.js, you can just use
require('customModule');
Do note though, you'd need to do this on every module that you intend to use this method on.
I wish node made this easier to be honest. One possibility:
project_root
`--node_modules
|--npm-module-1
|--npm-module-2
|--... (etc)
`--lib
|--my-module.js
`--my-other-module.js
With the above you can then type require('lib/my-module') from anywhere in your project. (Just make sure and never install an npm module named lib :) Another possibility:
project_root
|--node_modules
| |--npm-module-1
| |--npm-module-2
| `--... (etc)
`--lib
`--node_modules
|--my-module.js
`--my-other-module.js
With the above you can then type require('my-module'), but only for any files under project_root/lib/.
An advantage of the former approach is that require('lib/my-module') makes it super easy at a glance to tell which modules are local to the project. However the latter is less typing.

Webpack: unable to require automatically resolved dependencies using variables

While working on a Web app using Webpack to manage JavaScript dependencies, I stumbled upon the problem i'm going to describe.
Loading dependencies passing strings to require() works beautifully:
// main.js
var jQuery = require('jquery');
Here, jquery is installed with Bower, and Webpack is correctly configured to automatically resolve Bower modules.
Now, I'm working on the problem of conditionally loading modules, with particular regard to the situation where modules have to be downloaded from a CDN, or from the local server if the CDN fails. I use scriptjs to asynchronously load from the CDN, by the way. The code I'm writing is something like this:
var jQuery = undefined;
try {
jQuery = require('jquery-cdn');
} catch (e) {
console.log('Unable to load jQuery from CDN. Loading local version...');
require('script!jquery');
jQuery = window.jQuery;
}
// jQuery available here
and this code works beautifully as well.
Now, since I obviously have a lot of dependencies (Handlebars, Ember, etc.) that I want to try to load from a CDN first, this code starts to get a little redundant, so the most logical thing I try to do is to refactor it out into a function:
function loadModule(module, object) {
var lib = undefined;
try {
lib = require(module + '-cdn');
} catch (e) {
console.log('Cannot load ' + object + ' from CDN. Loading local version...');
require('script!' + module);
lib = window[object];
}
return lib;
}
var jQuery = loadModule('jquery', 'jQuery');
var Handlebars = loadModule('handlebars', 'Handlebars');
// etc...
The problem is that Webpack has a particular behaviour when dealing with expressions inside require statements, that hinders my attempts to load modules in the way described above. In particular, when using an expression inside require it
tries to include all files that are possible with your expression
The net effect is a huge pile of error messages when I try to run Webpack with the above code.
Though the linked resources suggest to explicitly declare the path of the JavaScript files to include, what I fail to get is how to do the same thing when I cannot, or don't want to, pass a precise path to require, but rather use the automatically resolved modules, as shown.
Thanks all
EDIT:
I still don't known how to use expressions to load those scripts, however, I designed a workaround. Basically, the idea is to explicitly write the require('script') inside a callback function, and then dinamically call that function when it's time. More precisely, I prepared a configuration file like this:
// config.js
'use strict';
module.exports = {
'lib': {
'jquery': {
'object': 'jQuery',
'dev': function() { require('script!jquery'); },
'dist': function() { return require('jquery-cdn'); },
'cdn': '//ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js'
},
'handlebars': {
// ...
}
}
};
Inside my main code I, then, define an array of resources to load, like:
var config = require('./config.js');
var resources = [ config.lib.jquery, config.lib.handlebars, ... ];
And then when I have to load the development version, or the distribution version, I dinamically call:
// Inside some kind of cycle
// resource = resources[index]
try {
window[resource.object] = resource.dist();
} catch (e) {
console.log('Cannot load ' + resource.object + ' from CDN. Loading local version...');
resource.dev();
}
Here there's a more complete example of this in action.

NodeJS and Javascript (requirejs) dependency injection

I am currently using requirejs to manage module js/css dependencies.
I'd like to discover the possibilities of having node do this via a centralized config file.
So instead of manually doing something like
define([
'jquery'
'lib/somelib'
'views/someview']
within each module.
I'd have node inject the dependencies ie
require('moduleA').setDeps('jquery','lib/somelib','views/someview')
Anyway, I'm interested in any projects looking at dependency injection for node.
thanks
I've come up with a solution for dependency injection. It's called injectr, and it uses node's vm library and replaces the default functionality of require when including a file.
So in your tests, instead of require('libToTest'), use injectr('libToTest' { 'libToMock' : myMock });. I wanted to make the interface as straightforward as possible, with no need to alter the code being tested. I think it works quite well.
It's just worth noting that injectr files are relative to the working directory, unlike require which is relative to the current file, but that shouldn't matter because it's only used in tests.
I've previously toyed with the idea of providing an alternate require to make a form of dependency injection available in Node.js.
Module code
For example, suppose you have following statements in code.js:
fs = require('fs');
console.log(fs.readFileSync('text.txt', 'utf-8'));
If you run this code with node code.js, then it will print out the contents of text.txt.
Injector code
However, suppose you have a test module that wants to abstract away the file system.
Your test file test.js could then look like this:
var origRequire = global.require;
global.require = dependencyLookup;
require('./code.js');
function dependencyLookup (file) {
switch (file) {
case 'fs': return { readFileSync: function () { return "test contents"; } };
default: return origRequire(file);
}
}
If you now run node test.js, it will print out "test contents", even though it includes code.js.
I've also written a module to accomplish this, it's called rewire. Just use npm install rewire and then:
var rewire = require("rewire"),
myModule = rewire("./path/to/myModule.js"); // exactly like require()
// Your module will now export a special setter and getter for private variables.
myModule.__set__("myPrivateVar", 123);
myModule.__get__("myPrivateVar"); // = 123
// This allows you to mock almost everything within the module e.g. the fs-module.
// Just pass the variable name as first parameter and your mock as second.
myModule.__set__("fs", {
readFile: function (path, encoding, cb) {
cb(null, "Success!");
}
});
myModule.readSomethingFromFileSystem(function (err, data) {
console.log(data); // = Success!
});
I've been inspired by Nathan MacInnes's injectr but used a different approach. I don't use vm to eval the test-module, in fact I use node's own require. This way your module behaves exactly like using require() (except your modifications). Also debugging is fully supported.

Categories