require.js require a module with index.js - javascript

So I'm trying to set up Typescript and Chutzpah for testing purposes. Typescript is set up to output in this format:
define(['require', 'exports', './someModule'], function(require, exports, someModule) {
//examplecode
});
Which works fine, the problem occurs when someModule is actually a directory with an index.js.
/app
app.js
/someModule
index.js
require.js is unable to resolve someModule in this way and the test fails.
Is there any way to tell require.js that this is a module?

RequireJS won't automatically check for the presence of index.js and load that as your module. You need to tell RequireJS that when you want to load someModule, it should load someModule/index. I'd set a map in my call to require.config:
require.config({
[ ... ]
map: {
'*': {
someModule: 'someModule/index',
}
},
});
You have to adjust the name you give there so that it is a path relative to your baseUrl. It's not clear from the information you give in your question what it should be.
(For the record, there's also a packages setting that you could probably tweak to do what you want but putting something packages says "this is a package", which is not what you appear to have here. So I would not use it for what you are trying to do.)

I didn't like the configuration in map either. The most simple way I accomplished this was writing a plugin for require.
Let's name the plugin mod, where it is to be used as mod!module/someModule, you can also call it index as in index!module/someModule, whatever suits you best.
define(function(require, exports, module) {
// loading module/someModule/index.js with `mod!`
var someModule = require('mod!module/someModule');
// whatever this is about ..
module.exports = { .. };
});
So lets assume you have paths set in require's configuration with some sort of project structure:
- app
- modules
- someModule/index.js // the index we want to load
- someModule/..
- someModule/..
- etc
- plugins
- mod.js // plugin to load a module with index.js
Requires config:
require.config({
paths: {
'module': 'app/modules',
// the plugin we're going to use so
// require knows what mod! stands for
'mod': 'app/plugins/mod.js'
}
});
To read all the aspects of how to write a plugin, read the docs at requirejs.org. The simplest version would be to just rewrite the name of the requested "module" you are attempting to access and pass it back to load.
app/plugins/mod.js
(function() {
define(function () {
function parse(name, req) {
return req.toUrl(name + '/index.js');
}
return {
normalize: function(name, normalize) {
return normalize(name);
},
load:function (name, req, load) {
req([parse(name, req)], function(o) {
load(o);
});
}
};
});
})();
This is not production code, it's just a simple way to demonstrate that requires config wasn't meant to solve problems like this.

Related

Loading webpack module in a require.js based project returns null

I'm trying to load a library that compiles to Webpack in a require.js project. While the library exposes an object, it returns null when required from the require.js project :
define(function(require, exports, module) {
[...]
require("./ext/mylib.core.js"); // -> null
})
Is there any flags that I can use in Webpack to enable AMD compliance ? There are some references to AMD in the generated library but as it is it does not seem to do anything.
The solution was in Webpack documentation : there is an outputLibrary flag that can be set to "amd" or "umd" and in that case webpack produces amd compliant modules.
EDIT 3:/EDIT: 4
Webpack is not cooperating it may seem, so another possibility would be to expose the module with the shim config option:
require.config({
paths: {
// Tell require where to find the webpack thingy
yourModule: 'path/to/the/webpack/asset'
},
shim: {
// This lets require ignore that there is no define
// call but will instead use the specified global
// as the module export
yourModule: {
exports: 'theGlobalThatIsPutInPlaceByWebpack'
}
}
});
This obviously only works in the case that the webpack stuff is putting something in the global scope. Hope this helps!
EDIT 2:
So I got the question wrong as pointed out in the comments. I didn't find any built-in functionality to produce AMD modules from webpack - the end result seems to be a static asset js file. You could wrap the result in a
define(function () {
return /* the object that webpack produces */;
});
block, maybe with the help of some after-build event (e.g. using this after build plugin for webpack). Then you should be able to require the module with an AMD loader.
Original Answer:
require.js loads it's dependencies asynchronously, you have to declare them explicitly when you're not using the r.js optimizer or the like. So if the module exposes an AMD definition it should work like this:
// It works the way you did it ...
define(['path/to/your/module'], function (require, exports, module) {
require('path/to/your/module'); // -> { ... }
});
// ... but I personally prefer this explicit syntax + it is
// friendlier to a code minifier
define(['path/to/your/module'], function (yourModule) {
console.log(yourModule); // { ... }
});
Maybe you have to configure your require instance, there are docs for that.
EDIT1: as pointed out the way the module is being accessed is not wrong but the dependencies were missing, so I added code that is closer to the original question.

Use node require inside a method called using requireJs

Is it possible to use the default node require function in a file that has been called through requirejs?
define(["require", "exports"], function(require, exports) {
//...
var Schema = require(DaoPublic._schemasDirectory + schemaFilename);
}
I always get ReferenceError: module is not defined, I also tried to load the schema using requireJs, same, because the file itself is coded as CommonJs, not AMD compatible.
Any solution?
Note that the loaded schema is in CommonJS and I need to keep this way, since it's used by several DAO, some in AMD and other in CommonJs. (Funny part)
Example of requested file (schema):
var userSchema = {
/**
* User Login, used as id to connect between all our platforms.
*/
login: {
type: String,
match: /^[a-zA-Z0-9_-]+$/,
trim: true,
required: true,
notEmpty: true,
unique: true,
check: {
minLength: 4,
maxLength: 16
}
}
};
module.exports = userSchema;
The problem is that your code is set so that RequireJS is able to find the CommonJS module by itself. However, when RequireJS is running in Node and cannot find a module, it will call Node's require function, which is what you need. So it is possible (with RequireJS) to have an AMD module use Node's require but the trick is getting RequireJS to not see the module in the first place.
Proof of Concept
Here's a proof of concept. The main file named test.js:
var requirejs = require("requirejs");
function myRequire(path) {
if (path.lastIndexOf("schemas/", 0) === 0)
path = "./" + path;
return require(path);
}
requirejs.config({
paths: {
"schemas": "BOGUS"
},
nodeRequire: myRequire
});
requirejs(['foo'], function (foo) {
console.log(foo);
});
The file foo.js:
define(["require", "exports"], function(require, exports) {
return require("./schemas/x") + " by way of foo";
});
The file schemas/x.js:
module.exports = "x";
If you run it with node test.js, you'll get on the console:
x by way of foo
Explanation
I'm calling this a "proof of concept" because I've not considered all eventualities.
The paths setting is there to throw RequireJS off track. BOGUS must be a non-existent directory. When RequireJS tries to load the module ./schemas/x, it tries to load the file ./BOGUS/x.js and does not find it. So it calls Node's require.
The nodeRequire setting tells RequireJS that Node's require function is myRequire. This is a useful lie.
The myRequire function changes the path to add the ./ at the start before calling Node's require. The issue here is that for some reason RequireJS transforms ./schemas/x to schemas/x before it gives the path to Node's require function, and Node will then be unable to find the module. Adding back the ./ at the start of the path name fixes this. I've tried a whole bunch of path variants but none of them worked. Some variants were such that RequireJS was able to find the module by itself and thus never tried calling Node's require or they prevented Node from finding the module. There may be a better way to fix this, which I've not found. (This is one reason why I'm calling this a "proof of concept".) Note that I've designed this function to only alter the paths that start with schemas/.
Other Possibilities
I've looked at other possibilities but they did not appear to me very promising. For instance, customizing NODE_PATH would eliminate myRequire but such customization is not always doable or desirable.

RequireJS plugin: load timeouts experienced when using plugin

Using RequireJS I'm building an app which make extensive use of widgets. For each widget I have at least 3 separate files:
request.js containing code for setting up request/response handlers to request a widget in another part of my application
controller.js containing handling between model and view
view.js containing handling between user and controller
Module definition in request.js:
define(['common/view/widget/entity/term/list/table/controller'],
function(WidgetController) { ... });
Module definition in controller.js:
define(['common/view/widget/entity/term/list/table/view'],
function(WidgetView) { ... });
Module definition of view.js is:
define(['module','require'],function(module,require) {
'use strict';
var WidgetView = <constructor definition>;
return WidgetView;
});
I have lots of these little situations as above in the case of widgets I have developed. What I dislike is using the full path every time when a module is requiring another module and both are located in the same folder. I'd like to simply specify as follows (assuming we have a RequireJS plugin which solves this for us):
define(['currentfolder!controller'],
function(WidgetController) { ... });
For this, I have written a small plugin, as I couldn't find it on the web:
define({
load: function (name, parentRequire, onload, config) {
var path = parentRequire.toUrl('.').substring(config.baseUrl.length) + '/' + name;
parentRequire([path], function (value) {
onload(value);
});
}
});
As you might notice, in its basic form it looks like the example of the RequireJS plugins documentation.
Now in some cases, the above works fine (e.g. from the request.js to the controller.js), but in other cases a load timeout occurs (from controller.js to view.js). When I look at the paths which are generated, all are proper RequireJS paths. Looking at the load timeouts, the following is logged:
Timestamp: 13-09-13 17:27:10
Error: Error: Load timeout for modules: currentfolder!view_unnormalized2,currentfolder!view
http://requirejs.org/docs/errors.html#timeout
Source File: http://localhost/app/vendor/requirejs/require.js?msv15z
Line: 159
The above log was from a test I did with only loading the view.js from controller.js using currentfolder!view in the list of modules in the define statement. Since I only requested currentfolder!view once, I'm confused as to why I both see currentfolder!view_unnormalized2 and currentfolder!view in the message.
Any idea as to why this might be happening?
My answer may not answer your primary questions, but it will help you achieve what you're trying to do with your plugin.
In fact, Require.js support relative paths for requiring modules when using CommonJS style. Like so:
define(function( require, exports, module ) {
var relativeModule = require("./subfolder/module");
module.exports = function() {
console.log( relativeModule );
};
});

Access RequireJS path configuration

I notice in the documentation there is a way to pass custom configuration into a module:
requirejs.config({
baseUrl: './js',
paths: {
jquery: 'libs/jquery-1.9.1',
jqueryui: 'libs/jquery-ui-1.9.2'
},
config: {
'baz': {
color: 'blue'
}
}
});
Which you can then access from the module:
define(['module'], function (module) {
var color = module.config().color; // 'blue'
});
But is there also a way to access the top-level paths configuration, something like this?
define(['module', 'require'], function (module, require) {
console.log( module.paths() ); // no method paths()
console.log( require.paths() ); // no method paths()
});
FYI, this is not for a production site. I'm trying to wire together some odd debug/config code inside a QUnit test page. I want to enumerate which module names have a custom path defined. This question touched on the issue but only lets me query known modules, not enumerate them.
It is available, but it's an implementation detail that shouldn't be depended on in production code ( which you've already said it's not for, but fair warning to others! )
The config for the main context is available at require.s.contexts._.config. Other configurations will also hang off of that contexts property with whatever name you associated with it.
I don't believe require exposes that anywhere, at least I can't find it looking through the immense codebase. There are two ways you could achieve this though. The first and most obvious is to define the config as a global variable. The second, and closer to what you want, is to create a require plugin that overrides the load function to attach the config to the module:
define({
load: function (name, req, onload, config) {
req([name], function (value) {
value.requireConfig = config;
onload(value);
});
}
});

Relative paths with RequireJS modules/packages

I'm fairly new to RequireJS and I've run into a bit of a problem. I've written a little framework built on Backbone using RequireJS and I want it to be re-usable in different projects. So, with some searching I learned that require allows packages. This seemed like what I was looking for. I have a main.js file to launch my app that essentially looks like this:
require.config({
packages: ['framework']
});
require(['framework'], function(framework) {
framework.createDash();
});
Then in the same directory as my main.js I have another directory called "framework" which contains another main.js which looks like this:
define(function(require, exports, module) {
exports.createDash = function(dash, element) {
require(['dash/dash.model', 'dash/dash.view'], function(DashModel, DashView) {
return new DashView({
model: new DashModel(dash),
el: element ? element : window
});
});
};
});
In searching I found this page which indicates that the 'require' argument should be scoped to the submodule. However, when I try to require things they are still relative to my original main.js. I've tried a number of things and searched for hours to no avail. Is there any way I can have my require/define calls within my package included relative to the main.js in it's root?
You need to define your submodule as package in the require configuration:
require.config({
packages: [
{ name: 'packagename',
location: 'path/to/your/package/root', // default 'packagename'
main: 'scriptfileToLoad' // default 'main'
}]
... some other stuff ...
});
To load your module you just need to use your 'packagename' at the requirements:
define(['jquery', 'packagename'], function($, MyPackage) {
MyPackage.useIt()
});
In your package you must use the ./ prefix to load your files relative to your submodule:
define(['globalDependency', './myLocalFile'], function(Asdf, LocalFile) {
LocalFile.finallyLoaded();
});
There is a useful shortcut: If your package name equals to your location and your main file is called 'main.js', then you can replace this
packages: [
{ name: 'packagename',
location: 'packagename',
main: 'main'
}]
to this:
packages: ['packagename']
As far as I can see, you already tried to define a package but did you also use the ./ prefix? Without this prefix require will try to find the files in it's global root-path. And without a package, ./ will be useless because the relative path is the same as the global root-path.
Cheers
I figured out the answer to my question, and the solution (they were not the same apparently). I guess I'll post it here in case it can help someone else in the future.
Essentially what I was wanting was to load my framework within its own context. I found the context option under the configuration section on require's website and an example of how to use it. Originally I tried this by doing something like:
var req = require.config({
baseUrl: 'framework',
context: 'framework',
paths: {
jQuery: 'lib/jquery/jquery-1.7.min.js',
Underscore: 'lib/underscore/underscore.min.js',
Backbone: 'lib/backbone/backbone.min.js',
etc...
}
});
req(['main'], function() {});
There were two problems with this. First, my 'req' variable was being defined outside of the framework, but I wanted the framework to define it's own paths. And second, whenever a file outside of the framework would require a file within the framework, which would in turn require 'jQuery', for example, then jQuery (or whatever else) wouldn't be required from within the context of the framework instance of require and so it couldn't find the file.
What I ended up doing was defining my framework's main.js to look something like this:
var paths = {
jQuery: 'lib/jquery/jquery-1.7.min.js',
Underscore: 'lib/underscore/underscore.min.js',
Backbone: 'lib/backbone/backbone.min.js',
etc...
};
define(function() {
var exports = {};
exports.initialize = function(baseUrl, overridePaths, callback) {
if(!overridePaths) {
overridePaths = {};
}
if(baseUrl && baseUrl[baseUrl.length - 1] != '/') {
baseUrl = baseUrl + '/';
}
var fullpaths = {};
for(var path in paths) {
// Don't add baseUrl to anything that looks like a full URL like 'http://...' or anything that begins with a forward slash
if(paths[path].match(/^(?:.*:\/\/|\/)/)) {
fullpaths[path] = paths[path];
}
else {
fullpaths[path] = baseUrl + paths[path];
}
}
var config = {paths: fullpaths};
for(var pathName in overridePaths) {
config.paths[pathName] = overridePaths[pathName];
}
require.config(config);
// Do anything else you need to do such as defining more functions for exports
if(callback) {
callback();
}
}
return exports;
});
And then in my project's main.js file I just do this:
require(['framework/main'], function(framework) {
// NOTE: This setTimeout() call is used because, for whatever reason, if you make
// a 'require' call in here or in the framework without it, it will just hang
// and never actually go fetch the files in the browser. There's probably a
// better way to handle this, but I don't know what it is.
setTimeout(function() {
framework.initialize('framework', null, function() {
// Do stuff here
}
}, 0);
});
This takes whatever is passed in to the framework's initialize() method for 'baseURL' and prepends that to any paths that the framework defines that do not start with a forward slash or 'anything://', unless they are override paths. This allows the package using the framework to override things like 'jQuery'.
This worked for me, adding a "./" prefix to the module names:
define(function (require, exports, module) {
exports.createDash = function (dash, element) {
require([ './dash/dash.model', './dash/dash.view' ], function (DashModel, DashView) {
return new DashView({
model : new DashModel(dash),
el : element ? element : window
});
});
};
});
A process that worked well for me for allowing a package with submodules to be used directly from data-main or from an outside framework, assuming that a main.js (or other package main) is called by a particular name, was to use var baseUrl = require.toUrl('packageName') + '/../' as a prefix to a require.config({ paths: { ... } }) configuration file. For instance:
var music21Base = require.toUrl('music21') + '/../';
require.config({ paths: {
'jquery': music21Base + 'ext/jquery/jquery.2.1.10.min';
'subModuleLoader': music21Base + 'src/subModuleLoader';
} });
The setting of context: "xxx" worked fine for calling normal modules with ./modName but did not work for the paths argument for me.

Categories