When multiple JavaScript modules are define()'d using AMD and concatenated into a single file, are those modules still considered asynchronous?
Generally speaking, just concatenating a bunch of AMD modules together won't make them synchronous. However, if you can live with additional limitations and select a loader that can do it, you can load AMD modules synchronously.
RequireJS
I don't know of a case where RequireJS will load anything synchronously, even if asynchronous loading is not needed. You could have the following in a <script> tag:
define("foo", [], function () {
});
require(["foo"], function (foo) {
});
There is nothing to load here because all the code is already present there. The foo module does not need to be fetched from anywhere. It's list of dependencies is known, etc. Yet, RequireJS will handle it asynchronously.
One source of confusion for some wondering about RequireJS' capability to load modules synchronously could be RequireJS' synchronous form of require. You can do:
define(function(require) {
var foo = require("foo");
});
The call to require here looks like it is synchronous but RequireJS transforms it behind the scenes into this, by adding "foo" to the list of define's required modules:
define(["foo"], function(require) {
var foo = require("foo");
});
So while the require call looks synchronous, RequireJS still handles it asynchronously.
Almond
Almond is a loader made by James Burke, who is also the author of RequireJS, that can load AMD modules synchronously. Now, Almond comes with a series of limitations. One such limitation is that you can't load anything dynamically. That is, the entire list of modules you want to load has to be part of the optimized bundle you create with r.js and give to almond for loading. If you can live with the limitations of almond, then it is quite possible to load a bunch of AMD modules synchronously. I've provided details on how to do this in this answer.
Yes, they are still considered async.
While the module itself doesn't have to be loaded from disk, the modules do need to be executed and there is a call back made.
Because you can combine SOME modules into a single file doesn't mean you have to combine all -- nor does RequireJS assume all there.
It will run what it can from your preload and async load the rest.
Related
I'm looking into what appears to be a case of javascript loading out of order in a legacy application. The application uses Require.js to load several modules, and one of our company's modules is executing prior to its dependencies being loaded.
My experience with Require.js and AMDs is very limited, and in researching I noted that in some areas dependencies are prefixed with an order! string, such as:
define(['order!jquery', ...
Whereas in other areas the prefix isn't used:
define(['jquery', ...
So far I can't find documentation of this directive. What's its effect?
Full information copied from here
Normally RequireJS loads and evaluates scripts in an undetermined
order. However, there are some traditional scripts that depend on
being loaded in a specific order. For those cases you can use the
order plugin. Download the plugin and put it in the same directory as
your app's main JS file. Example usage:
require(["order!one.js", "order!two.js", "order!three.js"], function () {
//This callback is called after the three scripts finish loading.
});
Scripts loaded by the order plugin will be fetched asynchronously, but
evaluated in the order they are passed to require, so it should still
perform better than using script tags in the head of an HTML document.
The order plugin is best used with traditional scripts. It is not
needed for scripts that use define() to define modules. It is possible
to mix and match "order!" dependencies with regular dependencies, but
only the "order!" ones will be evaluated in relative order to each
other.
Notes:
The order! plugin only works with JavaScript files that are cacheable by the browser. If the JS file has headers that do not
allow the browser to cache the file, then the order of scripts will
not be maintained.
Do not use the order! plugin to load other plugin-loaded resources. For instance. 'order!cs!my/coffescript/module' is not recommended.
You will get errors in some versions of IE and WebKit. This is due
to the workarounds the order plugin needs to do for those browsers
to ensureordered execution.
I am wondering anyone knows the internal of Requirejs, why it can load js asynchronourly? I know Javascript has no thread, how can async done by requirejs?
How RequireJS works?
Each module is contained inside a define call, which defines the module dependencies. With that, RequireJS makes a kind of tree to order each module from the one without dependencies, to the one with the most dependencies.
A module with only one dependency could be the one that depends on everything, if its dependency depends on another module, which depends on 2-3 other modules, and then it goes on like that.
define(['some/dep'], function(someDep){ /* module code */ });
In that order, RequireJS creates a <script> tag with the url to the module file and it inserts that script tag at the end of the <head>. The browser loads the JavaScript files and runs them in the order that they are present in the HTML.
Then, when every dependency is defined for the module to run, the function of that module is called with each dependency (previously defined) injected into the module factory function, and its result is stored.
How can it be async without threads?
It's async, but not necessarily parallel. Loading scripts can be parallel as the browser (at least chrome for sure) makes multiple connections to the server to fetch more files at once, but this has nothing to do with JS.
The async nature of JavaScript comes in an event-loop.
Each async callback is put in an event queue and when the synchronous call stack has completely finished executing, the next event callback from the queue is called.
It's easier to grasp when you see it, and you can in the chrome's dev tools Timeline tab.
common_tempaltes file is not a requirejs file - rather a file that defines a global variable.
common_templates needs hogan. they both are requested more or less at the same time, but race condition is in affect. common_templates sometimes wins so the code fails with "hogan not loaded yet".
require(['module1', 'hogan', 'common_templates'], function(Module){
Module.do_stuff() // this module also requires hogan and common_templates to be loaded
});
other than nested require, is there a built in way to tell require to block until hogan is fully downloaded?
nested:
require(['hogan'], function(Fn, Ui){
require(['common_templates'], function(){
require(['module1'], function(Module){
Module.do_stuff();
});
});
});
This approach seems a bit hacky. is there a built in way to work around these race conditions?
If common_templates is not an AMD module (doesn't contain a define([deps]) call), then you need to configure it as a shim:
require.config({
shims: {
common_templates: {
deps: ["hogan"]
}
}
});
Now, require(['module1', 'hogan', 'common_templates']) and require(['module1', 'common_templates']) should work.
I had this exact same issue. I needed to essentially 'block' my program flow to ensure that an initial list of dependencies were loaded before a second list of dependencies, and then finally my main application code. Here is how I solved it. This is the Require.js call I made in my index.html file:
<script src="/js/lib/require_2.1.22.js"></script>
<script>
//debugger;
//Load common code that includes config, then load the app
//logic for this page. Do the requirejs calls here instead of
//a separate file so after a build there are only 2 HTTP
//requests instead of three.
requirejs(['/js/common_libs.js'], function (common) {
//debugger;
//Ensure the the AdminLTE code and its dependencies get loaded prior to loading the Backbone App.
requirejs(['/js/lib/adminlte.js'], function (common) {
//debugger;
//The main file for the Backbone.js application.
requirejs(['/js/app/main_app.js']);
});
});
</script>
The common_libs.js file contains the requirejs.config({shim:...}) stuff. The adminlte.js library does it's own dependency checking and will complain on the console if it does not detect it's dependencies. My problem was that it was getting loaded asynchronously with its dependencies and was causing a race condition.
I was able to wrap the existing code in adminlte.js like this:
//My experiments with Require.js to prevent race conditions.
define([
'jQuery-2.1.4.min',
'bootstrap.3.3.6',
'jquery.slimscroll.min'
], function($, Bootstrap, SlimScroll ) {
//Existing code goes here
...
//I knew to return this value by looking at the AdminLTE code. Your mileage may vary.
return $.AdminLTE;
});
That allowed me to load this library separately with its dependencies. The code in adminlte.js is only execute after its dependencies are loaded. Then, and only after that is complete, will main_app.js be loaded, along with its dependencies.
Structuring your code this way allows you to explicitly load your dependencies in batches.
Editing for clarity:
The dependencies are loaded in stages. To clarify the example above:
In the first stage jQuery, Boostrap, and the SlimScroll library are loaded, then the adminlte.js file is execute.
In the second stage, all other dependencies are loaded, then the main_app.js file is executed. main_app.js will have its own defined([], ...) function that calls out the rest of the dependencies it will need loaded.
This is much more efficient than writing a nested require() call for each dependency. And, as far as I can tell from the requirejs.org website, this is the 'proper' way to load serial dependencies.
Coding Update:
I also had to wrap the code in the jquery.slimscroll library inside a define() statement in order to explicitly call out the that this library depends on jQuery. Otherwise it was setting up another chance at a race condition.
This does not make sense, you said "common_templates requires hogan" but if common_templates requires hogan then hogan will already be loaded when the common_templates code starts.
Make sure common_templates.js is defined like this:
define(['hogan'], function(){
//common_templates suff here
});
I'm struggling to get requireJS to work properly. Page is running fine, but I think I'm doing things in an oh-so wrong way.
For example, on page xzy I'm adding the following JavaScript at the end of the page (the JS must stay on the page for now, so no external js-files possible)
<script type="text/javascript" language="javascript">
//<![CDATA[
(function () {
require([
'async!http://maps.google.com/maps/api/js?v=3&sensor=false',
'maps/jquery.ui.map.full.min.js',
'maps/jquery.ui.map.extensions.min'
], function() {
// ... do stuff with Google Maps
}
);
}());
//]]>
</script>
Doing this makes google.map and the $.().gmap method globally available, which probably shouldn't be available globally.
Questions:
Should I convert this into a requireJS module? Why?
If so, will the module be available on other pages as well or do I just "re-define" on page 123 and the dependency files will already have been cached?
And finally - will I have to convert the code inside my require call into module.methods, which I then call via module_name.method_name(pass_some_parameters)?
Just looking at the JS:
http://maps.google.com/maps/api/js?v=3&sensor=false
You can see that window.google is a global. There's not much you can do about that without Google releasing an AMD version.
Your decision regarding should you create a module should firstly be a question of readability/maintainability of the JS code. Modules are (should be), readable, reusable chunks of code/reusable abstractions that the rest of your code can consume. You should also derive testing benefits from this - each module should be easier to test in isolation.
You can end up with many more JS files if you choose a modular approach, and you might think this leads to performance issues - i.e. multiple HTTP requests. But this is mitigated by using the RequireJS Optimiser to optimise your modules to a single file.
If you convert to a module, yes you can require it from other pages, and if your HTTP caching headers are set up, then the browser may choose to use a cached version, thus saving you a HTTP request (same caching heuristics apply if you've optimised every module into a single file).
If you re-define (I assume you mean copy and paste the code block), then those dependencies listed in the call to require should all be cached by the browser, and therefore instantly available (depending on your web server and its HTTP caching headers).
Finally, yes you may have to refactor the code a bit to expose the new module's API. If that means exposing a single object with methods, then that's what you should do. This process almost inevitably leads to better code though in my experience. As you've had to think more about what the module's purpose is, and this often leads to your breaking the coupling between pieces of code.
sorry for beeing a little lazy and not trying it all out myself, but I thought a nice answer on Stackoverflow might help some other guys too. I'm pondering whether or not to use requireJS to load my modules. Currently I'm doing that on my very own, so I have some questions about requireJS.
How does requireJS deal with multiple references (does it cache files/modules) ?
More precisely, if you have calls like require(["some/module", "a.js", "b.js"], function...}); and you again reference a.js or b.js in later .require or .define calls, how does requireJS handle those? My guess is, it will entirely ignore those additional references, is that correct? If so, is it possible to force requireJS to reload a script ?
Does requireJS always transfer files over the wire or you can load modules statically ?
What I normally do is, to concatenate all of my js files (modules included), except for those which need to get loaded dependent on run-time conditions. As far as I read the requireJS doc, you can define own names for modules. So my question is, can you load a module which is already present in the script, without transferring it over the wire ?
As far as I understood the doc, names are created automatically for modules, based on their path location and filename, so this would make no sense for my requirement here.
requirejs.undef() should do the trick
Normally, a module will only be loaded once by require.js. require.js will always resolve dependencies and load the modules in the right order so that you don't have to care about that. Subsequent calls to require for the same module will yield it immediately.
It is not possible to reload a module. If you really have a need for loading the same module more than once (which would unfortunately indicate that there is something wrong with your module's design) you can have a look at the Multiversion support.
I am not sure I understand what you mean by "load modules statically". But if I am guessing right you want to load several modules as one and use them seperately. This is possible:
Typically in your modules you will be doing something like:
define(['moduleA', 'moduleB', 'moduleC'], function (a, b, c) {
...
return exports;
});
where exports can be more or less anything, a function, an object, whatever. So you can also do something like:
define(['moduleA', 'moduleB', 'moduleC'], function (a, b, c) {
...
return {moduleA: a, moduleB: b, moduleC: c};
});
exporting them all together.
Note however that you should really have a look at the optimization tool. It can group related modules together.
Finally, the automatic naming is a misunderstanding, you can be explicit on the names of your modules.