Requirejs domReady plugin vs Jquery $(document).ready()? - javascript

I am using RequireJS and need to initialize something on DOM ready. Now, RequireJS provides the domReady plugin, but we already have jQuery's $(document).ready(), which is available to me since I have required jQuery.
So I have got two options:
Use the domReady plugin:
require(['domReady'], function (domReady) {
domReady(function () {
// Do my stuff here...
});
});
Use $(document).ready():
$(document).ready(function() {
// Do my stuff here...
});
Which one should I choose, and why?
Both the options seems to work as expected. I am not confident in the jQuery's one because RequireJS is doing its magic; that is, since RequireJS will dynamically add scripts, I'm worried that DOM ready may occur before all of the dynamically-requested scripts are loaded. Whereas, RequireJS will add a burden on additional JS just for domReady when I already have jQuery required.
Questions
Why does RequireJS provide a domReady plugin when we can have jQuery's $(document).ready();? I don't see any advantage of including another dependency.
If its just to feed a need, then why not provide one for cross-browser AJAX?
As far as I know, a module that requires domReady won't be fetched or executed after the document is ready, and you could do the same requiring jQuery as well:
require(['jQuery'], function ($) {
$(document).ready(function () {
// Do my stuff here...
});
});
To be more clear on my question: what's the difference between requiring domReady or jQuery?

It seems like all the key points were already hit, but a few details fell through the cracks. Mainly:
domReady
It is both a plugin and a module. If you include it in the the requirements array w/ a trailing ! your module won't execute until it's "safe" to interact w/ the DOM:
define(['domReady!'], function () {
console.info('The DOM is ready before I happen');
});
Note that loading and executing are different; you want all your files to load as soon as possible, it's the execution of the contents that is time sensitive.
If you omit the !, then it's just a normal module that happens to be a function, which can take a callback that won't execute before the DOM is safe to interact with:
define(['domReady'], function (domReady) {
domReady(function () {
console.info('The DOM is ready before I happen');
});
console.info('The DOM might not be ready before I happen');
});
Advantage when using domReady as a plugin
Code that depends on a module that in turn depends on domReady! has a very significant advantage: It does not need to wait for the DOM to be ready!
Say that we have a block of code, A, that depends on a module, B, that depends on domReady!. Module B will not finish loading before the DOM is ready. In turn, A will not run before B has loaded.
If you were to use domReady as a regular module in B, it would also be necessary for A to depend on domReady, as well as wrapping its code inside a domReady() function call.
Furthermore, this means that domReady! enjoys that same advantage over $(document).ready().
Re the differences between domReady and $(document).ready()
Both sniff out whether/when the DOM is ready in essentially the same way.
Re fear of jQuery firing at the wrong time
jQuery will fire any ready callback even if the DOM loads before jQuery does (your code shouldn't care which happens first.).

An attempt at answering your main question:
Why does requirejs provides a domReady plugin when we can have jquery's $(document).ready();?
They do two different things, really. RequireJS' domReady dependency signifies that this module requires the DOM to be completely loaded before it can be run (and can therefore be found in any number of modules in your application if you so desire), while $(document).ready() instead fires off its callback functions when the DOM is done loading.
The difference may seem subtle, but think of this: I have a module that needs to be coupled to the DOM in some way, so I can either depend on domReady and couple it at module definition time, or put down a $(document).ready() at the end of it with a callback to an init function for the module. I'd call the first approach cleaner.
Meanwhile, if I have an event that needs to happen right as the DOM is ready, the $(document).ready() event would be the go-to, since that does not in particular depend on RequireJS being done loading modules, provided the dependencies of the code you're calling it from are met.
It's also worth considering that you do not necessarily use RequireJS with jQuery. Any library module that needs DOM access (but does not rely on jQuery) would then still be useful by using domReady.

Answering your bullets in order of appearance:
They both do accomplish the same thing
If you have reservations about jquery for whatever reason then use domReady
Correct, so just use jQuery
Because not everyone uses jQuery
I agree, just use jQuery
Plugins by definition 'feed a need'.
Cross Browser ajax isn't a thing. Cross Domain? There probably is, and if there isn't then there is no need to feed.
, -, -, - Ok
When it comes down to it, you are overthinking this. It's a mechanism to execute javascript on domReady. If you didn't have jQuery I would advocate the domReady plugin. Since you have jQuery then don't load more scripts to do something that is already available.
Clarity Update
The domReady plugin collects functions to call when the document is 'ready'. If it is already loaded then they execute immediately.
JQuery collects functions and binds a deferred object to the dom being 'ready'. When the dom is ready the deferred object will be resolved and the functions will fire. If the dom is already 'ready' then the deferred will already be resolved so the function will execute immediately.
This means that effectively they do exactly the same thing.

After some experimenting with requirejs with multiple modules I suggest using domReady.
I noticed that function associated with $(document).ready(...) is not called when multiple modules are loaded by requirejs. I suspect that dom is getting ready before all requirejs code is executed and jquery ready callback handler is called before it gets bound with user defined function i.e. within main module code.
require(['jquery',
'underscore',
'text!some_template.html',
'./my_module_1',
'./my_module_2',
'domReady',
'other_dependency_1',
'other_dependency_2'
], function($, _, someTemplate, myModule1, myModule2, domReady) {
$(document).ready(function() {
console.info('This might never be executed.');
console.info('Dom might get ready before requirejs would load modules.');
});
domReady(function () {
console.info('This runs when the dom gets ready and modules are loaded.');
});
});

I found I do this as part of the main entry so that all of my javascript is guaranteed that the DOM is ready and jquery is loaded. Not sure how great this is so welcome any feedback but here's my main.js:
require(['domReady!'], function(domReady){
console.log('dom is ready');
require(['jquery', 'bootstrap'], function(){
console.log('jquery loaded');
require(['app'], function(app){
console.log('app loaded');
});
});
});

Related

Understanding JavaScript code snippet

This code is from http://twitter.github.com/
(function ($, undefined) {
// more code ...
$.getJSON("https://api.github.com/orgs/twitter/members?callback=?", function (result) {
var members = result.data;
$(function () {
$("#num-members").text(members.length);
});
});
// more code ...
})(jQuery);
First, things I understand:
All the code is wrapped in a IIFE
They are using Github API for getting the members
The URL includes the string '?callback=?' so the request is treated as JSONP.
What I don't understand is: why they are using $(function() ... inside the function that is executed if the request succeeds.
Is this code equivalent?
$(function() {
$.getJSON("https://api.github.com/orgs/twitter/members?callback=?", function (result) {
var members = result.data;
$("#num-members").text(members.length);
});
});
Maybe I'm wrong but what I think is that the second code snippet waits for the document to be loaded and then request the members ... so there is not parallelism? In the first code snippet the request is done in parallel with the document loading. Please correct me if I'm wrong.
The $ function, if it is passed a function as its argument (it is a horribly overloaded function), will call that function when the DOM is ready.
Having it there stops the code inside (which tries to modify the DOM) from running before the DOM is complete.
If the DOM is already complete before $ is called, then the function will be called immediately.
Note that the HTTP request sent by getJSON might get a response before or after the browser has finished loading and parsing the DOM of the original document.
This allows the request for the data to be sent without waiting for the DOM to be ready while still protecting against premature modification.
Is this code equivalent?
No. That waits for the DOM to be ready before it even sends the request for the data.
Maybe I'm wrong but what I think is that the second code snippet waits for the document to be loaded and then request the members
You're not wrong. That is exactly what happens. The first snippet is most likely used so the JSONP request can be made/returned while waiting for the DOM to be ready. They are just making the best use of the time available.
The chances are the DOM will be ready by the time the AJAX request is complete, but to be on the safe side there is no harm wrapping it in a ready event handler (if the DOM is already ready, jQuery executes the callback immediately).
As you know, this library makes use of jQuery. Now, I know how much we all love jQuery, but what if I want to use another library, such as MooTools or Prototype, that redefines the $ character we all know and love? Using the second example you gave, it breaks the code, because the author is trying to use properties of the $ that likely no longer exist because $ != jQuery.
But in the Twitter snippet, $ is a local variable, an argument to the IIFE, and the jQuery object, which is far less likely to be overwritten, is passed in as that argument. So now, anyone wishing to use this function/library can go ahead and use it without fear that it will break if they combine it with another library using the $.
In summary, It's all about namespacing, to prevent another library overwriting the $ and breaking your function definition.

$(window).load fires too quickly in IE with asynchronous js libs

I have some serious problem with getting asynchronously some js libs and executing them in $(window).load in IE
all works in other browsers of course
so the problem is, that I'm doing something like
<script type="text/javascript">
var scr1 = document.createElement('script');
scr1.type = 'text/javascript';
scr1.src = 'some_lib.js';
$('BODY').prepend(scr1);
</script>
Just before </body> and use $(window).load method in html above it to operate on some plugins in some_lib.js, but it all happens to fast in IE, probable because of that asynchronous lib including, and I get an error, that method is not available for the element.
Is there any chance of maybe modyfying $(window).load method so I still could use it in the same way for every browser ?
Any code that you have in the window.load() call must be placed in a function (called onLoad in this example).
Every time you have a script that you dynamically load, increment a counter. Also include something to decrement that counter...
src1.onload = function() { counter--; onLoad(); }
Then in 'onLoad' have the first line...
if (counter > 0) return;
That means that onLoad will fire at window.load and after every script is loaded, but will only execute when it's all loaded.
It's scrappy, but it will solve your problem.
You haven't really described the reason you need to load these libraries asynchronously. Third party libraries often have "on script load" functionality that you can define before the script is loaded. If you need to load multiple libraries before you can execute your code, you may have to either 1. fire up some code every time a library is loaded to test to see if all libraries required are loaded and then fire off you code 2. for every library, create a jQuery promise/deferred to get resolved when that library is loaded and use $.when(promises).done(function/code) to test and run the code whenever a particular set is loaded, or 3. rewrite to use RequireJS. If these libraries are YOUR code, well, you may have to add start up code to your libraries anyway; It might be a good time to learn RequireJS.
I wish I could recommend further, but learning the basics behind RequireJS has always been on my todo list, but it hasn't been done; I just know of people here successfully using it. If that seems like too much trouble, I'd consider some variant of option 2. If you don't know what jQuery would be used eh... you may be stuck with option 1 or 3.
Edit:
Of course, that's not to say that jQuery has got the only promise library, I just often recommend using promises in some form for these kind of things..
Archer's technique looks interesting, I just don't know how reliable it is (it might be quite reliable, I just would like to see proof/documentation). You could combine that with option 2 also, quite well, if you want to short-cut execution for some things while leaving others to be dealt asynchronously and if those script onload methods really work as expected.

Patterns or techniques for ensuring script availability?

As a project I have been working on has grown, so has the frequency of situations where all scripts on a page are not available when other code tries to access them. Though this happens most often after code is updated (e.g. not cached) I've had it come up more and more in testing, when it never used to happen.
I've addressed this, partially, by using a function to wait for a module to become available (see this question) and this addresses the concern, mostly, but I'm not totally thrilled with the implementation, and am looking for a more industrial strength pattern to deal with this. These are possible solutions I've come up with:
1) Load scripts on demand with something like ensure - not ideal. Requires actual script name dependency information to be included in each script, not just module/object name, to do this. Still have to take some action before using a resource to ensure it's available.
2) Manage script loading order. If this would even work (e.g. I don't think that simply putting script A before script B guarantees it will be available since they can be loaded concurrently), it would be a pain, since you don't know a dependency until you've loaded the thing that depends on it. Would require a lot of work to set up on a site that has lots of pages that use different resources (and I have no intention of loading everything used everywhere on the site on every page).
3) Wait for everything to be loaded on a given page before allowing user interaction. Far from ideal for obvious reasons. Still doesn't address dependencies that happen in initialization code.
4) Expand upon my current solution. Currently works like (this is pseudocode, but the basic logic process):
// Depends on Module2
Module1 = (function () {
self = {};
// function requires dependency
// waitFor waits until global named 'dependency' is available then calls callback
self.initialized=false;
self.init = function() {
waitFor('Module2', function() {
self.intialized=true;
});
}
// waitForInitialization sets a callback when self.initialized=true
self.func = self.waitForInitialization(func() {
Module2.doStuff();
});
}
//UI-initiated function requires dependency
self.uiFunc = function() {
if (!self.initialized) {
showPleaseWaitDialog();
self.waitForInitialization(function() {
dismissPleaseWaitDialog();
self.uiFuncImpl);
} else {
self.uiFuncImpl();
}
}
self.uiFuncImpl= function() {
Module2.doStuff();
}
} ());
I can think of ways to create a prototype that would deal with the dependency issue more transparently than my code above, and fully intend to do that if I have to, but is this truly the best solution? What do others do? What are considered best practices?
2) Script Load Order - scripts will always be executed in the order they are placed in the DOM, so while they might load concurrently they will execute in an orderly fashion (faced this same problem on a large project I worked on).
?) If script load order is not an ideal solution for you, you could look into the Promise model.
??) If Promises and Load Order won't work for you, you could listen for a namespaced event that each module could fire when it's initialized, that way if the object exists it can be used and if not its initialization could be listened for.

Understanding when and how to use Require.JS

I've just begun to work with Require.JS and I'm a bit unclear on the appropriate cases in which it should be used, as well as the correct way to use it in those cases.
Here's how I currently have things set up with Require.JS. I have two functions, functionA() and functionB(). Both of these functions require an additional function, functionC() to work properly.
I only want to load functionC() when necessary, i.e. when functionA() or functionB() is going to be called. So I have the following files:
functionC.js
functionC(){
//do stuff
}
functionA.js
functionA(){
define(['functionC'],function(){
//functionC() is loaded because it is listed as a dependency, so we can proceed
//do some functionA() stuff
});
}
functionB.js
functionB(){
define(['functionC'],function(){
//functionC() is loaded because it is listed as a dependency, so we can proceed
//do some functionB() stuff
});
}
So, is this set up correctly? And if I end up calling both functionA() and functionB() on the same page, is extra work being done since they both load the functionC.js file? If so, is that a problem? And if so, is there a way to set it up so that they first check to see if functionC.js has been loaded yet, and only load it if it hasn't been? Finally, is this an appropriate use of Require.JS?
define() should only be used to define a module. For the above example, where a piece of code should be dynamically loaded, using require() is more appropriate:
functionA.js
functionA(){
require(['functionC'],function(functionC){
//use funcC in here to call functionC
});
}
Some notes:
require([]) is asynchronous, so if the caller of functionA is expecting a return value from that function, there will likely be errors. It is best if functionA accepts a callback that is called when functionA is done with its work.
The above code will call require() for every call to functionA; however, after the first call, there is no penalty taken to load functionC.js, it is only loaded once. The first time require() gets called, it will load functionC.js, but the rest of the time, RequireJS knows it is already loaded, so it will call the function(functionC){} function without requesting functionC.js again.
You can find details about RequireJS and JavaScript modularity here: JavaScript modularity with RequireJS (from spaghetti code to ravioli code)

Is JavaScript multithreaded?

Here's my issue - I need to dynamically download several scripts using jQuery.getScript() and execute certain JavaScript code after all the scripts were loaded, so my plan was to do something like this:
function GetScripts(scripts, callback)
{
var len = scripts.length
for (var i in scripts)
{
jQuery.getScript(scripts[i], function()
{
len --;
// executing callback function if this is the last script that loaded
if (len == 0)
callback()
})
}
}
This will only work reliably if we assume that script.onload events for each script fire and execute sequentially and synchronously, so there would never be a situation when two or more of the event handlers would pass check for (len == 0) and execute callback method.
So my question - is that assumption correct and if not, what's the way to achieve what I am trying to do?
No, JavaScript is not multi-threaded. It is event driven and your assumption of the events firing sequentially (assuming they load sequentially) is what you will see. Your current implementation appears correct. I believe jQuery's .getScript() injects a new <script> tag, which should also force them to load in the correct order.
Currently JavaScript is not multithreaded, but the things will change in near future. There is a new thing in HTML5 called Worker. It allows you to do some job in background.
But it's currently is not supported by all browsers.
The JavaScript (ECMAScript) specification does not define any threading or synchronization mechanisms.
Moreover, the JavaScript engines in our browsers are deliberately single-threaded, in part because allowing more than one UI thread to operate concurrently would open an enormous can of worms. So your assumption and implementation are correct.
As a sidenote, another commenter alluded to the fact that any JavaScriptengine vendor could add threading and synchronization features, or a vendor could enable users to implement those features themselves, as described in this article: Multi-threaded JavaScript?
JavaScript is absolutely not multithreaded - you have a guarantee that any handler you use will not be interrupted by another event. Any other events, like mouse clicks, XMLHttpRequest returns, and timers will queue up while your code is executing, and run one after another.
No, all the browsers give you only one thread for JavaScript.
To be clear, the browser JS implementation is not multithreaded.
The language, JS, can be multi-threaded.
The question does not apply here however.
What applies is that getScript() is asynchronous (returns immediately and get's queued), however, the browser will execute DOM attached <script> content sequentially so your dependent JS code will see them loaded sequentially. This is a browser feature and not dependent on the JS threading or the getScript() call.
If getScript() retrieved scripts with xmlHTTPRequest, setTimeout(), websockets or any other async call then your scripts would not be guaranteed to execute in order. However, your callback would still get called after all scripts execute since the execution context of your 'len' variable is in a closure which persists it's context through asynchronous invocations of your function.
JS in general is single threaded. However HTML5 Web workers introduce multi-threading. Read more at http://www.html5rocks.com/en/tutorials/workers/basics/
Thought it might be interesting to try this out with a "forced", delayed script delivery ...
added two available scripts from
google
added delayjs.php as the 2nd
array element. delayjs.php sleeps
for 5 seconds before delivering an empty js
object.
added a callback that
"verifies" the existence of the
expected objects from the script
files.
added a few js commands that
are executed on the line after the
GetScripts() call, to "test" sequential js commands.
The result with the script load is as expected; the callback is triggered only after the last script has loaded. What surprised me was that the js commands that followed the GetScripts() call triggered without the need to wait for the last script to load. I was under the impression that no js commands would be executed while the browser was waiting on a js script to load ...
var scripts = [];
scripts.push('http://ajax.googleapis.com/ajax/libs/prototype/1.6.1.0/prototype.js');
scripts.push('http://localhost/delayjs.php');
scripts.push('http://ajax.googleapis.com/ajax/libs/scriptaculous/1.8.3/scriptaculous.js');
function logem() {
console.log(typeof Prototype);
console.log(typeof Scriptaculous);
console.log(typeof delayedjs);
}
GetScripts( scripts, logem );
console.log('Try to do something before GetScripts finishes.\n');
$('#testdiv').text('test content');
<?php
sleep(5);
echo 'var delayedjs = {};';
You can probably get some kind of multithreadedness if you create a number of frames in an HTML document, and run a script in each of them, each calling a function in the main frame that should make sense of the results of those functions.

Categories