Is there a way to load javascript scripts concurrently? - javascript

Probably not. But still would be nice to have it. Kinda hard to believe that modern standards do not support it.
Update: I would like to load scripts concurrently as opposed to the asynchronous scripts execution. So that if there are 10 scripts to load (9 small, 1 big) - in that case the big script won't "stuck" the download of the smaller ones.
Update2: I am loading the scripts by adding the script DOM element via javascript.

In html5, you have async attribute.
HTML5′s async Script Attribute

If you're referencing them via <script> tags in the delivered HTML, then it's up to the user agent to decide how to load this resource, but every modern browser is able to fetch multiple resources concurrently. Interestingly they may choose not to fetch multiple Javascript files at once - this Google best practices document alludes to why, though the exact behaviour will depend on the browser,
If you're programmatically loading them via logic in other Javascript files, then probably not - Javascript is inherently single-threaded, so no two scripts could execute at once in order to fire off the requests.

Related

Javascript - inline vs external script - what's the difference?

I have a few snippets of javascript scattered about my pages - many are contained in my own .js files, however some of the stuff that I've found online sits directly on the page.
I'm not too familiar with how javascript interacts with a page - is there a difference between adding the script inline or adding a reference to the external file?
There is little difference in using one or the other way. The real difference comes from the advantages/disadvantages that each one has.
Inline scripts
Are loaded in the same page so is not necessary to trigger another request.
Are executed immediately.
The async and defer attributes have no effect
Can be helpful when you are using a server-side dynamic rendering.
External scripts
Gives better separation of concerns and maintainability.
The async and defer attributes have effect so if this attributes are present the script will change the default behavior. This is not possible with inline scripts.
Once a external script is downloaded the browser store it in the cache so if another page reference it no additional download is required.
Can be used to load client code on demand and reduce overall download time and size.
External script files
Much easier to analyse so you can debug more efficiently and read it. This makes life much easier for us as programmers
Download time is reduced as the external file is cached so it can be downloaded with the website
Instead of writing the same script numerous times, an external file can be called and executed anywhere in the code
External files decrease page rendering speed as the browser has to stop parsing and download the external file. This adds a network round trip which will slow everything down. Also because external files are cached it makes it tough to delete them if the have been updated
Inline code
Inline code reduces the number of HTTP requests making improving the performance of the webpage. This because the code is loaded in the same page so a request is not needed
Inline script is executed immediately
Although inline code is much harder to read and analyse as it just looks like a lump of code chucked together. It is hard work having to find the problem when debugging, making life as a programmer tough
Hope this helps you understand a bit more :)
Looking at the <script> tag documentation, you can see that you can use the async and defer attributes only with external scripts, which might have an effect on scripts that do not use event listeners as entry points.
Other than that, inlining renders a browser unable to cache it on its own, so if you use the same script on different pages, the browser cache cannot kick in. So it might have an effect on performance and/or bandwidth usage.
And, of course, splitting code up into files is one way of organizing it.
Generally there is no difference as indicated in the comments. But, if the snippet is embedded in the middle of the HTML in the page and it is not a function, it is executed immediately. Such script segments may have a difference in behavior when moved to a separate JS file when enough care is not taken.

Bind JavaScript scripts to their respective web pages

I'm working on a large web application which has about 10K lines of JavaScript code (without taking into account the third-party libraries). In order to speed up page loading it has been decided to automatically concatenate every script file into a large script that gets loaded (and cached) on the client the first time the application is accessed. This poses a problem due to the fact that each page had its own script which contained all JavaScript required in (essentially) the same function.
Now if an error occurs in one of the scripts it is really hard to tell where that error came from, since everything is rolled into the same script which is added to each page, as opposed to using explicit script declarations in each page as was done before.
Is there a JavaScript pattern for solving this issue? I'm thinking of something similar to the AngularJS modules that can be bound to certain containers inside a web application's pages.
However, I would like a simple, custom, solution, as we're short on time and we don't have time to implement a framework in our application. It should apply certain scripts (modules) only to their respective pages and it should allow developers to explicitly declare any other scripts (modules) that certain scripts rely on.
Also, implementing an exception handling system to notify users (in the Firebug console, for example) from which module an exception originated (if the page's module relies on other modules) would be great.
Is there a common means of solving such issues in JavaScript (without relying on frameworks)?
A possible solution to your problem could be the use of js source maps. Many concat/minify/uglify tools directly support this feature, while most modern browsers are capable of interpreting them.
You would still serve a single JS file containing all your code (even if this is most likely not the best idea to handle such large amounts of code - largely depending on your overall architecture)
But your browsers developer tools are now able to show you the original file name and line number of an error/console output/etc.
You might most probably not serve the source map file in production.
A good point to get into js source maps is this wiki.

Why can't I load external scripts from outside servers while Modernizr.load can?

I keep having this doubt in my mind, I want to test if an URL exists before loading the script from that URL, but the way I'm trying to do it fails, as I'm using XMLHTTPRequests and as many know, when you use this method to GET a file from a server that it's not the same as the script that executes the GET, you will get back is not allowed by Access-Control-Allow-Origin .
So how come Modernizr.load() method can theoretically load the scripts and I cannot even see if there's actually something there ?
Because Modernizr.load(), like #dm03514 mentions, loads the script not through XMLHttpRequest, but by inserting a <script tag which doesn't have the cross-domain restriction. It then tries to check if the script loaded correctly, but that's not an easy task and it may not be possible in all browsers. For more detail you can see this recopilation of the support of different browsers for the various options available for checking success of loading scripts/css: http://pieisgood.org/test/script-link-events/
As for why XMLHttpRequest fails, you can read more about cross-domain restrictions at MDN: https://developer.mozilla.org/en-US/docs/HTTP_access_control
Some motivations for using script loaders are:
Loading scripts based on conditions like what yepnope and YUI do
Load scripts asynchronously for performance reasons ( tags block the rendering of the page).
Dependency injection (load resources that other scripts need, this is what requirejs does)
Load scripts when certain events happen (load hew functionality when a user clicks on a tab)
Also when you use script loaders, you usually load everything from them, including your application code, so that your application code has access to all dependencies. The require.js model (google AMD modules) is a great way of organizing your app. It allows you to write small modules that do specific tasks and reuse them, instead of one big file that does everything.

Should javascript code always be loaded in the head of an html document?

Is there a blanket rule in terms of how javascript should be loaded. I'm seeing people saying that it should go on the end of the page now.
Thoughts?
The thought behind putting it at the end of the document is to ensure that the entire contents of the document have been downloaded prior to any attempts to reference elements in it. If the JavaScript was loaded first it is possible that code could go looking for elements when the document was not ready.
jQuery addresses this issue with the ready function:
Binds a function to be executed
whenever the DOM is ready to be
traversed and manipulated.
This is probably the most important
function included in the event module,
as it can greatly improve the response
times of your web applications.
There are also performance considerations that suggest including JavaScript files at the bottom of the page - this is due to the fact that downloading a JavaScript file may block all other HTTP requests (for images, CSS, etc.) until it is complete.
Well there are two cases (at least) depending on what do you want to achieve. If you need the functionality incorporated in the script or scripts, like function libraries, available before or during page loading then you should load JavaScript code in the head tag.
If you need to do something that needs some resources that are made available only after the page is loaded (DOM resources that is) than you should load the script/s at the bottom of the page (this can also be solved using onDOMReady events that are already available in most of the JavaScript frameworks).
There could be also some performance issues that could dictate your decision. For instance there are situations when loading the script or scripts in the head tag would slow down the page rendering and if you can provide basic functionality with page rendered until the script are fully functional then again the script or scripts should be loaded at the bottom.
It is more or less a decision based on what you need to do.
If you need to use javascript to run some special logic while the document is still loading, you can put minimal code at the top. But practically all usage of javascript in today's rich web applications is required after the document is ready, hence the <script> tags can be safely put after </body>.
Here is a very relevant read: http://developer.yahoo.com/performance/rules.html
There is no rule : if you want the Js to be loaded first, you put it in the head. If you want it to be the last thing the browser load, you put it at the bottom.
For websites getting usuable with JS, you probably want to put it at the top, in head. For web sites that degrade nicely, you'll follow Yahoo! recommandations and let the page render, then load the script.
N.B : this has nothing to do with executing the script before or after the DOM is loaded. This issue is not a real one, most of the time you use onload or a $.ready equivalent. This is about when the file is actually loaded, not executed.
I like to minimize my code into a single file when I deploy to production. So, during development, I create a single file for each JavaScript class and load each in the head (after the CSS files). Since I wait for an event from the browser indicating the DOM is ready, I place my JS file in the head of the only HTML/JSP page.

Speed up web site loading

I am looking for the best way to speed up the load time of my js.
The problem is that I am working with a very large site that uses the jquery framework, and what's happening is because the site is also loading, facebook connect, addthis sharing, google analytics and another tracking code, the jquery is delayed a few seconds, and certain elements like the calendar just appear, and my users are complaining that things take to long.
I did a test in google chrome and the avg load time is 4s. Which is too much.
I am already doing minification, and the jquery UI/ Jquery is being loaded from google. What's the best way to approach this?
Make fewer http calls by combining images and script and css, and also use a Content Delivery Network for you static images and css might help!
You are not likely to be able to do much more about the load time of the external scripts, so what you can do is to change the order that things happen in the page so that the external scripts are loaded after you have initialised the page.
Scripts are loaded and executed in a serial manner, so if you change their order in the source code, you also change the order they are loaded.
Instead of using the ready event in jQuery, you can put your initialising code inline in the page, after all the content but before the body closing tag. That way the elements that you want to access are loaded when the script runs, and you can put external scripts below the initialising code to make them load after.
Small technical changes (such as serving the JSs from Google, minifying, etc..) will only get you so far.
Seems you simply have lots of dynamic stuff going on in your page. Have you though of an asynchronous way of building your content? One option might be to place placeholders instead of the external content, and asynchronously load it, so when all the scripts are loaded and ready, all you need to do is throw the markup into the placeholder.
This will create a better user experience, instead of the user waiting 10 seconds for the entire page, it will start loading incrementally after 2 seconds (and still fully load after 10).
In addition to Yuval's answer some options that might or might not bring you a speed gain:
the load time of external libraries is something beyond your control. Try to include them as late as possible, and better still, dynamically after the page has loaded. This way your page won't stall, if Google analytics or Facebook have another hickup.
It is not necessarily faster to load jQuery from Google. Consider putting jQuery, jQuery UI and as many of your own JS as reasonable in a single file, minify and gzip it and let the server serve the gzipped version where possible. Note here, that the gain in speed depends largely on what your users cache and where they cache it. If they already have jQuery from Google in their cache, this technique might make page load slower.
The bottomline is, that after some optimization you're out for experimenting. You must find out, what your average user has in her cache, if the page is accessed directly via deep links or if you can smuggle some JS or CSS (or even images) into her cache via a previous "landing page".
Make sure you deliver your content in gzip/deflate encrypted format. Combine multiple javascript files into 1 file, which helps to reduce the number of http requests.
P.S. Here's a test tool to check if compression is configured:
http://www.gidnetwork.com/tools/gzip-test.php

Categories