I'm currently doing some optimization work on a large web project. I'm already doing JavaScript file combining, minification and compression. But I'm confused on one point.
For a number of non-technical reasons, my users are about 50% each IE7 and IE8. After doing some research, I'm getting the impression that IE7 loads the JavaScript files sequentially and IE8 loads them in parallel. I understand that going forward that this will not be an issue with more modern browsers (IE9+, FF, Chrome, etc).
Is this an accurate statement? If yes, then what is best practice for loading the files?
That statement is correct, but you should remember that even modern browser will make only a limited number of connections to the same server. So when your page, scripts, css and images are all on the same server, the browser may load only 2 or 4 of those at a time. Therefor it may be a good idea to add a subdomain or a different domain for scripts to trick the browser and make it load the scripts alongside with the images.
An even simpler solution is to merge all scripts into one script. You can do this 'on the fly' or cache it. You can even minimize the scripts (which means comments and whitespace are stripped and variable names are shortened). You shouldn't minimize and combine the original scripts, but you can cache the combined/minimized scripts so they won't need to be minimized with each request.
If you do this, you reduce traffic, and your browser will only need one request for the file, eliminating the overhead of multiple sequential requests.
See this MSDN blog article which shows some other tricks for script loading.
Related
So I'm injecting some html into page (extension written with angular.js) and on pages with google maps frames like airbnb the loading slows down. So I don't have any idea how I affect that. Any ideas?
When you load JavaScript from a third party you should do it asynchronously. You might want to load your own scripts asynchronously too, but for this article let's focus on third parties.
There are two reasons for this:
If the third-party goes down or is slow, your page won't be held up
trying to load that resource.
It can speed up page loads.
At Wufoo, we just switched over to an asynchronous embed snippet. Users who build forms with Wufoo and want to embed them on their site are now recommended to use it. We did it for exactly those reasons above. It's the responsible thing to do for a web service that asks people to link to resources on that services site.
There is a little terminology involved here that will help us understand the umbrella "asynchronous" term.
"Parser blocking" - The browser reads your HTML and when it comes to a it downloads that entire resource before moving on with the parsing. This definitely slows down page loads, especially if the script is in the head or above any other visual elements. This is true in older browsers as well as modern browsers if you don't use the async attribute (more on that later). From the MDN docs: "In older browsers that don't support the async attribute, parser-inserted scripts block the parser..."
To prevent problematic parser blocking, scripts can be "script inserted" (i.e. insert another script with JavaScript) which then forces them to execute asynchronously (except in Opera or pre 4.0 Firefox).
"Resource blocking" - While a script is being downloaded, it can prevent other resources from downloading at the same time as it. IE 6 and 7 do this, only allowing one script to be downloaded at a time and nothing else. IE 8 and Safari 4 allow multiple scripts to download in parallel, but block any other resources (reference).
Ideally we fight against both of these problems and speed up page loading (both actual and perceived) speed.
https://css-tricks.com/thinking-async/
First some backstory:
We have a website that includes a Google Map in the usual way:
<script src="https://maps.googleapis.com/maps/api/js?v=....></script>
Then there is some of our javascript code that initializes the map. Now suddenly yesterday pages started to load but then freezed up entirely. In Chrome this resulted in having to force quit and restart. Firefox was smarter and allowed the user to stop script execution.
Now after some debugging, I found out that a previous developer had included the experimental version of the Google Maps API:
https://maps.googleapis.com/maps/api/js?v=3.exp
So it's likely that something has changed on Google's servers (which is completely understandable). This has uncovered a bug on our end, and both in combination caused the script to hang and freeze up the website and the browser.
Now ok, bug is found and fixed, no big harm done.
And now the actual question:
Is it possible to somehow sandbox these external script references so that they cannot crash my main site. I am loading a decent amount of external javascript files (tracking, analytics, maps, social) from their own servers.
However such code could change at all times and could have bugs that freeze my site. How can I protect my site? Is there a way to maybe define maximum allowable execution time?
I'm open to all kinds of suggestions.
It actually doesn't matter where the scripts are coming from - whether an external source or your own server. Either way they are run in the clients browser. And that makes it quite difficult to achieve your desired sandbox behavior.
You can get a sandbox inside your DOM with the usage of iframes and the keyword "sandbox". This way the content of this iframe is independent from the DOM of your actual website and you can include scripts independent as well. But this is beneficial mainly for security. I am not sure how it would result regarding the overall stability when one script has a bug like an endless loop or similar. But imho this is worth a try.
For further explanation see: https://www.html5rocks.com/en/tutorials/security/sandboxed-iframes/
Also, I know css and image files can be downloaded in parallel. But can javascript files be downloaded in parallel? Thanks.
When the HTML parser sees a script tag, barring your using a couple of special attributes, it comes to a screeching halt, downloads the JavaScript file, and then hands the contents off to the JavaScript interpreter. Consequently, script files cannot be downloaded in parallel either with each other or with the main HTML...unless you use the defer or async attributes and the browser supports them. Details in this other answer here on StackOverflow.
Note that even if you have multiple resources that can be downloaded in parallel, there's no guarantee that they will be. Browsers (and servers) put limits on the number of simultaneous connections between the same two endpoints. (With modern browsers the limit is usually at least four — up from two — but browsers may dial things down on dial-up connections and on mobile devices; and of course, they're entirely free to only use a single connection, it's implementation-specific).
Scripts are loaded in parallel in modern browsers and IE8. Firefox and Safari will download as many as six scripts at once. IE8 however seems to not have a limit set - my tests showed as many as 18 scripts load at once.
However, whether it's one script or six, everything else is blocked. So the browser will not load one script and one style sheet.
You can get a good idea of the flow by looking at Firebug in the Net tab. Those bars you see on the right are called a waterfall chart, and gives you a reasonably accurate representation of when things load and how long they take.
You can learn more about page loading in this presentation: Optimize Your Website to Load Faster The stuff on resource loading starts on slide 33.
Having read up recently on yahoo's web optimisation tips and using YSlow I've implemented a few of their ideas on one of my sites http://www.gwynfryncottages.com you can see the file here http://www.gwynfryncottages.com/js/gw-custom.js.
While this technique seems to work perfectly on most occasions, and really does speed up the site, but I do notice a significantly higher number of errors where the javascripts don't load or don't load completely while I'm working on the site so three questions:-
is combining scripts this way a good idea at all in terms of reliability?
is there any way to measure the number of errors i.e. the number of times the script failed to load?
is there any way to 'pre-load' the javascript or ensure that the number of loading errors is reduced?
Of course it's good. You will not only decrease HTTP requests but you will cut down delays in downloading other resources.
Try using minify: http://code.google.com/p/minify/, I've been using it and I've no complaints.
I can assure you that combining files WON'T cause any errors as a combined script is the same as 10 non-combined scripts, they all load in the same way (In an ordered way, left to right, top to bottom). Double check the way you're combining them.
Script execution stops at serious errors. If you have multiple scripts, the others will still run; if you packed everything into one big file, a lot more code won't get executed. So combining scripts is bad for reliability, but can be good for other purposes (mainly load time).
All browsers have some sort of javascript console which will show you the number of errors. Most have some sort of developer tool too (Firebug in Firefox, Dragonfly in Opera etc).
I'm not sure what you mean by preloading. Since a javascript file can affect the rest of the page in various ways, browsers will fully load and execute a script tag before continuing to parse the page (which is why scripts can slow page loading down so much).
I can't see the load function in your code which is being called on your body tag! I'd try and steer clear of adding JS to your HTML file, it can be added dynamically and will prob cause you less hassle along the way aas well as being easier to maintain.
I'd say that the things you need to look out for are making sure that you're not trying to call something before it's defined (maybe your seperate JS files were defined in a different order to how they appear in the single JS file).
Firebug for firefox is a good development tool, if you've not found it already. Webkit, Opera and IE also have various other dev tools.
Combining JavaScript files is always the best way to go, unless it's not logically sane to do so (downloading jQuery from Google Code instead of hosting it yourself is a good example).
I always combine as many files as I can (JavaScript, CSS, images (CSS Sprites), etc.), also in development, and I never experience any problems. It's way faster regarding less http connections, which should not in any case be underestimated.
Regarding that you want to count the errors, I don't exactly see what you mean. But, debugging tools like the built in one in Google Chrome or Firebug for Firefox are good tools for debugging your JavaScript code, and shows lists of the errors occurring.
And to that thing of preloading: Yes, it can be done, though it'll become nasty and illogical. However, I can't think of any case whatsoever where it would be a good solution to have the trouble to preload the JavaScript, compared to just make it work right out of the box, no error checking needed.
About the error you are experiencing, the only one that my Chrome points out is this:
Uncaught ReferenceError: load is not defined
... which seems to be the onload method "load()" set on line 55 of your HTML document when the body tag is started.
I was fine tuning a page that is heavy on jquery and stumbled across this website:
http://www.componenthouse.com/extra/jquery-analysis.html
When I click on the "Run Test" button the first time, the numbers are way higher than subsequent clicks. Is this because JS is cached by the browser? Can someone explain how this works internally? Can a user choose to not cache the JS?
External javascript files are cached and, of course, an html containing script tags can be cached too.
What you see may be a result of html caching or some browser optimization. You should try different browsers, closing and re-opening your browser and clearing the cache of the browser.
The numbers are (significantly) different for me on the second time in Firefox 3.5. OTOH, they are fairly consistent(ly slow) in IE 8. Firefox 3.5's JavaScript interpreter compiles the JS to executable code. So it does make sense that the first time is slower; the code hasn't been JITted yet.
The performance boost you're seeing is likely due to your javascript interpreter. Most newer web browsers use a JIT-compiling javascript engine so code paths taken multiple times can be optimized.
Read this blog post on how Safari's javascript engine achieved many of its speed-ups.
Whether or not JavaScript code is cached, execution performance isn't affected. What you are seeing is jQuery caching the results for the selector queries so they don't take as long on subsequent runs.