I am looking for the best way to speed up the load time of my js.
The problem is that I am working with a very large site that uses the jquery framework, and what's happening is because the site is also loading, facebook connect, addthis sharing, google analytics and another tracking code, the jquery is delayed a few seconds, and certain elements like the calendar just appear, and my users are complaining that things take to long.
I did a test in google chrome and the avg load time is 4s. Which is too much.
I am already doing minification, and the jquery UI/ Jquery is being loaded from google. What's the best way to approach this?
Make fewer http calls by combining images and script and css, and also use a Content Delivery Network for you static images and css might help!
You are not likely to be able to do much more about the load time of the external scripts, so what you can do is to change the order that things happen in the page so that the external scripts are loaded after you have initialised the page.
Scripts are loaded and executed in a serial manner, so if you change their order in the source code, you also change the order they are loaded.
Instead of using the ready event in jQuery, you can put your initialising code inline in the page, after all the content but before the body closing tag. That way the elements that you want to access are loaded when the script runs, and you can put external scripts below the initialising code to make them load after.
Small technical changes (such as serving the JSs from Google, minifying, etc..) will only get you so far.
Seems you simply have lots of dynamic stuff going on in your page. Have you though of an asynchronous way of building your content? One option might be to place placeholders instead of the external content, and asynchronously load it, so when all the scripts are loaded and ready, all you need to do is throw the markup into the placeholder.
This will create a better user experience, instead of the user waiting 10 seconds for the entire page, it will start loading incrementally after 2 seconds (and still fully load after 10).
In addition to Yuval's answer some options that might or might not bring you a speed gain:
the load time of external libraries is something beyond your control. Try to include them as late as possible, and better still, dynamically after the page has loaded. This way your page won't stall, if Google analytics or Facebook have another hickup.
It is not necessarily faster to load jQuery from Google. Consider putting jQuery, jQuery UI and as many of your own JS as reasonable in a single file, minify and gzip it and let the server serve the gzipped version where possible. Note here, that the gain in speed depends largely on what your users cache and where they cache it. If they already have jQuery from Google in their cache, this technique might make page load slower.
The bottomline is, that after some optimization you're out for experimenting. You must find out, what your average user has in her cache, if the page is accessed directly via deep links or if you can smuggle some JS or CSS (or even images) into her cache via a previous "landing page".
Make sure you deliver your content in gzip/deflate encrypted format. Combine multiple javascript files into 1 file, which helps to reduce the number of http requests.
P.S. Here's a test tool to check if compression is configured:
http://www.gidnetwork.com/tools/gzip-test.php
Related
I have noticed in chrome that if I load an image as a base64 string and then scroll through that part of the page it will slow down.
I have also noticed that when I navigate out of a tab with my Javascript in it and then move back to that tab it will be slow for a few seconds as though V8 is recompiling the js.
There are three options I can think of but I don't know which is best:
load a tiny loading page first and handle subsequent loading eloquently
load one huge js or css file with everything (jquery + my code + etc)
clump certain codes together (use jquery cdn but group my code together)
What is the best way to get your js loaded as quickly and eloquently as possible?
Generally, loading more files incurs more overhead in HTTP than combining them into fewer files. There are ways to combine files for all kinds of content:
For images, use CSS sprites.
For javascript, compile your client-side code and libraries into one file, and minify to reduce size.
For css, you can do something similar to the above. hem compiles stylus into one css file, for example, and this can help organizationally as well.
Additionally, when you concatenate Javascript and CSS, your webserver or reverse proxy can send them in compressed form for faster page loads. This will be more efficient for larger files as there is more to gain from compression.
There are way too many maybes for this to have any guaranteed solutions, but here you go:
1) load CSS at the top -- load it all there, if you're doing a site with multiple pages.
If you're building a one-page application (where you're running galleries and twitter feeds and articles, etc on the same page, and you can open and close different sections), then you can consider loading widget-specific CSS, at the time you're loading your widget (if it's not needed at startup).
Do NOT use #import in your CSS, if you want it to load quickly (you do).
2) load the vast majority of your JS at the bottom of the page.
There is practically nothing that can't be lazy-loaded, or at least can't be initialized at the bottom of the page, after the DOM is ready, and if there really is, serve those as separate files at the top of the page, and consider how you might rewrite to depend on them less.
3) be careful with timers -- especially setInterval... ...you can get your page's performance into a lot of trouble with poorly-managed timers.
4) be even more careful with event-handlers on things like window-scroll, resize, mouse-move or key-down. These things fire many, many times a second, so if you've written fancy programs which depend on them, you need to rethink how you fire the program (ie: don't run it every time something the handler goes off).
5) serving JS files is a trade-off:
Compiling JS takes a while. So if you're loading 40,000 lines in one file, your browser is going to pause for a little while, as it compiles all of that.
If you serve 18 separate files, then you have to make 18 different server calls.
That's not cool, either.
So a good balance is to concatenate files together that you KNOW you're going to need for that page, and then lazy-load anything which is optional on the page (like a widget for adding a comment, or the lightbox widget, etc).
And either lazy-load them after all of the main products are up and running, OR load them at the last possible second (like when a user hits the "add comment" button).
If you need to have 40,000 lines loaded in your app, as soon as it starts, then take the hit, or decide what order you can load each one in, and provide "loading" indicators (which you should be doing on lazy-load always) for each widget until it's ready (loading the JS one at a time).
These are guidelines for getting around general performance issues.
Specifics are hard to answer even when you have the site directly in front of you.
Use the Chrome dev console for profiling information and network performance, and rendering performance, et cetera.
Well there is a very popular concept called concatenation. The idea is to have as few HTTP requests to your server as possible. Because each request means a new connection, for which DNS lookup happens, then handshake is negotiated and then after a few more protocol-based steps, the server sends the requested file as the response.
You can check http-archive for a list of performance best-practices.
So yeah, you should combine all JS files into one (there are certain exceptions, like js at head and js in footer)
This is the answer for your question-title and points 2 & 3.
As for the other part, I am not clear about the scenario you are talking of.
I recently had the same problem, and then I developed and released a JS library (MIT licence) to do this. Basically, you can put all your stuff (js, images, css ...) into a standard tar archive (which you can create server side), and the library reads it and allows you to easily use the files.
You'll find it here : https://github.com/sebcap26/FileLoader.js
It works with all recents browsers and IE >= 10.
The number of files to load has an impact on the load speed of the whole site. I would recommend to pack into a single javascript file all the required functionality for the website to display properly.
I'm writing my first HTML5 + jquery.mobile web app.
The app is basically a menu which redirects to internal pages (data-role="page") defined in the same index.html. I do not write pages as external files to avoid reloading and rewriting the - substantially - same <head>: I suppose it's faster to jump to an internal tag than loading a new page...
Now, I have a page which needs some specific jquery plugins and some specific css. No other page needs these plugins or css.
Of course I could load these js/css in the main <head> section, but this approach would slow the first page loading, uselessly.
I could solve the problem with CSS, with:
$('head:first').append('<link rel="stylesheet" type="text/css" href="' + file + '" />');
I could even solve the problem with JS, but only for 'standard' JavaScript, with something like:
<script>
$(document).ready(function() {
$('#page-availability').live('pageinit', function () {
$.getScript("js/jqm-datebox.core.js");
$.getScript("js/jqm-datebox.mode.calbox.js");
$.getScript("js/jquery.mobile.datebox.i18n.en.utf8.js");
$('#datepicker').data({
"mode": "calbox",
...
});
...
});
...
});
Unfortunately this approach seems not to work (firebug croaks: "TypeError: a.mobile.datebox is undefined"...) with jquery plugins: it looks like they are not evaluated... (even if they are there, before the end of the <head> section, viewing at the "Generated Source"...).
I'm using Firefox (15) to debug, but I suppose this isn't the point...
Any hint?
The one page approach can be good for mobile if:
You don't have to load too much extra content in order to support all the content the user might show from that one page.
You don't have to load too much code to support all the behaviors.
The typical user actually does go to several different virtual pages so the scheme saves them load time and makes things quicker on subsequent virtual page loads.
Done well, the user gets OK performance on loading the first page and very quick performance when going to the other "embedded" pages that don't have to load new content over the network.
The one page approach is not so good if:
The initial load time is just more than it's worth because of the volume of stuff that must be loaded.
You have to dynamically load content for the sub-pages anyway.
You have SEO issues because the search engine can't really find/properly index all your virtual pages.
So, in the end, it's a real tradeoff and depends very much on how big things are, how many things you're loading and what the actual performance comes out to be. A compact mobile site can serve server-loaded page views from one page to the next pretty quickly if the pages are kept very lightweight and there are very few requests that must be satisfied for each page.
In general, you want to pursue these types of optimizations:
Compress/minify all javascript.
Reduce the number of separate items that must be loaded (stylesheets, javascript files, images).
Reduce the number of sequential things that must be loaded (load one, wait for it to load, load another). Mobile is bad at round-trips and loading lots of resources. It's OK at loading a few things.
Make it easy for the browser to cache javascript files. Use a few common javascript files that each serve the needs of many pages. Loading a little more at the start and then allowing the javascript file to be loaded from cache on all future pages loads is way, way better if the user will be visiting many successive pages on your site. The same is true for external CSS files.
Very very careful of lots of images, even small images. Lots of http requests in order to load a page is bad for load time on mobile and every image you request is an http request (unless it comes from the browser cache).
Make sure your server is configured to maximize browser caching for things that can be effectively cached.
Other things to be aware of:
By default dynamic loading of script files is asynchronous and unordered. If your script files must execute in a specific order, then you will have to either not load them dynamically or you will have to write code (or use a library) that serializes their execution in the desired order.
$.getscript is a shorthand AJAX function, it takes a callback as the second parameter.
Check out the docs:
http://dochub.io/#jquery/jquery.getscript
You could concatenate those scripts and then do your stuff in the callback.
This is not so dissimilar to old Flash asset loading issues.
My strategy for that? load only whats necessary for the initial page view. When its loaded and the page / app is viewable by the user, progressively load all other assets.
If the assets were particularly heavy, then I would disable the link to that specific page until its required assets were loaded.
In this case, you might disable the link to the particular page at the outset, initiate the load of its assets, and when they are ready, enable the link.
Not sure if you're having any syntax issues, but you can certainly just inject a new script element into the head with the correct source, and it will instigate a download (like you are doing with css. But you probably know that ;D )
Cheers
I would just combine/minify and compress all the JS in one file and always load that. This is something (with correct caching) which is only downloaded once so you don't have to worry about performance much.
Of course I could load these js/css in the main section
I often just add it just before the </body> and tag. Also note that besides the fact that .live() is deprecated it is also slow as hell. So don't use it, but use .on().
I'm currently developing a website with ASP.NET and I always check its performance through Firebug...
Now, my question is,
is it better to put all referencing jquery references in Masterpage?
(Reference all first)
or is it better to put specific jquery reference to a specific Content Page?
(Reference specific only)
Thank you!
I would think it's best to put your stylesheets in the master page, as long as they are site-wide styles. You should also think about compressing them into one download, so you reduce HTTP requests.
Maintainance-wise, I would put each reference where it belongs, depending on the scope of your jquery-referencing object.
For instance, if it's something to do with, say, the main navigation, that's present on all rendered pages, and therefore certainly in your master page, then you have better have your $("#navBar") close to your <div id="navBar">...</div>, i.e. on your MasterPage.
If on the other hand it's something related to a specific content page, let's say that shiny carousel (and its specific jQuery plugin) you need on your homepage, you have better have $('myCarousel').carousel(2); close to you <div id="carousel">, i.e. on your HomePage.aspx content page.
And while you're at dispatching stuff to where they belong, if you can tell for sure that carousel plugin is only required on your homepage, you have better include the plugin on your HomePage.aspx content page only.
Not only will you ease your maintainance, but you will also get benefits performance-wise, as you will be more likely to be initializing variables only when they are used, therefore puting a little bit less memory overhead on the browser. Same stuff about loading plugin-related resources (you may not want each and every page load bloated because your master links to every stuff required somewhere on your site).
Generally speaking, I would encourage you to identify any libraries or plugins which you would be using on multiple pages and include them in a place which will automatically put them within those pages.
For maintainability, I usually mash/minify all of my third-party libraries (such as jQuery, jQuery-UI, Backbone, etc) into a single JS file, along with any plugins for them which I know I will be using throughout the site. The downside to doing this is that you may have one very large JS file which loads the first time the user loads the page - the upside: client-side cache that file, and the user doesn't have to load it again.
The general rule of thumb is: minify the number of bytes that the user has to download, and minify the number of HTTP requests which the user has to make throughout your site. So - by compressing these kinds of files into a single download, and letting it exist on every page with the same URL - you can have a single request which generally gets a 302 response and no download. This is far better than having 5 different plugins which are loaded on different pages, each of which makes a separate HTTP call - even if those calls all receive 302 responses.
It is not a proper Approach to referencing jquery references on Content Page.
Once give jquery references on Master Page and use them on all content Pages. It is, Proper and Optimized approach.
I am creating a one page web app with ExtJS.
Isn't the best way to decrease load time of an web app to inject JS, CSS and HTML in the initial HTML file sent to browser instead of just including the script and css tags to load the files from the server one at a time since that will reduce multiple HTTP requests into only one.
You may like the concept of httpcombiner.ashx.
http://archive.msdn.microsoft.com/HttpCombiner
This tool can also compress and cache your js and css
If you want to cut down on initial load time, one of the best ways is to take advantage of the browser cache. Suggest you look at using a hosted ExtJS library, such as from Google Ajax APIs. There is a great chance a prospective visitor will already have it cached.
This is just one tip of many.
This webpage outlines some best practices when it comes to lowering perceived webpage loading time.
http://developer.yahoo.com/performance/rules.html
In addition to using the condensers pavan suggested, you can use Google's closure compiler to minimize javascript files.
http://closure-compiler.appspot.com/home
Well, there is big difference between load time and observed load time. One of the best ways to reduce load time is to use server side compression. However, progressive loading appears to be loading faster for the user.
Therefore initial response should only contain minimal set of style sheets (lets browser render later arriving stuff already styled) and layout. Then you could have onLoad callback to some AJAX loader which loads additional components.
Most importantly do not forget to size your image containers. One of the most annoying things is when you miss-click a link just because an image started loading and changed the layout.
Is there a way to download javascript without executing it? I want to decrease my page load times so am trying to "lazy load" as much javascript onto the page while the user is idle. However I don't want the javascript to execute, I just want it to be in the browser cache.
Should I use an object tag? I noticed that I can use a LINK tag but that makes the browser think it's css which has a negative impact on my ui perf / responsiveness.
As long as you have all code in functions or classes and nothing in global scope nothing will execute.
You can then start your script with a call from
window.load(function() { //your initialisation here });
This will let the whole page load before running any scripts.
You could also add script references via script to make sure they load after any images in the page.
Just add a script element to head using script and it will load.
These pages has examples for this:
http://unixpapa.com/js/dyna.html
http://www.javascriptkit.com/javatutors/loadjavascriptcss.shtml
This way if you have a slow connection or a sever that is overloaded, the visible elements will be loaded first by the browser.
As far as I know, there is no cross-browser compliant way to get around JavaScript loading in serial. If your javascript does something when it is loaded, you need to refactor your code. For instance, you don't write your jQuery commands/actions/code in the jQuery library script; you link the jQuery library and then put your jQuery commands into a separate file. You should do the same thing with your custom libraries. If this isn't possible, you have a big problem with the architecture of your code.
Also, make sure you stick non-executing JS at the bottom of the page near the </body> tag. This will allow everything else to load first, so that the bulky JS libraries don't slow down things like CSS and images.
The best practices way to deal with external javascript is to have it load after everything else on the page by putting it at the bottom of the page. Then everything that can be rendered will be and display and then the javascript at the bottom of the page will load and be compiled and cached. Of course this only works if the javascipt is a library of functions that don't need to be executed mid-page, in that case, you are stuck with serial javascript loading compiling and execution regardless.
Require.JS is a great library for automatically managing when your javascript loads.
You could load the file using the XMLHttpRequest object in JavaScript. (Aka AJAX). (end then of course just discard the result ^^).