Decrease initial web app load time - javascript

I am creating a one page web app with ExtJS.
Isn't the best way to decrease load time of an web app to inject JS, CSS and HTML in the initial HTML file sent to browser instead of just including the script and css tags to load the files from the server one at a time since that will reduce multiple HTTP requests into only one.

You may like the concept of httpcombiner.ashx.
http://archive.msdn.microsoft.com/HttpCombiner
This tool can also compress and cache your js and css

If you want to cut down on initial load time, one of the best ways is to take advantage of the browser cache. Suggest you look at using a hosted ExtJS library, such as from Google Ajax APIs. There is a great chance a prospective visitor will already have it cached.
This is just one tip of many.

This webpage outlines some best practices when it comes to lowering perceived webpage loading time.
http://developer.yahoo.com/performance/rules.html
In addition to using the condensers pavan suggested, you can use Google's closure compiler to minimize javascript files.
http://closure-compiler.appspot.com/home

Well, there is big difference between load time and observed load time. One of the best ways to reduce load time is to use server side compression. However, progressive loading appears to be loading faster for the user.
Therefore initial response should only contain minimal set of style sheets (lets browser render later arriving stuff already styled) and layout. Then you could have onLoad callback to some AJAX loader which loads additional components.
Most importantly do not forget to size your image containers. One of the most annoying things is when you miss-click a link just because an image started loading and changed the layout.

Related

Is it possible to cache an HTML page in a way that it can be requested from multiple URIs?

I'm in the process of building a single-page web application with friendly URLs. As an example, for a chat you could request /chat/roomname or /chat/roomname2 to connect to a different chatroom. Since this is an SPA however, both of those would lead to the same HTML contents.
Is it possible to tell the browser to cache both pages as a single page (as in, going to /chat/roomname would also cache /chat/roomname2 in the browser), or something that would give a similar effect? This way the HTML contents could be large and only have to be loaded once.
Alternatively I could do /#/chat/roomname or similar, though I'd prefer not to if the above is possible.
Separate paths are separate things for the browser. Even the same pages with different query parameters are treated different (hence, the "cache busting" technique). However...
You can:
Off-load all scripts to external scripts so the browser caches them separate from the page.
Do the same with CSS as well.
Keep the page to a bare minimum.
If the resources (scripts, styles, images) won't be modified for a long period of time, you can set a longer expiry for them.
Instead of loading the entire layout using HTML, you can use AJAX to fly in templates. That way, those layout requests get cached as well. Have JS assemble them when they're needed.
In the end, your pages will be devoid of all JS, CSS and markup that can be flown in and assembled using JS. This will make the page lighter than usual. You can take it a step further by minifying the scripts and styles, compressing images, compress the HTML and stuff.

Is it better to load many small JavaScript files or one large JavaScript file?

I have noticed in chrome that if I load an image as a base64 string and then scroll through that part of the page it will slow down.
I have also noticed that when I navigate out of a tab with my Javascript in it and then move back to that tab it will be slow for a few seconds as though V8 is recompiling the js.
There are three options I can think of but I don't know which is best:
load a tiny loading page first and handle subsequent loading eloquently
load one huge js or css file with everything (jquery + my code + etc)
clump certain codes together (use jquery cdn but group my code together)
What is the best way to get your js loaded as quickly and eloquently as possible?
Generally, loading more files incurs more overhead in HTTP than combining them into fewer files. There are ways to combine files for all kinds of content:
For images, use CSS sprites.
For javascript, compile your client-side code and libraries into one file, and minify to reduce size.
For css, you can do something similar to the above. hem compiles stylus into one css file, for example, and this can help organizationally as well.
Additionally, when you concatenate Javascript and CSS, your webserver or reverse proxy can send them in compressed form for faster page loads. This will be more efficient for larger files as there is more to gain from compression.
There are way too many maybes for this to have any guaranteed solutions, but here you go:
1) load CSS at the top -- load it all there, if you're doing a site with multiple pages.
If you're building a one-page application (where you're running galleries and twitter feeds and articles, etc on the same page, and you can open and close different sections), then you can consider loading widget-specific CSS, at the time you're loading your widget (if it's not needed at startup).
Do NOT use #import in your CSS, if you want it to load quickly (you do).
2) load the vast majority of your JS at the bottom of the page.
There is practically nothing that can't be lazy-loaded, or at least can't be initialized at the bottom of the page, after the DOM is ready, and if there really is, serve those as separate files at the top of the page, and consider how you might rewrite to depend on them less.
3) be careful with timers -- especially setInterval... ...you can get your page's performance into a lot of trouble with poorly-managed timers.
4) be even more careful with event-handlers on things like window-scroll, resize, mouse-move or key-down. These things fire many, many times a second, so if you've written fancy programs which depend on them, you need to rethink how you fire the program (ie: don't run it every time something the handler goes off).
5) serving JS files is a trade-off:
Compiling JS takes a while. So if you're loading 40,000 lines in one file, your browser is going to pause for a little while, as it compiles all of that.
If you serve 18 separate files, then you have to make 18 different server calls.
That's not cool, either.
So a good balance is to concatenate files together that you KNOW you're going to need for that page, and then lazy-load anything which is optional on the page (like a widget for adding a comment, or the lightbox widget, etc).
And either lazy-load them after all of the main products are up and running, OR load them at the last possible second (like when a user hits the "add comment" button).
If you need to have 40,000 lines loaded in your app, as soon as it starts, then take the hit, or decide what order you can load each one in, and provide "loading" indicators (which you should be doing on lazy-load always) for each widget until it's ready (loading the JS one at a time).
These are guidelines for getting around general performance issues.
Specifics are hard to answer even when you have the site directly in front of you.
Use the Chrome dev console for profiling information and network performance, and rendering performance, et cetera.
Well there is a very popular concept called concatenation. The idea is to have as few HTTP requests to your server as possible. Because each request means a new connection, for which DNS lookup happens, then handshake is negotiated and then after a few more protocol-based steps, the server sends the requested file as the response.
You can check http-archive for a list of performance best-practices.
So yeah, you should combine all JS files into one (there are certain exceptions, like js at head and js in footer)
This is the answer for your question-title and points 2 & 3.
As for the other part, I am not clear about the scenario you are talking of.
I recently had the same problem, and then I developed and released a JS library (MIT licence) to do this. Basically, you can put all your stuff (js, images, css ...) into a standard tar archive (which you can create server side), and the library reads it and allows you to easily use the files.
You'll find it here : https://github.com/sebcap26/FileLoader.js
It works with all recents browsers and IE >= 10.
The number of files to load has an impact on the load speed of the whole site. I would recommend to pack into a single javascript file all the required functionality for the website to display properly.

html5: a good loading approach?

I'm writing my first HTML5 + jquery.mobile web app.
The app is basically a menu which redirects to internal pages (data-role="page") defined in the same index.html. I do not write pages as external files to avoid reloading and rewriting the - substantially - same <head>: I suppose it's faster to jump to an internal tag than loading a new page...
Now, I have a page which needs some specific jquery plugins and some specific css. No other page needs these plugins or css.
Of course I could load these js/css in the main <head> section, but this approach would slow the first page loading, uselessly.
I could solve the problem with CSS, with:
$('head:first').append('<link rel="stylesheet" type="text/css" href="' + file + '" />');
I could even solve the problem with JS, but only for 'standard' JavaScript, with something like:
<script>
$(document).ready(function() {
$('#page-availability').live('pageinit', function () {
$.getScript("js/jqm-datebox.core.js");
$.getScript("js/jqm-datebox.mode.calbox.js");
$.getScript("js/jquery.mobile.datebox.i18n.en.utf8.js");
$('#datepicker').data({
"mode": "calbox",
...
});
...
});
...
});
Unfortunately this approach seems not to work (firebug croaks: "TypeError: a.mobile.datebox is undefined"...) with jquery plugins: it looks like they are not evaluated... (even if they are there, before the end of the <head> section, viewing at the "Generated Source"...).
I'm using Firefox (15) to debug, but I suppose this isn't the point...
Any hint?
The one page approach can be good for mobile if:
You don't have to load too much extra content in order to support all the content the user might show from that one page.
You don't have to load too much code to support all the behaviors.
The typical user actually does go to several different virtual pages so the scheme saves them load time and makes things quicker on subsequent virtual page loads.
Done well, the user gets OK performance on loading the first page and very quick performance when going to the other "embedded" pages that don't have to load new content over the network.
The one page approach is not so good if:
The initial load time is just more than it's worth because of the volume of stuff that must be loaded.
You have to dynamically load content for the sub-pages anyway.
You have SEO issues because the search engine can't really find/properly index all your virtual pages.
So, in the end, it's a real tradeoff and depends very much on how big things are, how many things you're loading and what the actual performance comes out to be. A compact mobile site can serve server-loaded page views from one page to the next pretty quickly if the pages are kept very lightweight and there are very few requests that must be satisfied for each page.
In general, you want to pursue these types of optimizations:
Compress/minify all javascript.
Reduce the number of separate items that must be loaded (stylesheets, javascript files, images).
Reduce the number of sequential things that must be loaded (load one, wait for it to load, load another). Mobile is bad at round-trips and loading lots of resources. It's OK at loading a few things.
Make it easy for the browser to cache javascript files. Use a few common javascript files that each serve the needs of many pages. Loading a little more at the start and then allowing the javascript file to be loaded from cache on all future pages loads is way, way better if the user will be visiting many successive pages on your site. The same is true for external CSS files.
Very very careful of lots of images, even small images. Lots of http requests in order to load a page is bad for load time on mobile and every image you request is an http request (unless it comes from the browser cache).
Make sure your server is configured to maximize browser caching for things that can be effectively cached.
Other things to be aware of:
By default dynamic loading of script files is asynchronous and unordered. If your script files must execute in a specific order, then you will have to either not load them dynamically or you will have to write code (or use a library) that serializes their execution in the desired order.
$.getscript is a shorthand AJAX function, it takes a callback as the second parameter.
Check out the docs:
http://dochub.io/#jquery/jquery.getscript
You could concatenate those scripts and then do your stuff in the callback.
This is not so dissimilar to old Flash asset loading issues.
My strategy for that? load only whats necessary for the initial page view. When its loaded and the page / app is viewable by the user, progressively load all other assets.
If the assets were particularly heavy, then I would disable the link to that specific page until its required assets were loaded.
In this case, you might disable the link to the particular page at the outset, initiate the load of its assets, and when they are ready, enable the link.
Not sure if you're having any syntax issues, but you can certainly just inject a new script element into the head with the correct source, and it will instigate a download (like you are doing with css. But you probably know that ;D )
Cheers
I would just combine/minify and compress all the JS in one file and always load that. This is something (with correct caching) which is only downloaded once so you don't have to worry about performance much.
Of course I could load these js/css in the main section
I often just add it just before the </body> and tag. Also note that besides the fact that .live() is deprecated it is also slow as hell. So don't use it, but use .on().

Is there an advantage to dynamically loading/unloading javascript and css stylesheets?

Background:
I'm putting together a site that will use ajax as a primary method of changing the content. That way the main frame and images will not have to be constantly reloaded with every page change. The main frame has its own site.css stylesheet.
Question 1:
Is it worthwhile to put all the stylesheet information into a single stylesheet? I think that this would make the website less modular. Everytime a new page or content is added/removed the css would have to be updated (given the content requires different style information).
Question 1.1:
Same question but with javascript.
Question 2:
If it is worthwhile (as I think it is) to have multiple stylesheets, is it beneficial to unload a stylesheet when its not in use. For example, I load the profile.php page so I dynamically load the profile.css. The user then changes to the settings.php page, I unload the profile.css and load the settings.css. Is this constant loading/unloading going to tank performance or even save on website size?
Question 2.1
Same question as above but applied to javascript functions.
Once your javascript or css file is downloaded to the machine, it is cached by their browser. So, you don't incur the additional cost of another HTTP request. Lazy loading the scripts and stylesheets could make sense, but there is no sense in unloading these web assets once they have already been sent to the client.
It is good to use some sort of mechanism to compile your scripts and stylesheets to minimize the initial http requests to one per asset type. Having only one stylesheet and one javascript file would be an architectural nightmare in many cases, but that doesn't mean that you can't still present it to the browser that way. I use .NET, so I'm not sure how you handle this in PHP, but I'm sure the functionality is out there.
Answer 1:
This is all about balance. You should keep different CSS and JS files, but combine them all into one before deploying (one CSS file and one JS file). Basically there's no need to develop with one big file because there are tools that can compile them for you.
The tricky part (depending on the complexity of your site) is figuring out which CSS and JS code gets used frequently enough to be called on every page. If you have code that only gets used on one page, you should probably dynamically load it...unless your whole site only has 5 pages. Like I said, balance.
Answer 1.1 and 1.2:
No, not worthwhile. As soon as you load the CSS and JS files, they get cached on the user's machine. So unloading them is pointless.
This smells somewhat of premature optimization to me. Let's look at SO's loadtimes:
all.css - 46.8 Kb - load time 87 ms
Just to drive the point home some more, here is the load times for a bunch of other items on this particular SO page (on my four-year-old laptop over a fiber optic connection):
And here are the relative sizes of (some of) those components:
I think the takehome is that if you build the rest of your site well, you shouldn't have to worry about optimizing those last few milliseconds until they really count.
Each time you need another css or js file, your browser requests it. The more requests you make, the more time it takes your page to load.
Question 1
Yes, because as said before, once downloaded the css gets cached. That's why in general when including style info it's better to not define it inline. You can handle this by using a simple PHP script that declares the content type as css and then includes your individual css files as one.
header("content-type: text/css");
$file = file_get_contents('your.css');
$file .= file_get_contents(...)
Then just refer to it as normal in a link tag.
<link rel="stylesheet" href="yourCSS.php" >
Question 1.1
Same answer as before just use this instead:
header("content-type: application/javascript");
Question 2
No. Once loaded it will always be accessible in the memory cache. Googles GWT framework actually compiles all of your external content and transfers it as a large monolithic bundle. This is because you can only do 2 concurrent HTTP_Requests per domain. Thats also why Yahoo's speed guide recommends that you place your css at the top of the page and your javascript at the bottom of the body tag.
Question 2.1
No. But do try to keep your external files small. Always minify production code.
Answer 1
While there may be some benefit to load your CSS rules in a as-needed basis, I would advise against it. Simply because this is against having a unified layout for your site or web application. Your design should be standardized throughout all of your pages, and whatever "modules" you are loading via Ajax requests, they should use the same layout as the primary page. JQuery UI is a good example; whether you use a widget or not, it's style is downloaded nevertheless.
However, if you need to apply specific styles to the bits of HTML that you retrieve from your requests, then you simply can have these rules inside a <style> tag appended to the <head> section, or add the CSS file to your <head>, or even have your rules set to the style attribute of your HTML elements.
Answer 1.1
If you should load Javascript code in a as-needed basis, this code should not be reloaded twice, unless you really need to reload the code for some obscure reasons...
But in any case, neither CSS nor Javascript should be "unloaded", there is no point to that.
Answer 2 and 2.1
If you had asked this question a few years back, the answer would have probably been that "loading a single stylesheet and/or Javascript is better", but this is not really true anymore. With the internet connection speed and computer performance, the browsers have become very efficient and the trouble is not worth the performance gain (most of the time). Moreover, if these resources are static (the URL don't change), they are cached and reused, however many there can be. Users usually don't mind to wait if waiting is expected (see section 2.10), or if it's the first time.

Speed up web site loading

I am looking for the best way to speed up the load time of my js.
The problem is that I am working with a very large site that uses the jquery framework, and what's happening is because the site is also loading, facebook connect, addthis sharing, google analytics and another tracking code, the jquery is delayed a few seconds, and certain elements like the calendar just appear, and my users are complaining that things take to long.
I did a test in google chrome and the avg load time is 4s. Which is too much.
I am already doing minification, and the jquery UI/ Jquery is being loaded from google. What's the best way to approach this?
Make fewer http calls by combining images and script and css, and also use a Content Delivery Network for you static images and css might help!
You are not likely to be able to do much more about the load time of the external scripts, so what you can do is to change the order that things happen in the page so that the external scripts are loaded after you have initialised the page.
Scripts are loaded and executed in a serial manner, so if you change their order in the source code, you also change the order they are loaded.
Instead of using the ready event in jQuery, you can put your initialising code inline in the page, after all the content but before the body closing tag. That way the elements that you want to access are loaded when the script runs, and you can put external scripts below the initialising code to make them load after.
Small technical changes (such as serving the JSs from Google, minifying, etc..) will only get you so far.
Seems you simply have lots of dynamic stuff going on in your page. Have you though of an asynchronous way of building your content? One option might be to place placeholders instead of the external content, and asynchronously load it, so when all the scripts are loaded and ready, all you need to do is throw the markup into the placeholder.
This will create a better user experience, instead of the user waiting 10 seconds for the entire page, it will start loading incrementally after 2 seconds (and still fully load after 10).
In addition to Yuval's answer some options that might or might not bring you a speed gain:
the load time of external libraries is something beyond your control. Try to include them as late as possible, and better still, dynamically after the page has loaded. This way your page won't stall, if Google analytics or Facebook have another hickup.
It is not necessarily faster to load jQuery from Google. Consider putting jQuery, jQuery UI and as many of your own JS as reasonable in a single file, minify and gzip it and let the server serve the gzipped version where possible. Note here, that the gain in speed depends largely on what your users cache and where they cache it. If they already have jQuery from Google in their cache, this technique might make page load slower.
The bottomline is, that after some optimization you're out for experimenting. You must find out, what your average user has in her cache, if the page is accessed directly via deep links or if you can smuggle some JS or CSS (or even images) into her cache via a previous "landing page".
Make sure you deliver your content in gzip/deflate encrypted format. Combine multiple javascript files into 1 file, which helps to reduce the number of http requests.
P.S. Here's a test tool to check if compression is configured:
http://www.gidnetwork.com/tools/gzip-test.php

Categories