I am trying to find an answer as to how to improve front-end performance for web applications. My question is say I have multiple css/js files being referenced.
Now the browser would make http call for each of the css/js file. But my questions are;
Does it happen in parallel or happen one after the other ? Is it same for both CSS/JS ?
Is the behaviour (parallel or one after the other) browser-specific ?
Is the use of async attribute for script tag standard or accpeted way for asynchronous download?
Are there any limitations to the number of http calls that can be made for a single page ? Is it browser-speicific?
Does using AMD frameworks like RequireJS solve any of the performance issues OR is it to be used only in a single-page app development ?
Apart from that references to any other general front-end performance improvement tips would be great?
Does it happen in parallel or happen one after the other ? Is it same for both CSS/JS ?
Is the behaviour (parallel or one after the other) browser-specific ?
Browsers download the content of a website in parallel using multiple connections. The number of those connections depends on the browser and its user settings. If memory serves, the average number of connections is 4.
Is the use of async attribute for script tag standard or accepted way for asynchronous download?
The async attribute is used to denote that the script is to be executed asynchronously, it has no effect on the precedence of download
Are there any limitations to the number of http calls that can be made for a single page ? Is it browser specific?
There is no limit, although obviously the more you have, the longer it will take for the page to download due to the connection limit.
Does using AMD frameworks like RequireJS solve any of the performance issues OR is it to be used only in a single-page app development ?
Those frameworks can be used on any website, with any structure. Their benefit comes from delaying the download of JS until it is actually required by the page. This means that other UI elements, such as images and video can be downloaded first which makes the page load appear quicker for the end user.
If you linking your css/js file in index.html the the request will be parallel, not serial.
I'm not sure about this, but I guess for all browsers it's parallel. Unless you link one css in index.html and then import the other css using the #import in css file.
For asynchronous download, you need to use require.js or any such kind of package manager. The async attribute is used for execution only not for request.
There is no limitation for http requests in a page.
Using require.js is a good option.Actually with require.js you can use r.js which will help you to create a build, which will reduce your multiple css & js files to a single file.
Does it happen in parallel or happen one after the other ? Is it same for both CSS/JS ?
Answer:
when user visit any website first time the browser keep cache of your the contents like js css and parallel request depend on browser to browser. by default different browser has different parallel request limit. and it is same for js and css even for your Ajax request as well.
Is the behaviour (parallel or one after the other) browser-specific ?
Answer:
yes its browser specific.
Is the use of async attribute for script tag standard or accepted way for asynchronous download?
Answer
there is no standard way to use it not uses depend on the requirement and use of async attribute is not related to asynchronous downloading. the downloading of content depend on browser setting or default setting.
Are there any limitations to the number of http calls that can be made for a single page ? Is it browser specific?
Answer:
there is no limit of http calls to a server however the browser will send it in own way as per default setting or user settings.
Related
Page is loading slower than expected. I checked the timeline with firebug, and I see a lot of image blocking:
http://i.imgur.com/tenTNVH.png
I guess I am doing something wrong. (I know I have double jquery here, will eliminate this mistake), but globally is there any way to load images parallel with js?
The reason why this is happening not because images are blocked by js, but because browser has limited number of parallel connections to the same server (some noted about 6-7)
If you look to your timeline closely, you will see there is that limit - no more than 7 files downloaded at the same time, and next is started at the time one of the current files is finished downloading.
In the past there was nasty tricks to avoid that limitation, like placing your images on subdomains and have them loaded in parallel just like from another server, but there is a better ways to improve loading performance. Most effective in matter of effort/result are:
use js concatenation/minimization toolchain. having all the JS in one-two files leaves your connection pool available for other downloads. In your case - you have 3 versions of jQuery and 2 of jqueryUI. Do you really need all of them? having two files instead of 5 will reduce blocking significantly, especially taking into account the fact that files are unminified and big.
use CDN's for third-party libraries. There is public free ones like google cdn. <script src="http://ajax.googleapis.com/ajax/libs/jquery/1.4.4/jquery.min.js"></script>. This also has advantage of most probably it is already in browser's cache
concat your images into sprites. Good if you have many small images, which is not content but UI-related. There is a lot of techniques to achieve it
Enable SPDY on your server if it is available. This can improve download speed not by removing blocking but by removing connection overhead.
Blocking generally occurs when there is more than X parallel requests to the same host (in chrome its 16, it varies per browser).
To rectify this, you have several options:
For images- split up you media content to a different hosts (subdomains from which you can serve the same content)
For js and css- try to minify and concatenate files on the server beforehand, so that they require less requests to retrieve.
For icons etc, try combining them into sprites if possible.
Theres a nice article about it here: http://gtmetrix.com/parallelize-downloads-across-hostnames.html
Since web browsers want to make the web faster.
I know google has his hosted libraries. But why not integrate them on the browser directly?
Problem nowadays is that if you navigate from one page that has jQuery to another page with jQuery since the url is different that same js is cached for that particular url. So loading time takes up longer while navigating between pages with same libraries.
Can't they make something that saves most known libraries in the browser so that when you load jQuery or jQuery-min it searches for it on the browser first.
Pros
-Faster navigation on the web.
-Makes 1 http request less if he finds the library to load.
Cons
Some problems that can occur with that is versions. Since most files have names like jquery.min.js we can't simply load them if they have the same name, on the other hand some have /1.11.0/jquery.min.js So the browser could try to find out the version with the url. If the browser couldn't find version than it would simply load the file.
What do you think ? Any suggestions on how this could work ? Any other cons ?
Edit1: I'm aware of CDNs. I'm only suggesting a way slightly faster than CDNs and doing one http request on the same process.
This problem can be avoided by using commonly used CDNs, as you mentioned.
http://cdnjs.com/
However I think integrating them into the browser could introduce a real versioning problem. Just think of how long in-between versions of IE. If you had to wait that long to download and cache new versions of libraries, it would be a disaster.
Also you would have to download a large variety of libraries to have your bases covered.
Downloading libraries is typically not very slow, its the time to parse and execute it that takes longer on mobile.
Here is a great post about this topic
http://flippinawesome.org/2014/03/10/is-jquery-too-big-for-mobile/
I'm hoping someone with more experience with global-scale web applications could clarify some questions, assumptions and possible misunderstandings I have.
Let's take a hypothetical site (heavy amount of client-side / dynamic components) which has hundreds of thousands of users globally and the sources are being served from one location (let's say central Europe).
If the application depends on popular JavaScript libraries, would it be better to take it from the Google CDN and compile it into one single minified JS file (along with all application-specific JavaScript) or load it separately from the Google CDN?
Assetic VS headjs: Does it make more sense to load one single JS file or load all the scripts in parallel (executing in order of dependencies)?
My assumptions (please correct me):
Compiling all application-specific/local JS code into one file, using CDNs like Google's for popular libraries, etc. but loading all of these via headjs in parallel seems optimal, but I'm not sure. Server-side compiling of third party JS and application-specific JS into one file seems to almost defeat the purpose of using the CDN since the library is probably cached somewhere along the line for the user anyway.
Besides caching, it's probably faster to download a third party library from Google's CDN than the central server hosting the application anyway.
If a new version of a popular JS library is released with a big performance boost, is tested with the application and then implemented:
If all JS is compiled into one file then every user will have to re-download this file even though the application code hasn't changed.
If third party scripts are loaded from CDNs then the user only has download the new version from the CDN (or from cache somewhere).
Are any of the following legitimate worries in a situation like the one described?
Some users (or browsers) can only have a certain number of connections to one hostname at once so retrieving some scripts from a third party CDN would be result in overall faster loading times.
Some users may be using the application in a restricted environment, therefore the domain of the application may be white-listed but not the CDNs's domains. (If it's possible this is realistic concern, is it at all possible to try to load from the CDN and load from the central server on failure?)
Compiling all application-specific/local JS code into one file
Since some of our key goals are to reduce the number of HTTP requests and minimize request overhead, this is a very widely adopted best practice.
The main case where we might consider not doing this is in situations where there is a high chance of frequent cache invalidation, i.e. when we make changes to our code. There will always be tradeoffs here: serving a single file is very likely to increase the rate of cache invalidation, while serving many separate files will probably cause a slower start for users with an empty cache.
For this reason, inlining the occasional bit of page-specific JavaScript isn't as evil as some say. In general though, concatenating and minifying your JS into one file is a great first step.
using CDNs like Google's for popular libraries, etc.
If we're talking about libraries where the code we're using is fairly immutable, i.e. unlikely to be subject to cache invalidation, I might be slightly more in favour of saving HTTP requests by wrapping them into your monolithic local JS file. This would be particularly true for a large code base heavily based on, for example, a particular jQuery version. In cases like this bumping the library version is almost certain to involve significant changes to your client app code too, negating the advantage of keeping them separate.
Still, mixing request domains is an important win, since we don't want to be throttled excessively by the maximum connections per domain cap. Of course, a subdomain can serve just as well for this, but Google's domain has the advantage of being cookieless, and is probably already in the client's DNS cache.
but loading all of these via headjs in parallel seems optimal
While there are advantages to the emerging host of JavaScript "loaders", we should keep in mind that using them does negatively impact page start, since the browser needs to go and fetch our loader before the loader can request the rest of our assets. Put another way, for a user with an empty cache a full round-trip to the server is required before any real loading can begin. Again, a "compile" step can come to the rescue - see require.js for a great hybrid implementation.
The best way of ensuring that your scripts do not block UI painting remains to place them at the end of your HTML. If you'd rather place them elsewhere, the async or defer attributes now offer you that flexibility. All modern browsers request assets in parallel, so unless you need to support particular flavours of legacy client this shouldn't be a major consideration. The Browserscope network table is a great reference for this kind of thing. IE8 is predictably the main offender, still blocking image and iFrame requests until scripts are loaded. Even back at 3.6 Firefox was fully parallelising everything but iFrames.
Some users may be using the application in a restricted environment, therefore the domain of the application may be white-listed but not the CDNs's domains. (If it's possible this is realistic concern, is it at all possible to try to load from the CDN and load from the central server on failure?)
Working out if the client machine can access a remote host is always going to incur serious performance penalties, since we have to wait for it to fail to connect before we can load our reserve copy. I would be much more inclined to host these assets locally.
Many small js files is better than few large ones for many reasons including changes/dependencies/requirements.
JavaScript/css/html and any other static content is handled very efficiently by any of the current web servers (Apache/IIS and many others), most of the time one web server is more than capable of serving 100s and 1000s requests/second and in any case this static content is likely to be cached somewhere between the client and your server(s).
Using any external (not controlled by you) repositories for the code that you want to use in production environment is a NO-NO (for me and many others), you don't want a sudden, catastrophic and irrecoverable failure of your whole site JavaScript functionality just because somebody somewhere pressed commit without thinking or checking.
Compiling all application-specific/local JS code into one file, using
CDNs like Google's for popular libraries, etc. but loading all of
these via headjs in parallel seems optimal...
I'd say this is basically right. Do not combine multiple external libraries into one file, since—as it seems you're aware—this will negate the majority case of users' browsers having cached the (individual) resources already.
For your own application-specific JS code, one consideration you might want to make is how often this will be updated. For instance if there is a core of functionality that will change infrequently but some smaller components that might change regularly, it might make sense to only compile (by which I assume you mean minify/compress) the core into one file while continuing to serve the smaller parts piecemeal.
Your decision should also account for the size of your JS assets. If—and this is unlikely, but possible—you are serving a very large amount of JavaScript, concatenating it all into one file could be counterproductive as some clients (such as mobile devices) have very tight restrictions on what they will cache. In which case you would be better off serving a handful of smaller assets.
These are just random tidbits for you to be aware of. The main point I wanted to make was that your first instinct (quoted above) is likely the right approach.
So if I'm using the YUI loader (on Yahoo's servers) to serve JavaScript, and I tell it to use combinations so several YUI Widgets can be downloaded by the browser in a single request, doesn't that make it much more difficult for the browser to cache the JavaScript?
Say I have a site with two pages, and the first one uses the YUI Calendar, Dialog, and Tree widgets, and the browser gets them all in one combined request from YUI's servers.
And the next page only uses the YUI Calendar and Dialog, but not Tree. Doesn't that mean it's technically a different request to Yahoo's servers now, with a different query string? Meaning those two widgets will be downloaded again, even though they were just used on the first page?
In that case, is it better to have one request to the combo server which will result in one single request of (in many situations) uncachable JavaScript? Or several requests for individual YUI components which can be cached?
(YSlow doesn't seem to mention anything about this question.)
It is most likely better to use the combo service. On the first visit to your page, users would be penalized by the amount of http overhead and the processing overhead associated with receiving each file. Also, there's still the issue of browser limits on concurrent connections, so async or not, the un-combo handled files would result in far worse page load time. While you would benefit for those individually cached files on subsequent pages, in all likelihood, each page is going to have other module requests which would amount to more http requests (remember one different module could mean more than one module request after dependencies are accounted for). So it amounts to optimized network IO for first page, then larger payloads with minimal http overhead on subsequent pages vs very unoptimized network IO for first page, then less content with more http overhead on subsequent pages.
If you have a site with many js-enabled pages or you otherwise attempt to mitigate the initial module loading issues, there may be justification in avoiding combo. But really, if it is a matter of shaving every last ms from the loading step, A) you may be missing more fruitful optimizations, and B) the only real way to answer your question is by profiling.
Another approach to address the situation you identify is to configure the YUI instance at construction with some custom rollup modules that represent common groupings of modules you use. This can be a pretty involved undertaking and would naturally introduce a maintenance step.
So in summary, there's probably less to worry about than you think, and allowing default behavior using combo is easy. Any definitive answer would be case by case, and based on profiling your app.
I think you probably want to statically load everything you actually use and minify everything into one file in your production version.
There's info about statically loading on the YUI page but it basically boils down to using script tags for all your files like you would with normal javascript and a use call that looks like this:
YUI().use('*', function(Y) {
// Any modules that were already loaded on the page statically will now be
// attached and ready to use. YUI will not automatically load any modules
// that weren't already on the page.
});
I've not been involved heavily in setting up minification before but the one used in the last project I was on was Uglify I think
We're looking at splitting our .js to be served from two domains with the intent that that would enable concurrent loading.
Question: Can we a) use subdomains for that purpose and b) will that concurrent loading also hold true over https?
For instance, we'd like to request two files as such:
https://www.example.com/firstfile.js
https://subdomain.example.com/secondfile.js
Doable? Alternatives?
As far as I am aware, it won't work. Scripts are set up to block parallel downloads. The reason for that is that parallel loading of scripts can cause race conditions in your javascript. Minify or on demand loading are your best options.
I think you have to consider the latency of the network (a kind of lost time that adds up for every call to make the round trip). The latency is what kills HTTP calls responsiveness.
Personally I follow the trend to reduce the number of http calls.
I merge all my files in one (+ minimise + gzip)
The problem caused by scripts is that they block parallel downloads. The HTTP/1.1 specification suggests that browsers download no more than two components in parallel per hostname. If you serve your images from multiple hostnames, you can get more than two downloads to occur in parallel. While a script is downloading, however, the browser won't start any other downloads, even on different hostnames. (source)
Sounds problematic.
An alternative presented in the book "faster web sites" or "even faster web sites" (which i recommend for you to read) suggests loading the javascript files diagrammatically using a javascript function/method that will append child nodes to the element.
You might want to do some research on the topic but it is a good practice which you might want to consider.
regards,
a) Yes. Use document.domain to avoid Same Origin Policy issues.
b) I don't know, but I can't think of any reason why it shouldn't.