For things like jQuery etc, isit better to leave it to use CDN or minify it into 1 file together with other JS
CDN - it's likely it will already be cached on the users machines and thus you'll save the download for it. Not to mention it will load faster from a CDN than from your site anyway - the overhead of the one extra connection to grab that file is diminimus
All your code should definitely be combined & minified. For the libraries, it's a bit trickier. CDNs are good in theory, but some studies have shown that they were not actually as efficient as they could be because of various reasons.
That means, if you've 50% miss rate on your CDN, the overhead of the extra DNS resolving and extra connection can actually slow you down more than it'll help.
The most important thing anyway is that you should version your minified/combined JS file, make it have a unique url for every version of the code you deploy. That way you can set Expires headers to +10 years, and make sure that anyone that downloads it only downloads it once.
Also don't forget to enable gzip (mod_deflate in apache), that will typically compress the transfer to 1/5-1/10th of its original size.
Using CDN is great, as the js file may be already cached from the CDN to user's computer.
But, there might be some plugins of jQuery and your own sites validation and other functions which might be separated in different JS files. then minify + combining is good approach.
For our ease we have separated the code in different files, and when browser tries to load content it has limitations on how many requests to send on the same server, CDN is out of your domain it will be requested without any browser limit so it loads fast. You need to combine your js files to reduce the number of requests from browser to load your page faster.
For me I use PHP to combine and minify
In html
<script src="js.php" >
and in php
header('Content-type: text/javascript');
include 'js/plugin.js';
include 'js/validation.js';
You can use output buffering to minify and also send this content as gziped to browser
Related
I am compressing all of my files on the go, so it's easier to update rather than having to decompress. I have around 10 JS files, in total around 2,000 lines maybe more, would it be better to put them all in one file and compress it, would it speed up my website, or should I just leave it in individual files, and compress each?
I'm assuming this is for web development.
If all of your scripts were created by you, and you suspect each script will be needed for every page on your site, you should concat / compress them. The first load will take longer, but the scripts will be cached.
If all of your scripts were authored by you, but each page does not necessarily need all 10 of your scripts, you should consider lazy loading them on demand.
If any of those scripts were not authored by you and can be found on a CDN (like jQuery), then link the scripts to a CDN, as there is a chance users will already have them cached. For the remaining scripts, decide if you should lazy load or concat / compress them all.
What you shouldn't do, however, is load all 10 of your scripts individually on each page. That would just have the user's browser send more requests than needed.
It's all about trade-offs, and there isn't a 100% correct answer. Good luck :)
--edit--
You said "on the go". If the content of your scripts change, then you wouldn't want them to be cached. In that case, lazy-loading would probably be the answer.
--edit 2--
If you haven't already, you should really take a look at using Grunt to concat and minify your js files during development. If you decide to go that route, take a look at grunt-contrib-watch.
It is often a good idea to keep the number of files the browser needs to request to a minimum.
The browser may have to open one connection per HTTP request, but if you put all your JavaScript code in one big file it will only have to do one request and thus only one connection needs to be opened to fetch your js code.
It depends on how much of that 2000 lines you are using for different pages. If the file is like a library of which most is used in your distinct pages, then it might actually make little to no difference in terms of loading speed. But if only a part of that file is being used in each page, I would assume separating would be smart as less will be needed to load per page.
To improve the performance of our website, we have split the code into modules. (Separate JS files and CSS files for every feature.) Now, we will be loading only the core files during the page load and loading the remaining files, only when the user requests for the feature. instead of sending both JS and CSS requests separately - we are thinking of loading the concatenated file ( JS file and CSS file ) to save a request. is there a way ?
Note: we have already tried the following techniques
http://blogs.msdn.com/b/shivap/archive/2007/05/01/combine-css-with-js-and-make-it-into-a-single-download.aspx
Loading it via Ajax and splitting them via JS.
Consider that a carefully designed site will be able to download JS files and CSS files in parallel. So once you combine all CSS together in a single CSS file, all JS in a single JS file and you make sure the two can be downloaded in parallel (e.g. using JS defer) you should get better end-to-end performance than downloading a single file with JS+CSS concatenated (and this would also avoid the time spend in the processing of the ajax response).
BTW, all of the above can be done automatically and dynamically by mod_pagespeed (if you're on apache or nginx). Also, the page speed documentation gives some useful information on such issues.
If all of the above is not enough, you may want to look into CDNs (cloudflare is a good free starting point) and make sure that resource caching is optimally configured (mod_pagespeed partially does it for you).
I have a HTML file with JS (jQuery) and CSS. I want a converter that converts all the files, minimizes it and just puts it all in a index.html for example. Google seems to be using this, they have no external files, not even the image, everything is just in one file and I'm sure pre-compiled before release.
Also is this a good idea?
This is not a good idea, in general.
Splitting out your CSS and JavaScript files means that they can be cached independently. You will likely be using a common CSS and JavaScript across many pages. If you don't allow those to be cached, and instead store them in each page, then the user is effectively downloading a new copy of those files for every page they visit.
Now, it is a good idea to served minified versions of these files. Also make sure to add gzip or deflate transfer encoding so that they are compressed. Text compresses nicely... usually around a ratio of 1/8.
(I should note that there has been one occasion where I have loaded everything into a single file. I was working on a single-page web application for the Nintendo Wii, which had no caching capability at all. This is about the only instance where putting everything into a single file made sense. Even then, it is only worth the effort if you automate it server-side.)
I don't recommend to concat CSS with JS.
Just put your css at the top of the page and js at the bottom.
To minify your CSS and JS you have to use gruntjs
Also I recommend you to read this article: Front-end performance for web designers and front-end developers
If your intention is to load the pages faster:
For images: try to use image sprites or images from different domains because browsers love downloading resources from different domains instead of just one domain.
For scripts as well as css: use online minifiers that can reduce white-spaces and reduce the size (if you are on a web hosting, your host may be already compressing the scripts for you using gzip etc)
For landing pages like index pages: If you have less styles then try inserting them inside the <style></style> tag, this will make the page load very fast, Facebook mobile does it that way.
If it wasn't a good idea, google wasn't be using it!
If you put everything in single file, you'll get less HTTP requests when the browser will check if the newer version of file is available.
You also get read of the problem that some resources are not refreshed, which is the headache for 'normal' developers, but it's a disaster in AJAX applications.
I don't know of any publicly available tool doing it all, surely Google is having its own. Note also that, for example in GWT, many such embedding was done by compiler.
What you can do is to search for:
CSS image embedder - for encoding images into CSS
CSS and JS minifier - for building single CSS/JS and minimizing it
And you need some simple tool that will embed it into HTML.
I usually have jQuery code that is page specific along with a handful of functions that many pages share. One approach is to make seperate files for organizing, but i'm thinking that putting all the script in one file and making comments in the file for readability would also work. Then when the site goes live I can minify and obfuscate if needed.
I think the question comes down to limiting http requests or limiting file size. Is one of these a bad habit?
You can have it both ways. Develop with as many individual .js files as you need. Then use a build/deployment process that assembles the files into one larger one, then pushes them through something like Google's Closure Compiler. Compression can be handled transparently by your web server if configured properly.
Of course, this implies a structured development and deployment workflow -- e.g., with files to be assembled/compiled in a specific directory, separated from files that should be served as-is.
References:
Closure Compiler
Apache Ant
Automating the Closure Compiler with Ant
If you can put all the scripts in one file which is minified then that's what you should do first.
Also if your webserver sends out gzipped content the actual script transfer would be small, and the script will be cached on client. Since tcp transfers starts out slow and increase in speed, limiting the number of requests is the best way to speed up the overall loading of a page.
This is the same reason you see sites concatenating images into one larger image, and using CSS to display the correct part of it.
I have a couple of questions that are somewhat related so I'm posting them all on a single question on SO...
Question 1:
I'm currently doing this Facebook application where I'm using jQuery UI Tabs, there's only 4 where 2 of them are loaded through Ajax. The main page is index.html, this is where the tabs code is placed and for the 2 tabs loaded through Ajax, I have two different files, tab1.html and tab2.html.
Currently, the jQuery tabs initialization and Facebook JavaScript initialization is done on index.html. Both tab1.html and tab2.html have JavaScript code that belongs to those pages. For instance, tab2.html has a form and there's some JS (with jQuery) code to validate the form, this code is irrelevant to tab1.html as the JS code on tab1.html is irrelevant to tab2.html.
My question is, should I keep doing this or maybe aggregate all the JS/jQuery code in index.html, tab1.html and tab2.html in a single global.js file and then include it in index.html?
I though of doing this but there will be irrelevant code loaded if the user never opens tab1 or tab2. The benefit of using a single global.js file is that I could pack/minify the file, which I couldn't do if I included each code block in each respective tabX.html file.
Question 2:
As I'm using jQuery, I'm also using lots of plugins (actually only 3 for now, but that number can grow). Some of them provide a minified JS and I use those when available, when they are not, I use the normal versions of course.
There's also the requests problem. If I have lots of plugins, say 10, it will be 10 requests for those plugins. And there is also the fact that some plugins are used in tab1.html but not on tab2.html and vice-verse.
How should I load all the plugins in a minified/packed version on a single web request? Should I do that manually before publishing my app (packing and merging them into a single file) or could I use the PHP version of Dean Edwards's Packer and pack/merge all plugins on the fly? Would this be a good approach?
Question 3:
If the answer on Q1 was something like "merge all code in a single global.js file", should I include the global.js file in the packing/merging script I described above on Q2?
Doing this would simplify everything. I could have my development environment properly organized with all .js files, for the plugins and the global.js in the appropriate folders without bothering with anything else. The packing/merging should take care of the rest (pull the files from the respective folders, send the respective JS headers and output one single packed .js file).
The one thing that's confusing me the most is that not all plugins are used for every tab, not all code is for every tab too. Still, a chunk of the code is global to every tab and the index. This also simplifies everything as: a) I don't have to worry to add the needed code to each tabX.html file and can I simply look at them as HTML templates and nothing else; b) I don't have to be bothered in including the necessary plugins where I need them as I'm currently using $.getScript() from jQuery to load the plugins I need when and only when I need them, but I'm not sure this is a good approach and the code feels dirty and ugly like this.
Question 1:
Pack them all into a single .js file. This will make maintenance easier, and the tiny bit of overhead for the user loading a little js they they potentially may not use does not matter. I would also let Google load the jQuery library for you and then have all of your js code in a single separate file.
Question 2:
As these plugins don't really change I would manually combine them. Closure Compiler is good at this. When minifying use the highest setting that does not give any warnings.
Question 3:
Yes you will want to minify the global.js
When the browser downloads the global.js it's cached for an amount of time. Thus when you call the entire global.js again on a different page, its not re-downloaded it looks at your local copy first. So you do a little bit more work at first on the initial download, but from then on, it should be quicker.
Generally best practices related to javascript for speeding up website loads are:
Minify all javascript and put all of it into a single file (make as much of your javascript external as possible).
Put javascript at the bottom of the document.
Force web server to assign expiration date in the future and use a timestamped query string to invalidate old versions of javascript files, this will prevent unnecessary requests for your javascript if it has not changed. (ie: in httpd.conf ExpiresByType application/x-javascript "access plus 1 year", in your document: <script type="text/javascript" src="/allmy.js?v=1285877202"></script>)
Configure your web server to gzip all text files.
The main reason why you should keep too much javascript away from tab pages is because it will kill user experience. When a user clicks on a tab for the first time it will grab all the components needed on the fly which makes it kinda sluggish.
You're question is only semi-specific as we don't know a lot of things about your site like exact file sizes, how the modules are really used.
The general idea would be to find balance between modularity and speed.
When you're combining modules together these are the general ideas you should consider:
how often does this module change?
how often is this module used?
how big is this module (filesize)?
Then put the most used, stable codebase and merge it into one. Then you should include the rest site specific functionality on the tab pages.
Also, make sure to load javascript asynchronously as it won't block rendering of the page (and tabs).
Another combined answer:
if adding all the JS together in packed/minifed version generates no more than 30k of file size you're better off combining it. A single extra connection for a file (assuming it's not cached) is worth 10-20k of extra JS download. This has to do with browsers opening and closing connections vs streaming extra 20k on an established connection. The threshold also depends on your user distribution. If you have a lot of dial-up or low bandwidth users your threshold will be smaller.
I typically recommend combining and loading as 1 file unless the library is very obscure and requires a very edge case for it to be triggered on a page. Ex: Hover triggers functionality Y but it's on a feedback widget that gets less than 1% of traffic- don't bother combining.
Minifying and Packing is a little overrated these days. With the vast majority of browsers supporting gZip the amount of data consolidation gZip provides of the file over the wire during browser transmission has virtually the same effect as min/pack. However, there is a small cost on the browser to unpack it. Having said that, it's still good practice to min/pack the code since not all browsers support it, you may not want the file to be gZip enabled, etc.
I've used online packers against 3rd party module and it works fairly well. However, there are times when it can cause an issue so make sure to test your manually packed version before deploying.
Alternate:
If you feel that your users will rest on your index page for longer than 10 seconds you could pre-load the additional libraries separately using Js Loader Prototype pattern.
Steve Souder's Even Faster Websites is a book you should look into.
Firstly one experience slowdowns because whenever an external script is linked the browser waits for the script to download, parse and then execute. After this only it regains processing rest of the request. So to avoid such slow downs one can look at parallely downloading the scripts. Few techniques are Ajax the scripts if the scripts are in the same domain or use Script Dom element or Script in iframe if the scripts are on external domains
Q1 : For me modularising all the content is a better option with respect to further development if the page content has to be changed constantly. Responsiveness is very important for the end user. A small global.js will help in getting the app up and running.Parallely one can download the tabX.html.
Q2: As the jquery plugins rarely change. The plugins for the tabX.html pages can be downloaded parallely and locally cached so when the tabX.html is loaded the required plugins need not be fetched. SO all the plugins required by the main page should be in one single file and the ones used by the tabX.html's should be in different files.
Q3 : its a personal choice here. Do you want it to be developer friendly or user friendly. I bank on user friendliness. Making responsive and efficient apps is our job !!!. All the advantages of packing everything into a singe files is you will have ease in development. Well ugly code begets beautiful apps :). Users are speed-aholics. For eg. when google changed its 10 results per page to 20 they saw a considerable drop in search queries. So my opinion is not to pack all of them into one and load each parallely
some of the techniques and relevant links on testing each:
XHR eval /ajax : http://stevesouders.com/cuzillion/?ex=10009
XHR Injection : http://stevesouders.com/cuzillion/?ex=10015
Script in Iframe : http://stevesouders.com/cuzillion/?ex=10012
Script DOM element : http://stevesouders.com/cuzillion/?ex=10010
Question 1:
The best practice would be to place all js files in a single "global" file. This minimizes your HTTP Requests. Let's say you have 5 plug-ins, this would me you need to do 5 request, wherein if you combine them as one, you only need to request it once. This might be a little bit heavy on the first load, but the next time around this file will be cached by the browser, so..no worries about the size. HOWEVER, be careful about the sequence of the scripts when combining it. (I.E. : JQuery script should be placed first on the js file before JQuery UI's)
http://articles.sitepoint.com/article/web-site-optimization-steps/4
http://code.google.com/speed/page-speed/docs/rtt.html
Question 2:
You can do it manually or automatically.Dean Edward's Packer is a good choice. If you're using ASP.NET, you can check MB Compression Handler, if you're using APACHE with PHP perhaps you can change the configuration of your htaccess to gzip it
Question 3:
It'd be better if you pack the "global" javascript file as well. This could save up bandwidth and save more time to load. You got the point, combining all the js files you need for the site will save you time from including individual scripts.