jquery-1.4.2.min.js is 71.8KB
Same file compressed through this tool, with gzip enabled, becomes 32.9 KB
Which is better? If latter, why doesn't jQuery provide a packed file too instead of just uncompressed and min versions?
My Question: One is minified and gzip enabled, other is minified and packed and gzip enabled. Which should I use? If the one thats 32KB, I wonder why doesn't jquery provide a minified+packed version instead, any particular reason why?
Thanks
It's not an either question, use both, serve the minified file, over a gzip stream to the browser for the best/quickest delivery possibly.
Most web servers and almost every current browser support gzip. You're serving the minified file, with internal variables shortened etc...but then deliverying a zipped version of that to the client. By doing this you're delivering the minimum amount of javascript for the client to execute and delivering the smallest payload...so a quicker download for your user.
Also, remember to set cache headers so the client's not re-fetching the file...and there are other performance tips to go along with this you should read :)
Gzip encoding is handled on the fly by web servers. It isn't a feature of the file uploaded to the server, so it wouldn't make sense to provide the file in that format for download.
Gzip encoding and minification are not mutually exclusive.
Perhaps you mean the version packed with Dean Edward's packer? It indeed yields to smaller download but requires some processing on the client-side in order to decompress it.
Related
I am trying to understand JavaScript minification and compression processes and have couple of questions on these:
Since minification makes the code difficult to debug, is it possible to do on-demand de-minification on client-side to cover-up for cases where you actually need to debug and investigate something on the website?
I remember reading somewhere that one can enable compression of all resources (like images, CSS, JavaScript etc.) by setting some options in the Apache Web Server. Is there any difference in the JavaScript compression done at Apache level and, the one done using tools like YUI Compressor?
Can someone help me know the above?
The kind-of cases where I would actually need to de-minify my JavaScript files is let's say a JavaScript error happened at line no. X. With a minified files it would be very tough to know which block of code caused that error in production as the lines are all wrapped up in a minified file. How do you guys investigate and debug in such circumstances? Another user also mentioned this debugging problem in Packed/minified javascript failing in IE6 - how to debug? questions (slightly specific to IE6 though).
You shouldn't be debugging minified code. Ideally, development process is like this:
You build and debug the site locally. You have full versions of javascripts, stylesheets and everything.
You deploy a version to production machine. You minify and gzip a copy of your assets and push that to the server. Your local development copy is left untouched.
If there's a bug in production, you modify your local version, minify, compress, upload.
Repeat until PROFIT
Chrome Dev Tools can de-obfuscate (and de-minify) javascript code if you want to debug production code (useful when trying to replicate a bug on a live environment you may not be seeing in dev)
Typically developers will develop against the uncompressed script file, compress right before deploying.
If you have to go back and debug your script files, you'd just open up the regular, uncompressed file, do your work, compress, and deploy. If you mean debug something while your website is in production, then no, you can't un-minify your script file on demand.
And yes, Apache, and even IIS, can gzip compress scripts and images automatically for you.
Since minification makes the code difficult to debug, is it possible
to do on-demand de-minification on client-side to cover-up for cases
where you actually need to debug and investigate something on the
website?
Sort of. Minified javascript has the same structure, it just does things like delete extra spaces and shorten variable names. So you can easily make the code readable again, either manually or with a script, but you can't recover variable names, so the code will still be harder to work with. So, if you have the original code, definitely don't get rid of it. Save the minified code separately.
I remember reading somewhere that one can enable compression of all
resources (like images, css, javascript etc.) by setting some options
in the Apache Web Server.
Yes, it's called gzip compression. It's not unique to apache, but you would need to configure your server to enable it.
Is there any difference in the javascript compression done at Apache
level and, the one done using tools like YUI Compressor?
Yes. YUI compressor is a minifier - the output is valid javascript. Server-side compression is more analogous to zipping a file - the browser must decode it before it can be used. Using the two together would yield the smallest filesize.
I prefer working with a local unminified copy of the JS-file, and when i deploy the site, I minify all JS-files into one. That way it is easy to debug and improve the code. However, there are tools to revert minification. Have a look at this SO post on revert minification of JavaScript.
Have a look at GZIP compression - This blog describe how to enable GZIP in Apache and how to verify that your server actually is compressing the files.
is it possible to do on-demand de-minification on client-side
Some browsers have a "pretty code" view that automatically formats source code. See Firebug's CSS tab.
Is there any difference in the javascript compression done at Apache level and, the one done using tools like YUI Compressor?
YIU Compressor is a actually a minifier. Apache compression is like ZIP-ing up the file before it is sent to the client so the actual file that is sent is smaller than the file on disk. They are two different technologies that are independent of each other.
We have a website say 'abc.com' which uses lot of javascript and css hosted on another server 'xyz.com'. We upload js and css on to this server and it gives us a URL, which we use/reference in our code.
Now I ran YSLOW on my website, and it complains that these javascript and css files can be compressed. When I inspect response headers using firebug, Content Encoding of response is set to 'GZip'.
My question would be 'how to enable compression for these javascript and css files, hosted on other server'. Is there something, we can do on our side?
Any suggestions are welcome.
You will have to use tools like YUI compressor to compress your js and css files, before uploading to the server.
EDIT:
Please check this link on how to enable gzipping your js and css files. But, I doubt it is possible for you to do this since the files are hosted on third server(unless you are managing it).
JavaScript and CSS compression goes beyond typical all-purpose compression algorithms like gzip.
There are domain specific solution for compressing JavaScript and CSS.
See:
http://developer.yahoo.com/yui/compressor/
http://code.google.com/closure/compiler/
https://github.com/mishoo/UglifyJS
To clarify the terminology used by YSlow (and similar tools like Google's PageSpeed):
Compression reduces response times by reducing the size of the HTTP response. Gzip is the most popular and effective compression method currently available and generally reduces the response size by about 70%. Approximately 90% of today's Internet traffic travels through browsers that claim to support gzip.
Minification removes unnecessary characters from a file to reduce its size, thereby improving load times. When a file is minified, comments and unneeded white space characters (space, newline, and tab) are removed. This improves response time since the size of the download files is reduced.
Some good references that cover both compression and minification:
Yahoo: http://developer.yahoo.com/performance/rules.html
Google: http://code.google.com/speed/page-speed/docs/payload.html
Stoyan Stefanov: http://www.phpied.com/reducing-tpayload/ (examples for Apache+PHP, but can apply to any web server)
As robert mentioned in his answer, enabling compression on the other server would be a configuration change on that side. If you wanted to minify the JS/CSS components, you could do that with a minification tool prior to hosting on the other server.
For things like jQuery etc, isit better to leave it to use CDN or minify it into 1 file together with other JS
CDN - it's likely it will already be cached on the users machines and thus you'll save the download for it. Not to mention it will load faster from a CDN than from your site anyway - the overhead of the one extra connection to grab that file is diminimus
All your code should definitely be combined & minified. For the libraries, it's a bit trickier. CDNs are good in theory, but some studies have shown that they were not actually as efficient as they could be because of various reasons.
That means, if you've 50% miss rate on your CDN, the overhead of the extra DNS resolving and extra connection can actually slow you down more than it'll help.
The most important thing anyway is that you should version your minified/combined JS file, make it have a unique url for every version of the code you deploy. That way you can set Expires headers to +10 years, and make sure that anyone that downloads it only downloads it once.
Also don't forget to enable gzip (mod_deflate in apache), that will typically compress the transfer to 1/5-1/10th of its original size.
Using CDN is great, as the js file may be already cached from the CDN to user's computer.
But, there might be some plugins of jQuery and your own sites validation and other functions which might be separated in different JS files. then minify + combining is good approach.
For our ease we have separated the code in different files, and when browser tries to load content it has limitations on how many requests to send on the same server, CDN is out of your domain it will be requested without any browser limit so it loads fast. You need to combine your js files to reduce the number of requests from browser to load your page faster.
For me I use PHP to combine and minify
In html
<script src="js.php" >
and in php
header('Content-type: text/javascript');
include 'js/plugin.js';
include 'js/validation.js';
You can use output buffering to minify and also send this content as gziped to browser
As far i know until now, the min version of a .js(javascript) file is obtaining by removing the unncessary blank spaces and comments, in order to reduce the file size.
My questions are:
How can I convert a min.js file into a clear, easy readable .js file
Besides, size(&and speed) are there any other advtages of the min.js file.
the js files can be encripted?
can js be infected. I think the answer is yes, so the question is how to protect the .js files from infections?
Only the first question is most important and I'm looking for help on it.
TY
To convert a minified file into a editable source, simply open any IDE that supports auto-formatting and auto-format it. I use Netbeans to do this.
If you do client side caching for the minified file, it means to say that the client (computer) needs to process less bytes. Size and speed are the main advantages of a minified file, and they are already great advantages to prepare for a future that requires great load of data transfer. By the way, it also saves you some bandwidth on your server and therefore money.
I don't see the need of encryption. See How to disable or encrypt "View Source" for my site
Javascript files cannot be edited unless it is done so on the server. The security of your Javascript files depends on your 1) server protection 2) data protection. Data should not be able to exploit. But of course, Javascript is executed on the client side, it will be meaningless for the client user to attack him/herself. However Twitter has shown multiple Javascript exploits. You need to constantly test and check your codes against XSS, CSRF and other attacks. This means to say that if your Javascript file has a loophole, it was the developer, you, who created it.
Multiple minifiers exists, that also are able to compress JS, see http://dean.edwards.name/weblog/2007/04/packer3 for one of the most being used. Some others exists, also see the JSMin library http://www.crockford.com/javascript/jsmin.html
The main advantage is the size gain. You should also aggregate your JS files when you have multiple JS files, this also saves a lot of I/O (less HTTP requests) between the server and the client. This is probably more important than minifying.
I can't answer you about encryption. Client security will mainly depend on its browser.
EDIT: Ok my first answer is not for the first question, merged both in 2.
I have written ashx which merges and remove white-spaces for javascript and css contents.
Using VS2010 ASP.NET Development Server everything works fine.
But in IIS7, text/javascript contents are not compressed (I'm using fiddler to monitor it).
I don't have problem with text/css contents and both of the contents are handled by the same ashx file.
What are your compression settings in IIS? Do you have dynamic script compression enabled?
I found it much easier to implement my own filter for compression than rely on IIS.
Also, on a side note, you are aware that if you call your CSS through an ASHX file all paths in the CSS will be relative to the ASHX and not the CSS file?
Have you enabled dynamic content compression in IIS? (As it comes from code&lhash;an http handler—it is dynamic content.)
But, it seems to me when testing something rather similar, that IIS doesn't always compress dynamic content (sometimes fiddler showed compression, sometimes not) but it wasn't clear why (or sufficiently important for me to dig into it).
Also note, you need to ensure when using Fiddler for this you are not using the default Fiddler view/options which will decompress for display.