We have a website say 'abc.com' which uses lot of javascript and css hosted on another server 'xyz.com'. We upload js and css on to this server and it gives us a URL, which we use/reference in our code.
Now I ran YSLOW on my website, and it complains that these javascript and css files can be compressed. When I inspect response headers using firebug, Content Encoding of response is set to 'GZip'.
My question would be 'how to enable compression for these javascript and css files, hosted on other server'. Is there something, we can do on our side?
Any suggestions are welcome.
You will have to use tools like YUI compressor to compress your js and css files, before uploading to the server.
EDIT:
Please check this link on how to enable gzipping your js and css files. But, I doubt it is possible for you to do this since the files are hosted on third server(unless you are managing it).
JavaScript and CSS compression goes beyond typical all-purpose compression algorithms like gzip.
There are domain specific solution for compressing JavaScript and CSS.
See:
http://developer.yahoo.com/yui/compressor/
http://code.google.com/closure/compiler/
https://github.com/mishoo/UglifyJS
To clarify the terminology used by YSlow (and similar tools like Google's PageSpeed):
Compression reduces response times by reducing the size of the HTTP response. Gzip is the most popular and effective compression method currently available and generally reduces the response size by about 70%. Approximately 90% of today's Internet traffic travels through browsers that claim to support gzip.
Minification removes unnecessary characters from a file to reduce its size, thereby improving load times. When a file is minified, comments and unneeded white space characters (space, newline, and tab) are removed. This improves response time since the size of the download files is reduced.
Some good references that cover both compression and minification:
Yahoo: http://developer.yahoo.com/performance/rules.html
Google: http://code.google.com/speed/page-speed/docs/payload.html
Stoyan Stefanov: http://www.phpied.com/reducing-tpayload/ (examples for Apache+PHP, but can apply to any web server)
As robert mentioned in his answer, enabling compression on the other server would be a configuration change on that side. If you wanted to minify the JS/CSS components, you could do that with a minification tool prior to hosting on the other server.
Related
I'm working on a AngularJs project whose loading time is too high >15sec. It has got too many Js and Css files how to optimise? As the page loads everything at once of 1st time loading. I want to make the Js and Css loading on demand.
From where to start so that I can optimise? please suggest some good guidelines and workflows.
Some points to consider:
Make sure your CSS, JavaScript and HTML files are being served with gzip encoding. This will dramatically reduce the amount of data which needs to be transferred on pageload.
Ensure your JS and CSS is minified, which eliminates whitespace and reduces the size of the files.
Try to remove dependencies on large, bulky libraries. JQuery is huge, and often more lightweight libraries can be used instead.
Take a look at Google's PageSpeed module for Apache and NGINX. This can perform several magic tasks such as compiling your javascript and css into single files. https://developers.google.com/speed/pagespeed/module/
If you use a compatible webserver, consider enabling HTTP2 (or SPDY on older servers).
Content Delivery Networks
Some people promote the use of public Content Delivery Networks (CDNs, such as maxcdn) to serve your JavaScript files. This has several potential advantages:
With popular libraries, it's likely the files will already be cached in the browsers of people who are visiting your site.
CDNs are generally globally distributed, which should make them fast and responsive worldwide.
However, I personally disagree, and I make sure that all my 3rd party JavaScript libraries are served locally. Here's why:
If a CDN is offline or having issues, your site will be broken (though this can be patched somewhat with fallback loading)
If a CDN is overloaded or saturated, your site will also be slow
It adds at least one additional DNS lookup on page load, which is another point of failure
If the CDN gets compromised, someone could place some malicious javascript on your site (this can be avoided by using the "integrity" checksum)
CDN served files cannot be inlined, combined and optimised by Google Pagespeed
CDN served files cannot be pipelined with the rest of your site
I have a HTML file with JS (jQuery) and CSS. I want a converter that converts all the files, minimizes it and just puts it all in a index.html for example. Google seems to be using this, they have no external files, not even the image, everything is just in one file and I'm sure pre-compiled before release.
Also is this a good idea?
This is not a good idea, in general.
Splitting out your CSS and JavaScript files means that they can be cached independently. You will likely be using a common CSS and JavaScript across many pages. If you don't allow those to be cached, and instead store them in each page, then the user is effectively downloading a new copy of those files for every page they visit.
Now, it is a good idea to served minified versions of these files. Also make sure to add gzip or deflate transfer encoding so that they are compressed. Text compresses nicely... usually around a ratio of 1/8.
(I should note that there has been one occasion where I have loaded everything into a single file. I was working on a single-page web application for the Nintendo Wii, which had no caching capability at all. This is about the only instance where putting everything into a single file made sense. Even then, it is only worth the effort if you automate it server-side.)
I don't recommend to concat CSS with JS.
Just put your css at the top of the page and js at the bottom.
To minify your CSS and JS you have to use gruntjs
Also I recommend you to read this article: Front-end performance for web designers and front-end developers
If your intention is to load the pages faster:
For images: try to use image sprites or images from different domains because browsers love downloading resources from different domains instead of just one domain.
For scripts as well as css: use online minifiers that can reduce white-spaces and reduce the size (if you are on a web hosting, your host may be already compressing the scripts for you using gzip etc)
For landing pages like index pages: If you have less styles then try inserting them inside the <style></style> tag, this will make the page load very fast, Facebook mobile does it that way.
If it wasn't a good idea, google wasn't be using it!
If you put everything in single file, you'll get less HTTP requests when the browser will check if the newer version of file is available.
You also get read of the problem that some resources are not refreshed, which is the headache for 'normal' developers, but it's a disaster in AJAX applications.
I don't know of any publicly available tool doing it all, surely Google is having its own. Note also that, for example in GWT, many such embedding was done by compiler.
What you can do is to search for:
CSS image embedder - for encoding images into CSS
CSS and JS minifier - for building single CSS/JS and minimizing it
And you need some simple tool that will embed it into HTML.
I am trying to understand JavaScript minification and compression processes and have couple of questions on these:
Since minification makes the code difficult to debug, is it possible to do on-demand de-minification on client-side to cover-up for cases where you actually need to debug and investigate something on the website?
I remember reading somewhere that one can enable compression of all resources (like images, CSS, JavaScript etc.) by setting some options in the Apache Web Server. Is there any difference in the JavaScript compression done at Apache level and, the one done using tools like YUI Compressor?
Can someone help me know the above?
The kind-of cases where I would actually need to de-minify my JavaScript files is let's say a JavaScript error happened at line no. X. With a minified files it would be very tough to know which block of code caused that error in production as the lines are all wrapped up in a minified file. How do you guys investigate and debug in such circumstances? Another user also mentioned this debugging problem in Packed/minified javascript failing in IE6 - how to debug? questions (slightly specific to IE6 though).
You shouldn't be debugging minified code. Ideally, development process is like this:
You build and debug the site locally. You have full versions of javascripts, stylesheets and everything.
You deploy a version to production machine. You minify and gzip a copy of your assets and push that to the server. Your local development copy is left untouched.
If there's a bug in production, you modify your local version, minify, compress, upload.
Repeat until PROFIT
Chrome Dev Tools can de-obfuscate (and de-minify) javascript code if you want to debug production code (useful when trying to replicate a bug on a live environment you may not be seeing in dev)
Typically developers will develop against the uncompressed script file, compress right before deploying.
If you have to go back and debug your script files, you'd just open up the regular, uncompressed file, do your work, compress, and deploy. If you mean debug something while your website is in production, then no, you can't un-minify your script file on demand.
And yes, Apache, and even IIS, can gzip compress scripts and images automatically for you.
Since minification makes the code difficult to debug, is it possible
to do on-demand de-minification on client-side to cover-up for cases
where you actually need to debug and investigate something on the
website?
Sort of. Minified javascript has the same structure, it just does things like delete extra spaces and shorten variable names. So you can easily make the code readable again, either manually or with a script, but you can't recover variable names, so the code will still be harder to work with. So, if you have the original code, definitely don't get rid of it. Save the minified code separately.
I remember reading somewhere that one can enable compression of all
resources (like images, css, javascript etc.) by setting some options
in the Apache Web Server.
Yes, it's called gzip compression. It's not unique to apache, but you would need to configure your server to enable it.
Is there any difference in the javascript compression done at Apache
level and, the one done using tools like YUI Compressor?
Yes. YUI compressor is a minifier - the output is valid javascript. Server-side compression is more analogous to zipping a file - the browser must decode it before it can be used. Using the two together would yield the smallest filesize.
I prefer working with a local unminified copy of the JS-file, and when i deploy the site, I minify all JS-files into one. That way it is easy to debug and improve the code. However, there are tools to revert minification. Have a look at this SO post on revert minification of JavaScript.
Have a look at GZIP compression - This blog describe how to enable GZIP in Apache and how to verify that your server actually is compressing the files.
is it possible to do on-demand de-minification on client-side
Some browsers have a "pretty code" view that automatically formats source code. See Firebug's CSS tab.
Is there any difference in the javascript compression done at Apache level and, the one done using tools like YUI Compressor?
YIU Compressor is a actually a minifier. Apache compression is like ZIP-ing up the file before it is sent to the client so the actual file that is sent is smaller than the file on disk. They are two different technologies that are independent of each other.
As far i know until now, the min version of a .js(javascript) file is obtaining by removing the unncessary blank spaces and comments, in order to reduce the file size.
My questions are:
How can I convert a min.js file into a clear, easy readable .js file
Besides, size(&and speed) are there any other advtages of the min.js file.
the js files can be encripted?
can js be infected. I think the answer is yes, so the question is how to protect the .js files from infections?
Only the first question is most important and I'm looking for help on it.
TY
To convert a minified file into a editable source, simply open any IDE that supports auto-formatting and auto-format it. I use Netbeans to do this.
If you do client side caching for the minified file, it means to say that the client (computer) needs to process less bytes. Size and speed are the main advantages of a minified file, and they are already great advantages to prepare for a future that requires great load of data transfer. By the way, it also saves you some bandwidth on your server and therefore money.
I don't see the need of encryption. See How to disable or encrypt "View Source" for my site
Javascript files cannot be edited unless it is done so on the server. The security of your Javascript files depends on your 1) server protection 2) data protection. Data should not be able to exploit. But of course, Javascript is executed on the client side, it will be meaningless for the client user to attack him/herself. However Twitter has shown multiple Javascript exploits. You need to constantly test and check your codes against XSS, CSRF and other attacks. This means to say that if your Javascript file has a loophole, it was the developer, you, who created it.
Multiple minifiers exists, that also are able to compress JS, see http://dean.edwards.name/weblog/2007/04/packer3 for one of the most being used. Some others exists, also see the JSMin library http://www.crockford.com/javascript/jsmin.html
The main advantage is the size gain. You should also aggregate your JS files when you have multiple JS files, this also saves a lot of I/O (less HTTP requests) between the server and the client. This is probably more important than minifying.
I can't answer you about encryption. Client security will mainly depend on its browser.
EDIT: Ok my first answer is not for the first question, merged both in 2.
jquery-1.4.2.min.js is 71.8KB
Same file compressed through this tool, with gzip enabled, becomes 32.9 KB
Which is better? If latter, why doesn't jQuery provide a packed file too instead of just uncompressed and min versions?
My Question: One is minified and gzip enabled, other is minified and packed and gzip enabled. Which should I use? If the one thats 32KB, I wonder why doesn't jquery provide a minified+packed version instead, any particular reason why?
Thanks
It's not an either question, use both, serve the minified file, over a gzip stream to the browser for the best/quickest delivery possibly.
Most web servers and almost every current browser support gzip. You're serving the minified file, with internal variables shortened etc...but then deliverying a zipped version of that to the client. By doing this you're delivering the minimum amount of javascript for the client to execute and delivering the smallest payload...so a quicker download for your user.
Also, remember to set cache headers so the client's not re-fetching the file...and there are other performance tips to go along with this you should read :)
Gzip encoding is handled on the fly by web servers. It isn't a feature of the file uploaded to the server, so it wouldn't make sense to provide the file in that format for download.
Gzip encoding and minification are not mutually exclusive.
Perhaps you mean the version packed with Dean Edward's packer? It indeed yields to smaller download but requires some processing on the client-side in order to decompress it.