After searching around in Google for a while I have not had any luck or guidance in my question.
I want to be able to load up a website using javascript, ajax, in order to reduce the amount of requests needed by the server from the client. My goal is to embed/encode data within an image such that only the client needs to request this image through ajax call, and then be decoded to find the js, css, and other files needed. Then the js, css and other files will be inserted into the DOM.
If I can get the above to work then I could have a lot of flexibility on how my webapp is loaded and be able to notify the user how close the webapp is to being ready for viewing.
Currently my problem is that I cannot find how I would encode the data within an image.
Even if this is not the way to be going about serving up a webapp my curiosity is getting the best of me and I would just really like to do this.
Any guidance or pointers would be greatly appreciated!
Also: I am learning Python so if you know of a python module that I could play with that would be cool. Currently i'm playing with the pypng module to see if this could be done.
To be frank. Don't do that.
The brightest minds on earth use other methods to keep the number of requests and response time down. The most common technique for minimizing the number of requests is called Bundling. In short, you just copy'n paste all js files after each other into one big js file and all the css files into one big css file. This way you need to download two files, one js and one css. Better than that is usually not worth the trouble.
To further keep response times down you usually minify your js and css files. This is a process where all white space, comments, etc are removed and internal variable names are made as short as possible.
Finally you can serve both js and css files as gziped files to further reduce the file size to transfer.
There are many tools out there that does both bundling and minification for you. Google and pick one that suits your other tooling support.
Related
I am compressing all of my files on the go, so it's easier to update rather than having to decompress. I have around 10 JS files, in total around 2,000 lines maybe more, would it be better to put them all in one file and compress it, would it speed up my website, or should I just leave it in individual files, and compress each?
I'm assuming this is for web development.
If all of your scripts were created by you, and you suspect each script will be needed for every page on your site, you should concat / compress them. The first load will take longer, but the scripts will be cached.
If all of your scripts were authored by you, but each page does not necessarily need all 10 of your scripts, you should consider lazy loading them on demand.
If any of those scripts were not authored by you and can be found on a CDN (like jQuery), then link the scripts to a CDN, as there is a chance users will already have them cached. For the remaining scripts, decide if you should lazy load or concat / compress them all.
What you shouldn't do, however, is load all 10 of your scripts individually on each page. That would just have the user's browser send more requests than needed.
It's all about trade-offs, and there isn't a 100% correct answer. Good luck :)
--edit--
You said "on the go". If the content of your scripts change, then you wouldn't want them to be cached. In that case, lazy-loading would probably be the answer.
--edit 2--
If you haven't already, you should really take a look at using Grunt to concat and minify your js files during development. If you decide to go that route, take a look at grunt-contrib-watch.
It is often a good idea to keep the number of files the browser needs to request to a minimum.
The browser may have to open one connection per HTTP request, but if you put all your JavaScript code in one big file it will only have to do one request and thus only one connection needs to be opened to fetch your js code.
It depends on how much of that 2000 lines you are using for different pages. If the file is like a library of which most is used in your distinct pages, then it might actually make little to no difference in terms of loading speed. But if only a part of that file is being used in each page, I would assume separating would be smart as less will be needed to load per page.
I have a web application that is currently split into like 40+ javascript files. When I run the application some subset of those files need to be downloaded by the browser. Obviously, given that browsers use ~6 threads to download files, this is not the optimal solution. One optimization idea that come to my mind was to embed all those javascript files (except external ones) inside the served .aspx page. So that the browser just gets one big html file and does not need to make any round trips to the server. The html page alone may contain user specific data that will be different on every request. The scripts, however, are not changing between requests. In typical use case the page (complete with scripts) has 180KB (scripts not minified) or 130KB (minified).
Now the question: does this approach have any drawbacks performance wise (network, browsers' javascript engines)? Do you know of any big applications doing something like that? Note that I am not interested in arguments about eg. maintainability as the individual scripts will still be available as separate files during development. Same question applies to css files (even though this is less of an issue in my app).
One bit of information that may be important here: the application is one big multipage form that does not require postbacks to go between the pages, validate form, submit the form, etc. However, the application in which it is embedded may have multiple such forms.
In general, it is a very good idea to concatenate javascript and css files together. I'm just not so sure about "your concept". My biggest concern and question here would be, can that .aspx file change potentially in any way through (dynamic) code ?
That would make in impossible for the browser to cache the file, which would be a horrible scenario.
The great thing about concatenating files is, that we have one big downstream (which still is a lot faster then downloading with several single requests and HTTP overhead) and the browser can cache this file afterwards.
There are some great build-tools and scripts available, Apache ANT is one I can really suggest. You should have a look on the HTML5 Boilerplate where they make usage of ANT very frequently.
I've launched a redesign of our website and I'm using quite a bit of Javascript for the first time.
I've learned that I should be combining all my javascript and css into one file (each obviously) but while I know I can combine the css without problems but the javascript I'm not sure of.
I have to load:
jquery.min.js <-- I load the top two from ajax.googleapis.com, is that a good idea
jquery-ui.min.js
javascript for Facebook
some for google plus button
same for twitter
some for google analytics
then some inline stuff to hide divs which javascript users shouldn't see and that type of thing.
you can see it here: traditionalirishgifts.com
So can I just copy and paste the contents of all these files into one big file. Find some way to minify (haven't looked into that fully yet) it. Load this one file right at the bottom of my page before and bingo?
I'd use this tool: http://jscompress.com/
JSCompress.com is an online javascript compressor that allows you to
compress and minify your javascript files. Compressed javascript files
are ideal for production environments since they typically reduce the
size of the file by 30-90%. Most of the filesize reduction is achieved
by removing comments and extra whitespace characters that are not
needed by web browsers or visitors.
You should always be able to merge all your external JavaScripts into one file. You can use a server-side compressor to cache it and serve it as one file. It does put some constraints on the files, like which file should load first etc. Also, if there is a syntax error anywhere it will crash completely.
Keep in mind that 3rd party code like code from google can't be mixed in. Usually there is some kind of authentication going on (or an API key in the URL). If you try to cache that code, it will stop working after a while. So you do need to keep those separate.
Question
If you use a single javascript file to hold all scripts, where do you put scripts that are for just one page?
Background
This may be a matter of opinion or "best practice" but I'm interested in others' opinions:
I'm using the html5 Boilerplate on a project. They recommend you place all javascript in a single file script.js for speed and consistency. Seems reasonable.
However, I have a bit of geolocation script that's only relevant to a single page, and not others. Should I break convention and just put this script on the page below my calls to the javascript libraries it depends on? Just put calls to the relevant functions (located in the script.js) file, below the links to the libraries they depend on?
Thanks!
The good folks at html5 boilerplate recommend putting all of your javascript in script.js so that the browser will only have to load that one file (along with the others that h5bp uses) and to allow caching of that file.
The idea is not to get caught up in the "recommended" way, and to think about things related to your own applications.
This geolocation file is only going to be used on this one page, right? It will never be used anywhere else.
The script.js file will be used on multiple pages.
Well, then it wouldn't make sense to put a "whole script" that will only be needed on one page in the script.js file. You should make the file external and call it separately on the page that it is needed. This will keep you from bloating the script.js file for functionality that may never get used by that user.
However, if your "whole script" for the geolocation functionality is pretty small, then include it in script.js. If it doesn't add to the speed of the download for that file, then it makes sense to include it there.
The gist of all of this is, What is the best trade off for my application?
These things we know to be true:
cached js files are good
fewer files to download are good
smaller files to download are good
maintenance is important
Once you think of these things in terms of your application, the decision making becomes a bit easier. And remember, decisions that trade off milliseconds are not going to make much of a difference in your user's "perception" of how fast your page is.
The browser will only download the .js files once (unless something is happening to discourage the browser from caching). So if you expect all of your users to hit the one page that uses geolocation sometime during their session, then you might as well give it to them early. If you expect maybe a tiny percent of your users to eventually hit the geolocation page, then maybe you might want to split them.
Split it out into a separate .js file so that it can be cached. Then reference both external .js files from your page.
I think you should put it in a separate file. Putting all the scripts in one single file could cause unexpected behavior and conflicts. I like to have one script file for the javascript that all pages will use containing plugins, helper functions, formatting functions etc. And then create one separate js file for everything that is relevant just for each page.
If you still want to have just one js file in the browser you could take advantage of one of those utilities that combine multiple js files into one.
Multiple sites reference combining JavaScript and CSS files to improve web page performance, including examples of using ANT build scripts to concatenate the files prior to deployment.
I've search, and haven't found any information how to automate updating references to those files in HTML and other documents. I am looking to avoid hacking together something error prone, and want to learn from others who have automated builds in the past.
Are there automated tools in the wild to complete this task that I'm not seeing? Are there recommended processes to update the script and link tags in HTML? Can these solutions be integrated with ANT or similar build tools?
There sure is and it's a smart thing to do.
I found a PHP solution, don't know it that's okay for you, but if it isn't you can still read it's source (it's not difficult) and learn a lot. The solution works like this:
Rewrite your requests like this: from css/main.css and css/skin.css to css/main.css,skin.css (of course you can put many more).
Use apache's mod_rewrite to redirect this request to a script (in our case combine.php), that will combine all files to a single one.
The script combines all the files and sends the combined file. Then it saves it to a cache folder.
Next time around it checks if there is an up-to-date version of the cache and serves that one. If the latest file modification time has changed, it discards the cache.
The solution works great and it even makes use of HTTP cache headers and spits out an [ETags], which you should do anyway.
You are correct this is a great way to speed up page loading. It will even work in conjunction with a CDN, which the other poster recommended.
Here is a small script that will pack multiple files in to one for deployment. It supports both JS and CSS, and will even "minify" them by removing whitespace, etc. Just hook this in to your build and deploy scripts.
juicer: http://cjohansen.no/en/ruby/juicer_a_css_and_javascript_packaging_tool
What even better, it will follow JS and CSS import statements, so you only need to point your HTML files to the loader file and it will work in both development and production. (Assuming you replace the loader file with the combined file on deployment.)
There are others, including some run-time solutions. But it sounds like you have a build process in place anyway.
As far as HTML updating, if you still need it, since automated deployments are very popular in the Ruby world, and you may find some standalone utilities to help even for non-ruby projects. (As above) Methinks this would be best handled by your own project's template language, though. (With a static resource revision id, or such.)
Good luck, and let us know what you find.
I think what you really want is a CDN Content Delivery Network.
Read about it here
http://developer.yahoo.com/performance/rules.html
http://en.wikipedia.org/wiki/Content_delivery_network