I noticed that some programmers use two ways of calling .js file.
1- this way where you must have the js file:
<script src="lib/jquery.js" type="text/javascript"></script>
2- and this way where you don't need the js file :
<script src="http://ajax.googleapis.com/ajax/libs/jquery/1.7.2/jquery.js" type="text/javascript"></script>
and I want to know which way is better to use.
The first option is using local files, The second option is using a CDN.
A CDN is a group of fast servers with several common use files. Is really useful to save bandwidth and speed up the download of your site.
However, as was mentioned, you would have problems if the end user don't have access to internet.
Basically, if you expect your application to be executed always online, a CDN is a great option. If you are developing an app that could be executed offline (like a CRM for a company) then it would be better to be served using local files.
If the CDN is down, then your website will be broke. But is more likely that your website is down than the CDN.
Depends.
Method #1 means you have a local copy of the file -- you don't need to rely on an existing path to the internet (from an intranet behind a firewall, spotty internet service, etc). You take care of any caching, and making sure the file exists.
Method #2 may give you a fast planet-wide content-delivery-network (CDN).
I have, and will continue to use both methods... but #2 is easier.
Related
Is there any way to "edit" a "server side" javascript file in one of the mentioned browsers that will save the js edits on the client side and replace the server side scripts?
Basically I want to edit the javascripts on the server. Obviously I can't save them on the server so they need to be saved on the client side(my computer) and the browser needs to load my scripts instead.
It shouldn't be hard to do at all but I've not been able to find any way to accomplish this.
Edit:
I want to modify the javascript's from a site I do not own or have write access too. e.g.,
Html page uses some javascript page on server. I want to modify this javascript file(the actual file).
I can download and save the javascript file BUT the html page will always use the one on the server because that is what is in the script tag. I need to modify the script tag of the html page to point to the local javascript file BEFORE the html page's scripts are executed(else the javascript from the server will be used).
here, for example, is a script tag from SE:
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.7.1/jquery.min.js"></script>
It uses a non-local javascript file. I need to replace this line with my own line before any javascript is executed. It would like like:
<script type="text/javascript" src="file://C:/temp/myjquery.min.js"></script>
or whatever. (this way, I can modify the jquery file and have it execute my own version of the one on the server)
I, could, ofcourse, download the html file and modify it BUT then php code may not work among other things. (for example, relative links will be broke)
this is usually very easy in Opera: Just view source, edit what you want and use the special "Tools > Advanced > Reload from cache" command instead of a normal reload. Voila, you'll be running the site with your modified scripts..
(There are some exceptions, related to specific no-caching techniques some sites use it won't work 100% for all files - but it certainly should work for anything served from googleapis.com)
I think what you're looking for is something like LiveReload
It allows you to edit css files and have the browser apply the changes without refreshing the browser.
The windows version is in alpha right now but the Mac version works quite well for CSS.
I don't know if it does Javascript but I think it might.
You could also try the Chrome DevTools. It's a chrome extension that does just what you want with javascript and css.
No problem, you want to use bookmark-lets for this. Indeed it is easy, just remember to use an anonymous autoexecuting function: javascript:(function(){ //commands })();
In the sane good old day's one could even place this javascript directly into your addresses, but nowaday's some browser-builders (like firefox we coders USED to trust in the old day's) are being a 'good boy' and listen to facebook's 'demands' to kill normal standard functionality in favor of their lack on comprehending closures... But alas..
Ofcourse you could also create a bookmark to fix firefox's insanity, again reclaiming power to the user :)
Every time you visit the site, you click your bookmarklet. Done.
One can even make it 'memory resistant' for as long als you are on the same page (if you really want to). Naturally power is with the user/visitor AS IT SHOULD BE, not with the webmaster (who already publicly shared whatever info).
You might also look into greasemonky on firefox and comparable solutions.
Good luck
Build a string on the server side to write all your javascript code on the server side.
I was thinking about creating script that would do the following:
Get all javascripts from JS directory used on server
Combine all scripts to one - that would make only one request instead of multiple
Minify combined script
Cache the file
Let's say that the order in which the files need to be loaded is written in config file somewhere.
Now when I load myexamplepage.com I actually use jQuery, backbone, mootools, prototype and few other libraries, but instead of asking server for these multiple files, I call myexamplepage.com/js/getjs and what I get is combined and minified JS file. That way I eliminate those additional requests to server. And as I read on net about speeding up your website I found out that the more requests you make to server, the slower your web become.
Since I'm pretty new to programming world I know that many things that I think of already exists, I don't think that this is exception also.
So please list what you know that does exactly or similar to what I described.(please note that you don't need to use any kind of minifiers or third party software everytime you want your scripts to be changed, you keep original files structure, you only use class helper)
P.S. I think same method could be used for CSS files also.
I'm using PHP and Apache.
Rather than having the server do this on-the-fly, I'd recommend doing it in advance: Just concatenate the scripts and run them through a non-destructive minifier, like jsmin or Google Closure Compiler in "simple" mode.
This also gives you the opportunity to put a version number on that file, and to give it a long cache life, so that users don't have to re-download it each time they come to the page. For example: Suppose the content of your page changes frequently enough that you set the cache headers on the page to say it expires every day. Naturally, your JavaScript doesn't change every day. So your page.html can include a file called all-my-js-v4.js which has a long cache life (like, a year). If you update your JavaScript, create a new all-in-one file called all-my-js-v5.js and update page.html to include that instead. The next time the user sees page.html, they'll request the updated file; but until then, they can use their cached copy.
If you really want to do this on-the-fly, if you're using apache, you could use mod_pagespeed.
If you're using .NET, I can recommend Combres. It does combination and minification of JavaScript and CSS files.
I know this is an old question, but you may be interested in this project: https://github.com/OpenNTF/JavascriptAggregator
Assuming you use AMD modules for your javascript, this project will create highly cacheable layers on demand. It has other features you may be interested in as well.
Question
If you use a single javascript file to hold all scripts, where do you put scripts that are for just one page?
Background
This may be a matter of opinion or "best practice" but I'm interested in others' opinions:
I'm using the html5 Boilerplate on a project. They recommend you place all javascript in a single file script.js for speed and consistency. Seems reasonable.
However, I have a bit of geolocation script that's only relevant to a single page, and not others. Should I break convention and just put this script on the page below my calls to the javascript libraries it depends on? Just put calls to the relevant functions (located in the script.js) file, below the links to the libraries they depend on?
Thanks!
The good folks at html5 boilerplate recommend putting all of your javascript in script.js so that the browser will only have to load that one file (along with the others that h5bp uses) and to allow caching of that file.
The idea is not to get caught up in the "recommended" way, and to think about things related to your own applications.
This geolocation file is only going to be used on this one page, right? It will never be used anywhere else.
The script.js file will be used on multiple pages.
Well, then it wouldn't make sense to put a "whole script" that will only be needed on one page in the script.js file. You should make the file external and call it separately on the page that it is needed. This will keep you from bloating the script.js file for functionality that may never get used by that user.
However, if your "whole script" for the geolocation functionality is pretty small, then include it in script.js. If it doesn't add to the speed of the download for that file, then it makes sense to include it there.
The gist of all of this is, What is the best trade off for my application?
These things we know to be true:
cached js files are good
fewer files to download are good
smaller files to download are good
maintenance is important
Once you think of these things in terms of your application, the decision making becomes a bit easier. And remember, decisions that trade off milliseconds are not going to make much of a difference in your user's "perception" of how fast your page is.
The browser will only download the .js files once (unless something is happening to discourage the browser from caching). So if you expect all of your users to hit the one page that uses geolocation sometime during their session, then you might as well give it to them early. If you expect maybe a tiny percent of your users to eventually hit the geolocation page, then maybe you might want to split them.
Split it out into a separate .js file so that it can be cached. Then reference both external .js files from your page.
I think you should put it in a separate file. Putting all the scripts in one single file could cause unexpected behavior and conflicts. I like to have one script file for the javascript that all pages will use containing plugins, helper functions, formatting functions etc. And then create one separate js file for everything that is relevant just for each page.
If you still want to have just one js file in the browser you could take advantage of one of those utilities that combine multiple js files into one.
Multiple sites reference combining JavaScript and CSS files to improve web page performance, including examples of using ANT build scripts to concatenate the files prior to deployment.
I've search, and haven't found any information how to automate updating references to those files in HTML and other documents. I am looking to avoid hacking together something error prone, and want to learn from others who have automated builds in the past.
Are there automated tools in the wild to complete this task that I'm not seeing? Are there recommended processes to update the script and link tags in HTML? Can these solutions be integrated with ANT or similar build tools?
There sure is and it's a smart thing to do.
I found a PHP solution, don't know it that's okay for you, but if it isn't you can still read it's source (it's not difficult) and learn a lot. The solution works like this:
Rewrite your requests like this: from css/main.css and css/skin.css to css/main.css,skin.css (of course you can put many more).
Use apache's mod_rewrite to redirect this request to a script (in our case combine.php), that will combine all files to a single one.
The script combines all the files and sends the combined file. Then it saves it to a cache folder.
Next time around it checks if there is an up-to-date version of the cache and serves that one. If the latest file modification time has changed, it discards the cache.
The solution works great and it even makes use of HTTP cache headers and spits out an [ETags], which you should do anyway.
You are correct this is a great way to speed up page loading. It will even work in conjunction with a CDN, which the other poster recommended.
Here is a small script that will pack multiple files in to one for deployment. It supports both JS and CSS, and will even "minify" them by removing whitespace, etc. Just hook this in to your build and deploy scripts.
juicer: http://cjohansen.no/en/ruby/juicer_a_css_and_javascript_packaging_tool
What even better, it will follow JS and CSS import statements, so you only need to point your HTML files to the loader file and it will work in both development and production. (Assuming you replace the loader file with the combined file on deployment.)
There are others, including some run-time solutions. But it sounds like you have a build process in place anyway.
As far as HTML updating, if you still need it, since automated deployments are very popular in the Ruby world, and you may find some standalone utilities to help even for non-ruby projects. (As above) Methinks this would be best handled by your own project's template language, though. (With a static resource revision id, or such.)
Good luck, and let us know what you find.
I think what you really want is a CDN Content Delivery Network.
Read about it here
http://developer.yahoo.com/performance/rules.html
http://en.wikipedia.org/wiki/Content_delivery_network
I have a couple of questions that are somewhat related so I'm posting them all on a single question on SO...
Question 1:
I'm currently doing this Facebook application where I'm using jQuery UI Tabs, there's only 4 where 2 of them are loaded through Ajax. The main page is index.html, this is where the tabs code is placed and for the 2 tabs loaded through Ajax, I have two different files, tab1.html and tab2.html.
Currently, the jQuery tabs initialization and Facebook JavaScript initialization is done on index.html. Both tab1.html and tab2.html have JavaScript code that belongs to those pages. For instance, tab2.html has a form and there's some JS (with jQuery) code to validate the form, this code is irrelevant to tab1.html as the JS code on tab1.html is irrelevant to tab2.html.
My question is, should I keep doing this or maybe aggregate all the JS/jQuery code in index.html, tab1.html and tab2.html in a single global.js file and then include it in index.html?
I though of doing this but there will be irrelevant code loaded if the user never opens tab1 or tab2. The benefit of using a single global.js file is that I could pack/minify the file, which I couldn't do if I included each code block in each respective tabX.html file.
Question 2:
As I'm using jQuery, I'm also using lots of plugins (actually only 3 for now, but that number can grow). Some of them provide a minified JS and I use those when available, when they are not, I use the normal versions of course.
There's also the requests problem. If I have lots of plugins, say 10, it will be 10 requests for those plugins. And there is also the fact that some plugins are used in tab1.html but not on tab2.html and vice-verse.
How should I load all the plugins in a minified/packed version on a single web request? Should I do that manually before publishing my app (packing and merging them into a single file) or could I use the PHP version of Dean Edwards's Packer and pack/merge all plugins on the fly? Would this be a good approach?
Question 3:
If the answer on Q1 was something like "merge all code in a single global.js file", should I include the global.js file in the packing/merging script I described above on Q2?
Doing this would simplify everything. I could have my development environment properly organized with all .js files, for the plugins and the global.js in the appropriate folders without bothering with anything else. The packing/merging should take care of the rest (pull the files from the respective folders, send the respective JS headers and output one single packed .js file).
The one thing that's confusing me the most is that not all plugins are used for every tab, not all code is for every tab too. Still, a chunk of the code is global to every tab and the index. This also simplifies everything as: a) I don't have to worry to add the needed code to each tabX.html file and can I simply look at them as HTML templates and nothing else; b) I don't have to be bothered in including the necessary plugins where I need them as I'm currently using $.getScript() from jQuery to load the plugins I need when and only when I need them, but I'm not sure this is a good approach and the code feels dirty and ugly like this.
Question 1:
Pack them all into a single .js file. This will make maintenance easier, and the tiny bit of overhead for the user loading a little js they they potentially may not use does not matter. I would also let Google load the jQuery library for you and then have all of your js code in a single separate file.
Question 2:
As these plugins don't really change I would manually combine them. Closure Compiler is good at this. When minifying use the highest setting that does not give any warnings.
Question 3:
Yes you will want to minify the global.js
When the browser downloads the global.js it's cached for an amount of time. Thus when you call the entire global.js again on a different page, its not re-downloaded it looks at your local copy first. So you do a little bit more work at first on the initial download, but from then on, it should be quicker.
Generally best practices related to javascript for speeding up website loads are:
Minify all javascript and put all of it into a single file (make as much of your javascript external as possible).
Put javascript at the bottom of the document.
Force web server to assign expiration date in the future and use a timestamped query string to invalidate old versions of javascript files, this will prevent unnecessary requests for your javascript if it has not changed. (ie: in httpd.conf ExpiresByType application/x-javascript "access plus 1 year", in your document: <script type="text/javascript" src="/allmy.js?v=1285877202"></script>)
Configure your web server to gzip all text files.
The main reason why you should keep too much javascript away from tab pages is because it will kill user experience. When a user clicks on a tab for the first time it will grab all the components needed on the fly which makes it kinda sluggish.
You're question is only semi-specific as we don't know a lot of things about your site like exact file sizes, how the modules are really used.
The general idea would be to find balance between modularity and speed.
When you're combining modules together these are the general ideas you should consider:
how often does this module change?
how often is this module used?
how big is this module (filesize)?
Then put the most used, stable codebase and merge it into one. Then you should include the rest site specific functionality on the tab pages.
Also, make sure to load javascript asynchronously as it won't block rendering of the page (and tabs).
Another combined answer:
if adding all the JS together in packed/minifed version generates no more than 30k of file size you're better off combining it. A single extra connection for a file (assuming it's not cached) is worth 10-20k of extra JS download. This has to do with browsers opening and closing connections vs streaming extra 20k on an established connection. The threshold also depends on your user distribution. If you have a lot of dial-up or low bandwidth users your threshold will be smaller.
I typically recommend combining and loading as 1 file unless the library is very obscure and requires a very edge case for it to be triggered on a page. Ex: Hover triggers functionality Y but it's on a feedback widget that gets less than 1% of traffic- don't bother combining.
Minifying and Packing is a little overrated these days. With the vast majority of browsers supporting gZip the amount of data consolidation gZip provides of the file over the wire during browser transmission has virtually the same effect as min/pack. However, there is a small cost on the browser to unpack it. Having said that, it's still good practice to min/pack the code since not all browsers support it, you may not want the file to be gZip enabled, etc.
I've used online packers against 3rd party module and it works fairly well. However, there are times when it can cause an issue so make sure to test your manually packed version before deploying.
Alternate:
If you feel that your users will rest on your index page for longer than 10 seconds you could pre-load the additional libraries separately using Js Loader Prototype pattern.
Steve Souder's Even Faster Websites is a book you should look into.
Firstly one experience slowdowns because whenever an external script is linked the browser waits for the script to download, parse and then execute. After this only it regains processing rest of the request. So to avoid such slow downs one can look at parallely downloading the scripts. Few techniques are Ajax the scripts if the scripts are in the same domain or use Script Dom element or Script in iframe if the scripts are on external domains
Q1 : For me modularising all the content is a better option with respect to further development if the page content has to be changed constantly. Responsiveness is very important for the end user. A small global.js will help in getting the app up and running.Parallely one can download the tabX.html.
Q2: As the jquery plugins rarely change. The plugins for the tabX.html pages can be downloaded parallely and locally cached so when the tabX.html is loaded the required plugins need not be fetched. SO all the plugins required by the main page should be in one single file and the ones used by the tabX.html's should be in different files.
Q3 : its a personal choice here. Do you want it to be developer friendly or user friendly. I bank on user friendliness. Making responsive and efficient apps is our job !!!. All the advantages of packing everything into a singe files is you will have ease in development. Well ugly code begets beautiful apps :). Users are speed-aholics. For eg. when google changed its 10 results per page to 20 they saw a considerable drop in search queries. So my opinion is not to pack all of them into one and load each parallely
some of the techniques and relevant links on testing each:
XHR eval /ajax : http://stevesouders.com/cuzillion/?ex=10009
XHR Injection : http://stevesouders.com/cuzillion/?ex=10015
Script in Iframe : http://stevesouders.com/cuzillion/?ex=10012
Script DOM element : http://stevesouders.com/cuzillion/?ex=10010
Question 1:
The best practice would be to place all js files in a single "global" file. This minimizes your HTTP Requests. Let's say you have 5 plug-ins, this would me you need to do 5 request, wherein if you combine them as one, you only need to request it once. This might be a little bit heavy on the first load, but the next time around this file will be cached by the browser, so..no worries about the size. HOWEVER, be careful about the sequence of the scripts when combining it. (I.E. : JQuery script should be placed first on the js file before JQuery UI's)
http://articles.sitepoint.com/article/web-site-optimization-steps/4
http://code.google.com/speed/page-speed/docs/rtt.html
Question 2:
You can do it manually or automatically.Dean Edward's Packer is a good choice. If you're using ASP.NET, you can check MB Compression Handler, if you're using APACHE with PHP perhaps you can change the configuration of your htaccess to gzip it
Question 3:
It'd be better if you pack the "global" javascript file as well. This could save up bandwidth and save more time to load. You got the point, combining all the js files you need for the site will save you time from including individual scripts.