Referencing javascript libraries locally or externally? - javascript

We are currently developing an ASP.NET MVC application which will be deployed on a corporate intranet, with a slightly-modified customer facing version available on the public internet.
We're making use of a number of external javascript libraries (e.g. jQuery) and a discussion has come up regarding referencing the libraries - should we reference them from an external source (e.g. via the Google load jQuery method) or keep our own version locally and reference from there?
The project manager is a little concerned about having a 'dependency' on Google (or whoever) if we reference from there, and thinks that having our own copy of the library makes us more independent. On the other hand, I have heard there are a number of advantages to letting someone else host the library - for example, they handle versioning for us, Google aren't going anywhere anytime soon...
(for the purpose of the discussion assume the intranet we're hosting on has external access - obviously if it turns out it doesn't the decision is very much made for us!)
So. Does this matter? And if so, what should we do and why?
(I appreciate this is subjective - but it would be very useful to get advice from anyone with experience or thoughts on the matter. Not sure if this is a candidate for community wiki or not, let me know if I should have put it there and I'll know for future!)
Thanks :)

Is the application critical to your internal users? If the 'internet is down' will your business suffer because internal users cannot access the applications? It's really more about risk than anything else.
Besides if there is change - breaking or otherwise, would you not want to manage the 'upgrade'?

Usually the reason of using a hosted library is performance. I browser will only download a limited number of files per host. So using a hosted library will load the files from a different host and therefore in parallel to the other files.
The second reason is that those files are usually being compressed and the cache headers are set correctly. And those files are usually stored in a CDN which means that your users will download the file from the hosts which is closest to them.
But all those reasons are not so important in a intranet environment.

you can do something like
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.min.js"></script>
<script type="text/javascript"/>
if( window.$ ) {
// jquery script goes here
} else {
var elmScript = document.createElement('script');
elmScript .src = 'jQuery.js'; // path to jquery
elmScript .type = 'text/javascript';
document.getElementsByTagName('head')[0].appendChild( elmScript );
}
</script>

Notwithstanding the complexity-issues that arise from having your code scattered over multiple sites, some of which you don't control, there is also a slight performance penalty to pay for pulling content from multiple domains (Eg. an extra dns lookup). For a public web site, where a lot of users will be first-timers, there may be some benefit to having the code cached already, but otherwise I don't really see the point.

The main (only?) reason to do this would be for performance reasons as #Kau-Boy has already mentioned. JQuery is a relatively small library (for what it gives you) and it will be cached locally by your external user's browsers/proxies/ISPs. So it doesn't really give you anything except a small latency gain.
If you find yourself adding many external Javascript files then you should perhaps reconsider your overall design strategy. Choose a good general purpose library (I personally consider jQuery to be unbeatable - this is the subjective part) and learn it inside-out. You shouldn't have to add many more libraries apart from the odd plugin, which should be included on a page-by-page basis anyway.
Some really good answers from the others on here by the way.

Related

Is there any data on the advantage of loading jQuery from a CDN (HTML5BP) vs concatenating and minifying it with the other scripts?

I'm using HTML5 boilerplate, and saw that they're separating jQuery from main.js and plugins.js. I understand that the logic behind this is that a lot of sites use jquery and if you use a cdn there is a big chance that the user will already have it cached.
But since my use of html5 bp utilizes a build step, I could also concatenate and minify jquery with all my other scripts. So:
Separate jQuery, preferably load it from a CDN, or
Concat and minify jQuery with all other scripts
From what I understand HTML5 BP doesn't consider the concat/minify option since their boilerplate doesn't include a build step. But since it is an option to me; is there any hard data about what is the best option here, or what the real world differences in performance are?
There was a large discussion of this (and other related issues) in a pull request last year. You should check out that thread. Lots of detailed discussion.
The reason H5BP uses that set-up is that it's the best default solution. If a user of H5BP doesn't do anything else to their set-up, they're guaranteed to get a geographically optimized version of jQuery served to their users from a CDN source with the highest possible chance of hitting the cache lottery.
Beyond a default solution:
If you've got a global audience and you're going to serve your static assets from a CDN, then it would probably be slightly faster to serve a single file from that CDN.
If you've got a performance tuned static assets server and you've got a local audience, then you might be faster serving a single file from a static assets server even if it's not on a CDN.
Really, though, testing your set-up against your own geographical audience is the best way to go. It's the only way to answer this question for your site or application.
Pros :
-Performances are a little better, because users have it already cached from other sites.
Cons :
-If the CDN is down, your site is broken.
-If all your files are concatened in one, you will have only one query on your domain, and it will always be better (IMHO) than a separate query on an external domain.
I personnally will always chose to concatenate.
But in any case, don't over-optimize. If your site is not big, chose what you prefer.
(And I think that if you test both solutions, the results will not be extremely differents)

Getting Jquery from google

There are some tutorials which suggest to use jquery path which is from google eg:
<script type="text/javascript"
src="http://ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js"></script>
Is that safe to use in our projects?
Aren't we dependent because we are not sure it will be there after a year or beyond?
The reason why I have asked this question is that there are some people who go in favor
of that.
From the documentation:
Google works directly with the key
stake holders for each library effort
and accepts the latest stable versions
as they are released. Once we host a
release of a given library, we are
committed to hosting that release
indefinitely.
It seems pretty low-risk to me. And more likely to be already in the user's cache. And served with the proper gzip and caching headers. Also won't eat up a http request to your domain on browsers that only allow downloading 2 requests to a domain at a time (e.g. IE6 and IE7).
I have an article for you that explains the benefits and cons of using this method: Here
I really doubt that google will put this up for people to use and then all of a sudden take it down and cause issues with thousands or more websites. It's not like they will lose their domain or run out of bandwidth. The only issue that I think you should worry about is if the end users of your sites cannot access google. Personally I just host the file on my own server anyway
Short answer is yes and I agree if that include doesn't work it is probably a sign of a much bigger problem. My general rule of thumb is for all public facing apps I use that include where as internal apps (which theoretically could be used w/o a connection to the outside world) I include a local copy instead.
There will be always a chance that it will not be there after a year, same as gmail, gdocs, google.com...
for jquery alone, i do not see a reason to use google source, as the file is small, impact to your server and BW will not be too much. But jquery UI may worth to use google's source.
It's pretty 'safe' like what the other guys mentioned. You probably ease a bit of load of your own server too. Even SO itself is using it.
But to be safe, always have a fallback plan and have a local copy, just in case.
There's not really much risk involved if you think about it. Suppose Google ceases to exist in a year (chuckle), it wouldn't take you but a couple of minutes to replace the google.load command in your common file with a reference to your own local jQuery copy.
The worse-case scenario is that in the improbable future of Google's demise, your hover-effects stop to work for 5 minutes :)
A similar question: Where do you include the jQuery library from? Google JSAPI? CDN?
Because of the answers from that question, I have started using:
<script type="text/javascript" src="//ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js"></script>
I have it running on quite a number of sites. The only issue I've had is some firewalls start blocking a site if there are too many requests (or at least this is my guess), which is the case on higher traffic sites all used in one location.

Where should JS library reside (host)

JS library like JQuery can be linked directly from other site (e.g. google). Usually I use
<script type="text/javascript" src="/js/jQuery.min.js"></script>
But I can use
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js"></script>
or similar.
I like to take full control over my site, so I use the first way. But using google or other host has some advantage (i.e. decreased latency, increased parallelism, better caching).
Both have advantage and disadvantage.
What should I use? What you use and why?
Please let me know your opinion.
Thank you
I think that it depends on the audience of your website.
If your site is public facing and people are going to be accessing it primarily or exclusively from the internet then you are going to benefit from lower bandwidth utilization, faster responses and caching benefits since the likelihood of the file having been previously referenced and loaded from another site is high.
If your site is internal for an intranet you may run into issues of people do not have internet access but you're also going to be wasting bandwidth since you're sending everyone out over the internet to fetch a file you could host locally.
I use Google where possible for performance reasons, but I also check in a local copy in case I need to work on the site when I am offline, e.g., on an airplane, or at a remote location with no internet access.
Don't forget that if you use a copy from google (or whoever), you have to guard against the possibility that they might move or change the file, or that they're server might be down.
If your site needs a specific javascript library, then you should download it and serve it up yourself. If your income depends on that file, the last thing you want is to rely on another site to provide it.

Calculating Javascript Library Load Times

I am trying to compare the performance of several Javascript libraries. Although measuring transaction times helps to determine the working performance of a library, it doesn't account for the time required to download and initiate the individual libraries. I'm looking for suggestions for the best method of determining the load time other than using tools such as firebug, etc. I would like to be able to setup a controlled environment where a page could be loaded n-times, while capturing the start and end times. Should the Javascript library contents be included in the page rather than as an include file, or is there a better way?
Reading this article by John Resig on JavaScript Benchmark Quality before you start anything may help you out.
After that, I would suggest that you might try requesting the javascript from your sever, getting, and timing how long the eval(responseJS); takes. That way, you are only timing how long the Library takes to load rather than that plus the time it takes to download from the server.
The libraries should always be an external file, included via a script tag, either on their own or with the site's scripting also rolled in. Minified and packed files will have a smaller size attachment. Delivered via a CDN is optimal as well, as the CDN will have it cached. Many of the popular frameworks are available over Google's CDN.
You must also account for not only the library, but the application using the library. The quality of the JS in the libraries is (typically) top notch, but what about the quality of the code tapping into those libraries, or even the code of plugins which may not be developed by the library authors. You also have to look at what browser is being used. As much as we hate it, most of these cross browser libraries are optimized for best performance out of Internet Explorer, because it retains 85+% market share.
The performance of any library is really a trade off. Deciding what is acceptable in order to get your application to do whatever it is that you want to do.

How to protect a site from API outages?

Watching the effects of today's Google outage across the web got me thinking about how to prevent this in the future. This may be a stupid question, but is there a good way to include external APIs (e.g., Google's AJAX libraries) so that if the API is unavailable, the including page can still soldier on without it? Is it generally a bad idea to use libraries hosted on an external server in general?
It's unavoidable to use a script tag to load cross-domain JavasScript files (this will cause a timeout if it goes down). However, in your code, check for the API objects being null to avoid errors:
E.g. instead of:
<script type="text/javascript">
google.load("maps", "2");
// Use Maps API
</script>
use:
<script type="text/javascript">
if(google != null)
{
google.load("maps", "2");
// Use Maps API
}
else
{
// Fallback
}
</script>
I don't think rare outages are worth rejecting an external API wholesale.
You will want to design your application to degrade gracefully (as others have stated) and there is actually a design pattern that can be useful in doing so. The Proxy Pattern, when implemented correctly can be used as a gatekeeper to check if a service is available (among many other uses) and return appropriately to the application either the correct data, cached data or inform the application that the service is not available.
The best general answer I can give is, degrade beautifully and gracefully & avoid sending errors. If the service can become unavailable, expect that and do the best job you can.
B
ut I don't think this is question that can be answered generically. It depends what your site does, what external libraries/API you are using, etc.
You could do some sort of caching to still serve up pages with older data. If allowed, you could run the API engine on your own server. Or you could just throw up status messages to users.
It's not a bad idea to rely on external APIs, but one of the major drawbacks is that you have little control over it. If it goes away? Welcome to a big problem. Outages? Not much you can do but wait.
I think its a great idea to use external libraries since it saves bandwidth for me (read $$). But its pretty easy to protect against this kind of api outage. Keep a copy on your server and in your JavaScript check if the api has been successfully loaded. If not load the one on your server.
I was thinking about jQuery and YUI here. The other guys are right about the problems when using actual services like mapping.
One possibility for mitigating the problem (will only work if your site is dynamically generated):
Set up a cronjob that runs every 10 minutes / hour / whatever, depending how much you care. Have it attempt to download the external file(s) that you are including, one attempt for each external host that you depend on. Have it set a flag in the database that represents whether each individual external host is currently available.
When your pages are being generated, check the external-host flags, and print the source attribute either pointing to the external host if it's up, or a local copy if it's down.
For bonus points, have the successfully downloaded file from the cronjob become the local copy. Then when one does go down, your local copy represents the most-current version from the external host anyway.
A lot of the time you need to access third party libraries over the web. The question you need to ask yourself is how much do you need this and can you cache any of it.
If your uptime needs to be as close to 100% as possible then maybe you should look at how much you rely on these third parties.
If all you are obtaining is the weather once an hour then you can probably cache that so that things carry on regardless. If you are asking a third party for data that is only valid for that milisecond then you probably need to look at error handling to cover the fact it may not be there.
The answer to the question is entirely based upon the specifics of your situation.
It'd certainly be fairly easy to include a switch in your application to toggle using Google or local web server to server your YUI, JQuery or similar library so that you can toggle provider.

Categories