Getting Jquery from google - javascript

There are some tutorials which suggest to use jquery path which is from google eg:
<script type="text/javascript"
src="http://ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js"></script>
Is that safe to use in our projects?
Aren't we dependent because we are not sure it will be there after a year or beyond?
The reason why I have asked this question is that there are some people who go in favor
of that.

From the documentation:
Google works directly with the key
stake holders for each library effort
and accepts the latest stable versions
as they are released. Once we host a
release of a given library, we are
committed to hosting that release
indefinitely.
It seems pretty low-risk to me. And more likely to be already in the user's cache. And served with the proper gzip and caching headers. Also won't eat up a http request to your domain on browsers that only allow downloading 2 requests to a domain at a time (e.g. IE6 and IE7).

I have an article for you that explains the benefits and cons of using this method: Here
I really doubt that google will put this up for people to use and then all of a sudden take it down and cause issues with thousands or more websites. It's not like they will lose their domain or run out of bandwidth. The only issue that I think you should worry about is if the end users of your sites cannot access google. Personally I just host the file on my own server anyway

Short answer is yes and I agree if that include doesn't work it is probably a sign of a much bigger problem. My general rule of thumb is for all public facing apps I use that include where as internal apps (which theoretically could be used w/o a connection to the outside world) I include a local copy instead.

There will be always a chance that it will not be there after a year, same as gmail, gdocs, google.com...
for jquery alone, i do not see a reason to use google source, as the file is small, impact to your server and BW will not be too much. But jquery UI may worth to use google's source.

It's pretty 'safe' like what the other guys mentioned. You probably ease a bit of load of your own server too. Even SO itself is using it.
But to be safe, always have a fallback plan and have a local copy, just in case.

There's not really much risk involved if you think about it. Suppose Google ceases to exist in a year (chuckle), it wouldn't take you but a couple of minutes to replace the google.load command in your common file with a reference to your own local jQuery copy.
The worse-case scenario is that in the improbable future of Google's demise, your hover-effects stop to work for 5 minutes :)

A similar question: Where do you include the jQuery library from? Google JSAPI? CDN?
Because of the answers from that question, I have started using:
<script type="text/javascript" src="//ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js"></script>
I have it running on quite a number of sites. The only issue I've had is some firewalls start blocking a site if there are too many requests (or at least this is my guess), which is the case on higher traffic sites all used in one location.

Related

PureChat and Mod Deflate

I'm working for a web customer who read on a blog that Speed Insights by Google matters. So now they want to have a "high Speed Insights rating because it's good for Google". While I think this is absurd, I don't have a choice but to comply.
Problem is, this customer is using an external javascript script provided by PureChat. This script does not come compressed in any way, leading Speed Insights to scream about how it's not gzipped or deflated.
Is there a way to force PureChat to be served deflated?
The problem with doing that is that PureChat most likely update their scripts quite often. Meaning that if you were to use things like Far-Future headers etc, it could cause issues with the customers cache still running the older version. Which could consist of harmful / exploitable bugs. So as you said, I really would NOT recommend doing this.
Another problem I find a number of people complain about is Google Analytics affecting their page speed ranking. Even though this is irrelevant, it comes through Google. It can be fixed using a Cron, you could do something similar by loading the JavaScript locally.
See article here: http://diywpblog.com/leverage-browser-cache-optimize-google-analytics/ It describes how to do this, you could use it for PureChat I imagine.

Referencing javascript libraries locally or externally?

We are currently developing an ASP.NET MVC application which will be deployed on a corporate intranet, with a slightly-modified customer facing version available on the public internet.
We're making use of a number of external javascript libraries (e.g. jQuery) and a discussion has come up regarding referencing the libraries - should we reference them from an external source (e.g. via the Google load jQuery method) or keep our own version locally and reference from there?
The project manager is a little concerned about having a 'dependency' on Google (or whoever) if we reference from there, and thinks that having our own copy of the library makes us more independent. On the other hand, I have heard there are a number of advantages to letting someone else host the library - for example, they handle versioning for us, Google aren't going anywhere anytime soon...
(for the purpose of the discussion assume the intranet we're hosting on has external access - obviously if it turns out it doesn't the decision is very much made for us!)
So. Does this matter? And if so, what should we do and why?
(I appreciate this is subjective - but it would be very useful to get advice from anyone with experience or thoughts on the matter. Not sure if this is a candidate for community wiki or not, let me know if I should have put it there and I'll know for future!)
Thanks :)
Is the application critical to your internal users? If the 'internet is down' will your business suffer because internal users cannot access the applications? It's really more about risk than anything else.
Besides if there is change - breaking or otherwise, would you not want to manage the 'upgrade'?
Usually the reason of using a hosted library is performance. I browser will only download a limited number of files per host. So using a hosted library will load the files from a different host and therefore in parallel to the other files.
The second reason is that those files are usually being compressed and the cache headers are set correctly. And those files are usually stored in a CDN which means that your users will download the file from the hosts which is closest to them.
But all those reasons are not so important in a intranet environment.
you can do something like
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.min.js"></script>
<script type="text/javascript"/>
if( window.$ ) {
// jquery script goes here
} else {
var elmScript = document.createElement('script');
elmScript .src = 'jQuery.js'; // path to jquery
elmScript .type = 'text/javascript';
document.getElementsByTagName('head')[0].appendChild( elmScript );
}
</script>
Notwithstanding the complexity-issues that arise from having your code scattered over multiple sites, some of which you don't control, there is also a slight performance penalty to pay for pulling content from multiple domains (Eg. an extra dns lookup). For a public web site, where a lot of users will be first-timers, there may be some benefit to having the code cached already, but otherwise I don't really see the point.
The main (only?) reason to do this would be for performance reasons as #Kau-Boy has already mentioned. JQuery is a relatively small library (for what it gives you) and it will be cached locally by your external user's browsers/proxies/ISPs. So it doesn't really give you anything except a small latency gain.
If you find yourself adding many external Javascript files then you should perhaps reconsider your overall design strategy. Choose a good general purpose library (I personally consider jQuery to be unbeatable - this is the subjective part) and learn it inside-out. You shouldn't have to add many more libraries apart from the odd plugin, which should be included on a page-by-page basis anyway.
Some really good answers from the others on here by the way.

Is there a publicly available CDN that hosts JSON2?

It's well known that Google and Microsoft host several common javascript libraries on their CDNs (content distribution networks). Unfortunately neither seems to host JSON2.js.
I'm aware that I could upload a copy of JSON2.js to my server and serve it myself, but there are a number advantages CDNs offer that I would like to take advantage of.
So with that in mind, are there any publicly available CDNs that host JSON2? If not, any idea why? Is there some sort of copyright reason?
Checkout cdnjs.com
http://cdnjs.com/libraries/json2/
Might also be worth investigating Json3
http://cdnjs.com/libraries/json3/
UPDATE: Some of the information was out of date, changed to better links.
json2.js can be found on Yandex CDN servers.
Full version: http://yandex.st/json2/2011-10-19/json2.js
Minified: http://yandex.st/json2/2011-10-19/json2.min.js
HTTPS also works.
I think probably it's too early to expect the big CDNs to start doing this. When enough sites are using a library, the benefits become clear: greater availability, more frequent use, reduced client requests, increased performance for the end user. If only a few sites are using it, chances of client having a copy in their cache already is low and all performance boosts are lost. So all that's left is that MS and Google offset your bandwidth charges, which is not their intention. Thus, the solution is to get more developers to use the library.
Plus the library is so tiny. The code is still only 3.5KB using conservative minification. For comparison, jQuery is 24KB and ext-core is 29KB. I'd personally recommend folding the library into your own site's base JS and get your performance boost there. At least until there's wider acceptance.
Plus, it's funny I'd have expected the JSON library to be hosted also at Yahoo, but I can't find it. I mean Crockford works there.
Thomas from cdnjs.com here with two quick reasons why there is no minified version.
1) The script may not possibly function as the author intended using the method of minification we choose.
2) As a security step we ensure that all files checksums match the original authors hosted files so community submitted updates cannot contain malformed minified code.
So for now that leaves us hosting Crockfords hosted un-minified version;
https://github.com/douglascrockford/JSON-js/raw/master/json2.js
There is now.
Douglas Crockford recently put JSON2 on github, this url will always link to the most recent version.
Edit:
Its not a good idea to use this method, see my comment below.

Where should JS library reside (host)

JS library like JQuery can be linked directly from other site (e.g. google). Usually I use
<script type="text/javascript" src="/js/jQuery.min.js"></script>
But I can use
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js"></script>
or similar.
I like to take full control over my site, so I use the first way. But using google or other host has some advantage (i.e. decreased latency, increased parallelism, better caching).
Both have advantage and disadvantage.
What should I use? What you use and why?
Please let me know your opinion.
Thank you
I think that it depends on the audience of your website.
If your site is public facing and people are going to be accessing it primarily or exclusively from the internet then you are going to benefit from lower bandwidth utilization, faster responses and caching benefits since the likelihood of the file having been previously referenced and loaded from another site is high.
If your site is internal for an intranet you may run into issues of people do not have internet access but you're also going to be wasting bandwidth since you're sending everyone out over the internet to fetch a file you could host locally.
I use Google where possible for performance reasons, but I also check in a local copy in case I need to work on the site when I am offline, e.g., on an airplane, or at a remote location with no internet access.
Don't forget that if you use a copy from google (or whoever), you have to guard against the possibility that they might move or change the file, or that they're server might be down.
If your site needs a specific javascript library, then you should download it and serve it up yourself. If your income depends on that file, the last thing you want is to rely on another site to provide it.

How to protect a site from API outages?

Watching the effects of today's Google outage across the web got me thinking about how to prevent this in the future. This may be a stupid question, but is there a good way to include external APIs (e.g., Google's AJAX libraries) so that if the API is unavailable, the including page can still soldier on without it? Is it generally a bad idea to use libraries hosted on an external server in general?
It's unavoidable to use a script tag to load cross-domain JavasScript files (this will cause a timeout if it goes down). However, in your code, check for the API objects being null to avoid errors:
E.g. instead of:
<script type="text/javascript">
google.load("maps", "2");
// Use Maps API
</script>
use:
<script type="text/javascript">
if(google != null)
{
google.load("maps", "2");
// Use Maps API
}
else
{
// Fallback
}
</script>
I don't think rare outages are worth rejecting an external API wholesale.
You will want to design your application to degrade gracefully (as others have stated) and there is actually a design pattern that can be useful in doing so. The Proxy Pattern, when implemented correctly can be used as a gatekeeper to check if a service is available (among many other uses) and return appropriately to the application either the correct data, cached data or inform the application that the service is not available.
The best general answer I can give is, degrade beautifully and gracefully & avoid sending errors. If the service can become unavailable, expect that and do the best job you can.
B
ut I don't think this is question that can be answered generically. It depends what your site does, what external libraries/API you are using, etc.
You could do some sort of caching to still serve up pages with older data. If allowed, you could run the API engine on your own server. Or you could just throw up status messages to users.
It's not a bad idea to rely on external APIs, but one of the major drawbacks is that you have little control over it. If it goes away? Welcome to a big problem. Outages? Not much you can do but wait.
I think its a great idea to use external libraries since it saves bandwidth for me (read $$). But its pretty easy to protect against this kind of api outage. Keep a copy on your server and in your JavaScript check if the api has been successfully loaded. If not load the one on your server.
I was thinking about jQuery and YUI here. The other guys are right about the problems when using actual services like mapping.
One possibility for mitigating the problem (will only work if your site is dynamically generated):
Set up a cronjob that runs every 10 minutes / hour / whatever, depending how much you care. Have it attempt to download the external file(s) that you are including, one attempt for each external host that you depend on. Have it set a flag in the database that represents whether each individual external host is currently available.
When your pages are being generated, check the external-host flags, and print the source attribute either pointing to the external host if it's up, or a local copy if it's down.
For bonus points, have the successfully downloaded file from the cronjob become the local copy. Then when one does go down, your local copy represents the most-current version from the external host anyway.
A lot of the time you need to access third party libraries over the web. The question you need to ask yourself is how much do you need this and can you cache any of it.
If your uptime needs to be as close to 100% as possible then maybe you should look at how much you rely on these third parties.
If all you are obtaining is the weather once an hour then you can probably cache that so that things carry on regardless. If you are asking a third party for data that is only valid for that milisecond then you probably need to look at error handling to cover the fact it may not be there.
The answer to the question is entirely based upon the specifics of your situation.
It'd certainly be fairly easy to include a switch in your application to toggle using Google or local web server to server your YUI, JQuery or similar library so that you can toggle provider.

Categories