It's well known that Google and Microsoft host several common javascript libraries on their CDNs (content distribution networks). Unfortunately neither seems to host JSON2.js.
I'm aware that I could upload a copy of JSON2.js to my server and serve it myself, but there are a number advantages CDNs offer that I would like to take advantage of.
So with that in mind, are there any publicly available CDNs that host JSON2? If not, any idea why? Is there some sort of copyright reason?
Checkout cdnjs.com
http://cdnjs.com/libraries/json2/
Might also be worth investigating Json3
http://cdnjs.com/libraries/json3/
UPDATE: Some of the information was out of date, changed to better links.
json2.js can be found on Yandex CDN servers.
Full version: http://yandex.st/json2/2011-10-19/json2.js
Minified: http://yandex.st/json2/2011-10-19/json2.min.js
HTTPS also works.
I think probably it's too early to expect the big CDNs to start doing this. When enough sites are using a library, the benefits become clear: greater availability, more frequent use, reduced client requests, increased performance for the end user. If only a few sites are using it, chances of client having a copy in their cache already is low and all performance boosts are lost. So all that's left is that MS and Google offset your bandwidth charges, which is not their intention. Thus, the solution is to get more developers to use the library.
Plus the library is so tiny. The code is still only 3.5KB using conservative minification. For comparison, jQuery is 24KB and ext-core is 29KB. I'd personally recommend folding the library into your own site's base JS and get your performance boost there. At least until there's wider acceptance.
Plus, it's funny I'd have expected the JSON library to be hosted also at Yahoo, but I can't find it. I mean Crockford works there.
Thomas from cdnjs.com here with two quick reasons why there is no minified version.
1) The script may not possibly function as the author intended using the method of minification we choose.
2) As a security step we ensure that all files checksums match the original authors hosted files so community submitted updates cannot contain malformed minified code.
So for now that leaves us hosting Crockfords hosted un-minified version;
https://github.com/douglascrockford/JSON-js/raw/master/json2.js
There is now.
Douglas Crockford recently put JSON2 on github, this url will always link to the most recent version.
Edit:
Its not a good idea to use this method, see my comment below.
Related
I'm working for a web customer who read on a blog that Speed Insights by Google matters. So now they want to have a "high Speed Insights rating because it's good for Google". While I think this is absurd, I don't have a choice but to comply.
Problem is, this customer is using an external javascript script provided by PureChat. This script does not come compressed in any way, leading Speed Insights to scream about how it's not gzipped or deflated.
Is there a way to force PureChat to be served deflated?
The problem with doing that is that PureChat most likely update their scripts quite often. Meaning that if you were to use things like Far-Future headers etc, it could cause issues with the customers cache still running the older version. Which could consist of harmful / exploitable bugs. So as you said, I really would NOT recommend doing this.
Another problem I find a number of people complain about is Google Analytics affecting their page speed ranking. Even though this is irrelevant, it comes through Google. It can be fixed using a Cron, you could do something similar by loading the JavaScript locally.
See article here: http://diywpblog.com/leverage-browser-cache-optimize-google-analytics/ It describes how to do this, you could use it for PureChat I imagine.
I'm using HTML5 boilerplate, and saw that they're separating jQuery from main.js and plugins.js. I understand that the logic behind this is that a lot of sites use jquery and if you use a cdn there is a big chance that the user will already have it cached.
But since my use of html5 bp utilizes a build step, I could also concatenate and minify jquery with all my other scripts. So:
Separate jQuery, preferably load it from a CDN, or
Concat and minify jQuery with all other scripts
From what I understand HTML5 BP doesn't consider the concat/minify option since their boilerplate doesn't include a build step. But since it is an option to me; is there any hard data about what is the best option here, or what the real world differences in performance are?
There was a large discussion of this (and other related issues) in a pull request last year. You should check out that thread. Lots of detailed discussion.
The reason H5BP uses that set-up is that it's the best default solution. If a user of H5BP doesn't do anything else to their set-up, they're guaranteed to get a geographically optimized version of jQuery served to their users from a CDN source with the highest possible chance of hitting the cache lottery.
Beyond a default solution:
If you've got a global audience and you're going to serve your static assets from a CDN, then it would probably be slightly faster to serve a single file from that CDN.
If you've got a performance tuned static assets server and you've got a local audience, then you might be faster serving a single file from a static assets server even if it's not on a CDN.
Really, though, testing your set-up against your own geographical audience is the best way to go. It's the only way to answer this question for your site or application.
Pros :
-Performances are a little better, because users have it already cached from other sites.
Cons :
-If the CDN is down, your site is broken.
-If all your files are concatened in one, you will have only one query on your domain, and it will always be better (IMHO) than a separate query on an external domain.
I personnally will always chose to concatenate.
But in any case, don't over-optimize. If your site is not big, chose what you prefer.
(And I think that if you test both solutions, the results will not be extremely differents)
We are currently developing an ASP.NET MVC application which will be deployed on a corporate intranet, with a slightly-modified customer facing version available on the public internet.
We're making use of a number of external javascript libraries (e.g. jQuery) and a discussion has come up regarding referencing the libraries - should we reference them from an external source (e.g. via the Google load jQuery method) or keep our own version locally and reference from there?
The project manager is a little concerned about having a 'dependency' on Google (or whoever) if we reference from there, and thinks that having our own copy of the library makes us more independent. On the other hand, I have heard there are a number of advantages to letting someone else host the library - for example, they handle versioning for us, Google aren't going anywhere anytime soon...
(for the purpose of the discussion assume the intranet we're hosting on has external access - obviously if it turns out it doesn't the decision is very much made for us!)
So. Does this matter? And if so, what should we do and why?
(I appreciate this is subjective - but it would be very useful to get advice from anyone with experience or thoughts on the matter. Not sure if this is a candidate for community wiki or not, let me know if I should have put it there and I'll know for future!)
Thanks :)
Is the application critical to your internal users? If the 'internet is down' will your business suffer because internal users cannot access the applications? It's really more about risk than anything else.
Besides if there is change - breaking or otherwise, would you not want to manage the 'upgrade'?
Usually the reason of using a hosted library is performance. I browser will only download a limited number of files per host. So using a hosted library will load the files from a different host and therefore in parallel to the other files.
The second reason is that those files are usually being compressed and the cache headers are set correctly. And those files are usually stored in a CDN which means that your users will download the file from the hosts which is closest to them.
But all those reasons are not so important in a intranet environment.
you can do something like
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.min.js"></script>
<script type="text/javascript"/>
if( window.$ ) {
// jquery script goes here
} else {
var elmScript = document.createElement('script');
elmScript .src = 'jQuery.js'; // path to jquery
elmScript .type = 'text/javascript';
document.getElementsByTagName('head')[0].appendChild( elmScript );
}
</script>
Notwithstanding the complexity-issues that arise from having your code scattered over multiple sites, some of which you don't control, there is also a slight performance penalty to pay for pulling content from multiple domains (Eg. an extra dns lookup). For a public web site, where a lot of users will be first-timers, there may be some benefit to having the code cached already, but otherwise I don't really see the point.
The main (only?) reason to do this would be for performance reasons as #Kau-Boy has already mentioned. JQuery is a relatively small library (for what it gives you) and it will be cached locally by your external user's browsers/proxies/ISPs. So it doesn't really give you anything except a small latency gain.
If you find yourself adding many external Javascript files then you should perhaps reconsider your overall design strategy. Choose a good general purpose library (I personally consider jQuery to be unbeatable - this is the subjective part) and learn it inside-out. You shouldn't have to add many more libraries apart from the odd plugin, which should be included on a page-by-page basis anyway.
Some really good answers from the others on here by the way.
There are some tutorials which suggest to use jquery path which is from google eg:
<script type="text/javascript"
src="http://ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js"></script>
Is that safe to use in our projects?
Aren't we dependent because we are not sure it will be there after a year or beyond?
The reason why I have asked this question is that there are some people who go in favor
of that.
From the documentation:
Google works directly with the key
stake holders for each library effort
and accepts the latest stable versions
as they are released. Once we host a
release of a given library, we are
committed to hosting that release
indefinitely.
It seems pretty low-risk to me. And more likely to be already in the user's cache. And served with the proper gzip and caching headers. Also won't eat up a http request to your domain on browsers that only allow downloading 2 requests to a domain at a time (e.g. IE6 and IE7).
I have an article for you that explains the benefits and cons of using this method: Here
I really doubt that google will put this up for people to use and then all of a sudden take it down and cause issues with thousands or more websites. It's not like they will lose their domain or run out of bandwidth. The only issue that I think you should worry about is if the end users of your sites cannot access google. Personally I just host the file on my own server anyway
Short answer is yes and I agree if that include doesn't work it is probably a sign of a much bigger problem. My general rule of thumb is for all public facing apps I use that include where as internal apps (which theoretically could be used w/o a connection to the outside world) I include a local copy instead.
There will be always a chance that it will not be there after a year, same as gmail, gdocs, google.com...
for jquery alone, i do not see a reason to use google source, as the file is small, impact to your server and BW will not be too much. But jquery UI may worth to use google's source.
It's pretty 'safe' like what the other guys mentioned. You probably ease a bit of load of your own server too. Even SO itself is using it.
But to be safe, always have a fallback plan and have a local copy, just in case.
There's not really much risk involved if you think about it. Suppose Google ceases to exist in a year (chuckle), it wouldn't take you but a couple of minutes to replace the google.load command in your common file with a reference to your own local jQuery copy.
The worse-case scenario is that in the improbable future of Google's demise, your hover-effects stop to work for 5 minutes :)
A similar question: Where do you include the jQuery library from? Google JSAPI? CDN?
Because of the answers from that question, I have started using:
<script type="text/javascript" src="//ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js"></script>
I have it running on quite a number of sites. The only issue I've had is some firewalls start blocking a site if there are too many requests (or at least this is my guess), which is the case on higher traffic sites all used in one location.
I am trying to compare the performance of several Javascript libraries. Although measuring transaction times helps to determine the working performance of a library, it doesn't account for the time required to download and initiate the individual libraries. I'm looking for suggestions for the best method of determining the load time other than using tools such as firebug, etc. I would like to be able to setup a controlled environment where a page could be loaded n-times, while capturing the start and end times. Should the Javascript library contents be included in the page rather than as an include file, or is there a better way?
Reading this article by John Resig on JavaScript Benchmark Quality before you start anything may help you out.
After that, I would suggest that you might try requesting the javascript from your sever, getting, and timing how long the eval(responseJS); takes. That way, you are only timing how long the Library takes to load rather than that plus the time it takes to download from the server.
The libraries should always be an external file, included via a script tag, either on their own or with the site's scripting also rolled in. Minified and packed files will have a smaller size attachment. Delivered via a CDN is optimal as well, as the CDN will have it cached. Many of the popular frameworks are available over Google's CDN.
You must also account for not only the library, but the application using the library. The quality of the JS in the libraries is (typically) top notch, but what about the quality of the code tapping into those libraries, or even the code of plugins which may not be developed by the library authors. You also have to look at what browser is being used. As much as we hate it, most of these cross browser libraries are optimized for best performance out of Internet Explorer, because it retains 85+% market share.
The performance of any library is really a trade off. Deciding what is acceptable in order to get your application to do whatever it is that you want to do.