JS library like JQuery can be linked directly from other site (e.g. google). Usually I use
<script type="text/javascript" src="/js/jQuery.min.js"></script>
But I can use
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js"></script>
or similar.
I like to take full control over my site, so I use the first way. But using google or other host has some advantage (i.e. decreased latency, increased parallelism, better caching).
Both have advantage and disadvantage.
What should I use? What you use and why?
Please let me know your opinion.
Thank you
I think that it depends on the audience of your website.
If your site is public facing and people are going to be accessing it primarily or exclusively from the internet then you are going to benefit from lower bandwidth utilization, faster responses and caching benefits since the likelihood of the file having been previously referenced and loaded from another site is high.
If your site is internal for an intranet you may run into issues of people do not have internet access but you're also going to be wasting bandwidth since you're sending everyone out over the internet to fetch a file you could host locally.
I use Google where possible for performance reasons, but I also check in a local copy in case I need to work on the site when I am offline, e.g., on an airplane, or at a remote location with no internet access.
Don't forget that if you use a copy from google (or whoever), you have to guard against the possibility that they might move or change the file, or that they're server might be down.
If your site needs a specific javascript library, then you should download it and serve it up yourself. If your income depends on that file, the last thing you want is to rely on another site to provide it.
Related
For a modern website, what is the best way to load javascript libraries (in my case jQuery and D3)?
Let assume:
everyone accesses the site using HTTP/2
self hosting means hosting on GitHub (i.e. for bl.ocks)
referencing could mean:
Google for jQuery and cdnjs or D3.org for D3
cdnjs for jQuery and D3
cdnjs for jQuery and D3.org for D3
Since everyone is using HTTP/2, the parallelism argument no long applies (right?).
In order to maximize the chance of a cache hit, I would assume Google is the best bet for jQuery, but they do not provide D3, so I would have to use cdnjs or D3.org for that. Is there an advantage to using cdnjs for both?
EDIT: Let me say about the audience that it is global, so ideally a solution would work well from e.g. Africa and China. The later is important here, because it blocks access to Google servers, meaning a local fallback would be needed.
The audience is also not limited to D3 designers / bl.ocks users (in case that would be relevant to the cache hit chances).
Using a CDN version might mean it's cached, so you save a tiny about of downloading another copy of it. However if it's not, then it can actually slow down your site because you need to make a connection to the CDN site including a DNS lookup, a TCP 3 way handshake, deal with TCP Slow Start meaning the connection is initially slow, do a TLS set up (assuming it's over HTTPS), and finally requesting the resource. After which you never use that CDN for anything else so all that setup cost is wasted.
This cost is doubled up for two different CDNs.
Personally, if it's one or two libraries than I just self host for this reason. Even for HTTP/1.1.
If you really want benefits of a CDN then consider putting the whole site behind a CDN like Cloudfare and not just loading one or two libraries from a CDN. And this might not be a bad idea for a global service like you say this is.
We use an external service (Monetate) to serve JS to our site such that we can perform adhoc presentation-layer site updates without going through the process of a site re-deploy - which in our case is a time-consuming, monolithic process which we can only afford to do about once per month.
However, users who use adblockers in the browser do not see some of these presentation-layer updates. This can negatively affect their experience of the site as we sometimes include time-sensitive promotions that those users may not be aware of.
To work around this, I was thinking to duplicate the JavaScript file that Monetate is serving and host it on a separate infrastructure from the site. That way, it we needed to make updates to it, we could do so as needed without doing a full site re-deploy.
However, I'm wondering if there is some way to work around the blocking of the Monetate JS file and somehow execute the remote Monetate JS file from our own JS code in such a way that adblockers would not be able to block it? This avoid the need to duplicate the file.
If that file is blocked by adblockers, chances are that it is used to serve ads. In fact, your description of time-sensitive promotions sounds an awful lot like ads, just not for an external provider, but for your own site.
Since adblockers usually match the URL, the easiest solution would indeed be to rehost this file, if possible under a different name. Instead of hosting a static copy, you can also implement a simple proxy with the equivalent of <?php readfile('http://monetdate.com/file.js'); or apache's mod_rewrite. While this will increase load times and can fail if the remote host goes down, it means the client will always get the newest version of the file.
Apart from using a different URL, there is no client-side solution - adblockers are included in the browser (or an extension thereof), and you cannot modify that code for good reasons.
Beware that adblockers may decide to block your URL too, if the script is indeed used to serve ads.
Monetate if probably blacklisted in Adblock, so you can't do nothing about.
I think that self-hosting Monetate script would require to keep it updated by checking for new versions from time to time (maintaining it could become a pain in the ass).
A good solution in my opinion is to inform your users about that limitation with a clear message.
Or, you can get in touch with Monetate and ask for a solution.
I've heard all the cases in favour of using a CDN like Google APIs to host JavaScript libraries like JQuery and Prototype for my web application. It's faster, saves bandwidth, permits parallel loading of scripts, and so on. But I recently came across the following comment in Douglas Crockford's json2.js script:
USE YOUR OWN COPY. IT IS EXTREMELY UNWISE TO LOAD CODE FROM SERVERS YOU DO NOT CONTROL.
I'm curious what his argument might be behind this assertion, and whether it's specifically targeted at users of public CDNs like Google's, or something else?
Assuming he's talking about professionally hosted CDNs like Google, then the best bet is to do this:
<!-- Grab Google CDN's jQuery, with a protocol relative URL; fall back to local if necessary -->
<script src="//ajax.googleapis.com/ajax/libs/jquery/1.5.1/jquery.js"></script>
<script>window.jQuery || document.write("<script src='js/libs/jquery-1.5.1.min.js'>\x3C/script>")</script>
(taken from http://html5boilerplate.com/)
That way, you get all the benefits, without the risk of your website breaking if Google's CDN goes down.
But, he said:
USE YOUR OWN COPY. IT IS EXTREMELY
UNWISE TO LOAD CODE FROM SERVERS YOU
DO NOT CONTROL.
I don't actually think he's talking about CDNs. I think he's just saying "don't hotlink scripts from random websites".
You wouldn't want to do this because the website might change where the script is located, or even change the script. A CDN would never do this.
Basically, it's a matter of trust. You need to trust the host to not change anything in the hosted file and you need to trust in the availability of the file. Can you be absolutely sure that the URL will not change? Are you comfortable with the fact that any downtime of their servers results in downtime of your application?
The reason is, if the server you are dependent on goes down, and yours doesn't. The experience of your site suffers. There are ways to have a fallback in place so if jquery or some other script doesn't load, then you can use a copy you host as a backup.
The other time you shouldn't use it is in a Intranet application scenario, where the bandwidth is not typically an issue.
A way to create a fallback from Jon Galloway: http://weblogs.asp.net/jgalloway/archive/2010/01/21/using-cdn-hosted-jquery-with-a-local-fall-back-copy.aspx
<script type="text/javascript" src="http://ajax.microsoft.com/ajax/jquery/jquery-1.3.2.min.js"></script>
<script type="text/javascript">
if (typeof jQuery == 'undefined')
{
document.write(unescape("%3Cscript src='/Scripts/jquery-1.3.2.min.js' type='text/javascript'%3E%3C/script%3E"));
}
</script>
If a public server's js is compromised (availability, security or bug-wise), then the visitors to your site will be affected and likely blame you. On the other hand, what are the chances of Google's CDN being compromised over the chances of some smaller company's server? You also lose out on all the caching advantages that a CDN gives you when you host locally.
jQuery is open source. If you've made a modification to the internals, then obviously you can't host off another person's server. In general, hosting other people's scripts is a security risk; they could change the script without ever telling you, and now you're linking it onto your pages.
It's a matter of trust; do you trust that whatever CDN will be secure to not host a malicious script in the location of the script you want?
While some of these other answers are certainly valid, we have a slightly different/additional reason.
We have a process that determines, on first request, evaluates which static content is required for any given page. In the background, this static content (js, css) is merged and minified into a single file (1 for JS, 1 for CSS), and then all future requests are served with a single file, instead of multiple.
While we could, theoretically, exclude files that may be served on a CDN and use the CDN for those, it's actually easier (because we'd actually have to add code to handle exclusions) and in some cases, faster than using a CDN.
In addition to all the other answers:
You want to worry about serving your pages over SSL (i.e. https) but your JS over straight http from a different source. Browsers can complain (sometimes in an alarming way) about secured and unsecured items.
In addition, people browsing with the noscript extension (or similar) need to allow JS to run from multiple different sources. Not that big a deal if you are using a major CDN (as chances are they'll have allowed it at some point in the past) but you then need to worry that they are allowing only SOME of your JS.
Modern answer: yes, availability
Other people's servers (regardless of a public CDN or some random nondescript site) might go down, breaking your app's availability.
The CDN might also be compromised, causing your app to execute harmful code, but this issue can be mitigated with Subresource Integrity (SRI).
If you host it on your own server that you control, it would become unavailable at the same time your entire app becomes unavailable, rather than at some arbitrary time under someone else's control.
Using a public CDN has tradeoffs and might be worth it in some cases (for example, to save bandwidth).
<!-- best -->
<script src="your_own_server/framework.js"></script>
<!-- second-best (using public CDN) -->
<script src="https://public-cdn.example/framework.js">
integrity="sha256-AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA"
crossorigin="anonymous"></script>
<!-- do not use -->
<script src="https://random-server-without-cors.example/framework.js"></script>
We are currently developing an ASP.NET MVC application which will be deployed on a corporate intranet, with a slightly-modified customer facing version available on the public internet.
We're making use of a number of external javascript libraries (e.g. jQuery) and a discussion has come up regarding referencing the libraries - should we reference them from an external source (e.g. via the Google load jQuery method) or keep our own version locally and reference from there?
The project manager is a little concerned about having a 'dependency' on Google (or whoever) if we reference from there, and thinks that having our own copy of the library makes us more independent. On the other hand, I have heard there are a number of advantages to letting someone else host the library - for example, they handle versioning for us, Google aren't going anywhere anytime soon...
(for the purpose of the discussion assume the intranet we're hosting on has external access - obviously if it turns out it doesn't the decision is very much made for us!)
So. Does this matter? And if so, what should we do and why?
(I appreciate this is subjective - but it would be very useful to get advice from anyone with experience or thoughts on the matter. Not sure if this is a candidate for community wiki or not, let me know if I should have put it there and I'll know for future!)
Thanks :)
Is the application critical to your internal users? If the 'internet is down' will your business suffer because internal users cannot access the applications? It's really more about risk than anything else.
Besides if there is change - breaking or otherwise, would you not want to manage the 'upgrade'?
Usually the reason of using a hosted library is performance. I browser will only download a limited number of files per host. So using a hosted library will load the files from a different host and therefore in parallel to the other files.
The second reason is that those files are usually being compressed and the cache headers are set correctly. And those files are usually stored in a CDN which means that your users will download the file from the hosts which is closest to them.
But all those reasons are not so important in a intranet environment.
you can do something like
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.min.js"></script>
<script type="text/javascript"/>
if( window.$ ) {
// jquery script goes here
} else {
var elmScript = document.createElement('script');
elmScript .src = 'jQuery.js'; // path to jquery
elmScript .type = 'text/javascript';
document.getElementsByTagName('head')[0].appendChild( elmScript );
}
</script>
Notwithstanding the complexity-issues that arise from having your code scattered over multiple sites, some of which you don't control, there is also a slight performance penalty to pay for pulling content from multiple domains (Eg. an extra dns lookup). For a public web site, where a lot of users will be first-timers, there may be some benefit to having the code cached already, but otherwise I don't really see the point.
The main (only?) reason to do this would be for performance reasons as #Kau-Boy has already mentioned. JQuery is a relatively small library (for what it gives you) and it will be cached locally by your external user's browsers/proxies/ISPs. So it doesn't really give you anything except a small latency gain.
If you find yourself adding many external Javascript files then you should perhaps reconsider your overall design strategy. Choose a good general purpose library (I personally consider jQuery to be unbeatable - this is the subjective part) and learn it inside-out. You shouldn't have to add many more libraries apart from the odd plugin, which should be included on a page-by-page basis anyway.
Some really good answers from the others on here by the way.
There are some tutorials which suggest to use jquery path which is from google eg:
<script type="text/javascript"
src="http://ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js"></script>
Is that safe to use in our projects?
Aren't we dependent because we are not sure it will be there after a year or beyond?
The reason why I have asked this question is that there are some people who go in favor
of that.
From the documentation:
Google works directly with the key
stake holders for each library effort
and accepts the latest stable versions
as they are released. Once we host a
release of a given library, we are
committed to hosting that release
indefinitely.
It seems pretty low-risk to me. And more likely to be already in the user's cache. And served with the proper gzip and caching headers. Also won't eat up a http request to your domain on browsers that only allow downloading 2 requests to a domain at a time (e.g. IE6 and IE7).
I have an article for you that explains the benefits and cons of using this method: Here
I really doubt that google will put this up for people to use and then all of a sudden take it down and cause issues with thousands or more websites. It's not like they will lose their domain or run out of bandwidth. The only issue that I think you should worry about is if the end users of your sites cannot access google. Personally I just host the file on my own server anyway
Short answer is yes and I agree if that include doesn't work it is probably a sign of a much bigger problem. My general rule of thumb is for all public facing apps I use that include where as internal apps (which theoretically could be used w/o a connection to the outside world) I include a local copy instead.
There will be always a chance that it will not be there after a year, same as gmail, gdocs, google.com...
for jquery alone, i do not see a reason to use google source, as the file is small, impact to your server and BW will not be too much. But jquery UI may worth to use google's source.
It's pretty 'safe' like what the other guys mentioned. You probably ease a bit of load of your own server too. Even SO itself is using it.
But to be safe, always have a fallback plan and have a local copy, just in case.
There's not really much risk involved if you think about it. Suppose Google ceases to exist in a year (chuckle), it wouldn't take you but a couple of minutes to replace the google.load command in your common file with a reference to your own local jQuery copy.
The worse-case scenario is that in the improbable future of Google's demise, your hover-effects stop to work for 5 minutes :)
A similar question: Where do you include the jQuery library from? Google JSAPI? CDN?
Because of the answers from that question, I have started using:
<script type="text/javascript" src="//ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js"></script>
I have it running on quite a number of sites. The only issue I've had is some firewalls start blocking a site if there are too many requests (or at least this is my guess), which is the case on higher traffic sites all used in one location.