I have had this question for awhile and am surprised that I have yet to come across a good/complete answer to it.
The question is essentially this:
When it comes to loading js files, in what situations should you load them from the web if available versus serving them up yourself? What case typically allows for the lowest latency?
E.g.
<script src="https://ajax.googleapis.com/ajax/libs/jquery/1.11.3/jquery.min.js"></script>
vs.
<script src="js/jquery-1-11-3.min.js"></script>
Complete answer: Both.
Loading it off of the web will benefit you in a couple of ways:
1) There is a limit to the number of maximum open HTTP requests a browser can have. However, this limit is per domain. So reaching out to google's servers will not prevent you from loading your CSS/images.
2) It's very likely that the user will already have that file cached, so they will get an HTTP 304 not changed response, and not have to download the file.
Now, with that said, sometimes the server will be down, or network issues will otherwise stop you from loading that file. When that happens, you need a workaround, which we can do like so:
<script>
if (typeof jQuery == 'undefined') {
document.write(unescape("%3Cscript src='/js/jquery-2.0.0.min.js' type='text/javascript'%3E%3C/script%3E"));
}
</script>
Put this tag AFTER loading JQuery from the CDN, if the load failed, jQuery will be undefined and proceed to load it from the local source. If the load from the CDN worked, then this block will be skipped over.
Including from own server pros:
Technically google servers can be down, and your website won't load correctly.
people, that dont trust google and have it blocked, script blocked etc.
These ppl too wouldnt want to include the file from google directly.
connections to google can have higher latency. if you have your audience in your own country and a good provider, your connec can be faster than googles.
contra:
higher webserver traffic
more connections
higher CPU impact
You have to decide for yourself, which one is better for you. For smaller sites I'd go for local stored file.
Related
I'm guessing this is impossible, but maybe someone has thought of something smart already:
Would it be possible to load jQuery using the version that's in the browser cache (from some major CDN, say) if it's cached, and otherwise load it from my site? This way, users would never make a request to the CDN because of reaching our site, but the site would stay fairly fast for most users.
Ideally one would do <script src="./jquery.min.js" same-as-cached="https://thecdn/jquery.min.js" integrity="sha-…"/>, but since AFAIK nothing like that exists, is there a way to get the script from cache without making a request?
Is there anyway to check on a client if jQuery was loaded before from a CDN? I mean to have code like this:
if (jQuery.isLoadedFromCDN)
//DoNothing
else
//load from internal resource
NOTE that I don't want to check that jQuery is loaded, but specifically if it was loaded from a CDN.
The context is that I have an internal LAN web app that uses jQuery, loading jQuery from the LAN is definitely faster but if it already was downloaded before from a CDN (which is probably the case) then I just want to use that otherwise I want to get it from our internal resource not the cdn. The bandwidth saving is definitely not huge, but I am more curious to know if that's technically possible.
I'm guessing you're checking to see if jQuery was loaded from the Google Libraries API?
Your browser will cache jQuery whether it's from the CDN or from your LAN. If it's already in the cache from a previous retrieval from the CDN, it'll load faster than from your LAN. If it's not already in your cache from either a previous visit to your site, or another site using the same CDN, it'll only need to load once: subsequent visits will load from the cache.
Splitting the URL for jQuery between the CDN and your LAN will just cause two copies to get cached. Let the browser cache do what it was meant to do. :)
This should do it:
<script src="//ajax.googleapis.com/ajax/libs/jquery/1.4.4/jquery.js"></script>
<script>!window.jQuery && document.write(unescape('%3Cscript src="js/libs/jquery-1.4.4.js"%3E%3C/script%3E'))</script>
You need to adapt the path for the fallback, which should be located at your domain.
to my knowledge, that's not really possible.
2 things jump to mind however that can help you:
Use Etags: Identify your resources on the webserver side. This way, your browser does that work for you: identify if a resource is already loaded, regardless from the domain it was loaded from
If you really want to know which domain jQuery was loaded from, you can do dummy AJAX requests. You can only do AJAX requests from the same domain your library was loaded from (except ofcourse when doing a JSONP request). So if your jQuery was loaded from your LAN resource, you will get trplies to AJAX requests to your LAN webserver, and that rules out the CDN.
Hope it helps,
Bart
Is there any difference between linking a stylesheet or script to our website from EXTERNAL servers and uploading the stylesheet to our server and link to it? What are the benefits of uploading to our server?
Usually the external ones are served by a CDN server, that's means that your users probably have them cached ( for example jquery or other popular library )
If you store something on your server it will mean your server will have to process more data, that is send your scripts or CSS together with every request. That will add some extra work for your server which may mean extra costs or slower responses.
The advantage of using links to external servers is less load on your server and also most of the popular frameworks like jQuery for example, they have your file available all over the world, so if user opens your website based in europe for example and he is based in USA he will not have to wait for file to be delivered across the world, but instead will most likely get that file from a location closer to him. Less traffic, less load on server and faster response times.
Yes, There is lots of difference:
If you will include stylesheet or script from external source then it will take time to load the page.
If you will include stylesheet or script from external source then it may causes not to load all due to firewall(I have faced this situation).
In future if there will change in the files then it will also reflect your site.
The security issue, anyone can add the code to track your data.
So always recommend to download and add these file from your on server instead of including stylesheet or script from external source.
I've seen a couple of pages load their javascript files into a page via new Image.src ='';
<script type="text/javascript">
new Image().src = "XXXX\/js\/jquery-1.7.2.min.js";
</script>
I was just wondering the benefits or purpose of this.
It's a quick and dirty way of initiating an HTTP request (as the comments on the question suggest).
There may be a minor advantage gained by initiating the download at the top of the page and then including <script src='the-same-file.js'></script> at the bottom of the page so that the file can be loaded from the browser cache.
This might allow the latency of the download to be parallelized with a parsing task. For example, the download initiated in the head might run while the body is still being parsed.
Why not just reference the file in the head using the src attribute?
If neither [the defer or async] attribute is present, then the script is fetched and
executed immediately, before the user agent continues parsing the
page.
Source (suggested reading)
In other words, this method attempts to allow the browser to download the file without incurring the blocking behavior until later in the process.
However
If this is really necessary, I would consider the defer attribute which is intended for such purposes rather than the new Image() hack.
This "optimization" could backfire depending on cache headers. You could end up making two HTTP requests or even downloading the file twice.
"In the Wild"
A quick investigation of several major sites (Google search, Gmail, Twitter, Facebook, Netflix) shows that this technique is not used to fetch JavaScript files and used very sparingly overall.
For example, Facebook appears to use it not for caching/performance but for tracking purposes when the site is (potentially maliciously) loaded into a frameset. By creating an Image instance and setting the source, they initiate an HTTP request to a page which tracks the clickjacking attempt.
This is an isolated case; under normal circumstances this script will never be run.
I have a website that is grown somewhat large and is built on a super-restrictive platform (SBI). There you have to follow their file structure and put everything in an appropriate folder and then upload each and every file through their interface manually. I have cool HTML5 template and some Javascript with a lot of little files and images so it was just way easier to upload all this stuff to my OTHER DOMAIN hosted by Hostgator using Filezilla and then just refer css and js files from my SBI site to their location at my Hostgator's domain.
Are there any potential issues with this method?
The reason I am asking is because yesterday I came across Google's article on serving resourcing from a consistent URL: https://developers.google.com/speed/docs/best-practices/payload#duplicate_resources However, I might be misunderstanding what it means. When I put my actual URL to test at Google's page speed insights here https://developers.google.com/speed/pagespeed/insights it advises me to serve resources from a consistent URL, but in details it doesn't complain about my CSS and JS files, it complains about Facebook only, like this:
Suggestions for this page:
The following resources have identical contents, but are served from different URLs. Serve these resources from a consistent URL to save 1 request(s) and24.3KiB.
http:// static.ak.facebook.com/.../xd_arbiter.php?...
https:// s-static.ak.facebook.com/.../xd_arbiter.php?...
I appreciate you reading this. Thanks in advance!
Serving static content from a different domain is common practice, I don't see any issues there - it's as safe and reliable as the server you are using to serve it.
The facebook warning could mean you are loading the same FB API script twice, or it just may be some black magic done by the FB devs.
You should not have any problems with hosting your files on a different site. Your users may experience a slightly slower page load because their machine has to do more DNS lookups, on the other hand most web browsers only download a maximum of 2 files form a host simultaneously, so doubling your hosts can double your simultaneous downloads. That warning about Facebook is because the same script is being downloaded twice from two different places which is not ideal, but I'm not familiar with the Facebook api so I'm not sure if that can be helped.