Load libraries, scripts, etc for the website from external or internal sources? - javascript

I just started my adventure with frontend, most likely with web design. I've been struggling to answer one technical question and I couldn't find yet a reasonable answer.
There's so many libraries you can load, download to make your web developing faster. Therefore there is my question.
Is it better to download these libraries (e.g. Boostrap, jQuery, Angular, fonts from Google and so) and link to them (externally) from the official source or download it, upload to your server and then link to the location file (internal source) on your server?
My imagination tells me that if I would download them and upload em on my server, then link to it would make the whole website load quicker. Is that a good thinking?

Pro hosting and linking to external resources (may it be JS libraries, images or whatever):
Spreading the load: your server doesn't have to serve all content, it can concentrate on its main functionality and serve the application itself.
Spread the HTTP connections: due to more and more asynchronously working applications it is a good thing to use the maximum parallel HTTP connections per site/subdomain to deliver application data and load all necessary additional resources from other servers.
as Rafael mentioned above, CDNs scale very good and seldom go offline.
Cons
Even with fast internet connections there is a high chance that resources will be served faster when they are located on the same Intranet. That's why some companies have their own "Micro-CDNs" inside their local networks to combine the advantages of multiple servers and local availability.
External dependancy: as soon as an Internet connection becomes unavailable or a Proxy server goes down, all external resources become unavailable leaving the application in a broken state.

Sometimes it may be actually faster if you link from an external source. That's because the browser stores recent data it accesses, and many sites use Bootstrap, jQuery and the such. It might not happen frequently with less popular libraries.
Keep in mind, though, since you're downloading from external sources, you're at the mercy of their servers. If for some reason or another it gets offline, your page won't work correctly. CDNs are not supposed to go offline for that very reason, but it's good to be aware of that. Also, when/if you're offline and working on your page, you won't be able to connect during development.

It is always better to download these files locally if you are developing some application for more security so that you do not really have to depend on any third party server which hosts the CDN.
Talking about performance using CDN might be beneficial because the libraries that you require might be cached in your browser so the time to fetch the file is saved. But if the file is available locally loading these files will definately take time and space.
https://halfelf.org/2015/cdn-vs-local/
https://www.sitepoint.com/7-reasons-not-to-use-a-cdn/

I agree with Rafael's answer above, but wanted to note a few benefits of serving up these libraries locally that he or she omitted.
It is still considered best practice (until HTTP2 becomes widespread) to try to minimize the amount of downloads being made by your site by concatenating many files into a single file. SO-- if you are using three Javascript libraries/frameworks (for instance, Angular, jQuery and Moment.js), if you are using a CDN that is three separate script elements pulling down three separate .js files. However, if you host them locally, you can have your build process include a step where it bundles the three libraries together into a single file called "vendor.js" or something along those lines. This has the added bonus of simplifying dependency loading to some degree, as you can concatenate them in a certain order should the need be.
Finally, although it is a little advanced if you are just getting started, if you are considering hosting your library files with your project it would definitely be worth looking into bower (https://bower.io/docs/api/) -- it is a node build tool that allows you to define what packages need to be included in your project and then install them with a single command-- particularly useful for keeping unnecessary library files out of your version control. Good luck!

Related

Is it faster to load if all webpage resources are compiled into a single HTML file?

What if I had a compilation step for my website that turned all external scripts and styles into a single HTML file with embedded <script> and <style> tags? Would this improve page load times due to not having to send extra GETs for the external files? If so, why isn't this done more often?
Impossible to say in general, because it is very situational.
If you're pulling resources from many different servers, these requests can slow your page loading down (especially with some bad DNS on the visiting side).
Requesting many different files may also slow down page load even if they're from the same origin/server.
Keep in mind not everyone has gigabit internet (or even on megabit level). So putting everything directly into your HTML file (inlining or using data URIs) will definitely reduce network overhead at first (less requests, less headers, etc.).
In addition (and making the previous point even worse) this will also break many other features often used to reduce page loading times. For example, resources can't be cached - neither locally nor on some proxy - and are always transferred. This might be costly for both the visitor as well as the hosting party.
So often the best way to approach this is going the middle ground, if loading times are an issue to you:
If you're using third party scripts, e.g. jQuery, grab these from a public hosted CDN that's used by other pages as well. If you're lucky, your visitor's browser will have a cached copy and won't do the request.
Your own scripts should be condensed and potentially minified into a single script (tools such as browersify, webpack, etc.). This must not include often changing parts, as these would force you to transfer even more data more often.
If you've got any scripts or resources that are really only part of your current visitor's experience (like logged in status, colors picked in user preferences, etc.), it's okay to put these directly into the parent HTML file, if that file is customized anyway and delivering them as separate files wouldn't work or would cause more overhead. A perfect example for this would be CSRF tokens. Don't do this if you're able to deliver some static HTML file that's filled/updated by Javascript though.
Yes, it will improve page load time but still this method is not often used due to these reasons:
Debugging will be difficult for that.
If we want to update later, it also won't be so easy.
Separate css and .js files remove these issues
And yeah, for faster page load, you can use a BUILD SYSTEM like GRUNT, GULP, BRUNCH etc. for better performance.

Is there any data on the advantage of loading jQuery from a CDN (HTML5BP) vs concatenating and minifying it with the other scripts?

I'm using HTML5 boilerplate, and saw that they're separating jQuery from main.js and plugins.js. I understand that the logic behind this is that a lot of sites use jquery and if you use a cdn there is a big chance that the user will already have it cached.
But since my use of html5 bp utilizes a build step, I could also concatenate and minify jquery with all my other scripts. So:
Separate jQuery, preferably load it from a CDN, or
Concat and minify jQuery with all other scripts
From what I understand HTML5 BP doesn't consider the concat/minify option since their boilerplate doesn't include a build step. But since it is an option to me; is there any hard data about what is the best option here, or what the real world differences in performance are?
There was a large discussion of this (and other related issues) in a pull request last year. You should check out that thread. Lots of detailed discussion.
The reason H5BP uses that set-up is that it's the best default solution. If a user of H5BP doesn't do anything else to their set-up, they're guaranteed to get a geographically optimized version of jQuery served to their users from a CDN source with the highest possible chance of hitting the cache lottery.
Beyond a default solution:
If you've got a global audience and you're going to serve your static assets from a CDN, then it would probably be slightly faster to serve a single file from that CDN.
If you've got a performance tuned static assets server and you've got a local audience, then you might be faster serving a single file from a static assets server even if it's not on a CDN.
Really, though, testing your set-up against your own geographical audience is the best way to go. It's the only way to answer this question for your site or application.
Pros :
-Performances are a little better, because users have it already cached from other sites.
Cons :
-If the CDN is down, your site is broken.
-If all your files are concatened in one, you will have only one query on your domain, and it will always be better (IMHO) than a separate query on an external domain.
I personnally will always chose to concatenate.
But in any case, don't over-optimize. If your site is not big, chose what you prefer.
(And I think that if you test both solutions, the results will not be extremely differents)

Should I embed third-party javascript libraries in optimized javascript build?

I'm using the excellent requirejs optimizer to compress the code of a web application.
The application uses a lot of third-party libs. I have several options :
Let the user download all the third party libs separately from my server
Let the user download all the third party libs from a CDN, if available
User requirejs to produce a 'compressed' version of all those libs, in a single file
Now, I know that caching and / or a CDN would help with how long it takes to fetch each individual library, however if I have 15 libs, I'm still going to end up with 15 http requests ; which is all the more annoying if the actual code for my application ends up being served in one or two relatively small files.
So what are the pros and cons of each methods ? Also, I suppose I would be actually 'redistributing' (in the sense of common FOOS licenses) the libraries if I were to bundle them inside my app (rather than pointing to a CDN ?)
Any experience / ideas welcome.
Thanks.
You could take a look to Why should I use Google's CDN for jQuery? question, why CDN is better solution.
It increases the parallelism available. (Most browsers will only
download 3 or 4 files at a time from any given site.)
It increases the chance that there will be a cache-hit. (As more sites
follow this practice, more users already have the file ready.)
It ensures that the payload will be as small as possible. (Google can
pre-compress the file in a wide array of formats (like GZIP or
DEFLATE). This makes the time-to-download very small, because it is
super compressed and it isn't compressed on the fly.)
It reduces the amount of bandwidth used by your server. (Google is
basically offering free bandwidth.)
It ensures that the user will get a geographically close response.
(Google has servers all over the world, further decreasing the
latency.)
(Optional) They will automatically keep your scripts up to date. (If
you like to "fly by the seat of your pants," you can always use the
latest version of any script that they offer. These could fix security
holes, but generally just break your stuff.)

How to optimally serve and load JavaScript files?

I'm hoping someone with more experience with global-scale web applications could clarify some questions, assumptions and possible misunderstandings I have.
Let's take a hypothetical site (heavy amount of client-side / dynamic components) which has hundreds of thousands of users globally and the sources are being served from one location (let's say central Europe).
If the application depends on popular JavaScript libraries, would it be better to take it from the Google CDN and compile it into one single minified JS file (along with all application-specific JavaScript) or load it separately from the Google CDN?
Assetic VS headjs: Does it make more sense to load one single JS file or load all the scripts in parallel (executing in order of dependencies)?
My assumptions (please correct me):
Compiling all application-specific/local JS code into one file, using CDNs like Google's for popular libraries, etc. but loading all of these via headjs in parallel seems optimal, but I'm not sure. Server-side compiling of third party JS and application-specific JS into one file seems to almost defeat the purpose of using the CDN since the library is probably cached somewhere along the line for the user anyway.
Besides caching, it's probably faster to download a third party library from Google's CDN than the central server hosting the application anyway.
If a new version of a popular JS library is released with a big performance boost, is tested with the application and then implemented:
If all JS is compiled into one file then every user will have to re-download this file even though the application code hasn't changed.
If third party scripts are loaded from CDNs then the user only has download the new version from the CDN (or from cache somewhere).
Are any of the following legitimate worries in a situation like the one described?
Some users (or browsers) can only have a certain number of connections to one hostname at once so retrieving some scripts from a third party CDN would be result in overall faster loading times.
Some users may be using the application in a restricted environment, therefore the domain of the application may be white-listed but not the CDNs's domains. (If it's possible this is realistic concern, is it at all possible to try to load from the CDN and load from the central server on failure?)
Compiling all application-specific/local JS code into one file
Since some of our key goals are to reduce the number of HTTP requests and minimize request overhead, this is a very widely adopted best practice.
The main case where we might consider not doing this is in situations where there is a high chance of frequent cache invalidation, i.e. when we make changes to our code. There will always be tradeoffs here: serving a single file is very likely to increase the rate of cache invalidation, while serving many separate files will probably cause a slower start for users with an empty cache.
For this reason, inlining the occasional bit of page-specific JavaScript isn't as evil as some say. In general though, concatenating and minifying your JS into one file is a great first step.
using CDNs like Google's for popular libraries, etc.
If we're talking about libraries where the code we're using is fairly immutable, i.e. unlikely to be subject to cache invalidation, I might be slightly more in favour of saving HTTP requests by wrapping them into your monolithic local JS file. This would be particularly true for a large code base heavily based on, for example, a particular jQuery version. In cases like this bumping the library version is almost certain to involve significant changes to your client app code too, negating the advantage of keeping them separate.
Still, mixing request domains is an important win, since we don't want to be throttled excessively by the maximum connections per domain cap. Of course, a subdomain can serve just as well for this, but Google's domain has the advantage of being cookieless, and is probably already in the client's DNS cache.
but loading all of these via headjs in parallel seems optimal
While there are advantages to the emerging host of JavaScript "loaders", we should keep in mind that using them does negatively impact page start, since the browser needs to go and fetch our loader before the loader can request the rest of our assets. Put another way, for a user with an empty cache a full round-trip to the server is required before any real loading can begin. Again, a "compile" step can come to the rescue - see require.js for a great hybrid implementation.
The best way of ensuring that your scripts do not block UI painting remains to place them at the end of your HTML. If you'd rather place them elsewhere, the async or defer attributes now offer you that flexibility. All modern browsers request assets in parallel, so unless you need to support particular flavours of legacy client this shouldn't be a major consideration. The Browserscope network table is a great reference for this kind of thing. IE8 is predictably the main offender, still blocking image and iFrame requests until scripts are loaded. Even back at 3.6 Firefox was fully parallelising everything but iFrames.
Some users may be using the application in a restricted environment, therefore the domain of the application may be white-listed but not the CDNs's domains. (If it's possible this is realistic concern, is it at all possible to try to load from the CDN and load from the central server on failure?)
Working out if the client machine can access a remote host is always going to incur serious performance penalties, since we have to wait for it to fail to connect before we can load our reserve copy. I would be much more inclined to host these assets locally.
Many small js files is better than few large ones for many reasons including changes/dependencies/requirements.
JavaScript/css/html and any other static content is handled very efficiently by any of the current web servers (Apache/IIS and many others), most of the time one web server is more than capable of serving 100s and 1000s requests/second and in any case this static content is likely to be cached somewhere between the client and your server(s).
Using any external (not controlled by you) repositories for the code that you want to use in production environment is a NO-NO (for me and many others), you don't want a sudden, catastrophic and irrecoverable failure of your whole site JavaScript functionality just because somebody somewhere pressed commit without thinking or checking.
Compiling all application-specific/local JS code into one file, using
CDNs like Google's for popular libraries, etc. but loading all of
these via headjs in parallel seems optimal...
I'd say this is basically right. Do not combine multiple external libraries into one file, since—as it seems you're aware—this will negate the majority case of users' browsers having cached the (individual) resources already.
For your own application-specific JS code, one consideration you might want to make is how often this will be updated. For instance if there is a core of functionality that will change infrequently but some smaller components that might change regularly, it might make sense to only compile (by which I assume you mean minify/compress) the core into one file while continuing to serve the smaller parts piecemeal.
Your decision should also account for the size of your JS assets. If—and this is unlikely, but possible—you are serving a very large amount of JavaScript, concatenating it all into one file could be counterproductive as some clients (such as mobile devices) have very tight restrictions on what they will cache. In which case you would be better off serving a handful of smaller assets.
These are just random tidbits for you to be aware of. The main point I wanted to make was that your first instinct (quoted above) is likely the right approach.

Link to external source or store locally

When using third-party libraries such as jquery, yui-reset, swfobject and so on, do you link to the hosted versions, or download and host your own versions?
Pros and cons either way?
Hosted versions are apparently the way to go. For three main reasons (edit: I've added a fourth reason, but it's sort of a moot point):
Google/jQuery/etc servers are likely to be faster than your own
A lot of these servers use content distribution so it will be served from a server geographically close to the requester
If every site used the hosted versions, users are more likely to have the files cached in their browsers, so a trip to the server might not even be necessary
They are possibly more reliable than your own server (however if your own server goes down, this point is moot, as you probably won't be able to serve the main page, so there won't be a request to the js file anyway)
Cons would be
You have no control over the uptime/reliability of the servers (though they are more likely more reliable than your own)
Cannot make any custom mods/patches to these files (though most good frameworks allow you to extend them without needing to modify the original code)
If the hosted file does not allow you to specify the version as part of the filename (eg "jquery-1.3.2.js" rather than "jquery.js"), you probably don't want to use the hosted version, as any updates could possibly break your code
I would say the pros generally outweigh the cons.
These are all javascript libraries - You want to put a copy of it on your own server.
If you somehow use a different version then you would not have tested against the newer version and it might break your code.
I always download and host them locally, just because I'm worried about their server downtime, there's no real guarantee that their servers will be up for the rest of time. There is usually a note in the script about who it belongs to anyway.
I guess the only con would be if the person who made the script wouldn't actually want it downloaded.. But I don't see that ever happening.
Plus the requests times are much faster, instead of requesting a script hosted at google, just request it on your own server.
For production use hosted.
For development use local because if you're offline then your dev site is broke.

Categories