I am trying to compare the performance of several Javascript libraries. Although measuring transaction times helps to determine the working performance of a library, it doesn't account for the time required to download and initiate the individual libraries. I'm looking for suggestions for the best method of determining the load time other than using tools such as firebug, etc. I would like to be able to setup a controlled environment where a page could be loaded n-times, while capturing the start and end times. Should the Javascript library contents be included in the page rather than as an include file, or is there a better way?
Reading this article by John Resig on JavaScript Benchmark Quality before you start anything may help you out.
After that, I would suggest that you might try requesting the javascript from your sever, getting, and timing how long the eval(responseJS); takes. That way, you are only timing how long the Library takes to load rather than that plus the time it takes to download from the server.
The libraries should always be an external file, included via a script tag, either on their own or with the site's scripting also rolled in. Minified and packed files will have a smaller size attachment. Delivered via a CDN is optimal as well, as the CDN will have it cached. Many of the popular frameworks are available over Google's CDN.
You must also account for not only the library, but the application using the library. The quality of the JS in the libraries is (typically) top notch, but what about the quality of the code tapping into those libraries, or even the code of plugins which may not be developed by the library authors. You also have to look at what browser is being used. As much as we hate it, most of these cross browser libraries are optimized for best performance out of Internet Explorer, because it retains 85+% market share.
The performance of any library is really a trade off. Deciding what is acceptable in order to get your application to do whatever it is that you want to do.
Related
I just started my adventure with frontend, most likely with web design. I've been struggling to answer one technical question and I couldn't find yet a reasonable answer.
There's so many libraries you can load, download to make your web developing faster. Therefore there is my question.
Is it better to download these libraries (e.g. Boostrap, jQuery, Angular, fonts from Google and so) and link to them (externally) from the official source or download it, upload to your server and then link to the location file (internal source) on your server?
My imagination tells me that if I would download them and upload em on my server, then link to it would make the whole website load quicker. Is that a good thinking?
Pro hosting and linking to external resources (may it be JS libraries, images or whatever):
Spreading the load: your server doesn't have to serve all content, it can concentrate on its main functionality and serve the application itself.
Spread the HTTP connections: due to more and more asynchronously working applications it is a good thing to use the maximum parallel HTTP connections per site/subdomain to deliver application data and load all necessary additional resources from other servers.
as Rafael mentioned above, CDNs scale very good and seldom go offline.
Cons
Even with fast internet connections there is a high chance that resources will be served faster when they are located on the same Intranet. That's why some companies have their own "Micro-CDNs" inside their local networks to combine the advantages of multiple servers and local availability.
External dependancy: as soon as an Internet connection becomes unavailable or a Proxy server goes down, all external resources become unavailable leaving the application in a broken state.
Sometimes it may be actually faster if you link from an external source. That's because the browser stores recent data it accesses, and many sites use Bootstrap, jQuery and the such. It might not happen frequently with less popular libraries.
Keep in mind, though, since you're downloading from external sources, you're at the mercy of their servers. If for some reason or another it gets offline, your page won't work correctly. CDNs are not supposed to go offline for that very reason, but it's good to be aware of that. Also, when/if you're offline and working on your page, you won't be able to connect during development.
It is always better to download these files locally if you are developing some application for more security so that you do not really have to depend on any third party server which hosts the CDN.
Talking about performance using CDN might be beneficial because the libraries that you require might be cached in your browser so the time to fetch the file is saved. But if the file is available locally loading these files will definately take time and space.
https://halfelf.org/2015/cdn-vs-local/
https://www.sitepoint.com/7-reasons-not-to-use-a-cdn/
I agree with Rafael's answer above, but wanted to note a few benefits of serving up these libraries locally that he or she omitted.
It is still considered best practice (until HTTP2 becomes widespread) to try to minimize the amount of downloads being made by your site by concatenating many files into a single file. SO-- if you are using three Javascript libraries/frameworks (for instance, Angular, jQuery and Moment.js), if you are using a CDN that is three separate script elements pulling down three separate .js files. However, if you host them locally, you can have your build process include a step where it bundles the three libraries together into a single file called "vendor.js" or something along those lines. This has the added bonus of simplifying dependency loading to some degree, as you can concatenate them in a certain order should the need be.
Finally, although it is a little advanced if you are just getting started, if you are considering hosting your library files with your project it would definitely be worth looking into bower (https://bower.io/docs/api/) -- it is a node build tool that allows you to define what packages need to be included in your project and then install them with a single command-- particularly useful for keeping unnecessary library files out of your version control. Good luck!
I'm using the excellent requirejs optimizer to compress the code of a web application.
The application uses a lot of third-party libs. I have several options :
Let the user download all the third party libs separately from my server
Let the user download all the third party libs from a CDN, if available
User requirejs to produce a 'compressed' version of all those libs, in a single file
Now, I know that caching and / or a CDN would help with how long it takes to fetch each individual library, however if I have 15 libs, I'm still going to end up with 15 http requests ; which is all the more annoying if the actual code for my application ends up being served in one or two relatively small files.
So what are the pros and cons of each methods ? Also, I suppose I would be actually 'redistributing' (in the sense of common FOOS licenses) the libraries if I were to bundle them inside my app (rather than pointing to a CDN ?)
Any experience / ideas welcome.
Thanks.
You could take a look to Why should I use Google's CDN for jQuery? question, why CDN is better solution.
It increases the parallelism available. (Most browsers will only
download 3 or 4 files at a time from any given site.)
It increases the chance that there will be a cache-hit. (As more sites
follow this practice, more users already have the file ready.)
It ensures that the payload will be as small as possible. (Google can
pre-compress the file in a wide array of formats (like GZIP or
DEFLATE). This makes the time-to-download very small, because it is
super compressed and it isn't compressed on the fly.)
It reduces the amount of bandwidth used by your server. (Google is
basically offering free bandwidth.)
It ensures that the user will get a geographically close response.
(Google has servers all over the world, further decreasing the
latency.)
(Optional) They will automatically keep your scripts up to date. (If
you like to "fly by the seat of your pants," you can always use the
latest version of any script that they offer. These could fix security
holes, but generally just break your stuff.)
I'm hoping someone with more experience with global-scale web applications could clarify some questions, assumptions and possible misunderstandings I have.
Let's take a hypothetical site (heavy amount of client-side / dynamic components) which has hundreds of thousands of users globally and the sources are being served from one location (let's say central Europe).
If the application depends on popular JavaScript libraries, would it be better to take it from the Google CDN and compile it into one single minified JS file (along with all application-specific JavaScript) or load it separately from the Google CDN?
Assetic VS headjs: Does it make more sense to load one single JS file or load all the scripts in parallel (executing in order of dependencies)?
My assumptions (please correct me):
Compiling all application-specific/local JS code into one file, using CDNs like Google's for popular libraries, etc. but loading all of these via headjs in parallel seems optimal, but I'm not sure. Server-side compiling of third party JS and application-specific JS into one file seems to almost defeat the purpose of using the CDN since the library is probably cached somewhere along the line for the user anyway.
Besides caching, it's probably faster to download a third party library from Google's CDN than the central server hosting the application anyway.
If a new version of a popular JS library is released with a big performance boost, is tested with the application and then implemented:
If all JS is compiled into one file then every user will have to re-download this file even though the application code hasn't changed.
If third party scripts are loaded from CDNs then the user only has download the new version from the CDN (or from cache somewhere).
Are any of the following legitimate worries in a situation like the one described?
Some users (or browsers) can only have a certain number of connections to one hostname at once so retrieving some scripts from a third party CDN would be result in overall faster loading times.
Some users may be using the application in a restricted environment, therefore the domain of the application may be white-listed but not the CDNs's domains. (If it's possible this is realistic concern, is it at all possible to try to load from the CDN and load from the central server on failure?)
Compiling all application-specific/local JS code into one file
Since some of our key goals are to reduce the number of HTTP requests and minimize request overhead, this is a very widely adopted best practice.
The main case where we might consider not doing this is in situations where there is a high chance of frequent cache invalidation, i.e. when we make changes to our code. There will always be tradeoffs here: serving a single file is very likely to increase the rate of cache invalidation, while serving many separate files will probably cause a slower start for users with an empty cache.
For this reason, inlining the occasional bit of page-specific JavaScript isn't as evil as some say. In general though, concatenating and minifying your JS into one file is a great first step.
using CDNs like Google's for popular libraries, etc.
If we're talking about libraries where the code we're using is fairly immutable, i.e. unlikely to be subject to cache invalidation, I might be slightly more in favour of saving HTTP requests by wrapping them into your monolithic local JS file. This would be particularly true for a large code base heavily based on, for example, a particular jQuery version. In cases like this bumping the library version is almost certain to involve significant changes to your client app code too, negating the advantage of keeping them separate.
Still, mixing request domains is an important win, since we don't want to be throttled excessively by the maximum connections per domain cap. Of course, a subdomain can serve just as well for this, but Google's domain has the advantage of being cookieless, and is probably already in the client's DNS cache.
but loading all of these via headjs in parallel seems optimal
While there are advantages to the emerging host of JavaScript "loaders", we should keep in mind that using them does negatively impact page start, since the browser needs to go and fetch our loader before the loader can request the rest of our assets. Put another way, for a user with an empty cache a full round-trip to the server is required before any real loading can begin. Again, a "compile" step can come to the rescue - see require.js for a great hybrid implementation.
The best way of ensuring that your scripts do not block UI painting remains to place them at the end of your HTML. If you'd rather place them elsewhere, the async or defer attributes now offer you that flexibility. All modern browsers request assets in parallel, so unless you need to support particular flavours of legacy client this shouldn't be a major consideration. The Browserscope network table is a great reference for this kind of thing. IE8 is predictably the main offender, still blocking image and iFrame requests until scripts are loaded. Even back at 3.6 Firefox was fully parallelising everything but iFrames.
Some users may be using the application in a restricted environment, therefore the domain of the application may be white-listed but not the CDNs's domains. (If it's possible this is realistic concern, is it at all possible to try to load from the CDN and load from the central server on failure?)
Working out if the client machine can access a remote host is always going to incur serious performance penalties, since we have to wait for it to fail to connect before we can load our reserve copy. I would be much more inclined to host these assets locally.
Many small js files is better than few large ones for many reasons including changes/dependencies/requirements.
JavaScript/css/html and any other static content is handled very efficiently by any of the current web servers (Apache/IIS and many others), most of the time one web server is more than capable of serving 100s and 1000s requests/second and in any case this static content is likely to be cached somewhere between the client and your server(s).
Using any external (not controlled by you) repositories for the code that you want to use in production environment is a NO-NO (for me and many others), you don't want a sudden, catastrophic and irrecoverable failure of your whole site JavaScript functionality just because somebody somewhere pressed commit without thinking or checking.
Compiling all application-specific/local JS code into one file, using
CDNs like Google's for popular libraries, etc. but loading all of
these via headjs in parallel seems optimal...
I'd say this is basically right. Do not combine multiple external libraries into one file, since—as it seems you're aware—this will negate the majority case of users' browsers having cached the (individual) resources already.
For your own application-specific JS code, one consideration you might want to make is how often this will be updated. For instance if there is a core of functionality that will change infrequently but some smaller components that might change regularly, it might make sense to only compile (by which I assume you mean minify/compress) the core into one file while continuing to serve the smaller parts piecemeal.
Your decision should also account for the size of your JS assets. If—and this is unlikely, but possible—you are serving a very large amount of JavaScript, concatenating it all into one file could be counterproductive as some clients (such as mobile devices) have very tight restrictions on what they will cache. In which case you would be better off serving a handful of smaller assets.
These are just random tidbits for you to be aware of. The main point I wanted to make was that your first instinct (quoted above) is likely the right approach.
I have been using javascript for some while now and recently began using jquery which I will admit I am fan of.
<script type='text/javascript' src='../locationOfJquery/jquery.js'></script> allows use of the library in the script tags on that page. What I want to know is if just including the script tags slows down page load time any, even if there is no jquery code on the page, and also if there are any other major downsides to using jquery
Put the script tags at the bottom of the page. This will not slow down processing of the DOM before onload events fire.
Use the minified version of jQuery, which is about as small as a small image/icon.
If visitors visit more than one page in your site, it will also usually be cached after their first visit. It may also already be pre-cached (or served from a more-local server) if you use a content delivery network (e.g. Google's). Good first impressions are critical.
To further answer smaller questions you had:
If there is no jQuery code on the page, jQuery must still be parsed. You can see how long it takes your computer to parse jQuery by using a profiling tool such as Chrome's.
There are frameworks which optimize your javascript on a per-page basis, but those have to trade off the ability to cache a script versus the gains in faster parsing. You almost certainly shouldn't worry about them. jQuery is very lightweight compared to other frameworks.
Numbers:
For example on Chrome when loading the Stackoverflow website, requesting the jQuery library from the Google CDN, the results were:
0.027ms aggregate time spent download jQuery (perhaps cached)
35.992ms aggregate time spent evaluating jQuery and performing any default DOM/CSS operations
This is all relative of course. I bet when you loaded this page you did not notice any lag because the entire page took about 630ms to load.
The client will have to download the jQuery script (which is quite small). To further optimize you can just use hosted "Content Delivery Network" versions from Google or Microsoft. Also remember to use the minified version which downloads faster.
This article states the reasons why.
You shouldn't include it if you are not using it.
32k is a small price to pay, but better to have no request and 0k extra to download.
Also, and more importantly, you may run into conflicts with other frameworks if you are using any.
So I have been playing around with a home project that includes a lot of js. I having been using Script# to write my own library etc. Personally I wouldn't write a lot of js if I didn't have a tool like Script# or GWT to help maintain it.
So far it includes these external libraries:
– ASP.NET AJAX
– ExtJS
– Google Maps
– Google Visulisations
– My own library to wrap the above libraries and add extra functionality...
So that works out to be a heap of js. It runs fine on my pc. I however have little faith in js/browsers and I am concerned that loading too much js will cause the browser to die or perform poorly.
Is this a valid concern?
Does anyone have any experience with loading a lot of js into the browser that has resulted in performance issues? I however know there are a lot of variables here, for example browser type (I assume IE is worse than others) the client PCs RAM etc, but it would be good to get other peoples experiences. I would hate to invest a lot of time into js only to find that I am painting myself into a corner.
The more I use Script# the more client classes I have as I move more processing onto the client. At what point would this start becoming an issue? I'm sure the browser could easily handle 100 MS Ajax classes but at what would be too far for a browser?
NOTE: I am not concerned about the actual js file sizes but more the runtime environment that gets loaded.
There is nothing wrong with having large number of js files or big js files, the project currently am working on got more than 60 core framework libraries and 30 of each module got average of 5 to 6 js files.
So the only concern is how you design your website that make use of the JS best practices & optimization techniques. like
Minimize the JS using YUI or any other compression libraries to address the download size issues.
Enable proper caching in your webserver to reduce the file downloads.
Put your javascript in the bottom of the page, or make it a separate file.
Make your AJAX response cachable.
And finally, design your page that handles the on-demamnd script loading.
- Microsoft DOLOTO is a good example for this one. download it here
And Check out the High Performance Web Sites && latest Even Faster Web Sites by Steve Souders. Its a must read for the web developers. This book addresses all the common problems web developers facing today.
with modern browsers routinely occupying 250 MB of RAM or more, script caching, and optimized javascript engines, keeping the script library resident would probably be negligible added load in most reasonable scenarios.
the biggest bottleneck would probably be intitial load time of the scripts - downloading and parsing them. but once that's done, the scripts are cached and the per-page initialization isn't very noticeable.
I highly doubt a browser would ever crash running your JS scripts, but it will become really slow and may not perform what you want. Most people are more concerned about how fast it runs, not if it will run!
I agree with jspcal, you should be able to load quite a lot of javascript with no problems. The javascript engines in all the modern browsers are a lot faster than they were a few years ago. The initial load will be the biggest issue. If possible I'd suggest lazy loading scripts that aren't needed for the page to render.
Also, Steve Souders has a lot of great material about improving page load times, such as this article, which gives several techniques for loading scripts without blocking.
http://www.stevesouders.com/blog/2009/04/27/loading-scripts-without-blocking/
If you're really concerned about performance then I would take a look at your target audience. If you think you'll have a relatively high number of IE6 users then test it out in IE6-- on an older machine if possible. IE Tester is great for this.