Are there any downsides to including the jquery library on my page? - javascript

I have been using javascript for some while now and recently began using jquery which I will admit I am fan of.
<script type='text/javascript' src='../locationOfJquery/jquery.js'></script> allows use of the library in the script tags on that page. What I want to know is if just including the script tags slows down page load time any, even if there is no jquery code on the page, and also if there are any other major downsides to using jquery

Put the script tags at the bottom of the page. This will not slow down processing of the DOM before onload events fire.
Use the minified version of jQuery, which is about as small as a small image/icon.
If visitors visit more than one page in your site, it will also usually be cached after their first visit. It may also already be pre-cached (or served from a more-local server) if you use a content delivery network (e.g. Google's). Good first impressions are critical.
To further answer smaller questions you had:
If there is no jQuery code on the page, jQuery must still be parsed. You can see how long it takes your computer to parse jQuery by using a profiling tool such as Chrome's.
There are frameworks which optimize your javascript on a per-page basis, but those have to trade off the ability to cache a script versus the gains in faster parsing. You almost certainly shouldn't worry about them. jQuery is very lightweight compared to other frameworks.
Numbers:
For example on Chrome when loading the Stackoverflow website, requesting the jQuery library from the Google CDN, the results were:
0.027ms aggregate time spent download jQuery (perhaps cached)
35.992ms aggregate time spent evaluating jQuery and performing any default DOM/CSS operations
This is all relative of course. I bet when you loaded this page you did not notice any lag because the entire page took about 630ms to load.

The client will have to download the jQuery script (which is quite small). To further optimize you can just use hosted "Content Delivery Network" versions from Google or Microsoft. Also remember to use the minified version which downloads faster.
This article states the reasons why.

You shouldn't include it if you are not using it.
32k is a small price to pay, but better to have no request and 0k extra to download.
Also, and more importantly, you may run into conflicts with other frameworks if you are using any.

Related

Js Library into browser

Since web browsers want to make the web faster.
I know google has his hosted libraries. But why not integrate them on the browser directly?
Problem nowadays is that if you navigate from one page that has jQuery to another page with jQuery since the url is different that same js is cached for that particular url. So loading time takes up longer while navigating between pages with same libraries.
Can't they make something that saves most known libraries in the browser so that when you load jQuery or jQuery-min it searches for it on the browser first.
Pros
-Faster navigation on the web.
-Makes 1 http request less if he finds the library to load.
Cons
Some problems that can occur with that is versions. Since most files have names like jquery.min.js we can't simply load them if they have the same name, on the other hand some have /1.11.0/jquery.min.js So the browser could try to find out the version with the url. If the browser couldn't find version than it would simply load the file.
What do you think ? Any suggestions on how this could work ? Any other cons ?
Edit1: I'm aware of CDNs. I'm only suggesting a way slightly faster than CDNs and doing one http request on the same process.
This problem can be avoided by using commonly used CDNs, as you mentioned.
http://cdnjs.com/
However I think integrating them into the browser could introduce a real versioning problem. Just think of how long in-between versions of IE. If you had to wait that long to download and cache new versions of libraries, it would be a disaster.
Also you would have to download a large variety of libraries to have your bases covered.
Downloading libraries is typically not very slow, its the time to parse and execute it that takes longer on mobile.
Here is a great post about this topic
http://flippinawesome.org/2014/03/10/is-jquery-too-big-for-mobile/

embedded vs linked JS / CSS

I am familiar with the benefits of linked CSS vs embedded and inline for maintainability and modularity. I have, however, read that in certain mobile applications of web dev, it can be beneficial(faster performance) to embed or even inline your CSS.
I would would avoid any inline JS, and to a lesser extent CSS, but I have noticed on many sites including plenty google pages, JS is embedded right in the header of pages.
A colleague of mine insists on always linking to external js files. I think it makes more sense to embed js if the function is specific to one page or varies slightly per page, to save the processing overhead of a linked script.
The one thing the other answers didn't touch on is developer efficiency. If it's easier to put it inline, and there's no immediate performance requirement/concern, then do that. There is real business value to "easy", and it trumps eventual or non-existent performance concerns. Don't prematurely optimize.
The advantage of linked JS files is that they can be cached by the browser and loaded from local disk or memory on subsequent pages.
The advantage of inline JS is that you might have fewer network requests per page.
The best compromise is usually a small number of linked JS files (one or two) that consist of a mininified combination of all your JS so they are combined into as few files as possible and as small as possible.
Getting the benefits of local caching far exceed the extra parsing of a little JS that might not be used on some pages.
Embedded JS that makes sense (even most of your JS is in linked files) is the settings of a few JS variables that contain state that is specific to your page. That would typically be embedded into the section of the page as it's generated dynamically at your server, different for every page and usually not cacheable. But, this data should typically be small and page-specific.
Linking a script incurs a small penalty in the form of an extra request to the server. If you keep it inline this request is not made and depending on the situation you may get a faster loading page. It makes sense to inline your code if:
it is very small
it is dynamcally generated since then you won't get the benefits of caching anyway
In the case of google and facebook you're most likely seeing inline javascript because it's being generated by server side code.
Other answers have already mentioned the advantages of caching with external JS files. I would almost always go that way for any library or framework type functionality that is likely to be used by at least two pages. Avoid duplication when you can.
I thought I would comment on "inline" vs "embedded".
To me, "inline" means mixing the JS in with the HTML, perhaps with several separate <script> blocks that may or may not refer to each other, almost certainly with a number of event handlers set directly with HTML element properties like <div onclick="..."/>. I would discourage this in most circumstances, but I wouldn't get too hung up about it for occasional uses. Sometimes it's simply less hassle and pretending otherwise just wastes time that you could spend on more important issues.
I would define "embedded" as having (preferably) a single <script> block in the head or at the end of the body, with event handlers assigned within that block using document ready or onload function(s). I see nothing wrong with this for functions specific to one page, in fact I tend to prefer this over an external file for that purpose if it's only a small amount of script and I don't care about caching it client-side. Also if the page is generated dynamically and something in the JavaScript needs to be generated on the server it is generally much easier to do it if the script is on the same page.
One last note on external files: during development watch out for IE's tendency to "over cache". Sometimes while testing I've made some small changes to an external file or two and pulled my hair out wondering why it didn't work only to eventually realise that IE was still using an old cached version. (On the one hand of course this is my fault, but on the other I know a lot of people who have fallen victim to this from time to time.)
All the answers above are very good answers. But I would like to add my own based on 15 years of web dev experience.
ALWAYS use linked resources for both CSS and ECMAScripted resources rather than inline code as linked content is cached by the browsers in most cases and used across potentially thousands of pages over hours and days of interaction by a user with a given domain online. The advantages are as follows:
The bandwidth savings using linked scripts are HUGE as you simply deliver less script over the wire over the user experience which might use the cache for weeks.
There's also better cascade of CSS, as embedded and inline styles override, by weight, linked styles causing confusion in designers.
There is avoidance of duplicate scripts which happens a lot with inline scripts.
You reuse the same libraries over many pages with cache control on the client now possible using versioning and date-based querystrings.
Linked resources tell the browser to preload all resources BEFORE initializing scripts of styles in the DOM. Ive seen issues related to this where users pressed buttons prior to pre-processing of date-times in time sheet apps by scripts causing major problems.
You have more control over script and CSS libraries by all developers in a project, avoiding polluting your app with hundreds of poorly vetted custom scripting in pages
Its very easy to update libraries for your users as well as version linked libraries.
External script libraries from Google or others are now accessible. You can even reuse your own linked libraries and CSS using linked resources.
Best of all there are processing speed gains using cached client-side resources. Cached resources out perform on-demand resources any time!
Linked scripts also enforces style and layout consistencies instead of custom layout shifts per page. If you use HTML layouts consistently, you can simulate flash-free page views because cached CSS is used by the DOM across web pages to render pages faster.
Once you pull in linked resources on the first domain request/response the user's experience is fast and server-side page delivery means the DOM and HTML layouts will not shift or refresh despite numerous page views or links to pages across the site. We often then added limited custom page-level embedded style and script resources as needed to the cached linked stack of libraries on a page level if needed for a narrow range of customizations. Global variables and custom CSS can then override linked values. This allows you to maintain websites much easier without guesswork page-by-page as to what libraries are missing or already used. Ive added custom linked JQuery or other libraries in sub-sections to gain more speed this way, which means you can use linked scripts to manage custom subgroups of website sections, as well.
GOOGLE ANGULAR
What you are seeing in Google's web presence is often implementation of Angular's very complex ES5 and ES6 modular scripted cache systems that utilize fully Javascripted DOM manipulation of single page applications using the scripts and embedded CSS in page views now exclusively managed in Angular(2+). They utilize elaborate modules to load on-demand and lazy load components and modules with HTML/CSS templates pre-loaded into memory and from the server behind the scenes to speed delivery of news and other pages they manage.
The FLAW in all that is they demand browsers stream HUGE megabytes of ECMAScript preloaded with HTML and CSS embedded into these webpages behind the scenes as the user interacts with these cached pages and modules. The problem is they have HUGE stacks of the same CSS and scripts that get injected into multiple modules then parts of the DOM which is sloppy and wasteful. They argue there is no need now for server-side delivery or caching when they can easily manage all that via inline style and script content downloaded via XMLHTTPRequest hidden WebAPI calls to and from the server. Why download all that and rebuild and store inline pages in memory constantly when a much smaller file linked from the page would suffice?
Honestly, this is the sloppiest approach to cache management of styles, content, and CSS I have seen yet in web dev frameworks, as it still demands huge megabytes of scripts just to parse a simple news page with a few lines of text. Someone at Google didn't really think that framework through lol. Its wasteful of bandwidth and processing in the browser, if you ask me, and overkill. Its typical of over-engineering at these bloated vendors.
That is why I always argue for linked CSS and scripts. Less code and more content is why these frameworks were invented. Linked and cached code means SIMPLER, OLDER models have worked better using the fast delivery of smaller markup pages that cache tiny kilobytes of linked ECMAScript and CSS libraries. It means less code is used to display content. The browser's relationship with the server now is so fast and efficient today compared to years ago that the initial caching of these smaller linked files directly from the server (rather than giant inline pages of duplicate scripts yanked down via Web API in Angular on every page view) means linked resources are delivered much faster over the initial visit of a typical web domain visit.
Its only recently the 'script kiddies' have forgotten all this and so have started going backwards to a failed way of using local embedded and inline styles and scripts which we stopped using 20 years ago for a reason. It is a very poor choice and shows inexperience with the web and its markup and content model by many new developers today.
Stokely

Does jquery makes the browser slow?

I'm using jQuery in my web page. Will this make the web page to load contents slow?
Using jQuery is advantage than JavaScript or not why?...
Inspect your page with Firebug and YSlow to see where the bottleneck actually is.
Generally speaking, JS does make your page load slower. However, on modern machines and with modern Internet connection speeds this delay is not even noticeable. However, excessive use of JS can potentially make your page operate slower.
Im using jquery in my web page, will this make the web page to load contents slow.
Depends on the situation, but usually no. If you are planning to use some JavaScript/JQuery to enhance your interface, build in a fading effect or two, have a lookup dropdown or some AJAX calls, don't worry too much. You are very likely to be fine.
Only if you have HUGE web pages (tens of thousands of elements), need to make multi-megabytes AJAX requests, or have a completely JavaScript-driven UI that tends to work slowly, then it's time to look for optimizations.
Check out questions on JQuery and performance on SO to get information on specific situations (lots of selectors, performance comparisons between $() and document.getElementById, etc.)
One thing to be careful with is JQuery (and of course, other Frameworks') plugins that apply manipulations to the whole document when the page loads. A very good example is the source code formatter here on SO. If you look closely, you will notice that when loading a page, there is a tiny fraction of a pause where the source code is not formatted. The formatting is applied using JavaScript. If you use too much of stuff like this, your page is likely to render slowly on older machines.
In general, if you're unsure, always test your pages not only on many browsers, but on older machines, too.
Use the Google Ajax libs hosted version and it'll probably be cached by the time someone gets to your page.
I've found that optimizing the jQuery JavaScript source with Google Closure Compiler has a noticeable effect on both the load time of the JavaScript and the overall response time. It's quite impressive.
There is only a slight overhead when loading the page by adding an additional resource such as a .js file.
For details of a specific loading procedure, use google webmaster tools to get recommndations.
jQuery itself can slow down the browser (not the loading time of the page) when used without caution or on extremely crowded pages, but I wouldn't worry about that in most cases. The recent release of jquery 1.4 has improved performance even further.
At 23KB minified and gzipped, it will not make a noticeable difference in the speed load of your pages (especially once it's cached in your browser). It will also not make a noticeable difference in the interactivity of your page.
You will save a lot of time not having to debug cross browser compatibility issues.
Yes jQuery or any other file that load in your page adds an overhead.
For me that overhead is "0" based on the jQuery effort.
Try programming without jQuery to see my point of view.
jQuery has my vote for president !

Calculating Javascript Library Load Times

I am trying to compare the performance of several Javascript libraries. Although measuring transaction times helps to determine the working performance of a library, it doesn't account for the time required to download and initiate the individual libraries. I'm looking for suggestions for the best method of determining the load time other than using tools such as firebug, etc. I would like to be able to setup a controlled environment where a page could be loaded n-times, while capturing the start and end times. Should the Javascript library contents be included in the page rather than as an include file, or is there a better way?
Reading this article by John Resig on JavaScript Benchmark Quality before you start anything may help you out.
After that, I would suggest that you might try requesting the javascript from your sever, getting, and timing how long the eval(responseJS); takes. That way, you are only timing how long the Library takes to load rather than that plus the time it takes to download from the server.
The libraries should always be an external file, included via a script tag, either on their own or with the site's scripting also rolled in. Minified and packed files will have a smaller size attachment. Delivered via a CDN is optimal as well, as the CDN will have it cached. Many of the popular frameworks are available over Google's CDN.
You must also account for not only the library, but the application using the library. The quality of the JS in the libraries is (typically) top notch, but what about the quality of the code tapping into those libraries, or even the code of plugins which may not be developed by the library authors. You also have to look at what browser is being used. As much as we hate it, most of these cross browser libraries are optimized for best performance out of Internet Explorer, because it retains 85+% market share.
The performance of any library is really a trade off. Deciding what is acceptable in order to get your application to do whatever it is that you want to do.

What are advantages of using google.load('jQuery', ...) vs direct inclusion of hosted script URL?

Google hosts some popular JavaScript libraries at:
http://code.google.com/apis/ajaxlibs/
According to google:
The most powerful way to load the libraries is by using google.load() ...
What are the real advantages of using
google.load("jquery", "1.2.6")
vs.
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.2.6/jquery.min.js"></script>
?
Aside from the benefit of Google being able to bundle multiple files together on the request, there is no perk to using google.load. In fact, if you know all libraries that you want to use (say just jQuery 1.2.6), you're possibly making the user's browser perform one unneeded HTTP connection. Since the whole point of using Google's hosting is to reduce bandwidth consumption and response time, the best decision - if you're just using 1 library - is to call that library directly.
Also, if your site will be using any SSL certificates, you want to plan for this by calling the script via Google's HTTPS connection. There's no downside to calling a https script from an http page, but calling an http script from an https page will causing more obscure debugging problems than you would want to think about.
It allows you to dynamically load the libraries in your code, wherever you want.
Because it lets you switch directly to a new version of the library in the javascript, without forcing you to rebuild/change templates all across your site.
It lets Google change the URL (but they can't since the URL method is already established)
In theory, if you do several google.load()s, Google can bundle then into one file, but I don't think that is implemented.
I find it's very useful for testing different libraries and different methods, particularly if you're not used to them and want to see their differences side by side, without having to download them. It appears that one of the primary reason to do it, would be that it is asynchronous versus the synchronous script call. You also get some neat stuff that is directly included in the google loader, like client location. You can get their latitude and longitude from it. Not necessarily useful, but it may be helpful if you're planning to have targeted advertising or something of the like.
Not to mention that dynamic loading is always useful. Particularly to smooth out the initial site load. Keeping the initial "site load time" down to as little as possible is something every web designer is fighting an uphill battle on.
You might want to load a library only under special conditions.
Additionally the google.load method would speed up the initial page display. Otherwise the page rendering will freeze until the file has been loaded if you include script tags in your html code.
Personally, I'm interested in whether there's a caching benefit for browsers that will already have loaded that library as well. Seems like if someone browses to google and loads the right jQuery lib and then browses to my site and loads the right jQuery lib... ...both might well use the same cached jQuery. That's just a speculative possibility, though.
Edit: Yep, at very least when using the direct script tags to the location, the javascript library will be cached if someone has already called for the library from google (e.g. if it were included by another site somewhere).
If you were to write a boatload of JavaScript that only used the library when a particular event happens, you could wait until the event happens to download the library, which avoids unnecessary HTTP requests for those who don't actually end up triggering the event. However, in the case of libraries like Prototype + Scriptaculous, which downloads over 300kb of JavaScript code, this isn't practical.

Categories