I'm using jQuery in my web page. Will this make the web page to load contents slow?
Using jQuery is advantage than JavaScript or not why?...
Inspect your page with Firebug and YSlow to see where the bottleneck actually is.
Generally speaking, JS does make your page load slower. However, on modern machines and with modern Internet connection speeds this delay is not even noticeable. However, excessive use of JS can potentially make your page operate slower.
Im using jquery in my web page, will this make the web page to load contents slow.
Depends on the situation, but usually no. If you are planning to use some JavaScript/JQuery to enhance your interface, build in a fading effect or two, have a lookup dropdown or some AJAX calls, don't worry too much. You are very likely to be fine.
Only if you have HUGE web pages (tens of thousands of elements), need to make multi-megabytes AJAX requests, or have a completely JavaScript-driven UI that tends to work slowly, then it's time to look for optimizations.
Check out questions on JQuery and performance on SO to get information on specific situations (lots of selectors, performance comparisons between $() and document.getElementById, etc.)
One thing to be careful with is JQuery (and of course, other Frameworks') plugins that apply manipulations to the whole document when the page loads. A very good example is the source code formatter here on SO. If you look closely, you will notice that when loading a page, there is a tiny fraction of a pause where the source code is not formatted. The formatting is applied using JavaScript. If you use too much of stuff like this, your page is likely to render slowly on older machines.
In general, if you're unsure, always test your pages not only on many browsers, but on older machines, too.
Use the Google Ajax libs hosted version and it'll probably be cached by the time someone gets to your page.
I've found that optimizing the jQuery JavaScript source with Google Closure Compiler has a noticeable effect on both the load time of the JavaScript and the overall response time. It's quite impressive.
There is only a slight overhead when loading the page by adding an additional resource such as a .js file.
For details of a specific loading procedure, use google webmaster tools to get recommndations.
jQuery itself can slow down the browser (not the loading time of the page) when used without caution or on extremely crowded pages, but I wouldn't worry about that in most cases. The recent release of jquery 1.4 has improved performance even further.
At 23KB minified and gzipped, it will not make a noticeable difference in the speed load of your pages (especially once it's cached in your browser). It will also not make a noticeable difference in the interactivity of your page.
You will save a lot of time not having to debug cross browser compatibility issues.
Yes jQuery or any other file that load in your page adds an overhead.
For me that overhead is "0" based on the jQuery effort.
Try programming without jQuery to see my point of view.
jQuery has my vote for president !
Related
Since web browsers want to make the web faster.
I know google has his hosted libraries. But why not integrate them on the browser directly?
Problem nowadays is that if you navigate from one page that has jQuery to another page with jQuery since the url is different that same js is cached for that particular url. So loading time takes up longer while navigating between pages with same libraries.
Can't they make something that saves most known libraries in the browser so that when you load jQuery or jQuery-min it searches for it on the browser first.
Pros
-Faster navigation on the web.
-Makes 1 http request less if he finds the library to load.
Cons
Some problems that can occur with that is versions. Since most files have names like jquery.min.js we can't simply load them if they have the same name, on the other hand some have /1.11.0/jquery.min.js So the browser could try to find out the version with the url. If the browser couldn't find version than it would simply load the file.
What do you think ? Any suggestions on how this could work ? Any other cons ?
Edit1: I'm aware of CDNs. I'm only suggesting a way slightly faster than CDNs and doing one http request on the same process.
This problem can be avoided by using commonly used CDNs, as you mentioned.
http://cdnjs.com/
However I think integrating them into the browser could introduce a real versioning problem. Just think of how long in-between versions of IE. If you had to wait that long to download and cache new versions of libraries, it would be a disaster.
Also you would have to download a large variety of libraries to have your bases covered.
Downloading libraries is typically not very slow, its the time to parse and execute it that takes longer on mobile.
Here is a great post about this topic
http://flippinawesome.org/2014/03/10/is-jquery-too-big-for-mobile/
To combine all modules into a single resource, we wrote each module into a separate script tag and hid the code inside a comment block (/* */). When the resource first loads, none of the code is parsed since it is commented out. To load a module, find the DOM element for the corresponding script tag, strip out the comment block, and eval() the code....
On an iPhone 2.2 device, 200k of JavaScript held within a block comment adds 240ms during page load, whereas 200k of JavaScript that is parsed during page load added 2600 ms. That's more than a 10x reduction in startup latency by eliminating 200k of unneeded JavaScript during page load!
http://googlecode.blogspot.co.uk/2009/09/gmail-for-mobile-html5-series-reducing.html
https://developers.google.com/speed/docs/best-practices/mobile
The gmail article is more than three years old and there's been great advantages in mobile performance since then, namely things like iOS's Nitro and JIT coming to mobile. Are the performance gains still to be had from using eval?
Its not the same technology issue as it was before since JavaScript engines have become so performant. Rather there are other considerations in terms of being more app-like.
There are tricks now that are different in approach such as using web workers for ajax requests to free up the thread, utilizing the GPU with CSS transformations and requestAnimationFrame or even asm.js. Using localStorage/sessionStorage and Application Cache is another approach along those lines where you can really get a lot of client-side caching up front to avoid calling anything more than the content JSON / images data urls / videos and load/execute things into memory as needed from those caches.
Its a different time in other words and your question is interesting but not focused in the right areas to really make a difference in web-app performance.
I have been using javascript for some while now and recently began using jquery which I will admit I am fan of.
<script type='text/javascript' src='../locationOfJquery/jquery.js'></script> allows use of the library in the script tags on that page. What I want to know is if just including the script tags slows down page load time any, even if there is no jquery code on the page, and also if there are any other major downsides to using jquery
Put the script tags at the bottom of the page. This will not slow down processing of the DOM before onload events fire.
Use the minified version of jQuery, which is about as small as a small image/icon.
If visitors visit more than one page in your site, it will also usually be cached after their first visit. It may also already be pre-cached (or served from a more-local server) if you use a content delivery network (e.g. Google's). Good first impressions are critical.
To further answer smaller questions you had:
If there is no jQuery code on the page, jQuery must still be parsed. You can see how long it takes your computer to parse jQuery by using a profiling tool such as Chrome's.
There are frameworks which optimize your javascript on a per-page basis, but those have to trade off the ability to cache a script versus the gains in faster parsing. You almost certainly shouldn't worry about them. jQuery is very lightweight compared to other frameworks.
Numbers:
For example on Chrome when loading the Stackoverflow website, requesting the jQuery library from the Google CDN, the results were:
0.027ms aggregate time spent download jQuery (perhaps cached)
35.992ms aggregate time spent evaluating jQuery and performing any default DOM/CSS operations
This is all relative of course. I bet when you loaded this page you did not notice any lag because the entire page took about 630ms to load.
The client will have to download the jQuery script (which is quite small). To further optimize you can just use hosted "Content Delivery Network" versions from Google or Microsoft. Also remember to use the minified version which downloads faster.
This article states the reasons why.
You shouldn't include it if you are not using it.
32k is a small price to pay, but better to have no request and 0k extra to download.
Also, and more importantly, you may run into conflicts with other frameworks if you are using any.
I make heavy use of the excellent jTemplates plugin for a small web app.
Currently I load all of the templates into the DOM on the initial page load.
Overtime as the app has grown I have gotten more and more templates - currently about 100kb worth.
Because my app is all ajax-based, there is never a need to refresh the page after the initial page load. There is a couple second delay at the beginning while all of the templates load into the DOM, but after that the app behaves very responsively.
I am wondering: In this situation, is there any significant advantage to using jTemplates processTemplateURL method to lazy load templates as needed, as opposed to just bulk loading all of the templates on the initial page load?
(I don't mind the extra 2 or 3 seconds the initial page load takes - so I guess I am wondering -- besides the initial page load delay, is there any reason not to load a large amount of html template data into the DOM? Does having a larger amount of data in the DOM affect performance in any way?)
Thanks (in advance) for your help.
According to yahoo Best Practices for Speeding Up Your Web Site article, They reommand not having more then 500-700 elements in DOM.
The number of DOM elements is easy to test, just type in Firebug's console:
document.getElementsByTagName('*').length
Read more http://developer.yahoo.com/performance/rules.html
Like a jar that contains 100 marbles 10 of which are red color. It is easy to spot and pick the 10 red marbles from a jar of 100, but if that jar contained 1000 marbles, it will take more time to find the red marbles. Comparing this to DOM elements, the more you have the slow your selections will be, and that will effect performance.
You really should optimize your DOM in order to save memory and enhance speed. However, the key is to avoid premature optimizations.
What is your target platform? What browser(s) are your users most likely to be using?
For example, if you are targeting primarily desktop PC's and your users are running modern browsers, then you probably should prefer clarity and simplicity of your code.
If you are targeting desktop PC's but must support IE6, say, then having too many DOM elements will impact your performance and you should think in lines of optimizations.
However, if you are targeting modern browsers, but in areas with poor bandwidth (e.g. on a cruise ship etc.) then your bandwidth considerations may out-weight your DOM considerations.
If you are targeting iPhones, iPads etc., then memory is a scarce resource (as is with CPU), so you definitely should optimize the DOM. In addition, on mobile devices, you're going to give more weight in optimizing the AJAX payload due to bandwidth issues than with anything else. You'll give yet more weight in reducing the number of AJAX calls vs. saving on DOM elements. For example, you may want to just load more DOM elements in order to reduce the number of AJAX calls due to bandwidth considerations -- again, only you can decide on the right balance.
So the answer really is: depends. In a fast-connection, modern browser environment, no real need to prematurely optimize unless you DOM gets really huge. In a slow-connection, or mobile enviornment, overweight on bandwidth optimizations before DOM optimizations, but do optimize on DOM node count as well.
Having whole bunch of extra DOM elements will not only affect your initial load time, it will also affect more or less everything on the page. JavaScript DOM queries will run slower, inserts will run slower, CSS will apply slower and so on, since the entire tag soup is parsed into DOM Tree by the browser, any traversal of that tree is going to be affected. To measure how much it's going to slow your page down, you can use tools like DynaTrace AJAX and run it once with your current code and once with your code with no templates. If you notice large difference between those two runs in terms of JavaScipt execution or rendering, you should probably lazy load (though 100kb is not that much, so you might not see significant difference).
I'm trying to wrap my head around jQuery Mobile. My aim is to build a very fast application with a look and feel as close as possible to a native app (at least for modern devices).
I understand there are two ways of navigating between pages:
Loading each page as a separate page and linking to other pages with regular html anchors.
Putting all (or many) pages on one single web page and navigating between them by means of javascript ($.mobile.changePage (method) and similar api functions.
The first approach should work on all browsers, but performs quite poorly since there is a delay between each page transition.
The second looks like it should be much faster, so I would definitely prefer this approach. But how would that work for mobile device browsers without javascript support? It certainly seems to violate jQuery Mobile's aim to provide a gracefully degraded experience for C-grade browsers.
It looks to me like I need to implement my app twice, once optimized for browsers with javascript support, once for browsers without? Using may be another option, but that looks even more messy.
What's the recommended way to approach this dilemma? Is there anything I have not noticed?
Thanks,
Adrian
First of all: Your point2 is wrong.
See Local, internal linked "pages" in here and read it carefully. A link href="#pageelementid" will work just fine AND will work in any HTML4 capable browser too [might require <a name="pageelementid"> in some cases, I'm not sure anymore] with the only difference being that all the pages are visible at once.
Second thing is that if you use the way number 1 it will look quite nice too. It will load, yes, but in javascript-enabled browsers it's loaded with AJAX and there's no nasty blink between pages. Also - a "loading" popup shows up.
jQuery Mobile is supposed to let you create an application with some pure and simple html without any JS. JQM itself takes on the enhancement of the page so that it looks good and uses AJAX. Try to create an application that would work in every browser possible (my inspiration: lynx) and use JQM markup for that. Any javascript you are willing to write should work as an enhancement - making it better instead of making it work at all.
good luck with that!
The current thinking on supporting lesser browsers is to not degrade gracefully, but to enhance. If you build the website from the ground up to work without javascript then enhance it afterwards, then you pretty much know that the site will work (rather than having to fix it or build a secondary site).
As regards the two options you've specified, number one would be my preference as a mobile user if I had a limited bandwidth and also a lot of people have a restricted download amount per month.
Lumping all the pages into one large file may seem like a good idea (downloaded already), but you may well run into memory limitations on certain phones. And what if all they want to do is visit two pages, why should they be forced to download the entire website to do so?