A total newbie. This seems like a basic question, but one that I can't find the answer to:
I have a few lines of JQuery to show and hide a responsive menu on half-a-dozen pages. I understand I need to "link" to the JQ library:
<script src="https://ajax.googleapis.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
Will this slow my site down? Will a visitor have to download the whole JQ library for my two lines of code to work? if this is the case I might revert back to Javascript.
Loading a script file is never free, so you have to decide whether it's worth it for the functionality provided.
Once you've decided to do it (if you do), the question is whether to use a CDN or host it locally.
Pros of a CDN:
It may already be in browser cache because the same file is used elsewhere.
CDNs use edge-casting, highly-optimized servers and server configurations, etc. to make delivery as fast as possible. While it may be possible for you to have a server that's just as fast, it's quite likely that you don't.
Const of a CDN:
If the CDN is down, the link doesn't work even though your site is up and running.
It's not under your control.
Generally, yes. There are certain conditions which can affect the download speed, like content blocking, cache and async.
We can caluculate how long it will take to download a file if we know the internet connection speed and the file size.
That minified jquery CDN file is 84.8 KB.
And the average global internet connection speed is around 7.2 Mbps.
So...
It will take about 78.45 ms to download a file of that size.
Depending on what you think is a site slowdown. Are some miliseconds a huge slowdown?
You can speed up the process slightly by having the jQuery library on your own server (eliminates the need for extra request to the google CDN => saves time). Another thing to speed up the process would be to use the SLIM and minified CDN versions of jQuery.
Now if you have only 2 lines of code, there might not be a particular need to just include jQuery for that (again think longterm), but think if in the future you would use jQuery for more things.
In the end the including of just jQuery will not slow down your response times that much (max of a couple of miliseconds).
Hope this helps!
The CDN network will not slow down your webpage since you are trying to get the minified file. Even if you use the offline file, it will still take some time to load from the storage. Since you are developing a web site/web application, your app should be connected to network. If you are not showing any jquery content when the page first loads, you can make it to load in the background by adding async attribute to the script tag.
You can check how much time this file takes to load by going to (in Chrome)
developer console -> Network.
It will show list of requests and its time to finish. Different filters are also available. So select All to see all network connections and time for finishing the request.
Related
i am trying to improve the performance of my web application.
It is a java based web application & deployed on an amazon cloud server with Jboss & apache.
There is one page in the application that is taking 13-14 seconds to open. The functionality is so much that there are about 100+ http requests that get executed on page loading time. The Js & css files are taking too much time to load.
So i moved all of the Javascript code from my JSP page to new JS files. Minified the JS & css files. Still there is not much difference in the page load time.
There is dojo data on this page as well which takes time to load.
Is there any other appproach i should try to tune this page?
Can something be done at the Jboss or Apache level?
Apply caching for images etc. More here
Use CDN for any external libraries you use (like jquery). More here
Use a library for your js scripts like RequireJS to optimize your css and js files. You can concatenate (merge multiple js files to one) your code. This will reduce the number of ajax calls (that's what the browser does when it sees a js or css dependency) (As #Ken Franqueiro mentions in the comment section, Dojo already has a mechanism for this). More here
Optimize your images. Use appropriate dimensions. Do not use full blown dimensions if you just intend to use it for a 10x10 container. Use sprites if possible. More here
Show a message/loader to show the user some progress. This will minimize the user restlessness More here
If you load data that takes too long, then show the page and load the data afterwards. This will too give some sense of progress to the user.
If the response is very big you can compress your response data. Be careful though that the browsers your application supports, can handle the compressed information by default, or add a custom mechanism to decompress the data. More here
Use some profiling tools like Chrome Development Tools or FireBug for Mozilla.
Take a snapshot of your network traffic and check where the bottleneck is.
In Chrome you press F12 and then select the Network tab.
I am developing a social media website of the scale of LinkedIn, and I am struggling with few things related to JS. As project is going on, the requirement of JavaScript/jQuery is increasing. I am guessing by the end of the project, jQuery required on each page will be of average file size 1.5MB which is a bad idea. Functions like image upload, comments, like, updates is almost on every page.
My Question here is How to optimize jQuery? Any tricks or concept that I can use on this project which will lower file size as well as do these functions (there are much more than just above ones)?
Here I have already implemented:
1. Compress/Minify jQuery by PHP plugin
2. Use jQuery.getScript() to load file only when it required
3. Divide one large JS file into few separate files to finish loading faster
4. HTML optimization (Google Pagespeed)
I've heard of people using YUI Compressor, which 'has a better compression ratio' than most other tools. Not sure what you're using for compression at the moment, but this could provide a slight improvement. http://yui.github.io/yuicompressor/
Use a CDN (Content Delivery Network) to deliver jQuery. The browser will most probably cache the library and each page that loads, won't actually be needing to download the library again. This, if i'm not wrong, is one of the main reasons CDN's are used.
Otherwise, you should rethink your dependency/development strategy since you might not need jquery after all on each page. At least not whole of it.
I have noticed in chrome that if I load an image as a base64 string and then scroll through that part of the page it will slow down.
I have also noticed that when I navigate out of a tab with my Javascript in it and then move back to that tab it will be slow for a few seconds as though V8 is recompiling the js.
There are three options I can think of but I don't know which is best:
load a tiny loading page first and handle subsequent loading eloquently
load one huge js or css file with everything (jquery + my code + etc)
clump certain codes together (use jquery cdn but group my code together)
What is the best way to get your js loaded as quickly and eloquently as possible?
Generally, loading more files incurs more overhead in HTTP than combining them into fewer files. There are ways to combine files for all kinds of content:
For images, use CSS sprites.
For javascript, compile your client-side code and libraries into one file, and minify to reduce size.
For css, you can do something similar to the above. hem compiles stylus into one css file, for example, and this can help organizationally as well.
Additionally, when you concatenate Javascript and CSS, your webserver or reverse proxy can send them in compressed form for faster page loads. This will be more efficient for larger files as there is more to gain from compression.
There are way too many maybes for this to have any guaranteed solutions, but here you go:
1) load CSS at the top -- load it all there, if you're doing a site with multiple pages.
If you're building a one-page application (where you're running galleries and twitter feeds and articles, etc on the same page, and you can open and close different sections), then you can consider loading widget-specific CSS, at the time you're loading your widget (if it's not needed at startup).
Do NOT use #import in your CSS, if you want it to load quickly (you do).
2) load the vast majority of your JS at the bottom of the page.
There is practically nothing that can't be lazy-loaded, or at least can't be initialized at the bottom of the page, after the DOM is ready, and if there really is, serve those as separate files at the top of the page, and consider how you might rewrite to depend on them less.
3) be careful with timers -- especially setInterval... ...you can get your page's performance into a lot of trouble with poorly-managed timers.
4) be even more careful with event-handlers on things like window-scroll, resize, mouse-move or key-down. These things fire many, many times a second, so if you've written fancy programs which depend on them, you need to rethink how you fire the program (ie: don't run it every time something the handler goes off).
5) serving JS files is a trade-off:
Compiling JS takes a while. So if you're loading 40,000 lines in one file, your browser is going to pause for a little while, as it compiles all of that.
If you serve 18 separate files, then you have to make 18 different server calls.
That's not cool, either.
So a good balance is to concatenate files together that you KNOW you're going to need for that page, and then lazy-load anything which is optional on the page (like a widget for adding a comment, or the lightbox widget, etc).
And either lazy-load them after all of the main products are up and running, OR load them at the last possible second (like when a user hits the "add comment" button).
If you need to have 40,000 lines loaded in your app, as soon as it starts, then take the hit, or decide what order you can load each one in, and provide "loading" indicators (which you should be doing on lazy-load always) for each widget until it's ready (loading the JS one at a time).
These are guidelines for getting around general performance issues.
Specifics are hard to answer even when you have the site directly in front of you.
Use the Chrome dev console for profiling information and network performance, and rendering performance, et cetera.
Well there is a very popular concept called concatenation. The idea is to have as few HTTP requests to your server as possible. Because each request means a new connection, for which DNS lookup happens, then handshake is negotiated and then after a few more protocol-based steps, the server sends the requested file as the response.
You can check http-archive for a list of performance best-practices.
So yeah, you should combine all JS files into one (there are certain exceptions, like js at head and js in footer)
This is the answer for your question-title and points 2 & 3.
As for the other part, I am not clear about the scenario you are talking of.
I recently had the same problem, and then I developed and released a JS library (MIT licence) to do this. Basically, you can put all your stuff (js, images, css ...) into a standard tar archive (which you can create server side), and the library reads it and allows you to easily use the files.
You'll find it here : https://github.com/sebcap26/FileLoader.js
It works with all recents browsers and IE >= 10.
The number of files to load has an impact on the load speed of the whole site. I would recommend to pack into a single javascript file all the required functionality for the website to display properly.
I have a couple of questions that are somewhat related so I'm posting them all on a single question on SO...
Question 1:
I'm currently doing this Facebook application where I'm using jQuery UI Tabs, there's only 4 where 2 of them are loaded through Ajax. The main page is index.html, this is where the tabs code is placed and for the 2 tabs loaded through Ajax, I have two different files, tab1.html and tab2.html.
Currently, the jQuery tabs initialization and Facebook JavaScript initialization is done on index.html. Both tab1.html and tab2.html have JavaScript code that belongs to those pages. For instance, tab2.html has a form and there's some JS (with jQuery) code to validate the form, this code is irrelevant to tab1.html as the JS code on tab1.html is irrelevant to tab2.html.
My question is, should I keep doing this or maybe aggregate all the JS/jQuery code in index.html, tab1.html and tab2.html in a single global.js file and then include it in index.html?
I though of doing this but there will be irrelevant code loaded if the user never opens tab1 or tab2. The benefit of using a single global.js file is that I could pack/minify the file, which I couldn't do if I included each code block in each respective tabX.html file.
Question 2:
As I'm using jQuery, I'm also using lots of plugins (actually only 3 for now, but that number can grow). Some of them provide a minified JS and I use those when available, when they are not, I use the normal versions of course.
There's also the requests problem. If I have lots of plugins, say 10, it will be 10 requests for those plugins. And there is also the fact that some plugins are used in tab1.html but not on tab2.html and vice-verse.
How should I load all the plugins in a minified/packed version on a single web request? Should I do that manually before publishing my app (packing and merging them into a single file) or could I use the PHP version of Dean Edwards's Packer and pack/merge all plugins on the fly? Would this be a good approach?
Question 3:
If the answer on Q1 was something like "merge all code in a single global.js file", should I include the global.js file in the packing/merging script I described above on Q2?
Doing this would simplify everything. I could have my development environment properly organized with all .js files, for the plugins and the global.js in the appropriate folders without bothering with anything else. The packing/merging should take care of the rest (pull the files from the respective folders, send the respective JS headers and output one single packed .js file).
The one thing that's confusing me the most is that not all plugins are used for every tab, not all code is for every tab too. Still, a chunk of the code is global to every tab and the index. This also simplifies everything as: a) I don't have to worry to add the needed code to each tabX.html file and can I simply look at them as HTML templates and nothing else; b) I don't have to be bothered in including the necessary plugins where I need them as I'm currently using $.getScript() from jQuery to load the plugins I need when and only when I need them, but I'm not sure this is a good approach and the code feels dirty and ugly like this.
Question 1:
Pack them all into a single .js file. This will make maintenance easier, and the tiny bit of overhead for the user loading a little js they they potentially may not use does not matter. I would also let Google load the jQuery library for you and then have all of your js code in a single separate file.
Question 2:
As these plugins don't really change I would manually combine them. Closure Compiler is good at this. When minifying use the highest setting that does not give any warnings.
Question 3:
Yes you will want to minify the global.js
When the browser downloads the global.js it's cached for an amount of time. Thus when you call the entire global.js again on a different page, its not re-downloaded it looks at your local copy first. So you do a little bit more work at first on the initial download, but from then on, it should be quicker.
Generally best practices related to javascript for speeding up website loads are:
Minify all javascript and put all of it into a single file (make as much of your javascript external as possible).
Put javascript at the bottom of the document.
Force web server to assign expiration date in the future and use a timestamped query string to invalidate old versions of javascript files, this will prevent unnecessary requests for your javascript if it has not changed. (ie: in httpd.conf ExpiresByType application/x-javascript "access plus 1 year", in your document: <script type="text/javascript" src="/allmy.js?v=1285877202"></script>)
Configure your web server to gzip all text files.
The main reason why you should keep too much javascript away from tab pages is because it will kill user experience. When a user clicks on a tab for the first time it will grab all the components needed on the fly which makes it kinda sluggish.
You're question is only semi-specific as we don't know a lot of things about your site like exact file sizes, how the modules are really used.
The general idea would be to find balance between modularity and speed.
When you're combining modules together these are the general ideas you should consider:
how often does this module change?
how often is this module used?
how big is this module (filesize)?
Then put the most used, stable codebase and merge it into one. Then you should include the rest site specific functionality on the tab pages.
Also, make sure to load javascript asynchronously as it won't block rendering of the page (and tabs).
Another combined answer:
if adding all the JS together in packed/minifed version generates no more than 30k of file size you're better off combining it. A single extra connection for a file (assuming it's not cached) is worth 10-20k of extra JS download. This has to do with browsers opening and closing connections vs streaming extra 20k on an established connection. The threshold also depends on your user distribution. If you have a lot of dial-up or low bandwidth users your threshold will be smaller.
I typically recommend combining and loading as 1 file unless the library is very obscure and requires a very edge case for it to be triggered on a page. Ex: Hover triggers functionality Y but it's on a feedback widget that gets less than 1% of traffic- don't bother combining.
Minifying and Packing is a little overrated these days. With the vast majority of browsers supporting gZip the amount of data consolidation gZip provides of the file over the wire during browser transmission has virtually the same effect as min/pack. However, there is a small cost on the browser to unpack it. Having said that, it's still good practice to min/pack the code since not all browsers support it, you may not want the file to be gZip enabled, etc.
I've used online packers against 3rd party module and it works fairly well. However, there are times when it can cause an issue so make sure to test your manually packed version before deploying.
Alternate:
If you feel that your users will rest on your index page for longer than 10 seconds you could pre-load the additional libraries separately using Js Loader Prototype pattern.
Steve Souder's Even Faster Websites is a book you should look into.
Firstly one experience slowdowns because whenever an external script is linked the browser waits for the script to download, parse and then execute. After this only it regains processing rest of the request. So to avoid such slow downs one can look at parallely downloading the scripts. Few techniques are Ajax the scripts if the scripts are in the same domain or use Script Dom element or Script in iframe if the scripts are on external domains
Q1 : For me modularising all the content is a better option with respect to further development if the page content has to be changed constantly. Responsiveness is very important for the end user. A small global.js will help in getting the app up and running.Parallely one can download the tabX.html.
Q2: As the jquery plugins rarely change. The plugins for the tabX.html pages can be downloaded parallely and locally cached so when the tabX.html is loaded the required plugins need not be fetched. SO all the plugins required by the main page should be in one single file and the ones used by the tabX.html's should be in different files.
Q3 : its a personal choice here. Do you want it to be developer friendly or user friendly. I bank on user friendliness. Making responsive and efficient apps is our job !!!. All the advantages of packing everything into a singe files is you will have ease in development. Well ugly code begets beautiful apps :). Users are speed-aholics. For eg. when google changed its 10 results per page to 20 they saw a considerable drop in search queries. So my opinion is not to pack all of them into one and load each parallely
some of the techniques and relevant links on testing each:
XHR eval /ajax : http://stevesouders.com/cuzillion/?ex=10009
XHR Injection : http://stevesouders.com/cuzillion/?ex=10015
Script in Iframe : http://stevesouders.com/cuzillion/?ex=10012
Script DOM element : http://stevesouders.com/cuzillion/?ex=10010
Question 1:
The best practice would be to place all js files in a single "global" file. This minimizes your HTTP Requests. Let's say you have 5 plug-ins, this would me you need to do 5 request, wherein if you combine them as one, you only need to request it once. This might be a little bit heavy on the first load, but the next time around this file will be cached by the browser, so..no worries about the size. HOWEVER, be careful about the sequence of the scripts when combining it. (I.E. : JQuery script should be placed first on the js file before JQuery UI's)
http://articles.sitepoint.com/article/web-site-optimization-steps/4
http://code.google.com/speed/page-speed/docs/rtt.html
Question 2:
You can do it manually or automatically.Dean Edward's Packer is a good choice. If you're using ASP.NET, you can check MB Compression Handler, if you're using APACHE with PHP perhaps you can change the configuration of your htaccess to gzip it
Question 3:
It'd be better if you pack the "global" javascript file as well. This could save up bandwidth and save more time to load. You got the point, combining all the js files you need for the site will save you time from including individual scripts.
I am looking for the best way to speed up the load time of my js.
The problem is that I am working with a very large site that uses the jquery framework, and what's happening is because the site is also loading, facebook connect, addthis sharing, google analytics and another tracking code, the jquery is delayed a few seconds, and certain elements like the calendar just appear, and my users are complaining that things take to long.
I did a test in google chrome and the avg load time is 4s. Which is too much.
I am already doing minification, and the jquery UI/ Jquery is being loaded from google. What's the best way to approach this?
Make fewer http calls by combining images and script and css, and also use a Content Delivery Network for you static images and css might help!
You are not likely to be able to do much more about the load time of the external scripts, so what you can do is to change the order that things happen in the page so that the external scripts are loaded after you have initialised the page.
Scripts are loaded and executed in a serial manner, so if you change their order in the source code, you also change the order they are loaded.
Instead of using the ready event in jQuery, you can put your initialising code inline in the page, after all the content but before the body closing tag. That way the elements that you want to access are loaded when the script runs, and you can put external scripts below the initialising code to make them load after.
Small technical changes (such as serving the JSs from Google, minifying, etc..) will only get you so far.
Seems you simply have lots of dynamic stuff going on in your page. Have you though of an asynchronous way of building your content? One option might be to place placeholders instead of the external content, and asynchronously load it, so when all the scripts are loaded and ready, all you need to do is throw the markup into the placeholder.
This will create a better user experience, instead of the user waiting 10 seconds for the entire page, it will start loading incrementally after 2 seconds (and still fully load after 10).
In addition to Yuval's answer some options that might or might not bring you a speed gain:
the load time of external libraries is something beyond your control. Try to include them as late as possible, and better still, dynamically after the page has loaded. This way your page won't stall, if Google analytics or Facebook have another hickup.
It is not necessarily faster to load jQuery from Google. Consider putting jQuery, jQuery UI and as many of your own JS as reasonable in a single file, minify and gzip it and let the server serve the gzipped version where possible. Note here, that the gain in speed depends largely on what your users cache and where they cache it. If they already have jQuery from Google in their cache, this technique might make page load slower.
The bottomline is, that after some optimization you're out for experimenting. You must find out, what your average user has in her cache, if the page is accessed directly via deep links or if you can smuggle some JS or CSS (or even images) into her cache via a previous "landing page".
Make sure you deliver your content in gzip/deflate encrypted format. Combine multiple javascript files into 1 file, which helps to reduce the number of http requests.
P.S. Here's a test tool to check if compression is configured:
http://www.gidnetwork.com/tools/gzip-test.php