Does many javascript sections on different places affect page loading time?
like
<html><script></script><body><script></script></body><script></script></html>
Also is there any affects while using referces or writing small script parts on html?
And last, is it bad or good to write many codes on document load in page, will it affect page laod time while compiling all scripts on load
Obviously loading many different external JavaScript files will increase final load time.
You can place all the <script> tags at the end of your document, right before the </body> so that the website loads before the scripts. If possible you could minify and combine your JS files, to decrease the number of HTTP requests as well.
Page rendering will start as soon as all CSS is loaded and the DOM has been parsed, so this will make it seem like it loads faster.
Try following some recommendations, like the Best Practices for Speeding Up Your Web Site, if you are concerned about page speed.
Having JavaScript snippets inline with your HTML would not be a good idea. Try to keep your HTML clean, and use JavaScript with the appropriate selectors if you want to alter the DOM adding or removing any element.
The best practice requires you to load or run javascript at the bottom of the page(footer).
This way all the assets of the page (html, css, images) are loaded first.
Then load and execute javascript code.
Regarding speed the best way is to put all your js code to one .js file (if possible) and minimize it.
This way you can cut loading time and http requests giving a speed boost to your site.
Related
I have multiple external js from the same provider that I insert into the website to embed Instagram (it created galleries for each insert). Unfortunately, this JS negatively impacts Google page speed significantly. I am talking about a reduction in 20 - 23 points on mobile and desktop.
Unfortunately, this product doesn't have any other option to embed. It is a normal tag. I have tried async and defer, but no use as google is clever and considers all the scripts on the page to be loaded to calculate Time to Interactive, which is delayed. Obviously the SEO is taking a hit as the website performance is getting a hit.
The question, is there a way I can load all these scripts without influencing the DCL (DomContentLoaded) and TTI? Any help is appreciated. It is sufficient that these scripts are loaded when the DOM element is in view.
Since you can't split them up to several chunks, I would suggest you load these files on dynamically on demand (when they're needed).
This answer shows how to load a script dynamically.
I'm writing my first HTML5 + jquery.mobile web app.
The app is basically a menu which redirects to internal pages (data-role="page") defined in the same index.html. I do not write pages as external files to avoid reloading and rewriting the - substantially - same <head>: I suppose it's faster to jump to an internal tag than loading a new page...
Now, I have a page which needs some specific jquery plugins and some specific css. No other page needs these plugins or css.
Of course I could load these js/css in the main <head> section, but this approach would slow the first page loading, uselessly.
I could solve the problem with CSS, with:
$('head:first').append('<link rel="stylesheet" type="text/css" href="' + file + '" />');
I could even solve the problem with JS, but only for 'standard' JavaScript, with something like:
<script>
$(document).ready(function() {
$('#page-availability').live('pageinit', function () {
$.getScript("js/jqm-datebox.core.js");
$.getScript("js/jqm-datebox.mode.calbox.js");
$.getScript("js/jquery.mobile.datebox.i18n.en.utf8.js");
$('#datepicker').data({
"mode": "calbox",
...
});
...
});
...
});
Unfortunately this approach seems not to work (firebug croaks: "TypeError: a.mobile.datebox is undefined"...) with jquery plugins: it looks like they are not evaluated... (even if they are there, before the end of the <head> section, viewing at the "Generated Source"...).
I'm using Firefox (15) to debug, but I suppose this isn't the point...
Any hint?
The one page approach can be good for mobile if:
You don't have to load too much extra content in order to support all the content the user might show from that one page.
You don't have to load too much code to support all the behaviors.
The typical user actually does go to several different virtual pages so the scheme saves them load time and makes things quicker on subsequent virtual page loads.
Done well, the user gets OK performance on loading the first page and very quick performance when going to the other "embedded" pages that don't have to load new content over the network.
The one page approach is not so good if:
The initial load time is just more than it's worth because of the volume of stuff that must be loaded.
You have to dynamically load content for the sub-pages anyway.
You have SEO issues because the search engine can't really find/properly index all your virtual pages.
So, in the end, it's a real tradeoff and depends very much on how big things are, how many things you're loading and what the actual performance comes out to be. A compact mobile site can serve server-loaded page views from one page to the next pretty quickly if the pages are kept very lightweight and there are very few requests that must be satisfied for each page.
In general, you want to pursue these types of optimizations:
Compress/minify all javascript.
Reduce the number of separate items that must be loaded (stylesheets, javascript files, images).
Reduce the number of sequential things that must be loaded (load one, wait for it to load, load another). Mobile is bad at round-trips and loading lots of resources. It's OK at loading a few things.
Make it easy for the browser to cache javascript files. Use a few common javascript files that each serve the needs of many pages. Loading a little more at the start and then allowing the javascript file to be loaded from cache on all future pages loads is way, way better if the user will be visiting many successive pages on your site. The same is true for external CSS files.
Very very careful of lots of images, even small images. Lots of http requests in order to load a page is bad for load time on mobile and every image you request is an http request (unless it comes from the browser cache).
Make sure your server is configured to maximize browser caching for things that can be effectively cached.
Other things to be aware of:
By default dynamic loading of script files is asynchronous and unordered. If your script files must execute in a specific order, then you will have to either not load them dynamically or you will have to write code (or use a library) that serializes their execution in the desired order.
$.getscript is a shorthand AJAX function, it takes a callback as the second parameter.
Check out the docs:
http://dochub.io/#jquery/jquery.getscript
You could concatenate those scripts and then do your stuff in the callback.
This is not so dissimilar to old Flash asset loading issues.
My strategy for that? load only whats necessary for the initial page view. When its loaded and the page / app is viewable by the user, progressively load all other assets.
If the assets were particularly heavy, then I would disable the link to that specific page until its required assets were loaded.
In this case, you might disable the link to the particular page at the outset, initiate the load of its assets, and when they are ready, enable the link.
Not sure if you're having any syntax issues, but you can certainly just inject a new script element into the head with the correct source, and it will instigate a download (like you are doing with css. But you probably know that ;D )
Cheers
I would just combine/minify and compress all the JS in one file and always load that. This is something (with correct caching) which is only downloaded once so you don't have to worry about performance much.
Of course I could load these js/css in the main section
I often just add it just before the </body> and tag. Also note that besides the fact that .live() is deprecated it is also slow as hell. So don't use it, but use .on().
Is there a way to download javascript without executing it? I want to decrease my page load times so am trying to "lazy load" as much javascript onto the page while the user is idle. However I don't want the javascript to execute, I just want it to be in the browser cache.
Should I use an object tag? I noticed that I can use a LINK tag but that makes the browser think it's css which has a negative impact on my ui perf / responsiveness.
As long as you have all code in functions or classes and nothing in global scope nothing will execute.
You can then start your script with a call from
window.load(function() { //your initialisation here });
This will let the whole page load before running any scripts.
You could also add script references via script to make sure they load after any images in the page.
Just add a script element to head using script and it will load.
These pages has examples for this:
http://unixpapa.com/js/dyna.html
http://www.javascriptkit.com/javatutors/loadjavascriptcss.shtml
This way if you have a slow connection or a sever that is overloaded, the visible elements will be loaded first by the browser.
As far as I know, there is no cross-browser compliant way to get around JavaScript loading in serial. If your javascript does something when it is loaded, you need to refactor your code. For instance, you don't write your jQuery commands/actions/code in the jQuery library script; you link the jQuery library and then put your jQuery commands into a separate file. You should do the same thing with your custom libraries. If this isn't possible, you have a big problem with the architecture of your code.
Also, make sure you stick non-executing JS at the bottom of the page near the </body> tag. This will allow everything else to load first, so that the bulky JS libraries don't slow down things like CSS and images.
The best practices way to deal with external javascript is to have it load after everything else on the page by putting it at the bottom of the page. Then everything that can be rendered will be and display and then the javascript at the bottom of the page will load and be compiled and cached. Of course this only works if the javascipt is a library of functions that don't need to be executed mid-page, in that case, you are stuck with serial javascript loading compiling and execution regardless.
Require.JS is a great library for automatically managing when your javascript loads.
You could load the file using the XMLHttpRequest object in JavaScript. (Aka AJAX). (end then of course just discard the result ^^).
I've been doing a little JavaScript (well, more like jQuery) for a while now and one thing I've always been confused about is where I should put my scripts, in the <head> tag or in the <body> tag.
If anyone could clarify this issue, that'd be great. An example of what should go where would be perfect.
Best practices from google and yahoo say that, for performance, javascript should always be put in an external file, and linked to your page with a <script> tag located at the bottom of the html, just before the closing of the <body> element.
This allows the browser to render the entire page right away, instead of stopping to evaluate javascript.
You mentioned three places:
In tags;
In the HTML; and
In an external file.
Let me address each of those.
Best practice is to have common Javascript in one or more external files and the less files the better since each JS file loaded will block loading of the page until that JS file is loaded.
The word "common" is extremely important. What that means is you don't want to put page-specific Javascript code in that external file for caching reasons. Let's say you have a site with 1000 pages. Each page has JS code specific to it. That could either be 1000 different files or one really big file that executes a lot of unnecessary code (eg looking for IDs that aren't on that particular page but are on one of the 999 others). Neither of these outcomes is good.
The first gives you little caching boost. The second can have horrific page load times.
So what you do is put all common functions in one JS file where that JS file only contains functions. In each HTML page you call the JS functions needed for that page.
Ideally your JS files are cached effectively too. Best practice is to use a far futures HTTP Expires header and a version number so the JS file is only loaded once by each browser no matter how many pages they visit. When you change the file you change the version number and it forces a reload. Using mtime (last modified time of the JS file) is a common scheme, giving URLs like:
<script type="text/javascript" src="/js/script.js?1233454455"></script>
where that mtime is automatically generated. Your Web server is configured to serve JS files with an appropriate Expires header.
So that mixes external files and in-page scripts in (imho) the best way possible.
The last place you mentioned was in the tag. Here it depends somewhat on what JS libraries and frameworks you use. I'm a huge fan of jQuery, which encourages unobtrusive Javascript. That means you (hopefully) don't put any Javascript in your markup at all. So instead of:
do stuff
you do:
do stuff
with Javascript:
$(function() {
$("#dostuff").click(doStuff);
});
I am looking for the best way to speed up the load time of my js.
The problem is that I am working with a very large site that uses the jquery framework, and what's happening is because the site is also loading, facebook connect, addthis sharing, google analytics and another tracking code, the jquery is delayed a few seconds, and certain elements like the calendar just appear, and my users are complaining that things take to long.
I did a test in google chrome and the avg load time is 4s. Which is too much.
I am already doing minification, and the jquery UI/ Jquery is being loaded from google. What's the best way to approach this?
Make fewer http calls by combining images and script and css, and also use a Content Delivery Network for you static images and css might help!
You are not likely to be able to do much more about the load time of the external scripts, so what you can do is to change the order that things happen in the page so that the external scripts are loaded after you have initialised the page.
Scripts are loaded and executed in a serial manner, so if you change their order in the source code, you also change the order they are loaded.
Instead of using the ready event in jQuery, you can put your initialising code inline in the page, after all the content but before the body closing tag. That way the elements that you want to access are loaded when the script runs, and you can put external scripts below the initialising code to make them load after.
Small technical changes (such as serving the JSs from Google, minifying, etc..) will only get you so far.
Seems you simply have lots of dynamic stuff going on in your page. Have you though of an asynchronous way of building your content? One option might be to place placeholders instead of the external content, and asynchronously load it, so when all the scripts are loaded and ready, all you need to do is throw the markup into the placeholder.
This will create a better user experience, instead of the user waiting 10 seconds for the entire page, it will start loading incrementally after 2 seconds (and still fully load after 10).
In addition to Yuval's answer some options that might or might not bring you a speed gain:
the load time of external libraries is something beyond your control. Try to include them as late as possible, and better still, dynamically after the page has loaded. This way your page won't stall, if Google analytics or Facebook have another hickup.
It is not necessarily faster to load jQuery from Google. Consider putting jQuery, jQuery UI and as many of your own JS as reasonable in a single file, minify and gzip it and let the server serve the gzipped version where possible. Note here, that the gain in speed depends largely on what your users cache and where they cache it. If they already have jQuery from Google in their cache, this technique might make page load slower.
The bottomline is, that after some optimization you're out for experimenting. You must find out, what your average user has in her cache, if the page is accessed directly via deep links or if you can smuggle some JS or CSS (or even images) into her cache via a previous "landing page".
Make sure you deliver your content in gzip/deflate encrypted format. Combine multiple javascript files into 1 file, which helps to reduce the number of http requests.
P.S. Here's a test tool to check if compression is configured:
http://www.gidnetwork.com/tools/gzip-test.php