Also, I know css and image files can be downloaded in parallel. But can javascript files be downloaded in parallel? Thanks.
When the HTML parser sees a script tag, barring your using a couple of special attributes, it comes to a screeching halt, downloads the JavaScript file, and then hands the contents off to the JavaScript interpreter. Consequently, script files cannot be downloaded in parallel either with each other or with the main HTML...unless you use the defer or async attributes and the browser supports them. Details in this other answer here on StackOverflow.
Note that even if you have multiple resources that can be downloaded in parallel, there's no guarantee that they will be. Browsers (and servers) put limits on the number of simultaneous connections between the same two endpoints. (With modern browsers the limit is usually at least four — up from two — but browsers may dial things down on dial-up connections and on mobile devices; and of course, they're entirely free to only use a single connection, it's implementation-specific).
Scripts are loaded in parallel in modern browsers and IE8. Firefox and Safari will download as many as six scripts at once. IE8 however seems to not have a limit set - my tests showed as many as 18 scripts load at once.
However, whether it's one script or six, everything else is blocked. So the browser will not load one script and one style sheet.
You can get a good idea of the flow by looking at Firebug in the Net tab. Those bars you see on the right are called a waterfall chart, and gives you a reasonably accurate representation of when things load and how long they take.
You can learn more about page loading in this presentation: Optimize Your Website to Load Faster The stuff on resource loading starts on slide 33.
Related
So I'm injecting some html into page (extension written with angular.js) and on pages with google maps frames like airbnb the loading slows down. So I don't have any idea how I affect that. Any ideas?
When you load JavaScript from a third party you should do it asynchronously. You might want to load your own scripts asynchronously too, but for this article let's focus on third parties.
There are two reasons for this:
If the third-party goes down or is slow, your page won't be held up
trying to load that resource.
It can speed up page loads.
At Wufoo, we just switched over to an asynchronous embed snippet. Users who build forms with Wufoo and want to embed them on their site are now recommended to use it. We did it for exactly those reasons above. It's the responsible thing to do for a web service that asks people to link to resources on that services site.
There is a little terminology involved here that will help us understand the umbrella "asynchronous" term.
"Parser blocking" - The browser reads your HTML and when it comes to a it downloads that entire resource before moving on with the parsing. This definitely slows down page loads, especially if the script is in the head or above any other visual elements. This is true in older browsers as well as modern browsers if you don't use the async attribute (more on that later). From the MDN docs: "In older browsers that don't support the async attribute, parser-inserted scripts block the parser..."
To prevent problematic parser blocking, scripts can be "script inserted" (i.e. insert another script with JavaScript) which then forces them to execute asynchronously (except in Opera or pre 4.0 Firefox).
"Resource blocking" - While a script is being downloaded, it can prevent other resources from downloading at the same time as it. IE 6 and 7 do this, only allowing one script to be downloaded at a time and nothing else. IE 8 and Safari 4 allow multiple scripts to download in parallel, but block any other resources (reference).
Ideally we fight against both of these problems and speed up page loading (both actual and perceived) speed.
https://css-tricks.com/thinking-async/
One of my users is intermittently getting a dialog in IE 8 that says:
A script on this page is causing Internet Explorer to run slowly
This problem has been reported numerous times in the MSDN forums and other places on the web. For example:
http://social.msdn.microsoft.com/Forums/ie/en-US/0fdb7550-0a59-4553-a90e-1958475f2ce5/how-to-debug-a-script-on-this-page-is-causing-internet-explorer-to-run-slowly?forum=iewebdevelopment
http://social.msdn.microsoft.com/Forums/ie/en-US/dafffdd8-a390-4a3c-a904-867acaec298e/error-message-a-script-on-this-page-is-causing-internet-explorer-to-run-slowly-for-ie8?forum=whatforum
http://social.msdn.microsoft.com/Forums/ie/en-US/e9d16d2a-60e1-4d6f-8f80-0d981eab3a2e/a-script-on-this-page-is-causing-your-web-browser-to-run-slowly?forum=bingmapsajax
So, this question is a duplicate of those and many others. But it is an intentional duplicate, because I don't think any of those questions were answered in a way that helps a user (developer) to determine precisely what in his scenario causes the dialog to appear.
I know that based on this page:
http://support.microsoft.com/kb/175500
the dialog appears when a certain number of statements have executed since a new script begins execution (through a variety of means). By default the number of statements is 5,000,000 but this is configurable via a registry entry.
The general prescription to this problem is a combination of:
Write less code. Unfortunately, this is not always possible.
Use web workers. This is not an option for IE before IE 10, nor with
some mobile browsers.
Use setTimeOut, setInterval, event handlers, etc to break the
script(s) up. This is a legitimate strategy across all browsers.
So, I understand what the problem is in general terms, and I understand what the options are to solve the problem, again in general terms. The question is, how to determine what area(s) of the code are causing the dialog to appear for a specific user? Often this problem occurs with a very large code base (including third party libraries), so a manual review of the code base without some tooling support is not feasible.
Most browsers, including IE8 and later, have profiling tools that enable the developer to determine JavaScript CPU usage. With the exception of IE, other browsers decide a script is long running not based on the numbers of statements executed but rather the amount of time the script spent executing. To this end, the profilers available in (or as add ons) to the browsers can identify the percentage and usually raw time spent executing a function, both inclusively and exclusively for a function, and the results can be sorted accordingly. IE's profiler will also give you a count of how often a function is called. None of these profilers tell you how many statements of code within a function were executed during profiling; they tell you only how long was spent in the function and how many times the function was called. This doesn't help with IE's slow script dialog logic, which is based on the number of statements (not time) that were executed. There is sometimes a correlation between the time spent executing a function and the number of statements, but it is not a reliable relationship as of course different types of statements can take very different lengths of times to execute (e.g. many native JavaScript functions are faster than calls which update the DOM; both have the same statement count but the former executes faster than the latter, making examining %/raw time not very useful).
One approach I've used that is of some value is when the slow script dialog appears, start the debugger in IE (if not already started) and select the break on next statement command in the debugger. Then click the dialog button in the browser that allows the slow script to continue executing. At this point, the debugger is invoked and the developer can examine the call stack to determine what is executing at the time the slow script dialog appears. This is okay, but a very manual approach and there's no guarantee there aren't multiple scripts running that can at different times invoke the dialog.
An interesting idea I've seen suggested is to use a JavaScript code coverage tool to instrument the code base. There are multiple JavaScript code coverage options out there, but the ease of use of a browser extension that could dynamically instrument the code seems like an ideal solution. (Another interesting idea is to use a proxy server, for example http://siliconforks.com/jscoverage/manual.html; 'jscoverage --server --proxy', but I couldn't get this to work on virtually any website, except the the silicon forks website itself). There's a proof of concept one available for Chrome (http://googletesting.blogspot.ca/2011/10/scriptcover-makes-javascript-coverage.html) that I think could be a great start for helping resolve the the slow script problem -- if such an extension could be made for IE.
So, to reiterate my main question is what tools/processes can one use to debug/analyze a slow script dialog appearing on a user's machine? A subquestion would be, does anyone know of any JavaScript code analysis tools that could be repurposed to help in diagnosing the slow script dialog in IE, and that takes minimal deployment effort? Can an IE extension be written, in theory, that does the sort of code coverage as the Google' script cover extension does?
Thank you,
Notre
In the old times JavaScript was not used that much. Pages used to me more static. These days the PC's were less powered and it was more popular to run complex tasks on the server. JavaScript was only used for some animations.
Microsoft (my way or no way) took their time to acknowledge heavily JavaScript-ed content. (They were also fiddling with their JScript in IE8). Until IE9 they were considering that running > 5 000 000 instruction in a script is a potential mistake.
I'm not aware of any tool that retrieves the instructions count. It's a browser build in feature.
But most of the browsers retrieve the execution time of a function. I know that in every different computer, same number of instructions can take different times, but you can set a benchmark.
Run a script with 5 000 000 instructions on the machine and see how long it takes to run. Then use that time to benchmark your other JavaScript. It's not 100 % accurate but it can became close once you fiddle with it.
Since old IE developer tools are quite poor you can use some third party one. A ussage example here:
deep-tracing-of-internet-explorer
Anyway, the “A script on this page is causing your computer to run slow ...” is only a problem in IE4 - IE8. Since those browsers are obsolete, so should this question be.
I'm currently doing some optimization work on a large web project. I'm already doing JavaScript file combining, minification and compression. But I'm confused on one point.
For a number of non-technical reasons, my users are about 50% each IE7 and IE8. After doing some research, I'm getting the impression that IE7 loads the JavaScript files sequentially and IE8 loads them in parallel. I understand that going forward that this will not be an issue with more modern browsers (IE9+, FF, Chrome, etc).
Is this an accurate statement? If yes, then what is best practice for loading the files?
That statement is correct, but you should remember that even modern browser will make only a limited number of connections to the same server. So when your page, scripts, css and images are all on the same server, the browser may load only 2 or 4 of those at a time. Therefor it may be a good idea to add a subdomain or a different domain for scripts to trick the browser and make it load the scripts alongside with the images.
An even simpler solution is to merge all scripts into one script. You can do this 'on the fly' or cache it. You can even minimize the scripts (which means comments and whitespace are stripped and variable names are shortened). You shouldn't minimize and combine the original scripts, but you can cache the combined/minimized scripts so they won't need to be minimized with each request.
If you do this, you reduce traffic, and your browser will only need one request for the file, eliminating the overhead of multiple sequential requests.
See this MSDN blog article which shows some other tricks for script loading.
Having read up recently on yahoo's web optimisation tips and using YSlow I've implemented a few of their ideas on one of my sites http://www.gwynfryncottages.com you can see the file here http://www.gwynfryncottages.com/js/gw-custom.js.
While this technique seems to work perfectly on most occasions, and really does speed up the site, but I do notice a significantly higher number of errors where the javascripts don't load or don't load completely while I'm working on the site so three questions:-
is combining scripts this way a good idea at all in terms of reliability?
is there any way to measure the number of errors i.e. the number of times the script failed to load?
is there any way to 'pre-load' the javascript or ensure that the number of loading errors is reduced?
Of course it's good. You will not only decrease HTTP requests but you will cut down delays in downloading other resources.
Try using minify: http://code.google.com/p/minify/, I've been using it and I've no complaints.
I can assure you that combining files WON'T cause any errors as a combined script is the same as 10 non-combined scripts, they all load in the same way (In an ordered way, left to right, top to bottom). Double check the way you're combining them.
Script execution stops at serious errors. If you have multiple scripts, the others will still run; if you packed everything into one big file, a lot more code won't get executed. So combining scripts is bad for reliability, but can be good for other purposes (mainly load time).
All browsers have some sort of javascript console which will show you the number of errors. Most have some sort of developer tool too (Firebug in Firefox, Dragonfly in Opera etc).
I'm not sure what you mean by preloading. Since a javascript file can affect the rest of the page in various ways, browsers will fully load and execute a script tag before continuing to parse the page (which is why scripts can slow page loading down so much).
I can't see the load function in your code which is being called on your body tag! I'd try and steer clear of adding JS to your HTML file, it can be added dynamically and will prob cause you less hassle along the way aas well as being easier to maintain.
I'd say that the things you need to look out for are making sure that you're not trying to call something before it's defined (maybe your seperate JS files were defined in a different order to how they appear in the single JS file).
Firebug for firefox is a good development tool, if you've not found it already. Webkit, Opera and IE also have various other dev tools.
Combining JavaScript files is always the best way to go, unless it's not logically sane to do so (downloading jQuery from Google Code instead of hosting it yourself is a good example).
I always combine as many files as I can (JavaScript, CSS, images (CSS Sprites), etc.), also in development, and I never experience any problems. It's way faster regarding less http connections, which should not in any case be underestimated.
Regarding that you want to count the errors, I don't exactly see what you mean. But, debugging tools like the built in one in Google Chrome or Firebug for Firefox are good tools for debugging your JavaScript code, and shows lists of the errors occurring.
And to that thing of preloading: Yes, it can be done, though it'll become nasty and illogical. However, I can't think of any case whatsoever where it would be a good solution to have the trouble to preload the JavaScript, compared to just make it work right out of the box, no error checking needed.
About the error you are experiencing, the only one that my Chrome points out is this:
Uncaught ReferenceError: load is not defined
... which seems to be the onload method "load()" set on line 55 of your HTML document when the body tag is started.
I was fine tuning a page that is heavy on jquery and stumbled across this website:
http://www.componenthouse.com/extra/jquery-analysis.html
When I click on the "Run Test" button the first time, the numbers are way higher than subsequent clicks. Is this because JS is cached by the browser? Can someone explain how this works internally? Can a user choose to not cache the JS?
External javascript files are cached and, of course, an html containing script tags can be cached too.
What you see may be a result of html caching or some browser optimization. You should try different browsers, closing and re-opening your browser and clearing the cache of the browser.
The numbers are (significantly) different for me on the second time in Firefox 3.5. OTOH, they are fairly consistent(ly slow) in IE 8. Firefox 3.5's JavaScript interpreter compiles the JS to executable code. So it does make sense that the first time is slower; the code hasn't been JITted yet.
The performance boost you're seeing is likely due to your javascript interpreter. Most newer web browsers use a JIT-compiling javascript engine so code paths taken multiple times can be optimized.
Read this blog post on how Safari's javascript engine achieved many of its speed-ups.
Whether or not JavaScript code is cached, execution performance isn't affected. What you are seeing is jQuery caching the results for the selector queries so they don't take as long on subsequent runs.