I'm told that document.write should be avoided in web page since it hurts web page performance. But what is the exact reason?
document.write() itself doesn't seem to be very harmful to page performance in most browsers. In fact, I ran some tests at DHTML Kitchen and found that in Firefox, Opera and Chrome, document.write() was actually faster on the first load, and comparable in speed of standard HTML on subsequent refreshes. Internet Explorer 8 was the exception, but it was actually faster than the other browsers at rendering the HTML (surprisingly).
As Guffa's answer points out, and what I was building up to, the actual performance issues come from inline scripts themselves. Content rendering can only continue when an inline script has finished executing, so if you have a complexe routine inside an inline script you can noticeably halt your page's loading for the end user. That's why waiting for onload/DOMReady and using DOM manipulation is preferred.
It's especially unwise to use document.write() after the document has finished loading. In most browsers, using document.write() after document load also implies document.open(), which will wipe the current HTML off the screen and create a new document.
That doesn't mean that document.write() doesn't have its uses, it's just that most developers use it for the wrong reasons. Real problems with document.write() include:
You can't use it in documents served as XHTML (for browsers that correctly parse XHTML as XHTML).
Overwrites the entire page when used after DOM parsing has completed.
Adds content to the page that isn't accessible to browsers with JavaScript disabled (although <noscript> is sometimes a valid workaround here).
More difficult to maintain than static HTML.
If you have scripts that run in the middle of the page, the browser has to wait for the script to finish before it can continue to parse the rest of the page.
To make your page appear fast, you want the browser to parse the page as soon as possible so that it can be displayed to the user, and after that you can apply the extra functionality that your scripts add.
I think there are some reason why it should be avoided.
but what you mean is, that if you have somewhere in your html code a
<script>
document.write('mystuff')
</script
one of the problem is, that bevore the browser can display your website, it has to load the Javascript interpreter.
if you would start your javascript only by body.onLoad then it can display the whole website to the user, and then run your javascripts...
therefore
the subjective loading time is faster :-)
Related
So I'm injecting some html into page (extension written with angular.js) and on pages with google maps frames like airbnb the loading slows down. So I don't have any idea how I affect that. Any ideas?
When you load JavaScript from a third party you should do it asynchronously. You might want to load your own scripts asynchronously too, but for this article let's focus on third parties.
There are two reasons for this:
If the third-party goes down or is slow, your page won't be held up
trying to load that resource.
It can speed up page loads.
At Wufoo, we just switched over to an asynchronous embed snippet. Users who build forms with Wufoo and want to embed them on their site are now recommended to use it. We did it for exactly those reasons above. It's the responsible thing to do for a web service that asks people to link to resources on that services site.
There is a little terminology involved here that will help us understand the umbrella "asynchronous" term.
"Parser blocking" - The browser reads your HTML and when it comes to a it downloads that entire resource before moving on with the parsing. This definitely slows down page loads, especially if the script is in the head or above any other visual elements. This is true in older browsers as well as modern browsers if you don't use the async attribute (more on that later). From the MDN docs: "In older browsers that don't support the async attribute, parser-inserted scripts block the parser..."
To prevent problematic parser blocking, scripts can be "script inserted" (i.e. insert another script with JavaScript) which then forces them to execute asynchronously (except in Opera or pre 4.0 Firefox).
"Resource blocking" - While a script is being downloaded, it can prevent other resources from downloading at the same time as it. IE 6 and 7 do this, only allowing one script to be downloaded at a time and nothing else. IE 8 and Safari 4 allow multiple scripts to download in parallel, but block any other resources (reference).
Ideally we fight against both of these problems and speed up page loading (both actual and perceived) speed.
https://css-tricks.com/thinking-async/
I'm currently doing some optimization work on a large web project. I'm already doing JavaScript file combining, minification and compression. But I'm confused on one point.
For a number of non-technical reasons, my users are about 50% each IE7 and IE8. After doing some research, I'm getting the impression that IE7 loads the JavaScript files sequentially and IE8 loads them in parallel. I understand that going forward that this will not be an issue with more modern browsers (IE9+, FF, Chrome, etc).
Is this an accurate statement? If yes, then what is best practice for loading the files?
That statement is correct, but you should remember that even modern browser will make only a limited number of connections to the same server. So when your page, scripts, css and images are all on the same server, the browser may load only 2 or 4 of those at a time. Therefor it may be a good idea to add a subdomain or a different domain for scripts to trick the browser and make it load the scripts alongside with the images.
An even simpler solution is to merge all scripts into one script. You can do this 'on the fly' or cache it. You can even minimize the scripts (which means comments and whitespace are stripped and variable names are shortened). You shouldn't minimize and combine the original scripts, but you can cache the combined/minimized scripts so they won't need to be minimized with each request.
If you do this, you reduce traffic, and your browser will only need one request for the file, eliminating the overhead of multiple sequential requests.
See this MSDN blog article which shows some other tricks for script loading.
to make it more specific, I mostly care about SpiderMonkey interpreter in Firefox.
So suppose I want to speed up the loading of a particular website in my browser or else speed up loading of all websites that have some popular script, e.g. JQuery. Presumably the scripts involved don't change between the page reloads. Will SeaMonkey understand that much and avoid full recompilation?
If SpiderMonkey wouldn't, will any other interpreter? Or is this basically a potential new feature which nobody cares about since computers are fast as is?
This is not an optimization Gecko does yet, but it's one we're looking into doing for sure. There are some complications to doing it, unfortunately.
Having read up recently on yahoo's web optimisation tips and using YSlow I've implemented a few of their ideas on one of my sites http://www.gwynfryncottages.com you can see the file here http://www.gwynfryncottages.com/js/gw-custom.js.
While this technique seems to work perfectly on most occasions, and really does speed up the site, but I do notice a significantly higher number of errors where the javascripts don't load or don't load completely while I'm working on the site so three questions:-
is combining scripts this way a good idea at all in terms of reliability?
is there any way to measure the number of errors i.e. the number of times the script failed to load?
is there any way to 'pre-load' the javascript or ensure that the number of loading errors is reduced?
Of course it's good. You will not only decrease HTTP requests but you will cut down delays in downloading other resources.
Try using minify: http://code.google.com/p/minify/, I've been using it and I've no complaints.
I can assure you that combining files WON'T cause any errors as a combined script is the same as 10 non-combined scripts, they all load in the same way (In an ordered way, left to right, top to bottom). Double check the way you're combining them.
Script execution stops at serious errors. If you have multiple scripts, the others will still run; if you packed everything into one big file, a lot more code won't get executed. So combining scripts is bad for reliability, but can be good for other purposes (mainly load time).
All browsers have some sort of javascript console which will show you the number of errors. Most have some sort of developer tool too (Firebug in Firefox, Dragonfly in Opera etc).
I'm not sure what you mean by preloading. Since a javascript file can affect the rest of the page in various ways, browsers will fully load and execute a script tag before continuing to parse the page (which is why scripts can slow page loading down so much).
I can't see the load function in your code which is being called on your body tag! I'd try and steer clear of adding JS to your HTML file, it can be added dynamically and will prob cause you less hassle along the way aas well as being easier to maintain.
I'd say that the things you need to look out for are making sure that you're not trying to call something before it's defined (maybe your seperate JS files were defined in a different order to how they appear in the single JS file).
Firebug for firefox is a good development tool, if you've not found it already. Webkit, Opera and IE also have various other dev tools.
Combining JavaScript files is always the best way to go, unless it's not logically sane to do so (downloading jQuery from Google Code instead of hosting it yourself is a good example).
I always combine as many files as I can (JavaScript, CSS, images (CSS Sprites), etc.), also in development, and I never experience any problems. It's way faster regarding less http connections, which should not in any case be underestimated.
Regarding that you want to count the errors, I don't exactly see what you mean. But, debugging tools like the built in one in Google Chrome or Firebug for Firefox are good tools for debugging your JavaScript code, and shows lists of the errors occurring.
And to that thing of preloading: Yes, it can be done, though it'll become nasty and illogical. However, I can't think of any case whatsoever where it would be a good solution to have the trouble to preload the JavaScript, compared to just make it work right out of the box, no error checking needed.
About the error you are experiencing, the only one that my Chrome points out is this:
Uncaught ReferenceError: load is not defined
... which seems to be the onload method "load()" set on line 55 of your HTML document when the body tag is started.
I was fine tuning a page that is heavy on jquery and stumbled across this website:
http://www.componenthouse.com/extra/jquery-analysis.html
When I click on the "Run Test" button the first time, the numbers are way higher than subsequent clicks. Is this because JS is cached by the browser? Can someone explain how this works internally? Can a user choose to not cache the JS?
External javascript files are cached and, of course, an html containing script tags can be cached too.
What you see may be a result of html caching or some browser optimization. You should try different browsers, closing and re-opening your browser and clearing the cache of the browser.
The numbers are (significantly) different for me on the second time in Firefox 3.5. OTOH, they are fairly consistent(ly slow) in IE 8. Firefox 3.5's JavaScript interpreter compiles the JS to executable code. So it does make sense that the first time is slower; the code hasn't been JITted yet.
The performance boost you're seeing is likely due to your javascript interpreter. Most newer web browsers use a JIT-compiling javascript engine so code paths taken multiple times can be optimized.
Read this blog post on how Safari's javascript engine achieved many of its speed-ups.
Whether or not JavaScript code is cached, execution performance isn't affected. What you are seeing is jQuery caching the results for the selector queries so they don't take as long on subsequent runs.