IE7 and IE8 randomly not able to load external script - javascript

I am dynamically adding <link> elements to the head once the DOM is ready. However, I'm getting inconsistent results in IE8 and IE7 (all other browsers are fine).
Every few times the page is loaded (cached or uncached), IE 7/8 will drop a handful of CSS rules in the stylesheets. 1 or 2 of my dynamic stylesheets will not load. It's always the same 1 or 2 stylesheets that IE tends to ignore - even though the Developer Toolbar shows them as added to the head!.
The stylesheets themselves show up as <link> elements in the final DOM, but some of their rules are not applied (although every few reloads they are applied, without any issue).
In my position, I do not have the luxury of writing code from the <head> (CMS restriction) - I can only dynamically insert from the body, which may be the issue.
UPDATED: This is the code I am using (located within the <body>) to insert stylesheets:
document.observe('dom:loaded', function() { // Using Prototype.js
// Add stylesheets
// addStylesheet('cite.css', 'head'); // Contains no webfont/#font-face rules
// addStylesheet('type.css', 'head'); // Contains webfont family name references*
// addStylesheet('flex.css', 'head'); // Responsive rules with #media queries
// addStylesheet('anm8.css', 'head'); // Some minor positional CSS for home page
// addStylesheet('gothic-cite.css', 'head'); // *Contains #font-face config
// addStylesheet('stag-cite.css', 'head'); // *Contains #font-face config
addStylesheet('all.css', 'head'); // Contains ALL content from above in 1 file
function addStylesheet(cssname, pos2)
{
var th2 = document.getElementsByTagName(pos2)[0];
var s2 = document.createElement('link');
s2.setAttribute('type', 'text/css');
s2.setAttribute('href', cssname);
s2.setAttribute('media', 'screen');
s2.setAttribute('rel', 'stylesheet');
th2.appendChild(s2);
}
});
As suggested, even when I combined all rules into one stylesheet (which I hate doing), IE 7/8 continues to flip-flop as described with some rules, and the page appears differently.
As a further check, I also removed all #font-face and referenced font-family: "webfont-name" rules from the stylesheets, and the same behavior continued. Therefore, we can rule out webfonts being the issue.
You can see the anomalies by visiting the following with IE8 and refreshing/clicking the nav several times. It seems completely random as to when IE8 is dropping those styles. However, in the natively-built control page, all styles load correctly, every time.
Live Page (with problems)
https://www.eiseverywhere.com/ehome/index.php?eventid=31648&tabid=50283
PHP-based CMS prints out XHTML on page load (template content mixed w/user content)
Prototype.js is loaded and initialized by default on page load
CMS proprietary scripts.js file is parsed on page load
My scripts run when DOM is loaded, essentially replacing body.innerHTML CMS fluff-HTML with just the HTML I want, then adds stylesheets to <head>.
For lte IE 8, CSS extension plugins (selectivizr.js, html5.js, and ie-media-queries.js) are loaded within the <body> via conditional comments. Not sure if they wait for DOM:loaded...
The CMS WYSIWYG editor converts all carriage-returns to empty <p> tags, resulting in elements like <section> being contained inside broken <p> tags, and extra <p></p> tags being created where whitespace is expected. Only lt IE 8 seems to choke on this, however, so I added the following CSS rules to remedy this:
:not(.ie7) p { display: none; }
.ie7 p { display: inline; }
article p { display: block !important; }
I should note that the external stylesheets here are being pulled from the same domain, but each time they are re-uploaded, a new MD5-based URL is generated for the file. I'm not sure if previous revisions to the file (or previous files) are still available by their previous URLs. This isn't likely to be the problem though, since the newly created all.css stylesheet is still dropping rules that have been in the file from the start.
Control Page (works flawlessly)
http://client.clevelanddesign.com/CD/IDG/CITE/home.html
Pure XHTML document - no PHP.
jQuery is used, rather than Prototype, for IE8 and below.
All resources (stylesheets) are present in <head> at page load - no dynamic insertion
For lte IE 8, CSS extension plugins (selectivizr.js, html5.js, and ie-media-queries.js) are initialized natively.
Rephrased question:
Which of these differences do you think may be causing IE 7/8 to flip-flop on styles when loaded on the Live page? I personally suspect either a race-condition issue, or that Prototype.js and the other CMS scripts are mucking things up (unfortunately no way to purge those from the page though).
PS: I've already tried using IE's createStylsheet() function, to no avail.
UPDATE - Screenshots of working/not working in IE8
IE8: DOM code when loaded correctly:
IE8: DOM code when NOT loaded correctly:

I've nailed down exactly what is happening, but still do not know the cause of flip-flop:
selectivizr.js is not loading correctly every few page loads.
All of the rules that use CSS3 selectors need that script to be applied in IE 7/8. Therefore when IE 7/8 does not load selectivizr.js correctly, those rules are ignored. Those rules certainly include the webfont references, as well as the errant <p> display properties.
To remind you all, these helper JS scripts are being loaded normally (from within the <body>) with the initial page load, before my script replaces the <body> contents (including that script reference). Therefore, there's a chance it's initializing twice (can anyone confirm this?)
The trouble is, on the control website, selectivizr.js always loads correctly in IE 7/8. There are also no known incompatibilities between the CSS3 helper js and the Media Query help js files (when initialized correctly).
I removed selectivizr.js from the page and the page loaded exactly the same way after 20+ refreshes. Good to get consistency back, bad that I've lost my CSS3 rules in IE 7/8.
Apparently this is how the js plugin in question works:
In accordance with the W3C specs, a web browser should discard style
rules it doesn’t understand. This presents a problem — we need access
to the CSS3 selectors in the style sheet but IE throws them away. To
avoid this issue each style sheet is downloaded using a
XMLHttpRequest. This allows the script to bypass the browsers internal
CSS parser and gain access to the raw CSS file.
Source: http://www.css3.info/css3-pseudo-selectors-emulation-in-internet-explorer/
I can try any suggested CSS3-selector plugins that you all may have; maybe one will load more reliable, or have less overhead and thus less room for lag-related issues. Any alternatives?
Or, perhaps I should add it after the DOM is ready the second time (after my script replaces the body contents) to the <head> or elsewhere in the <body>. None of these options worked - they had the same if not worse outcome.

First off let me say I have worked on multiple initiatives where the teams have started down the path of dynamically generating the DOM via Javascript, including remote-loading of scripts through CORS.
After many months of effort on 3 different projects (and different approaches used in each), we finally had to face the fact that IE7 and IE8 are incapable of properly or consistently dynamically loading and processing external scripts or CSS.
My recommendation is to consolidate / combine any scripts on the PHP / server side and serve up as a single file that can be cached on the client side.
As an additional note, IE is not completely to blame. There are huge complexities involved with downloading, processing, and rendering scripts / css in the correct orders and programming this process such that it works well in every environment (webkit + mozilla + IE9+) requires near-expert-level knowledge and very thorough testing.
In your case, one example of bad "flow" is the fact that when I viewed your page specifically, it briefly shows the non-CSS-applied page (yucky!) before the screen "updates" and CSS gets pulled in and applied. Bad bad bad.
Other issues I noticed are the large number of httprequests in general. Each requires a DNS lookup, cache / expires check (and other stuff dictated by headers), and subsequent download of response. On desktops this is not all that noticable, but on mobile devices, tablets and even some slower / bogged-down PC's it is especially noticable.
If you're building a web app in today's browsing environment and have only a small team, it's probably best to either:
Serve up CSS as a single, cacheable file from a CDN, and pages in pre-parsed, pre-iterated, pre-rendered HTML chunks, minimizing the client-side JS processing (only binding elements post-load), or
Go with a pre-existing client-side framework such as Sencha, SproutCore, YUI etc. - they have built out the framework for you and fixed all the bugs already.
Two things have to happen before I change my view: IE8 has to disappear from general use (drops below 10%), and the "average" mobile device needs to have 2 physical processor cores. Right now only the expensive / high-end models have dual-core processors.
Also of note, the fastest mobile processors even with JIT JS compilers are still 10x slower than a typical desktop in JS performance - which when compared directly to a desktop, would compete head-to-head with a Pentium 4 or old AMD Athlon 64.

Related

How to combine good result with google page speed and putting css on bottom of the page?

I am trying to get good results for pageSpeed with google on my page.
I got good results but putting CSS and JS on bottom of the page.
But I got the problem: my page renders without CSS, then got rendered normally after css is loaded (it produces like a page flash)
I tried to solve by putting style on body display: none
then added the jquery document.ready and put display to normal, but my google page speed results went down again.
Is there a solution/tip to get good pageSpeed results with good rendering of the page.
Unfortunately with HTTP/1 we are forced to bundle all our css rule-sets into one file to prevent multiple resource requests, this won't be the case with HTTP/2.
Speed is definitely something you would want to improve in a website, but the important point here is how fast useable content is in front of the visitors. The resources you use will eventually increase in size, this shouldn't be proportional to the time the user waits to be able to use the page. Focus on perceived performance.
What is the current problem with CSS files located in the head tag?
A: They block rendering until the file is loaded.
What can you do about it?
There is a specification that involves the preload keyword used in the link tag to load css files asynchronously.
This specification defines the preload keyword that may be used with
link elements. This keyword provides a declarative fetch primitive
that initiates an early fetch and separates fetching from resource
execution.
Source: w3
This, however, is still not fully supported by browsers. (Browser support here).
A solution is to use loadCSS which is basically a polyfill.
The new <link rel="preload"> standard enables us to load stylesheets
asynchronously, without blocking rendering, and loadCSS provides a
JavaScript polyfill for that feature to allow it to work across
browsers, as well as providing its own JavaScript method for loading
stylesheets.
Finally, the technique that is commonly proposed is the following:
Load a stylesheet with the critical css rule-sets to be able to display
information to the user, such as layout formatting, this is included as you normally would, in the head tag with <link>.
Load the stylesheet with the css rule-sets that are not critical to the initial rendering of the page which will be loaded with loadCSS.
Notes:
If you go down this path make sure to check tools like
webpagetest.org to test perceived performance.

Eliminate render-blocking JavaScript and CSS - advice needed

Google PageSpeed Insights is flagging this as something I should fix - I've read the guidance on Optimising CSS Delivery at https://developers.google.com/speed/docs/insights/OptimizeCSSDelivery but I'm confused at what the best practice is, and also on which resources are render blocking and which aren't?
Is Google suggesting removing stylesheet links from the page head and replacing with inline styles to make something render, then using Javascript to trigger an external stylesheet to load when window.onload fires? Won't this just delay process of arriving at the 'correctly rendered' page - isn't it better for the browser to start downloading the CSS as soon as possible?
Yes, that's pretty much what the page you reference is recommending. Put the minimal amount of CSS (as long as it's a small amount) directly in the HTML markup within a <style> tag. Then include the complete set of styles at the end of the document. (In the example, it's not actually loaded via JavaScript per se; rather, the link to the external style sheet is placed in a <noscript> tag. That's a bit of a hack, but it gets the job done. You could also request the stylesheet via AJAX and inject it using JavaScript directly.)
This approach only works if you can isolate the minimal CSS needed for your page and that amount of CSS is reasonably small. If, for example, you're building a single page web app, then many of your CSS rules might be for parts of the app other than the initial view. In that case, those extra rules can be put in the external style sheet. Or maybe you have a set of rules strictly for pop-up dialog boxes. Those rules can be postponed as well.
If you can't really separate your rules into those that are needed initially and those that aren't, and if your minimal rule set is large, you can't take advantage of this approach.

Downloaded aspx website does not display well

I have downloaded a aspx webpage and saved it as html. I open it in IE and chrome and it takes time to load + some parts are missing. All the text is there but the onmouseover is not working properly and some css is not displaying correctly. Was the content not downloaded completely? i.e is it missing sme javascript, css or else?
I have done what you describe on many occasions for the purposes of putting together a prototype of new functionality in an existing application.
You will likely need to do a couple of things:
Ensure the paths to your JS and CSS resources are right (removing the unneccessary JS files, if any)
Also, you will likely need to update the paths in your CSS to any image resources in your page

Is it practically good to put JS files at the bottom of webpage?

It is widely recommended that JS files should be put at the bottom of page to allow html codes to be loaded first. In this case, visitors will see something when waiting for full load of the page. However, I think this is disadvantageous for these reasons:
Modern design mainly depends on JS. This means before loading JS, the page will look ugly.
If interrupting the connection during the load (not loading JS at all), visitors will miss some of the website features (probably very important); and they will not understand that this is the problem of load (to re-load the page).
If the server-side script die (due to an error) at the very end of script before footer (e.g. in PHP), visitors will miss the whole page functionality (by JS); but if loading JS at the top, they will only miss the footer or half the page.
If loading JS first, browser will load other stuff (like images) in parallel; but if loading JS last, it may increase the load time. Because JS files are large (e.g. JQuery and JQuery UI), and all tiny stuffs (like images) have been loaded and we are loading a large file, last in line.
UPDATE: 5. Since jQuery library should be loaded before codes; if loading the jQuery library in footer (e.g. footer.php), you cannot add custom jquery codes for different pages (within the body).
Please correct me if I'm wrong! Is putting JS files in footer still beneficial?
Edit: I am adding another point in response to the cotton I'm seeing in peoples ears on this topic.
Additional point #5. If you are seriously concerned about handling behavior on JS-fail and by that I mean, people browsing with JS turned off, what you should be doing is embracing the notion of progressive enhancement. For instance, you could design an accordion menu to act as a flyout-menu on-hover by default, yes with CSS only, and then remove that behavior by changing key classes when JS is enabled. That way users have access to the links without JS if they should turn it off but they get the enhanced behavior when JS is working.
But what you should not be trying to handle is the absence of entire JS files on pages that are supposed to have them because the page was mangled on the back-end. Handling the unexpected is one thing, but handling the failure to finish building an HTML file before it's served should not ever be considered an acceptable scenario in production, especially if you have actual back end code in your templating language (which you shouldn't) waiting to spill out and give would-be hackers something potentially interesting to look at. Broken pages should be served as error messages.
================================
Dead wrong. Any time you use JS to tweak the initial static look of your page, you're doing it wrong. Maintain that separation of concerns and your pages will be much more flexible. Using JS to tweak the STATIC styles of your pages isn't modern, it's bass-ackwards and you can tell the jQuery mobile guys I said as much. If you absolutely must support IE6 and IE7 and IE8 tell your client how much it's going to cost them to cut out rounded gradient corners all over the place if they refuse to accept anything as an alternative to absolute graceful degradation for 5% of their client-base.
If your pages, with no JS beforehand are taking that long to load, you have other problems that need to be addressed. How many resources are you loading? What ungodly pre-processing is your PHP up to? I call back end or design shenanigans.
Are you saying it's half-acceptable to have half a page with working JS rather than completely unacceptable? Don't let go of that client, whoever they are.
jQuery, when minimized is about the size of a medium-sized JPEG.
Note: It is not entirely unacceptable to have some JS up top. Some specialized code like analytics stuff or canvas normalizers require it. But anything that doesn't need to be should be at the bottom. Every time JS is parsed, the entire page load and flow calculation process stalls out. Pushing your JS to the bottom improves perceived page load times and should also serve to provide evidence that somebody on your team needing a swift kick in the butt to figure out why their code is tanking or what could be done with their 25 megabyte png-24s that they just shrunk down rather than reformatted.
Script tags in the header block all other requests. If you have let's say 10 tags like this:
<script src="http://images.apple.com/home/scripts/promotracker.js"></script>
...they will be executed sequentially. No other files will concurrently be downloaded. Hence they increase page load time.
Check out HeadJS here as a sample solution.
You need to think in terms of "do I need the DOM to be ready before I execute some javascript". Basically you put script tags at the bottom of the page to basically guarantee that the DOM is ready. If you link your styling in the header, and properly style the page, you shouldn't get the "ugliness". Secondly, if you are dependent on some parts of the page to be displayed with javascript to work on DOM objects, I would use more ajax calls to prevent this problem as well. The best of both worlds. Page is loaded with what you can, and then ajax calls are getting the html to populate on other parts of the page.
The reason putting JS at the bottom of the page or loading in asynchronously is recommended is that JS slows the page load down.
If some browsers downloading the JS blocks other parallel downloads, and in all browsers executing the JS blocks the UI thread and hence rendering (parsing blocks in some too).
Putting it a the bottom or loading asynchronously attempts to delay the issue until it has less impact on the visitors page load experience.
Don't forget that no matter how beautiful you page is, is it takes too long to load people won't wait and 2 /3 seconds is where it starts to cause problems.
Modern design can probably depends less on JS that it ever has - yes we need polyfills still but as browsers get better then we can do more with CSS
This might be true for things like Backbone.js apps, but if the lack of JS breaks the site then I'd argue the design should be different.
If the server-side script dies there are perhaps other issues to worry about, there's no guarantee there's enough of the page to be useful anyway.
NO! JS blocks the UI thread and in some cases downloads so the images will be delayed. Also as the JS is using connections then there are less connections available for parallel downloads.
Have a look at #samsaffron's article on loading jQuery late - http://samsaffron.com/archive/2012/02/17/stop-paying-your-jquery-tax
If you can delay the JS load you should
I have occasionally recommended putting Javascript at the bottom of a page (just before </body>) but only in circumstances where a novice programmer couldn't really cope with window.onload or <body onload...> and that was the easiest adaptation of their code to get it to work.
I agree with your practical disadvantages, which need to be balanced against Michael's note of the effect on load time. In most cases [I submit] loading scripts in the <head> of the page wins.
Everybody's needs are different, lets go through your list:
1) I've never had an issue with the layout or styling of the page because of javascript. If you have your HTML & CSS in order the missing javascript will be close to invisible.
You can hide elements in the css and display them with javascript when they're ready. You can use jQuery's .show(); method
here's an example:
<!DOCTYPE html>
<html>
<head>
<style>
div { background:#def3ca; margin:3px; width:80px;
display:none; float:left; text-align:center; }
</style>
<script src="http://code.jquery.com/jquery-latest.js"></script>
</head>
<body>
<button id="showr">Show</button><button id="hidr">Hide</button>
<div>Hello 3,</div><div>how</div><div>are</div><div>you?</div>
<script>
$("#showr").click(function () {
$("div").first().show("fast", function showNext() {
$(this).next("div").show("fast", showNext);
});
});
$("#hidr").click(function () {
$("div").hide(1000);
});
</script>
</body>
</html>
If you still have problems you can split up your javascript into ones your site relies on and other scripts and put some in the header and some in the footer.
2) That's user error, you can't control that, but you could check if the needed functionality is there and attempt to reload it. Most plugins offer some sort of confirmation if they're running or not, so you could run a test and try to reload them.
You can also delay loading of files until the user needs them, like waiting for them to focus on a form to load validation scripts or scroll past a certain point to load the code for things below "the fold"
3) If the page dies you're going to get a half-blank page anyhow. With PHP 5 you can do better error handling with exceptions
if (!$result = mysql_query('SELECT foo FROM bar', $db)) {
throw new Exception('You fail: ' . mysql_error($db));
}
and this
try
{
// Code that might throw an exception
throw new Exception('Invalid URL.');
}
catch (FirstExceptionClass $exception)
{
// Code that handles this exception
}
catch (SecondExceptionClass $exception)
{
// you get the idea what i mean ;)
}
4) If you minify your script you they shouldn't be much larger than images. JQuery is 32KB minified & gziped. JQuery-UI's script is 51KB. That's not too bad, most plugins should be even smaller than that.
So I suggest you should do what you have to do to get the results you want, but search for best practices that reduce errors and excess code. There's always a better way to skin a cat...
I'm not really that familiar with putting the scripts in the footer, but you may want to look into the various ways of telling the page to only run the JavaScript AFTER the page is fully loaded.
This opens up a couple options - you could have JS dynamically load external scripts only after the page is ready.
You can also hide some or all of the page content, and then just make it visible after the page is ready. Just make sure you hide it with JS, so that non-js browsers can still see it.
See these:
https://www.google.com/search?q=javascript+page+ready
http://api.jquery.com/ready/

When is a stylesheet added to document.styleSheets

I'm trying to dynamically add a css stylesheet rule using javascript, something like example 2
here.
It works most of the time, but there seems to be a race condition that makes it fail sometimes in (at least) Chrome (15.0.874 and 17.0.933). It happens infrequently when the cache is empty (or has been cleared).
Here's what I've been able to narrow it down to. First I load an external stylesheet by appending it to <head> and then I create a new stylesheet (where I would add rules). I then print the length of document.styleSheets (immediately and after 1 second).
$(function() {
// it doesn't happen if this line is missing.
$("head").append('<link rel="stylesheet" type="text/css"'+
'href="/css/normalize.css" />');
var stylesheet = document.createElement("style");
stylesheet.setAttribute("type", "text/css");
document.getElementsByTagName('head')[0].appendChild(stylesheet);
var b = $('body');
b.append(document.styleSheets.length).append('<br/>');
setTimeout(function() {
b.append(document.styleSheets.length).append('<br/>');
}, 1000);
});
(play with it at http://jsfiddle.net/amirshim/gAYzY/13/)
When the cache is clear, it sometimes prints 2 then 4 (jsfiddle adds it's own 2 css files), meaning it doesn't add either of the style sheets to document.styleSheets immediately... but probably waits for the external file to load.
Is this expected?
If so, is Example 2 on MDN (and many others out there) broken? Since line 27:
var s = document.styleSheets[document.styleSheets.length - 1];
might evaluate with document.styleSheets.length == 0
Note that this doesn't happen when I don't load the external CSS file first.
If JavaScript is in the page below the CSS (which it almost always is) the HTML parser must wait with JS execution until the JS and CSS is fully loaded and parsed because of the fact that JS might request styling information (Chrome only does so when the script actually does this though).
This effectively makes the loading of external CSS blocking in almost all cases.
When you insert them later via JavaScript or there are no JS' in the page (or the JS is loaded non-blocking) CSS loads asynchronously meaning they're loaded and parsed without blocking the parsing of the DOM.
Therefor the documents.stylesheets count only gets updated after the sheet is inside the DOM and that only happens after it is fully loaded and parsed.
In this situation there might be some timing differences involved.
Considering most browsers only have a limited number of pipes through which they load data (some have only two like IE6, most have 6 and some even have 12 like IE9) the loading of the stylesheet is added at the end of the queue to be loaded.
The browser is still loading things because you call you function on DOMReady.
Which results in the stylesheet not being fully loaded and parsed one second later, so not affecting the document.stylesheets.length.
And all stylesheet examples i've run into on the web assume the dom is fully parsed and loaded.
OffDOM stylesheets don't even allow rules to be inserted or checked due to the fact that they can have #import rules and those have to be loaded externally, so for browsers it's pretty hard to determine when its safe to interact with the sheet unless they're fully loaded and parsed.
OffDOM stylesheets do expose an empty sheet property but won't let you interact with it until the sheet has been added to the DOM.
I've always found it better to insert a stylesheet dynamically and execute all changes in that one sheet and leaving the document.stylesheets alone.
This has the great advantage that when you override styles with the same specificity you wont run into trouble because of insertion in the wrong sheet.
Since document.stylesheets is a live nodeList, document.stylesheets[ 2 ] can point to another sheet every time you call the function (unless stored in a var).
So I tend to use a dynamically inserted sheet and only operate on that one.
This should help: 'When is a stylesheet really loaded'
As for 'Example 2'. It will break if there is stylesheet loading when you call addStylesheetRules(), of course that is in Chrome.

Categories