I'm trying to dynamically add a css stylesheet rule using javascript, something like example 2
here.
It works most of the time, but there seems to be a race condition that makes it fail sometimes in (at least) Chrome (15.0.874 and 17.0.933). It happens infrequently when the cache is empty (or has been cleared).
Here's what I've been able to narrow it down to. First I load an external stylesheet by appending it to <head> and then I create a new stylesheet (where I would add rules). I then print the length of document.styleSheets (immediately and after 1 second).
$(function() {
// it doesn't happen if this line is missing.
$("head").append('<link rel="stylesheet" type="text/css"'+
'href="/css/normalize.css" />');
var stylesheet = document.createElement("style");
stylesheet.setAttribute("type", "text/css");
document.getElementsByTagName('head')[0].appendChild(stylesheet);
var b = $('body');
b.append(document.styleSheets.length).append('<br/>');
setTimeout(function() {
b.append(document.styleSheets.length).append('<br/>');
}, 1000);
});
(play with it at http://jsfiddle.net/amirshim/gAYzY/13/)
When the cache is clear, it sometimes prints 2 then 4 (jsfiddle adds it's own 2 css files), meaning it doesn't add either of the style sheets to document.styleSheets immediately... but probably waits for the external file to load.
Is this expected?
If so, is Example 2 on MDN (and many others out there) broken? Since line 27:
var s = document.styleSheets[document.styleSheets.length - 1];
might evaluate with document.styleSheets.length == 0
Note that this doesn't happen when I don't load the external CSS file first.
If JavaScript is in the page below the CSS (which it almost always is) the HTML parser must wait with JS execution until the JS and CSS is fully loaded and parsed because of the fact that JS might request styling information (Chrome only does so when the script actually does this though).
This effectively makes the loading of external CSS blocking in almost all cases.
When you insert them later via JavaScript or there are no JS' in the page (or the JS is loaded non-blocking) CSS loads asynchronously meaning they're loaded and parsed without blocking the parsing of the DOM.
Therefor the documents.stylesheets count only gets updated after the sheet is inside the DOM and that only happens after it is fully loaded and parsed.
In this situation there might be some timing differences involved.
Considering most browsers only have a limited number of pipes through which they load data (some have only two like IE6, most have 6 and some even have 12 like IE9) the loading of the stylesheet is added at the end of the queue to be loaded.
The browser is still loading things because you call you function on DOMReady.
Which results in the stylesheet not being fully loaded and parsed one second later, so not affecting the document.stylesheets.length.
And all stylesheet examples i've run into on the web assume the dom is fully parsed and loaded.
OffDOM stylesheets don't even allow rules to be inserted or checked due to the fact that they can have #import rules and those have to be loaded externally, so for browsers it's pretty hard to determine when its safe to interact with the sheet unless they're fully loaded and parsed.
OffDOM stylesheets do expose an empty sheet property but won't let you interact with it until the sheet has been added to the DOM.
I've always found it better to insert a stylesheet dynamically and execute all changes in that one sheet and leaving the document.stylesheets alone.
This has the great advantage that when you override styles with the same specificity you wont run into trouble because of insertion in the wrong sheet.
Since document.stylesheets is a live nodeList, document.stylesheets[ 2 ] can point to another sheet every time you call the function (unless stored in a var).
So I tend to use a dynamically inserted sheet and only operate on that one.
This should help: 'When is a stylesheet really loaded'
As for 'Example 2'. It will break if there is stylesheet loading when you call addStylesheetRules(), of course that is in Chrome.
Related
And injecting all the design on the head like JSS does, it has some benefit on performance?
If you are asking whether adding javascript and css inside of the header rather than in the body gives some form of performance improvement, it does not. It just matters whether the files will be loaded before or after the body is created. Some utilities require you to even put script tags after a certain line of html inside of the body so that the library will be able to see the needed DOM element.
Everything gets loaded before the user sees the page whether that is in the head or in the body. The only way to have things loaded truly dynamically, that I know of, is to use javascript to add script tags or iframes that will be loaded at a later time.
So, no there is no performance improvement.
Do you mean critical path for SSR?
I am trying to get good results for pageSpeed with google on my page.
I got good results but putting CSS and JS on bottom of the page.
But I got the problem: my page renders without CSS, then got rendered normally after css is loaded (it produces like a page flash)
I tried to solve by putting style on body display: none
then added the jquery document.ready and put display to normal, but my google page speed results went down again.
Is there a solution/tip to get good pageSpeed results with good rendering of the page.
Unfortunately with HTTP/1 we are forced to bundle all our css rule-sets into one file to prevent multiple resource requests, this won't be the case with HTTP/2.
Speed is definitely something you would want to improve in a website, but the important point here is how fast useable content is in front of the visitors. The resources you use will eventually increase in size, this shouldn't be proportional to the time the user waits to be able to use the page. Focus on perceived performance.
What is the current problem with CSS files located in the head tag?
A: They block rendering until the file is loaded.
What can you do about it?
There is a specification that involves the preload keyword used in the link tag to load css files asynchronously.
This specification defines the preload keyword that may be used with
link elements. This keyword provides a declarative fetch primitive
that initiates an early fetch and separates fetching from resource
execution.
Source: w3
This, however, is still not fully supported by browsers. (Browser support here).
A solution is to use loadCSS which is basically a polyfill.
The new <link rel="preload"> standard enables us to load stylesheets
asynchronously, without blocking rendering, and loadCSS provides a
JavaScript polyfill for that feature to allow it to work across
browsers, as well as providing its own JavaScript method for loading
stylesheets.
Finally, the technique that is commonly proposed is the following:
Load a stylesheet with the critical css rule-sets to be able to display
information to the user, such as layout formatting, this is included as you normally would, in the head tag with <link>.
Load the stylesheet with the css rule-sets that are not critical to the initial rendering of the page which will be loaded with loadCSS.
Notes:
If you go down this path make sure to check tools like
webpagetest.org to test perceived performance.
I have jQuery added at the bottom of the page. However, when I run my site on pagespeed insights (Mobile), I get the error:
Eliminate render-blocking JavaScript and CSS in above-the-fold content
Your page has 2 blocking script resources and 1 blocking CSS
resources.
This causes a delay in rendering your page. None of the
above-the-fold content on your page could be rendered without waiting
for the following resources to load.
Try to defer or asynchronously
load blocking resources, or inline the critical portions of those
resources directly in the HTML.
See: http://learnyourbubble.com and https://developers.google.com/speed/pagespeed/insights/?url=http%3A%2F%2Flearnyourbubble.com&tab=mobile
However, the jQuery is added at the bottom of the page. So it should be below the fold.
How can I remove this error?
It has to do with your font files.
Look at request 19 and 20 in the waterfall. These font files are considered CSS.
Notice how first paint (green vertical line) does not happen until after the font files are loaded?
Then notice the 15 JS files were loaded prrior to the font (CSS) files.
That is what Google is taking about.
Having 16 JS files is beyond excessive.
Try this: Disable JavaScript in your Browser. Notice the only change is in the menu header. Is 16 JS files worth that? I think not.
This article should explain a lot of what's happening: https://varvy.com/pagespeed/critical-render-path.html
In short though the problem is that chrome will need to load your jquery and your foundation javascript to give the initial render of the page. This is why its blocking. Just because the javascript comes after the html, does not mean that the html can be displayed yet. The rendering of the DOM is still going to block while the jquery/foundation is being loaded because chrome thinks they're vital to the page being displayed correctly. Pagespeed complains on these particularly because they're large. To alleviate this problem, there are a few things you can do, some of them detailed in the article above, some of them in here https://developers.google.com/speed/docs/insights/BlockingJS. The easiest way to tell chrome that these scripts are not vital and can be loaded "below the fold" is to add a defer or async tag to them.
I see an error calling foundation() but I will assume that you have removed it to rule it out, but it could be that this same call happens prior to load. Try to always enclose your code like:
(function($) {
// DOM ready
})(jQuery);
Have you tried loading async
Make JavaScript Asynchronous By default JavaScript blocks DOM
construction and thus delays the time to first render. To prevent
JavaScript from blocking the parser we recommend using the HTML async
attribute on external scripts. For example:
<script async src="my.js">
See Parser Blocking vs. Asynchronous JavaScript to learn more about
asynchronous scripts. Note that asynchronous scripts are not
guaranteed to execute in specified order and should not use
document.write. Scripts that depend on execution order or need to
access or modify the DOM or CSSOM of the page may need to be rewritten
to account for these constraints.
Please try with defer tag it's working for me.
<script src="filename.js" defer>
I'm making some simple changes using javascript to HTML elements already existant when the page is served (such as changing background images of div elements, adding IDs etc). This of course works fine in every browser apart from IE8 where the change doesn't appear to be reflected in the DOM so when I parse the dom after the JS has run it cant find the elements I'm looking for. The page is built up of 2 javascript files in the header, 1 is an external third party script which I do not have control over but which is the one adding the ids and background images. The second is mine which is called after the first and is parsing the document looking for the specific elements with the new IDs. Both are external scripts and are not inline in the HTML source.
From what I can tell its either:
a race condition, 2 external Javascripts are running 1 is changing the buttons and adding the ids and the other is parsing the dom looking for specific elements and as they're running at the same time the second never finds the elements
IE8 does not properly refresh the DOM after changes have been made
My JS is called after the first JS in the head so you would assume that the blocking would not cause the race condition and the elements would be available before my JS runs
Things I've tried:
I've tried adding a class to the body to force a refresh of the DOM before my code runs
I've used IE8 developer tools and the ids and elements are not present, but if I refresh a few times they magically appear (the page has already fully loaded at the this point and I can interact with it fully)
Any ideas?
Thanks!
One thing to keep in mind is that when you are creating multiple <script> tags, you are not guaranteed of the order in which they load, and depending on how they are build - they will generally begin processing as soon as they are loaded.
So, if you are including one local file and one file to a CDN, such as:
<script type="text/javascript"
src="https://ajax.googleapis.com/ajax/libs/jquery/1.7.2/jquery.min.js">
</script>
<script type="text/javascript" src="/js/my_script.js"></script>
You have to take into account that the CDN file is often far faster for delivery than your hosted file. In the above case, this may be a good thing - because jQuery being loaded on the page before your script is probably ideal - but if you are loading a different 3rd party script, which may rely on certain elements being present in the DOM that your script is responsible for creating, your script may not create them in time.
Imagine this scenario:
<script type="text/javascript" src="https://someurl/somelib.js">
// This script parses the DOM and applies alterations to certain items
</script>
<script type="text/javascript" src="/js/my_script.js">
// This script creates the DOM elements the other script is supposed to alter
</script>
Your page will only work if the local file, /js/my_script.js, loads first - which is unlikely because the other file is being served from a dedicated CDN.
This is even worse when both files are served locally, such as:
<script type="text/javascript" src="/js/my_relied_upon_script.js"></script>
<script type="text/javascript" src="/js/my_reliant_script.js"></script>
In this case, it all depends on how your local web server happens to handle the HTTP requests in order to determine what happens in what order.
So - on to the solution:
1) Make your scripts all wait for the document's onready event to fire. Because this event only occurs once the document is fully loaded (including any other HTTP requests necessary to fully load its elements, such as scripts, images, etc.) - you can be guaranteed that the scripts will at least wait until the full DOM is loaded.
2) Make subordinate scripts wait for trigger events.
With jQuery, an example might be something like the following:
// Script #1
$(document).bind('ready', function () {
$('#NeedsBackground').css({ background: 'url(/gfx/bg.png)' });
var $wrapper = $('<div />').addClass('wrapper');
$('#NeedsWrapper').wrap($wrapper);
// Here's the magic that enforces loading.
$(document).trigger('Script1Finished');
});
// Script #2
$(document).bind('Script1Finished', function () {
$('.wrapper').css({ border: '1px solid #000' });
});
Now - bear in mind that the above transformations are fairly terrible, and not something you'd want to do (such as inlining CSS and such, generally) - but they give an example. Because Script #2 requires that the .wrapper elements exist before running, you need to ensure that it happens AFTER Script #1 occurs.
In this case, we're accomplishing that by triggering a custom event on the document, which we can then respond to - and we are only firing that event after the DOM has been put in the proper state.
I am dynamically adding <link> elements to the head once the DOM is ready. However, I'm getting inconsistent results in IE8 and IE7 (all other browsers are fine).
Every few times the page is loaded (cached or uncached), IE 7/8 will drop a handful of CSS rules in the stylesheets. 1 or 2 of my dynamic stylesheets will not load. It's always the same 1 or 2 stylesheets that IE tends to ignore - even though the Developer Toolbar shows them as added to the head!.
The stylesheets themselves show up as <link> elements in the final DOM, but some of their rules are not applied (although every few reloads they are applied, without any issue).
In my position, I do not have the luxury of writing code from the <head> (CMS restriction) - I can only dynamically insert from the body, which may be the issue.
UPDATED: This is the code I am using (located within the <body>) to insert stylesheets:
document.observe('dom:loaded', function() { // Using Prototype.js
// Add stylesheets
// addStylesheet('cite.css', 'head'); // Contains no webfont/#font-face rules
// addStylesheet('type.css', 'head'); // Contains webfont family name references*
// addStylesheet('flex.css', 'head'); // Responsive rules with #media queries
// addStylesheet('anm8.css', 'head'); // Some minor positional CSS for home page
// addStylesheet('gothic-cite.css', 'head'); // *Contains #font-face config
// addStylesheet('stag-cite.css', 'head'); // *Contains #font-face config
addStylesheet('all.css', 'head'); // Contains ALL content from above in 1 file
function addStylesheet(cssname, pos2)
{
var th2 = document.getElementsByTagName(pos2)[0];
var s2 = document.createElement('link');
s2.setAttribute('type', 'text/css');
s2.setAttribute('href', cssname);
s2.setAttribute('media', 'screen');
s2.setAttribute('rel', 'stylesheet');
th2.appendChild(s2);
}
});
As suggested, even when I combined all rules into one stylesheet (which I hate doing), IE 7/8 continues to flip-flop as described with some rules, and the page appears differently.
As a further check, I also removed all #font-face and referenced font-family: "webfont-name" rules from the stylesheets, and the same behavior continued. Therefore, we can rule out webfonts being the issue.
You can see the anomalies by visiting the following with IE8 and refreshing/clicking the nav several times. It seems completely random as to when IE8 is dropping those styles. However, in the natively-built control page, all styles load correctly, every time.
Live Page (with problems)
https://www.eiseverywhere.com/ehome/index.php?eventid=31648&tabid=50283
PHP-based CMS prints out XHTML on page load (template content mixed w/user content)
Prototype.js is loaded and initialized by default on page load
CMS proprietary scripts.js file is parsed on page load
My scripts run when DOM is loaded, essentially replacing body.innerHTML CMS fluff-HTML with just the HTML I want, then adds stylesheets to <head>.
For lte IE 8, CSS extension plugins (selectivizr.js, html5.js, and ie-media-queries.js) are loaded within the <body> via conditional comments. Not sure if they wait for DOM:loaded...
The CMS WYSIWYG editor converts all carriage-returns to empty <p> tags, resulting in elements like <section> being contained inside broken <p> tags, and extra <p></p> tags being created where whitespace is expected. Only lt IE 8 seems to choke on this, however, so I added the following CSS rules to remedy this:
:not(.ie7) p { display: none; }
.ie7 p { display: inline; }
article p { display: block !important; }
I should note that the external stylesheets here are being pulled from the same domain, but each time they are re-uploaded, a new MD5-based URL is generated for the file. I'm not sure if previous revisions to the file (or previous files) are still available by their previous URLs. This isn't likely to be the problem though, since the newly created all.css stylesheet is still dropping rules that have been in the file from the start.
Control Page (works flawlessly)
http://client.clevelanddesign.com/CD/IDG/CITE/home.html
Pure XHTML document - no PHP.
jQuery is used, rather than Prototype, for IE8 and below.
All resources (stylesheets) are present in <head> at page load - no dynamic insertion
For lte IE 8, CSS extension plugins (selectivizr.js, html5.js, and ie-media-queries.js) are initialized natively.
Rephrased question:
Which of these differences do you think may be causing IE 7/8 to flip-flop on styles when loaded on the Live page? I personally suspect either a race-condition issue, or that Prototype.js and the other CMS scripts are mucking things up (unfortunately no way to purge those from the page though).
PS: I've already tried using IE's createStylsheet() function, to no avail.
UPDATE - Screenshots of working/not working in IE8
IE8: DOM code when loaded correctly:
IE8: DOM code when NOT loaded correctly:
I've nailed down exactly what is happening, but still do not know the cause of flip-flop:
selectivizr.js is not loading correctly every few page loads.
All of the rules that use CSS3 selectors need that script to be applied in IE 7/8. Therefore when IE 7/8 does not load selectivizr.js correctly, those rules are ignored. Those rules certainly include the webfont references, as well as the errant <p> display properties.
To remind you all, these helper JS scripts are being loaded normally (from within the <body>) with the initial page load, before my script replaces the <body> contents (including that script reference). Therefore, there's a chance it's initializing twice (can anyone confirm this?)
The trouble is, on the control website, selectivizr.js always loads correctly in IE 7/8. There are also no known incompatibilities between the CSS3 helper js and the Media Query help js files (when initialized correctly).
I removed selectivizr.js from the page and the page loaded exactly the same way after 20+ refreshes. Good to get consistency back, bad that I've lost my CSS3 rules in IE 7/8.
Apparently this is how the js plugin in question works:
In accordance with the W3C specs, a web browser should discard style
rules it doesn’t understand. This presents a problem — we need access
to the CSS3 selectors in the style sheet but IE throws them away. To
avoid this issue each style sheet is downloaded using a
XMLHttpRequest. This allows the script to bypass the browsers internal
CSS parser and gain access to the raw CSS file.
Source: http://www.css3.info/css3-pseudo-selectors-emulation-in-internet-explorer/
I can try any suggested CSS3-selector plugins that you all may have; maybe one will load more reliable, or have less overhead and thus less room for lag-related issues. Any alternatives?
Or, perhaps I should add it after the DOM is ready the second time (after my script replaces the body contents) to the <head> or elsewhere in the <body>. None of these options worked - they had the same if not worse outcome.
First off let me say I have worked on multiple initiatives where the teams have started down the path of dynamically generating the DOM via Javascript, including remote-loading of scripts through CORS.
After many months of effort on 3 different projects (and different approaches used in each), we finally had to face the fact that IE7 and IE8 are incapable of properly or consistently dynamically loading and processing external scripts or CSS.
My recommendation is to consolidate / combine any scripts on the PHP / server side and serve up as a single file that can be cached on the client side.
As an additional note, IE is not completely to blame. There are huge complexities involved with downloading, processing, and rendering scripts / css in the correct orders and programming this process such that it works well in every environment (webkit + mozilla + IE9+) requires near-expert-level knowledge and very thorough testing.
In your case, one example of bad "flow" is the fact that when I viewed your page specifically, it briefly shows the non-CSS-applied page (yucky!) before the screen "updates" and CSS gets pulled in and applied. Bad bad bad.
Other issues I noticed are the large number of httprequests in general. Each requires a DNS lookup, cache / expires check (and other stuff dictated by headers), and subsequent download of response. On desktops this is not all that noticable, but on mobile devices, tablets and even some slower / bogged-down PC's it is especially noticable.
If you're building a web app in today's browsing environment and have only a small team, it's probably best to either:
Serve up CSS as a single, cacheable file from a CDN, and pages in pre-parsed, pre-iterated, pre-rendered HTML chunks, minimizing the client-side JS processing (only binding elements post-load), or
Go with a pre-existing client-side framework such as Sencha, SproutCore, YUI etc. - they have built out the framework for you and fixed all the bugs already.
Two things have to happen before I change my view: IE8 has to disappear from general use (drops below 10%), and the "average" mobile device needs to have 2 physical processor cores. Right now only the expensive / high-end models have dual-core processors.
Also of note, the fastest mobile processors even with JIT JS compilers are still 10x slower than a typical desktop in JS performance - which when compared directly to a desktop, would compete head-to-head with a Pentium 4 or old AMD Athlon 64.