Related
Many aspects of my site are dynamic. I am using jquery.
I have a div which once the DOM is ready is populated using load().
Then if a button is clicked, using load() once again, this value is replaced by another value.
This kind of setup is common across my site. My homepage is essentially lots of dynamically loaded, refreshed, and changeable content.
What are the repercussions of this for SEO?
Ive seen sites where each page is loaded using load() and then displayed using the animation functions... It looks awesome !
People have posed this question before, but noone has answered it properly.
So any ideas? JQUERY AND SEO??
Thanks
EDIT
Very interesting points. I dont want to overdo my site with jaascript.. just where neccesary to make it look good - my homepage however is one place of concern.
So when the DOM is readY, it loads content into a div. On clicking a tab, this content is changed. I.E No JS, No content.
The beauty here for me is that, there is no duplicated code. Is the suggestion here that i should simply 'print' some default content, then have the tabs link to pages (with the same content) if JS is disabled. I.E sacrifice a little duplicate code for SEO?
As far as degrading goes, my only other place of concern is tabs on the same page.. I have 3 divs, all containing content. On this page two divs are hidden until a tab is clicked. I used this method first before i started playing with JS. Would it perhaps be best to load() these tabs, then have the tab buttons link to where the content is pulled from?
Thanks
None of the content loaded via JavaScript will be crawled.
The common and correct approach is to use Progressive Enhancement: all links should be normal <a href="..."> to actual pages so that your site "makes sense" to a search spider; and the click() event overrides the normal functionality with load() so normal users with JavaScript enabled will see the "enhanced" version of your site.
If your content is navigable when JavaScript is turned off, you'll be a good ways toward being visible to search engines.
Note that search engine crawlers won't be submitting any forms on your site, so if you have any or elements that are meant to be navigating between your site's content pages, that content is not navigable by search engines.
Here is a guidelines how to make Google to crawl content loaded with ajax: http://code.google.com/web/ajaxcrawling/docs/getting-started.html
I use jquery load() asynchronous page load. It greatly improves user experience, but not seo-friendly. Here's the only solution I have found so far:
On first load I do not use jquery load() and try to write cookie with javascript.document.cookie = 'checkjs=on';
On next page load if php script finds this cookie it means that javascript is enabled and jquery load() can be used. If there's no such cookie then javascript is off (probably spider came), so jquery load() is not used.
if (!$_COOKIE['checkjs'] || $_COOKIE['checkjs']!='on'){echo 'js is off, hello Google!'; } else {echo 'js is on, can use jquery load';}
This way I can be sure that most of users can benefit from asynchronous page blocks load, exept for the very first load. And spiders get all content too.
In your case you could just load the same page with new parameter that makes another tab active. Spider is gonna be happy.
It is widely recommended that JS files should be put at the bottom of page to allow html codes to be loaded first. In this case, visitors will see something when waiting for full load of the page. However, I think this is disadvantageous for these reasons:
Modern design mainly depends on JS. This means before loading JS, the page will look ugly.
If interrupting the connection during the load (not loading JS at all), visitors will miss some of the website features (probably very important); and they will not understand that this is the problem of load (to re-load the page).
If the server-side script die (due to an error) at the very end of script before footer (e.g. in PHP), visitors will miss the whole page functionality (by JS); but if loading JS at the top, they will only miss the footer or half the page.
If loading JS first, browser will load other stuff (like images) in parallel; but if loading JS last, it may increase the load time. Because JS files are large (e.g. JQuery and JQuery UI), and all tiny stuffs (like images) have been loaded and we are loading a large file, last in line.
UPDATE: 5. Since jQuery library should be loaded before codes; if loading the jQuery library in footer (e.g. footer.php), you cannot add custom jquery codes for different pages (within the body).
Please correct me if I'm wrong! Is putting JS files in footer still beneficial?
Edit: I am adding another point in response to the cotton I'm seeing in peoples ears on this topic.
Additional point #5. If you are seriously concerned about handling behavior on JS-fail and by that I mean, people browsing with JS turned off, what you should be doing is embracing the notion of progressive enhancement. For instance, you could design an accordion menu to act as a flyout-menu on-hover by default, yes with CSS only, and then remove that behavior by changing key classes when JS is enabled. That way users have access to the links without JS if they should turn it off but they get the enhanced behavior when JS is working.
But what you should not be trying to handle is the absence of entire JS files on pages that are supposed to have them because the page was mangled on the back-end. Handling the unexpected is one thing, but handling the failure to finish building an HTML file before it's served should not ever be considered an acceptable scenario in production, especially if you have actual back end code in your templating language (which you shouldn't) waiting to spill out and give would-be hackers something potentially interesting to look at. Broken pages should be served as error messages.
================================
Dead wrong. Any time you use JS to tweak the initial static look of your page, you're doing it wrong. Maintain that separation of concerns and your pages will be much more flexible. Using JS to tweak the STATIC styles of your pages isn't modern, it's bass-ackwards and you can tell the jQuery mobile guys I said as much. If you absolutely must support IE6 and IE7 and IE8 tell your client how much it's going to cost them to cut out rounded gradient corners all over the place if they refuse to accept anything as an alternative to absolute graceful degradation for 5% of their client-base.
If your pages, with no JS beforehand are taking that long to load, you have other problems that need to be addressed. How many resources are you loading? What ungodly pre-processing is your PHP up to? I call back end or design shenanigans.
Are you saying it's half-acceptable to have half a page with working JS rather than completely unacceptable? Don't let go of that client, whoever they are.
jQuery, when minimized is about the size of a medium-sized JPEG.
Note: It is not entirely unacceptable to have some JS up top. Some specialized code like analytics stuff or canvas normalizers require it. But anything that doesn't need to be should be at the bottom. Every time JS is parsed, the entire page load and flow calculation process stalls out. Pushing your JS to the bottom improves perceived page load times and should also serve to provide evidence that somebody on your team needing a swift kick in the butt to figure out why their code is tanking or what could be done with their 25 megabyte png-24s that they just shrunk down rather than reformatted.
Script tags in the header block all other requests. If you have let's say 10 tags like this:
<script src="http://images.apple.com/home/scripts/promotracker.js"></script>
...they will be executed sequentially. No other files will concurrently be downloaded. Hence they increase page load time.
Check out HeadJS here as a sample solution.
You need to think in terms of "do I need the DOM to be ready before I execute some javascript". Basically you put script tags at the bottom of the page to basically guarantee that the DOM is ready. If you link your styling in the header, and properly style the page, you shouldn't get the "ugliness". Secondly, if you are dependent on some parts of the page to be displayed with javascript to work on DOM objects, I would use more ajax calls to prevent this problem as well. The best of both worlds. Page is loaded with what you can, and then ajax calls are getting the html to populate on other parts of the page.
The reason putting JS at the bottom of the page or loading in asynchronously is recommended is that JS slows the page load down.
If some browsers downloading the JS blocks other parallel downloads, and in all browsers executing the JS blocks the UI thread and hence rendering (parsing blocks in some too).
Putting it a the bottom or loading asynchronously attempts to delay the issue until it has less impact on the visitors page load experience.
Don't forget that no matter how beautiful you page is, is it takes too long to load people won't wait and 2 /3 seconds is where it starts to cause problems.
Modern design can probably depends less on JS that it ever has - yes we need polyfills still but as browsers get better then we can do more with CSS
This might be true for things like Backbone.js apps, but if the lack of JS breaks the site then I'd argue the design should be different.
If the server-side script dies there are perhaps other issues to worry about, there's no guarantee there's enough of the page to be useful anyway.
NO! JS blocks the UI thread and in some cases downloads so the images will be delayed. Also as the JS is using connections then there are less connections available for parallel downloads.
Have a look at #samsaffron's article on loading jQuery late - http://samsaffron.com/archive/2012/02/17/stop-paying-your-jquery-tax
If you can delay the JS load you should
I have occasionally recommended putting Javascript at the bottom of a page (just before </body>) but only in circumstances where a novice programmer couldn't really cope with window.onload or <body onload...> and that was the easiest adaptation of their code to get it to work.
I agree with your practical disadvantages, which need to be balanced against Michael's note of the effect on load time. In most cases [I submit] loading scripts in the <head> of the page wins.
Everybody's needs are different, lets go through your list:
1) I've never had an issue with the layout or styling of the page because of javascript. If you have your HTML & CSS in order the missing javascript will be close to invisible.
You can hide elements in the css and display them with javascript when they're ready. You can use jQuery's .show(); method
here's an example:
<!DOCTYPE html>
<html>
<head>
<style>
div { background:#def3ca; margin:3px; width:80px;
display:none; float:left; text-align:center; }
</style>
<script src="http://code.jquery.com/jquery-latest.js"></script>
</head>
<body>
<button id="showr">Show</button><button id="hidr">Hide</button>
<div>Hello 3,</div><div>how</div><div>are</div><div>you?</div>
<script>
$("#showr").click(function () {
$("div").first().show("fast", function showNext() {
$(this).next("div").show("fast", showNext);
});
});
$("#hidr").click(function () {
$("div").hide(1000);
});
</script>
</body>
</html>
If you still have problems you can split up your javascript into ones your site relies on and other scripts and put some in the header and some in the footer.
2) That's user error, you can't control that, but you could check if the needed functionality is there and attempt to reload it. Most plugins offer some sort of confirmation if they're running or not, so you could run a test and try to reload them.
You can also delay loading of files until the user needs them, like waiting for them to focus on a form to load validation scripts or scroll past a certain point to load the code for things below "the fold"
3) If the page dies you're going to get a half-blank page anyhow. With PHP 5 you can do better error handling with exceptions
if (!$result = mysql_query('SELECT foo FROM bar', $db)) {
throw new Exception('You fail: ' . mysql_error($db));
}
and this
try
{
// Code that might throw an exception
throw new Exception('Invalid URL.');
}
catch (FirstExceptionClass $exception)
{
// Code that handles this exception
}
catch (SecondExceptionClass $exception)
{
// you get the idea what i mean ;)
}
4) If you minify your script you they shouldn't be much larger than images. JQuery is 32KB minified & gziped. JQuery-UI's script is 51KB. That's not too bad, most plugins should be even smaller than that.
So I suggest you should do what you have to do to get the results you want, but search for best practices that reduce errors and excess code. There's always a better way to skin a cat...
I'm not really that familiar with putting the scripts in the footer, but you may want to look into the various ways of telling the page to only run the JavaScript AFTER the page is fully loaded.
This opens up a couple options - you could have JS dynamically load external scripts only after the page is ready.
You can also hide some or all of the page content, and then just make it visible after the page is ready. Just make sure you hide it with JS, so that non-js browsers can still see it.
See these:
https://www.google.com/search?q=javascript+page+ready
http://api.jquery.com/ready/
When archiving web pages, dynamic content has to be treated differently.
How do I detect whether a page uses any JavaScript?
This will end up in a browser extension, so it probably doesn't need to exclude itself from the findings.
Simply checking for <script> tags should be fine.
if (document.querySelectorAll("script").length) {
//there are scripts on this page
}
You could scan the entire page for onclick, etc handlers in HTML tags, but that would be slow for a big page.
That's actually relatively simple -- does it have any <script> tags? Then it probably has dynamic content. Additionally, you may wish to check for <object> tags as occasionally embedded objects will modify the page as well (though I suppose their presence should also make the page considered 'dynamic')
With javascript it ain't possible if javascript is turned off in your browser. You must make some browser check in your server-code.
I found some good cons here:
The noscript element only detects whether the browser has JavaScript enabled or not. If JavaScript is disabled in the Firewall rather than in the browser then the JavaScript will not run and the content of the noscript element will not be displayed.
Many scripts are dependent on a specific feature or features of the language being supported in order for them to be able to run (for example document.getElementById). Where the required features are not supported the JavaScript is unable to run but since JavaScript itself is supported the noscript content will not be displayed.
The most useful place to use the noscript element is in the head of the page where it would be able to selectively determine what stylesheet and meta elements get applied to the page as the page is loading rather than having to wait until the page is loaded. Unfortunately the noscript element is only valid within the body of the page and so cannot be used in the head.
The noscript element is a block level element and therefore can only be used to display entire blocks of content when JavaScript is disabled. It cannot be used inline.
Ideally, web pages should use HTML for the content, CSS for the appearance, and JavaScript for the behavior. Using the noscript element is applying a behavior from within the HTML rather than applying it from JavaScript.
Source: http://javascript.about.com/od/reference/a/noscriptnomore.htm
I very much agree on last point. Is there a way to make and add an external <noscript> file? Should we place <noscript> in the <head>?
It's better to have the default be non-javascript, and then let a javascript code overwrite with a javascript enabled page. Doesn't have to be much. Can just be a display:none; block, which is then set to display:block; by javascript, and vice versa for the non-js page.
After pondering for many days and changing my code back and forth, I think I have clearer picture now and would like to share my two cents worth on the subject before I forget.
<div id='noscript'>show non-js content</div>
<script>document.getElementById('noscript').style.display='none';</script>
<script id='required script'>show js content</script>
vs
<noscript>show non-js content</noscript>
<script id='required script'>//show js content</script>
Depending on the situation, there are three cases for consideration:
Case 1 - If required script is inline
JavaScript disabled
Content in <noscript> element appears immediately, non-js content is
shown
Content in <div> element appears immediately, non-js content is shown
JavaScript enabled
Content in <noscript> element does not appear at all, js content shown
Content in <div> element may momentarily appear before being hidden, js
content shown
For this case, using <noscript> element is advantageous.
Case 2 - If required script is from external (third-party) source, but hiding of <div> element is done with inline script
JavaScript disabled
Content in <noscript> element appears immediately, non-js content is
shown
Content in <div> element appears immediately, non-js content is shown
JavaScript enabled but required script is blocked
Content in <noscript> element does not appear at all, nothing is shown!
Content in <div> element may momentarily appear before being hidden, nothing is shown!
JavaScript enabled and required script is received
Content in <noscript> element does not appear at all, js content shown
Content in <div> element may momentarily appear before being hidden, js
content shown
For this case, using <noscript> element is advantageous.
Case 3 - If required script hides the <div> element
JavaScript disabled
Content in <noscript> element appears immediately, non-js content is
shown
Content in <div> element appears immediately, non-js content is shown
JavaScript enabled but required script is blocked
Content in <noscript> element does not appear at all, nothing is shown!
Content in <div> element appears, non-js content is shown
JavaScript enabled and required script is received
Content in <noscript> element does not appear at all, js content shown
Content in <div> element may momentarily appear before being hidden, js
content shown
For this case, using <div> element is advantageous.
In summary
Use <noscript> element if rendering of the HTML content depends on third-party scripts or if the required script is inline. Else, use <div> element and make sure that the required script contains:
document.getElementById('noscript').style.display='none';
Although Tor Valamo has an elegant answer to this problem, there is an issue which may cause you to opt out of using this technique.
The problem is (usually) IE. It has the tendency to load and execute the JS a bit slower than other browsers causing it to sometimes flash the "Please Enable Your Javascript" div for a split second before it then loads the JS and hides the div.
It is annoying and to get around this you can implement the "classic". <noscript> redirect approach.
<head>
<noscript><meta http-equiv="refresh" content="0; URL=/NO_SCRIPT_URL/ROUTE_HERE"/></noscript>
</head>
This is the most solid technique that I've come across with regards to this little nasty.
One useful application for noscript that I've seen is for a progressively-enhanced async loading of heavy content (especially "below the fold"). Big images, iframes, etc. can be wrapped in noscript in the HTML source, and then the unwrapped elements can be appended to the page using JavaScript after the DOM is ready. This unblocks the page and can make for a much quicker initial loading experience, especially if your interface relies on JS/JQ interactions applied after the document is ready (2 seconds vs. 6 seconds for a portfolio page I consulted on).
These days it seems almost every browser runs Javascript, but you can never know who is going to be accessing your site. These days even screen readers and web crawlers use Javascript, and sometimes make AJAX requests if they have to.
That said, if you're going to fall back to no-Javascript, there is a much better way than a <noscript> tag. Simply do this in the HEAD of your document:
<script type="text/javascript">
document.getElementsByTagName('html')[0].className += ' Q_js'; // better than noscript
</script>
With this technique, you can easily refer to the Q_js class in your CSS to hide things. With the <noscript> tag, the best you can hope for is to include an additional CSS file to override previous CSS. This becomes important when some elements with static content are supposed to be hidden right away (not flicker) until Javascript can make them more dynamic.
In short, the technique I suggested addresses all your cons 1-5, and I believe it's strictly better than using <noscript>.
In the (hopefully near) future you will be able to use css #media scripting:
#media (scripting: none) {
/* styles for when JS is disabled */
}
I create a full height, full width, position:fixed div in all pages with some id .
<div id='noscript_div' style='position:fixed;z-index:20000000;height:100%;width:100%;line-height:100%;'>enable JS buddy</div>
$('#noscript_div').hide();
$(document).ready(function(event){
});
I am not an expert . This worked for me .
I am sorry but, this case will suit only if you want the user to have his javascript enabled always
the simple ideea is in this times your website may adapt to no javascript usage on slow devices using noscript tag like an entity for the entire content of your website**(your html should be prepared to no javascript and all controls must work also if javascript is off,users using basic html controls shoul be able to do everything they done before when javascript was active.So <noscript></noscript> can be the dynamic switch to the same content in other way with the same results=solving the problem wich is the reason the users open your url).**You can see is no matter javascript is or not present ,the website's functionality can be "the same" in any cases js enabled / disabled.On chinese slow devices eg:Samsung neo mini phone this method can run an website without any delays on low internet traffic..
try to run this auto double functionallity website if js is on/off cases:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN"><HTML>
<HEAD>
<TITLE>noscript can change the Internet forever</TITLE>
<script src="https://ajax.googleapis.com/ajax/libs/jquery/1.11.3/jquery.min.js"></script>
</HEAD>
<BODY>
<SCRIPT LANGUAGE="JavaScript">
<!--
$(document).ready(function(){
$('noscript').replaceWith(function() {
return this.textContent || this.innerText;
});
$("p#javascripton").css("background-color", "yellow");
$("p").click(function(){
$(this).hide();
});
});
//-->
</SCRIPT>
<noscript>
<p>
Noscript's usage today can be logical for <p id="javascripton">eg pc/laptop/high quality tablets usage the complete website with all features:images high resolution,javascript<br><h1>OR without javascript so no high resolutions images inserted with a jquery automated script generated from some php+javascript scripts so have usage for 80% mobile application cause almost are from China ,so low quality products=low cpu,low ram :IN THIS CASE SOMEONE CAN THINK TO SWITCH HIS PHONE TO NO JAVASCRIPT USAGE SO IF ANY PROGRAMMER CAN ADAPT AN ENTIRELY APPLICATION TO THE METHOD I USED IN THIS EXAMPLE AUTOMATED HIS BROWSER IS ADAPT FOR ANY RANDOM ACTION ABOUT THE USER CHOISE(YOU UNDERSTAND "TO USE OR NOT JAVASCRIPT") SO HIS CHINESE PHONE CAN BE APROXIMATELLY APROACH LIKE QUALITY OF SPEED EXECUTION THE OTHERS PC/LAPTOPS/TABLETS QUALITY PRODUCTS.<BR><BR>This stupid example is the best example how no script tag can change the quality of services on this planet ,boost the speed of the internet connection and stops unnecessary use of A LOT OF INTERNET TRAFFIC on slow devices..a simple tag can change the entirely dynamic of programmer's views so entirely Planet's beneficts</h1><p> <br>
run this code in two instances :<br>with browser javascript enable <br>and without(browser's javascript disable or eg a firefox plugin noscript states on/off)
</p>
</noscript>
</BODY></HTML>
and to say more on this .. right noscript was invented to work like a trigger when js is disabled but you can work around this feature to change the course of internet functionality about how is now ,to change it's dynamics....
Like all things, use the right tool for the job.
If you are using Google Maps API, you have a static image via tag and that gets replaced with dynamic JS map. Google have recently started charging for everything thus with the above example it's going to cost you twice, once for static and once for dynamic. The static map is only relevant if JS is disabled. Therefore to save double paying it seems to me the best solution is to wrap the tag for the static map in a tag.
I hate how you can actually see webpages load. I think it'd be much more appealing to wait until the page is fully loaded and ready to be displayed, including all scripts and images, and then have the browser display it. So I have two questions:
How can I do this?
Is this common practice? If not, why?
This is a very bad idea for all of the reasons given, and more. That said, here's how you do it using jQuery:
<body>
<div id="msg" style="font-size:largest;">
<!-- you can set whatever style you want on this -->
Loading, please wait...
</div>
<div id="body" style="display:none;">
<!-- everything else -->
</div>
<script type="text/javascript">
$(document).ready(function() {
$('#body').show();
$('#msg').hide();
});
</script>
</body>
If the user has JavaScript disabled, they never see the page. If the page never finishes loading, they never see the page. If the page takes too long to load, they may assume something went wrong and just go elsewhere instead of *please wait...*ing.
I think this is a really bad idea. Users like to see progress, plain and simple. Keeping the page at one state for a few seconds and then instantly displaying the loaded page will make the user feel like nothing is happening and you are likely to lose visits.
One option is to show a loading status on your page while stuff processes in the background, but this is normally reserved for when the site is actually doing processing on user input.
http://www.webdeveloper.com/forum/showthread.php?t=180958
The bottom line, you at least need to show some visual activity while the page is loading, and I think having the page load in little pieces at a time is not all that bad (assuming you aren't doing something that seriously slows down page load time).
There is certainly a valid use for this. One is to prevent people from clicking on links/causing JavaScript events to occur until all the page elements and JavaScript have loaded.
In IE, you could use page transitions which mean the page doesn't display until it's fully loaded:
<meta http-equiv="Page-Enter" content="blendTrans(Duration=.01)" />
<meta http-equiv="Page-Exit" content="blendTrans(Duration=.01)" />
Notice the short duration. It's just enough to make sure the page doesn't display until it's fully loaded.
In FireFox and other browsers, the solution I've used is to create a DIV that is the size of the page and white, then at the very end of the page put in JavaScript that hides it. Another way would be to use jQuery and hide it as well. Not as painless as the IE solution but both work well.
Here's a solution using jQuery:
<script type="text/javascript">
$('#container').css('opacity', 0);
$(window).load(function() {
$('#container').css('opacity', 1);
});
</script>
I put this script just after my </body> tag. Just replace "#container" with a selector for the DOM element(s) you want to hide. I tried several variations of this (including .hide()/.show(), and .fadeOut()/.fadeIn()), and just setting the opacity seems to have the fewest ill effects (flicker, changing page height, etc.). You can also replace css('opacity', 0) with fadeTo(100, 1) for a smoother transition. (No, fadeIn() won't work, at least not under jQuery 1.3.2.)
Now the caveats: I implemented the above because I'm using TypeKit and there's an annoying flicker when you refresh the page and the fonts take a few hundred milliseconds to load. So I don't want any text to appear on the screen until TypeKit has loaded. But obviously you're in big trouble if you use the code above and something on your page fails to load. There are two obvious ways that it could be improved:
A maximum time limit (say, 1 second) after which everything appears whether the page is loaded or not
Some kind of loading indicator (say, something from http://www.ajaxload.info/)
I won't bother implementing the loading indicator here, but the time limit is easy. Just add this to the script above:
$(document).ready(function() {
setTimeout('$("#container").css("opacity", 1)', 1000);
});
So now, worst-case scenario, your page will take an extra second to appear.
Immediately following your <body> tag add something like this...
<style> body {opacity:0;}</style>
And for the very first thing in your <head> add something like...
<script>
window.onload = function() {setTimeout(function(){document.body.style.opacity="100";},500);};
</script>
As far as this being good practice or bad depends on your visitors, and the time the wait takes.
The question that is stil left open and I am not seeing any answers here is how to be sure the page has stabilized. For example if you are loading fonts the page may reflow a bit until all the fonts are loaded and displayed. I would like to know if there is an event that tells me the page is done rendering.
Also make sure the server buffers the page and does not immediately (while building) stream it to the client browser.
Since you have not specified your technology stack:
PHP: look into ob_start
ASP.NET: make sure Response.BufferOutput = True (it is by default)
obligatory: "use jQuery"
I've seen pages that put a black or white div that covers everything on top of the page, then remove it on the document.load event. Or you could use .ready in jQuery That being said, it was one of the most anoying web pages I've ever seen, I would advise against it.
in PHP-Fusion Open Source CMS, http://www.php-fusion.co.uk, we do it this way at core -
<?php
ob_start();
// Your PHP codes here
?>
YOUR HTML HERE
<?php
$html_output = ob_get_contents();
ob_end_clean();
echo $html_output;
?>
You won't be able to see anything loading one by one. The only loader will be your browser tab spinner, and it just displays everything in an instant after everything is loaded. Give it a try.
This method is fully compliant in html files.
You can hide everything using some css:
#some_div
{
display: none;
}
and then in javascript assign a function to document.onload to remove that div.
jQuery makes things like this very easy.
In addition to Trevor Burnham's answer if you want to deal with disabled javascript and defer css loading
HTML5
<html class="no-js">
<head>...</head>
<body>
<header>...</header>
<main>...</main>
<footer>...</footer>
</body>
</html>
CSS
//at the beginning of the page
.js main, .js footer{
opacity:0;
}
JAVASCRIPT
//at the beginning of the page before loading jquery
var h = document.querySelector("html");
h.className += ' ' + 'js';
h.className = h.className.replace(
new RegExp('( |^)' + 'no-js' + '( |$)', 'g'), ' ').trim();
JQUERY
//somewhere at the end of the page after loading jquery
$(window).load(function() {
$('main').css('opacity',1);
$('footer').css('opacity',1);
});
RESOURCES
CSS delivery optimization: How to defer css loading?
What is the purpose of the HTML "no-js" class?
How to get the <html> tag HTML with JavaScript / jQuery?
How to add/remove a class in JavaScript?
While I agree with the others that you should not want it I'll just briefly explain what you can do to make a small difference without going all the way and actually blocking content that is already there -- maybe this will be enough to keep both you and your visitors happy.
The browser starts loading a page and will process externally located css and js later, especially if the place the css/js is linked is at the 'correct' place. (I think the advice is to load js as late as possible, and to use external css that you load in the header). Now if you have some portion of your css and/or js that you would like to be applied as soon as possible simply include that in the page itself. This will be against the advice of performance tools like YSlow but it probably will increase the change of your page showing up like you want it to be shown. Use this only when really needed!
You could start by having your site's main index page contain only a message saying "Loading". From here you could use ajax to fetch the contents of your next page and embed it into the current one, on completion removing the "Loading" message.
You might be able to get away with just including a loading message container at the top of your page which is 100% width/height and then removing the said div on load complete.
The latter may not work perfectly in both situations and will depends on how the browser renders content.
I'm not entirely sure if its a good idea. For simple static sites I would say not. However, I have seen a lot of heavy javascript sites lately from design companies that use complex animation and are one page. These sites use a loading message to wait for all scripts/html/css to be loaded so that the page functions as expected.
Don't use display:none. If you do, you will see images resize/reposition when you do the show(). Use visibility:hidden instead and it will lay everything out correctly, but it just won't be visible until you tell it to.
Hope this code will help
<html>
<head>
<style type="text/css">
.js #flash {display: none;}
</style>
<script type="text/javascript">
document.documentElement.className = 'js';
</script>
</head>
<body>
<!-- the rest of your code goes here -->
<script type="text/javascript" src="/scripts/jquery.js"></script>
<script type="text/javascript">
// Stuff to do as soon as the body finishes loading.
// No need for $(document).ready() here.
</script>
</body>
</html>
Put text at the top of the page. While the user reads it, the rest of the page can load and it will be ready by the time the user scrolls down.
I am, frankly, a bit disturbed at many of the answers here. I'd say all of them are terrible. Although I share the skeptical reaction of the various top respondents, many answers give "solutions" that won't display anything at all to a user who has JavaScript disabled, and many others rely on a customized on-page loading notice, while signaling to the browser that the page is already loaded.
As a user, I hate both of these outcomes, so as a web-developer, I'd say these are both "non-solutions". You never want to anger your userbase and the solutions given here will anger a lot of users. I especially hate these approaches because if the user opens a webpage in the background in a new tab, the browser will display the page as loaded but the user might click over to it to find that it isn't loaded.
Independently of your question here, best practice is to make as much of your site work without JavaScript as possible, and best practice is to use the browser's built-in loading signals and never signal to the browser that the page is loaded before it actually is. So really, the only good way to do this is to make your page load so fast that there is never any moment of the user waiting.
The best way to achieve what you want is avoid use of Javascript to load elements of the page, and then optimize the page intensely. Here are the components of this approach:
Have JavaScript on the page if you like, but don't use it to load or otherwise modify any DOM elements after the initial request is fulfilled by the server. Use JavaScript to modify elements of the page only later, such as if triggered by user input, or perhaps to refresh an element after some time, but not in any way related to the page's initial loading. I.e. use JavaScript for what it was designed for (to make webpages interactive) and don't use it to do what HTML was designed for (to make the webpage in the first place.)
Avoid the use of any heavy JavaScript libraries and include as little JavaScript as possible. Never include JavaScript files generically, i.e. only include specific files / libraries in specific pages where you need them.
Specify the width and height of any images in the page code itself, so that the browser can know the exact layout before the image loaded. This reduces any "choppiness" as the page loads, i.e. elements moving around as the browser resizes the boxes in which images of unspecified width are contained.
Ensure that image files are in the exact dimensions being displayed on the page and are not being downsized by the browser. This minimizes file size and also minimizes CPU work the user's computer needs to do to resize images, both of which can affect load time.
Optimize the compression of images, which includes using a good lossy format like JPG and lowering the compression level to as low as you can go without affecting perception. Use lossless formats like PNG only where necessary and ideally keep them small in dimension so the filesize is also small.
Focus the intensity of your optimization efforts on any elements that load "above the fold" on a typical page, as these are what is going to affect what the user sees. Users rarely scroll down instantly, so if elements lower down on the page load a bit slower, almost no one will notice. But still optimize these lower elements reasonably because they also affect server load, bandwidth, and user CPU load.
If you use any elements at all in your page that are potentially very slow to load due to reasons beyond your control, such as content pulled from another server (ads, social media widgets, integrations with other websites, etc.), compartmentalize these in an element of fixed size, and ideally place it below the fold.
Avoid auto-ads, page-modifying AI (like Ezoic), or any other external add-ins that necessarily breaks or undermines one or more of these recommendations. For example, auto-ads are terrible because they rely on loading an external resource,they usually have heavy javascript libraries, and they also modify the page layout. Even the best-designed auto-ads are going to completely undermine all your other optimization efforts.
If you are running a company with multiple developers, quickly jettison any developers who are not fully committed to a lightweight, fast-loading web design. Ideally, don't ever hire such people to begin with. A lot of people get really vested in a certain philosophy or style of development that is at odds with lightweight design. The world would be a better place if these people were in a different line of work, rather than designing webpages.
So you've optimized your page.
This produces the outcome that, if the user clicks the link directly, they'll see the content above the fold fully loaded immediately or nearly-immediately, worst-case-scenario being that a couple images fill in in a second or two. By the time they scroll down, everything else will already be loaded. Any truly-slow-to-load content, such as Google Analytics tracking or other third-party services, will not be central to the appearance of the webpage itself, so the user will see a fully-loaded page even if there are still a few invisible elements loading behind the scenes.
On the other hand, if the user loads the link in a background tab, it will display as loading to the browser, showing the animated symbol in the tab, until it is truly fully loaded. Once it displays as loaded in the tab, if they click it, it will be fully loaded.
In addition, you will have made the page load really fast, which is a good thing in and of itself.
This is a win-win. The user sees a full-loaded page nearly instantly, there is almost never any waiting while looking at a half-displayed page, the loading symbol works as expected when loading a tab in the background, and on top of this you've netted a ton of side-benefits like reduced bandwidth and server CPU load, not to mention lessening the load on the user's CPU as well. (Many users HATE when your page cranks their CPU, and rightfully so.)
So yeah, your choice what to do, but there is only one real solution here and it is lightweight, efficient web design.