js slider website - google crawling - javascript

I'm currently building a website with a js slider.
Basically all the pages of the site (slides in this case) are contained in one big html page.
These pages/slides are organized within <div> tags, like <div id="#slide1">, <div id="#slide2" >, etc
As you slide through the pages/slides, the url is updated, like www.mysite.com/#!slide1, then www.mysite.com/!#slide2, etc
Is it possible to tell the crawling bots that each div should be considered like a page ?
The slides are not loaded in ajax.

When you're serving your files with PHP or something like that you can try this:
create links for all slides which a earch engine could see (e.g. in a site map)
when a link is called deliver only the requested slide/content as HTML
store all other slides in JavaScript
before moving to another slide append the content
Extra advantage:
even without javascript activated a user could go through all the slides
Disadvantages:
when your slides contain a lot of content, javascript overhead could get heavy
it will work best if only one slide is visible at a time (or just append all slides "unload")
Maybe this is a possible idea.
To enhance your URLs you could also use the HTML5 history object, but this won't work in any Internet Explorer before IE10.

Related

Include code from jQuery load() onto page source code [duplicate]

Many aspects of my site are dynamic. I am using jquery.
I have a div which once the DOM is ready is populated using load().
Then if a button is clicked, using load() once again, this value is replaced by another value.
This kind of setup is common across my site. My homepage is essentially lots of dynamically loaded, refreshed, and changeable content.
What are the repercussions of this for SEO?
Ive seen sites where each page is loaded using load() and then displayed using the animation functions... It looks awesome !
People have posed this question before, but noone has answered it properly.
So any ideas? JQUERY AND SEO??
Thanks
EDIT
Very interesting points. I dont want to overdo my site with jaascript.. just where neccesary to make it look good - my homepage however is one place of concern.
So when the DOM is readY, it loads content into a div. On clicking a tab, this content is changed. I.E No JS, No content.
The beauty here for me is that, there is no duplicated code. Is the suggestion here that i should simply 'print' some default content, then have the tabs link to pages (with the same content) if JS is disabled. I.E sacrifice a little duplicate code for SEO?
As far as degrading goes, my only other place of concern is tabs on the same page.. I have 3 divs, all containing content. On this page two divs are hidden until a tab is clicked. I used this method first before i started playing with JS. Would it perhaps be best to load() these tabs, then have the tab buttons link to where the content is pulled from?
Thanks
None of the content loaded via JavaScript will be crawled.
The common and correct approach is to use Progressive Enhancement: all links should be normal <a href="..."> to actual pages so that your site "makes sense" to a search spider; and the click() event overrides the normal functionality with load() so normal users with JavaScript enabled will see the "enhanced" version of your site.
If your content is navigable when JavaScript is turned off, you'll be a good ways toward being visible to search engines.
Note that search engine crawlers won't be submitting any forms on your site, so if you have any or elements that are meant to be navigating between your site's content pages, that content is not navigable by search engines.
Here is a guidelines how to make Google to crawl content loaded with ajax: http://code.google.com/web/ajaxcrawling/docs/getting-started.html
I use jquery load() asynchronous page load. It greatly improves user experience, but not seo-friendly. Here's the only solution I have found so far:
On first load I do not use jquery load() and try to write cookie with javascript.document.cookie = 'checkjs=on';
On next page load if php script finds this cookie it means that javascript is enabled and jquery load() can be used. If there's no such cookie then javascript is off (probably spider came), so jquery load() is not used.
if (!$_COOKIE['checkjs'] || $_COOKIE['checkjs']!='on'){echo 'js is off, hello Google!'; } else {echo 'js is on, can use jquery load';}
This way I can be sure that most of users can benefit from asynchronous page blocks load, exept for the very first load. And spiders get all content too.
In your case you could just load the same page with new parameter that makes another tab active. Spider is gonna be happy.

Is it possible to direct which elements of a page are painted first by the browser?

I wanted to know if there was any way to control browser painting, for example I'd like to load elements at the top of the page first so users can see content straightaway. The elements at the bottom of the page can load last as the user will not see them until they scroll down.
I'm looking to optimize my site which currently has a 6 second load time and I'd like to get it down to 1 second. This is mostly being caused by JS and images. I know that reducing both these will mean I wont need to worry about directing the painting but out of interest I just wanted to know if it was possible?
Apologies if my understanding of browser painting is very basic
its not that difficult. all you need is ajax. load the inital markup and then load the rest of the page via ajax.
just load the page with little markup which you initally want to show to the user. then as user scrolls down you can make ajax calls and get xml or json or also html files and render them on you page, for example:
$(window).on( "scroll" , function() {
var $document = $(document);
var $window = $(this);
if( $document.scrollTop() >= $document.height() - $window.height() - 400 ) {
//make ajax call here and load the data
}
});
Also read this
After looking into this further I found this article
http://www.feedthebot.com/pagespeed/prioritize-visible-content.html
which provides a good way of directing which parts of the page are rendered first. By separating your content in to above and below the fold content you can decide what needs to be delivered first i.e. your main content rather than sidebar ads. Using inline style to display your above-the-fold content will make it appear very quickly since it won't need to wait for for an external request.
But this is only good for simple CSS, if pages require complex CSS then it's better to use an external file because:
"When you use external CSS files the entire file is cached (remembered) by the browser so it doesn't have to take the same steps over and over when a user goes to another page on your website. When you inline your CSS, this does not occur and the CSS is read and acted upon again when a visitor goes to another page of your website. This does not matter if your CSS is small and simple. If your CSS is large and complex, as they often tend to be, then you may want to consider that the caching of your CSS is a better choice."
http://www.feedthebot.com/pagespeed/inline-small-css.html

How to avoid loading all images when a html page is loaded in browser?

My web page include a lot of img tags, but when it is initially displayed, most of the imgs are hidden. I want to load the imgs only when user shows the intention to view them, otherwise the page could generate too much network traffic.
I know I could insert the img tags into the DOM on the fly with javascript. But that way I lose the benefit of search engine indexing these images, I want the search engine bots to see these imgs.
Is there a way to keep the DOM structure unchanged, while loading the imgs only when needed?
You could try lazy loading:
Lazy Load delays loading of images in long web pages. Images outside of viewport are not loaded until user scrolls to them. This is opposite of image preloading.
demo: http://www.appelsiini.net/projects/lazyload/enabled_timeout.html
http://www.appelsiini.net/projects/lazyload
https://github.com/tuupola/jquery_lazyload
http://luis-almeida.github.io/unveil/
What you could do, is put all the images in a <noscript> tag, so browsers without JavaScript, and thus search engines, can see them.
You can then add the images in using JavaScript manually, for those who do have it.

Best practice for injecting a header or toolbar into a page?

Our webapp allows customers to view historical snapshots of pages on their site. We want to inject a header into the top of the page (something like the digg or linkedin toolbar) that contains data like snapshot time, url, and various other metrics.
We want to present these pages as close as possible to their original state.
So what is the best way to add a header into a page whilst preserving it as best possible?
Potential approaches we have looked at:
Sticking the cached page in an iframe. However a surprising number of sites contain frame-breaking code and we don't want to do anything hacky like trying to stop this.
Add an absolutely/fixed positioned div to the top of the page with a high z-index. The problem with this approach is that a) some of your styling may get over-written, b) javascript that runs on DOM load can screw around with your html/ccs (e.g Plone-powered sites add classes and styles to all tables for example) c) the varying DOCTYPEs or lack-of can screw up our css (yes IE, looking at you).
Adding an absolutely positioned iframe to the top of the page with a high z-index. This get around any of our html/css being clobbered or amended. However again we have DOCTYPE issues - we'd like it statically positioned and IE7 doesn't support this in Quirksmode.
Any thoughts? Thanks
Why would you want to use a banner with a height of 100px? I see some other possibilities:
Can't you use a link to a popup or page with more information?
Or make it pull out if you hover it.
That way it will not obscure a large percentage of the site.
If you control the links that lead to an archived version, you could put in a proxy-url. Let that URL open the right html in a frame. This is much like google cache:
show a list of links that look like pagearchive.html?version=43234324
let pagearchive.html be a html page with an iframe that starts 100px from the top. the version=43234324 part can let you open the right url in the frame.

Ajax entire page - display only one div and retain CSS and other header material?

A client wants a merch shop on their site, and has set one up. I could iFrame in the whole page to the merch page, but frankly the merch site is an eyesore, and their site has a very particular feel to it. So I'm considering using an AJAX GET to grab the whole page, then javascript to display only the div with the merchandise in it. However, there are a lot of javascript includes (etc) on the merch site that I'd need to make sure are still present for the div to work correctly.
Any feeling on if this would work or not? Would the displayed div take its stylesheet and scripts from the AJAX'd page? Can I put the div in an iframe instead?
Opinions?
It sounds like an ugly solution. Isn't it better to do this serverside instead, for example let a PHP script read in the page and to whatever magic it takes to display it?
Using AJAX to load entire pages is ugly for a couple of reasons, including:
It breaks the URLs (can be worked around but requires extra work)
It's hard for search engines to crawl your site
It breaks some GUI elements in the browser, such as loading visualisations
looks like you can use jquery load function http://docs.jquery.com/Ajax/load

Categories