Let's say I want to create a website that contains one page. All the content is dynamic and generated using JavaScript with DOM replacement. The good thing about this is that it creates a better user experience, especially for applications that contain catalogues (online stores, galleries, etc). The problem now comes with linking. Let's say I'm browsing the site and I feel like sharing that particular thing I'm looking at with someone, but the problem is the link is always the same since it's JavaScript that's doing the magic. So the question comes: how can I create a fully JavaScript run website while maintaining the ability to link?
Now there's hash linking, but I'm failing miserably. I've tried overriding all <a> tags, changing the hash, and preventing the default action like so
$("a").click( function(){
window.location.hash = $(this).attr("id");
processHash();
return false;
});
Yet, it will randomly scroll my body for no reason.
I'd like some insights on the restrictions of linking in a fully dynamic website. Thanks.
Here is one simple thing you can do:
window.onload = function () {
processHash();
}
or it can be using jquery $(function () {...});
what happens here is when the page is loaded example http://www.example.com/#some-link
the page content is loaded first then your function that handle links processHash(); will do its work
not even the new and shiny jQuery mobile library is 100% ajax, but it's close. Obviously with a very modern browser, checkout this doc site done in jQuery mobile: http://jquerymobile.com/test/
If you dig in the docs a little you see how they use hash linking with the framework and html5 data-content="page"
each <div data-content="page">Is an independent page, if I remember right</div>
Related
Many aspects of my site are dynamic. I am using jquery.
I have a div which once the DOM is ready is populated using load().
Then if a button is clicked, using load() once again, this value is replaced by another value.
This kind of setup is common across my site. My homepage is essentially lots of dynamically loaded, refreshed, and changeable content.
What are the repercussions of this for SEO?
Ive seen sites where each page is loaded using load() and then displayed using the animation functions... It looks awesome !
People have posed this question before, but noone has answered it properly.
So any ideas? JQUERY AND SEO??
Thanks
EDIT
Very interesting points. I dont want to overdo my site with jaascript.. just where neccesary to make it look good - my homepage however is one place of concern.
So when the DOM is readY, it loads content into a div. On clicking a tab, this content is changed. I.E No JS, No content.
The beauty here for me is that, there is no duplicated code. Is the suggestion here that i should simply 'print' some default content, then have the tabs link to pages (with the same content) if JS is disabled. I.E sacrifice a little duplicate code for SEO?
As far as degrading goes, my only other place of concern is tabs on the same page.. I have 3 divs, all containing content. On this page two divs are hidden until a tab is clicked. I used this method first before i started playing with JS. Would it perhaps be best to load() these tabs, then have the tab buttons link to where the content is pulled from?
Thanks
None of the content loaded via JavaScript will be crawled.
The common and correct approach is to use Progressive Enhancement: all links should be normal <a href="..."> to actual pages so that your site "makes sense" to a search spider; and the click() event overrides the normal functionality with load() so normal users with JavaScript enabled will see the "enhanced" version of your site.
If your content is navigable when JavaScript is turned off, you'll be a good ways toward being visible to search engines.
Note that search engine crawlers won't be submitting any forms on your site, so if you have any or elements that are meant to be navigating between your site's content pages, that content is not navigable by search engines.
Here is a guidelines how to make Google to crawl content loaded with ajax: http://code.google.com/web/ajaxcrawling/docs/getting-started.html
I use jquery load() asynchronous page load. It greatly improves user experience, but not seo-friendly. Here's the only solution I have found so far:
On first load I do not use jquery load() and try to write cookie with javascript.document.cookie = 'checkjs=on';
On next page load if php script finds this cookie it means that javascript is enabled and jquery load() can be used. If there's no such cookie then javascript is off (probably spider came), so jquery load() is not used.
if (!$_COOKIE['checkjs'] || $_COOKIE['checkjs']!='on'){echo 'js is off, hello Google!'; } else {echo 'js is on, can use jquery load';}
This way I can be sure that most of users can benefit from asynchronous page blocks load, exept for the very first load. And spiders get all content too.
In your case you could just load the same page with new parameter that makes another tab active. Spider is gonna be happy.
I am making a web page that has multiple different pages under multiple directories.
The way this is currently set up is that there is a main page A.
On Page A there is a menu that has links to other pages. These pages are loaded into a div on page A using AJAX.
I'm currently trying to make it so users could bookmark the page with the current content from the menu they selected already loaded. I've done this by making a GET variable and posting the page "directory/name". The issue now though is to make it so the user can press the back button on their browser giving the same functionality of a page that isn't loaded like this.
Is there some way that back functionality can be used to get the previous pages and load them? I know the easiest way of doing this is forgetting about the loading of the pages and it will fix pretty well all of the problems, however at this point I'm curious of a workaround for this.
Javascript is being used but the jQuery library is not.
Cheers.
EDIT: Adding basic code functionality.
<div id="content">
<p>Hello</p>
</div>
<p onclick="load('dir/file')">Link1</p>
<p onclick="load('dir/file')">Link2</p>
<script>
window.onbeforeunload = function() {
load(getVar("page"));
}
function getVar (name) {
name = RegExp ('[?&]' + name.replace (/([[\]])/, '\\$1') + '=([^&#]*)');
return (window.location.href.match (name) || ['', ''])[1];
}
function load(val) {
loadPage(val+".html", "content"); //This just does an ajax call and puts the content into the second value.
window.history.pushState({},"IBM","https://labweb.torolab.ibm.com/groups/websphere/WASL3/l3_new/index.php?page="+val);
}
</script>
I think that's the basic functionality of it.
I think what your asking about is hash tag browsing.
jquery javascript: adding browser history back with hashtag?.
This example uses jquery but all you have to do is put an event when the link is clicked to update the page sounds like you have something like that already.
Change the URL in the browser without loading the new page using JavaScript
This example gives you a better idea of what you would do with just Javascript.
Now this will not when they load the page using the bookmark because there is no browser history but will work when using your pages.
<a href="#hashtag" class="loadpage" data-page="directory/name">
I am working on a blog website at http://d361.azurewebsites.net/Blog. I have used a template for full page flip from http://tympanus.net/codrops/2012/12/11/fullscreen-pageflip-layou which uses javascript and jquery, other than the obvious.
I have 2 problems.
I need disqus plugin on every article of the blog. But since the whole website is basically one single webpage, I have not been able to implement it. Further anchor tags are not working.
Currently the disqus plugin is working in case of the first article only.
I am also using social sharing buttons on the site. Again, they share only the mail website link, i.e d361.azurewebsites.net/Blog and not the actual articles. I tried to use anchor tags but it is not working.
Kindly help me out here. As you must have already known, I dnt know much beyond html and css.
Disqus won't work with single-page applications out of the box, you have to use our AJAX protocol to reload the thread with the new information. The process is documented here: http://help.disqus.com/customer/portal/articles/472107-using-disqus-on-ajax-sites
Whenever the page content is changing, you would call DISQUS.reset like this:
DISQUS.reset({
reload: true,
config: function () {
this.page.identifier = "new_disqus_identifier";
this.page.url = "http://example.com/#!new-url";
}
});
I'm having a jQuery mobile page with JavaScript inside. The problem is the JavaScript doesn't work unless the page is refreshed. Here is my code:
jQuery(function($) {
var url = window.location.search.substring(1);
$('#mydiv').load('real_news.asp?' + url);
});
To understand this problem you need to understand how jQuery Mobile works.
Your first problem is point where you are trying to initialize JavaScript. From your previous answers I can see you are using several HTML/ASP pages and all of your javascript is initialized form the page <head>. This is the main problem. Only the first HTML file should have JavaScript placed into the <head> content. When jQuery Mobile loads other pages into the DOM it loads only the <div> with a data-role="page" attribute. Everything else, including <head>, will be discarded.
This is because currently loaded page has a <head> already. No point in loading another pages <head> content. This goes even further. If you have several pages in a second HTML file, only the first one is going to be loaded.
I will not try to invent warm water here so here are links to my other 2 answers discussing this problem. Several solutions can be found there:
Why I have to put all the script to index.html in jquery mobile (or in this blog article)
Link fails to work unless refreshing
There's more then enough information there to give you an idea what to do.
The basic solutions to this problem are:
Put all of your JavaScript into a first HTML/ASP file
Move your JavaScript into <body>; to be more precise, move it into a <div> with data-role="page". As I already pointed out, this is the only part of a page that is going to be loaded.
Use rel="external" when switching between pages because it will trigger a full page refresh. Basically, you jQuery mobile that the page will act as a normal web application.
As Archer pointed out, you should use page events to initialize your code. But let me tell you more about this problem. Unlike classic normal web pages, when working with jQuery Mobile, document ready will usually trigger before page is fully loaded/enhanced inside the DOM.
That is why page events were created. There are several of them, but if you want your code to execute only once (like in case of document ready) you should use the pageinit event. In any other case use pagebeforeshow or pageshow.
If you want to find out more about page events and why they should be used instead of document ready take a look at this article on my personal blog. Or find it here.
Your question isn't exactly overflowing with pointers and tips, so I'm going with the thing that immediately sprung to mind when I saw it.
Document ready does not fire on page change with jQuery Mobile, due to "hijax", their method of "ajaxifying" all the links. Try this instead...
$(document).on("pageshow", function() {
var url = window.location.search.substring(1);
$('#mydiv').load('real_news.asp?' + url);
});
Try pageinit like this
$(document).delegate("body", "pageinit", function() { // Use body or page wrapper id / class
var url = window.location.search.substring(1);
$('#mydiv').load('real_news.asp?' + url);
});
seems like nothing ever worked for me. Tried many different fixes, but i made the site too messy, that even position of certain javascript files wouldn't make the site work. Enough talk, here is what i came up with.
// write it in head at top of all javascripts
<script type="text/javascript">
$(document).ready(function() {
// stops ajax load thereby refreshing page
$("a,button,form").attr('data-ajax', 'false');
// encourages ajax load, hinders refresh page (in case if you want popup or dialogs to work.)
$("a[data-rel],a[data-dialog],a[data-transition]").attr('data-ajax', 'true');
});
</script>
Is it possible to load full web pages with AJAX and how would I go about it?
I'm thinking that I can create individual pages as I normally would, and then use AJAX somehow to get that page, and present it where the user currently is. Is that the correct assumption to make?
Basically I'm aiming to make a more dynamic site, so when the user clicks an option it will scroll down and reveal the requested info, without a noticeable page redirect.
Any advice would be great.
Thanks.
Jquery's .load(url) method loads HTML direct into an element. So if you changed every <a> tag to be a .load() on your top-level element you could do this. It would be a bit like using frames, but targeting a DIV instead of a frame.
Of course it would break lots of things like the back-button, form handling etc etc unless you put a lot of work in.
So, like the doctor who when told "It hurts when I do this" replied "well don't do that then", the answer is probably "dont do that".
One possible way is to fetch the HTML and then write it into a div
Yes it is possible. You could create a page/site from pure JavaScript fetching all elements from a web service or similar handler. It's a nightmare to maintain though, and you run into all sorts of problems depending on your needs. I did it as an exercise in learning jQuery, AJAX and a few other things. I found form submissions became tricky. While the data is posted to a web service with AJAX, managing the state of the page became very convoluted, and it only got nightmarish as the needs of the site grew.
I also found that in order to accomplish this, you have to make a choice to refresh the entire interface during transitions or just the section being changed. Refreshing the entire interface is cumbersome and for "rapid" users, the AJAX may not be able to keep up. It also causes collisions on the web service requests. If your page has 4 separate sections being updated, it is not uncommon for a web service request to be "lost" in the middle leaving a section without an update.
So the answer to your question is "yes". Reading your question closely, I would keep the scope of your requests to individual display pages without much functionality. The simpler you keep it, the easier it is to maintain and use.
I know this is an old post but this is a nice little script that can do the job. It can add ajax content loading to a existing non ajax site. Requires jQuery.
Script
<script type="text/javascript">
//Your navigation bar, can be "document" or body
var $navigation = $(".side");
//Your main content that will be replaces
var body = ".page";
var $body = $(body);
$navigation.delegate("a", "click", function() {
window.location.hash = $(this).attr("href");
return false;
});
$(window).bind('hashchange', function() {
var newHash = window.location.hash.substring(1);
if(newHash) {
$body.fadeOut(200, function() {
$body.hide().load(newHash + " " + body, function() {
$body.fadeIn(200, function() {
});
});
});
};
});
$(window).trigger('hashchange');
</script>
Details
All the links under $navigation will have a click event added to them that will update the window url hash. The window is listening for the hash change and will use the hash value to make an AJAX request to reload the $body html.
Advantage
History (Back & Forward) navigation will work:
The same site will work with browsers that support JavaScript and browsers that don't;
If you copy past the url the script will load the correct page;
Because we are using the delegate function any links that are added via the result of the ajax load will also have the click event added to them
Disadvantage
You can no longer use anchors on your site
For more information and a example see: http://css-tricks.com/video-screencasts/85-best-practices-dynamic-content/