I am making a web page that has multiple different pages under multiple directories.
The way this is currently set up is that there is a main page A.
On Page A there is a menu that has links to other pages. These pages are loaded into a div on page A using AJAX.
I'm currently trying to make it so users could bookmark the page with the current content from the menu they selected already loaded. I've done this by making a GET variable and posting the page "directory/name". The issue now though is to make it so the user can press the back button on their browser giving the same functionality of a page that isn't loaded like this.
Is there some way that back functionality can be used to get the previous pages and load them? I know the easiest way of doing this is forgetting about the loading of the pages and it will fix pretty well all of the problems, however at this point I'm curious of a workaround for this.
Javascript is being used but the jQuery library is not.
Cheers.
EDIT: Adding basic code functionality.
<div id="content">
<p>Hello</p>
</div>
<p onclick="load('dir/file')">Link1</p>
<p onclick="load('dir/file')">Link2</p>
<script>
window.onbeforeunload = function() {
load(getVar("page"));
}
function getVar (name) {
name = RegExp ('[?&]' + name.replace (/([[\]])/, '\\$1') + '=([^&#]*)');
return (window.location.href.match (name) || ['', ''])[1];
}
function load(val) {
loadPage(val+".html", "content"); //This just does an ajax call and puts the content into the second value.
window.history.pushState({},"IBM","https://labweb.torolab.ibm.com/groups/websphere/WASL3/l3_new/index.php?page="+val);
}
</script>
I think that's the basic functionality of it.
I think what your asking about is hash tag browsing.
jquery javascript: adding browser history back with hashtag?.
This example uses jquery but all you have to do is put an event when the link is clicked to update the page sounds like you have something like that already.
Change the URL in the browser without loading the new page using JavaScript
This example gives you a better idea of what you would do with just Javascript.
Now this will not when they load the page using the bookmark because there is no browser history but will work when using your pages.
<a href="#hashtag" class="loadpage" data-page="directory/name">
Related
Many aspects of my site are dynamic. I am using jquery.
I have a div which once the DOM is ready is populated using load().
Then if a button is clicked, using load() once again, this value is replaced by another value.
This kind of setup is common across my site. My homepage is essentially lots of dynamically loaded, refreshed, and changeable content.
What are the repercussions of this for SEO?
Ive seen sites where each page is loaded using load() and then displayed using the animation functions... It looks awesome !
People have posed this question before, but noone has answered it properly.
So any ideas? JQUERY AND SEO??
Thanks
EDIT
Very interesting points. I dont want to overdo my site with jaascript.. just where neccesary to make it look good - my homepage however is one place of concern.
So when the DOM is readY, it loads content into a div. On clicking a tab, this content is changed. I.E No JS, No content.
The beauty here for me is that, there is no duplicated code. Is the suggestion here that i should simply 'print' some default content, then have the tabs link to pages (with the same content) if JS is disabled. I.E sacrifice a little duplicate code for SEO?
As far as degrading goes, my only other place of concern is tabs on the same page.. I have 3 divs, all containing content. On this page two divs are hidden until a tab is clicked. I used this method first before i started playing with JS. Would it perhaps be best to load() these tabs, then have the tab buttons link to where the content is pulled from?
Thanks
None of the content loaded via JavaScript will be crawled.
The common and correct approach is to use Progressive Enhancement: all links should be normal <a href="..."> to actual pages so that your site "makes sense" to a search spider; and the click() event overrides the normal functionality with load() so normal users with JavaScript enabled will see the "enhanced" version of your site.
If your content is navigable when JavaScript is turned off, you'll be a good ways toward being visible to search engines.
Note that search engine crawlers won't be submitting any forms on your site, so if you have any or elements that are meant to be navigating between your site's content pages, that content is not navigable by search engines.
Here is a guidelines how to make Google to crawl content loaded with ajax: http://code.google.com/web/ajaxcrawling/docs/getting-started.html
I use jquery load() asynchronous page load. It greatly improves user experience, but not seo-friendly. Here's the only solution I have found so far:
On first load I do not use jquery load() and try to write cookie with javascript.document.cookie = 'checkjs=on';
On next page load if php script finds this cookie it means that javascript is enabled and jquery load() can be used. If there's no such cookie then javascript is off (probably spider came), so jquery load() is not used.
if (!$_COOKIE['checkjs'] || $_COOKIE['checkjs']!='on'){echo 'js is off, hello Google!'; } else {echo 'js is on, can use jquery load';}
This way I can be sure that most of users can benefit from asynchronous page blocks load, exept for the very first load. And spiders get all content too.
In your case you could just load the same page with new parameter that makes another tab active. Spider is gonna be happy.
I'm going to be very specific.
I have a frontend...
http://www.eroticahub.site (not porn)
If you have javascript, it becomes... http://www.eroticahub.site/#!body=home [renders with jquery/ajax load]
If you don't have javascript, it remains...
http://www.eroticahub.site/
Then you click "Privacy" at the bottom.
If you have javascript, it loads the file /body/privacy.html into the main div and you get...
http://www.eroticahub.site/#!body=privacy [renders with jquery/ajax load]
If you don't have javascript, you just get... http://www.eroticahub.site/body/privacy_body.html
^ I'm just fetching the file that jquery/ajax is inserting into the template.
This isn't a very good solution. I want a page that never does a full refresh/reload but that is fully indexed by every major search engine.
Is it perhaps possible to make a command like this:
For each link in page,
if ( user_has_javascript )
return page_with_javascript;
else
return serverside_render( page_with_javascript );
That way any user who doesn't have javascript (web crawlers included) will get a pure html/css version of the page. I'm planning on using Ruby for my backend. Does anyone have a clean solution to this problem?
First make everything work with regular URLs and no JavaScript. You want your JS to be unobtrusive, so build it on top of a working, plain HTML + server side solution.
Next write JavaScript that fetches the data it needs from the server
and updates the document to match another of the pages.
That JavaScript should use pushState to change the URL to match the URL of the page from the server that you are generating locally with JavaScript.
NB: pushState replaces hashbang URIs. It is a standard designed for the use case you described (while hashbangs were an ugly hack).
Bind that JavaScript to your link click / form submit / etc event.
Add a listener for a popstate event so that when the user clicks Back you can restore the page to it's previous state.
Okay. Let's say a user goes straight to... eroticahub.site/privacy and then they click a link to go to eroticahub.site/legal The link looks like this:
<a href=eroticahub.site/legal.html onclick=function(){window.location.hash = 'legal.html';return false;}>
Link
</a>
So if the user has no javascript, they go to eroticahub.site/legal.html and request a whole new page from the server and if they do have javascript they go to eroticahub.site#legal.html and will not request a whole new page from the server.
The # will trigger a hash change event which will call a function with a big switch statement that will contain (window.location.hash === "legal.html") in it. This condition will trigger the loading of the snippets/legal.html html into the web page using jquery/ajax.
If the link goes to eroticahub.site/legal.html, the backend will deliver the same template it did for eroticahub.site/privacy.html, but with a different middle section containing words from snippets/privacy.html
If the user has javascript, the middle section is rendered the same as if the user does not have javascript. It is only when the user clicks a link that the distinction must be made whether or not they have javascript. The AJAX would have to load the dynamic content (#legal) on top of [replacing] the static content in the content div of eroticahub.site/privacy and then this would in tern be replaced by more html in the exact same div. A convention would have to be maintained such that:
<a href=eroticahub.site/legal.html onclick=function(){window.location.hash = 'legal.html';return false;}>
Link
</a>
<a href=eroticahub.site/privacy.html onclick=function(){window.location.hash = 'privacy.html';return false;}>
Link
</a>
<a href=eroticahub.site/user_content/stories.html onclick=function(){window.location.hash = 'user_content/stories.html';return false;}>
Link
</a>
etc.
I'm having a jQuery mobile page with JavaScript inside. The problem is the JavaScript doesn't work unless the page is refreshed. Here is my code:
jQuery(function($) {
var url = window.location.search.substring(1);
$('#mydiv').load('real_news.asp?' + url);
});
To understand this problem you need to understand how jQuery Mobile works.
Your first problem is point where you are trying to initialize JavaScript. From your previous answers I can see you are using several HTML/ASP pages and all of your javascript is initialized form the page <head>. This is the main problem. Only the first HTML file should have JavaScript placed into the <head> content. When jQuery Mobile loads other pages into the DOM it loads only the <div> with a data-role="page" attribute. Everything else, including <head>, will be discarded.
This is because currently loaded page has a <head> already. No point in loading another pages <head> content. This goes even further. If you have several pages in a second HTML file, only the first one is going to be loaded.
I will not try to invent warm water here so here are links to my other 2 answers discussing this problem. Several solutions can be found there:
Why I have to put all the script to index.html in jquery mobile (or in this blog article)
Link fails to work unless refreshing
There's more then enough information there to give you an idea what to do.
The basic solutions to this problem are:
Put all of your JavaScript into a first HTML/ASP file
Move your JavaScript into <body>; to be more precise, move it into a <div> with data-role="page". As I already pointed out, this is the only part of a page that is going to be loaded.
Use rel="external" when switching between pages because it will trigger a full page refresh. Basically, you jQuery mobile that the page will act as a normal web application.
As Archer pointed out, you should use page events to initialize your code. But let me tell you more about this problem. Unlike classic normal web pages, when working with jQuery Mobile, document ready will usually trigger before page is fully loaded/enhanced inside the DOM.
That is why page events were created. There are several of them, but if you want your code to execute only once (like in case of document ready) you should use the pageinit event. In any other case use pagebeforeshow or pageshow.
If you want to find out more about page events and why they should be used instead of document ready take a look at this article on my personal blog. Or find it here.
Your question isn't exactly overflowing with pointers and tips, so I'm going with the thing that immediately sprung to mind when I saw it.
Document ready does not fire on page change with jQuery Mobile, due to "hijax", their method of "ajaxifying" all the links. Try this instead...
$(document).on("pageshow", function() {
var url = window.location.search.substring(1);
$('#mydiv').load('real_news.asp?' + url);
});
Try pageinit like this
$(document).delegate("body", "pageinit", function() { // Use body or page wrapper id / class
var url = window.location.search.substring(1);
$('#mydiv').load('real_news.asp?' + url);
});
seems like nothing ever worked for me. Tried many different fixes, but i made the site too messy, that even position of certain javascript files wouldn't make the site work. Enough talk, here is what i came up with.
// write it in head at top of all javascripts
<script type="text/javascript">
$(document).ready(function() {
// stops ajax load thereby refreshing page
$("a,button,form").attr('data-ajax', 'false');
// encourages ajax load, hinders refresh page (in case if you want popup or dialogs to work.)
$("a[data-rel],a[data-dialog],a[data-transition]").attr('data-ajax', 'true');
});
</script>
Ok, this is my problem... I have two pages. On the first page I have a header (h2) element on which I want to apply a border style. On the second page, there is only one link. When clicking on the link from the second page, to navigate to the first page, it should create a border on the first page around the h2 tag.
Any thoughts on how to do this? Is it possible to do it with regular JavaScript or jQuery?
Thanks!
No, JavaScript is client-side and for this you would require the server to remember if the link was clicked, possibly by recording this in a database or session variable.
That's of course if you're using the tradition model of a website, rather than loading pages already using pure JS.
It would be a pretty stupid way of doing it, but it is possible to do it client side. I would seriously recommend to do it server-side though.
On page 2, link back to page 1 with a hash:
Go back to page one and add a border
And on page 1, check if there's a hash:
if (window.location.hash === '#border') {
$('h2').addClass('withBorder');
}
I think if you are looking this kind of scenario you can achieve it with passing some hash to the url:
i passed a hash named 'second' in the second page url and put this script on second page
$(function(){
var url=window.location;
var hash = url.hash.substr(1);
if(hash == "second"){
$('h2').css('border','solid 1px red');
}
});
Checkout this if helps.
Well there is a way you could do this with JavaScript, although it's tricky and server side is a LOT easier. You would need to use some JavaScript to load different pages without refreshing the entire DOM. I do this with something called pjax. The way it works is to have each page act as a container to load all subsequent pages via ajax. By not doing a full page reload, any style changes you make on one page get carried over to other pages (this dose not survive an actual browser refresh).
Let's say I want to create a website that contains one page. All the content is dynamic and generated using JavaScript with DOM replacement. The good thing about this is that it creates a better user experience, especially for applications that contain catalogues (online stores, galleries, etc). The problem now comes with linking. Let's say I'm browsing the site and I feel like sharing that particular thing I'm looking at with someone, but the problem is the link is always the same since it's JavaScript that's doing the magic. So the question comes: how can I create a fully JavaScript run website while maintaining the ability to link?
Now there's hash linking, but I'm failing miserably. I've tried overriding all <a> tags, changing the hash, and preventing the default action like so
$("a").click( function(){
window.location.hash = $(this).attr("id");
processHash();
return false;
});
Yet, it will randomly scroll my body for no reason.
I'd like some insights on the restrictions of linking in a fully dynamic website. Thanks.
Here is one simple thing you can do:
window.onload = function () {
processHash();
}
or it can be using jquery $(function () {...});
what happens here is when the page is loaded example http://www.example.com/#some-link
the page content is loaded first then your function that handle links processHash(); will do its work
not even the new and shiny jQuery mobile library is 100% ajax, but it's close. Obviously with a very modern browser, checkout this doc site done in jQuery mobile: http://jquerymobile.com/test/
If you dig in the docs a little you see how they use hash linking with the framework and html5 data-content="page"
each <div data-content="page">Is an independent page, if I remember right</div>