AJAX - Losing the Javascript data - javascript

I have been struggling ajaxing my website, which happens to be a WordPress.
What I am trying to do, is to only refresh the content of my blog. What I mean is that my header, footer # sidebar shouldn't be refreshed when I navigate through my website.
It sounded easy to me when I first started, but I was wrong. I've been looking around to find a way to get around problems and found this but it did not help... So, here is my terrible issue :
There are Javascript scripts that are involved in my "refreshed content" and the innerHTML does not keep the JS. Only Html is transposed... As a result, my plugins aren't working anymore.
So, I have been looking for a way to keep the JS content.
I hope I have been clear in desribing my problems and pray for you guys to be able to help me :)
Here is my website : www.construction.urbaineparis.com
If you need more details, I will be very willing to give you the code you need to help.
Here is a part of the source that I believe contains the issue.
//start changing the page content.
jQuery('#' + AAPL_content).fadeOut("slow", function() {
//See the below - NEVER TRUST jQuery to sort ALL your problems - this breaks Ie7 + 8 :o
//jQuery('#' + AAPL_content).html(AAPL_loading_code);
//Nothing like good old pure JavaScript...
document.getElementById(AAPL_content).innerHTML = AAPL_loading_code;
jQuery('#' + AAPL_content).fadeIn("slow", function() {
jQuery.ajax({
type: "GET",
url: url,
data: getData,
cache: false,
dataType: "html",
success: function(data) {
AAPL_isLoad = false;
//get title attribute
datax = data.split('<title>');
titlesx = data.split('</title>');
if (datax.length == 2 || titlesx.length == 2) {
data = data.split('<title>')[1];
titles = data.split('</title>')[0];
//set the title?
//after several months, I think this is the solution to fix & issues
jQuery(document).attr('title', (jQuery("<div/>").html(titles).text()));
} else {
if (AAPL_warnings == true) {
alert("WARNING: \nYou seem to have more than one <title> tag on the page, this is going to cause some major problems so page title changing is disabled.");
}
}
//Google analytics?
if (AAPL_track_analytics == true) {
if(typeof _gaq != "undefined") {
if (typeof getData == "undefined") {
getData = "";
} else {
getData = "?" + getData;
}
_gaq.push(['_trackPageview', path + getData]);
} else {
if (AAPL_warnings == true) {
alert("WARNING: \nAnalytics does not seem to be initialized! Could not track this page for google.");
}
}
}
///////////////////////////////////////////
// WE HAVE AN ADMIN PAGE NOW - GO THERE //
///////////////////////////////////////////
try {
AAPL_data_code(data);
} catch(err) {
if (AAPL_warnings == true) {
txt="ERROR: \nThere was an error with data_code.\n";
txt+="Error description: " + err.message;
alert(txt);
}
}
//get content
data = data.split('id="' + AAPL_content + '"')[1];
data = data.substring(data.indexOf('>') + 1);
var depth = 1;
var output = '';
while(depth > 0) {
temp = data.split('</div>')[0];
//count occurrences
i = 0;
pos = temp.indexOf("<div");
while (pos != -1) {
i++;
pos = temp.indexOf("<div", pos + 1);
}
//end count
depth=depth+i-1;
output=output+data.split('</div>')[0] + '</div>';
data = data.substring(data.indexOf('</div>') + 6);
}
//put the resulting html back into the page!
//See the below - NEVER TRUST jQuery to sort ALL your problems - this breaks Ie7 + 8 :o
//jQuery('#' + AAPL_content).html(output);
//Nothing like good old pure JavaScript...
document.getElementById(AAPL_content).innerHTML = output;

Change
document.getElementById(AAPL_content).innerHTML = AAPL_loading_code;
to
$("#"+AAPL_content).html(AAPL_loading_code);
jQuery takes care of executing scripts that are in the HTML, which .innerHTML does not do.
I doubt this really breaks in IE 7, as your comment says, unless you're using jQuery 2.x (they've dropped support for old IE versions).

Related

Page Navigation Script not fully working in Blogger

I picked up this great javascript from MS-Potilas of http://yabtb.blogspot.com/ which gives me my next and previous blog titles appended to the prev and next icons by calling the title information from the blog feed. And if that somehow fails, it has a backup, which is to pull the information off the urls and turn that into the title in its PseudoTitle mode.
Thing is, it only works for about the newest half of my blog posts. After that, it switches into PseduoTitle mode.
Here's what I don't understand. It's supposed to work for 500 posts. My site only has 350+. So why does it seem to work properly for only the newest 100+ posts?
Also, is there something I can do so that I can increase the number of posts that this script will work for after I go past 500 posts?
Thanks a lot for your help.
Here's the script;
<script type='text/javascript'>
// Post titles to Older Post and Newer Post links (without stats skew)
// by MS-potilas 2012. See http://yabtb.blogspot.com/
//<![CDATA[
var urlToNavTitle = {};
function getTitlesForNav(json) {
for(var i=0 ; i < json.feed.entry.length ; i++) {
var entry = json.feed.entry[i];
var href = "";
for (var k=0; k<entry.link.length; k++) {
if (entry.link[k].rel == 'alternate') {
href = entry.link[k].href;
break;
}
}
if(href!="") urlToNavTitle[href]=entry.title.$t;
}
}
document.write('<script type="text/javascript" src="//'+window.location.hostname+'/feeds/posts/summary?redirect=false&max-results=500&alt=json-in-script&callback=getTitlesForNav"/>');
function urlToPseudoTitle(href) {
var title=href.match(/\/([^\/_]+)(_.*)?\.html/);
if(title) {
title=title[1].replace(/-/g," ");
title=title[0].toUpperCase() + title.slice(1);
if(title.length > 28) title=title.replace(/ [^ ]+$/, "...")
}
return title;
}
$(window).load(function() {
window.setTimeout(function() {
var href = $("a.blog-pager-newer-link").attr("href");
if(href) {
href = href.replace(/\http\:[^/]+\//, "https");
var title=urlToNavTitle[href];
if(!title) title=urlToPseudoTitle(href);
if(title) $("a.blog-pager-newer-link").html("<< Newer<br />" + title);
}
href = $("a.blog-pager-older-link").attr("href");
if(href) {
href = href.replace(/\http\:[^/]+\//, "https");
var title=urlToNavTitle[href];
if(!title) title=urlToPseudoTitle(href);
if(title) $("a.blog-pager-older-link").html("Older >><br />" + title);
}
}, 500);
});
//]]>
</script>
Seems I managed to figure it out.
Apparently, even though the script says max-results=500, the script is really only pulling 150 posts. I don't know why that is.
So I just added more retrieval scripts like this to cover the rest.
document.write('<script type="text/javascript" src="//'+window.location.hostname+'/feeds/posts/summary?redirect=false&max-results=150&start-index=151&alt=json-in-script&callback=getTitlesForNav"/>');
function urlToPseudoTitle(href) {
var title=href.match(/\/([^\/_]+)(_.*)?\.html/);
if(title) {
title=title[1].replace(/-/g," ");
title=title[0].toUpperCase() + title.slice(1);
if(title.length > 28) title=title.replace(/ [^ ]+$/, "...")
}
return title;
}
Many thanks to Adam over at http://too-clever-by-half.blogspot.com/ for providing the solution to the &start-index=151 extension.

DOM Update slow in chrome than firefox looks like chrome has some rendering issues

I have written a code which dumps large no of node in DOM . When i load it in firefox it renders in 2-3 secs but in chrome (ver:33) it freezes the UI and rendering takes long time(8-10 sec) .
$.ajax({
xhr: function () {
var xhr = new window.XMLHttpRequest();
xhr.addEventListener("progress", function (evt) {
if (evt.lengthComputable) {
var percentComplete = evt.loaded / evt.total * 100;
$("#fetchProgress").attr("value", percentComplete);
}
}, false);
return xhr;
},
type: 'GET',
url: "/GetSomething",
data: {},
success: function (data) {
///process and dump to DOM//
var fileLines = data.split('\n');
var htmlString = '';
for (var i = 0; i < fileLines.length; i++) {
htmlString += '<span>' + (i + 1) + '. ' + fileLines[i]+</span>;
if ((i % 1000) == 0) {
$("#textPlace").append(htmlString);
htmlString = '';
}
}
fileLines = null;
$("#textPlace").append(htmlString);
}
});
I learned from internet that chrome has some rendering bugs and tried hacks from this URL.
"Chrome Bug - window.scroll method intercepts DOM Rendering"
It started to work but now again it is not working .Please suggest something .
Any help is appreciated . Tank size Thanks in Advance :)
If I understand your code, you have an array which you want to tie together with spans. You can remove the for() from your code (and the modulus in it(=slow) ), saving a lot of time:
htmlString = '<span>'+ fileLines.join("</span><span>") +'</span>';
That will not display the i number, but you could switch to li's and use the numbers instead of bullets.
This might work too:
var fileLines = '<span>'+ data.replace('\n', '</span><span>') +'</span>';
This is a litte more messy (this could end in </span><span></span>, you need to trim \n's to fix that (easy to do)), but it doesnt have to turn it into an array, which should speed things up
See if you can avoid the append to html under this function
if ((i % 1000) == 0) {
$("#textPlace").append(htmlString);
htmlString = '';
}
and only have one append at the very end of your code. You want to limit the DOM manipulation - it affects the browser performance by triggering multiple browser reflow.
Google Dev: Speeding up JavaScript: Working with the DOM
Taken out from the link above, you can work with something like this instead
var anchor, fragment = document.createDocumentFragment();
for (var i = 0; i < 10; i ++) {
anchor = document.createElement('a');
anchor.innerHTML = 'test';
fragment.appendChild(anchor);
}
element.appendChild(fragment);

jQuery Ajax call caching content

Ok I will try to make it simple.
1) I have 3 links that execute an Ajax Request and update a div with some content.
The DIV
<div id="content-to-update"></div>
The 3 links that update #content-to-update
example 1
example 2
example 3
Each link update the div #content-to-update with the content just below with one parameter named CODE
The the div #content-to-update is updated with the code below.
var loading = false;
$(window).scroll(function () {
var winTop = $(window).scrollTop();
var docHeight = $(document).height();
var winHeight = $(window).height();
//if user reach bottom of the page
if (!loading && (winTop / (docHeight - winHeight)) > 0.95) {
loading = true;
//the CODE parameter is different on each call from the links that I
//talked earlier.
$.get("/items/next/?list_name=" + CODE, function(data){
//executing some javascript to display next items
}).done(function() {
loading = false;
})
}
});
});
The problem is that it seams the browser keep all different version of the updated div.
Its like the old content is not erased before the new content is added.
If I click on the first link and scroll I get the right items OK!.
Then if I click on the second link, when I scroll I get the Item twice (duplicated- it calls the code from the previous ajax call)
Then if I click on the third link, when I scroll I get the Item 3 times. (it calls the code from the 2 previous ajax call )
When I use the Chrome debugger I see that it goes first in the code that have received the parameter EXAMPLE_1 then it goes in the code that has received the parameter EXAMPLE_2 etc
But this code should has been overridden by the call of the EXAMPLE_2 link.
It is difficult to explain I don't know if someone understand what Im trying to explain but I give it a try :) and again sorry for my english.
Thanks
I'm a bit picky about POST and GET, so even though Wayne is technically correct, the fact that you are retrieving data makes your use of GET the right way of doing it.
The way around caching is either by using jQuery's ajax method and setting cache to false, like so:
$.ajax({
url: "/items/next/?list_name=" + CODE,
type: 'GET',
success: function(data) {
$('#content-to-update').html(data);
},
cache:false,
error: function(e) {
alert("Server failure! Is the server turned off?");
}
});
You can also trick the browser by adding a random string to the end of the URL, which is what I usually do. Something like this:
$.get("/items/next/?list_name=" + CODE + '&cache_buster=' + (new Date()).getTime().toString(), function(data){
//executing some javascript to display next items
}).done(function() {
loading = false;
})
If you are using .html() to set the content, the error is definitely somewhere else. Ensure that you are not appending the new content to the div, which seems like what you are doing.
Also, your functions should act independently of one another. Your current process seems to support that, but your problem seem to suggest otherwise.
Try the suggestions first and if they don't work, post more code.
Update
Try this:
var loading = false;
function executeSomeAjax(CODE){
$(window).scroll(function () {
var winTop = $(window).scrollTop();
var docHeight = $(document).height();
var winHeight = $(window).height();
//if user reach bottom of the page
if (!loading && (winTop / (docHeight - winHeight)) > 0.95) {
loading = true;
//the CODE parameter is different on each call from the links that I
//talked earlier.
$.get("/items/next/?list_name=" + CODE, function(data){
//executing some javascript to display next items
}).done(function() {
loading = false;
})
}
});
});
}
As you can see, the variable loading is now a global variable. I suspect that it was a local variable in your original function and as a result was set to false anytime the function ran.
Making it a global variable should resolve your issue.
Hope this helps.
UPDATE
Ok this is the final working code thanks to everybody for helping me out !
I think the problem was coming from low memory on my computer. The code you see below was used yesterday and was not working.
Since I rebooted the computer this morning everything works like a charm. I have 4GO of memory and working with Grails 2.2.2 and Intellij IDEA Im often with 100Mo of memory left I guess this should have a side effect. I cant see other explanations.
If That can help anyone to read this post
var loading = false;
function nextProject(){
$('.logo').html('<img src="/images/ajax-loader-transparent.gif">');
$.ajax({
type:'GET',
url: "/project/next/",
data:"list_name=" + CODE,
beforeSend:function(){
console.log("loading : " + loading)
}
}).done(function(data) {
if(data != ""){
var arrayOfObjects = eval(data);
for(var i=0; i < arrayOfObjects.length; i++){
TrackManager.newTrack(btoa(arrayOfObjects[i].base64Params));
var projectMgr = new ProjectManager(arrayOfObjects[i].id);
projectMgr.socialShare();
<sec:ifNotLoggedIn >
projectMgr.runDeny();
</sec:ifNotLoggedIn>
<sec:ifLoggedIn >
projectMgr.runGranted(arrayOfObjects[i].likeUp, arrayOfObjects[i].inPlayList );
</sec:ifLoggedIn>
INC++;
}
loading = false;
$('.logo').html('<img src="/images/soundshare_logo_32.png">');
console.log(INC + "/" + PROJECT_COUNT );
}
}).fail(function(){
console.error("Ajax error!")
});
}
$(window).scroll(function(){
var winTop = $(window).scrollTop();
var docHeight = $(document).height();
var winHeight = $(window).height();
if ((winTop / (docHeight - winHeight)) > 0.95) {
if(INC < PROJECT_COUNT){
if(!loading){
loading = true;
nextProject()
}
}
}
});

DOM manipulation slow in Chrome (hiding / showing elements)

I've put together a small test at http://jsfiddle.net/Hwqb3/3/ this morning. This is on the back of a larger project with pagination. I have tried this with native JS and jQuery. The test uses jQuery.
A quick search on SO says that Chrome handles things poorly if background-size is set, but this is not the case here. No trace of background-size in the source, and inspecting elements shows no background-size being set / inherited.
Ignore the initial page load while 5,000 elements are added to the list. It is only a few seconds, but it just so there are some elements to test with.
In Firefox 18.0.1, the moving between pages is almost instant and in IE9 there is maybe a 0.1s delay between mouse click and the paged results refreshing; However, in Chrome (24.0.1312.57 m) the delay is a noticeable 1-2 seconds.
I spent the majority of my night last night pouring over my code to see if I can find the cause before writing this test. This is bare bones and still has the issue.
I can only assume that Chrome is handling the element.style.display=''; poorly. Without that (even looping through the 5,000 elements to display='none') the thing is snappy.
Any ideas? Client wants pagination on a result set of around 4,000 - 7,500, but doesn't want page reloads and doesn't understand that they should apply filters to whittle that list down to <100, as no one is ever going to page through 200 - 375 pages looking for something specific.
Last resort is AJAX calls, which may be slightly quicker on Chrome. Untested yet though.
Thanks in advance.
Code from jsfiddle, excluding the jQuery CDN link
HTML:
First
Previous
Next
Last
<br>
<ul id='list'>
</ul>
JS:
window.onload=function() {
window.list=$('#list'), window.max=20, window.page=0, window.pages=0, window.elements;
var i=0;
while(i<5000) {
i++;
list.append("<li>"+i+"</li>");
}
jump('first');
};
function jump(operation) {
window.elements=list.find('li');
window.pages=Math.ceil(window.elements.length/window.max);
if(operation=='first') {
window.page=0;
}
else if(operation=='last') {
window.page=(window.pages-1);
}
else if(operation=='+1') {
window.page=(window.page+1);
if(window.page>=window.pages) {
window.page=(window.pages-1);
}
}
else if(operation=='-1') {
window.page=(window.page-1);
if(window.page<0) {
window.page=0;
}
}
var showing=0, total=0;
window.elements.each(function() {
var show=false, self=$(this);
if(showing<window.max) {
if(total>=(window.page*window.max) && total<((window.page*window.max)+window.max)) {
self[0].style.display='';
showing++;
show=true;
}
}
if(!show) {
self[0].style.display='none';
}
total++;
});
}
check this
window.onload = function() {
window.list = $('#list'),
window.max = 20,
window.page = 0,
window.pages = 0,
window.elements;
var i = 0;
var html = '';
while(i < 5000) {
i++
html += '<li>' + i + '</li>';
}
list.append(html);
window.elements = list.find('li');
window.pages = Math.ceil(window.elements.length/window.max);
jump('first');
};
function jump(operation) {
if (operation == 'first')
window.page = 0;
else if (operation == 'last')
window.page = window.pages - 1;
else if (operation == '+1')
(window.page + 1 >= window.pages) ? window.page = window.pages - 1 : window.page++ ;
else if (operation == '-1')
(window.page - 1 < 0) ? window.page = 0 : window.page--;
var index = page * window.max;
window.elements.hide().slice(index, index + window.max).show();
}
http://jsfiddle.net/Hwqb3/16/

Can't change iframes onload listener in IE9

I have the following code within an external javascript file.
jQuery(function ($) {
//////////////////////UPCOMING EVENTS JSON SERVER START///////////////////////////
var eventList = $("#eventList"); //cache the element
$.getJSON("/JsonControl/Events.json", function (jsonObj) {
val = "";
for (var i = 0; i < jsonObj.events.length; ++i) {
val += "<p>" + jsonObj.events[i].dateMonth + "/" + jsonObj.events[i].dateNumber +
"/" + jsonObj.events[i].dateYear + " - <span id='EL" + i + "' class='link' " +
"onclick=plotEvent(" + i +")>" + jsonObj.events[i].title + "</span></p>";
}
eventList.html(val);
});
//////////////////////UPCOMING EVENTS JSON SERVER END/////////////////////////////
});
function plotEvent(index)
{
$.ajax({
url: "/JsonControl/Events.json",
dataType: 'json',
async: false,
success: function (jsonObj)
{
var eventBox = window.frameElement;
alert("This alert fires in all browsers, including IE9")
eventBox.onload = function ()
{
alert("This alert doesn't fire in IE9.")
window.frameElement.onload = null; // unset it so it only fires once
eventBox = eventBox.contentDocument || eventBox.contentWindow.document;
eventBox.getElementById("title").innerHTML = (jsonObj.events[index].title);
eventBox.getElementById("content").innerHTML = (jsonObj.events[index].explanation);
eventBox.getElementById("dateHolder").innerHTML = (jsonObj.events[index].dateMonth + "-" + jsonObj.events[index].dateNumber + "-" + jsonObj.events[index].dateYear);
};
eventBox.src="/Event htms/Event.htm";
}
});
}
The page that loads this script is in the iframe itself. A very similar function called in a different external js file, from the main page outside of the iframe (for a different but similar purpose) works in all browsers just fine. The only difference is that with this code I have to target the onload of the iframe from within the iframe, instead of just grabbing the iframe by id. I then attempt to change, the onload of said iframe, for use with the next internal iframe page (which is why I need to preserve the json array index [i] when dynamically writing the first iframe page's innerHTML.
Sorry if that was a bit wordy, and/or confusing, but suffice it to say that with using the above-pasted code, I have no problems... except with IE (tried in IE9). I have tried dozens of examples and supposed solutions, but nothing has worked. Using IE9.
Here's what I mean when I say 'it doesn't work in IE9':
This part of the code within plotEvent() doesn't fire:
eventBox.onload = function ()
{
alert("This alert doesn't fire in IE9.")
window.frameElement.onload = null; // unset it so it only fires once
eventBox = eventBox.contentDocument || eventBox.contentWindow.document;
eventBox.getElementById("title").innerHTML = (jsonObj.events[index].title);
eventBox.getElementById("content").innerHTML = (jsonObj.events[index].explanation);
eventBox.getElementById("dateHolder").innerHTML = (jsonObj.events[index].dateMonth + "-" + jsonObj.events[index].dateNumber + "-" + jsonObj.events[index].dateYear);
};
Is there any solution to this problem, or is this sort of thing why iframes aren't used more often (that is, that IE doesn't fully support them)?
Try eventBox.contentWindow.onload or maybe $(eventBox).load(function)

Categories