I'm trying to fetch posts dynamically using AJAX and JQuery by checking if the user is close to the bottom. Serverside is in python on GAE.
Listening for scroll:
this.config.window.on('scroll',this.loadContent);
1.Checking for distance from bottom
2.Sending an ajax request with the number of current posts in order to retrieve the next 10
3.results.check = true means that the server has no further posts to send.
loadContent: function(){
// 1
if($(document).height() - $(window).height() - $(window).scrollTop() < 1000) {
var posts = $('.troll').children('div').length;
data = 'loadmore=True&offset=' + posts; //2
$.ajax({
url: '/',
type: 'POST',
data: data,
dataType: 'json',
success: function(results){
if (results.check === 'true'){ //3
$(window).unbind('scroll');
return;
}
Post.insert10Values(results);
}
});
};
},
insert10Values: function(results){
var update = Handlebars.compile($('#troll10').html()),
troll10update = update(results);
$('div.troll').append( troll10update );
}
The problem here is that when scrolling fast, two or more requests are sent to the server and i get duplicate entries. I want to rate-limit on client-side.
Set a flag loading = false. Before you send a request, check the flag. If it's false, set the flag to true and proceed with request, otherwise ignore the event. When results arrive, show them and set the flag back to false.
Part of your problem is scroll event will trigger many times a second
you can throttle any function calls doing something like this:
var scrollTimer=false;
var delay=500; /* 1/2 second*/
$(window).on('scroll',function(){
if( scrollTimer){
clearTimeout( scrollTimer);
}
scrollTimer=setTimeout(function(){
/* run your code here*/
}, delay);
});
As for the ajax you could store a time for last ajax call and set a miniumum difference based on now vs stored time before making a new ajax call
var lastAJAX=Date.now(), AJAXMin=5000;/* 5 seconds*/
function checkAJAXCalls(){
var now=Date.now(), diff=now-lastAJAX;
if( diff >= AJAXMin){
lastAJAX=now;
return true;
}else{
return false;
}
}
Then run if(checkAJAXCalls()) prior to making request. Concept could be modified to update lastAJAX in success callback of $.ajax also
jQuery.ajax has a method called beforeSend. It is executed right before your ajax call. You can use it to check if any other request is in progress and cancel the call if there is one. If you return false in beforeSend function, the ajax call will not be fired so you won't have any duplicate content.
$.ajax({
url: '/',
type: 'POST',
data: data,
dataType: 'json',
beforeSend: function() {
if (window.nextPageProcess) {
return false;
} else {
window.nextPageProcess = 1;
}
},
success: function(results){
if (results.check === 'true'){ //3
$(window).unbind('scroll');
return;
}
Post.insert10Values(results);
window.nextPageProcess = 1;
}
});
Related
In the below code I am making an API call to my backend node.js app using setTimeout() which calls my AJAX at every 5 seconds. Inside my AJAX success I am displaying divContent1 & divContent2 based on certain condition which should execute at least once. After that only divContent2 should be visible at each setTimeout() calls.
index.html
<script type="text/javascript">
$(document).ready(function(){
$.ajax({
url: "http://localhost:8070/api/route1",
type: 'POST',
dataType:'json',
success: function(res) {
//Some Task
}
});
$("#myButton").click(function(){
const route2 = function() {
$.ajax({
url: "http://localhost:8070/api/route2",
type: "POST",
dataType: "json",
data: { var1: val1 },
success: function (res) {
// Various tasks
if(res.flag){
$("#divContent1").hide();
$("#divContent2").show();
}
else{
$("#divContent1").show();
}
//Functions that handle div content data
},
beforeSend: function() {
$("#divContent1").hide();
$("#divContent2").hide();
},
complete: function() {
setTimeout(route2,5000);
},
});
};
$(function(){
route2();
})
});
});
</script>
The setTimeout() calls the entire route2 function which handles all the display and insertion of div content. However, the ask is to only display divContent2 from the second call.
Looking for a solution for this
The setTimeout() calls the entire route2 function which handles all
the display and insertion of div content. However, the ask is to only
display divContent2 from the second call.
You're calling route2 recursively with setTimeout(route2,5000); under complete. So this will run infinitely as complete occur each time an ajax call is completed (wether success or error). So what you can do is to create a timer and clear it after the second execution, something like this:
var ctr = 0, timer =0;
const route2 = function() {
$.ajax({
...
success: function (res) {
//Write you logic based on ctr
}
complete: function() {
if(ctr>0){
clearTimeout(timer)
}else{
timer = setTimeout(route2,5000);
ctr = ctr+ 1;
}
},
});
};
Will an external variable be enough? Just define it in the outer context and set/check it to choose the behavior:
// before declaring button click handler
var requestDoneAtLeastOnce = false;
// ...
// somewhere in success handler
success: function (res) {
if (!requestDoneAtLeastOnce) {
requestDoneAtLeastOnce = true;
// do something that belongs only to handling the first response
}
else {
// this is at least the second request, the other set of commands belongs here
}
}
Is there any way to check if the event is completed and element is free to perform another action?
Like I want to do
$('#button-cancel').on('click', function() {
// send ajax call
});
/****************************************
extra code
*******************************************/
$('#button-cancel').on('click', function() {
if(ajax call is completed) {
//do some thing
}
});
I don't want to send ajax call in second onclick as it is already been sent, just want to check if it is done with ajax then do this
You can introduce a helper variable:
// introduce variable
var wasAjaxRun = false;
$('#button-cancel').on('click', function() {
// in ajax complete event you change the value of variable:
$.ajax({
url: "yoururl"
// other parameters
}).done(function() {
// your other handling logic
wasAjaxRun = true;
});
});
$('#button-cancel').on('click', function() {
if(wasAjaxRun === true) {
//do some thing
}
});
EDIT: I just noticed that you have event handlers attached to the same button. In that case my initial answer would not work, because first event hander would be executed every time you click the button.
It is not very clear from the description what you want to do with your first event hander. I assume you want to use some data, and if you already have this data, then you use it immediately (like in second handler), if you don't have it - you make the AJAX call to get the data (like in first handler).
For such scenario you could use single event handler with some conditions:
var isAjaxRunning = false; // true only if AJAX call is in progress
var dataYouNeed; // stores the data that you need
$('#button-cancel').on('click', function() {
if(isAjaxRunning){
return; // if AJAX is in progress there is nothing we can do
}
// check if you already have the data, this assumes you data cannot be falsey
if(dataYouNeed){
// You already have the data
// perform the logic you had in your second event handler
}
else { // no data, you need to get it using AJAX
isAjaxRunning = true; // set the flag to prevent multiple AJAX calls
$.ajax({
url: "yoururl"
}).done(function(result) {
dataYouNeed = result;
}).always(function(){
isAjaxRunning = false;
});
}
});
You should be able to provide handlers for AJAX return codes. e.g
$.ajax({
type: "post", url: "/SomeController/SomeAction",
success: function (data, text) {
//...
},
error: function (request, status, error) {
alert(request.responseText);
}
});
you can disable the button as soon as it enters in to the event and enable it back in ajax success or error method
$('#button-cancel').on('click', function() {
// Disable button
if(ajax call is completed) {
//do some thing
//enable it back
}
});
This is edited, more complete version of dotnetums's answer, which looks like will only work once..
// introduce variable
var ajaxIsRunning = false;
$('#button').on('click', function() {
// check state of variable, if running quit.
if(ajaxIsRunning) return al("please wait, ajax is running..");
// Else mark it to true
ajaxIsRunning = true;
// in ajax complete event you change the value of variable:
$.ajax({
url: "yoururl"
}).done(function() {
// Set it back to false so the button can be used again
ajaxIsRunning = false;
});
});
You just need to set a flag that indicates ajax call is underway, then clear it when ajax call returns.
var ajaxProcessing = false;
$('#button-cancel').on('click', function(){
processAjaxCall();
});
function processAjaxCall() {
if(ajaxProcessing) return;
ajaxProcessing = true; //set the flag
$.ajax({
url: 'http://stackoverflow.com/questions/36506931/javascript-how-to-check-if-operation-has-been-completed-on-this-event'
})
.done(function(resp){
//do something
alert('success');
})
.fail(function(){
//handle error
alert('error');
})
.always(function(){
ajaxprocessing = false; //clear the flag
})
}
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
<button id="button-cancel">Cancel</button>
What you can do is call a function at the end of an if statement like
if(ajax call is completed) {
checkDone();
}
function checkDone() {
alert("Done");
}
This topic is covered in a few other questions, but I had some difficulty applying the suggested approaches into this use case. I have a checkbox list, where a user can select n sub-sites to publish their post to. since this list could grow to be 100+, I need an efficient way to perform an expensive task on each one. It's okay if it takes awhile, as long as Im providing visual feedback, so I planned to apply an "in progress" style to each checkbox item as its working, then move to the next item int he list once it is successfully published. Also note: I'm working in the WordPress wp_ajax_ hook but the PHP side of things is working well, this is focused on the JS solution.
This code is working right now (console.logs left in for debug), but I've seen multiple warnings against using async: true. How can I achieve a waterfall AJAX loop in a more efficient way?
//Starts when user clicks a button
$("a#as_network_syndicate").click( function(e) {
e.preventDefault(); //stop the button from loading the page
//Get the checklist values that are checked (option value = site_id)
$('.as-network-list').first().find('input[type="checkbox"]').each(function(){
if($(this).is(':checked')){
blog_id = $(this).val();
console.log(blog_id+' started');
$(this).parent().addClass('synd-in-progress'); //add visual feedback of 'in-progress'
var process = as_process_syndication_to_blog(blog_id);
console.log('finished'+blog_id);
$(this).parent().removeClass('synd-in-progress');
}
});
});
function as_process_syndication_to_blog(blog_id){
var data = {
"post_id": $('#as-syndicate_data-attr').attr("data-post_id"), //these values are stored in hidden html elements
"nonce": $('#as-syndicate_data-attr').attr("data-nonce"),
"blog_id": blog_id
};
var result = as_syndicate_to_blog(data);
console.log('end 2nd func');
return true;
}
function as_syndicate_to_blog(data){
$.ajax({
type : "post",
dataType : "json",
async: false,
url : ASpub.ajaxurl, //reference localized script to trigger wp_ajax PHP function
data : {action: "as_syndicate_post", post_id : data.post_id, nonce: data.nonce, blog_id: data.blog_id},
success: function(response) {
if(response.type == "success") {
console.log(response);
return response;
} else {
}
},
error: {
}
});
}
Indeed, doing synchronous AJAX request is bad because it will block the browser during the whole AJAX call. This means that the user cannot interact with your page during this time. In your case, if you're doing like 30 AJAX calls which take say 0.5 seconds, the browser will be blocked during 15 whole seconds, that's a lot.
In any case, you could do something following this pattern:
// some huge list
var allOptions = [];
function doIntensiveWork (option, callback) {
// do what ever you want
// then call 'callback' when work is done
callback();
}
function processNextOption () {
if (allOptions.length === 0)
{
// list is empty, so you're done
return;
}
// get the next item
var option = allOptions.shift();
// process this item, and call "processNextOption" when done
doIntensiveWork(option, processNextOption);
// if "doIntensiveWork" is asynchronous (using AJAX for example)
// the code above might be OK.
// but if "doIntensiveWork" is synchronous,
// you should let the browser breath a bit, like this:
doIntensiveWork(option, function () {
setTimeout(processNextOption, 0);
});
}
processNextOption();
Notice: as said by Karl-André Gagnon, you should avoid doing many AJAX requests using this technique. Try combining them if you can, it will be better and faster.
If you can't pass the whole block to the server to be processed in bulk, you could use a jQuery queue. This is using your sample code as a base:
var $container = $('.as-network-list').first();
$container.find('input[type="checkbox"]:checked').each(function(){
var $input = $(this);
$container.queue('publish', function(next) {
var blog_id = $input.val(),
$parent = $input.parent();
console.log(blog_id+' started');
$parent.addClass('synd-in-progress'); //add visual feedback of 'in-progress'
as_process_syndication_to_blog(blog_id).done(function(response) {
console.log(response);
console.log('finished'+blog_id);
$parent.removeClass('synd-in-progress');
next();
});
});
});
$container.dequeue('publish');
function as_process_syndication_to_blog(blog_id){
var data = {
"post_id": $('#as-syndicate_data-attr').attr("data-post_id"), //these values are stored in hidden html elements
"nonce": $('#as-syndicate_data-attr').attr("data-nonce"),
"blog_id": blog_id
};
return as_syndicate_to_blog(data).done(function(){ console.log('end 2nd func'); });
}
function as_syndicate_to_blog(data){
return $.ajax({
type : "post",
dataType : "json",
url : ASpub.ajaxurl, //reference localized script to trigger wp_ajax PHP function
data : {action: "as_syndicate_post", post_id : data.post_id, nonce: data.nonce, blog_id: data.blog_id}
});
}
I don't have a test environment for this so you may need to tweak it for your use case.
Note: simplified example..
I've got a page with 1000 table rows. For each row, i need to "do some work" on the server via an AJAX call, then in the callback, update that table row saying done.
Initially i tried just firing off the 1000 ajax requests inside the .each selector, but the browser was locking up.
So i changed it to try and use an internal ajax counter, so only ever fire off 50 at a time.
Here's the code:
$('#do').click(function () {
var maxAjaxRequests = 50;
var ajaxRequests = 0;
var doneCounter = 0;
var toDo = $('#mytable tr').length;
$.each($('#mytable > tr'), function (i, v) {
while (doneCounter < toDo) {
if (ajaxRequests <= maxAjaxRequests) {
ajaxRequests++;
doAsyncStuff($(this), function () {
ajaxRequests--;
doneCounter++;
});
} else {
setTimeout(function() {
}, 1000);
}
}
});
});
function doAsyncStuff(tr, completeCallback) {
$.ajax({
url: '/somewhere',
type: 'POST',
dataType: 'json',
data: null,
contentType: 'application/json; charset=utf-8',
complete: function () {
completeCallback();
},
success: function (json) {
// update ui.
},
error: function (xmlHttpRequest, textStatus, errorThrown) {
// update ui.
}
});
}
But the browser is still being locked up. It never goes into the $.ajax complete callback, even though i can see the request coming back successfully (via Fiddler). Therefore its just sleeping, looping, sleeping, etc because the callback is never returned.
I've got a feeling that the entire doAsyncStuff function needs to be asynchronous?
Any ideas on what i am doing wrong (or how i can do this better)?
You are doing a while loop inside the .each callback function, so there is much more ajax request than 1000, the worst is 1000*1000.
You could delay each ajax request with different time.
$('#do').click(function () {
$('#mytable > tr').each(function (i, v) {
var $this = $(this);
setTimeout(function () {
doAsyncStuff($this, function () {
console.log('complete!');
});
}, i * 10);
});
});
The browser gets locked because of the WHILE... You are creating an endless loop.
The while loops runs over and over waiting for the doneCounter to be increased, but the javascript engine cannot execute the success call of the ajax since it is stuck in the while...
var callQueue = new Array();
$('#mytable > tr').each(function(key,elem){callQueue.push($(this));});
var asyncPageLoad = function(){
var tr = callQueue.splice(0,1);
$.ajax({
url: '/somewhere',
type: 'POST',
dataType: 'json',
data: null,
contentType: 'application/json; charset=utf-8',
complete: function () {
completeCallback();
asyncPageLoad();
},
success: function (json) {
// update ui.
},
error: function (xmlHttpRequest, textStatus, errorThrown) {
// update ui.
}
}
};
asyncPageLoad();
This will call the requests one by one. If you want, simply do a for() loop inside to make maybe 5 calls? And increase the amount if the browser is fine.
Actually, I prefer to send new request when current request is done. I used this method to dump db tables (in this work). Maybe it gives an idea.
See this link, check all check boxes and click Dump! button. And you can find the source codes here (see dumpAll function).
Look at this code please - how could I kill / update or restart an ajax call (not content that Ajax calls) after the content has already been called?
I mean the $('#posting_main') is called onclick and animated - how to stop ajax and make it another $('#posting_main') on another click?
$(document).ready(function() {
$("#img_x_ok").click(function(e){
e.preventDefault();
var post_text = $.trim($("#main_text_area").val());
var data_text = 'post_text='+ post_text;
if (post_text === "") return;
var xhr = $.ajax({
type: "POST",
url: "comm_main_post.php",
data: data_text,
cache: false,
success: function (data){
//content
$("#posting_main").fadeIn();
$("#posting_main").load("pull_comm.php");
$("#main_text_area").attr("value", "");
$("#posting_main").animate({
marginTop: "+=130px",
}, 1000 );
}
}); //ajax close
}); }); //both functions close
You can abort the current request with:
xhr.abort();
After having done that, you can run another $.ajax(...) to make a second request.
You could implement it like the following. Note that indenting code makes it a lot more readable!
$(document).ready(function() {
var xhr; // by placing it outside the click handler, you don't create
// a new xhr each time. Rather, you can access the previous xhr
// and overwrite it this way
$("#img_x_ok").click(function(e){
e.preventDefault();
var post_text = $.trim($("#main_text_area").val());
var data_text = 'post_text='+ post_text;
if (post_text === "") return;
if(xhr) xhr.abort(); // abort current xhr if there is one
xhr = $.ajax({
type: "POST",
url: "comm_main_post.php",
data: data_text,
cache: false,
success: function (data){
//content
$("#posting_main").fadeIn();
$("#posting_main").load("pull_comm.php");
$("#main_text_area").attr("value", "");
$("#posting_main").animate({
marginTop: "+=130px",
}, 1000 );
}
});
});
});
I am not sure I fully understand your question, however:
xhr.abort() will kill the AJAX request. After calling abort(), you could modify and resend the request, if desired.
$("#posting_main").stop() will stop the fadeIn animation. (And I think you might need to follow that with $("#posting_main").hide() to be sure it isn't left partially visible.)