Stopping a recurring ajax call - javascript

I'm making an app where there's loads of products, each with a button. My plan is the user clicks one of the buttons and I have a javascript (jquery) function which starts making an ajax request every second using the id of the button as a parameter to identify the product.
The function I'm planning on using is the 2nd answer here Execute an Ajax request every second
The idea is to check the status of the product (which can change constantly) every second while the user is interested in it.
When the user clicks the button again I want to stop checking this particular product's status but I can't figure out how to do this. In my head I imagine the user might have clicked 3 buttons so there's 3 ajax requests every second happening, each with a different product id. How can I stop the recurring request which has the id of the product the user has clicked stop?

I did something similar recently, where I had an interval running every few seconds and, when some event occurred, I stopped the process altogether. I'm assuming you're using Javascript, so something like the below. You can check "someVar" in the timer call if you wish, up to you...
var someVar = false, intervalId;
$(function () {
// Attach event handler to button
$('#ButtonId').click(function (e) {
if (!someVar) {
e.preventDefault();
someVar = true;
//Start interval
intervalId = setInterval(CheckStatus, 3000);
} else {
// Stop the interval!
clearInterval(intervalId);
someVar = false;
}
});
});
function CheckStatus() {
// I used MVC, but you can use whatever you need to generate the URL
var checkUrl = '#Url.Action("CheckStatus", "SomeController", new { intervalId = "_intervalId_" })'.replace('_intervalId_', intervalId).replace(/&/g, "&");
$.ajax({
url: checkUrl,
type: 'GET',
contentType: 'application/json; charset=utf-8',
success: function (data) {
// do something with the status - data.Status perhaps?
},
error: function () {
//handle error
}
});
}

Related

How Javascript wait a click function completed then only repeat the same function after user click again?

Below is my html, I am doing a simple text base game, the player want to sell their item.
<input name="gg1" id="gg1" value="10">
Sale
<script>
function fastbtn(sid){
var aa = jQuery("#gg"+sid).val();
if(aa > 0){
aa--;
jQuery("#gg"+sid).attr("value", aa);
ajaxget('plugin.php?id=game&do=store&submit=true&timestamp=12345&gg1[1]=1&ggqty[1]=1&formhash={FORMHASH}&fastbuy=true','bbb'); //this is my ajax function -> ajaxget(requesturl,return result to id);
}
</script>
Example, the value = 10, but when user click 10 times (fast clicking), the value turn to 0, but my server side only proceess 8 times.
Is it anyway can matched the quantity of click with server side?
In my mind, is it possible the second click of fastbtn(sid) can wait until the first click of fastbtn(sid)'s ajaxget completed then only process?
It happens because the ajax network request will take some finite time and clicking rapidly will send cause many request that are not yet processed on server but you are doing aa-- resulting in disparity
There can be many ways you can solve this easiest being just disable the button while the ajax in pending state this will limit active ajax request at any given point of time to 1
But since you are using anchor tag disabling is not possible but you can use a flag instead (isPending)
So at load we initialize the flag to say "there is no pending
request you can proceed" (false)
When proceed to do ajax we set the flag to say "Sorry, busy right now
try again later" (true)
Once ajax completes we again reset the flag to false
So everytime the user clicks we first check "Are we clear" if no we just return from fastbtn() without doing anything
Assuming you're using jQuery's get() function for ajax, you can follow this
<script>
var isPending = false
function fastbtn(sid, t){
if(isPending)
return;
var aa = jQuery("#gg"+sid).val();
if(aa > 0){
aa--;
jQuery("#gg"+sid).attr("value", aa);
ajaxget('plugin.php?id=game&do=store&submit=true&timestamp=12345&gg1[1]=1&ggqty[1]=1&formhash={FORMHASH}&fastbuy=true','bbb'); //this is my ajax function -> ajaxget(requesturl,return result to id);
}
//...
//..
//.
}
function ajaxget(url){
isPending = true;
$.get(url, function(data, status){
// <--only fires on success
})
.done(function() { // <--only fires on success
})
.fail(function() { // <-- only fires on error
})
.always(function() { // <-- this always fires
isPending = false;
});
}
</script>
Another way could be to put aa-- in ajax's success callback or in done() so aa value changes only when server has recorded the change and has sent it's confirmation

Wait for form.submit() / POST to complete

I'm stuck in a really bizarre situation here. It's complicated to explain but I'll try my best.
Detailed explanation of the issue:
On every top Nav click (Green donuts/circles), or next button, I must submit the form, if it exists and is valid. If not valid, form.valid() triggers validation errors and return false would stop any further propagation. This setup was working flawlessly until I noticed a strange behavior which isn't very persistence. Form on my 3rd tab, specifically, is quite data heavy. When I hit next button it should practically go thru the same process: check for an existing form, if valid, then submit. Submit calls the POST action method and when post completes it GETs the view for next tab. It works like this 5/10 times but at other times GET executes before the POST, which causes next page to load with incomplete data. When I put breakpoints to debug, I see GET for the next tab executing before POST of the current tab.
UI Explained:
I have a UI with 4 navigation <a> buttons on top - in the center there's a always a form - and at the bottom I have Previous & Next buttons.
Forms are constructed in MVC using Ajax.BeginForm
For each Nav link <a> element on top, I have a JavaScript function
var LoadTabs = function (e, arg) {
// This is to validate a form if one of the top links is clicked and form has incomplete fields...
if (arg !== "prev" && arg !== "next") {
if (!window.ValidateForm(false)) return false;
}
var url = $(this).attr('data'); // this contains link to a GET action method
if (typeof url != "undefined") {
$.ajax(url, { context: { param: arg } }).done(function (data) {
$('#partialViewContainer').html(data);
});
}
}
This function above binds to each top link on page load.
$('.navLinks').on('click', LoadTabs);
My Next & Previous buttons basically trigger the click event i.e. LoadTabs function.
$('button').on('click', function () {
if (this.id === "btnMoveToNextTab") {
if (!window.ValidateForm(true)) return false;
$.ajax({
url: url,
context: { param: 'next' },
method: "GET",
data: data,
success: function(response) {
if (typeof response == 'object') {
if (response.moveAhead) {
MoveNext();
}
} else {
$('#mainView').html(response);
}
ScrollUp(0);
}
});
}
if (this.id === "btnMoveToPreviousTab") {
MoveBack();
}
return false;
});
MoveNext() Implementation is as below:
function MoveNext() {
var listItem = $('#progressbarInd > .active').next('li');
listItem.find('.navLink').trigger('click', ['next']);
ScrollUp(0);
}
The problem is, for some reasons, when Nav Link 3 is active and I hit NEXT button - Instead of posting the form first via form.submit() - the nav 4 gets triggered - hence GET for nav 4 runs before form POST of nav 3.
My ValidateForm method is basically just checking if the form exists and is valid then Submit, else returns false. Its as below:
function ValidateForm(submit) {
var form = $('form');
// if form doesn't exist on the page - return true and continue
if (typeof form[0] === "undefined") return true;
// now check for any validation errors
if (submit) {
if (!$(form).valid()) {
return false;
} else {
$(form).submit();
}
}
else {
return true;
}
return true;
}
My speculation is that form.submit does get triggered as it should be but since submit takes a little longer to finish it continues with the next code block in the button onclick event.
I first thought that this is a server side issue as in the POST I'm saving a big chunk of data with a few loops, and any code block that's process heavy I have that part in
var saveTask = Task.Factory.StartNew(() => ControllerHelper.SomeMethod(db, model)); Task.WaitAll(saveTask);
WaitAll will wait and pause the execution until SomeMethod finishes executing. I'm not sure how can I lock a process in JavaScript and wait for it to finish execution. Because I think If i can somehow lock the form.submit() in ValidateForm until its finished processing .. via a callback method perhaps...
Please if anyone can put me in right direction, I'd greatly appreciate the help. If you need more information please let me know I'd be happy to provide!
Ajax is async, and your forms submit which is using Ajax.BeginForm() is using ajax. What is happening is that when you click your 'Next' button, which triggers the $('button').on('click', function () { code:
You call the ValidateForm() function (and assuming its valid),
your $(form).submit(); line of code starts making a ajax POST
The code progresses to the final return true; line while the ajax
call is executing.
Because the ValidateForm() function returned true, the $.ajax
GET call now starts, but at that point the ajax POST in the
ValidateForm() function may not have finished executing causing
your GET method to return invalid data
You need to change your code so that the GET call is made once the POST method call has completed. And since your using the $.ajax() methods throughout your code, and $.ajax() gives you more flexibility, it seems unnecessary to use Ajax.BeginForm() (and the extra overhead of including the jquery.unbtrusive-ajax.js script). You should also be handling the forms .submit() function (if you do not want the 'Next' button to be a submit button in the form, you could just trigger the .submit() event in the buttons .click() handler)
$(document).on('submit', 'form', function(e) {
e.preventDefault(); // cancel default submit
var form = $(this);
if (!form.valid()) {
return; // will display the validation errors
}
.... // get the relevant urls to the GET and POST methods etc
$.post(postUrl, form.serialize(), function(data) {
.... // not clear if your [HttpPost] method returns anything
}).done(function() {
$.get(getUrl, someData, function(response) {
.... // Update the DOM with the next form?
.... // Re-parse the validator for client side validation
}
}).fail(function() {
.... // code that you might run if the code in the [HttpPost] method fails
});
});
You should also consider returning the appropriate 'next' view in the [HttpPost] method so that you don't then needs to make a second call back to the server to get it.
It is also worth reading the Deferred Object documentation and the use of $.when(), $.then() etc.

Executing current ajax call and aborting all previous ones

I am trying to build an auto-complete UI. There is an input whose on keyup function does an ajax call to server to fetch the most relevant data. But if user types a word which is, say 10 character long, so for each keyup one ajax call is made and my dialogue box refreshes 10 times.
I have tried using abort() for the ajax call. When I do an abort to previous ajax call, the call is not made but still it waits for 10 calls before executing the last one, which makes the user experience very bad.
So is there a way to execute just the current ajax call without any delay from the previous ones?
A part of my code:
var request_autocomplete=jQuery.ajax({});
$('.review_autocomplete').keyup(function() {
request_autocomplete.abort();
request_autocomplete=jQuery.ajax({
// DO something
});
});
OP, there are two parts to this. The first is your abort, which it seems that you already have.
The second is to introduce forgiveness into the process. You want to fire when the user stops typing, and not on every key press.
You need to use both keyUp and keyDown. On keyUp, set a timeout to fire your submit. Give it perhaps 700ms. On KeyDown, clear the timeout.
var request_autocomplete=jQuery.ajax({});
var forgiveness;
// first your AJAX routine as a function
var myServiceCall = function() {
request_autocomplete.abort();
request_autocomplete=jQuery.ajax({
// DO something
}
// keyup
$('.review_autocomplete').keyup(function() {
forgiveness = window.setTimeout(myServiceCall, 700);
});
});
// key down
$('.review_autocomplete').keydown(function() {
window.clearTimeout(forgiveness);
});
});
What this will do is constantly set a timeout to fire every time a key is up, but each time a key is down it will cancel that timeout. This will have the effect of keeping your service call from firing until the user has stopped typing, or paused too long. The end result is that you will wind up aborting a much smaller percentage of your calls.
you can implement the way you asked in your question is preventing for example 3 calls as below :
var calls = 0;
$('.review_autocomplete').keyup(function() {
if (calls >3) {
request_autocomplete.abort();
request_autocomplete=jQuery.ajax({
// DO something
});
calls = 0;
}
calls++;
});
but this way not recommended because when user wants to types sample while user types samp at p ajax call fire up. and when user type l and e nothing happen !
If you are using jquery Autocomplete
you can using
minLenght so you can check current lenght of text box and when user typed at least 3 character then you must call the ajax request.
delay (between last keystroke and ajax call. Usually 2-300ms should do)
and using AjaxQueue
after a quick search about this issue I have found this link that shows another way to prevent multiple ajax calls for autocomplete by using cache
You could use a globalTimeout variable that you reset with setTimeout() and clearTimeout().
var globalTimeout;
$('.review_autocomplete').keydown(function(){
if(globalTimeout)clearTimeout(globalTimeout);
}).keyup(function(){
globalTimeout = setTimeoout(function(){
$.ajax({/* you know the drill */});
}, 10);
});
This way the timeout is cleared whenever your Client pushes a keydown, yet the timeout is set again as soon as the your Client releases a key onkeyup, therefore $.ajax() will only be called if there's no key action, after 10 milliseconds in this case. I admit that this won't stop an $.ajax() call that has already been made, however it probably won't matter because they happen pretty fast, and because this example prevents future $.ajax() calls as long as the Client keeps typing.
Try
var count = {
"start": 0,
// future , by margin of `count.timeout`
"complete": 0,
// if no `keyup` events occur ,
// within span of `count.timeout`
// `request_autocomplete()` is called
// approximately `2` seconds , below ,
// adjustable
"timeout" : 2
};
$('.review_autocomplete')
.focus()
.on("keyup", function (e) {
elem = $(this);
window.clearInterval(window.s);
window.s = null;
var time = function () {
var t = Math.round($.now() / 1000);
count.start = t;
count.complete = t + count.timeout;
};
time();
var request_autocomplete = function () {
return jQuery.ajax({
url: "/echo/json/",
type: "POST",
dataType: "json",
data: {
json: JSON.stringify({
"data": elem.val()
})
}
// DO something
}).done(function (data) {
window.clearInterval(s);
console.log("request complete", data);
$("body").append("<br /><em>" + data.data + "</em>");
elem.val("");
count.start = count.complete = 0;
console.log(count.start, count.complete);
});
};
window.s = setInterval(function () {
if (Math.round($.now() / 1000) > count.complete) {
request_autocomplete();
console.log("requesting data");
};
// increased to `1000` from `501`
}, 1000);
});
jsfiddle http://jsfiddle.net/guest271314/73yrndwy/

error callback function is fired even if request is successful

I have a script that uses ajax to send mail. I have tested it by checking the email that will receive the mail and true enough, the ajax request is successful. I also checked the console window of my Firefox browser and it also shows me a successful message. But my problem here is that instead of the done callback function, the error callback function is fired. You may all wonder why I'm still using the error function instead of fail. The reason for this is because when I tried using the fail function, it doesn't trigger the alertbox that I have set inside it. So what I did is go back and use error function again since at least it triggers the alertbox I made.
Here is the script:
<script type="text/javascript">
var submitButton = $('#submit'); // Variable to cache button element
var alertBox1 = $('.success'); // Variable to cache meter element
var alertBox2 = $('.alert');
var closeButton1 = $('.close1'); // Variable to cache close button element
var closeButton2 = $('.close2'); // Variable to cache close button element
$( function(){
$( '#contactform' ).submit( function(e){
e.preventDefault();
console.log( 'hello' );
var formData = $( this ).serialize();
console.log( formData );
$.ajax({
type: 'POST',
url: 'send.php',
data: formData,
dataType: 'json',
done: function(){
$(submitButton).fadeOut(500); // Fades out submit button when it's clicked
setTimeout(function() { // Delays the next effect
$(alertBox1).fadeIn(500); // Fades in success alert
}, 500);
},
error: function(){
$(submitButton).fadeOut(500); // Fades out submit button when it's clicked
setTimeout(function() { // Delays the next effect
$(alertBox2).fadeIn(500); // Fades in fail alert
}, 500);
}
});
});
$(closeButton1).click(function() { // Initiates the reset function
$(alertBox1).fadeOut(500); // Fades out success message
setTimeout(function() { // Delays the next effect
$('input, textarea').not('input[type=submit]').val(''); // Resets the input fields
$(submitButton).fadeIn(500); // Fades back in the submit button
}, 500);
return false; // This stops the success alert from being removed as we just want to hide it
});
$(closeButton2).click(function() { // Initiates the reset function
$(alertBox2).fadeOut(500); // Fades out success message
setTimeout(function() { // Delays the next effect
$('input, textarea').not('input[type=submit]').val(''); // Resets the input fields
$(submitButton).fadeIn(500); // Fades back in the submit button
}, 500);
return false; // This stops the fail alert from being removed as we just want to hide it
});
});
</script>
What seems to be the one causing this? Just to reiterate, I've tried using fail instead of error callback function since that is one of the answers I found in the Internet and also because I know for a fact that the error function is already deprecated. But because of the reason I mentioned above, I've no choice but to use it.
if you refer the documentation, you cannot use done inside the ajax function as a callback. Either use success or add done at the end of ajax call.
$.ajax({
// url, data etc
success: function() {
//success handler
},
error:function(){
//Error handler
}
});
(OR)
$.ajax({
// ajax related codes
}).done(function(){
//callback
});
Also if you aren't really returning JSON from the server, remove the dataType: 'json', from the ajax call.

Stopping Javascript timers

I'm trying to dynamically create PDFs on a webserver using PHP/wkhtmltopdf, which involves sending the PDF-generation process to the background in order to prevent the page timing out.
To check whether the job has completed successfully, I've used Javascript (which I suck at) and more specifically jQuery/AJAX to continuously query the server looking to see if wkhtmltopdf's process has ended. If its still running, the PHP script returns nothing and simply exits. If the process has ended successfully, a html link to the PDF is generated and then dumped into a <div></div>.
All the server side code works flawlessly however I'm stuck on the Javascript component. The code below kinda works but instead of the timer stopping after a PDF has been generated, it continues to query the server. How do I get it to stop?
$('#pdfmodal').on('shown', function () {
pdf(); // fire PDF generation process function
(function worker() {
$.ajax({
url: 'pdfpidcheck.php',
success: function(data) {
if(data == ''){
// Schedule the next request if nothing returned (i.e. still running)
setTimeout(worker, 5000);
} else {
// dump link to pdf
$('.pdfmodal').html(data);
}
}
});
})();
})
To stop a timer, you just remember the returned value from setTimeout() and call clearTimeout() on it.
var id = setTimeout(fn, 5000);
// then some time later
clearTimeout(id);
In the code you've shown us, this should not be an issue unless you are calling worker() from some other place than what you show us or unless the .on() handler gets called a second time while a PDF is being created. Your current code doesn't look like it knows how to handler two PDFs being created at the same time or a second even triggered while the first one is still processing.
You could protect against multiple timers running like this:
$('#pdfmodal').on('shown', function () {
var modal = $(this);
pdf(); // fire PDF generation process function
(function worker() {
$.ajax({
url: 'pdfpidcheck.php',
success: function(data) {
var timer = modal.data(timer);
if(data == ''){
// make sure we never have more than one timer running
if (timer) clearTimeout(timer);
// Schedule the next request if nothing returned
// (i.e. server process still running)
timer = setTimeout(worker, 5000);
// save timer for later use
modal.data("timer", timer);
} else {
// clean up timer data
if (timer) clearTimeout(timer);
modal.removeData("timer");
// dump link to pdf
$('.pdfmodal').html(data);
}
}
});
})();
})

Categories