I am using jQuery.post a lot in my project and every time the browser sends a post request, I want to show a preloader while the request is in process and then stop the preloader when i get the reply form the server:
var params = {'action':'get_customer_info', 'customerID':4};
preloader.start();
$.post('ajax/url', params, function(response){
preloader.stop();
responseHandler(response);
});
Instead of adding the preloader.start() and preloader.stop() lines every time I call jQuery's post, I'd like to bind/trigger events on before the jQuery.post as well on the success/fail handlers.
I know how to bind and trigger events in general, but I'm not sure how I would do this with the $.post and handlers.
How do I do this?
You could set up the global ajax events, that way the preloader shows on every ajax request
$(document).ajaxSend(function() {
preloader.start();
}).ajaxComplete(function() {
preloader.stop();
});
Or you could create your own post function
function post(url, params) {
preloader.start();
return $.post('ajax/url', params, function(response){
preloader.stop();
});
}
To be used like
post('ajax/url', params).done(function(response) {
responseHandler(response);
});
Related
I've built a javascript application which has a graph with some dropdown filters that users are able to change. The dropdowns all have event listeners which submit a server request to get the data (via a jquery ajax call) and then graphs the data. The issue is if the user uses the keyboard to quickly go through many different elements of the dropdown.
The server call takes roughly a second so if they quickly scroll through say 20, this can lead to a buildup. 20 different requests to the server are created, and then there's not even a guarantee that the last piece of code to be executed on server request success will be the most current filter.
So my question is what is the best way when a filter is changed to kill all other asynchronous processes? Here's the relevant code:
$("#filter").change(function(d) {
getData();
} //end of if
});
function getData() {
...
$.ajax({
url: myUrl
type: "GET",
dataType: "json",
success: function(d) {
[do some stuff]
} //end of success function
}); //end of ajax
} //end of getData
Save all the Ajax calls you wish to abort into some variable then you can abort the last call.
This is a very common practice when some ajax call might happen many times, before the ones before it had a chance to finish.
function getData(){
$filter.data('REQ') && $filter.data('REQ').abort(); // abort the last request before creating a new one
return $.ajax({
url:myUrl
type:"GET",
dataType:"json",
success:function(d){
[do some stuff]
}
})
}
var $filter = $("#filter");
$filter.on('change', function(d) {
$filter.data('REQ', getData())
});
Of-course this is a very simplified manner of code and you should re-write this in a more structured way, but it gives you an idea of how to cache the last ajax request and then you have the power to abort it before sending a new call.
By the way, your title has nothing to do with the question. You are asking how to handling a sequence of ajax calls but asking about events. this question has nothing to do with events, so you should change the title to fit the problem.
Update regarding what #t.niese had said in the comments:
Throttling the requests on the client side is also a good play to do, since the server cannot really know if the client has aborted the request or not, and if it's a resource-demanding request, it should be throttled.
BUT, I would suggest throttling the requests on the server-side and not on the client-side, if possible, because client-side throttling could be by-passed and is not 100% reliable, and it "costs" the same amount of time to do that on the server-side.
Can a http server detect that a client has cancelled their request?
I would drop all attempts initiate a new request calling getData while a current request is running, and only send the last getData attempt as soon as the current request is finished. This will ensure that the server load wont become unnecessarily high, because only one request will run.
var currentRequest;
var resubmitRequest;
function getData() {
// only start a new request id no current request is running
if (!currentRequest) {
resubmitRequest = false;
currentRequest = Promise.resolve($.ajax({
url: myUrl
type: "GET",
dataType: "json"
})); //end of ajax
currentRequest
.then((d) => {
[do some stuff]
})
.finally(() => {
// if the current request finished and another attempt to request data
// happened while this request was running then call getData again
if (resubmitRequest) {
getData()
}
})
} else {
// store the information that the data has to be requested
// another time after the currently running request finished
resubmitRequest = true;
}
return currentRequest;
} //end of getData
you can simply pause it for example to 500mil seconds and then run 'last' change and do that ajax call..
$("#filter").change(function(d) {
if( $.active > 0 ) return false;
// do not run new ajax if last one is not finished
clearTimeout( window.timer );
// if new 'change' evt is raised in less then 500 mils
// then clear it and run 'last' one
window.timer = setTimeout( function(){
getData();
}, 500 );
});
in case if user changes it while doing ajax, return false it :)
With little effort you can code your own handler for these cases:
function handler(ms, fn) {
var eventId;
return function () {
// if there is an event programmed, kill it
clearTimeout(eventId);
// and program the new one (replace the event)
eventId = setTimeout(fn);
// if passed the time no event was programmed, setTimeout is going to execute the function
}
}
// execute getData if no event is fired in a 1000ms interval (the user stopped typing)
// getData is the same that you have
var fn = handler(1000, getData);
$("#filter").change(fn);
I have various function with the ajax and $.post syntax which gives call to the server function. But when session gets expired and page is not refreshed my ajax code wont work. In this I want to redirect the page to login controller.
As it this is ajax call my redirection code is not working.
Is there any code or JavaScript/jQuery function which gets executed before any other jquery ajax and post function.
I am using PHP(Yii framework) on server side.
Please let me know. Thank you.
You can use the "beforeSend" ajax event where you can check you session and if it's expired you can do something else:
$.ajax({
beforeSend: function(){
// Handle the beforeSend event
},
complete: function(){
// Handle the complete event
}
// ......
});
Check this for more info: http://api.jquery.com/Ajax_Events/
jQuery provides a set of AJAX events you can listen during request lifecycle. In your case you can subscribe to ajaxError event triggered on document to react when requests failed as unauthorized:
$(document).on('ajaxError', function(el, xhr) {
if (xhr.status == 401) {
alert('Unauthorized');
}
});
This code can solve your problem.
$(document).bind("ajaxSend", function(){
//DO Someting..
});
Note beforeSend is local event and ajaxSend is global event
/***
* Global Ajax call which gets excuted before each $.ajax and $.post function
* at server side chk session is set or destroy
***/
$(document).on('ajaxStart', function()
{
$.post( BASEURL+"/login/chkLogin",
{},
function(data)
{
if (data == 'login') {
window.location = BASE_URL+'/login'; //Load the login page
}
else
{
//alert('all okay! go and execute!');
//No need to write anything in else part
//or can say no need of else block
//if all okay normal ajax action is processed
}
});
});
This is what I want. working perfectly for my functionality. Thank you for all answers and comments. I got a big reference from you.
use ajaxComplete event.
$( document ).ajaxComplete(function() {
window.location.href = "/login";
});
Previously I used onchange event to call javascript method(in this method used AJAX call to pass selected option to server). But this leads to multiple AJAX call to server on multiple options selection. Is there a way to call javascript method after making multiple selection in select box so that I can prevent multiple AJAX calls to server.
here it will abort in process ajax request and will consider only last request..
$(document).ready(function(){
$("#searchInput").onchange(function(){
ajaxSelect( $("#searchInput").val() );
});
});
var request;
function ajaxSelect(selectedKey) {
/* if request is in-process, kill it */
if(request) {
request.abort();
};
request = $.ajax({
type: "get",
url: "http://example.com/ajaxRequestHandler/",
data: "action=selectmulti&selectedKey=" + selectedKey
}).done(function() {
/* process response */
/* response received, reset variable */
request = null;
});
}
one way could be to define following event clearly.
after making multiple selection in select box
You can think of using other events like onblur or onmouseout on select box to send ajax request if that suits your need.
Often times I find myself designing apps that make AJAX calls to the server, outside APIs, HTTP requests, etc. The problem is, while this async calls are happening, the user still has the ability to click on items that make the same AJAX call or interrupt the flow of the app, etc. I've experimented with various hacks to prevent this, but I'm wondering what the most accepted way of doing this is?
To make this more concrete, let's say I have a button element that makes an AJAX call and a form element that alters some of the data my app uses for the AJAX call. What is the best way to design the button and form functions so that they do not work while button's AJAX call is in process?
The best way to accomplish what you want is to lead the AJAX calls trough a function so you can check within that function if a request is active. Here's an example assuming you're using JQuery:
active_ajax_call = false;
function get_ajax(url, senddata) {
if(active_ajax_call == false) {
active_ajax_call = true;
$.ajax({
type: "POST",
url: url,
data: senddata
}).done(function (data) {
active_ajax_call = false;
console.log(data);
});
}
}
get_ajax("some_url", {name: "John", location: "Amsterdam"});
And ofcourse present the website user a nice ajax loader or something so they know data is being pulled.
In the handler for the button, disable the button (visually and functionally), then do the AJAX call. Returning from the AJAX call, reenable the button.
This is, how the major sites do it (e.g. PayPal).
I've been pulling my hair out with this function. It's a function within a function which is why I think it's not returning anything, heres the code:
function getEventImageNormal(data) {
$.getJSON("https://graph.facebook.com/fql?access_token=" + access_token + "&q=SELECT pic FROM event WHERE eid=" + data, function(data){
console.log(data.data[0].pic);
return data.data[0].pic;
});
}
The correct item, the URL of the image, is being set to the console log, but not being returned?
If anyone is wondering why I'm not using https://graph.facebook.com/object_id/picture to get the events image, it's because this functionality is currently not working and the only method is to use FQL for event images.
By default getJSON() performs asynchronous call.
You can call a function right within the success callback handler to treat the response.
$.getJSON("https://graph.facebook.com/fql?access_token=" + access_token + "&q=SELECT pic FROM event WHERE eid=" + data, function(data){
console.log(data.data[0].pic);
getResponse(data);
});
function getResponse(data) {
// handle your data here.
}
You need to add callback=? parameter to your url to let remote url and jQuery know it is a jsonp call. If it is only json sent, it is against cross domain security policies and browser won't accept it into the DOM.
This is in addition to processing the data within the sucecss callback of request, due to asynchronous nature of ajax.
var data=/* your value*/
$.getJSON("https://graph.facebook.com/fql?callback=?&access_token=" + access_token + "&q=SELECT pic FROM event WHERE eid=" + data, function(data){
/* process data here*/
})