Ajax: does setting timeout always override the browser's timeout? - javascript

It seems to be possible to set a timeout value when doing an Ajax request in plain javascript. see How to detect timeout on an AJAX (XmlHttpRequest) call in the browser?
It is also possible when using jQuery's ajax implementation, & other similar frameworks I assume. see Set timeout for ajax (jQuery)
Browsers seem to have rather vague specification regarding their default timeout. see Browser Timeouts
Hence one might "hey, I'm going to set a timeout to my ajax request so all the users will have the same timeout".
But then, the next question follow: would it actually override the browser's timeout in all cases?
When I say "all" cases, I mean for instance, if the browser timeout value is smaller than your ajax request timeout value.
I suspect it does not.
And I also suspect it is best practice to always have a timeout error handler to make sure that whatever happens you can display a relevant message that will save hours of work to your support team, & money to your company. see Determine if $.ajax error is a timeout
Thanks in advance

It is an interesting question, I made some experiments in Chrome 59.0 and Firefox 54.0 using a 10min delay service as the backend.
After some test setting the timeout on the client to 10 minutes I had an error response with text status "error" after 300 seconds (5 minutes) in both browsers, so at least for these two browsers it is not possible to override the internal timeout value. I am assuming the same behavior for the remaining browsers in the market.
My test script: (similar results for vanilla JavaScript)
var st = new Date();
$.ajax({
url: "https//mysitewith10minresponse.com/foobar",
type: "GET",
dataType: "json",
timeout: 600000,
success: function(response) { console.log(response); },
error: function(jqXHR, textStatus, errorThrown) {
st = (new Date() - st)/1000;
alert("Text Status " + textStatus + ", diff: " + st + " seconds");
}
});

Related

Javascript ajax request callback without waiting for response

I know we can make a javascript ajax request from some server and it either receives the response or gives timeout error after some time.
Let's consider this scenario when we don't want to wait for the request rather the server would send a response(or we can say it would be another request from server to client) async at any time after getting the request and then call a javascript CB function with the response.
I am looking for ideas for how to go about it mainly supporting all modern browsers and if possible not relying on any 3rd party plugin except may be jQuery.
The main feature of Ajax is that it IS asynchronous by default, and your program will continue to run without waiting for the response. So unless I'm misreading your question, it is what you need.
If you use jquery, then you pass in a callback function that will execute only when the server sends back a response. You can specify a timeout in the settings, though I'm not sure what the maximum time you can provide without getting a timeout error. But it will be several seconds, at least.
You can even specify different callbacks for success and fail as follows (adapted from the jquery ajax API, but added a timeout of 5 seconds):
var request = $.ajax({
url: "http://www.some.url/",
method: "GET",
data: { some : stuff },
dataType: "html",
timeout: 5000
});
request.done(function( data ) {
console.log( "SUCCESS: " + data );
});
request.fail(function() {
console.log( "Request failed");
});
I came across this question after 4 years. I dont remember in what context I asked this but for anyone who has the same query:
Http is a request/response protocol. Which means the client sends a request and the server responds to that request with some message/data. Thats the end of the story for that request.
In order for the server to trigger something on the clientside we will have to use something that keeps the connection to the server rather than ending the communication after getting the response. Socket.io is bi directional event driven library that solves this problem.
To update a cart (PHP Session storage and reserve the stock of items in database) on my online shop, I simply add a timeout of 100ms after calling it and remove Success/Error callback.
$.ajax({
url: 'http://www.some.url/',
method: 'GET',
data: {
some : 'stuff'
},
dataType: 'html',
timeout: 100
});
Note : It doesn't matter if some requests didn't arrive, because when the order is saved, an update of the whole cart is sent with a callback.
If your query needs acknowledge, don't use that solution !
I believe your question is similar to this
by Paul Tomblin. I use the answer provided by gdoron, which is also marked as the best solution, and also the comment by AS7K.
$.ajax({
url: "theURL",
data: theData
});
NB: No async parameter provided.

Timeout in function with ajax jquery

Good afternoon people !
I have the following code in jquery / ajax
$.ajax({
url:'../pujar',
dataType:'json',
type:'get',
cache:true,
});
This code works correctly when I send the php but now I don't know how to use the timeout with ajax.
In another code I use the following structure and I don't have any problem with code.
setInterval(function() {
$.ajax({
url: '../ajaxpujas',
dataType: 'json',
type: 'get',
cache: true,
success: json,
});
function json(data) {
$("#tbodyid")
.empty();
$(data)
.each(function(index, value) {
var table = '<tr><td>' + value.users.name + '</td><td>' + value.id + '</td></tr>';
$('#tbodyid')
.append(table);
});
}
}, 1000);
When I try to use this code , doesn't work correctly. I need to reload every second.
$.ajax({
url:'../pujar',
dataType:'json',
type:'get',
cache:true,
timeout:1000,
});
Docs
Set a timeout (in milliseconds) for the request. This will override any global timeout set with $.ajaxSetup(). The timeout period starts at the point the $.ajax call is made; if several other requests are in progress and the browser has no connections available, it is possible for a request to time out before it can be sent. In jQuery 1.4.x and below, the XMLHttpRequest object will be in an invalid state if the request times out; accessing any object members may throw an exception. In Firefox 3.0+ only, script and JSONP requests cannot be cancelled by a timeout; the script will run even if it arrives after the timeout period.
timeout in $.ajax() sSet a timeout (in milliseconds) for the request to complete, if for any reason the request is not completed with in the time frame the request will abort
You have to use
setInterval(function() {
$.ajax({
url: '../pujar',
dataType: 'json',
type: 'get',
cache: true,
success: function (data) {
}
});
}, 1000);
The "timeout" that you are using in the AJAX request is not the same as setTimeout in javascript. AJAX timeout actually specifies the time in which the request should get timed out.
As per jquery's documentation
timeout
Type: Number
Set a timeout (in milliseconds) for the request. This will override any global timeout set with $.ajaxSetup(). The timeout period starts at the point the $.ajax call is made; if several other requests are in progress and the browser has no connections available, it is possible for a request to time out before it can be sent.
Hence you are actually setting a timeout for your request (i.e. if the source doesn't respond in 1000ms consider it to be a timeout failure). Thus you have to reload it every second.
What you are trying to do with the setInterval would work. Though I would recommend using setTimeout recursively instead of setInterval for better performance (and the intended effect I guess).

Chrome not handling jquery ajax query

I have the following query in jquery. It is reading the "publish" address of an Nginx subscribe/publish pair set up using Nginx's long polling module.
function requestNextBroadcast() {
// never stops - every reply triggers next.
// and silent errors restart via long timeout.
getxhr = $.ajax({
url: "/activity",
// dataType: 'json',
data: "id="+channel,
timeout: 46000, // must be longer than max heartbeat to only trigger after silent error.
error: function(jqXHR, textStatus, errorThrown) {
alert("Background failed "+textStatus); // should never happen
getxhr.abort();
requestNextBroadcast(); // try again
},
success: function(reply, textStatus, jqXHR) {
handleRequest(reply); // this is the normal result.
requestNextBroadcast();
}
});
}
The code is part of a chat room. Every message sent is replied to with a null rply (with 200/OK) reply, but the data is published. This is the code to read the subscribe address as the data comes back.
Using a timeout all people in the chatroom are sending a simple message every 30 to 40 seconds, even if they don't type anything, so there is pleanty of data for this code to read - at least 2 and possibly more messages per 40 seconds.
The code is 100% rock solid in EI and Firefox. But one read in about 5 fails in Chrome.
When Chrome fails it is with the 46 seconds timeout.
The log shows one /activity network request outstanding at any one time.
I've been crawling over this code for 3 days now, trying various idea. And every time IE and Firefox work fine and Chrome fails.
One suggestion I have seen is to make the call syncronous - but that is clearly impossible because it would lock up te user interface for too long.
Edit - I have a partial solution: The code is now this
function requestNextBroadcast() {
// never stops - every reply triggers next.
// and silent errors restart via long timeout.
getxhr = jQuery.ajax({
url: "/activity",
// dataType: 'json',
data: "id="+channel,
timeout: <?php echo $delay; ?>,
error: function(jqXHR, textStatus, errorThrown) {
window.status="GET error "+textStatus;
setTimeout(requestNextBroadcast,20); // try again
},
success: function(reply, textStatus, jqXHR) {
handleRequest(reply); // this is the normal result.
setTimeout(requestNextBroadcast,20);
}
});
}
Result is sometimes the reply is delayed until the $delay (15000) happens, Then the queued messages arrive too quicly to follow. I have been unable to make it drop messages (only tested with netwrok optomisation off) with this new arrangement.
I very much doubt that delays are dur to networking problems - all machines are VMs within my one real machine, and there are no other users of my local LAN.
Edit 2 (Friday 2:30 BST) - Changed the code to use promises - and the POST of actions started to show the same symptoms, but the receive side started to work fine! (????!!!???).
This is the POST routine - it is handling a sequence of requests, to ensure only one at a time is outstanding.
function issuePostNow() {
// reset heartbeat to dropout to send setTyping(false) in 30 to 40 seconds.
clearTimeout(dropoutat);
dropoutat = setTimeout(function() {sendTyping(false);},
30000 + 10000*Math.random());
// and do send
var url = "handlechat.php?";
if (postQueue.length > 0) {
postData = postQueue[0];
var postxhr = jQuery.ajax({
type: 'POST',
url: url,
data: postData,
timeout: 5000
})
postxhr.done(function(txt){
postQueue.shift(); // remove this task
if ((txt != null) && (txt.length > 0)) {
alert("Error: unexpected post reply of: "+txt)
}
issuePostNow();
});
postxhr.fail(function(){
alert(window.status="POST error "+postxhr.statusText);
issuePostNow();
});
}
}
About one action in 8 the call to handlechat.php will timeout and the alert appears. Once the alert has been OKed, all queued up messages arrive.
And I also noticed that the handlechat call was stalled before it wrote the message that others would see. I'm wondering if it could be some strange handling of session data by php. I know it carefully queues up calls so that session data is not corrupted, so I have been careful to use different browsers or different machines. There are only 2 php worker threads however php is NOT used in the handling of /activity or in the serving of static content.
I have also thought it might be a shortage of nginx workers or php processors, so I have raised those. It is now more difficult to get things to fail - but still possible. My guess is the /activity call now fails one in 30 times, and does not drop messages at all.
And thanks guys for your input.
Summary of findings.
1) It is a bug in Chrome that has been in the code for a while.
2) With luck the bug can be made to appear as a POST that is not sent, and, when it times out it leaves Chrome in such a state that a repeat POST will succeed.
3) The variable used to store the return from $.ajax() can be local or global. The new (promises) and the old format calls both trigger the bug.
4) I have not found a work around or way to avoid the bug.
Ian
I had a very similar issue with Chrome. I am making an Ajax call in order to get the time from a server every second. Obviously the Ajax call must be asynchronous because it will freeze up the interface on a timeout if it's not. But once one of the Ajax calls is a failure, each subsequent one is as well. I first tried setting a timeout to be 100ms and that worked well in IE and FF, but not in Chrome. My best solution was setting the type to POST and that solved the bug with chrome for me:
setInterval(function(){
$.ajax({
url: 'getTime.php',
type: 'POST',
async: true,
timeout: 100,
success: function() { console.log("success"); },
error: function() { console.log("error"); }
});
}, 1000);
Update:
I believe the actual underlying problem here is Chrome's way of caching. It seems that when one request fails, that failure is cached, and therefore subsequent requests are never made because Chrome will get the cached failure before initiating subsequent requests. This can be seen if you go to Chrome's developer tools and go to the Network tab and examine each request being made. Before a failure, ajax requests to getTime.php are made every second, but after 1 failure, subsequent requests are never initiated. Therefore, the following solution worked for me:
setInterval(function(){
$.ajax({
url: 'getTime.php',
cache: false,
async: true,
timeout: 100,
success: function() { console.log("success"); },
error: function() { console.log("error"); }
});
}, 1000);
The change here, is I am disabling caching to this Ajax query, but in order to do so, the type option must be either GET or HEAD, that's why I removed 'type: 'POST'' (GET is default).
try moving your polling function into a webworker to prevent freezing up in chrome.
Otherwise you could try using athe ajax .done() of the jquery object. that one always works for me in chrome.
I feel like getxhr should be prefixed with "var". Don't you want a completely separate & new request each time rather than overwriting the old one in the middle of success/failure handling? Could explain why the behavior "improves" when you add the setTimeout. I could also be missing something ;)
Comments won't format code, so reposting as a 2nd answer:
I think Michael Dibbets is on to something with $.ajax.done -- the Deferred pattern pushes processing to the next turn of the event loop, which I think is the behavior that's needed here. see: http://www.bitstorm.org/weblog/2012-1/Deferred_and_promise_in_jQuery.html or http://joseoncode.com/2011/09/26/a-walkthrough-jquery-deferred-and-promise/
I'd try something like:
function requestNextBroadcast() {
// never stops - every reply triggers next.
// and silent errors restart via long timeout.
getxhr = jQuery.ajax({
url: "/activity",
// dataType: 'json',
data: "id="+channel,
timeout: <?php echo $delay; ?>
});
getxhr.done(function(reply){
handleRequest(reply);
});
getxhr.fail(function(e){
window.status="GET error " + e;
});
getxhr.always(function(){
requestNextBroadcast();
});
Note: I'm having a hard time finding documentation on the callback arguments for Promise.done & Promise.fail :(
Perhaps it can be worked around by changing the push module settings (there are a few) - Could you please post these?
From the top of my head:
setting it to interval poll, would kinda uglily solve it
the concurrency settings might have some effect
message storage might be used to avoid missing data
I would also use something like Charles to see what exactly does happen on the network/application layers

How to set maximum execution time for ajax post with jQuery?

Is there a way to specify maximum execution time of an ajax post to the server so if the server doesn't respond, then keep trying for 10 seconds and then continue with the rest of the code??
Function doajaxPost(){
var returned_value="";
// #############I NEED THIS CODE TO TRY TO POST THE DATA TO THE SERVER AND KEEP
// #############TRYING FOR 10 SECONDS AND THEN CONTINUE WITH THE REST OF THE CODE.
jQuery.ajax({
url: 'ajaxhandler.php',
success: function (result) {
returned_value=result;
},
async: false
});
// ###################################################
alert(returned_value);
some other code
.
.
.
}
Use timeout:
jQuery.ajax({
url: 'ajaxhandler.php',
success: function (result) {
returned_value=result;
},
timeout: 10000,
async: false
});
However, alert(returned_value); will execute just after your call (won't wait for the call to finish).
The JQuery API documentation tells how to set a "timeout".
http://api.jquery.com/jQuery.ajax/
While other answers here are correct, learning to check the documentation for yourself is more valuable than knowing just this answer.
You can set timeout value for your ajax request.
timeout
Set a timeout (in milliseconds) for the request. This will override
any global timeout set with $.ajaxSetup(). The timeout period starts
at the point the $.ajax call is made; if several other requests are in
progress and the browser has no connections available, it is possible
for a request to time out before it can be sent. In jQuery 1.4.x and
below, the XMLHttpRequest object will be in an invalid state if the
request times out; accessing any object members may throw an
exception. In Firefox 3.0+ only, script and JSONP requests cannot be
cancelled by a timeout; the script will run even if it arrives after
the timeout period.
Here is an example:
$.ajax({
url: "ajaxhandler.php",
...
timeout: 10000,
...
});
Set a timeout (in milliseconds) for the request. This will override any global timeout set with $.ajaxSetup(). The timeout period starts at the point the $.ajax call is made; if several other requests are in progress and the browser has no connections available, it is possible for a request to time out before it can be sent. In jQuery 1.4.x and below, the XMLHttpRequest object will be in an invalid state if the request times out; accessing any object members may throw an exception. In Firefox 3.0+ only, script and JSONP requests cannot be cancelled by a timeout; the script will run even if it arrives after the timeout period.
Hopefully, this will help others like me who aren't completely fluent in JavaScript or, more to the point, aren't completely fluent in reading jQuery documentation.
I admit, I looked at the jQuery.Ajax docs, and easily enough found the section that talks about setting a timeout:
timeout
Type: Number
Set a timeout (in milliseconds) for the request. A value of 0 means there will be no timeout. This will override any global timeout set with $.ajaxSetup(). The timeout period starts at the point the $.ajax call is made; if several other requests are in progress and the browser has no connections available, it is possible for a request to time out before it can be sent. In jQuery 1.4.x and below, the XMLHttpRequest object will be in an invalid state if the request times out; accessing any object members may throw an exception. In Firefox 3.0+ only, script and JSONP requests cannot be cancelled by a timeout; the script will run even if it arrives after the timeout period.
But lacking an example, I was still clueless how to specify the timeout value. The secret sauce is at the top of the article, where it says:
In JavaScript, this translates to this sort of syntax:
$.ajax({
url: "https://example.com/my-endpoint",
...
timeout: 0,
...
});
In the above example (which specifies a timeout of 0 to disable timeouts for this request), each key/value pair (as mentioned in the documentation) appears in the code as [key]: [value]

jQuery AJAX fires error callback on window unload - how do I filter out unload and only catch real errors?

If I navigate away from a page in the middle of an $.ajax() request it fires the error callback. I've tested in Safari and FF with both GET and POST requests.
One potential solution would be to abort all AJAX requests on page unload, but the error handler is called before unload, so this doesn't seem possible.
I want to be able to handle REAL errors such as 500s gracefully on the client side with a polite alert or a modal dialog, but I don't want this handling to be called when a user navigates away from the page.
How do I do this?
--
(Also strange: When navigating away from a page, the error handler says that the textStatus parameter is "error", the same it throws when receiving a 500/bad request.)
In the error callback or $.ajax you have three input arguments:
function (XMLHttpRequest, textStatus, errorThrown) {
this; // options for this ajax request
}
You can check directly the xhr.status to get the HTTP response code, for example:
$.ajax({
url: "test.html",
cache: false,
success: function(html){
$("#results").append(html);
},
error: function (xhr, textStatus) {
if (xhr.status == 500) {
alert('Server error: '+ textStatus);
}
}
});
Edit:
To tell the difference between a connection broken by the browser and the case where the server is down (jasonmerino's comment):
On unload the xhr.readyState should be 0, where for a non responsive
server the xhr.readyState should be 4.
This is a tough one to handle correctly in all situations. Unfortunately in many popular browsers the xhr.status is the same (0) if the AJAX call is cancelled by navigation or by a server being down / unresponsive. So that technique rarely works.
Here's a set of highly "practical" hacks that I've accumulated that work fairly well in the majority of circumstances, but still isn't bullet-proof. The idea is to try to catch the navigate events and set a flag which is checked in the AJAX error handler. Like this:
var global_is_navigating = false;
$(window).on('beforeunload',function() {
// Note: this event doesn't fire in mobile safari
global_is_navigating = true;
});
$("a").on('click',function() {
// Giant hack that can be helpful with mobile safari
if( $(this).attr('href') ) {
global_is_navigating = true;
}
});
$(document).ajaxError(function(evt, xhr, settings) {
// default AJAX error handler for page
if( global_is_navigating ) {
// AJAX call cancelled by navigation. Not a real error
return;
}
// process actual AJAX error here.
});
(I'd add this as a comment to the main answer but haven't built up enough points to do so yet!)
I'm also seeing this in FF4 and Chrome (9.0.597.107). Probably elsewhere but that's bad enough for me to want to fix it!
One of the things that's odd about this situation is that returned XMLHttpRequest.status === 0
Which seems like a reliable way to detect this situation and, in my particular case, abort the custom error handling that displays to the user:
error: function (XMLHttpRequest, textStatus, errorThrown) {
if (XMLHttpRequest.status === 0) return;
// error handling here
}
Also worth mentioning that on the assumption if may be a problem in the JSON parse of whatever the browser is giving back to the $.ajax() call, I also tried swapping out the native JSON.stringify for the Douglas Crockford version ( https://github.com/douglascrockford/JSON-js ) but that made no difference.
The error callback should get a reference to the XHR object, check the status code to see if it's a server error or not?
If you make use of jQuery functions such as $.get, $.post, $.ajax... then you can use the parameter text_status to check what type of fail is:
request = $.get('test.html', function(data){
//whatever
}).fail(function(xhr, text_status, error_thrown) {
if(text_status!== 'abort'){
console.warn("Error!!");
}
});
From the jQuery docs:
jqXHR.fail(function( jqXHR, textStatus, errorThrown ) {}); An alternative construct to the error callback option, the .fail() method
replaces the deprecated .error() method. Refer to deferred.fail() for
implementation details.
Just wait a while before showing the error. If a request fails due to page unload, the timeout won't fire and the error won't be shown. My experiments have shown that 100ms is enough for desktop Safari and Firefox (desktop and Android) and 500ms is enough for iOS Safari. The durations are fuzzy, so you should increase the numbers if it's possible.
const request = new XMLHttpRequest();
// Firefox calls onabort, Safari calls onerror
request.onabort = request.onerror = () => {
// A short comprehensive solution
setTimeout(() => {
alert('The request has failed completely');
}, 500);
// A more gentle solution
const isWebKit = 'ApplePayError' in window;
const isDesktopSafari = !('standalone' in navigator);
const isGecko = 'mozInnerScreenX' in window
setTimeout(
() => {
alert('The request has failed completely');
},
isWebKit ? (isDesktopSafari ? 100 : 500) : (isGecko ? 100 : 0)
);
}
request.open('get', 'http://example.com', true);
request.send();
I've checked this solutions only in modern browsers (Chrome 87, Firefox 83, IE 11, Edge, Safari 14).

Categories