JQuery Ajax post parameters sometimes not sent on IE - javascript

The problem I am having is that when I use jquery ajax post, with very low frequency (< 2%), the post parameters never make it to the server. I do see the post request in the access log. It seems to happen only on IE (I've observed it on 7, 8, and 9 in the logs).
When I switch the call from type "post" to type "get" the issue goes away.
Has anyone else ever seen this odd behavior on IE? Thanks!
I have seen this for various ajax calls, but here is a typical one:
var data= {
"guess" : "m1",
"eas" : "hello world"
};
$.ajax({
url: "http://myco.com/ajaxcall.action",
data: data,
type : 'post',
dataType: 'json',
success: function(data) {},
error: function() {}
});
Update: passing "cache: false" does not fix the issue.

I have spent the last week tracking down a similar problem in my own application (uses Dojo, not JQuery). From your description and frequency of occurrence, I would say it's the same issue.
When HTTP persistent connections are used between browser and server (the default behavior), an HTTP connection can be closed down by the server at any time. This creates a very small timing hole when the browser starts to send a new request at the same time the server closes the connection. Most browsers will use a different connection or open a new connection and resend the request. This is the behavior suggested in RFC 2616 section 8.1.4:
A client, server, or proxy MAY close the transport connection at any
time. For example, a client might have started to send a new request
at the same time that the server has decided to close the "idle"
connection. From the server's point of view, the connection is being
closed while it was idle, but from the client's point of view, a
request is in progress.
This means that clients, servers, and proxies MUST be able to recover
from asynchronous close events. Client software SHOULD reopen the
transport connection and retransmit the aborted sequence of requests
without user interaction so long as the request sequence is
idempotent (see section 9.1.2).
Internet explorer does try to resend the request when this happens, but when it happens to be a POST, it mangles it up by sending the headers (with Content-Length) but no actual data. That is a malformed request and should always lead to an HTTP error (usually after some timeout waiting for the data that never comes).
This bug is documented by Microsoft as KB 895954 (see http://support.microsoft.com/kb/895954). Microsoft first recognized this bug in IE 6. They provided a hotfix, and appear to have shipped the hotfix with every version of IE since then including IE 9. There are two problems with the fix:
The hotfix is not activated by default. You have to create a really weird key using regedit to activate the fix: HKEY_LOCAL_MACHINE\Software\Microsoft\Internet Explorer\Main\FeatureControl\FEATURE_SKIP_POST_RETRY_ON_INTERNETWRITEFILE_KB895954.
The fix doesn't really fix the problem. The "fixed" behavior is that when the connection is closed when trying to send a request, it does not even try to resend it. It simply passes the error along to the javascript application.
It appears that you have to add error handlers in your code and re-post the request yourself if it fails. I am looking into this solution for my application. My concern is that I'm not sure how to tell if the error I get is caused by a failed attempt to send the query, or some error sent back from the server as a result of the query (in which case I don't want to resend it).
I wrote a C program to simulate a web server and explicitly close a connection to see how the browser handles it. I have found that IE reproduces the errant behavior 100% of the time, while Firefox, Safari and Chrome recover by properly resending the POST on another connection 100% of the time. Perhaps the answer is, "don't use IE."

As a direct answer to your question: Yes we have just come across this issue and could not find a reasonable explanation. It only affects IE and with a very low frequency - took a long while to get to the conclusion that it is a sporadic jQuery Ajax in IE bug. We had to 'fix' the issue by returning a fail from the server under this condition and re-posting the data after a 1 second delay!
Hacky as hell but seemed to be the only way.
There was definitely no clash with DOM elements etc. and no logical reason for this to happen, the page can be updated many times by the user successfully with intermittent fails.
Must be a bug.

I think you have to prevent caching in Internet Explorer. Try to set option cache to false.
Example:
$.ajax({
url: "http://myco.com/ajaxcall.action",
data: data,
type : 'post',
dataType: 'json',
success: function(data) {},
error: function() {},
cache: false
});

The params sent to the PHP are received from IE in GET:
$.ajax ({
url: "path/to/ajax.php"
,method: "POST"
,data: {
var1: "value1"
,var2: true
,varX: 123123
}
,cache: false
,success: function (data) {
alert (data);
}
});
Then on PHP you should use REQUEST instead of POST:
$var1 = $_REQUEST ["var1"]; // value1
$var2 = $_REQUEST ["var2"]; // true
$var3 = $_REQUEST ["var3"]; // 123123
This example could use it for compatibility with IE7

Related

XMLHttpRequest takes very long only in Internet Explorer

We have recently noticed an issue where a common page on our site temporarily freezes when navigated to from Internet Explorer with the message "This web page is not responding due to a long running script".
After investigating, I can see that it is caused by an AJAX XMLHttpRequest that is taking 30 - 45 seconds to complete. Normally when there are performance issues with our AJAX calls like this, the long wait time is during the wait for the server's response. But here, it is the wait to create the request and send it that is taking so long:
Note that there are no issues with this request in Google Chrome at all, it only takes 200ms:
These results are consistent on every page refresh. Note also that this is not a large request nor a large response. The request body is actually empty:
and the response is quite small:
I figured since the problem appears to be client-side, there must be something off with our scripts, but we use the same generic function for all of our AJAX calls and don't have this problem with anything else:
JSONRequest: function (url, type, data, success, error) {
var customError = function (er) {
console.log(er);
}
if (error !== 'undefined' && error != null)
customError = error;
$.ajax({
url: url,
type: type,
cache: false,
data: JSON.stringify(data),
contentType: 'application/json; charset=utf-8',
success: success,
error: customError
});
},
I am quite puzzled here. Is this simply a matter of "Avoid IE", or is there something I am missing? 30+ seconds to create a small request and send it seems absurdly long, especially when it is fast in Chrome. What gives?
*Note that I am testing with IE 11.
For some reason, a line of JQuery that unchecks all check boxes in a list of them was causing the issue:
$('.list-check-buttons.check-button.only-button input:checkbox').prop('checked', false)
I verified this by stepping through in the Chrome Debugger and also removing this line resolved the issue for the most part. (It turns out we didn't need this line as the same logic was also being handled in another place which strangely enough was not causing performance issues.)

Server stops responding to requests after too many in a session?

I have web app that uses frequent $.ajax() calls to transmit data to and from the server. This runs locally between a virtual machine host and client.
The problem I'm having is that it seems to cut out after making certain number of consecutive calls in a session (no actual number has been determined). This is can be seconds or minutes.
I tried assigning my $.ajax() calls to objects so they could be deleted, eg.:
myApp.ajaxRegistry.myAjax = $.ajax({
url: '/path/to/server',
error: function() {
delete myApp.ajaxRegistry.myAjax;
}
success: function() {
delete myApp.ajaxRegistry.myAjax;
}
});
I thought that may have improved it, but it could just be coincidence. It still fails frequently.
I've monitored the server access log when these failures occur, I can see that it's not even making the request. There are no Javascript errors in the browser console.
EDIT
The browser's network logger indicates that it is making the request, but server is not responding (according to apache's access log). After a few minutes, it starts responding again, so I'm thinking there is configuration on the server.
It might also be worth noting that the virtual machine server frequently loses time (some sort of annoying VirtualBox "feature"), so I wonder if that might be related.
UPDATE
I think my hunch about the server time may have been right. I finally managed to get ntp to work properly on the VM and I haven't encountered this problem for a few weeks now.
Just to have the answer in a separate post: the server time needs to be accurate (at least in this context) or the AJAX requests get confused.

Using Ajax CORS IE 11 in IE 5 Document Mode

I am in quite an interesting situation. I work for a middle sized company and we have IE 11 for our browser but for a particular application I am programming, its document mode is set to IE 5. I am making calls to a web service located within the company Intranet (the particular URL is https) and the website where I'm making the call is http (not sure if I can set document.domain or anything here). I am currently using ajax and I'm able to successfully communicate with service and get code back using the code below. Although, each time I do get the message that says, "the information you're accessing is coming from another source." If I hit yes, then I get information back from the server, if I hit no, then the data does not come through. I tried to call this same method from a different page on the same website, and instead get the access denied error, without even having the prompt. That one really has me confused.
I've read through many many different articles on the subject of CORS at this point and know there's a few things to try to avoid this message. Credentials are omitted below. I am using Jquery 1.9.1 minimized version.
$.support.cors = true;
$.ajax
({
type: "POST",
crossDomain: true,
beforeSend: function(request)
{
request.setRequestHeader("Authorization", "");
return true;
},
url: webServiceURL,
dataType: "xml",
async: true,
data: soapMessage,
contentType: "text/xml; charset=\"utf-8\"",
success: OnSuccess,
error: OnError
});
}
function OnSuccess(data, status)
{
alert(status);
alert(data);
var documentValArr = parseXMLRrec(data);
insertDataIntoTable(documentValArr);
}
function OnError(request, status, error)
{
alert(request.responseText);
alert(error);
alert(status);
}
Enable cross domain sources IE setting. I suppose this can try to be set for that specific URL in that intranet zone, but it still feels wrong to me. Not sure how the security group at work would feel about it either.
Add the URL I'm trying to access to the Intranet trusted sites. I think this would fix the issue I'm seeing and this is something I definitely would want to try.
Also, when testing on my local workstation and calling the web service through IE 11 doc 5 mode, I do not get the message.
Work with the group that owns the server where the service lives, and see if they can somehow add URL's that can access the service alright. This would be interesting because there are quite a lot of production sites.
Because I'm using IE 11 in doc 5 mode I could try to use the XDomainRequest object but I am not sure if it would be a worth while effort to do this if the earlier options work out.
So there my interesting scenario is. I would very much like some feed back from people on the situation, what they would do, and what type of things and solutions go for. THank you very much for all input and ideas!

Ajax request fails on window unload

I have a scenario where I need to make a cross-domain request upon leaving a page. The endpoint has all the proper CORS headers configured.
This is the request:
$.ajax({
type: "POST",
url: URL,
data: {
// some stuff
},
async: false
});
When tested as-is, it works great. When I try to trigger this request from an unload handler:
$(window).unload(function() {
// same as above
});
it intermittently fails. On the occasion that it fails, I get this stack trace:
Error: A network error occurred.
at Object.x.ajaxTransport.x.support.cors.e.crossDomain.send (http://cdnjs.cloudflare.com/ajax/libs/jquery/2.0.3/jquery.min.js:6:9344)
at Function.x.extend.ajax (http://cdnjs.cloudflare.com/ajax/libs/jquery/2.0.3/jquery.min.js:6:4804)
at http://my.web.site.com/main.js:129:11
at x.event.dispatch (http://cdnjs.cloudflare.com/ajax/libs/jquery/2.0.3/jquery.min.js:5:10006)
at x.event.add.y.handle (http://cdnjs.cloudflare.com/ajax/libs/jquery/2.0.3/jquery.min.js:5:6789)
With a jQuery DOMException and error code 19.
What's going on here? It looks like there's some race condition here that I'm unaware of, but other than setting the request to be synchronous, I'm not sure what else I can do.
You simply can't rely on unload handlers. May it be jQuery's implementation or the native onbeforeunload. (Maybe onbeforeunload will work better then jQuery's implementation)
There are so many ways a user can leave the page (crash, ALT+F4, close tab, ...).
Most browsers just don't wait long enough, so your async requests will fail in most cases.
You should probably reconsider your design.
I ended up using the beforeunload event instead:
$(window).on('beforeunload' function() {
// same code
});
which works just fine on all modern browsers. The beforeunload event behavior is consistent, and as long as the request is sent out synchronously, the event execution - and thus the window unloading - is paused.
This is the exact event services such as Gmail, Facebook and even Stack Overflow use to confirm is user is genuinely interested in leaving a page while in the middle of an action.

Can beforeunload/unload be used to send XmlHttpRequests reliably

recently, I had the urgent requirement to give my server a notice, that a specific page of my webapp is about to get closed. "Easy peasy" I thought, beforeunload is available for quite a while. The HTML5 "thing" even refreshed the spec (that was what I thought...) about it, in the way that we had the option to return a string value from a beforeunload event handler and stuff, which gives an user the option to intercept.
See the MDN page about onbeforeunload
However, as it turned out, there isn't any "official" specification available, which describes the behavior for beforeunload up to this date. The only official document I found, was on WHATWG, which is just a proposal for W3C of course.
See WHATWG
So far so good. We are able to create a synchronized XHR request within a beforeunload event handler. "Most" browsers, give that request a timeframe of about 1-2 seconds to complete, after that it is getting killed. Standard asynchronous request are killed immediately. Having that said, I cannot even tell from "where" I know this, it seems like gossip and word of mouth looking at it now. Even tho, it works in Firefox+Chrome, we cannot rely on that, can we ?
Is there any ongoing discussion/proposal on WHATWG about beforeunload ?
Any other official resources about the event I might have not found ?
And far most important to me here, how reliably can we send data via sync-XHR there ?
Take a look at navigator.sendBeacon(), which allows you to reliably send data to a server even when the page is unloading. It's currently in a draft specification and supported by Firefox 31, Chrome 39 (behind a flag from 37), behind a flag in Opera 24.
You could "sort of" polyfill it using something like the following:
navigator.sendBeacon = navigator.sendBeacon || function (url, data) {
var xhr = new XMLHttpRequest();
// Need to send synchronously to have the best chance of data getting
// through to the server
xhr.open('POST', url, false);
xhr.send(data);
};
Further reading:
HTML5 Rocks article
The thing to keep in mind is that beforeunload started as an extension by Internet Explorer. Automatically, that makes it a second-class citizen on the web. There is no specification, and browser implementation varies. For example, Firefox only partially implements it by not displaying the string, only a generic message.
Additionally, even when fully implemented, it does not protect against all possible unload scenarios, eg, the user has terminated the processor, the browser has crashed, or the computer has been turned off. Even ignoring these extreme scenarios, I suspect that it might be possible to configure your browser to ignore such requests.
My feeling is that you shouldn't rely on this message to save you. If this web app is internal, I would suggest training them to use the Save or Close or whatever buttons instead of just closing the tab. If it's external, maybe look into automatic saving as the user does their thing?
Sync XHR is a top-source of browser hangs, accounting for nearly 10% of hangs: http://blogs.msdn.com/b/ieinternals/archive/2011/08/03/do-not-use-xmlhttprequest-in-synchronous-mode-unless-you-like-to-hang.aspx
In IE, even sync XHR can be "interrupted" if the request requires Windows authentication roundtrips, or if there's a POST body to be sent. You may find that only the headers of the first unauthenticated request are sent.

Categories