Asynchronous ajax request locking browser - javascript

This is a simple snippet of code to launch an aynchronous ajax request.
The processing time of the request is deliberately long (10 seconds or more).
Why browser prevent my users to click on a href link during the process of the async request ?
(tried with Firefox and Chrome)
The async request is normally called and the 'Ready' message is immediately displayed in console.
Snippet :
new Ajax.Request('index.php', {
method: 'post',
asynchronous: true,
parameters: { 'sleep': 10 },
onSuccess: function(transport) { console.log('Success'); },
onFailure: function() { console.log('Error'); }
});
console.log('Ready');

PHP is the cause of the problem here. When you do session_start() the PHP locks the session file so there’s no concurrent writing to this file and gives the running script full access to the session variables ( reading and writing ).
So you need to call session_write_close() as soon as possible.

Related

Asp.Net Service method Is Not Executing or Calling Inner Method or Statement on BeforeUnload event

I have beforeunload event in js which will hit the .asmx service method as provided below.
.js event
$(window).on("beforeunload", function () {
var d, str;
str = '{Id:"' + $('#hdnId').val() + '"}';
d = str;
$.ajax({
type: "POST", //GET or POST or PUT or DELETE verb
url: "../POC.asmx/fUpdateTimeInterval",
data: d,
contentType: "application/json; charset=utf-8",
dataType: "json", //Expected data format from server
async: true,
beforeSend: function () {
// BlockUI();
},
success: function (data, Type, xhr) {//On Successfull service call
},
error: function (XMLHttpRequest, textStatus, errorThrown) {
alert(errorThrown);
},
complete: function () {
},
failure: function () {
}
});
});
.asmx (Web Service)
[WebMethod(true)]
public int fUpdateTimeInterval(String Id)
{
return new MY_POC.DLL.Survey().fUpdateTimeInterval(Id);
}
The above service will then call the below mentioned method defined in DLL class file.
public int fUpdateTimeInterval(Int32 Id)
{
List<SqlParameter> objParam = new List<SqlParameter>()
{
new SqlParameter{ParameterName ="#Id",Direction=ParameterDirection.Input,DbType=DbType.Int32,Value= Id},
};
MY_POC.DLL.SqlHelper.ExecuteNonQuery("MY_UpdateTimeInterval", System.Data.CommandType.StoredProcedure, objParam.ToArray());
return 0;
}
Now problem is when the page gets load on browser for first time I am getting current auto ID of the inserted row. If I refreshes the browser, then beforeunload event gets fired and update the row of received ID only. But if I close the tab or browser then the compiler would hit the service method and stops after opening brace, it does not executing further & not even showing any error.
After execution stops, I am getting the following message at Output screen of vs, but not showing any error.
It sounds like execution of the request is being aborted because the browser is closing the connection.
You should consider using the Beacon API. It's well supported by almost all browsers and it's made for this purpose. From Mozilla's documentation:
The main use case for the Beacon API is to send analytics such as
client-side events or session data to the server. Historically,
websites have used XMLHttpRequest for this, but browsers do not
guarantee to send these asynchronous requests in some circumstances
(for example, if the page is about to be unloaded). To combat this,
websites have resorted to various techniques, such as making the
request synchronous, that have a bad effect on responsiveness. Because
beacon requests are both asynchronous and guaranteed to be sent, they
combine good performance characteristics and reliability.
You can also make your Ajax request synchronous to prevent the connection from closing but that will have an impact on your GUI as it will block until the request completes.

Javascript ajax request callback without waiting for response

I know we can make a javascript ajax request from some server and it either receives the response or gives timeout error after some time.
Let's consider this scenario when we don't want to wait for the request rather the server would send a response(or we can say it would be another request from server to client) async at any time after getting the request and then call a javascript CB function with the response.
I am looking for ideas for how to go about it mainly supporting all modern browsers and if possible not relying on any 3rd party plugin except may be jQuery.
The main feature of Ajax is that it IS asynchronous by default, and your program will continue to run without waiting for the response. So unless I'm misreading your question, it is what you need.
If you use jquery, then you pass in a callback function that will execute only when the server sends back a response. You can specify a timeout in the settings, though I'm not sure what the maximum time you can provide without getting a timeout error. But it will be several seconds, at least.
You can even specify different callbacks for success and fail as follows (adapted from the jquery ajax API, but added a timeout of 5 seconds):
var request = $.ajax({
url: "http://www.some.url/",
method: "GET",
data: { some : stuff },
dataType: "html",
timeout: 5000
});
request.done(function( data ) {
console.log( "SUCCESS: " + data );
});
request.fail(function() {
console.log( "Request failed");
});
I came across this question after 4 years. I dont remember in what context I asked this but for anyone who has the same query:
Http is a request/response protocol. Which means the client sends a request and the server responds to that request with some message/data. Thats the end of the story for that request.
In order for the server to trigger something on the clientside we will have to use something that keeps the connection to the server rather than ending the communication after getting the response. Socket.io is bi directional event driven library that solves this problem.
To update a cart (PHP Session storage and reserve the stock of items in database) on my online shop, I simply add a timeout of 100ms after calling it and remove Success/Error callback.
$.ajax({
url: 'http://www.some.url/',
method: 'GET',
data: {
some : 'stuff'
},
dataType: 'html',
timeout: 100
});
Note : It doesn't matter if some requests didn't arrive, because when the order is saved, an update of the whole cart is sent with a callback.
If your query needs acknowledge, don't use that solution !
I believe your question is similar to this
by Paul Tomblin. I use the answer provided by gdoron, which is also marked as the best solution, and also the comment by AS7K.
$.ajax({
url: "theURL",
data: theData
});
NB: No async parameter provided.

Page waits for AJAX before changing location

This question might seem a bit odd, the problem arised when the page went through webtests.
The page uses an AJAX call (async set to true) to gather some data. For some reason it won't swap pages before the AJAX call has returned - consider the following code:
console.log("firing ajax call");
$.ajax({
type: "POST",
url: "requestedService",
data: mode : "requestedMethod",
cache: false,
dataType: "json",
success: function() { console.log("ajax response received") },
error: null,
complete: null,
});
console.log("changing window location");
window.location = "http://www.google.com"
The location only changes after AJAX returns the response. I have tested the call, it is in fact asynchronous, the page isn't blocked. It should just load the new page even if the AJAX call hasn't completed, but doesn't. I can see the page is trying to load, but it only happens once I get the response. Any ideas?
The console output is:
firing ajax call
changing window location
ajax response received
This seems to work fine for me. The location is changed before the code in the async handler executes. Maybe you should post some real code and not a simplified version, so that we can help better.
Here is a demonstration that works as you expect: http://jsfiddle.net/BSg9P/
$(document).ready(function() {
var result;
$("#btn").on('click', function(sender, args) {
setInterval(function() {
result = "some result";
console.log("Just returned a result");
}, 5000);
window.location = "http://www.google.com";
});
});
And here is a screenshot of the result: http://screencast.com/t/VbxMCxxyIbB
I have clicked the button 2 times, and you can see in the JS console that the message about the location change is printed before the result each time. (The error is related to CORS, if it was the same domain, it would navigate).
Bit late but maybe someone else will have the same issue.
This answer by #todd-menier might help: https://stackoverflow.com/questions/941889#answer-970843
So the issue might be server-side. For eg, if you're using PHP sessions by default the user's session will be locked while the server is processing the ajax request, so the next request to the new page won't be able to be processed by the server until the ajax has completed and released the lock. You can release the lock early if your ajax processing code doesn't need it so the next page load can happen simultaneously.

Chrome not handling jquery ajax query

I have the following query in jquery. It is reading the "publish" address of an Nginx subscribe/publish pair set up using Nginx's long polling module.
function requestNextBroadcast() {
// never stops - every reply triggers next.
// and silent errors restart via long timeout.
getxhr = $.ajax({
url: "/activity",
// dataType: 'json',
data: "id="+channel,
timeout: 46000, // must be longer than max heartbeat to only trigger after silent error.
error: function(jqXHR, textStatus, errorThrown) {
alert("Background failed "+textStatus); // should never happen
getxhr.abort();
requestNextBroadcast(); // try again
},
success: function(reply, textStatus, jqXHR) {
handleRequest(reply); // this is the normal result.
requestNextBroadcast();
}
});
}
The code is part of a chat room. Every message sent is replied to with a null rply (with 200/OK) reply, but the data is published. This is the code to read the subscribe address as the data comes back.
Using a timeout all people in the chatroom are sending a simple message every 30 to 40 seconds, even if they don't type anything, so there is pleanty of data for this code to read - at least 2 and possibly more messages per 40 seconds.
The code is 100% rock solid in EI and Firefox. But one read in about 5 fails in Chrome.
When Chrome fails it is with the 46 seconds timeout.
The log shows one /activity network request outstanding at any one time.
I've been crawling over this code for 3 days now, trying various idea. And every time IE and Firefox work fine and Chrome fails.
One suggestion I have seen is to make the call syncronous - but that is clearly impossible because it would lock up te user interface for too long.
Edit - I have a partial solution: The code is now this
function requestNextBroadcast() {
// never stops - every reply triggers next.
// and silent errors restart via long timeout.
getxhr = jQuery.ajax({
url: "/activity",
// dataType: 'json',
data: "id="+channel,
timeout: <?php echo $delay; ?>,
error: function(jqXHR, textStatus, errorThrown) {
window.status="GET error "+textStatus;
setTimeout(requestNextBroadcast,20); // try again
},
success: function(reply, textStatus, jqXHR) {
handleRequest(reply); // this is the normal result.
setTimeout(requestNextBroadcast,20);
}
});
}
Result is sometimes the reply is delayed until the $delay (15000) happens, Then the queued messages arrive too quicly to follow. I have been unable to make it drop messages (only tested with netwrok optomisation off) with this new arrangement.
I very much doubt that delays are dur to networking problems - all machines are VMs within my one real machine, and there are no other users of my local LAN.
Edit 2 (Friday 2:30 BST) - Changed the code to use promises - and the POST of actions started to show the same symptoms, but the receive side started to work fine! (????!!!???).
This is the POST routine - it is handling a sequence of requests, to ensure only one at a time is outstanding.
function issuePostNow() {
// reset heartbeat to dropout to send setTyping(false) in 30 to 40 seconds.
clearTimeout(dropoutat);
dropoutat = setTimeout(function() {sendTyping(false);},
30000 + 10000*Math.random());
// and do send
var url = "handlechat.php?";
if (postQueue.length > 0) {
postData = postQueue[0];
var postxhr = jQuery.ajax({
type: 'POST',
url: url,
data: postData,
timeout: 5000
})
postxhr.done(function(txt){
postQueue.shift(); // remove this task
if ((txt != null) && (txt.length > 0)) {
alert("Error: unexpected post reply of: "+txt)
}
issuePostNow();
});
postxhr.fail(function(){
alert(window.status="POST error "+postxhr.statusText);
issuePostNow();
});
}
}
About one action in 8 the call to handlechat.php will timeout and the alert appears. Once the alert has been OKed, all queued up messages arrive.
And I also noticed that the handlechat call was stalled before it wrote the message that others would see. I'm wondering if it could be some strange handling of session data by php. I know it carefully queues up calls so that session data is not corrupted, so I have been careful to use different browsers or different machines. There are only 2 php worker threads however php is NOT used in the handling of /activity or in the serving of static content.
I have also thought it might be a shortage of nginx workers or php processors, so I have raised those. It is now more difficult to get things to fail - but still possible. My guess is the /activity call now fails one in 30 times, and does not drop messages at all.
And thanks guys for your input.
Summary of findings.
1) It is a bug in Chrome that has been in the code for a while.
2) With luck the bug can be made to appear as a POST that is not sent, and, when it times out it leaves Chrome in such a state that a repeat POST will succeed.
3) The variable used to store the return from $.ajax() can be local or global. The new (promises) and the old format calls both trigger the bug.
4) I have not found a work around or way to avoid the bug.
Ian
I had a very similar issue with Chrome. I am making an Ajax call in order to get the time from a server every second. Obviously the Ajax call must be asynchronous because it will freeze up the interface on a timeout if it's not. But once one of the Ajax calls is a failure, each subsequent one is as well. I first tried setting a timeout to be 100ms and that worked well in IE and FF, but not in Chrome. My best solution was setting the type to POST and that solved the bug with chrome for me:
setInterval(function(){
$.ajax({
url: 'getTime.php',
type: 'POST',
async: true,
timeout: 100,
success: function() { console.log("success"); },
error: function() { console.log("error"); }
});
}, 1000);
Update:
I believe the actual underlying problem here is Chrome's way of caching. It seems that when one request fails, that failure is cached, and therefore subsequent requests are never made because Chrome will get the cached failure before initiating subsequent requests. This can be seen if you go to Chrome's developer tools and go to the Network tab and examine each request being made. Before a failure, ajax requests to getTime.php are made every second, but after 1 failure, subsequent requests are never initiated. Therefore, the following solution worked for me:
setInterval(function(){
$.ajax({
url: 'getTime.php',
cache: false,
async: true,
timeout: 100,
success: function() { console.log("success"); },
error: function() { console.log("error"); }
});
}, 1000);
The change here, is I am disabling caching to this Ajax query, but in order to do so, the type option must be either GET or HEAD, that's why I removed 'type: 'POST'' (GET is default).
try moving your polling function into a webworker to prevent freezing up in chrome.
Otherwise you could try using athe ajax .done() of the jquery object. that one always works for me in chrome.
I feel like getxhr should be prefixed with "var". Don't you want a completely separate & new request each time rather than overwriting the old one in the middle of success/failure handling? Could explain why the behavior "improves" when you add the setTimeout. I could also be missing something ;)
Comments won't format code, so reposting as a 2nd answer:
I think Michael Dibbets is on to something with $.ajax.done -- the Deferred pattern pushes processing to the next turn of the event loop, which I think is the behavior that's needed here. see: http://www.bitstorm.org/weblog/2012-1/Deferred_and_promise_in_jQuery.html or http://joseoncode.com/2011/09/26/a-walkthrough-jquery-deferred-and-promise/
I'd try something like:
function requestNextBroadcast() {
// never stops - every reply triggers next.
// and silent errors restart via long timeout.
getxhr = jQuery.ajax({
url: "/activity",
// dataType: 'json',
data: "id="+channel,
timeout: <?php echo $delay; ?>
});
getxhr.done(function(reply){
handleRequest(reply);
});
getxhr.fail(function(e){
window.status="GET error " + e;
});
getxhr.always(function(){
requestNextBroadcast();
});
Note: I'm having a hard time finding documentation on the callback arguments for Promise.done & Promise.fail :(
Perhaps it can be worked around by changing the push module settings (there are a few) - Could you please post these?
From the top of my head:
setting it to interval poll, would kinda uglily solve it
the concurrency settings might have some effect
message storage might be used to avoid missing data
I would also use something like Charles to see what exactly does happen on the network/application layers

EXT JS Session Timeout

EXT JS - I would like to know how to check the json response for a session time out like if a user is idle for say 20 minutes or so if his session is expired or not
There is no standard way of handling session timeouts in ExtJS. ExtJS is a client-side library, used to create the user interface/front-end layer of an application, while session management takes place on the server side.
ExtJS Ajax requests implement a callback mechanism. It means that a certain Javascript function is assigned as the callback function, which is called when the Ajax request has finished (either successfully or unsuccessfully). Here's an example taken from ExtJS API Documentation - see parameters success and failure that define the callback functions:
// Basic request
Ext.Ajax.request({
url: 'foo.php',
success: someFn,
failure: otherFn,
headers: {
'my-header': 'foo'
},
params: { foo: 'bar' }
});
So, in the case of session timeout, you could (for example) construct a JSON response, which would contain some error code (defined by you), and an error message to be shown to the user. The callback function should then check if this error is returned from the server, and take necessary actions (show error message, redirect to login page, etc.) when that happens.
Note that in the above case, from ExtJS viewpoint, the Ajax request would actually be successful. When the HTTP request fails altogether (HTTP errors like 403 and such), the Ajax request is considered unsuccessful. This is important because it is usually possible to define different callback functions for successful and unsuccessful requests (as in the above sample code).
You can mock the timeout session...
var keepaliveHandler = new Ext.util.DelayedTask(function(){
Ext.Ajax.request({
url : '/keepalive',
method : 'GET',
success: function(response, options){
//dummy server call each 60 seconds
keepaliveHandler.delay(60000);
}
});
});
var timeoutHandler = new Ext.util.DelayedTask(function(){
//invalidate session
Ext.Ajax.request({
url : '/logout',
method : 'GET',
success: function(response, options){
Ext.MessageBox.show({
title: MessagesMap.getMessage('session.closed'),
msg: MessagesMap.getMessage('session.closed.message'),
buttons: Ext.MessageBox.OK,
fn: function() {
window.location.pathname = '/';
},
icon: Ext.MessageBox.WARNING
});
}
});
});
if(Ext.ux.SystemProperties.isLogged) {
keepaliveHandler.delay(60000);
timeoutHandler.delay(Ext.ux.SystemProperties.timeout);
//check for mouse movements
document.body.onmousemove = function(e) {
timeoutHandler.delay(Ext.ux.SystemProperties.timeout);
};
}

Categories