Ajax call not working until user interaction in IOS - javascript

I am developing an application using phonegap and jQuery and I am facing a problem when performing ajax requests on iOS. I do the request , the php on my server receives the information and echo the correct answer. Turns out my app does not 'get' the information until I interact somehow with the screen (scrolling for example) or really wait too long ( over a minute ). This problem evolved, in the beginning happened just after a few requests and now the first ajax already shows it . Another thing I noticed was, when taking out all ( or almost all ) javascript and/or css ( weird ) the problem disappears as if it was something with the phone memory . When doing the request using async : false , the problem also disappears! It happens on the iPhone 4 . Was tested on Android and PC (Chrome and Mozilla Firefox ) and it worked fine.
The weirdest thing is that when I interact with the screen , the answer appears, it do not wait a second, it is almost instantaneous ... as if the answer was already there, but not showing up for some reason.
Ps: The error alert don`t appears.
jQuery.ajax({
type: 'GET',
url: 'url',
crossDomain: true,
data: {
data: data
},
error: function() {
alert('error');
}
}).done(function(data) {
alert(data);
});

http://api.jquery.com/jQuery.ajax/
if you are wanting a funciton to be triggered on response i would use the complete instead of done.

My guess is you are not cancelling the click action and the page is refreshing
$(".yourSelector").on("click",function (evt) {
evt.preventDefault();
//your ajax call
});

Related

Why is my async jQuery ajax call blocking?

I have read many similar questions to mine from long ago but have yet to find the answer to my problem, so apologies if it sounds so familiar.
I have a Laravel/PHP web app which loads in an excel file of transactions. These are processed as either success or failure. In development it takes about two seconds per transaction. A typical file has about 40 transactions. I am now wanting to use the Bootstrap progress bar to provide feedback to the user about how far along the processing is going.
I have a page with a button to fire the import, previously file selection and things have happened, so I can just call the backend URL (audit.import) with the correct parameters and the upload will happen and work. So what I have done, is create a URL to return the status of the upload from the server (loadprogress). The plan being that the loadprogress will be called via ajax and the magic of js setTimeout, in order to poll the backend. Once we see all records have been successful or not, then the poll can end.
The problem is, the loadprogress poll fires regularly, right up until I press the button to start the main file load. Then it fails to fire again until the main file load has completed, thus removing the planned use for the progress meter.
My javascript looks like this,
<script src="https://ajax.googleapis.com/ajax/libs/jquery/3.6.0/jquery.min.js"></script>
<script type="text/javascript">
$(document).ready(function() {
var fullname = '<?php echo $fullname; ?>';
$("#ajaxButton").click(function(event) {
$.ajaxSetup({
headers: {
'X-CSRF-TOKEN': $('meta[name="csrf-token"]').attr('content')
}
});
$.ajax({
url: '/audit.import/1/' + fullname,
type: 'POST',
async: true,
}).always(function(xhr, status) {
console.log("Import complete with status of " + status);
});
console.log("sent async call to perform audit");
});
(function loadProgress() {
$.ajax({
type: 'GET',
url: '/loadprogress',
}).done(function(result) {
console.log(result);
}).then(function() {
setTimeout(loadProgress, 100);
})
})();
});
</script>
I am on a mac in safari, but have tried chrome on mac with the same results.
Any assistance would be welcomed.
Thanks.
It was indeed a server side problem. I was running on a single threaded test server which means there was no second thread for my second process. Hence the block. By default the test Laravel server that comes with the install is only single threaded. A fact I had missed in my setup of the test server.

Page waits for AJAX before changing location

This question might seem a bit odd, the problem arised when the page went through webtests.
The page uses an AJAX call (async set to true) to gather some data. For some reason it won't swap pages before the AJAX call has returned - consider the following code:
console.log("firing ajax call");
$.ajax({
type: "POST",
url: "requestedService",
data: mode : "requestedMethod",
cache: false,
dataType: "json",
success: function() { console.log("ajax response received") },
error: null,
complete: null,
});
console.log("changing window location");
window.location = "http://www.google.com"
The location only changes after AJAX returns the response. I have tested the call, it is in fact asynchronous, the page isn't blocked. It should just load the new page even if the AJAX call hasn't completed, but doesn't. I can see the page is trying to load, but it only happens once I get the response. Any ideas?
The console output is:
firing ajax call
changing window location
ajax response received
This seems to work fine for me. The location is changed before the code in the async handler executes. Maybe you should post some real code and not a simplified version, so that we can help better.
Here is a demonstration that works as you expect: http://jsfiddle.net/BSg9P/
$(document).ready(function() {
var result;
$("#btn").on('click', function(sender, args) {
setInterval(function() {
result = "some result";
console.log("Just returned a result");
}, 5000);
window.location = "http://www.google.com";
});
});
And here is a screenshot of the result: http://screencast.com/t/VbxMCxxyIbB
I have clicked the button 2 times, and you can see in the JS console that the message about the location change is printed before the result each time. (The error is related to CORS, if it was the same domain, it would navigate).
Bit late but maybe someone else will have the same issue.
This answer by #todd-menier might help: https://stackoverflow.com/questions/941889#answer-970843
So the issue might be server-side. For eg, if you're using PHP sessions by default the user's session will be locked while the server is processing the ajax request, so the next request to the new page won't be able to be processed by the server until the ajax has completed and released the lock. You can release the lock early if your ajax processing code doesn't need it so the next page load can happen simultaneously.

Loading gif image is not showing in IE and chrome

I am using JQuery ajax call for sending synchronized call to server and want to display a loading image for that time.
But loading image is visible in Firefox but not in IE and chrome. When i debug, i found that in IE, when we call java script, it stop showing changes in browser as it halts DOM and when the call is completed, it then shows all the changes. Since i shows the loading image on ajax call and remove it after completing the call, the IE doe not display any image.
But when i use alert box,it shows the loading image in IE as it stop the thread until we response to it.
Is there any method to stop java script execution for a millisecond such that IE execution halts and loading image is shown.
I already tried ajaxSend, ajaxComplete, ajaxStart, ajaxStop of JQuery and timer event of java script, but does not get any solution.
Any help is precious, thanks in advance.
In the context of XHR, synchronously means just that the browser freezes until the request is complete. If what you want is to make sure only one request is running at a given time, then use your own flag to indicate that a request is in progress.
By definition, the synchronous flag of a request means that any other activity must stop. I'm surprised that it even works in Firefox, last time I tried that it didn't work, but that was a long time ago. So forget about it, there's no way to make it work in all browsers. Even if you delay the request using a setTimeout, at best you'll get a frozen browser with a non-animated gif. And users don't like when you freeze their browser, especially if the request might take more than a fraction of a second.
Don't ever depend on the browser for security or correct functionality related features. If your application might get broken if a client does two requests in parallel, then you have a bug on the server side. There's nothing that prevents a malicious user from making parallel requests using other tools than the normal UI.
You problem is probably the 'synchronized' part in your opening post.
Open the connection asynchronously. That stops the browser from locking up, and it will work as you expect. (set async = true on your XmlHttpRequest / activex object)
try to shows the loading image at the start of your ajax jquery call and hide it on success event
or
you can use set time out also
I also faced similar problem while working with ajax like i applied some styles to UI during ajax call but these are not applied to UI and same as you told that if we put some alert it will work as expected until OK is not clicked for alert
I can't guess why it happening but you can use JQuery to show or Hide that Loading.... div
Hope it will work....
I had a similar problem and then I sorted it out by using the Update Panel in ASP.NET. If you are using PHP or any other technology then you have to use ajax call. If you do synchronized call then the Loading image will not be shown.
I have had similar situation to deal with, if you have to make the synchronous call, browser will suspend the DOM manipulation. So unless absolutely necessary, keep with the async calls.
There is a workaround for manipulating the DOM and show an element before starting the ajax call. Use jQuery animate(), and make the ajax call in the callback for animation complete. Following code works for me:
//show the loading animation div
$('#loading').show();
//call dummy animate on element, call ajax on finish handler
$('#loading').animate({
opacity: 1
}, 500, function() {
//call ajax here
var dataString = getDataString() + p + '&al=1';
$.ajax(
{
type: 'GET',
async: false,
url: 'uldateList.php',
data: dataString,
dataType: 'json',
success: function(result){
//Do something with result
...
//hide loading animation
$('#loading').hide();
}
});
});
You can try to execute the ajax synchronous call in the image load callback.
Something like:
var img = new Image();
img.src = "loading.gif";
img.onload = function() {
/* ajax synch call */
}
Then append img to DOM.
Hi try to modify ajax call like this its works for me
xmlHttp.open('POST', url, true);
xmlHttp.onreadystatechange = myHandlerFunction;
xmlHttp.setRequestHeader('Content-Type', 'application/x-www-form-urlencoded');
xmlHttp.setRequestHeader("Accept-Charset", "charset=UTF-8");
xmlHttp.send(query);
I used Sergiu Dumitriu's information as base. I set my async to true and added
beforeSend: function() {
$('#Waiting').jqmShow();
},
complete: function() {
$('#Waiting').jqmHide();
}
And it worked for me. Basically i created my own async:false attribute.
In your $.ajax call you need to add async: true. Following code works for me:
$.ajax({
url: 'ajax_url',
data: 'sending_data',
type: 'POST',
cache : false,
async: true,
beforeSend: function() {
$('#id_of_element_where_loading_needed').html('html_of_loading_gif');
},
success: function(data) {
$('#id_of_element_where_result_will_be_shown').html(data.body);
}
});

Chrome not handling jquery ajax query

I have the following query in jquery. It is reading the "publish" address of an Nginx subscribe/publish pair set up using Nginx's long polling module.
function requestNextBroadcast() {
// never stops - every reply triggers next.
// and silent errors restart via long timeout.
getxhr = $.ajax({
url: "/activity",
// dataType: 'json',
data: "id="+channel,
timeout: 46000, // must be longer than max heartbeat to only trigger after silent error.
error: function(jqXHR, textStatus, errorThrown) {
alert("Background failed "+textStatus); // should never happen
getxhr.abort();
requestNextBroadcast(); // try again
},
success: function(reply, textStatus, jqXHR) {
handleRequest(reply); // this is the normal result.
requestNextBroadcast();
}
});
}
The code is part of a chat room. Every message sent is replied to with a null rply (with 200/OK) reply, but the data is published. This is the code to read the subscribe address as the data comes back.
Using a timeout all people in the chatroom are sending a simple message every 30 to 40 seconds, even if they don't type anything, so there is pleanty of data for this code to read - at least 2 and possibly more messages per 40 seconds.
The code is 100% rock solid in EI and Firefox. But one read in about 5 fails in Chrome.
When Chrome fails it is with the 46 seconds timeout.
The log shows one /activity network request outstanding at any one time.
I've been crawling over this code for 3 days now, trying various idea. And every time IE and Firefox work fine and Chrome fails.
One suggestion I have seen is to make the call syncronous - but that is clearly impossible because it would lock up te user interface for too long.
Edit - I have a partial solution: The code is now this
function requestNextBroadcast() {
// never stops - every reply triggers next.
// and silent errors restart via long timeout.
getxhr = jQuery.ajax({
url: "/activity",
// dataType: 'json',
data: "id="+channel,
timeout: <?php echo $delay; ?>,
error: function(jqXHR, textStatus, errorThrown) {
window.status="GET error "+textStatus;
setTimeout(requestNextBroadcast,20); // try again
},
success: function(reply, textStatus, jqXHR) {
handleRequest(reply); // this is the normal result.
setTimeout(requestNextBroadcast,20);
}
});
}
Result is sometimes the reply is delayed until the $delay (15000) happens, Then the queued messages arrive too quicly to follow. I have been unable to make it drop messages (only tested with netwrok optomisation off) with this new arrangement.
I very much doubt that delays are dur to networking problems - all machines are VMs within my one real machine, and there are no other users of my local LAN.
Edit 2 (Friday 2:30 BST) - Changed the code to use promises - and the POST of actions started to show the same symptoms, but the receive side started to work fine! (????!!!???).
This is the POST routine - it is handling a sequence of requests, to ensure only one at a time is outstanding.
function issuePostNow() {
// reset heartbeat to dropout to send setTyping(false) in 30 to 40 seconds.
clearTimeout(dropoutat);
dropoutat = setTimeout(function() {sendTyping(false);},
30000 + 10000*Math.random());
// and do send
var url = "handlechat.php?";
if (postQueue.length > 0) {
postData = postQueue[0];
var postxhr = jQuery.ajax({
type: 'POST',
url: url,
data: postData,
timeout: 5000
})
postxhr.done(function(txt){
postQueue.shift(); // remove this task
if ((txt != null) && (txt.length > 0)) {
alert("Error: unexpected post reply of: "+txt)
}
issuePostNow();
});
postxhr.fail(function(){
alert(window.status="POST error "+postxhr.statusText);
issuePostNow();
});
}
}
About one action in 8 the call to handlechat.php will timeout and the alert appears. Once the alert has been OKed, all queued up messages arrive.
And I also noticed that the handlechat call was stalled before it wrote the message that others would see. I'm wondering if it could be some strange handling of session data by php. I know it carefully queues up calls so that session data is not corrupted, so I have been careful to use different browsers or different machines. There are only 2 php worker threads however php is NOT used in the handling of /activity or in the serving of static content.
I have also thought it might be a shortage of nginx workers or php processors, so I have raised those. It is now more difficult to get things to fail - but still possible. My guess is the /activity call now fails one in 30 times, and does not drop messages at all.
And thanks guys for your input.
Summary of findings.
1) It is a bug in Chrome that has been in the code for a while.
2) With luck the bug can be made to appear as a POST that is not sent, and, when it times out it leaves Chrome in such a state that a repeat POST will succeed.
3) The variable used to store the return from $.ajax() can be local or global. The new (promises) and the old format calls both trigger the bug.
4) I have not found a work around or way to avoid the bug.
Ian
I had a very similar issue with Chrome. I am making an Ajax call in order to get the time from a server every second. Obviously the Ajax call must be asynchronous because it will freeze up the interface on a timeout if it's not. But once one of the Ajax calls is a failure, each subsequent one is as well. I first tried setting a timeout to be 100ms and that worked well in IE and FF, but not in Chrome. My best solution was setting the type to POST and that solved the bug with chrome for me:
setInterval(function(){
$.ajax({
url: 'getTime.php',
type: 'POST',
async: true,
timeout: 100,
success: function() { console.log("success"); },
error: function() { console.log("error"); }
});
}, 1000);
Update:
I believe the actual underlying problem here is Chrome's way of caching. It seems that when one request fails, that failure is cached, and therefore subsequent requests are never made because Chrome will get the cached failure before initiating subsequent requests. This can be seen if you go to Chrome's developer tools and go to the Network tab and examine each request being made. Before a failure, ajax requests to getTime.php are made every second, but after 1 failure, subsequent requests are never initiated. Therefore, the following solution worked for me:
setInterval(function(){
$.ajax({
url: 'getTime.php',
cache: false,
async: true,
timeout: 100,
success: function() { console.log("success"); },
error: function() { console.log("error"); }
});
}, 1000);
The change here, is I am disabling caching to this Ajax query, but in order to do so, the type option must be either GET or HEAD, that's why I removed 'type: 'POST'' (GET is default).
try moving your polling function into a webworker to prevent freezing up in chrome.
Otherwise you could try using athe ajax .done() of the jquery object. that one always works for me in chrome.
I feel like getxhr should be prefixed with "var". Don't you want a completely separate & new request each time rather than overwriting the old one in the middle of success/failure handling? Could explain why the behavior "improves" when you add the setTimeout. I could also be missing something ;)
Comments won't format code, so reposting as a 2nd answer:
I think Michael Dibbets is on to something with $.ajax.done -- the Deferred pattern pushes processing to the next turn of the event loop, which I think is the behavior that's needed here. see: http://www.bitstorm.org/weblog/2012-1/Deferred_and_promise_in_jQuery.html or http://joseoncode.com/2011/09/26/a-walkthrough-jquery-deferred-and-promise/
I'd try something like:
function requestNextBroadcast() {
// never stops - every reply triggers next.
// and silent errors restart via long timeout.
getxhr = jQuery.ajax({
url: "/activity",
// dataType: 'json',
data: "id="+channel,
timeout: <?php echo $delay; ?>
});
getxhr.done(function(reply){
handleRequest(reply);
});
getxhr.fail(function(e){
window.status="GET error " + e;
});
getxhr.always(function(){
requestNextBroadcast();
});
Note: I'm having a hard time finding documentation on the callback arguments for Promise.done & Promise.fail :(
Perhaps it can be worked around by changing the push module settings (there are a few) - Could you please post these?
From the top of my head:
setting it to interval poll, would kinda uglily solve it
the concurrency settings might have some effect
message storage might be used to avoid missing data
I would also use something like Charles to see what exactly does happen on the network/application layers

Problem with jQuery.ajax with 'delete' method in ie

I have a page where the user can edit various content using buttons and selects that trigger ajax calls. In particular, one action causes a url to be called remotely, with some data and a 'put' request, which (as i'm using a restful rails backend) triggers my update action. I also have a delete button which calls the same url but with a 'delete' request. The 'update' ajax call works in all browsers but the 'delete' one doesn't work in IE. I've got a vague memory of encountering something like this before...can anyone shed any light? here's my ajax calls:
//update action - works in all browsers
jQuery.ajax({
async:true,
data:data,
dataType:'script',
type:'put',
url:"/quizzes/"+quizId+"/quiz_questions/"+quizQuestionId,
success: function(msg){
initializeQuizQuestions();
setPublishButtonStatus();
}
});
//delete action - fails in ie
function deleteQuizQuestion(quizQuestionId, quizId){
//send ajax call to back end to change the difficulty of the quiz question
//back end will then refresh the relevant parts of the page (progress bars, flashes, quiz status)
jQuery.ajax({
async:true,
dataType:'script',
type:'delete',
url:"/quizzes/"+quizId+"/quiz_questions/"+quizQuestionId,
success: function(msg){
alert("success");
initializeQuizQuestions();
setSelectStatus(quizQuestionId, true);
jQuery("tr[id*='quiz_question_"+quizQuestionId+"']").removeClass('selected');
},
error: function(msg){
alert("error:" + msg);
}
});
}
I put the alerts in success and error in the delete ajax just to see what happens, and the 'error' part of the ajax call is triggered, but WITH NO CALL BEING MADE TO THE BACK END (i know this by watching my back end server logs). So, it fails before it even makes the call. I can't work out why - the 'msg' i get back from the error block is blank.
Any ideas anyone? Is this a known problem? I've tested it in ie6 and ie8 and it doesn't work in either.
thanks - max
EDIT - the solution - thanks to Nick Craver for pointing me in the right direction.
Rails (and maybe other frameworks?) has a subterfuge for the unsupported put and delete requests: a post request with the parameter "_method" (note the underscore) set to 'put' or 'delete' will be treated as if the actual request type was that string. So, in my case, i made this change - note the 'data' option':
jQuery.ajax({
async:true,
data: {"_method":"delete"},
dataType:'script',
type:'post',
url:"/quizzes/"+quizId+"/quiz_questions/"+quizQuestionId,
success: function(msg){
alert("success");
initializeQuizQuestions();
setSelectStatus(quizQuestionId, true);
jQuery("tr[id*='quiz_question_"+quizQuestionId+"']").removeClass('selected');
},
error: function(msg){
alert("error:" + msg);
}
});
}
Rails will now treat this as if it were a delete request, preserving the REST system. The reason my PUT example worked was just because in this particular case IE was happy to send a PUT request, but it officially does not support them so it's best to do this for PUT requests as well as DELETE requests.
IE 7 and 8 do not support DELETE and PUT methods. I had a problem where IE7,8 would not follow a 302 redirect and IE would use the DELETE or PUT method for the location that it was supposed to redirect to (with a get.)
To ensure that IE7 and 8 work properly, I would use a POST with the parameters:
data: {'_method': 'delete'}
Take a look at your type attribute type:'delete'
jQuery documentation on type:
The type of request to make ("POST" or "GET"), default is "GET". Note: Other HTTP request methods, such as PUT and DELETE, can also be used here, but they are not supported by all browsers.
I would instead try and include this with your data and look for it on the server-side, like this:
data: {'action': 'delete'},

Categories