I'm working on a UI5 application. I want to block UI form users during ajax requests. I added the following code:
var dialog = new sap.m.BusyDialog({
text: this.languageModel.getModelProperty("BUSY_DIALOG_FETCHING_DATA"),
title : this.languageModel.getModelProperty("BUSY_DIALOG_WAIT")
});
var that = this;
jQuery.ajaxSetup({
beforeSend: function() {
dialog = (dialog) ? dialog : new sap.m.BusyDialog({
text : that.languageModel.getModelProperty("BUSY_DIALOG_FETCHING_DATA"),
title : that.languageModel.getModelProperty("BUSY_DIALOG_WAIT")
});
dialog.open();
},
complete: function() {
console.log("close");
dialog.close();
}
});
In console, I get good results, and while debugging I can see dialogs, but in normal mode (without any breakpoints) the busy dialogs don't appear at all. Any clue?
Why not use the busy indicator?
var oController = this;
oController.getView().setBusy(true);
$.ajax({
//your ajax call here,
success: function(data) {
//do whatever needs to be done after success;
oController.getView().setBusy(false);
},
error: function(...) {
oController.getView().setBusy(false);
}
});
You're probably sending some requests synchronously so that the browser wasn't able to update the DOM [1] while trying to open the dialog.
When calling low-level APIs (such as jQuery.ajax()):
NEVER set async: false.
NEVER call jQuery.sap.sjax as it's deprecated due to sending sync XHRs only.
Migrate to asynchronous APIs. Follow the best-practices mentioned below:
Is Your Application Ready for Asynchronous Loading?
Use Asynchronous Loading
In case of OData: DO NOT use the deprecated sap.ui.model.odata.ODataModel! It sends many requests with sync XHRs by default. Consider using v2.ODataModel or v4.ODataModel depending on the OData service.
[1] Sending requests synchronously blocks the main-thread (which is the UI-thread at the same time) of the browser which results in a bad UX. It's also deprecated by the web platform.
Related
I have looked at the following thread
jQuery Ajax - Status Code 0?
However I could not find a definitive answer and I am having serious trouble trying to find the source of my issue so I am posting here in the hopes that someone can point me in the right direction.
In my code I am performing an Angular HTTP post which just sends basic JSON data, then within the on success callback I am using AJAX to upload files to the same server. (I know I should not be using jQuery and Angular however I can't change this for the moment)
It looks something like this
var deferred = $q.defer()
// first post
$http.post(url,payload,{params: params, headers: headers)
.then(function(response) {
uploadFiles(response,deferred);
// I am also sending google analytics events here
}, function(error) {
// do error stuff
}
return deferred.promise;
// upload files function
function uploadFiles(response,deferred){
$ajax({
type: 'POST',
processData: false,
contentType: false,
data: data // this new FormData() with files appended to it,
url: 'the-endpoint-for-the-upload',
dataType: 'json',
success: function(data) {
// do success stuff here
deferred.resolve(data);
},
error: function(jqXHR, textStatus, errorThrown) {
var message = {};
if (jqXHR.status === 0) {
message.jqXHRStatusIsZero = "true";
}
if (jqXHR.readyState === 0) {
message.jqXHRReadyStateIsZero = "true";
}
if (jqXHR.status === '') {
message.jqXHRStatusIsEmptyString = "true";
}
if (jqXHR.status) {
message.jqXHRStatus = jqXHR.status;
}
if (jqXHR.readyState) {
message.jqXHRReadyState = jqXHR.readyState;
}
if (jqXHR.responseText) {
message.jqXHR = jqXHR.responseText;
}
if (textStatus) {
message.textStatus = textStatus;
}
if (errorThrown) {
message.errorThrown = errorThrown;
}
message.error = 'HTTP file upload failed';
logError(message);
deferred.resolve(message);
}
}
})
}
Not my exact code but almost the exact same.
The issue is that is works almost all of the time, but maybe three or four in every few hundred will fail. By fail I mean the error handler function is called on the file upload function and the files are not uploaded.
I get jqXHRStatus 0 and jqXHRReadyState 0 when this occurs.
The only way I am able to replicate the issue is by hitting the refresh on the browser when the request is being processed, however users have advised they are not doing this (although have to 100% confirm this)
Is there perhaps a serious flaw in my code which I am not seeing? Maybe passing deferred variable around isn't good practice? Or another way the ajax request is being cancelled that I am not considering? Could sending google analytics events at the same time be interfering?
Any advice would be great and please let me know if you would like more information on the issue.
This means, the request has been canceled.
There could be many different reasons for that, but be aware: this could be also due to a browser bug or issue - so i believe (IMHO) there is no way to prevent this kind of issues.
Think for example, you get a 503 (Service Unavailable) response. What you would do in such a case? This is also a sporadic and not predictable issue. Just live with that, and try to repost your data.
Without reinventing the wheel, I suggest you to implement:
Retrying ajax calls using the deferred api
My guess is that your code is executing before it actually gets back from the call. I.e. the call goes back and nothing was returned and it gives a 0 error. This would make sense as the error is variable. Most of the time it would return fine because the backend executed fast enough but sometimes it wouldn't because it took especially long or something else happened etc. Javascript doesn't ever REALLY stop execution. It says it does but especially passing between angular and jquery with multiple ajax requests it wouldn't be surprising if it was executing the second ajax call before it actually completed your angular post. That's why a refresh would replicate the error because it's would clear your variables.
Some things you can do to test this:
On the backend make a timer that goes for a few seconds before it returns anything. This will probably make your code fail more consistently.
Set breakpoints and see when they are being hit and the values they contain in the javascript.
Good luck!
I saw that's possible to do ajax requests inside a web worker, but I want to do the ajax call via jQuery (outside worker, of course), and after this, pass the result of the callback to the worker.
I made some tests and this works, but I want to know if there's something wrong with it (memory leaks, incompatibility, instability):
$.ajax
({
type: 'GET',
url : 'http://192.168.0.2/json.php',
data: requestDataObj,
dataType: 'json'
}).success(function(jsonResult)
{
var jSonWorker = new Worker('http://localhost/webmail/responsive/js/workers.js');
jSonWorker.postMessage(jsonResult);
jSonWorker.onmessage = function(event)
{
alert(event.data)
}
});
As you can see, I pass the jsonResult to the worker, where I do something with it and post a message back to the main thread. Anything wrong with this?
The only problem I see is that you're assuming any browser has support for window.Worker, which is not the case.
In case blocking the application is feasible — say the computation you're expecting the worker to do is light — you could paste the entire code of the worker inside the ajax callback [1].
Thus
...success(function (res) {
if (window.Worker) {
var worker = new Worker(...);
worker.onmessage = function (e) { /* ... */ };
worker.postMessage(res);
}
else {
// Write here the same code the worker
// is supposed to execute "on" res.
}
});
Course you will lose the performance improvement you gained with two threads.
[1] As done here by #afshinm.
I know how to view network traffic in the browser's dev tools and to have XMLHttpRequests displayed in the console, but is there a window property that displays all the network traffic?
(function ($) {
// The ajaxSend() method is a callback which is fired
every time a jQuery AJAX call is completed.
$(document).ajaxSend(function(event, xhr, settings){
if (iswantedurl(settings.url.pathname)){
// ga('send','pageview',settings.url.pathname);
}
});
})(jQuery);
origin code from How to extend google analytics to track AJAX etc (as per H5BP docs)
With jquery.
And if you're without it, you could just extend XMLHttpRequest.prototype.open to fire your custom event.
Pure JavaScript:
XMLHttpRequest.prototype.open = function(method, url, async, user, password) {
// Add my code to record all the requests
if (!window.XMLLog) window.XMLLog = [];
window.XMLLog.push(this);
// Don't break the original JS
return this.xhr.open(method, url, async, user, password);
}
jQuery:
$(document).ajaxSend(function(event, xhr, settings){
if (!window.XMLLog) window.XMLLog = [];
window.XMLLog.push({event:event,xhr:xhr,settings:settings});
});
I am using JQuery ajax call for sending synchronized call to server and want to display a loading image for that time.
But loading image is visible in Firefox but not in IE and chrome. When i debug, i found that in IE, when we call java script, it stop showing changes in browser as it halts DOM and when the call is completed, it then shows all the changes. Since i shows the loading image on ajax call and remove it after completing the call, the IE doe not display any image.
But when i use alert box,it shows the loading image in IE as it stop the thread until we response to it.
Is there any method to stop java script execution for a millisecond such that IE execution halts and loading image is shown.
I already tried ajaxSend, ajaxComplete, ajaxStart, ajaxStop of JQuery and timer event of java script, but does not get any solution.
Any help is precious, thanks in advance.
In the context of XHR, synchronously means just that the browser freezes until the request is complete. If what you want is to make sure only one request is running at a given time, then use your own flag to indicate that a request is in progress.
By definition, the synchronous flag of a request means that any other activity must stop. I'm surprised that it even works in Firefox, last time I tried that it didn't work, but that was a long time ago. So forget about it, there's no way to make it work in all browsers. Even if you delay the request using a setTimeout, at best you'll get a frozen browser with a non-animated gif. And users don't like when you freeze their browser, especially if the request might take more than a fraction of a second.
Don't ever depend on the browser for security or correct functionality related features. If your application might get broken if a client does two requests in parallel, then you have a bug on the server side. There's nothing that prevents a malicious user from making parallel requests using other tools than the normal UI.
You problem is probably the 'synchronized' part in your opening post.
Open the connection asynchronously. That stops the browser from locking up, and it will work as you expect. (set async = true on your XmlHttpRequest / activex object)
try to shows the loading image at the start of your ajax jquery call and hide it on success event
or
you can use set time out also
I also faced similar problem while working with ajax like i applied some styles to UI during ajax call but these are not applied to UI and same as you told that if we put some alert it will work as expected until OK is not clicked for alert
I can't guess why it happening but you can use JQuery to show or Hide that Loading.... div
Hope it will work....
I had a similar problem and then I sorted it out by using the Update Panel in ASP.NET. If you are using PHP or any other technology then you have to use ajax call. If you do synchronized call then the Loading image will not be shown.
I have had similar situation to deal with, if you have to make the synchronous call, browser will suspend the DOM manipulation. So unless absolutely necessary, keep with the async calls.
There is a workaround for manipulating the DOM and show an element before starting the ajax call. Use jQuery animate(), and make the ajax call in the callback for animation complete. Following code works for me:
//show the loading animation div
$('#loading').show();
//call dummy animate on element, call ajax on finish handler
$('#loading').animate({
opacity: 1
}, 500, function() {
//call ajax here
var dataString = getDataString() + p + '&al=1';
$.ajax(
{
type: 'GET',
async: false,
url: 'uldateList.php',
data: dataString,
dataType: 'json',
success: function(result){
//Do something with result
...
//hide loading animation
$('#loading').hide();
}
});
});
You can try to execute the ajax synchronous call in the image load callback.
Something like:
var img = new Image();
img.src = "loading.gif";
img.onload = function() {
/* ajax synch call */
}
Then append img to DOM.
Hi try to modify ajax call like this its works for me
xmlHttp.open('POST', url, true);
xmlHttp.onreadystatechange = myHandlerFunction;
xmlHttp.setRequestHeader('Content-Type', 'application/x-www-form-urlencoded');
xmlHttp.setRequestHeader("Accept-Charset", "charset=UTF-8");
xmlHttp.send(query);
I used Sergiu Dumitriu's information as base. I set my async to true and added
beforeSend: function() {
$('#Waiting').jqmShow();
},
complete: function() {
$('#Waiting').jqmHide();
}
And it worked for me. Basically i created my own async:false attribute.
In your $.ajax call you need to add async: true. Following code works for me:
$.ajax({
url: 'ajax_url',
data: 'sending_data',
type: 'POST',
cache : false,
async: true,
beforeSend: function() {
$('#id_of_element_where_loading_needed').html('html_of_loading_gif');
},
success: function(data) {
$('#id_of_element_where_result_will_be_shown').html(data.body);
}
});
I'm making a mailing list script that takes advantage of ajax (async=false) to send emails in chunks.
Basically the cycle is this:
var i = 0;
for(i;i<num_rows;i=i+mxt){
if($("#panic").val()=='1'){
break;
}
perc = (i*100)/num_rows;
startThread(i,perc);
}
Tha panic value is set by a button, the problem is that during the cycle (that works) I can't interact with the page.
What am I doing wrong?
Thank you
EDIT:
function startThread(i,perc){
l_a = i;
l_b = mxt;
headers = '&mail_from='+mail_from+'&mail_from_name='+mail_from_name+'&mail_subject='+mail_subject;
$.ajax({
type: "POST", url: "ajax/thread.php", data: "l_a="+l_a+"&l_b="+l_b+headers,
success: function(html){ $("#progressbar").progressbar({value: perc}); },
async: false
});
}
Your startThread() function name is misleading, because JavaScript in web browsers in not only single threaded, but it shares the same thread with the page rendering.
Since you're using async=false, the $.ajax call become a blocking function, and this blocks the page rendering thread, making the UI unresponsive.
Quoting the jQuery documentation (emphasis added):
async
Default: true
By default, all requests are sent asynchronous (i.e. this is set to true by default). If you need synchronous requests, set this option to false. Cross-domain requests and dataType: "jsonp" requests do not support synchronous operation. Note that synchronous requests may temporarily lock the browser, disabling any actions while the request is active.
Possible solutions:
Piggyback your data in one JSON object, and send just one $.ajax request. If possible use async=true.