Ajax async : false freezes web page during loading in slow network - javascript

I use jquery AJAX for web page and use async : false option as following. My client's network is very slow. When I try to load the web page from the server web page is slow and all the controls are freeze. Is that "async:false" matter? here my code
function ajaxRequestWithNoArguments(url) {
return $.ajax({
url: urlForPhp + '/' + url,
data: '',
dataType: 'JSON',
async: false,
method: 'POST'
});
}

When I try to load the web page from the server web page is slow and all the controls are freeze. Is that "async:false" matter?
Yes, this is exactly why you should not use async:false, it's used in very specific cases and sounds like you don't need it. Making the request synchronous means that browser will pause program execution (freeze all UI too) until the request is done, the data is loaded back and processed. You don't wan't it in most cases, that's why you need to use default async: true.
function ajaxRequestWithNoArguments(url) {
return $.ajax({
url: urlForPhp + '/' + url,
data: '',
dataType: 'JSON',
method: 'POST'
});
}
Returning a promise object is convenient way to deal with asynchronous function. In this case you would use ajaxRequestWithNoArguments as follows:
ajaxRequestWithNoArguments('/some/url').then(function(response) {
console.log('Data loaded', response);
});

function OpenAjax(link, form_id)
{
document.getElementById("page-wrapper").innerHTML = "";
$.ajaxSetup({ async: true });
$("#page-wrapper").load(link, function(){
$("#" + form_id).validator();
});
}
This is my code. I had the same issue and setting it to true will fix it. When set it to true another problem may occur. Your javascript code will continue to work and if you have a response text you must tell JQuery to run your code after response, as in my example:
$("#" + form_id).validator();
This code works after response, but if I write my code this way
function OpenAjax(link, form_id)
{
document.getElementById("page-wrapper").innerHTML = "";
$.ajaxSetup({ async: true });
$("#page-wrapper").load(link, function(){
//Code moved from this line
});
//Here
$("#" + form_id).validator();
}
$("#" + form_id).validator(); - code will work before Ajax response

Related

Best Practice for closing browser window without waiting for ajax response

I want to open a window, make an ajax call and close the window.
I want to do it as fast as possible and i was wondering if i have to wait for the ajax response before closing the window?
Currently i'm doing it like this:
$.ajax({
url: requestURL,
type: 'GET',
dataType: "json",
cache: false,
timeout: 30000,
always: function () {
closeWindow();
}
});
However, i was wondering if the ajax will reach the server 100% on all browsers if i will do it like this:
$.ajax({
url: requestURL,
type: 'GET',
dataType: "json",
cache: false,
timeout: 30000,
always: function () {
}
});
closeWindow();
//THIS HAS CONFIRMED TO NOT WORK AND MISS OUT SOME REQUESTS
closeWindow() implementation is irrelevant.
The full usecase is as follows:
I send a user a link on Whatsapp/Telegram/Messenger
User clicks the link
Browser is open -> issue an ajax call -> closing the window.
EDIT
To clarify, i don't care what was the server response for the call. I just want to make sure that the browser issued the HTTP GET to the server and then close the window.
EDIT 2
AJAX is not a must, also, better to use vanilla JS
You can do it like you said, send the ajax request and close the window, since it's async methods, it will make the call, if you wanna be absolutely SURE, write a little PHP script that writes something in a file with a little sleep at the top to check that it made the call, but it will do it.
Edit : or use this method and close the browser in the done function, it will take 1ms or so
following function can close your tab after 1 sec, without waiting for the ajax response
$.ajax({
url: requestURL,
type: 'GET',
dataType: "json",
cache: false,
});
setInterval(function(){
close();
}, 1000);
I believe that if you want to know when your request was launched you could use .ajaxStart as explained here.
$(document).ajaxStart(function() {
console.log('Ajax call started');
closeWindow();
});
Or you could try some raw js like the solution explained here.
I guess the answer was at the back of my head: use an Image object to send the request and close the window right after:
var img = new Image();
img.src = urlWithParamters;
img.onload = function () {
closeWindow();
};
img.onerror = function () {
closeWindow();
};
//safety:
setTimeout(closeWindow,5000);
Since you are using JQuery, you can use the "done" callback option, and set your function inside so the window will close if the request was successful.
$.ajax({
url: requestURL,
type: 'GET',
dataType: "json",
cache: false
}).done(function() {
closeWindow();
});
You can also use the "fail" callback option to manage potential errors.

High CPU Usage and Synchronous XMLHttpRequest Warning With setInterval() and AJAX

I want to update a shell command output each second using AJAX.
However, Chrome CPU usage is too high and output update seems to be updating so fast ( not one second )
Here is the HTML Document:
<script src='jquery-2.2.4.js'></script>
<script>
setInterval(function() {
$.ajax({
url: "test.php",
success: function(data) {
$("body").html(data);
},
async: true
});
}, 1000);
</script>
</body>
And here is the shell command I'm actually using:
system("dir C:");
It would be better to use a setTimeout which will be called after every successful ajax completion.
You could setup also an error handler in the $.ajax because a network fail might happen and call there again the setTimeout(function(){myajaxfunction();},1000);
var myajaxfunction = function() {
$.ajax({
url: "test.php",
success: function(data) {
$("body").html(data);
setTimeout(function(){myajaxfunction()},1000);
},
async: true
});
};
myajaxfunction();
I found the solution. The AJAX request URL was the same URL I requested from which caused an infinite recursive loop.
So what I did is to request it from another PHP page which contains the data I actually need.

Mockjax dynamic mock stops working with `dataType="Script"`

I have a dynamic mock setup using mockjax, and it works for most of my ajax requests, but fails when the dataType is set to Script, and lets the request fall through to regular Ajax handler.
// gets mocked
$.ajax({
type: "GET",
url: "http://myurl.com/myfile.js?_=1395314460347"
})
// does not get mocked!
$.ajax({
type: "GET",
dataType: "script",
url: "http://myurl.com/myfile.js?_=1395314460347"
})
How can I configure dynamic mocks in mockjax to intercept requests with the dataType set?
UPDATE: example code for mockjax definition
I am creating dynamic mock, so I am defining via function, not plain object, something like this...
$.mockjax(function(settings) {
// settings.url == '/restful/<service>'
var service = settings.url.match(/\/restful\/(.*)$/);
if ( service ) {
return {
proxy: '/mocks/' + service[1] + '.json',
// handle `dataType: 'script'`
dataType: 'application/javascript'
};
}
return;
});
This appears to be a bug with how Mockjax handles crossDomain script requests. It is not doing anything special to detect the crossDomain request (like it does with JSONP) and as such, when it passes the request back to the original $.ajax method – jQuery never uses the mocked up XHR object it was provided by Mockjax.
So in essence, Mockjax is intercepting the request, and then passes it right back to jQuery and it fails on you.
I opened an issue here so this can be fixed: https://github.com/appendto/jquery-mockjax/issues/136
In the mean time you have a two choices. If you want to quickly patch mockjax, add this line to around 471:
origSettings.crossDomain = false;
That section will look like this when you are done:
mockHandler.cache = requestSettings.cache;
mockHandler.timeout = requestSettings.timeout;
mockHandler.global = requestSettings.global;
origSettings.crossDomain = false;
copyUrlParameters(mockHandler, origSettings);
The other alternative (which I recommend against), is adding crossDomain: false to your actual AJAX request. I don't recommend this due to the need to remove that line when you remove your mocks later.
Thanks #Nicholas Cloud for pinging me and bringing this issue to my attention.
Are you setting the dataType property in your mocked endpoints?
See: https://github.com/appendto/jquery-mockjax#data-types
If you are, have you tried setting the mock dataType to application/javascript?
$.mockjax({
type: "GET",
dataType: "application/javascript",
url: "myfile.js?_=1395314460347",
responseText: "(function () { alert('hello world!'); }());"
});
$.ajax({
type: "GET",
dataType: "script",
url: "myfile.js?_=1395314460347"
});

Why does my page lock while doing ajax request?

The idea is to generate a heap of thumbnails without geting a server time-out. So I do it one by one with ajax using jQuery.
I have a JavaScript loop running through a set of filenames and asking the server to generate an image for each file. In this case, it's a matter of about 300+ files.
I run through my files and use a separate function to do the ajax request. The top loop wants to display the file currently being processed. But the page appears hanged while the top loop is running. Why? (As you see, i tried to wait a second before calling ajax, but that did not do the trick.)
function mkImgAll() {
$.ajaxSetup({ async: true });
var files = $('.files');
debugClear();
for(i=0;i < files.length;i++) {
var id=files[i].id;
var file=files[i].value;
debugSay(id+' '+file); // <- This does not display
sleep(1000); // until the
mkImg(id, file); // loop has finished.
}
$.ajaxSetup({ async: ajaxAsyncDefault });
}
function mkImg(id, file){
$('#ajaxWaiting').show(1);
$.ajax({
type : 'POST',
url : 'includes/ajax.php',
dataType : 'json',
async: false,
data: {
'proc' : 'makeOneThumb',
'id' : id,
'file' : file
},
btw, the debugSay function does this:
function debugSay(say) {
if(debug) {
$("#debugMessage").append("<xmp>"+say+"</xmp>");
}
}
$.ajax({
....
async: false,
Your request isn't asynchronous. You should put async to true, the use of async: false is deprecated since jquery 1.7 !
I would try to call the debug with setTimeout...
Long shot, but you could try this:
if (debug) {
var dummy = $("#debugMessage").append("<xmp>"+say+"</xmp>").get(0).offsetLeft;
}
That should force refresh the page before the browser gets locked into the sleep(). It's better to use setTimeout() instead of sleep() for that btw, unless your loop really has to pause there.

Simple ajax call seems to be blocking

Really simple question. I trying to test a Restful webservice that I am developing, and have this simple ajax call (using jquery):
<script type="text/javascript">
$(document).ready(function() {
var url = '/index.php/gettest/reallyLongRequest';
$.ajax({
url: url,
dataType:'text',
success:function(data) { $('#result').html(data);},
error:function(xhr,err,e) { alert ("Error: " + err);}
});
});
</script>
This runs when the page loads. As it's running, the page is blocking; i.e., (I can see the hourglass next to the mouse pointer) no other user actions can be handled. (Btw, this particular get request--intentionally--takes a very long time to return).
Why is this? A(asynchronous)JAX right? Obviously I am making a beginners mistake. Any ideas, please?
When I attempt this using plain javascript (no library) it works as expected. Does this have something to do with Jquery's handling of the xhr onreadystatechange?
Thank you for looking.
EDIT: multiple people have suggested setting async: true, which as it happens, is the default in jquery, and as such has no effect.
EDIT: As previously mentioned, if I use plain javascript and start this with a timer, e.g., window.setInterval(function() { startLongPoll(); }, 5000)
It updates as expected, without appearing to block. Ideas, anyone?
Here is an example of what I did to solve the problem:
jQuery(document).ready(function() {
setTimeout(function () {
$.getJSON("veryLongRequest", function(json) {
alert("JSON Result: " + json[0].id);});
}, 500); // You may need to adjust this to a longer delay.
});
Note: I am using the short-hand jquery method, "getJSON" which is a wrapper for the ajax call with datatype set to "json". However, this solution will work for all ajax requests.
Referenced:
Stop the browser "throbber of doom" while loading comet/server push iframe
I think that this should default to true, but try adding async: true to your ajax json parameter.
Does the code below work as expected?
<script type="text/javascript">
//$(document).ready(function() {
var url = '/index.php/gettest/reallyLongRequest';
$.ajax({
url: url,
dataType:'text',
success:function(data) { $('#result').html(data);},
error:function(xhr,err,e) { alert ("Error: " + err);}
});
//});
</script>
May want to try and Add async:true
<script type="text/javascript">
$(document).ready(function() {
var url = '/index.php/gettest/reallyLongRequest';
$.ajax({
url: url,
async:true,
dataType:'text',
success:function(data) { $('#result').html(data);},
error:function(xhr,err,e) { alert ("Error: " + err);}
});
});
</script>

Categories