Javascript call a function or inline code? - javascript

I had a weird experience.
On the success of the ajax call I did loads of computation and processing on the DOM, everything was as smooth as it can be.
Next I moved the whole code written in the success to a separate javascript function which was in turn invoked on the success part of the ajax.
Now I see a lag of 1-2 seconds in execution of the function. Is it possible that inline code is faster than a function call?
EDIT
The sample code :
$.ajax({
url: '/apps/project/controller/load_data',
method: 'get',
dataType: "json",
data: {},
success: function(data) {
//Parse JSON (Huge Data) and insert into DOM
}});
The second approach I did
$.ajax({
url: '/apps/project/controller/load_data',
method: 'get',
dataType: "json",
data: {},
success: function(data) {
populate_timeline(data)
}});
function populate_timeline(json){
//Parse JSON (Huge Data) and insert into DOM
}

One suggestion would be to not compound your problems by using an anonymous pass through. You should simply be able to do success: populate_timeline as functions are first order objects in JavaScript. You may have to ensure that populate_timeline is declared before it is referenced in the ajax, I don't know how all your code is laid out or called.
I was optimizing a script recently and found that in-lining a single function call really had very little effect on performance. That was code that performed some canvas animations with a pretty short setInterval time so the function call was being made many many times a second.
Have you gone back and made sure that moving the previously in-lined code to its own function is the only thing you've done? It's easy to make other changes without thinking about it. Also if you are running this code on your local machine for development purposes, ensure it's not simply the ajax call being slower rather than the function call. Maybe you have some other CPU heavy process running now that wasn't running earlier and is slowing the ajax response?

Related

How JQuery Ajax works?

I am trying to send data from JQuery Ajax to a Generic Handler that calculates something and returns a result. The Ajax request is made inside a for loop at JQuery end. The code looks something like this:
function send(Handler, ids) {
var URL = "http://" + window.location.host + "/Handlers/" + Handler + ".ashx";
var i;
for (i = 0; i < 10; i++) {
var cur = $('.' + ids[i]);
$.ajax({
url: URL,
type: "POST",
data: JSON.stringify({
Data: cur
}),
dataType: "json",
cache: false,
beforeSend: //my before send code,
success: //my success code,
error: //my error code
});
}
alert('Done!');
}
I placed 3 breakpoint in Visual Studio 2012 at line:
$.ajax({
this
alert('Done!');
And third breakpoint at first line in the Generic Handler.
Now, when I try to execute this code, the Ajax works nicely in async way. But, when it reaches the first breakpoint, stops there and then I resume it, instead of reaching the Generic Handler's breakpoint, it continues the loop and goes back to first breakpoint. Then after it reaches the second breakpoint it then stops at generic handler's break point again and again for each value.
So, does that mean that Ajax first collects all the ajax requests and then after the for loop it executes them together?
Javascript is single threaded and non-blocking. This means that in the first iteration won't wait for the ajax call to be completed, it will go back and start the second iteration, and so on.
So, no it doesn't executes them all together. It definately starts the ajax calls in the order of the loop but there is no way to tell what will end first. It might make all the ajax calls and then get an answer (doesn't mean it is the answer of the first iteration), or in the middle of a loop it might be getting answers.
If I am understanding you correctly, you just answered your own question. Ajax is working asynchronously meaning the for loop starts and fires out ajax requests, and continues the loop (the ajax request DOES NOT block)
Therefore it is very likely that the js is performing a loop of code before the request reaches your url (as this has to create a network call)
That said, what are you doing in your beforeSend method? maybe this is making it take enough time that it can perform all iterations of the loop before sending the first request?
To answer your question, no it shouldn't be waiting for the for loop to finish in order to send off the requests, it should be initiating the process as soon as you have made the call

Does AJAX function call within regular intervals would slow down the application?

In my application in a particular page I use an AJAX function call continuously like below,
<script type="text/javascript">
$(document).ready(function(){
setInterval(function() {
$.ajax({
url:'clmcontrol_livematchupdate',
type:'post',
dataType: 'json',
success: function(data) {
$('#lblbattingteam').html(data.battingnow);
$('#lblscore').html(data.score);
$('#lblwickets').html(data.wickets);
$('#lblovers').html(data.overs);
$('#lblballs').html(data.balls);
$('#lblextras').html(data.extras);
$('#lblrr').html(data.runrate);
$('#lblbowlingteam').html(data.bowlingnow);
$('#lblbowler').html(data.currentbowler);
$('#lblbowlerovers').html(data.bowlerovers);
$('#lblbowlerballs').html(data.bowlerballs);
$('#lblrunsgiven').html(data.runsgiven);
$('#lblextrasgiven').html(data.extrasgiven);
$('#lblwicketstaken').html(data.wicketstaken);
$('#lblecon').html(data.econ);
}
});
}, 4000);
});
</script>
Any how at the first attempts the application ran well and the values got updated as I expected, but after few attempts more the values struggled to update and going further updates were not happening. Is it because the function slows down the system due to continuous ajax calls?
It's better not to use setInterval() because If the first request hasn't completed and start another one, you could end up in a situation where you have multiple requests that consume shared resources and starve each other. You can avoid this problem by waiting to schedule the next request until the last one has completed.
Just Try:
(function ajaxInterval() {
$.ajax({
url:'clmcontrol_livematchupdate',
type:'post',
dataType: 'json',
success: function(data) {
$('#lblbattingteam').html(data.battingnow);
$('#lblscore').html(data.score);
$('#lblwickets').html(data.wickets);
$('#lblovers').html(data.overs);
$('#lblballs').html(data.balls);
$('#lblextras').html(data.extras);
$('#lblrr').html(data.runrate);
$('#lblbowlingteam').html(data.bowlingnow);
$('#lblbowler').html(data.currentbowler);
$('#lblbowlerovers').html(data.bowlerovers);
$('#lblbowlerballs').html(data.bowlerballs);
$('#lblrunsgiven').html(data.runsgiven);
$('#lblextrasgiven').html(data.extrasgiven);
$('#lblwicketstaken').html(data.wicketstaken);
$('#lblecon').html(data.econ);
},
complete: function() {
// Schedule the next request when the current one has been completed
setTimeout(ajaxInterval, 4000);
}
});
})();
There is a potential issue here that would be obvious if you checked your network calls from a debugger. Due to the non blocking async behavior of the ajax call you have the potential to be making simultaneous ajax calls. Depending on your browser you are only allowed to make so many calls at the same time so they will queue up. In these circumstances there are also no guarantees of execution order.
In your situation I would set async: false in the ajax options. You are already gaining non interface blocking behavior by executing in the setInterval callback. Since setInterval just applies a timer in between method calls you will never have more than one ajax call operating at a given time(which is a likely culprit of your issue).

How to Handle delays that made by multiple timeouts (containing ajax calls)

Consider I have multiple (sometimes more than 12) ajax calls that are calling every 2 seconds or more. Data gathered through the calls are set to the UI contained elements (Like progress bars). After all I have delay on SCROLL while timers working . This delay is natural, But How can I handle it?
NOTE: Calls Destinations are services that provides data with the minimum spent time. The point that makes the scroll sad, is using multiple setTimeout() and setInterval() methods. To get more familiar with my work, See the below code:
function FillData(accessUrl, name) {
var add = accessUrl;
$.support.cors = true;
if (add) {
$.ajax({
type: 'GET',
url: accessUrl,
crossDomain: true,
contentType: 'application/json; charset=utf-8',
dataType: 'json',
success: function (data) {
Update(name, data);
},
error: function (xhr, status, error) {
LogResponseErrors(status , error, name);
}
});
setTimeout(function () { FillData(accessUrl, name); }, interval);
//Consider that the method calls with different parameters one time and it will run automatically with setTimeout
}
else {
freezeFrame(name);
}
}
Used Tags explains what I used.
Any useful answer will be appreciated
From what I understand in your question. You have delay when you're handling your ajax responses and you need to remove the delay.
Javascript is single-threaded. Therefore, if there is a function that takes long time to complete, it could dominate the thread and cause the UI not responding. To deal with this, you have 2 options:
Optimize your code so that the function does not take long.
Use setTimeout to break your function into smaller pieces. For example: if your function is executing a loop of 100 items, you could break it to execute 10 times with 10 items each.
Update: (based on updated question):
It seems that the loop never stops when you use setTimeout like this. Should have something like:
counter++;
if (counter <= 12)
setTimeout(function () { FillData(accessUrl, name); }, interval);
Due to timing problem between ajax and your setTimeout, at some points, there are a lot of events (escalated) waiting in the queue to be executed and cause performance problem. Try putting your setTimeout inside your success or complete function

jquery ajax async false causes click event issue

I have a script that runs through a multi-level array and each time calls a new ajax GET command to a php file with part of that array as the data.
Pretty basic...
for(var x=0; x<cities.length; x++){
for(var u=0; u<links.length; u++){
$.ajax({
url: "dontneedtoknow.php?city=" + cities[x] + "&link=" + links[u],
type: 'GET',
async: false,
cache: false,
timeout: 30000,
error: function(){
return true;
},
success: function(data){
//just appending data to page
}
});
}
}
I'd like to be able to have click() events and the ability to STOP this for loop but when this loop is going I can't do ANYTHING because of the async false.
I need the async false because I want the data to be appended as each function completes for a reason.
I have tried .live() but that doesn't seem to work...
Ideas?
When async is false, the entire browser* will be hung. You cannot do anything during a synchronous Ajax call other than waiting for the call to finish.
If you want to be able to stop the loop, you must use asynchronous calls.
See also:
What does "async: false" do in jQuery.ajax()?
IE7 hangs when using (to much) ajax calls with async: false
How to make all AJAX calls sequential?
That last link especially might be useful (if I understand what you're trying to accomplish here).
*unless you're in Chrome (then it's just the current page)
Why make that many calls to the server? That seems very inefficient to me.
Is there a reason you can not change the service you are calling to receive a list of items and return it? It would involve one Ajax call and the server side code can make sure the data is processed in order.

Does this JS Code work as expected?

Every 3 seconds I make an AJAX POST request to get the status of a process. This works great.
When the process reaches 100% a call back function executes(indicated below) to add new elements to the page and then it cancels the setTimeout method that use to continuously get the progress every 3 seconds. However, I have been told by my users it sometimes fails to cancel and the new elements are not added to the page and I've been that it get stuck at showing "100%".
I have tested this again and again and it never gets stuck for me. The code also looks ok, but my JavaScript skills are not great so I was hoping someone could point out if there is potential of this problem happening?
I have commented the code, apologies its very long. I have tried to reduce it.
function convertNow(validURL){
startTime = setTimeout('getStatus();', 6000);
//AJAX CALL TO RUN PROCESS
$.ajax({
type: "GET",
url: "main.php",
data: 'url=' + validURL + '&filename=' + fileNameTxt,
success: function(msg){
//ON SUCCESS CLEAR SETTIMEOUT AND SHOW ELEMENTS (text)
clearTimeout(continueTime);
clearTimeout(startTime);
$("#loading").hide("slow");
$("#done").html("Done");
}//function
});//ajax
}//function convertNow
function getStatus(){
//AJAX CALL TO GET STATUS OF PROCESS
$.ajax({
type: "POST",
url: "fileReader.php",
data: 'textFile=' + fileNameTxt,
success: function(respomse){
textFileResponse = respomse.split(" ");
$("#done").html("Processing...");
}
});//ajax
clearTimeout(continueTime);
if(textFileResponse[0]=='100.0%'){
clearTimeout(continueTime);
}
else{
clearTimeout(startTime);
continueTime = setTimeout('getStatus();', 3000);
}
}
There's probably a parsing error in the textFileReponse[0]=='100.0%' in some edge cases, with the value in the response not equaling exactly 100.0% (maybe there's extra whitespace, or maybe there are some minor differences on some platforms, etc...). This would cause the code to fall through to the else {} block, and your getStatus function would be queued up again.
EDIT: Given the thread in the comments, it's also an equal likelyhood that there's a race condition going on between the two blocks of Ajax code. (just putting this here for the benefit of readers). END EDIT
What you probably want, in addition to resolving the parsing, however, is to use setInterval(), with only one timer, instead of a startTime and continueTime timer. setTimeout executes only once, whereas setInterval repeats every x milliseconds, so you'd need only one. To cancel a setInterval, use clearInterval.

Categories