Timing a set of functions with asyncronous subroutines - javascript

I have two functions periodically called via setInterval. The goal is to defer Function B until Function A is done (and vis versa). Currently, Function A will start, complete some of its subroutines, but not reach the end before Function B begins.
I've tried passing Function B as an argument of Function A. I am not sure if that was sufficient to create a callback. I also tried jQuery's $.when(setInterval(functionA, 10000)).then(setInterval(functionB, 5000)).
How do I ask JavaScript to wait for functions/blocks of code to finish? Thank you in advance.
Edit: Below is code very similar to my original. Sorry for not being concise.
Function A, getFruits(): There is a remote JSON that changes on its own (fruits.json). getFruits() does two things: 1) It empties an array, [allFruits] (just in case); 2) It adds all the names of fruit currently in the remote JSON to [allFruits]. Now, [allFruits] is an instanced copy of the remote JSON. Before this question, I only called getFruits() once, at startup; in other words, I did not use setInterval for getFruits().
Function B, checkFruits(): Now checkFruits() periodically (setInterval(checkFruits, 5000)) compares [allFruits] to the remote version. If any fruit was added to the remote version, checkFruits appends [allFruits] with those fruits' names; it also runs useful code (i.e. pushes the new names to an array [queue]).
For this implementation, it is important to create an initial list so only new (post-startup) fruit trigger the useful code of checkFruits(). Moreover, it is important only to add (never subtract) names from [allFruits] within a session. This is to prevent a new fruit from triggering the useful code more than once per session.
Problem: Now I want to make getFruits() (Function A) periodic. Because getFruits() empties [allFruits], it will allow the names that built up to again trigger useful code (but only once in between invocations of getFruits()). However, when I use setInterval(getFruits, 10000), there are times (in this example, always) when getFruits() overlaps with checkFruits(). When that happens, I notice only part of getFruits() finishes before checkFruits() starts. The console.log() messages appear in this order: 'getFruits() start:', 'checkFruits():', 'getFruits() end:'. Furthermore, my useful code is ran before getFruits() finishes (this is what is really undesired), and [allFruits] gets duplicates. This would not occur if getFruits() completely finished before checkFruits() jumped in.
debugging = true;
var debug = function() {
if (debugging){
console.log.apply(console, arguments)
};
}
var allFruits = [];
var queue = [];
var getFruits = function() {
allFruits = []; // Empty the list
debug('getFruits() start:', 'allFruits =', allFruits, 'queue =', queue);
$.ajax({
url: 'fruits.json',
dataType: 'json',
success: function(data) {
data.fruits.forEach(function(element) {
allFruits.push(element.name);
});
debug('getFruits() end:', 'data =', data, 'allFruits =', allFruits, 'queue =', queue);
},
});
}
var checkFruits = function() {
$.ajax({
url: 'fruits.json',
dataType: 'json',
success: function(data) {
data.fruits.forEach(function(element) {
if (allFruits.indexOf(element.name) === -1) {
queue.push(['fruit', element.name]);
allFruits.push(element.name);
}
});
debug('checkFruits():', 'data =', data, 'allFruits =', allFruits, 'queue =', queue);
}
});
}
getFruits();
setInterval(checkFruits, 5000);
// setInterval(getFruits, 10000); // When I try this, checkFruits() does not wait for getFruits() to finish.
The analogy of my actual remote resource is fruits.json. fruits.json can simply be the following:
{"fruits":[{"name":"apple","color":"red"},{"name":"banana","color":"yellow"},{"name":"tangerine","color":"orange"}]}
Again, the actual, remote JSON changes independently.

What you have here are two methods that each do asynchronouse stuff. Here are some good stack overflow posts on what that means.
Easy to understand definition of "asynchronous event"?
Does async programming mean multi-threading?
Are JavaScript functions asynchronous?
We have no idea how long it will take for an asynchronous call to finish. In your case, the AJAX request could take up to a few seconds depending on network speeds so regardless of when each of these methods are executed you CANNOT know which one will finish first. So what to do? Well, generally when you write/use an asynchronous method (like $.ajax) you give it a callback that will be executed when the asynchronous work is finished. And you have done this in the form of the success callback. And here is the good news. The success callbacks are SYNCHRONOUS (note the missing a). This means that the "useful code" in the success callback that needs to be run when a request finishes will complete (so long as none of it is async) before the "other useful code" in the other success callback is executed at all. And this works no matter which request finishes first. Each success callback will always wait for the other. So I think what was confusing you was your debug statements. If you add the following statements to your code the execution flow may make more sense:
debugging = true;
var debug = function() {
if (debugging) {
console.log.apply(console, arguments)
};
}
var allFruits = [];
var queue = [];
var getFruits = function() {
debug("getFruits: make request");
$.ajax({
url: 'fruits.json',
dataType: 'json',
success: function(data) {
debug("getFruits: start processing");
allFruits = []; // Empty the list
data.fruits.forEach(function(element) {
allFruits.push(element.name);
});
debug('getFruits: finished processing');
},
});
debug("getFruits: request sent, now we wait for a response.");
}
var checkFruits = function() {
debug("checkFruits: make request");
$.ajax({
url: 'fruits.json',
dataType: 'json',
success: function(data) {
debug("checkFruits: start processing");
data.fruits.forEach(function(element) {
if (allFruits.indexOf(element.name) === -1) {
queue.push(['fruit', element.name]);
allFruits.push(element.name);
}
});
debug("checkFruits: finished processing");
}
});
debug("checkFruits: request sent, now we wait for a response.");
}
getFruits();
setInterval(checkFruits, 5000);
// setInterval(getFruits, 10000); // When I try this, checkFruits() does not wait for getFruits() to finish.
After thinking about it I believe the only reason things may not have been behaving as expected is because you're emptying the allFruits array outside of the callback. If you move it as I have done I would think everything should work fine.
Now, I don't know why you need to re-initialize the data since each time you make the request your getting the latest information but lets roll with it. Since both methods make the same request lets consolidate that into a single method. No need to duplicate code ;). And since all of your examples have the getFruits running twice as slow as the checkFruits we could easily add a counter to accomplish the same sequence of events like so:
debugging = true;
var debug = function() {
if (debugging) {
console.log.apply(console, arguments)
};
}
var allFruits = [];
var queue = [];
var count = 0;
var doOneThing = function(data) {
//do stuff
}
var doAnotherThing= function(data) {
//do other stuff
}
var requestFruits = function() {
$.ajax({
url: 'fruits.json',
dataType: 'json',
success: function(data) {
// if count is even...or, do this every other time.
if (count % 2 === 0) {
count++;
doOneThing(data);
}
// do this everytime
doAnotherThing(data);
},
});
}
setInterval(requestFruits, 5000);
Hope this helps. Cheers.

your last code example first executes setInterval(functionA), and when the deferred execution of functionA is setup, executes setInterval(functionB), meaning that B will called +- 5 seconds after that line is executed, while functionA is called +- 10 seconds.
edit to reflect your additional information:
setInterval(function(){
functionA();
functionB();
}, 10000)
setTimeout(function(){
setInterval(functionB, 10000)
}, 5000)

This is a crude answer. I sense that callbacks can achieve this, but I am not sure how to code them, especially involving setInterval.
I create two global variables, getFruitsIsBusy = false and checkFruitsIsBusy = false. I create an IF for both getFruits() and checkFruits(). Here is getFruits():
var getFruits = function() {
if (checkFruitsIsBusy) { // New
setTimeout(getFruits, 100); // New
return; // New
} else { // New
getFruitsIsBusy = true // New
allFruits = []; // Empty the list
debug('getFruits() start:', 'allFruits =', allFruits, 'queue =', queue);
$.ajax({
url: 'fruits.json',
dataType: 'json',
success: function(data) {
data.fruits.forEach(function(element) {
allFruits.push(element.name);
});
getFruitsIsBusy = false // New; in the success function
debug('getFruits() end:', 'data =', data, 'allFruits =', allFruits, 'queue =', queue)
},
});
}
}
If also using this paradigm for checkFruits(), it seems both functions will wait for each other to finish.

Based on an analysis of the timing of two functions (A and B), consider the following solution (Chionglo, 2016):
Keep state information for each of function A and function B. The state of each function should be set within each of the respective functions.
Create a wrapper function for each of function A and function B. The wrapper function calls on the respective function, and then checks for the state of the respective function.
a. The check in wrapper function A: if function A has reached is final state, clear the interval associated with wrapper function A and schedule an interval for wrapper function B.
b. The check in wrapper function B: if function B has reached its final state, clear the interval associated with wrapper function B.
To begin the process, schedule an interval for wrapper function A.
Sample code:
var ac = Math.round(4*Math.random())+4;
var bc = Math.round(6*Math.random())+6;
var ai;
var Astate = false;
var Bstate = false;
function A() {
// Do your thing for A here.
// The following changes the “state of A” and then determines if the final state has been reached.
ac -= 1;
if (ac<1) Astate = true;
else Astate = false;
}
function B() {
// Do your thing for B here.
// The following changes the “state of B” and then determines if the final state has been reached.
bc -= 1;
if (bc<1) Bstate = true;
else Bstate = false;
}
ai = setInterval("processA()", 1000);
function processA() {
A();
if (Astate) {
clearInterval(ai);
ai = setInterval("processB()", 500);
}
}
function processB() {
B();
if (Bstate) {
clearInterval(ai);
ai = undefined;
}
}
Reference
Chionglo, J. F. (2016). An analysis for timing a set of functions. Available at http://www.aespen.ca/AEnswers/1458200332.pdf.

Related

ajax function in for loop not updating DOM even with setTimeout

I am writing a function to use ajax to get the instructions from a back end server while the page is loading. My ajax code fetches the instructions based on the number and prints the instructions using the variable response.setText in this <p id="loadText"></p> element using jquery and the console. Here is my ajax function:
function ajaxFetch(s) {
var success = false;
$.ajax({
type: "POST",
url: "post.php",
data: {
step: s
},
async: false,
dataType: 'JSON',
success: function (response) {
$("#loadText").text(response.stepText);
console.log(response.stepText);
success = true;
}
});
return success;
}
I am trying to use another function to loop through steps no matter how many there are, but here are my problems that I keep running into:
ajaxFetch() is not updating the DOM until last execution
tried setTimeout() and not updating DOM
for loop looping through ajaxFetch() too quickly
response.stepText prints in the console on time, but does not update DOM on time
Here is a sample loop I have tried:
function uploadSteps(maxStep) {
for (var x = 1; x <= maxStep; x++){
setTimeout(ajaxFetch(x), 20);
}
}
Sorry this is so long and thanks in advance.
By the time your for-loop completes, say 20 iterations, your ajax call in ajaxFetch only would have received the response for the first few calls and what you see in the end is the response for the last ajax call. You can use this link to understand how async calls work in javascript
https://rowanmanning.com/posts/javascript-for-beginners-async/
So the answer is, you need to wait till the first ajax call completes and then call the method again with a timeout of 20ms, like this
var globalMaxSteps = 1;
var startIndex = 1;
function ajaxFetch(s) {
$.ajax({
type: "POST",
url: "post.php",
data: {
step: s
},
async: false,
dataType: 'JSON',
success: function (response) {
$("#loadText").text(response.stepText);
console.log(response.stepText);
startIndex++;
if(startIndex <= globalMaxSteps) {
setTimeout(function(){
ajaxFetch((startIndex);
},20);
} else {
console.log("All Iterations complete");
}
}
});
}
function uploadSteps(maxStep) {
startIndex = 1;
globalMaxSteps = maxStep;
setTimeout(function(){
ajaxFetch(startIndex);
},20);
}
First, we need to fix mistakes in the uploadSteps function:
function uploadSteps(maxStep) {
// here change `var x` to `let x` to avoid problems
// like here - https://stackoverflow.com/q/750486/5811984
for (let x = 1; x <= maxStep; x++){
setTimeout(function() {
// notice how here ajaxFetch(x) is wrapped into a function,
// otherwise it gets called right away
ajaxFetch(x)
}, 20);
}
}
Now here's another problem - all the setTimeout will be called with 20ms delay, that means that all of them will be executed at the same time, but ~20ms after uploadSteps() was called.
Let's see what happens when maxStep=3 (assuming your CPU is very fast because that is irrelevant for understanding the problem):
Time passed | what happens
--------------------------
0ms | setTimeout(ajaxFetch(1), 20) is called
0ms | setTimeout(ajaxFetch(2), 20) is called
0ms | setTimeout(ajaxFetch(3), 20) is called
20ms | ajaxFetch(1) is called
20ms | ajaxFetch(2) is called
20ms | ajaxFetch(3) is called
So as you see all ajaxFetch's are called at the same time, and I am assuming that's not exactly what you need. What you might be looking for is this:
Time passed | what happens
--------------------------
0ms | setTimeout(ajaxFetch(1), 20) is called
0ms | setTimeout(ajaxFetch(2), 40) is called
0ms | setTimeout(ajaxFetch(3), 60) is called
20ms | ajaxFetch(1) is called
40ms | ajaxFetch(2) is called
60ms | ajaxFetch(3) is called
Which can be implemented with a slight change to your code
function uploadSteps(maxStep) {
for (let x = 1; x <= maxStep; x++){
setTimeout(function() {
ajaxFetch(x)
}, 20 * x); // change delay from 20 -> 20 * x
}
}
Also it looks like you don't need to return anything from ajaxFetch(), so it's better to make it async so it does not block the code execution:
function ajaxFetch(s) {
$.ajax({
type: "POST",
url: "post.php",
data: {
step: s
},
// async: false, -- remove this, it's true by default
dataType: 'JSON',
success: function (response) {
$("#loadText").text(response.stepText);
console.log(response.stepText);
}
});
}
Even if you actually do need to return something for fetchAjax(), it's better to keep it async and use callbacks/promises. jQuery actually strongly discourages using async: false in any case.
If the reason you added setTimeouts is to make sure all the steps are executed in order, then it's not the right way to do that. The problems are:
Let's say it took 100ms for the server to respond to the first request, and 10ms for the second one. Even with the 20ms delay the second request will be executed first. And just increasing the delay is not the solution, because:
If your server responds much faster the delay, you are introducing an unnecessary wait for the user.
It's better to add a callback from ajaxFetch() that will be called when ajax fetching is done, and then you'd call the next ajaxFetch() after you receive the callback.

Prevent multiple AJAX calls to the same url within timeframe

I have an app which wants to get info about every marker on a map.
Each marker has a class, such as "car" or "pedestrian".
The app makes (via jQuery) a getJSON call to "http://myserver/info/".
However, since multiple markers may have the same class, the server could end up getting hit with many requests.
Accordingly, I'd like to pool requests which occur within a specified time frame (maybe 5 seconds or so) so that only one request is made, but each calling instance of getJSON is unaware of it.
My thought is to wrap getJSON in another function which stores the URLS in a hashmap/dictionary and stores up promises for each requester. When data is returned, the promises are fulfilled.
I ask, is there a standard way of doing this (debouncing an AJAX request, as it were)?
I created something (in 25 minutes ^^) that might help you; it's a Timeout manager:
var requestsPool = {
requests: {}, //list of urls
timeout: 5000, //In milliseconds
add: function(url) {
if(requestsPool.exists(url)) return false; //check if url is already present in the pool
requestsPool.requests[url] = setTimeout(function(u) {
requestsPool.remove(u);
}.bind(this, url), requestsPool.timeout); //Defining the timeout
return true;
},
exists: function(url) {
return requestsPool.requests[url]; //Return the Timeout ID if present or undefined
},
remove: function(url) {
return delete requestsPool.requests[url]; //return true almost always #link https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/delete
},
cancel: function(url) {
clearTimeout(requestsPool.requests[url]); //cancel the timeout
return requestsPool.remove(url); //remove the url form the pool
}
}
$(anchor).click(function() {
if(requestsPool.exists(anchor.href)) {
// If cooldown is present
} else {
$.getJSON(anchor.href, function(data) {
requestsPool.add(anchor.href);
});
}
})
My thought is to wrap getJSON in another function which stores the URLS in a hashmap/dictionary and stores up promises for each requester
Yes, that's a good idea. It might look like this:
var debouncedGet = (function() {
var pool = {};
return function get(url) {
if (!pool[url]) {
pool[url] = $.getJSON(url);
setTimeout(function() {
pool[url] = null;
}, 5000); // you might want to move this into a `pool[url].always(…)` callback
// so the timer starts when the request returned
}
return pool[url];
};
}());
Here's my bid:
(function(window,$,undefined){
'use strict';
var cache = {},
timeout = 5e3;
// Use like traditional $.getJSON
$.getJSON = function(url,data,callback){
if ($.isFunction(data)){
callback = data;
data = undefined;
}
// Establish a cache key so we can re-reference existing
// requests to subsequent ones (within the timeout window).
var cacheKey = url;
if (cache[cacheKey]){
// This is an existing request; Simple add the callback
// onto the promise and return it.
return cache[cacheKey].done(callback);
} else {
// This is a new request. Build up a new request,
// attach the callback to the promise, and also add
// a couple cleanup methods for disposing the cache
// when appropriate.
cache[cacheKey] = $.ajax($.extend({
url: url,
type: 'get',
dataType: 'json',
data: data,
}, $.isPlainObject(url) && url))
.done(callback)
.always(function(){
delete cache[cacheKey];
});
setTimeout(function(){
// TODO: Probbaly want to store a reference to
// this timeout and clear it in the .always (to
// avoid race condition between .always firing
// and new request coming in but not returning yet)
cache[cacheKey] && delete cache[cacheKey];
}, timeout);
return cache[cacheKey];
}
};
})(window,jQuery);
And, FWIW, a jsFiddle: http://jsfiddle.net/ajtbdxt7/

Execute a forEach like a waterfall in async

I'm trying to retrieve longitude and latitude from a list of addresses with the Google API via a Node.js script. The call itself works fine but since I have around 100 addresses to submit. I use a async.forEach on an array, but the calls are made too fast and I get the error "You have exceeded your rate-limit for this API."
I found that the number of calls is limited to 2500 every 24h and maximum 10 a second. While I'm OK for the 2500 a day, I make my calls way too fast for the rate limit.
I now have to write a function who will delay the calls enough not to reach the limit. Here is a sample of my code :
async.forEach(final_json, function(item, callback) {
var path = '/maps/api/geocode/json?address='+encodeURIComponent(item.main_address)+'&sensor=false';
console.log(path);
var options = {
host: 'maps.googleapis.com',
port: 80,
path: path,
method: 'GET',
headers: {
'Content-Type': 'application/json'
}
}
// a function I have who makes the http GET
rest.getJSON(options, function(statusCode, res) {
console.log(res);
callback();
});
}, function() {
// do something once all the calls have been made
});
How would you proceed to achieve this? I tried putting my rest.getJSON inside a 100ms setTimeout but the forEach iterates through all the rows so fast that it starts all the setTimeout almost at the same time and therefore it doesn't change anything...
The async.waterfall looks like it would do the trick, but the thing is I don't know exactly how many rows I will have, so I can't hardcode all the function calls. And to be honest, it would make my code really ugly
The idea is that you can create a rateLimited function that acts much like a throttled or debounced function, except any calls that don't execute immediately get queued and run in order as the rate limit time period expires.
Basically, it creates parallel 1 second intervals that self-manage via timer rescheduling, but only up to perSecondLimit intervals are allowed.
function rateLimit(perSecondLimit, fn) {
var callsInLastSecond = 0;
var queue = [];
return function limited() {
if(callsInLastSecond >= perSecondLimit) {
queue.push([this,arguments]);
return;
}
callsInLastSecond++;
setTimeout(function() {
callsInLastSecond--;
var parms;
if(parms = queue.shift()) {
limited.apply(parms[0], parms[1]);
}
}, 1010);
fn.apply(this, arguments);
};
}
Usage:
function thisFunctionWillBeCalledTooFast() {}
var limitedVersion = rateLimit(10, thisFunctionWillBeCalledTooFast);
// 10 calls will be launched immediately, then as the timer expires
// for each of those calls a new call will be launched in it's place.
for(var i = 0; i < 100; i++) {
limitedVersion();
}
Here's how I would hack it (Note: arr is your array of locations):
function populate(arr, callback, pos) {
if(typeof pos == "undefined")
pos=0;
var path = '/maps/api/geocode/json?address='+encodeURIComponent(arr[pos].main_address)+'&sensor=false';
console.log(path);
var options = {
host: 'maps.googleapis.com',
port: 80,
path: path,
method: 'GET',
headers: {
'Content-Type': 'application/json'
}
}
// a function I have who makes the http GET
rest.getJSON(options, function(statusCode, res) {
console.log(res);
});
pos++;
if(pos<arr.length)
setTimeout(function(){
populate(arr,callback,pos);
},110); //a little wiggle room since setTimeout isn't exact
else
callback();
}
You could add a rate limiting function, but, IMHO, it introduces unnecessary complexity. All you really want to do is call the function every tenth of a second or so until you're done with your list, so do that.
It's certainly not as extensible as the alternative, but I'm a fan of simplicity.

jQuery Deferred/Promises dynamic array not executing callbacks in correct order

Grateful for any insight into what I'm misunderstanding here. My requirement is as follows:
I have an array of URLs. I want to fire off an AJAX request for each URL simultaneously, and as soon as the first request completes, call the first callback. Then, if and when the second request completes, call that callback, and so on.
Option 1:
for (var i = 0; i < myUrlArray.length; i++) {
$.ajax({
url: myUrlArray[i]
}).done(function(response) {
// Do something with response
});
}
Obviously this doesn't work, as there is no guarantee the responses will complete in the correct order.
Option 2:
var promises = [];
for (var i = 0; i < myUrlArray.length; i++) {
promises.push($.ajax({
url: myUrlArray[i]
}));
}
$.when.apply($, promises).then(function() {
// Do something with each response
});
This should work, but the downside is that it waits until all AJAX requests have completed, before firing any of the callbacks.
Ideally, I should be able to call the first callback as soon as it's complete, then chain the second callback to execute whenever that response is received (or immediately if it's already resolved), then the third, and so on.
The array length is completely variable and could contain any number of requests at any given time, so just hard coding the callback chain isn't an option.
My attempt:
var promises = [];
for (var i = 0; i < myUrlArray.length; i++) {
promises.push($.ajax({
url: myUrlArray[i] // Add each AJAX Deferred to the promises array
}));
}
(function handleAJAX() {
var promise;
if (promises.length) {
promise = promises.shift(); // Grab the first one in the stack
promise.then(function(response) { // Set up 'done' callback
// Do something with response
if (promises.length) {
handleAJAX(); // Move onto the next one
}
});
}
}());
The problem is that the callbacks execute in a completely random order! For example, if I add 'home.html', 'page2.html', 'page3.html' to the array, the order of responses won't necessarily be 'home.html', 'page2.html', 'page3.html'.
I'm obviously fundamentally misunderstanding something about the way promises work. Any help gratefully appreciated!
Cheers
EDIT
OK, now I'm even more confused. I made this JSFiddle with one array using Alnitak's answer and another using JoeFletch's answer and neither of them work as I would expect! Can anyone see what is going on here?
EDIT 2
Got it working! Based on JoeFletch's answer below, I adapted the solution as follows:
var i, responseArr = [];
for (i = 0; i < myUrlArray.length; i++) {
responseArr.push('0'); // <-- Add 'unprocessed' flag for each pending request
(function(ii) {
$.ajax({
url: myUrlArray[ii]
}).done(function(response) {
responseArr[ii] = response; // <-- Store response in array
}).fail(function(xhr, status, error) {
responseArr[ii] = 'ERROR';
}).always(function(response) {
for (var iii = 0; iii < responseArr.length; iii++) { // <-- Loop through entire response array from the beginning
if (responseArr[iii] === '0') {
return; // As soon as we hit an 'unprocessed' request, exit loop
}
else if (responseArr[iii] !== 'done') {
$('#target').append(responseArr[iii]); // <-- Do actual callback DOM append stuff
responseArr[iii] = 'done'; // <-- Set 'complete' flag for this request
}
}
});
}(i)); // <-- pass current value of i into closure to encapsulate
}
TL;DR: I don't understand jQuery promises, got it working without them. :)
Don't forget that you don't need to register the callbacks straight away.
I think this would work, the main difference with your code being that I've used .done rather than .then and refactored a few lines.
var promises = myUrlArray.map(function(url) {
return $.ajax({url: url});
});
(function serialize() {
var def = promises.shift();
if (def) {
def.done(function() {
callback.apply(null, arguments);
serialize();
});
}
})();
Here's my attempt at solving this. I updated my answer to include error handling for a failed .ajax call. I also moved some code to the complete method of the .ajax call.
var urlArr = ["url1", "url2"];
var responseArr = [];
for(var i = 0; i < length; i++) {
responseArr.push("0");//0 meaning unprocessed to the DOM
}
$.each(urlArr, function(i, url){
$.ajax({
url: url,
success: function(data){
responseArr[i] = data;
},
error: function (xhr, status, error) {
responseArr[i] = "Failed Response";//enter whatever you want to place here to notify the end user
},
complete: function() {
$.each(responseArr, function(i, element){
if (responseArr[i] == "0") {
return;
}
else if (responseArr[i] != "done")
{
//do something with the response
responseArr[i] = "done";
}
});
}
});
})
Asynchronous requests aren't guaranteed to finish in the same order that they are sent. some may take longer than others depending on server load and the amount of data being transferred.
The only options are either to wait until they are all done, only send one at a time, or just deal with them being called possibly out of order.

Extjs: two parallel ajax call

my code creates two ajax call at the same time (i assume the parallelism would be more efficient). I want to load a table if both calls succeed. What's the proper way of doing this?
var succeeded = {};
function callBackOne(){
succeeded.one = true;
// your other stuff
if (succeeded.two) { bothHaveSucceeded());
}
function callBackTwo(){
succeeded.two = true;
// your other stuff
if (succeeded.one) { bothHaveSucceeded());
}
I'd use a delayed task personally:
var success = {
one: false,
two: false
};
// Task
var task = new Ext.util.DelayedTask(function(){
// Check for success
if (success.one && success.two) {
// Callback
doCallback();
} else {
task.delay(500);
}
});
task.delay(500);
// First
Ext.Ajax.request({
...
success: function() {
success.one = true;
}
...
});
// Second
Ext.Ajax.request({
...
success: function() {
success.two = true;
}
...
});
The task acts like a thread and will check on the status of the requests and sleep for every 500ms until they both complete.
Old question, but well, as I stumbled upon it...
I'd use the excellent async library by Caolan, particularly here you'll want to use async.parallel.
The examples written on the GitHub doc are worth a read.
https://github.com/caolan/async#parallel
Share an integer variable that each callback checks:
// count variable
var numReturns = 0;
// same call back used for each Ajax request:
function callback() {
numReturns++;
if (numReturns === 2) {
progress();
}
}
If you need different callbacks, have each callback fire an event which does the same thing.

Categories