Am new to Deferreds and Promises.
Here is my [simplified] code, which is defined within a JavaScript object:
myFunction: function(d, cb)
{
return $.ajax('/myURL', {
contentType: 'application/json',
data: d,
dataType: 'json',
type: 'POST'
}).then(cb, cb);
},
flush: function(myArray)
{
return myFunction(myArray, myCallback);
}
The above works fine. I can call flush(someArray), and some time later I get the result of the ajax request.
QUESTION:
I want to modify the flush function so that it first breaks the array into chunks (i.e. smaller arrays), and then calls myFunction on each of those chunks. It must then return, in one go, the aggregated data (preferably in an array) from each of the ajax calls that were made.
I am starting to modify flush() along the following lines, but I know it's not quite right. Please can someone complete/fill in the gaps for me, or suggest a re-structuring that would work well?
Thanks.
flush: function(myArray)
{
var chunk = 2;
var i, a;
var j = myArray.length;
var myArrayChunks = [];
for (i=0; i<j; i+=chunk)
{
a = myArray.slice(i, i+chunk);
myArrayChunks.push(a);
}
var myDeferreds = [];
for (i=0; i<myArrayChunks.length; i++)
{
// Here, I need to create a deferred object that will run: myFunction(myArrayChunks[i], myCallback)
// How do I do that?
var f = // The deferred object that will run: myFunction(myArrayChunks[i], myCallback)
myDeferreds.push(f);
}
return $.when.apply($, myDeferreds).then(function(){
// Here, I need to get the aggregated data that is returned by each of the deferreds. How do I do that?
console.log("FLUSH COMPLETE!");
});
}
The async library I've pasted below allows you to run a series of async/deferred requests, and passes on the results of each async function to a final callback, which aggregates a collection of the results.
In particular, check out the parallel method, which will execute all of your async requests simultaneously, but there is no guarantee which order they will run in. If you are concerned about which order to execute your async requests, check out the seriesand eachSeries methods.
parallel:
https://github.com/caolan/async#parallel
series / eachSeries:
https://github.com/caolan/async#seriestasks-callback
Both methods aggregate your results into a final results object, which contains all of the results passed on from each async call you make.
NOTE, to use jQuery's deferred functionality, you would need to call .resolve() in the "final" callback of the async.parallel or async.each or async.eachSeries methods
Here's an example of the parallel method:
async.parallel([
function(callback){
// some request
$.ajax(/*details*/, function(data) {
callback(null, data);
});
},
function(callback){
// some request
$.ajax(/*details*/, function(data) {
callback(null, data);
});
}
],
// "final" callback, invoked after all above functions have
// called their respective callback() functions
function(err, results){
if(err) {
// handle error
} else {
// results contains aggregated results from all
// async calls (2nd parameter in callback(errorParam, resultsParam)
console.log('all async methods finished!', results);
}
});
Here's a way to pass in an array and make async methods with each array element. NOTE that every async call within the async.each method must call callback() when the async request is resolved, or callback(err) in your async error method if there an error. If you pass in an array of N elements to the async.each method, the final callback will be invoked when all N async resolve callback() methods have been invoked.
async.each(array, function(element, callback) {
$.ajax(/* details */, {data: element}, function(data) {
// call `callback` when you're finished up
callback();
});
},
// "final" callback, invoked after each async call is resolved and
// invokes the callback() function
function(err){
if( err ) {
// handle errors
} else {
console.log('All async methods flushed!');
}
});
I love this library, and once you start using it it'll change your life :]. Best of luck!
Since you already have a promise returned from your ajax function, I'd suggest you use promises instead of plain callbacks. Here's a way to do that:
myFunction: function(d) {
return $.ajax('/myURL', {
contentType: 'application/json',
data: d,
dataType: 'json',
type: 'POST'
});
},
flush: function(myArray, chunkSize) {
chunkSize = chunkSize || 2;
var index = 0;
var results = [];
var self = this;
return jQuery.Deferred(function(def) {
function next() {
var start = index;
var arrayChunk, promises = [];
index += chunkSize;
if (index < myArray.length) {
arrayChunk = myArray.slice(start, chunkSize);
// create chunkSize array of promises
arrayChunk.forEach(function(item) {
promises.push(self.myFunction(item));
});
$.when.apply($, promises).then(function() {
// results are in arguments[0][0], arguments[1][0], etc...
for (var i = 0; i < arguments.length; i++) {
results.push(arguments[i][0]);
}
// next iteration
next();
}, def.reject)
} else {
def.resolve(results);
}
}
// start first iteration
next();
}).promise();
}
obj.flush(myArray).then(function(results) {
// array of results here
}, function(jqXHR, textStatus, errorThrown) {
// error here
});
Here's another way to do it by creating a version of $.ajax() which I call $.ajaxChunk() that takes an array of data and does the chunking for you.
// Send ajax calls in chunks from an array with no more than X in flight at the same time
// Pass in array of data where each item in dataArray is sent separately
// in an ajax call
// Pass settings.chunkSize to specify the chunk size, defaults to 2 if not present
// Returns a promise
// The resolved value of promise is an array of results
// The rejected value of the promise is whatever jQuery result failed first
$.ajaxChunk = function(dataArray, url, settings) {
settings = settings || {};
var chunkSize = settings.chunkSize || 2;
var index = 0;
var results = [];
return jQuery.Deferred(function(def) {
function next() {
var start = index;
var arrayChunk, promises = [];
index += chunkSize;
if (index < myArray.length) {
arrayChunk = myArray.slice(start, chunkSize);
// create chunkSize array of promises
arrayChunk.forEach(function(item) {
// make unique copy of settings object for each ajax call
var localSettings = $.extend({}, settings);
localSettings.data = item;
promises.push($.ajax(url, localSettings));
});
$.when.apply($, promises).then(function() {
// results are in arguments[0][0], arguments[1][0], etc...
for (var i = 0; i < arguments.length; i++) {
results.push(arguments[i][0]);
}
next();
}, def.reject)
} else {
def.resolve(results);
}
}
// start first iteration
next();
}).promise();
}
And, sample usage:
$.ajaxChunk(arrayOfData, '/myURL', {
contentType: 'application/json',
dataType: 'json',
type: 'POST',
chunksize: 2
}).then(function(results) {
// array of results here
}, function(jqXHR, textStatus, errorThrown) {
// error here
})
If the real requirement here is that you don't have more than X ajax calls in process at the same time, then there's a more efficient and faster (end-to-end time) way to do than chunking. Instead, you keep track of exactly how many ajax calls in "in flight" at any time and as soon as one finishes, you start the next one. This is a bit more efficient than chunking where you send the whole chunk, then wait for the whole chunk to finish. I've written a jQuery helper that implements this:
$.ajaxAll = function(dataArray, url, settings, maxInFlight) {
maxInFlight = maxInFlight || 1;
var results = new Array(dataArray.length);
settings = settings || {};
var index = 0;
var inFlight = 0;
return jQuery.Deferred(function(def) {
function runMore() {
while (inFlight < maxInFlight && index < dataArray.length) {
(function(i) {
var localSettings = $.extend({}, settings);
localSettings.data = dataArray[index++];
++inFlight;
$.ajax(url, localSettings).then(function(data, textStatus, jqXHR) {
--inFlight;
results[i] = data;
runMore();
}, def.reject);
})(index);
}
// if we are all done here
if (inFlight === 0 && index >= dataArray.length) {
def.resolve(results);
}
}
// start first iteration
runMore();
}).promise();
}
Note: If you pass 1 for the maxInFlight argument, then this runs the ajax calls in series one after the other. Results are always returned in order.
And, sample usage:
$.ajaxAll(arrayOfData, '/myURL', {
contentType: 'application/json',
dataType: 'json',
type: 'POST'
}, 2).then(function(results) {
// array of results here
}, function(jqXHR, textStatus, errorThrown) {
// error here
})
Thanks to all for great advice.
I used a combination of the suggested techniques in my solution.
The key thing was to make an array of promises, and push onto it the required calls (each with its own array chunk passed as a parameter) to the function that makes the ajax request. One thing I hadn't previously realised is that this calls the ajaxCall() function at that very moment, and that's ok because it returns a promise that is pushed onto the array.
After this, the 'when.apply' line does the trick in waiting until all the ajax promises are fulfilled. The arguments of the 'then' function are used to collate all the results required (obviously, the exact mechanism for that depends on the format of your returned arguments). The results are then sent to theResultsHandler(), which takes the place of the original callback in the code I first posted in my question.
Hope this is useful to other Promise-novices!
The ajax-calling function is:
ajaxCall: function(d) {
return $.ajax('/myURL', {
contentType: 'application/json',
data: d,
dataType: 'json',
type: 'POST'
});
},
And inside the flush() function...
var promises = [];
var i, j;
for (i=0; i<batchChunks.length; i++)
{
promises.push(self.ajaxCall(batchChunks[i]));
}
var results = [];
return $.when.apply($, promises).then(function(){
console.log("arguments = " + JSON.stringify(arguments));
for (i = 0; i < arguments.length; i++)
{
for (j = 0; j < arguments[i][0].length; j++)
{
results.push(arguments[i][0][j]);
}
}
return self.theResultsHandler(results);
});
Related
I have an ajax query followed by some functions, and I use the .then() promise callback to execute them in order:
var pictures = [];
var venues = **an array of venues**
$.get(url).then(functionA, functionFail).then(function B);
But functionA, the first success callback, includes a loop that fires off 'n' ajax requests:
for(var i=0; i<n; i++) {
var venue = venues[i];
var new_url = **some_url**
$.ajax({url: new_url, async: false}).done(function(data) {
var pics = data.response.photos.items;
pictures.push(pics[0]);
pictures.push(pics[1]);
}).fail(function() {
console.log('Failed!');
});
}
These looped ajax requests fill up the global pictures array. The pictures array is then used by functionB, but because of the async nature, the array doesn't get filled up fully and executes right away.
I tried to make the requests synchronous with async: false but it's not completely effective (it leaves out the very last request of the loop).
How can I make sure that functionB is only executed after all the ajax requests have finished? I don't want to use timeouts but if nothing else I'll fall back on that.
As you're using jQuery, it looks like jQuery.when() can take multiple results and then call done once they're all resolved.
Not sure if this is the best answer, but one of them! Just count the number of times request is completed, and when all of them are, execute your function.
var completed_requests = 0,
returned_data = [];
for (var i=0; i<n; i++) {
var venue = venues[i];
var new_url = **some_url**
var done = function(data_array) {
var pics = [];
data_array.forEach(function(data) {
pics = pics.concat(data.response.photos.items);
});
};
$.ajax({url: new_url}).done(function(data) {
completed_requests++;
returned_data.push(data);
if (completed_requests == n) {
done(returned_data);
}
}).fail(function() {
console.log('Failed!');
});
}
My example also saves data from all requests until you need it.
you can use Promise.all, code of function A:
var requests = []
for(var i=0; i<n; i++) {
var venue = venues[i];
var new_url = **some_url**
request.push(new Promise((resolve, reject) => {
$.ajax({url: new_url}).done(function(data) {
var pics = data.response.photos.items;
resolve(pics)
}).fail(function(err) {
reject(err)
console.log('Failed!');
});
}))
}
return Promise.all(requests)
when all request are run successful, the whole return values of requests will be pushed to array and passed to function B.
You can handle the looping yourself, handling only one item in venues at a time. Then upon the the completion of one, call a function to handle the next one, and when venues is empty then call your functionB
ar pictures = [];
var venues = **an array of venues**
$.get(url).then(functionA, functionFail);
function functionA() {
var venue = venues.shift();
var new_url = **some_url**;
$.ajax({
url: new_url,
async: true
}).done(function(data) {
var pics = data.response.photos.items;
pictures.push(pics[0]);
pictures.push(pics[1]);
if(venues.length !== 0) {
functionA();
}
else {
functionB();
}
}).fail(function() {
console.log('Failed!');
});
}
I have this API call where i make sure the data return in the same order i send it. However, i realized thats not really what i want, i want to make sure the data is send and taken care of one at a time.
data[n] has returned before data[n+1] is send.
the reason for this is:
If i do it as seen below, the server still gets it in a random order, and therefor saves the data in my DB in a random order. (or well not random, heavier data gets processed slower)
var promiseArray = [];
for (var i = 0; i < data.length; i++) {
var dataPromise = $http.post('/api/bla/blabla', $httpParamSerializer(data[i]))
.then (function (response) {
//return data for chaining
return response.data;
});
promiseArray.push(dataPromise);
}
$q.all(promiseArray).then(function (dataArray) {
//succes
}).catch (function (errorResponse) {
//error
});
how can i make sure the data is send and processed and returned, one at a time in a smooth way ?
You could do something like this:
var i = -1;
processNextdata();
function processNextdata() {
i++;
if(angular.isUndefined(data[i]))
return;
$http.post('/api/bla/blabla', $httpParamSerializer(data[i]))
.then(processNextdata)
}
Update:
Callback after every result:
var i = -1;
processNextdata();
function processNextdata() {
i++;
if(angular.isUndefined(data[i]))
return;
$http.post('/api/bla/blabla', $httpParamSerializer(data[i]))
.then(function(result) {
// do something with a single result
return processNextdata();
}, errorCallback);
}
Callback after everything is done:
var i = -1, resultData = [];
processNextdata()
.then(function(result) {
console.log(result);
}, errorCallback);
function processNextdata() {
i++;
if(angular.isUndefined(data[i]))
return resultData;
$http.post('/api/bla/blabla', $httpParamSerializer(data[i]))
.then(function(result) {
resultData.push(result.data);
return processNextdata();
}, $q.reject);
}
When using the Promise.all([...]) method, the documentation shows the following:
The Promise.all(iterable) method returns a promise that resolves when all of the promises in the iterable argument have resolved, or rejects with the reason of the first passed promise that rejects.
What this tells us is that there is no expected order of synchronized operations, but in fact the promises run parallel to one another and can complete in any order.
In your case, there is an expected order that you want your promises to run in, so using Promise.all([...]) won't satisfy your requirements.
What you can do instead is execute individual promises, then if you have some that can run in parallel use the Promise.all([...]) method.
I would create a method that takes a request as an argument, then returns the generated promise:
function request (req) {
return new Promise(function (resolve, reject) {
request({
url: url
, port: <port>
, body: req
, json: <true/false>
, method: '<POST/GET>'
, headers: {
}
}, function (error, response, body) {
if (error) {
reject(error);
} else {
resolve(body);
}
});
});
You can then call this function and store the result:
var response = request(myRequest);
Alternatively, you could create an array of your requests and then call the function:
var requests = [request1, request2, ..., requestN];
var responses = [];
for (var i = 0; i < requests.length; i++) {
responses.push(request(requests[i]));
}
I have an array that can hold an unknown amount of indexes in it. Each index is used to send data with an ajax call. I am looping through with a for loop gathering the data from the successful call and pushing it into an empty array. At the end of the unknown amount of calls I then need to use that newly gathered array in my view. newDataArray is executed at the bottom before the loops are done and therefor it is still empty. How do I finish all the calls then do what is at the bottom?
If it helps, I am doing this in React with the Flux pattern. But the same issue may be done not in React. Here is a mock sample of what I am trying to do:
JS
case 'execute-calls':
//This is the new array to push to
var newDataArray = [];
//Url to call
var url = 'http://dev.markitondemand.com/Api/v2/Quote/jsonp';
for(let i = 0; i < payload.data.length; i++){
//given array of data that needs to be sent with call
let symb = { symbol: payload.data[i]};
$.ajax({
data: symb,
url: url,
dataType: "jsonp",
})
.done(function(data){
let updatedData = {
//...data that is stored from response
};
newDataArray.push(updatedData);
})
.fail(function(error){
//console.log(error);
});
}
//This will be updating the state object which is above the switch cases
//However this is ran before the end of the loops so newDataArray is empty
var updateTicker = {
updatedTicker: true,
updatedTickerSymbols: newDataArray
};
assign(stockData,updateTicker);
getStockData.emitChange();
break;
You can make use of the fact that $.ajax() actually returns a deferred object, and use it to create an array of deferreds. e.g.
var symbols = [1, 2, 3, 4];
var deferreds = symbols.map(function (symbol) {
return $.ajax({
url: 'http://dev.markitondemand.com/MODApis/Api/v2/Quote/jsonp',
data: { symbol: symbol },
dataType: 'jsonp'
});
});
You can resolve multiple deferreds at once with $.when(). There is a complication however, $.when() expects a list of parameters rather than array. We can solve this by using Function#apply.
To add to the complication, the callback function is also called with a list of arguments. Since we don't know how many arguments there are, we'll use the arguments pseudo-array. And since arguments isn't an actual array, we'll loop through it by using Function#call on Array#prototype.
$.when.apply($, deferreds).done(function () {
Array.prototype.forEach.call(arguments, function (response) {
console.log(response[0].Message);
});
}).fail(function (jqXHR, textStatus, error) {
console.error(error);
});
[UPDATED to include fail() call]
If you're using ES6 this is much more elegant:
$.when(...deferreds).done((...responses) => {
responses.forEach((response) => {
console.log(response[0].Message);
});
});
When ever you are dealing with ajax calls and have to do some operations at the end of all async calls then better choice would be to use Callback functions.
Modifying your code to use the call back,
function AsyncLoopHandler(index) {
if (index > payload.data.length) {
// all the indexes have finished ajax calls do your next step here
var updateTicker = {
updatedTicker: true,
updatedTickerSymbols: newDataArray
};
assign(stockData, updateTicker);
getStockData.emitChange();
}
else {
//given array of data that needs to be sent with call
let symb = { symbol: payload.data[index] };
$.ajax({
data: symb,
url: url,
dataType: "jsonp",
})
.done(function (data) {
let updatedData = {
//...data that is stored from response
};
newDataArray.push(updatedData);
AsyncLoopHandler(index++); // call the function again with new index
})
.fail(function (error) {
//console.log(error);
});
}
}
Now for starting this recursive function just start it by passing the index 0.
AsyncLoopHandler(0);
So all the ajax calls will be executed one after the other as if its an synchronous requests, And the if check will see if all the indexes are complete and then run your logic. Let me know if this helps
suggest use promise, logic would like
var urls= [x,x,x,x];
var results = [];
var qs = $.map(urls,function(url){
return function(){
var deferred = Q.defer();
$.ajax({
success:function(){
results.push(url)
deferred.reslove();
},error:function(){
deferred.reslove();
}
})
return deferred;
}
})
Q.all(qs).then(function(){
console.log(results )
});
or use yield and co in new standard
https://github.com/kriskowal/q
Here's an contrived example of what's going on: http://jsfiddle.net/adamjford/YNGcm/20/
HTML:
Click me!
<div></div>
JavaScript:
function getSomeDeferredStuff() {
var deferreds = [];
var i = 1;
for (i = 1; i <= 10; i++) {
var count = i;
deferreds.push(
$.post('/echo/html/', {
html: "<p>Task #" + count + " complete.",
delay: count
}).success(function(data) {
$("div").append(data);
}));
}
return deferreds;
}
$(function() {
$("a").click(function() {
var deferreds = getSomeDeferredStuff();
$.when(deferreds).done(function() {
$("div").append("<p>All done!</p>");
});
});
});
I want "All done!" to appear after all of the deferred tasks have completed, but $.when() doesn't appear to know how to handle an array of Deferred objects. "All done!" is happening first because the array is not a Deferred object, so jQuery goes ahead and assumes it's just done.
I know one could pass the objects into the function like $.when(deferred1, deferred2, ..., deferredX) but it's unknown how many Deferred objects there will be at execution in the actual problem I'm trying to solve.
To pass an array of values to any function that normally expects them to be separate parameters, use Function.prototype.apply, so in this case you need:
$.when.apply($, my_array).then( ___ );
See http://jsfiddle.net/YNGcm/21/
In ES6, you can use the ... spread operator instead:
$.when(...my_array).then( ___ );
In either case, since it's unlikely that you'll known in advance how many formal parameters the .then handler will require, that handler would need to process the arguments array in order to retrieve the result of each promise.
The workarounds above (thanks!) don't properly address the problem of getting back the objects provided to the deferred's resolve() method because jQuery calls the done() and fail() callbacks with individual parameters, not an array. That means we have to use the arguments pseudo-array to get all the resolved/rejected objects returned by the array of deferreds, which is ugly:
$.when.apply($,deferreds).then(function() {
var objects = arguments; // The array of resolved objects as a pseudo-array
...
};
Since we passed in an array of deferreds, it would be nice to get back an array of results. It would also be nice to get back an actual array instead of a pseudo-array so we can use methods like Array.sort().
Here is a solution inspired by when.js's when.all() method that addresses these problems:
// Put somewhere in your scripting environment
if (typeof jQuery.when.all === 'undefined') {
jQuery.when.all = function (deferreds) {
return $.Deferred(function (def) {
$.when.apply(jQuery, deferreds).then(
// the calling function will receive an array of length N, where N is the number of
// deferred objects passed to when.all that succeeded. each element in that array will
// itself be an array of 3 objects, corresponding to the arguments passed to jqXHR.done:
// ( data, textStatus, jqXHR )
function () {
var arrayThis, arrayArguments;
if (Array.isArray(this)) {
arrayThis = this;
arrayArguments = arguments;
}
else {
arrayThis = [this];
arrayArguments = [arguments];
}
def.resolveWith(arrayThis, [Array.prototype.slice.call(arrayArguments)]);
},
// the calling function will receive an array of length N, where N is the number of
// deferred objects passed to when.all that failed. each element in that array will
// itself be an array of 3 objects, corresponding to the arguments passed to jqXHR.fail:
// ( jqXHR, textStatus, errorThrown )
function () {
var arrayThis, arrayArguments;
if (Array.isArray(this)) {
arrayThis = this;
arrayArguments = arguments;
}
else {
arrayThis = [this];
arrayArguments = [arguments];
}
def.rejectWith(arrayThis, [Array.prototype.slice.call(arrayArguments)]);
});
});
}
}
Now you can simply pass in an array of deferreds/promises and get back an array of resolved/rejected objects in your callback, like so:
$.when.all(deferreds).then(function(objects) {
console.log("Resolved objects:", objects);
});
You can apply the when method to your array:
var arr = [ /* Deferred objects */ ];
$.when.apply($, arr);
How do you work with an array of jQuery Deferreds?
When calling multiple parallel AJAX calls, you have two options for handling the respective responses.
Use Synchronous AJAX call/ one after another/ not recommended
Use Promises' array and $.when which accepts promises and its callback .done gets called when all the promises are return successfully with respective responses.
Example
function ajaxRequest(capitalCity) {
return $.ajax({
url: 'https://restcountries.eu/rest/v1/capital/'+capitalCity,
success: function(response) {
},
error: function(response) {
console.log("Error")
}
});
}
$(function(){
var capitalCities = ['Delhi', 'Beijing', 'Washington', 'Tokyo', 'London'];
$('#capitals').text(capitalCities);
function getCountryCapitals(){ //do multiple parallel ajax requests
var promises = [];
for(var i=0,l=capitalCities.length; i<l; i++){
var promise = ajaxRequest(capitalCities[i]);
promises.push(promise);
}
$.when.apply($, promises)
.done(fillCountryCapitals);
}
function fillCountryCapitals(){
var countries = [];
var responses = arguments;
for(i in responses){
console.dir(responses[i]);
countries.push(responses[i][0][0].nativeName)
}
$('#countries').text(countries);
}
getCountryCapitals()
})
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
<div>
<h4>Capital Cities : </h4> <span id="capitals"></span>
<h4>Respective Country's Native Names : </h4> <span id="countries"></span>
</div>
As a simple alternative, that does not require $.when.apply or an array, you can use the following pattern to generate a single promise for multiple parallel promises:
promise = $.when(promise, anotherPromise);
e.g.
function GetSomeDeferredStuff() {
// Start with an empty resolved promise (or undefined does the same!)
var promise;
var i = 1;
for (i = 1; i <= 5; i++) {
var count = i;
promise = $.when(promise,
$.ajax({
type: "POST",
url: '/echo/html/',
data: {
html: "<p>Task #" + count + " complete.",
delay: count / 2
},
success: function (data) {
$("div").append(data);
}
}));
}
return promise;
}
$(function () {
$("a").click(function () {
var promise = GetSomeDeferredStuff();
promise.then(function () {
$("div").append("<p>All done!</p>");
});
});
});
Notes:
I figured this one out after seeing someone chain promises sequentially, using promise = promise.then(newpromise)
The downside is it creates extra promise objects behind the scenes and any parameters passed at the end are not very useful (as they are nested inside additional objects). For what you want though it is short and simple.
The upside is it requires no array or array management.
I want to propose other one with using $.each:
We may to declare ajax function like:
function ajaxFn(someData) {
this.someData = someData;
var that = this;
return function () {
var promise = $.Deferred();
$.ajax({
method: "POST",
url: "url",
data: that.someData,
success: function(data) {
promise.resolve(data);
},
error: function(data) {
promise.reject(data);
}
})
return promise;
}
}
Part of code where we creating array of functions with ajax to send:
var arrayOfFn = [];
for (var i = 0; i < someDataArray.length; i++) {
var ajaxFnForArray = new ajaxFn(someDataArray[i]);
arrayOfFn.push(ajaxFnForArray);
}
And calling functions with sending ajax:
$.when(
$.each(arrayOfFn, function(index, value) {
value.call()
})
).then(function() {
alert("Cheer!");
}
)
If you're transpiling and have access to ES6, you can use spread syntax which specifically applies each iterable item of an object as a discrete argument, just the way $.when() needs it.
$.when(...deferreds).done(() => {
// do stuff
});
MDN Link - Spread Syntax
I had a case very similar where I was posting in an each loop and then setting the html markup in some fields from numbers received from the ajax. I then needed to do a sum of the (now-updated) values of these fields and place in a total field.
Thus the problem was that I was trying to do a sum on all of the numbers but no data had arrived back yet from the async ajax calls. I needed to complete this functionality in a few functions to be able to reuse the code. My outer function awaits the data before I then go and do some stuff with the fully updated DOM.
// 1st
function Outer() {
var deferreds = GetAllData();
$.when.apply($, deferreds).done(function () {
// now you can do whatever you want with the updated page
});
}
// 2nd
function GetAllData() {
var deferreds = [];
$('.calculatedField').each(function (data) {
deferreds.push(GetIndividualData($(this)));
});
return deferreds;
}
// 3rd
function GetIndividualData(item) {
var def = new $.Deferred();
$.post('#Url.Action("GetData")', function (data) {
item.html(data.valueFromAjax);
def.resolve(data);
});
return def;
}
If you're using angularJS or some variant of the Q promise library, then you have a .all() method that solves this exact problem.
var savePromises = [];
angular.forEach(models, function(model){
savePromises.push(
model.saveToServer()
)
});
$q.all(savePromises).then(
function success(results){...},
function failed(results){...}
);
see the full API:
https://github.com/kriskowal/q/wiki/API-Reference#promiseall
https://docs.angularjs.org/api/ng/service/$q
Grateful for any insight into what I'm misunderstanding here. My requirement is as follows:
I have an array of URLs. I want to fire off an AJAX request for each URL simultaneously, and as soon as the first request completes, call the first callback. Then, if and when the second request completes, call that callback, and so on.
Option 1:
for (var i = 0; i < myUrlArray.length; i++) {
$.ajax({
url: myUrlArray[i]
}).done(function(response) {
// Do something with response
});
}
Obviously this doesn't work, as there is no guarantee the responses will complete in the correct order.
Option 2:
var promises = [];
for (var i = 0; i < myUrlArray.length; i++) {
promises.push($.ajax({
url: myUrlArray[i]
}));
}
$.when.apply($, promises).then(function() {
// Do something with each response
});
This should work, but the downside is that it waits until all AJAX requests have completed, before firing any of the callbacks.
Ideally, I should be able to call the first callback as soon as it's complete, then chain the second callback to execute whenever that response is received (or immediately if it's already resolved), then the third, and so on.
The array length is completely variable and could contain any number of requests at any given time, so just hard coding the callback chain isn't an option.
My attempt:
var promises = [];
for (var i = 0; i < myUrlArray.length; i++) {
promises.push($.ajax({
url: myUrlArray[i] // Add each AJAX Deferred to the promises array
}));
}
(function handleAJAX() {
var promise;
if (promises.length) {
promise = promises.shift(); // Grab the first one in the stack
promise.then(function(response) { // Set up 'done' callback
// Do something with response
if (promises.length) {
handleAJAX(); // Move onto the next one
}
});
}
}());
The problem is that the callbacks execute in a completely random order! For example, if I add 'home.html', 'page2.html', 'page3.html' to the array, the order of responses won't necessarily be 'home.html', 'page2.html', 'page3.html'.
I'm obviously fundamentally misunderstanding something about the way promises work. Any help gratefully appreciated!
Cheers
EDIT
OK, now I'm even more confused. I made this JSFiddle with one array using Alnitak's answer and another using JoeFletch's answer and neither of them work as I would expect! Can anyone see what is going on here?
EDIT 2
Got it working! Based on JoeFletch's answer below, I adapted the solution as follows:
var i, responseArr = [];
for (i = 0; i < myUrlArray.length; i++) {
responseArr.push('0'); // <-- Add 'unprocessed' flag for each pending request
(function(ii) {
$.ajax({
url: myUrlArray[ii]
}).done(function(response) {
responseArr[ii] = response; // <-- Store response in array
}).fail(function(xhr, status, error) {
responseArr[ii] = 'ERROR';
}).always(function(response) {
for (var iii = 0; iii < responseArr.length; iii++) { // <-- Loop through entire response array from the beginning
if (responseArr[iii] === '0') {
return; // As soon as we hit an 'unprocessed' request, exit loop
}
else if (responseArr[iii] !== 'done') {
$('#target').append(responseArr[iii]); // <-- Do actual callback DOM append stuff
responseArr[iii] = 'done'; // <-- Set 'complete' flag for this request
}
}
});
}(i)); // <-- pass current value of i into closure to encapsulate
}
TL;DR: I don't understand jQuery promises, got it working without them. :)
Don't forget that you don't need to register the callbacks straight away.
I think this would work, the main difference with your code being that I've used .done rather than .then and refactored a few lines.
var promises = myUrlArray.map(function(url) {
return $.ajax({url: url});
});
(function serialize() {
var def = promises.shift();
if (def) {
def.done(function() {
callback.apply(null, arguments);
serialize();
});
}
})();
Here's my attempt at solving this. I updated my answer to include error handling for a failed .ajax call. I also moved some code to the complete method of the .ajax call.
var urlArr = ["url1", "url2"];
var responseArr = [];
for(var i = 0; i < length; i++) {
responseArr.push("0");//0 meaning unprocessed to the DOM
}
$.each(urlArr, function(i, url){
$.ajax({
url: url,
success: function(data){
responseArr[i] = data;
},
error: function (xhr, status, error) {
responseArr[i] = "Failed Response";//enter whatever you want to place here to notify the end user
},
complete: function() {
$.each(responseArr, function(i, element){
if (responseArr[i] == "0") {
return;
}
else if (responseArr[i] != "done")
{
//do something with the response
responseArr[i] = "done";
}
});
}
});
})
Asynchronous requests aren't guaranteed to finish in the same order that they are sent. some may take longer than others depending on server load and the amount of data being transferred.
The only options are either to wait until they are all done, only send one at a time, or just deal with them being called possibly out of order.