Hey all. I have, what appears to be, a trivial problem. I have the following JavaScript:
$(function() {
var r = GetResults();
for(var i = 0; i < r.length; i++) {
// Do stuff with r
}
});
function GetResults() {
$.getJSON("/controller/method/", null, function(data) {
return data;
});
}
Due to the fact that I'm calling a method asynchronously, the script continues executing and when it encounters the for loop, r obviously isn't going to have a value yet. My question is: when I have a method that is doing an asynchronous operation, and I'm dependent on the data it returns back in the main block, how do I halt execution until the data is returned? Something like:
var r = GetResults(param, function() {
});
where the function is a callback function. I cannot move the for loop processing into the callback function of the JSON request because I am reusing the functionality of GetResults several time throughout the page, unless I want to duplicate the code. Any ideas?
move your "do stuff with r" block into your $.getJSON callback. you can't do stuff with r until it has been delivered, and the first opportunity you'll have to use r is in the callback... so do it then.
$(function() {
var r = GetResults();
});
function GetResults() {
$.getJSON("/controller/method/", null, function(data) {
for(var i = 0; i < data.length; i++) {
// Do stuff with data
}
return data;
});
}
I've run into something similar before. You'll have to run the ajax call synchronously.
Here is my working example:
$.ajax({
type: "POST",
url: "/services/GetResources",
contentType: "application/json; charset=utf-8",
dataType: "json",
data: '{resourceFileName:"mapedit",culture:"' + $("#lang-name").val() + '"}',
cache: true,
async: false, // to set local variable
success: function(data) {
localizations = data.d;
}
});
Ajax already gives you a callback, you are supposed to use it:
function dostuff( data ) {
for(var i = 0; i < data.length; i++) {
// Do stuff with data
}
};
$(document).ready( function() {
$.getJSON( "/controller/method/", null, dostuff );
});
You could do this:
$(function() {
PerformCall();
});
function PerformCall() {
$.getJSON("/controller/method/", null, function(data) {
for(var i = 0; i < data.length; i++) {
// Do stuff with data
}
});
}
The short answer is that you can't block on an asynchronous operation...which is of course, the meaning of "asynchronous".
Instead, you need to change your code to use a callback to trigger the action based on the data returned from the $.getJSON(...) call. Something like the following should work:
$(function() {
GetResults();
});
function GetResults() {
$.getJSON("/controller/method/", null, function(data) {
for(var i = 0; i < data.length; i++) {
// Do stuff with data
}
});
}
Given your updated requirements ...
I cannot move the for loop processing
into the callback function of the JSON
request because I am reusing the
functionality of GetResults several
time throughout the page, unless I
want to duplicate the code. Any ideas?
... you could modify GetResults() to accept a function as a parameter, which you would then execute as your $.getJSON callback (air code warning):
$(function() {
GetResults(function(data) {
for(var i = 0; i < data.length; i++) {
// Do stuff with data
}
});
});
function GetResults(callback) {
$.getJSON("/controller/method/", null, callback);
}
As you can see from the general tide of answers, you're best off not trying to fight the asynchronous jQuery programming model. :)
This is not possible.
Either you make your function synchronous or you change the design of your code to support the asynchronous operation.
You can have a callback with parameters that should work nicely...
$(function() {
GetResults(function(data) {
for(var i = 0; i < data.length; i++) {
// Do stuff with data
}
});
});
function GetResults(func) {
$.getJSON("/controller/method/", null, func);
}
Move the data processing into the callback:
$(function() {
GetResults();
});
function GetResults() {
$.getJSON("/controller/method/", null, function(data) {
for(var i = 0; i < data.length; i++) {
// Do stuff with data
}
});
}
Related
I have an ajax query followed by some functions, and I use the .then() promise callback to execute them in order:
var pictures = [];
var venues = **an array of venues**
$.get(url).then(functionA, functionFail).then(function B);
But functionA, the first success callback, includes a loop that fires off 'n' ajax requests:
for(var i=0; i<n; i++) {
var venue = venues[i];
var new_url = **some_url**
$.ajax({url: new_url, async: false}).done(function(data) {
var pics = data.response.photos.items;
pictures.push(pics[0]);
pictures.push(pics[1]);
}).fail(function() {
console.log('Failed!');
});
}
These looped ajax requests fill up the global pictures array. The pictures array is then used by functionB, but because of the async nature, the array doesn't get filled up fully and executes right away.
I tried to make the requests synchronous with async: false but it's not completely effective (it leaves out the very last request of the loop).
How can I make sure that functionB is only executed after all the ajax requests have finished? I don't want to use timeouts but if nothing else I'll fall back on that.
As you're using jQuery, it looks like jQuery.when() can take multiple results and then call done once they're all resolved.
Not sure if this is the best answer, but one of them! Just count the number of times request is completed, and when all of them are, execute your function.
var completed_requests = 0,
returned_data = [];
for (var i=0; i<n; i++) {
var venue = venues[i];
var new_url = **some_url**
var done = function(data_array) {
var pics = [];
data_array.forEach(function(data) {
pics = pics.concat(data.response.photos.items);
});
};
$.ajax({url: new_url}).done(function(data) {
completed_requests++;
returned_data.push(data);
if (completed_requests == n) {
done(returned_data);
}
}).fail(function() {
console.log('Failed!');
});
}
My example also saves data from all requests until you need it.
you can use Promise.all, code of function A:
var requests = []
for(var i=0; i<n; i++) {
var venue = venues[i];
var new_url = **some_url**
request.push(new Promise((resolve, reject) => {
$.ajax({url: new_url}).done(function(data) {
var pics = data.response.photos.items;
resolve(pics)
}).fail(function(err) {
reject(err)
console.log('Failed!');
});
}))
}
return Promise.all(requests)
when all request are run successful, the whole return values of requests will be pushed to array and passed to function B.
You can handle the looping yourself, handling only one item in venues at a time. Then upon the the completion of one, call a function to handle the next one, and when venues is empty then call your functionB
ar pictures = [];
var venues = **an array of venues**
$.get(url).then(functionA, functionFail);
function functionA() {
var venue = venues.shift();
var new_url = **some_url**;
$.ajax({
url: new_url,
async: true
}).done(function(data) {
var pics = data.response.photos.items;
pictures.push(pics[0]);
pictures.push(pics[1]);
if(venues.length !== 0) {
functionA();
}
else {
functionB();
}
}).fail(function() {
console.log('Failed!');
});
}
I have a for loop which is calling an async function. I need this function to then call a callback at the end of the for loop but only when all my async functions have returned their result. I have tried this:
for(var i = 0; i < vaccinsCount; i++){
getVaccinAddress(i, address, provider, function(result){
if(result.success){
console.log("result:" + result.values);
vaccines.push(result.values);
} else {
callback({success: false, message: result.message});
}
});
}
callback({success: true, values: vaccines});
instead what is happening is that the code enters the for loop then call then async function then exits straigh away. How could i get around this ?
getVaccinAddress is the Async Function that does a server call.
EDIT
I am using NodeJS, therefore the solution is to then use bluebird, I however have no idea on how to implement this with bluebird.
I highly recommend using promises in this case.
It's a good way to manage your asynchronous calls:
https://davidwalsh.name/promises
If you are using promises your code would look something like this:
var promises = []
for(var i = 0; i < vaccinsCount; i++){
promises.push(getVaccinAddress(i, address, provider));
// getVaccinAddress will need to return a promise
}
Promise.all(promises).then(function(result) {
console.log('success');
})
.catch(function(err) {
console.log(err);
});
You can call callback() when vaccines.length is equal to vaccinsCount
for(var i = 0; i < vaccinsCount; i++) {
(function(i) {
getVaccinAddress(i, address, provider, function(result) {
if(result.success) {
console.log("result:" + result.values);
vaccines.push(result.values);
if (vaccines.length === vaccinsCount) {
// call `callback()` here
}
}
});
})(i);
}
Am new to Deferreds and Promises.
Here is my [simplified] code, which is defined within a JavaScript object:
myFunction: function(d, cb)
{
return $.ajax('/myURL', {
contentType: 'application/json',
data: d,
dataType: 'json',
type: 'POST'
}).then(cb, cb);
},
flush: function(myArray)
{
return myFunction(myArray, myCallback);
}
The above works fine. I can call flush(someArray), and some time later I get the result of the ajax request.
QUESTION:
I want to modify the flush function so that it first breaks the array into chunks (i.e. smaller arrays), and then calls myFunction on each of those chunks. It must then return, in one go, the aggregated data (preferably in an array) from each of the ajax calls that were made.
I am starting to modify flush() along the following lines, but I know it's not quite right. Please can someone complete/fill in the gaps for me, or suggest a re-structuring that would work well?
Thanks.
flush: function(myArray)
{
var chunk = 2;
var i, a;
var j = myArray.length;
var myArrayChunks = [];
for (i=0; i<j; i+=chunk)
{
a = myArray.slice(i, i+chunk);
myArrayChunks.push(a);
}
var myDeferreds = [];
for (i=0; i<myArrayChunks.length; i++)
{
// Here, I need to create a deferred object that will run: myFunction(myArrayChunks[i], myCallback)
// How do I do that?
var f = // The deferred object that will run: myFunction(myArrayChunks[i], myCallback)
myDeferreds.push(f);
}
return $.when.apply($, myDeferreds).then(function(){
// Here, I need to get the aggregated data that is returned by each of the deferreds. How do I do that?
console.log("FLUSH COMPLETE!");
});
}
The async library I've pasted below allows you to run a series of async/deferred requests, and passes on the results of each async function to a final callback, which aggregates a collection of the results.
In particular, check out the parallel method, which will execute all of your async requests simultaneously, but there is no guarantee which order they will run in. If you are concerned about which order to execute your async requests, check out the seriesand eachSeries methods.
parallel:
https://github.com/caolan/async#parallel
series / eachSeries:
https://github.com/caolan/async#seriestasks-callback
Both methods aggregate your results into a final results object, which contains all of the results passed on from each async call you make.
NOTE, to use jQuery's deferred functionality, you would need to call .resolve() in the "final" callback of the async.parallel or async.each or async.eachSeries methods
Here's an example of the parallel method:
async.parallel([
function(callback){
// some request
$.ajax(/*details*/, function(data) {
callback(null, data);
});
},
function(callback){
// some request
$.ajax(/*details*/, function(data) {
callback(null, data);
});
}
],
// "final" callback, invoked after all above functions have
// called their respective callback() functions
function(err, results){
if(err) {
// handle error
} else {
// results contains aggregated results from all
// async calls (2nd parameter in callback(errorParam, resultsParam)
console.log('all async methods finished!', results);
}
});
Here's a way to pass in an array and make async methods with each array element. NOTE that every async call within the async.each method must call callback() when the async request is resolved, or callback(err) in your async error method if there an error. If you pass in an array of N elements to the async.each method, the final callback will be invoked when all N async resolve callback() methods have been invoked.
async.each(array, function(element, callback) {
$.ajax(/* details */, {data: element}, function(data) {
// call `callback` when you're finished up
callback();
});
},
// "final" callback, invoked after each async call is resolved and
// invokes the callback() function
function(err){
if( err ) {
// handle errors
} else {
console.log('All async methods flushed!');
}
});
I love this library, and once you start using it it'll change your life :]. Best of luck!
Since you already have a promise returned from your ajax function, I'd suggest you use promises instead of plain callbacks. Here's a way to do that:
myFunction: function(d) {
return $.ajax('/myURL', {
contentType: 'application/json',
data: d,
dataType: 'json',
type: 'POST'
});
},
flush: function(myArray, chunkSize) {
chunkSize = chunkSize || 2;
var index = 0;
var results = [];
var self = this;
return jQuery.Deferred(function(def) {
function next() {
var start = index;
var arrayChunk, promises = [];
index += chunkSize;
if (index < myArray.length) {
arrayChunk = myArray.slice(start, chunkSize);
// create chunkSize array of promises
arrayChunk.forEach(function(item) {
promises.push(self.myFunction(item));
});
$.when.apply($, promises).then(function() {
// results are in arguments[0][0], arguments[1][0], etc...
for (var i = 0; i < arguments.length; i++) {
results.push(arguments[i][0]);
}
// next iteration
next();
}, def.reject)
} else {
def.resolve(results);
}
}
// start first iteration
next();
}).promise();
}
obj.flush(myArray).then(function(results) {
// array of results here
}, function(jqXHR, textStatus, errorThrown) {
// error here
});
Here's another way to do it by creating a version of $.ajax() which I call $.ajaxChunk() that takes an array of data and does the chunking for you.
// Send ajax calls in chunks from an array with no more than X in flight at the same time
// Pass in array of data where each item in dataArray is sent separately
// in an ajax call
// Pass settings.chunkSize to specify the chunk size, defaults to 2 if not present
// Returns a promise
// The resolved value of promise is an array of results
// The rejected value of the promise is whatever jQuery result failed first
$.ajaxChunk = function(dataArray, url, settings) {
settings = settings || {};
var chunkSize = settings.chunkSize || 2;
var index = 0;
var results = [];
return jQuery.Deferred(function(def) {
function next() {
var start = index;
var arrayChunk, promises = [];
index += chunkSize;
if (index < myArray.length) {
arrayChunk = myArray.slice(start, chunkSize);
// create chunkSize array of promises
arrayChunk.forEach(function(item) {
// make unique copy of settings object for each ajax call
var localSettings = $.extend({}, settings);
localSettings.data = item;
promises.push($.ajax(url, localSettings));
});
$.when.apply($, promises).then(function() {
// results are in arguments[0][0], arguments[1][0], etc...
for (var i = 0; i < arguments.length; i++) {
results.push(arguments[i][0]);
}
next();
}, def.reject)
} else {
def.resolve(results);
}
}
// start first iteration
next();
}).promise();
}
And, sample usage:
$.ajaxChunk(arrayOfData, '/myURL', {
contentType: 'application/json',
dataType: 'json',
type: 'POST',
chunksize: 2
}).then(function(results) {
// array of results here
}, function(jqXHR, textStatus, errorThrown) {
// error here
})
If the real requirement here is that you don't have more than X ajax calls in process at the same time, then there's a more efficient and faster (end-to-end time) way to do than chunking. Instead, you keep track of exactly how many ajax calls in "in flight" at any time and as soon as one finishes, you start the next one. This is a bit more efficient than chunking where you send the whole chunk, then wait for the whole chunk to finish. I've written a jQuery helper that implements this:
$.ajaxAll = function(dataArray, url, settings, maxInFlight) {
maxInFlight = maxInFlight || 1;
var results = new Array(dataArray.length);
settings = settings || {};
var index = 0;
var inFlight = 0;
return jQuery.Deferred(function(def) {
function runMore() {
while (inFlight < maxInFlight && index < dataArray.length) {
(function(i) {
var localSettings = $.extend({}, settings);
localSettings.data = dataArray[index++];
++inFlight;
$.ajax(url, localSettings).then(function(data, textStatus, jqXHR) {
--inFlight;
results[i] = data;
runMore();
}, def.reject);
})(index);
}
// if we are all done here
if (inFlight === 0 && index >= dataArray.length) {
def.resolve(results);
}
}
// start first iteration
runMore();
}).promise();
}
Note: If you pass 1 for the maxInFlight argument, then this runs the ajax calls in series one after the other. Results are always returned in order.
And, sample usage:
$.ajaxAll(arrayOfData, '/myURL', {
contentType: 'application/json',
dataType: 'json',
type: 'POST'
}, 2).then(function(results) {
// array of results here
}, function(jqXHR, textStatus, errorThrown) {
// error here
})
Thanks to all for great advice.
I used a combination of the suggested techniques in my solution.
The key thing was to make an array of promises, and push onto it the required calls (each with its own array chunk passed as a parameter) to the function that makes the ajax request. One thing I hadn't previously realised is that this calls the ajaxCall() function at that very moment, and that's ok because it returns a promise that is pushed onto the array.
After this, the 'when.apply' line does the trick in waiting until all the ajax promises are fulfilled. The arguments of the 'then' function are used to collate all the results required (obviously, the exact mechanism for that depends on the format of your returned arguments). The results are then sent to theResultsHandler(), which takes the place of the original callback in the code I first posted in my question.
Hope this is useful to other Promise-novices!
The ajax-calling function is:
ajaxCall: function(d) {
return $.ajax('/myURL', {
contentType: 'application/json',
data: d,
dataType: 'json',
type: 'POST'
});
},
And inside the flush() function...
var promises = [];
var i, j;
for (i=0; i<batchChunks.length; i++)
{
promises.push(self.ajaxCall(batchChunks[i]));
}
var results = [];
return $.when.apply($, promises).then(function(){
console.log("arguments = " + JSON.stringify(arguments));
for (i = 0; i < arguments.length; i++)
{
for (j = 0; j < arguments[i][0].length; j++)
{
results.push(arguments[i][0][j]);
}
}
return self.theResultsHandler(results);
});
I have that code :
for (var i = 0; i < $total_files; i++) {
$.ajax({
type: 'POST',
url: 'uploading.php',
context: $(this),
dataType: 'json',
cache: false,
contentType: false,
processData: false,
data: data_string,
success: function(datas) {
//does something
},
error: function(e) {
alert('error, try again');
}
});
}
It uploads images very well but the problem is that I can't find a way to upload the images one by one, I tried to put the option async to false but it freezes the web browser until all images are uploaded which is not what I want, I want to emulate somehow this "async : false" option to perform the same thing but without freezing the web browser.
How to do this ?
You can create an array of promises so that once all promises are resolved you can run your all done code.
var promises = [];
for (var i = 0; i < $total_files; i++){
/* $.ajax returns a promise*/
var request = $.ajax({
/* your ajax config*/
})
promises.push( request);
}
$.when.apply(null, promises).done(function(){
alert('All done')
})
DEMO
For jQuery 3.x+ and modern browser that support native Promise, Promise.all could be used this way:
var promises = [];
for (var i = 0; i < $total_files; i++) {
// jQuery returns a prom
promises.push($.ajax({
/* your ajax config*/
}))
}
Promise.all(promises)
.then(responseList => {
console.dir(responseList)
})
If your files are already stored in a list then you could use map instead of a loop.
var fileList = [/*... list of files ...*/];
Promise.all(fileList.map(file => $.ajax({
/* your ajax config*/
})))
.then(responseList => {
console.dir(responseList)
})
Populate an array with each call and call the next item when the previous is done.
You could try something like that:
window.syncUpload = {
queue : [],
upload : function(imagesCount) {
var $total_files = imagesCount, data_string = "";
/* Populates queue array with all ajax calls you are going to need */
for (var i=0; i < $total_files; i++) {
this.queue.push({
type: 'POST',
url: 'uploading.php',
context: $(this),
dataType: 'json',
cache: false,
contentType: false,
processData: false,
data: data_string,
success: function(datas) {
//does something
},
error: function(e){
alert('error, try again');
},
/* When the ajax finished it'll fire the complete event, so we
call the next image to be uploaded.
*/
complete : function() {
this[0].uploadNext();
}
});
}
this.uploadNext();
},
uploadNext : function() {
var queue = this.queue;
/* If there's something left in the array, send it */
if (queue.length > 0) {
/* Create ajax call and remove item from array */
$.ajax(queue.shift(0));
}
}
}
Just call it using
syncUpload.upload(NUMBER_OF_IMAGES);
I would try jQuery.when so you can still use asynchronous call but deferred, something like :
jQuery(document).ready(function ($) {
$.when(
//for (var i = 0; i < $total_files; i++) {
$.ajax({
// ajax code
})
//}
).done(function () {
// perform after ajax loop is done
});
}); // ready
EDIT : ajax iteration should be done outside $.when and pushed into an array as proposed by charlietfl's answer. You may use an (asynchronous) ajax call and defer it inside $.when though, see JSFIDDLE
In one statement with jquery
$.when.apply(null, $.map(/*input Array|jQuery*/, function (n, i) {
return $.get(/* URL */, function (data) {
/* Do something */
});
})).done(function () {
/* Called after all ajax is done */
});
Grateful for any insight into what I'm misunderstanding here. My requirement is as follows:
I have an array of URLs. I want to fire off an AJAX request for each URL simultaneously, and as soon as the first request completes, call the first callback. Then, if and when the second request completes, call that callback, and so on.
Option 1:
for (var i = 0; i < myUrlArray.length; i++) {
$.ajax({
url: myUrlArray[i]
}).done(function(response) {
// Do something with response
});
}
Obviously this doesn't work, as there is no guarantee the responses will complete in the correct order.
Option 2:
var promises = [];
for (var i = 0; i < myUrlArray.length; i++) {
promises.push($.ajax({
url: myUrlArray[i]
}));
}
$.when.apply($, promises).then(function() {
// Do something with each response
});
This should work, but the downside is that it waits until all AJAX requests have completed, before firing any of the callbacks.
Ideally, I should be able to call the first callback as soon as it's complete, then chain the second callback to execute whenever that response is received (or immediately if it's already resolved), then the third, and so on.
The array length is completely variable and could contain any number of requests at any given time, so just hard coding the callback chain isn't an option.
My attempt:
var promises = [];
for (var i = 0; i < myUrlArray.length; i++) {
promises.push($.ajax({
url: myUrlArray[i] // Add each AJAX Deferred to the promises array
}));
}
(function handleAJAX() {
var promise;
if (promises.length) {
promise = promises.shift(); // Grab the first one in the stack
promise.then(function(response) { // Set up 'done' callback
// Do something with response
if (promises.length) {
handleAJAX(); // Move onto the next one
}
});
}
}());
The problem is that the callbacks execute in a completely random order! For example, if I add 'home.html', 'page2.html', 'page3.html' to the array, the order of responses won't necessarily be 'home.html', 'page2.html', 'page3.html'.
I'm obviously fundamentally misunderstanding something about the way promises work. Any help gratefully appreciated!
Cheers
EDIT
OK, now I'm even more confused. I made this JSFiddle with one array using Alnitak's answer and another using JoeFletch's answer and neither of them work as I would expect! Can anyone see what is going on here?
EDIT 2
Got it working! Based on JoeFletch's answer below, I adapted the solution as follows:
var i, responseArr = [];
for (i = 0; i < myUrlArray.length; i++) {
responseArr.push('0'); // <-- Add 'unprocessed' flag for each pending request
(function(ii) {
$.ajax({
url: myUrlArray[ii]
}).done(function(response) {
responseArr[ii] = response; // <-- Store response in array
}).fail(function(xhr, status, error) {
responseArr[ii] = 'ERROR';
}).always(function(response) {
for (var iii = 0; iii < responseArr.length; iii++) { // <-- Loop through entire response array from the beginning
if (responseArr[iii] === '0') {
return; // As soon as we hit an 'unprocessed' request, exit loop
}
else if (responseArr[iii] !== 'done') {
$('#target').append(responseArr[iii]); // <-- Do actual callback DOM append stuff
responseArr[iii] = 'done'; // <-- Set 'complete' flag for this request
}
}
});
}(i)); // <-- pass current value of i into closure to encapsulate
}
TL;DR: I don't understand jQuery promises, got it working without them. :)
Don't forget that you don't need to register the callbacks straight away.
I think this would work, the main difference with your code being that I've used .done rather than .then and refactored a few lines.
var promises = myUrlArray.map(function(url) {
return $.ajax({url: url});
});
(function serialize() {
var def = promises.shift();
if (def) {
def.done(function() {
callback.apply(null, arguments);
serialize();
});
}
})();
Here's my attempt at solving this. I updated my answer to include error handling for a failed .ajax call. I also moved some code to the complete method of the .ajax call.
var urlArr = ["url1", "url2"];
var responseArr = [];
for(var i = 0; i < length; i++) {
responseArr.push("0");//0 meaning unprocessed to the DOM
}
$.each(urlArr, function(i, url){
$.ajax({
url: url,
success: function(data){
responseArr[i] = data;
},
error: function (xhr, status, error) {
responseArr[i] = "Failed Response";//enter whatever you want to place here to notify the end user
},
complete: function() {
$.each(responseArr, function(i, element){
if (responseArr[i] == "0") {
return;
}
else if (responseArr[i] != "done")
{
//do something with the response
responseArr[i] = "done";
}
});
}
});
})
Asynchronous requests aren't guaranteed to finish in the same order that they are sent. some may take longer than others depending on server load and the amount of data being transferred.
The only options are either to wait until they are all done, only send one at a time, or just deal with them being called possibly out of order.