I have an array that I'm passing to a payload that will be posted to an API. In the array are field names that the API will take individually( not as an array) so I created a for loop to iterate thru the array and add the field names to the payload dynamically. But when I make the call, I get data for only the last field name. If I have say a total of 6 items in the array, I get data for just the last field.
function getData(payload, index, field) {
var deferred = $q.defer();
for (var i = 0; i < field.length; i++) {
if (field[i]) {
console.log("terms logged", field[i]);
var termsData = {
user_selection: payload,
index_info: index,
field: field[i]
};
console.log("terms data", termsData);
}
}
$http({
url: 'API',
method: "POST",
data: $.param(termsData),
headers: {'Content-Type': 'application/x-www-form-urlencoded'}
}).then(function (response) {
var data = response.data;
console.log("response data", data);
deferred.resolve(data);
});
return deferred.promise;
}
Do I need to repeat the loop after the initial call? Since it's in a for loop, I assumed the calls would be made one after another until the condition is met.
There are a couple of errors here. First, the return deferred.promise; will break out of the function the first time it is reached. So that's why it's only sending the first term. If you move the return statement outside of the for loop you should get all the terms sent.
What also should be fixed is you have only 1 deferred object attached to multiple calls. There should be a deferred object for each call. Below is an example.
function getData(payload, index, field) {
for (var i = 0; i < field.length; i++) {
if (field[i]) {
console.log("terms logged", field[i]);
var termsData = {
user_selection: payload,
index_info: index,
field: field[i]
};
postTerm(term);
}
}
}
function postTerm(term) {
var deferred = $q.defer();
console.log("terms data", term);
$http({
url: 'API',
method: "POST",
data: $.param(term),
headers: {
'Content-Type': 'application/x-www-form-urlencoded'
}
}).then(function(response) {
var data = response.data;
console.log("response data", data);
deferred.resolve(data);
});
return deferred.promise;
}
I believe what you are looking for is a way to chain promises within a loop. This can be achieved storing the promises inside an array as this:
var promises = [];
for(...) {
var promise = $http(...); // http calls return a promise
promises.push(promise);
// Or, if you prefer to use $q
var deferred = $q.defer();
$http(...).success(function(){
deferred.resolve();
});
promises.push(deferred);
}
$q.all(promises).then(function(){
// This will be executed when all promises inside the array have been resolved
});
Although, I do not recommend doing so many requests. If possible, change your backend, enabling it to receive an array of objects alternatively.
Here some documentation about $q https://docs.angularjs.org/api/ng/service/$q
Resolve a promise array with Angular UI Router
Related
I'm not familiar enough with AJAX calls to know how this does or doesn't work.
If I have an array A and I'm using A.filter() on it, how will AJAX calls within filter work? The array is being used to populate a template, all synchronously I believe.
// Event calls function to filter list on page.
// Function calls filterArray(arrayList, objFilters)
async_fetch: function(string)
{
// Utilizes $.ajax() to retrieve a JSON array
var deferred = $.Deferred();
$.ajax({
url: ...,
dataType: "json",
success: function(data) {
var response = data;
deferred.resolve(data);
},
error: function(data)
{
//...
deferred.reject(msg);
}
});
return deferred;
};
filterArray: function(list, filters)
{
var filteredList = list.filter(function(item) {
for(var key in filters) {
// Actions for each of multiple filters to compare against...
else if(key == 'FILTER_X') {
var data = async_fetch(item.name);
// Use data to arrive at a determination, where true means I don't want to include this item in the filtered list
if(determination)
return false;
}
}
};
return filteredList;
};
// Results of filterArray() are passed along to a template within Backbone
// to redraw a segment of HTML on the page.
Will the call to filter just wait synchronously for the AJAX call to finish? Will the list get filtered otherwise and returned, and the AJAX call have to hook into the filtered list, and essentially finish the filtering later?
Should I just build a version of async_fetch() that isn't async?
You will need to .then() or .done() the call, eg
....
async_fetch(item.name).then(function(data){
if (data.determination)
// do something
})
....
Hi you can resolve the promise after your filtering the data.
example like,
hope this help you.
$(document).ready(function(){
function async_fetch(string)
{
// Utilizes $.ajax() to retrieve a JSON array
var deferred = $.Deferred();
$.ajax({
url: string,//your URL
dataType: "json",
success: function(data) {
var filterdData= filterArray(data);
deferred.resolve(filterdData);
},
error: function(data)
{
//...
deferred.reject(msg);
}
});
return deferred;
};
function filterArray(data)
{
var filteredList = data.filter(function(item) {
//filter whatever you want
})
return filteredList;
}
async_fetch(url).then(function(response){
//now you will get the filterd data
console.log(response);
})
});
You can do this with help of async/await as below:-
const filterArray = async function (list, filters) {
var filteredListPromise = list.filter(async function (item) {
for (var key in filters) {
// Actions for each of multiple filters to compare against...
if (key == 'FILTER_X') {
return arriveAtDetermination(item.name);
}
else {
//other filters
}
}
});
return Promise.all(filteredListPromise);
};
async function arriveAtDetermination(name) {
let data = await async_fetch(name);
return determination ? true : false;//your logic
}
//Now you can filter like
filterArray(list, filters).then(result => {
console.log(result);
}).catch(err => {
console.log(err);
})
I have the following Ajax call:
for (var i=0;i<localStorage.length;i++){
stored = localStorage.getItem(i.toString());
id=stored.split(':');
if ( id[0].toString()=="legislator" ){
bioguide_id=id[1];
$http({
method: 'GET',
url: 'getData.php',
params: {fav_legislator_value: bioguide_id}
}).then(function (response) {
fav_legislators.push(response.data.results[0]);})
}
}
I need all the Ajax call to complete and all results pushed to the array after that only I can start processing again. I have tried many ways by making this a function but nothing works. My whole script gets executed then the values are pushed to the array which is of no use to me. How to wait until all the calls are completed and the array is completely populated.
Store all the promises object into an array, and use $q.all to combine all promises into a single promise which will be resolved only if all of them have been resolved.
E.g.
var promises = [];
for (var i=0;i<localStorage.length;i++){
stored = localStorage.getItem(i.toString());
id=stored.split(':');
if ( id[0].toString()=="legislator" ){
bioguide_id=id[1];
var req = $http({
method: 'GET',
url: 'getData.php',
params: {fav_legislator_value: bioguide_id}
}).then(function (response) {
fav_legislators.push(response.data.results[0]);
});
promises.push(req);
}
}
$q.all(promises).then(function () {
// All result is now in `fav_legislators`.
})
Am new to Deferreds and Promises.
Here is my [simplified] code, which is defined within a JavaScript object:
myFunction: function(d, cb)
{
return $.ajax('/myURL', {
contentType: 'application/json',
data: d,
dataType: 'json',
type: 'POST'
}).then(cb, cb);
},
flush: function(myArray)
{
return myFunction(myArray, myCallback);
}
The above works fine. I can call flush(someArray), and some time later I get the result of the ajax request.
QUESTION:
I want to modify the flush function so that it first breaks the array into chunks (i.e. smaller arrays), and then calls myFunction on each of those chunks. It must then return, in one go, the aggregated data (preferably in an array) from each of the ajax calls that were made.
I am starting to modify flush() along the following lines, but I know it's not quite right. Please can someone complete/fill in the gaps for me, or suggest a re-structuring that would work well?
Thanks.
flush: function(myArray)
{
var chunk = 2;
var i, a;
var j = myArray.length;
var myArrayChunks = [];
for (i=0; i<j; i+=chunk)
{
a = myArray.slice(i, i+chunk);
myArrayChunks.push(a);
}
var myDeferreds = [];
for (i=0; i<myArrayChunks.length; i++)
{
// Here, I need to create a deferred object that will run: myFunction(myArrayChunks[i], myCallback)
// How do I do that?
var f = // The deferred object that will run: myFunction(myArrayChunks[i], myCallback)
myDeferreds.push(f);
}
return $.when.apply($, myDeferreds).then(function(){
// Here, I need to get the aggregated data that is returned by each of the deferreds. How do I do that?
console.log("FLUSH COMPLETE!");
});
}
The async library I've pasted below allows you to run a series of async/deferred requests, and passes on the results of each async function to a final callback, which aggregates a collection of the results.
In particular, check out the parallel method, which will execute all of your async requests simultaneously, but there is no guarantee which order they will run in. If you are concerned about which order to execute your async requests, check out the seriesand eachSeries methods.
parallel:
https://github.com/caolan/async#parallel
series / eachSeries:
https://github.com/caolan/async#seriestasks-callback
Both methods aggregate your results into a final results object, which contains all of the results passed on from each async call you make.
NOTE, to use jQuery's deferred functionality, you would need to call .resolve() in the "final" callback of the async.parallel or async.each or async.eachSeries methods
Here's an example of the parallel method:
async.parallel([
function(callback){
// some request
$.ajax(/*details*/, function(data) {
callback(null, data);
});
},
function(callback){
// some request
$.ajax(/*details*/, function(data) {
callback(null, data);
});
}
],
// "final" callback, invoked after all above functions have
// called their respective callback() functions
function(err, results){
if(err) {
// handle error
} else {
// results contains aggregated results from all
// async calls (2nd parameter in callback(errorParam, resultsParam)
console.log('all async methods finished!', results);
}
});
Here's a way to pass in an array and make async methods with each array element. NOTE that every async call within the async.each method must call callback() when the async request is resolved, or callback(err) in your async error method if there an error. If you pass in an array of N elements to the async.each method, the final callback will be invoked when all N async resolve callback() methods have been invoked.
async.each(array, function(element, callback) {
$.ajax(/* details */, {data: element}, function(data) {
// call `callback` when you're finished up
callback();
});
},
// "final" callback, invoked after each async call is resolved and
// invokes the callback() function
function(err){
if( err ) {
// handle errors
} else {
console.log('All async methods flushed!');
}
});
I love this library, and once you start using it it'll change your life :]. Best of luck!
Since you already have a promise returned from your ajax function, I'd suggest you use promises instead of plain callbacks. Here's a way to do that:
myFunction: function(d) {
return $.ajax('/myURL', {
contentType: 'application/json',
data: d,
dataType: 'json',
type: 'POST'
});
},
flush: function(myArray, chunkSize) {
chunkSize = chunkSize || 2;
var index = 0;
var results = [];
var self = this;
return jQuery.Deferred(function(def) {
function next() {
var start = index;
var arrayChunk, promises = [];
index += chunkSize;
if (index < myArray.length) {
arrayChunk = myArray.slice(start, chunkSize);
// create chunkSize array of promises
arrayChunk.forEach(function(item) {
promises.push(self.myFunction(item));
});
$.when.apply($, promises).then(function() {
// results are in arguments[0][0], arguments[1][0], etc...
for (var i = 0; i < arguments.length; i++) {
results.push(arguments[i][0]);
}
// next iteration
next();
}, def.reject)
} else {
def.resolve(results);
}
}
// start first iteration
next();
}).promise();
}
obj.flush(myArray).then(function(results) {
// array of results here
}, function(jqXHR, textStatus, errorThrown) {
// error here
});
Here's another way to do it by creating a version of $.ajax() which I call $.ajaxChunk() that takes an array of data and does the chunking for you.
// Send ajax calls in chunks from an array with no more than X in flight at the same time
// Pass in array of data where each item in dataArray is sent separately
// in an ajax call
// Pass settings.chunkSize to specify the chunk size, defaults to 2 if not present
// Returns a promise
// The resolved value of promise is an array of results
// The rejected value of the promise is whatever jQuery result failed first
$.ajaxChunk = function(dataArray, url, settings) {
settings = settings || {};
var chunkSize = settings.chunkSize || 2;
var index = 0;
var results = [];
return jQuery.Deferred(function(def) {
function next() {
var start = index;
var arrayChunk, promises = [];
index += chunkSize;
if (index < myArray.length) {
arrayChunk = myArray.slice(start, chunkSize);
// create chunkSize array of promises
arrayChunk.forEach(function(item) {
// make unique copy of settings object for each ajax call
var localSettings = $.extend({}, settings);
localSettings.data = item;
promises.push($.ajax(url, localSettings));
});
$.when.apply($, promises).then(function() {
// results are in arguments[0][0], arguments[1][0], etc...
for (var i = 0; i < arguments.length; i++) {
results.push(arguments[i][0]);
}
next();
}, def.reject)
} else {
def.resolve(results);
}
}
// start first iteration
next();
}).promise();
}
And, sample usage:
$.ajaxChunk(arrayOfData, '/myURL', {
contentType: 'application/json',
dataType: 'json',
type: 'POST',
chunksize: 2
}).then(function(results) {
// array of results here
}, function(jqXHR, textStatus, errorThrown) {
// error here
})
If the real requirement here is that you don't have more than X ajax calls in process at the same time, then there's a more efficient and faster (end-to-end time) way to do than chunking. Instead, you keep track of exactly how many ajax calls in "in flight" at any time and as soon as one finishes, you start the next one. This is a bit more efficient than chunking where you send the whole chunk, then wait for the whole chunk to finish. I've written a jQuery helper that implements this:
$.ajaxAll = function(dataArray, url, settings, maxInFlight) {
maxInFlight = maxInFlight || 1;
var results = new Array(dataArray.length);
settings = settings || {};
var index = 0;
var inFlight = 0;
return jQuery.Deferred(function(def) {
function runMore() {
while (inFlight < maxInFlight && index < dataArray.length) {
(function(i) {
var localSettings = $.extend({}, settings);
localSettings.data = dataArray[index++];
++inFlight;
$.ajax(url, localSettings).then(function(data, textStatus, jqXHR) {
--inFlight;
results[i] = data;
runMore();
}, def.reject);
})(index);
}
// if we are all done here
if (inFlight === 0 && index >= dataArray.length) {
def.resolve(results);
}
}
// start first iteration
runMore();
}).promise();
}
Note: If you pass 1 for the maxInFlight argument, then this runs the ajax calls in series one after the other. Results are always returned in order.
And, sample usage:
$.ajaxAll(arrayOfData, '/myURL', {
contentType: 'application/json',
dataType: 'json',
type: 'POST'
}, 2).then(function(results) {
// array of results here
}, function(jqXHR, textStatus, errorThrown) {
// error here
})
Thanks to all for great advice.
I used a combination of the suggested techniques in my solution.
The key thing was to make an array of promises, and push onto it the required calls (each with its own array chunk passed as a parameter) to the function that makes the ajax request. One thing I hadn't previously realised is that this calls the ajaxCall() function at that very moment, and that's ok because it returns a promise that is pushed onto the array.
After this, the 'when.apply' line does the trick in waiting until all the ajax promises are fulfilled. The arguments of the 'then' function are used to collate all the results required (obviously, the exact mechanism for that depends on the format of your returned arguments). The results are then sent to theResultsHandler(), which takes the place of the original callback in the code I first posted in my question.
Hope this is useful to other Promise-novices!
The ajax-calling function is:
ajaxCall: function(d) {
return $.ajax('/myURL', {
contentType: 'application/json',
data: d,
dataType: 'json',
type: 'POST'
});
},
And inside the flush() function...
var promises = [];
var i, j;
for (i=0; i<batchChunks.length; i++)
{
promises.push(self.ajaxCall(batchChunks[i]));
}
var results = [];
return $.when.apply($, promises).then(function(){
console.log("arguments = " + JSON.stringify(arguments));
for (i = 0; i < arguments.length; i++)
{
for (j = 0; j < arguments[i][0].length; j++)
{
results.push(arguments[i][0][j]);
}
}
return self.theResultsHandler(results);
});
I have this API call where i make sure the data return in the same order i send it. However, i realized thats not really what i want, i want to make sure the data is send and taken care of one at a time.
data[n] has returned before data[n+1] is send.
the reason for this is:
If i do it as seen below, the server still gets it in a random order, and therefor saves the data in my DB in a random order. (or well not random, heavier data gets processed slower)
var promiseArray = [];
for (var i = 0; i < data.length; i++) {
var dataPromise = $http.post('/api/bla/blabla', $httpParamSerializer(data[i]))
.then (function (response) {
//return data for chaining
return response.data;
});
promiseArray.push(dataPromise);
}
$q.all(promiseArray).then(function (dataArray) {
//succes
}).catch (function (errorResponse) {
//error
});
how can i make sure the data is send and processed and returned, one at a time in a smooth way ?
You could do something like this:
var i = -1;
processNextdata();
function processNextdata() {
i++;
if(angular.isUndefined(data[i]))
return;
$http.post('/api/bla/blabla', $httpParamSerializer(data[i]))
.then(processNextdata)
}
Update:
Callback after every result:
var i = -1;
processNextdata();
function processNextdata() {
i++;
if(angular.isUndefined(data[i]))
return;
$http.post('/api/bla/blabla', $httpParamSerializer(data[i]))
.then(function(result) {
// do something with a single result
return processNextdata();
}, errorCallback);
}
Callback after everything is done:
var i = -1, resultData = [];
processNextdata()
.then(function(result) {
console.log(result);
}, errorCallback);
function processNextdata() {
i++;
if(angular.isUndefined(data[i]))
return resultData;
$http.post('/api/bla/blabla', $httpParamSerializer(data[i]))
.then(function(result) {
resultData.push(result.data);
return processNextdata();
}, $q.reject);
}
When using the Promise.all([...]) method, the documentation shows the following:
The Promise.all(iterable) method returns a promise that resolves when all of the promises in the iterable argument have resolved, or rejects with the reason of the first passed promise that rejects.
What this tells us is that there is no expected order of synchronized operations, but in fact the promises run parallel to one another and can complete in any order.
In your case, there is an expected order that you want your promises to run in, so using Promise.all([...]) won't satisfy your requirements.
What you can do instead is execute individual promises, then if you have some that can run in parallel use the Promise.all([...]) method.
I would create a method that takes a request as an argument, then returns the generated promise:
function request (req) {
return new Promise(function (resolve, reject) {
request({
url: url
, port: <port>
, body: req
, json: <true/false>
, method: '<POST/GET>'
, headers: {
}
}, function (error, response, body) {
if (error) {
reject(error);
} else {
resolve(body);
}
});
});
You can then call this function and store the result:
var response = request(myRequest);
Alternatively, you could create an array of your requests and then call the function:
var requests = [request1, request2, ..., requestN];
var responses = [];
for (var i = 0; i < requests.length; i++) {
responses.push(request(requests[i]));
}
I have an array that can hold an unknown amount of indexes in it. Each index is used to send data with an ajax call. I am looping through with a for loop gathering the data from the successful call and pushing it into an empty array. At the end of the unknown amount of calls I then need to use that newly gathered array in my view. newDataArray is executed at the bottom before the loops are done and therefor it is still empty. How do I finish all the calls then do what is at the bottom?
If it helps, I am doing this in React with the Flux pattern. But the same issue may be done not in React. Here is a mock sample of what I am trying to do:
JS
case 'execute-calls':
//This is the new array to push to
var newDataArray = [];
//Url to call
var url = 'http://dev.markitondemand.com/Api/v2/Quote/jsonp';
for(let i = 0; i < payload.data.length; i++){
//given array of data that needs to be sent with call
let symb = { symbol: payload.data[i]};
$.ajax({
data: symb,
url: url,
dataType: "jsonp",
})
.done(function(data){
let updatedData = {
//...data that is stored from response
};
newDataArray.push(updatedData);
})
.fail(function(error){
//console.log(error);
});
}
//This will be updating the state object which is above the switch cases
//However this is ran before the end of the loops so newDataArray is empty
var updateTicker = {
updatedTicker: true,
updatedTickerSymbols: newDataArray
};
assign(stockData,updateTicker);
getStockData.emitChange();
break;
You can make use of the fact that $.ajax() actually returns a deferred object, and use it to create an array of deferreds. e.g.
var symbols = [1, 2, 3, 4];
var deferreds = symbols.map(function (symbol) {
return $.ajax({
url: 'http://dev.markitondemand.com/MODApis/Api/v2/Quote/jsonp',
data: { symbol: symbol },
dataType: 'jsonp'
});
});
You can resolve multiple deferreds at once with $.when(). There is a complication however, $.when() expects a list of parameters rather than array. We can solve this by using Function#apply.
To add to the complication, the callback function is also called with a list of arguments. Since we don't know how many arguments there are, we'll use the arguments pseudo-array. And since arguments isn't an actual array, we'll loop through it by using Function#call on Array#prototype.
$.when.apply($, deferreds).done(function () {
Array.prototype.forEach.call(arguments, function (response) {
console.log(response[0].Message);
});
}).fail(function (jqXHR, textStatus, error) {
console.error(error);
});
[UPDATED to include fail() call]
If you're using ES6 this is much more elegant:
$.when(...deferreds).done((...responses) => {
responses.forEach((response) => {
console.log(response[0].Message);
});
});
When ever you are dealing with ajax calls and have to do some operations at the end of all async calls then better choice would be to use Callback functions.
Modifying your code to use the call back,
function AsyncLoopHandler(index) {
if (index > payload.data.length) {
// all the indexes have finished ajax calls do your next step here
var updateTicker = {
updatedTicker: true,
updatedTickerSymbols: newDataArray
};
assign(stockData, updateTicker);
getStockData.emitChange();
}
else {
//given array of data that needs to be sent with call
let symb = { symbol: payload.data[index] };
$.ajax({
data: symb,
url: url,
dataType: "jsonp",
})
.done(function (data) {
let updatedData = {
//...data that is stored from response
};
newDataArray.push(updatedData);
AsyncLoopHandler(index++); // call the function again with new index
})
.fail(function (error) {
//console.log(error);
});
}
}
Now for starting this recursive function just start it by passing the index 0.
AsyncLoopHandler(0);
So all the ajax calls will be executed one after the other as if its an synchronous requests, And the if check will see if all the indexes are complete and then run your logic. Let me know if this helps
suggest use promise, logic would like
var urls= [x,x,x,x];
var results = [];
var qs = $.map(urls,function(url){
return function(){
var deferred = Q.defer();
$.ajax({
success:function(){
results.push(url)
deferred.reslove();
},error:function(){
deferred.reslove();
}
})
return deferred;
}
})
Q.all(qs).then(function(){
console.log(results )
});
or use yield and co in new standard
https://github.com/kriskowal/q