Is it possible/can you loop ajax GET request each time? - javascript

I've looked everywhere, and I am not quite getting if it's possible or how to loop through an ajax request, cycling through the values in an array.
So it would need to make an ajax request one of the data values (serial) as array[0], finishing the request, then does the next request with array[1], and so on.
My code:
$.ajax({
url: 'example.com',
type: 'GET',
data: {
message: message,
user: user,
serial: i
},
success: function(response) {
alert("Sent");
},
error: function(XMLHttpRequest, textStatus, errorThrown) {
alert("Fail");
}
});
So this will work for one defined serial, but how would it work when serial (the variable 'i') is an array, holding many serials?
Also it shouldn't send the array, it needs to cycle through, sending one value at a time.
Any help is most appreciated at this point.

Create a recursive function that does an ajax call. When the ajax call ends, the function calls itself (recursion) and passes in an updated index value to use on passed in array for the next ajax call.
/**
* Recursive function that make an ajax call over the {index} element inside an {array}
* #param {Array} array the array to loop through
* #param {Number} index current index
*/
function Caller(array, index){
if(array.length <= index){return;}
$.ajax({
url: 'example.com',
type: 'GET',
data: {
message: message,
user: user,
serial: array[index]
},
success: function(response) {
alert("Sent");
},
error: function(XMLHttpRequest, textStatus, errorThrown) {
alert("Fail");
},
complete: function(){
Caller(array,++index);
}
}
The recursive function calls itself on the complete callback (which is triggered after the call completes whether is was a success or error).
By doing it this way you get through the array and only send an ajax request when the previous request is finished.

Try to use forEach():
array.forEach(element => {
$.ajax({
...
data {.., serial: element...}
...
});
});

It's 2018 so there are multiple, nice ways of doing this;
You can use Promises, $.ajax actually returns one; and async/await to perform XHR requests serially.
You can keep your callback-style code and use a small utility function to abstract the async iteration in a nice readable way that you can reuse over and over.
I'll cover both cases.
Async Iteration with Promises and async/await
Since jQuery 1.5, $.ajax returns a Promise. So if you're using a modern browser you can just await it.
This is by far most elegant and terse way since the code looks like synchronous code hence it's far more readable. Be aware that while the code looks synchronous, it's in fact non-blocking.
const getPosts = async (pages) => {
const posts = []
for (const page of pages) {
const post = await $.ajax({
url: 'https://jsonplaceholder.typicode.com/posts/' + page
})
posts.push(post)
}
return posts
}
getPosts([1, 2, 3, 4, 5]).then(posts => {
console.log(posts)
}).catch(err => {
console.error(err)
})
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
Async Iteration with callbacks
This is the "traditional" way of doing asynchronous operations, that uses callbacks. This style doesn't require a modern browser at all since at it's base level it just passes around functions to achieve non-blocking behaviour.
However, this type of code is far harder to work with; You need to rely on utility functions that wrap common operations (looping, mapping etc) to effectively work with such code.
I've written a small utility function that let's you:
Iterate over the elements of an Array.
Specify a callback function that get's called for each iteration
This function get's called on each iteration. The currently-iterated over Array element is passed to it, together with a next argument that you need to call in order to proceed to the next iteration.
Calling the next of this callback function pushes the result into a final result Array, and proceeds to the next iteration.
Specify a final callback function that get's called when all the iterations have finished.
If I'm not mistaken, this is identical in operation to the async.mapSeries method of the popular async module.
async.mapSeries:
In the following example, I'm passing an Array of posts to fetch from a REST API.
When all the iterations are complete, the posts argument in the final callback contains an Array with 5 posts.
It takes advantage of error-first callbacks, a common pattern to gracefully propagate errors up the callback chain if something goes awry in your async operations.
// async.mapSeries utility Function
const async = {
mapSeries: function(arr, onIteration, onDone, { i = 0, acc = [] } = {}) {
arr.length ?
onIteration(arr[i], (err, result) => {
if (err) return onDone(err)
acc.push(result)
acc.length < arr.length ?
this.mapSeries(arr, onIteration, onDone, {
i: ++i, acc
}) : onDone(null, acc)
})
: onDone(null, arr)
}
}
// Usage
async.mapSeries([1, 2, 3, 4, 5], (page, next) => {
$.ajax({
url: 'https://jsonplaceholder.typicode.com/posts/' + page,
success: response => {
next(null, response)
},
error: (XMLHttpRequest, textStatus, err) => {
next(err)
}
})
}, (err, posts) => {
if (err) return console.error('Error:', err)
console.log(posts)
})
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>

Related

Javascript - What is happening to the callbacks in async parallel in this example?

I'm trying to teach myself javascript, and I am working my way through Mozilla's Express tutorial, and I came across this piece of code that is confusing me.
Each function in the object that is being passed as the first argument in async.parallel is being passed a callback argument. I'm learning about callbacks and how they work. Normally when I see a callback, it's invoked later on in the function that it's passed into something like callback(), or callback(null, result), but I don't see that here. Any idea why that's the case?
Just as a heads up, the count method (from the Mongoose api) accepts two arguments, the second one being a callback.
exports.index = function(req, res) {
async.parallel({
book_count: function(callback) {
Book.count(callback);
},
book_instance_count: function(callback) {
BookInstance.count(callback);
},
book_instance_available_count: function(callback) {
BookInstance.count({status:'Available'},callback);
},
author_count: function(callback) {
Author.count(callback);
},
genre_count: function(callback) {
Genre.count(callback);
},
}, function(err, results) {
res.render('index', { title: 'Local Library Home', error: err, data: results });
});
};
In the docs, you can read up on async.parallel.
The reason you don't see callback(null, result), is because the callback is being directly passed to mongoose. It is important to note, that it is mongoose that is invoking the callback function - not your code.
For example:
book_count: function(callback) {
Book.count(callback);
},
Is the same as writing
book_count: function(callback) {
Book.count(function(error, result) {
callback(error, result);
});
},
As you can see, the second example only adds a "wrapper"-function - which is not really needed. It is much more readable to just pass along the callback to mongoose (which has the same convention of accepting error as the first argument, and result as the second).

nodejs recursively call same api and write to excel file sequentially

I need to call an API recursively using request promise after getting result from API need to write in an excel file , API sample response given below
{
"totalRecords": 9524,
"size": 20,
"currentPage": 1,
"totalPages": 477,
"result": [{
"name": "john doe",
"dob": "1999-11-11"
},
{
"name": "john1 doe1",
"dob": "1989-12-12"
}
]
}
Now I want to call this API n times, here n is equal to totalPages, after calling each API I want to write response result to the excel files.
First write page 1 response result to excel then append page 2 response result to excel file and so on..
I have written some sample code given below
function callAPI(pageNo) {
var options = {
url: "http://example.com/getData?pageNo="+pageNo,
method: 'GET',
headers: {
'Content-Type': 'application/json'
},
json: true
}
return request(options)
}
callAPI(1).then(function (res) {
// Write res.result to excel file
}).catch(function (err) {
// Handle error here
})
But facing problem calling recursively API and maintaining sequentially like write page 1 result first to excel file then page 2 result append to excel and so on..
Any code sample how to achieve in nodejs
You want to do something like this:
function getAllPages() {
function getNextPage(pageNo) {
return callAPI(pageNo).then(response => {
let needNextPage = true;
if (pageNo === 1) {
// write to file
} else {
// append to file
}
if (needNextPage) {
return getNextPage(pageNo+1);
} else {
return undefined;
}
});
}
return getNextPage(1);
}
Obviously change that 'needNextPage' to false to stop the recursion when you're done
So you want to do 477 requests in sequence? How long do you wanna wait for this to finish? Even in paralell, this would be still too long for me.
Best: write an API that can return you a batch of pages at once. Reducing the number of requests to the backend. Maybe something like http://example.com/getData?pages=1-100 and let it return an Array; maybe like
[
{
"totalRecords": 9524,
"currentPage": 1,
"totalPages": 477,
"result": [...]
},
{
"totalRecords": 9524,
"currentPage": 2,
"totalPages": 477,
"result": [...]
},
...
]
or more compact
{
"totalRecords": 9524,
"totalPages": 477,
"pages": [
{
"currentPage": 1,
"result": [...]
},
{
"currentPage": 2,
"result": [...]
},
...
]
}
Sidenote: writing the size of the results array into the json is unnecessary. This value can easily be determined from data.result.length
But back to your question
Imo. all you want to run in sequence is adding the pages to the sheet. The requests can be done in paralell. That already saves you a lot of overall runtime for the whole task.
callApi(1).then(firstPage => {
let {currentPage, totalPages} = firstPage;
//`previous` ensures that the Promises resolve in sequence,
//even if some later request finish sooner that earlier ones.
let previous = Promise.resolve(firstPage).then(writePageToExcel);
while(++currentPage <= totalPages){
//make the next request in paralell
let p = callApi(currentPage);
//execute `writePageToExcel` in sequence
//as soon as all previous ones have finished
previous = previous.then(() => p.then(writePageToExcel));
}
return previous;
})
.then(() => console.log("work done"));
or you wait for all pages to be loaded, before you write them to excel
callApi(1).then(firstPage => {
let {currentPage, totalPages} = firstPage;
let promises = [firstPage];
while(++currentPage < totalPages)
promises.push(callApi(currentPage));
//wait for all requests to finish
return Promise.all(promises);
})
//write all pages to excel
.then(writePagesToExcel)
.then(() => console.log("work done"));
or you could batch the requests
callApi(1).then(firstPage => {
const batchSize = 16;
let {currentPage, totalPages} = firstPage;
return Promise.resolve([ firstPage ])
.then(writePagesToExcel)
.then(function nextBatch(){
if(currentPage > totalPages) return;
//load a batch of pages in paralell
let batch = [];
for(let i=0; i<batchSize && ++currentPage <= totalPages; ++i){
batch[i] = callApi(currentPage);
}
//when the batch is done ...
return Promise.all(batch)
//... write it to the excel sheet ...
.then(writePagesToExcel)
//... and process the next batch
.then(nextBatch);
});
})
.then(() => console.log("work done"));
But don't forget to add the error handling. Since I'm not sure how you'd want to handle errors with the approaches I've posted, I didn't include the error-handling here.
Edit:
can u pls modify batch requests, getting some error, where you are assigning toalPages it's not right why the totalPages should equal to firstPage
let {currentPage, totalPages} = firstPage;
//is just a shorthand for
let currentPage = firstPage.currentPage, totalPages = firstPage.totalPages;
//what JS version are you targeting?
This first request, callApi(1).then(firstPage => ...) is primarily to determine currentIndex and totalLength, as you provide these properties in the returned JSON. Now that I know these two, I can initiate as many requests in paralell, as I'd want to. And I don't have to wait for any one of them to finish to determine at what index I am, and wether there are more pages to load.
and why you are writing return Promise.resolve([ firstPage ])
To save me some trouble and checking, as I don't know anything about how you'd implement writePagesToExcel.
I return Promise.resolve(...) so I can do .then(writePagesToExcel). This solves me two problems:
I don't have to care wether writePagesToExcel returns sync or a promise and I can always follow up with another .then(...)
I don't need to care wether writePagesToExcel may throw. In case of any Error, it all ends up in the Promise chain, and can be taken care of there.
So ultimately I safe myself a few checks, by simply wrapping firstPage back up in a Promise and continue with .then(...). Considering the amounts of data you're processing here, imo. this ain't too much of an overhead to get rid of some potential pitfalls.
why you are passing array like in resolve
To stay consistent in each example. In this example, I named the function that processes the data writePagesToExcel (plural) wich should indicate that it deals with multiple pages (an array of them); I thought that this would be clear in that context.
Since I still need this seperate call at the beginning to get firstPage, and I didn't want to complicate the logic in nextBatch just to concat this first page with the first batch, I treat [firstPage] as a seperate "batch", write it to excel and continue with nextBatch
function callAPI(pageNo) {
var options = {
url: "http://example.com/getData?pageNo="+pageNo,
method: 'GET',
headers: {
'Content-Type': 'application/json'
},
json: true
}
return request(options)
}
function writeToExcel(res){console.log(res)} //returns promise.
callAPI(1).then(function (res) {
if(res){
writeToExcel(res).then(() => {
var emptyPromise = new Promise(res => setTimeout(res, 0));
while(res && res.currentPage < res.totalPages){
emptyPromise = emptyPromise.then(() => {
return callAPI(res.currentPage).then(function (res){
if(res){
writeToExcel(res)
}
});
}
}
return emptyPromise;
});
}
}).catch(function (err) {
// Handle error here
})

Using the parameter error to separate failed process's in async.mapLimit

I'm using async.mapLimit to make some concurrency procedures upon an array with limit of 10:
async.mapLimit(files, 10, function(file, callback) {
... etc...
}, function(error, files) {
... etc..
});
Inside the main function, i'm executing a async operation with child_process, and if everything happen as it should, i just call the callback:
callback(null, files);
But... when something bad happens, i also NEED call the callback passing the file, because i don't want end everything, i just assign the file with a error property and call the callback:
file.error = error;
callback(null, file);
So, when the second async.mapLimit callback is fired, i have an array of files:
, function(error, files) {
console.log(files);
});
output:
[
{
name: 'file_2',
error: 'something'...
},
{
name: 'file_1',
...etc
}
]
So, i need separate the files that failed, doing:
var failedFiles = [];
var okFiles = [];
files.forEach(function(file) {
if (file.error)
failedFiles.push(file);
else
okFiles.push(file;
});
I would like to know if isn't possible to return the files that failed as an array, and access then by the parameter error of the second async.mapLimit callback.
Thanks in advance :).
async.mapLimit() will stop immediately when an iteration "returns" an error, so it's not possible to do what you want.
As an alternative, instead of using async.mapLimit() you could use async.eachLimit() and push the file objects into the respective array inside the iterator function:
var failedFiles = [];
var okFiles = [];
async.eachLimit(files, 10, function(file, callback) {
if (SOME_ERROR) {
failedFiles.push(file);
} else {
okFiles.push(file);
}
callback();
}, function(err) {
...
});

How do I get a plain array back from vuejs with a component?

I am using a call to my database to retrieve some results and pushing them onto an array. However when I console.log(this.activeBeers) I don't get an array back but instead an object. How can I get a plain array back instead of a object?
Vue.component('beers', {
template: '#beers-template',
data: function() {
return {
activeBeers: []
}
},
ready: function() {
function getActiveBeers(array, ajax) {
ajax.get('/getbeers/' + $('input#bar-id').val()).then(function (response) {
$.each(response.data, function(key, value) {
array.push(value.id);
});
}, function (response) {
console.log('error getting beers from the pivot table');
});
return array;
}
console.log(this.activeBeers = getActiveBeers(this.activeBeers, this.$http));
},
props: ['beers']
});
AJAX is done asynchronously so you won't be able to just return the value that you do not have yet.
You should console.log your stuff after the $.each to see what you received.
As the other answers pointed out, your getActiveBeers() call is returning before the callback that fills the array gets executed.
The reason your array is an object is because Vue wraps/extends arrays in the underlying data so that it can intercept and react to any mutating methods - like push, pop, sort, etc.
You can log this.activeBeers at the beginning of your ready function to see that it's an object.
By the way, if you want to log the unwrapped/plain array of activeBeers, you can use your component's $log method:
this.$log(this.activeBeers);
The other answer is correct, getActiveBeers sends an HTTP request and then immediately returns the array, it doesn't wait for the ajax request to come back. You need to handle the updating of activeBeers in the success function of the ajax request. You can use the .bind() function to make sure that this in your success function refers to the Vue component, that way you can just push the ids directly into your activeBeers array.
Vue.component('beers', {
template: '#beers-template',
data: function() {
return {
activeBeers: []
}
},
ready: function() {
this.getActiveBeers();
},
methods: {
getActiveBeers: function(){
this.$http.get('/getbeers/' + $('input#bar-id').val()).then(function (response) {
$.each(response.data, function(key, value) {
this.activeBeers.push(value.id);
}.bind(this));
console.log(this.activeBeers);
}.bind(this), function (response) {
console.log('error getting beers from the pivot table');
});
}
}
props: ['beers']
});

Can I act on and then forward the results of a AngularJS $http call without using $q?

I have functions like the getData function below.
I understand that $http returns a promise. In my current set up I am using $q so that I can do some processing of the results and then return another promise:
var getData = function (controller) {
var defer = $q.defer();
$http.get('/api/' + controller + '/GetData')
.success(function (data) {
var dataPlus = [{ id: 0, name: '*' }].concat(data);
defer.resolve({
data: data,
dataPlus: dataPlus
});
})
.error(function (error) {
defer.reject({
data: error
});
});
return defer.promise;
}
Is there any way that I can do this without needing to use the AngularJS $q (or any other $q implementation) or is the code above the only way to do this? Note that I am not looking for a solution where I pass in an onSuccess and an onError to the getData as parameters.
Thanks
As you say $http.get already returns a promise. One of the best things about promises is that they compose nicely. Adding more success, then, or done simply runs them sequentially.
var getData = function (controller) {
return $http.get('/api/' + controller + '/GetData')
.success(function (data) {
var dataPlus = [{ id: 0, name: '*' }].concat(data);
return {
data: data,
dataPlus: dataPlus
};
})
.error(function (error) {
return {
data: error
};
});
}
This means that using getData(controller).then(function (obj) { console.log(obj) });, will print the object returned by your success handler.
If you want you can keep composing it, adding more functionality. Lets say you want to always log results and errors.
var loggingGetData = getData(controller).then(function (obj) {
console.log(obj);
return obj;
}, function (err) {
console.log(err);
return err;
});
You can then use your logging getData like so:
loggingGetData(controller).then(function (obj) {
var data = obj.data;
var dataPlus = obj.dataPlus;
// do stuff with the results from the http request
});
If the $http request resolves, the result will first go through your initial success handler, and then through the logging one, finally ending up in the final function here.
If it does not resolve, it will go through the initial error handler to the error handler defined by loggingGetData and print to console. You could keep adding promises this way and build really advanced stuff.
You can try:
Using an interceptor which provides the response method. However I don't like it, as it moves the code handling the response to another place, making it harder to understand and debug the code.
Using $q would be the best in that case IMO.
Another (better ?) option is locally augmented transformResponse transformer for the $http.get() call, and just return the $http promise.

Categories