In my AngularJS application I have an array of parameters (some IDs, for example), which should be used as a parameters for an ajax call queue. The problem is that array might contain more than 600 items and if I just recursively make the ajax call using the forEach loop, the browser page eventually stops responding before each of the requests are resolved, since each of responses updates the view. Is there a technique, which could allow to send ajax requests, for example, by 5 requests at a time asynchronously, and only when those are finished, proceed the next 5?
I think best solution would be to change the endpoint to allow an array of Id's, but I guess that is not an option. You can use promises to limit the number of simultaneous requests:
function chunkedAjax(idArray, index, limit) {
if(index >= idArray.length) {
return;
}
var index = index || 0;
var limit = limit || 5;
var chunk = idArray.slice(index, limit);
var promises = [];
angular.forEach(chunk, function(id) {
var deferred = $q.defer();
promises.push(deferred.promise);
makeAjaxCall(id).then(function(response){
deferred.resolve();
});
});
$q.all(promises).then(function() {
chukedAjax(idArray, index + limit, limit);
});
}
This is a recursive function, so be warned. I would debug this heavily before putting into production.
You will also likely need to modify your makeAjaxCall function to return a promise if it does not already, or pass the promise object to it so that it can be resolved when the ajax call completes
Take a look at $q.all(). This lets you execute a callback if multiple requests have finished. And therefore you are able to recursively execute a limited number of requests until all items are processed.
Yes!
Take a look on this module: https://github.com/caolan/async
Related
With node.js I want to http.get a number of remote urls in a way that only 10 (or n) runs at a time.
I also want to retry a request if an exception occures locally (m times), but when the status code returns an error (5XX, 4XX, etc) the request counts as valid.
This is really hard for me to wrap my head around.
Problems:
Cannot try-catch http.get as it is async.
Need a way to retry a request on failure.
I need some kind of semaphore that keeps track of the currently active request count.
When all requests finished I want to get the list of all request urls and response status codes in a list which I want to sort/group/manipulate, so I need to wait for all requests to finish.
Seems like for every async problem using promises are recommended, but I end up nesting too many promises and it quickly becomes uncypherable.
There are lots of ways to approach the 10 requests running at a time.
Async Library - Use the async library with the .parallelLimit() method where you can specify the number of requests you want running at one time.
Bluebird Promise Library - Use the Bluebird promise library and the request library to wrap your http.get() into something that can return a promise and then use Promise.map() with a concurrency option set to 10.
Manually coded - Code your requests manually to start up 10 and then each time one completes, start another one.
In all cases, you will have to manually write some retry code and as with all retry code, you will have to very carefully decide which types of errors you retry, how soon you retry them, how much you backoff between retry attempts and when you eventually give up (all things you have not specified).
Other related answers:
How to make millions of parallel http requests from nodejs app?
Million requests, 10 at a time - manually coded example
My preferred method is with Bluebird and promises. Including retry and result collection in order, that could look something like this:
const request = require('request');
const Promise = require('bluebird');
const get = Promise.promisify(request.get);
let remoteUrls = [...]; // large array of URLs
const maxRetryCnt = 3;
const retryDelay = 500;
Promise.map(remoteUrls, function(url) {
let retryCnt = 0;
function run() {
return get(url).then(function(result) {
// do whatever you want with the result here
return result;
}).catch(function(err) {
// decide what your retry strategy is here
// catch all errors here so other URLs continue to execute
if (err is of retry type && retryCnt < maxRetryCnt) {
++retryCnt;
// try again after a short delay
// chain onto previous promise so Promise.map() is still
// respecting our concurrency value
return Promise.delay(retryDelay).then(run);
}
// make value be null if no retries succeeded
return null;
});
}
return run();
}, {concurrency: 10}).then(function(allResults) {
// everything done here and allResults contains results with null for err URLs
});
The simple way is to use async library, it has a .parallelLimit method that does exactly what you need.
Consider the following:
A web application that can have up to 100 concurrent requests per second
Each incoming request currently makes a http request to an endpoint to get some data (which could take up to 5 seconds)
I want to only make the http request once, i.e. I don't want to make concurrent calls to the same endpoint as it will return the same data
The idea is only the first request will make the http call to get the data
While this call is 'inflight', and subsequent requests will not make the same call and instead 'wait' for the first inflight request to complete.
When the initial http request for data has responded, it must respond to all calls with the data.
I am using Bluebird promises to for the async function that performs the http request.
I would like to create/use some sort of generic method/class that wraps the business logic promise method. This generic method/call will know when to invoke the actual business logic function, when to wait for inflight to finish and then resolve all waiting calls when it has a response.
I'm hoping there is already a node module that can do this, but can't think of what this type of utility would be called.
Something similar to lodash throttle/debounce, but not quite the same thing.
I could write it myself if it doesn't exists, but struggling to come up with a sensible name for this.
Any help would be appreciated.
You can implement a PromiseCaching, like:
module.exports = function request(url) {
if (caches[url]) return caches[url];
var promise = req(url);
return (caches[url] = promise);
};
var req = require('');
var caches = {};
EDIT:
Let me be more explanatory:
Here is not about caching of the responses, but about caching of promises. Nodejs is single threaded, that means, there no concurrent function calls, even when everything is async, at one point of time, runs only one peace of code. That means, there will be somebody first calling the function with the url y.com/foo, there will be no promise in the cache, so it will fire the GET request und will cache and return that promise. When somebody immediately calls the function with the same url, no more requests are fired, but instead the very first promise for this url will be returned, and the consumer can subscribe on done/fail callbacks.
When the response is ready and the promise is fulfilled, and somebody makes the request with the same url, then again, it will get the cached promise back, which is already ready.
Promise caching is a good technique to prevent duplicate async tasks.
A web application can only have 6 concurrent requests because that's the hard browser limitation. Older IE can only do 2. So no matter what you do - this is a hard limit.
In general, you should solve the multiplexing on the server side.
Now - to your actual question - the sort of caching you're asking for is incredibly simple to do with promises.
function once(fn) {
var val = null; // cache starts as empty
return () => val || (val = fn()); // return value or set it.
}
var get = once(getData);
get();
get(); // same call, even though the value didn't change.
Now, you might want to add an expiry policy:
function once(fn, timeout) {
var val = null, timer = null; // cache starts as empty
return () => val || (val = fn().tap(invalidate)); // return value and invalidate
function invalidate() {
clearTimeout(timer); // remove timer.
timer = setTimeout(() => val = null, timeout);
}
}
var get = once(getData, 10000);
You might also want to uncache the result if it fails:
function once(fn, timeout) {
var val = null, timer = null; // cache starts as empty
return () => val ||
(val = fn().catch(e => value = null, Promise.reject(e)).tap(invalidate));
function invalidate() {
clearTimeout(timer); // remove timer.
timer = setTimeout(() => val = null, timeout);
}
}
Since the original functionality is one line of code, there isn't a helper for it.
You can used promise for prevent duplicate request same time
Example write in nodejs, you can using this pattern in browser as well
const rp = require('request-promise'),
var wait = null;
function getUser(req, rep, next){
function userSuccess(){
wait = null;
};
function userErr(){
wait = null;
};
if (wait){
console.log('a wait');
}
else{
wait = rp.get({ url: config.API_FLIX + "/menu"});
}
wait.then(userSuccess).catch(userErr);
}
I'm really new to AngularJS, and actually relatively new to programming altogether. Basically, I want to request the JSON api from Jenkins, for the list of running jobs in 2 different folders. Inside that data, there is a url to each individual job, which I want to get the data for, as well. So, I need to do another $http.get request for each job, passing the url (which is a value inside the data of the first request) as the parameters.
Initially, I had one request inside another, inside a couple loops to iterate between
The folders, and in each folder
The jobs.
After doing some research, I realized that due to $http requests being async, and for loops being sync, that method was not going to work. So I have a service, which, using $q, collects the promise of the first request, but I don't know how to use the data from the first request as a parameter for the second request. Can someone help, please?
So assume you have two calls, a and b. They both return a list of Jenkins jobs. You can group these using an array of promises and then using $q.all(promises) to group these responses:
var jenkinsPromises = [];
// You could loop over this part if you have an array of calls for example.
jenkinsPromises.push($http.get{ // call folder one });
jenkinsPromises.push($http.get{ // call folder two });
// Now wait for all calls to finish and iterate over their responses.
$q.all(jenkinsPromises).then(function (jenkinsResponses) {
// jenkinsResponses is an array with objects
// each object being a response from Jenkins
}
In the above example, jenkinsResponses, you will find a combined result of the, lets say 'first layer' calls to Jenkins. This array contains response-objects which, as you say, contain the urls you need to make calls to.
Using the same practise as above, you can then group the calls in a promise array. Then we use $q.all() again to group their responses.
$q.all(jenkinsPromises).then(function (jenkinsResponses) {
var i, j, current, urlPromises = [];
for (i = 0, j = jenkinsResponses.length; i < j; i++) {
current = jenkinsResponses[i];
// Push the call to the urlPromises array.
// Note that this is a dummy call.
urlPromises.push($http.get{ current.url] };
}
$q.all(urlPromises).then(function (urlResponses) {
// urlResponses contains a result of all calls to the urls.
}
}
You should use async waterfall https://github.com/caolan/async for async request.
Example:
async.waterfall([
function(callback) {
callback(null, 'one', 'two');
},
function(arg1, arg2, callback) {
// arg1 now equals 'one' and arg2 now equals 'two'
callback(null, 'three');
},
function(arg1, callback) {
// arg1 now equals 'three'
callback(null, 'done');
}
], function (err, result) {
// result now equals 'done'
});
If I'm understanding right, this is the same thing I had to learn in the last few months. Basically, this:
requestFunction(...)
.then(function(response)
{
nextRequest(...).then(...);
}
);
The catch and finally methods are often implemented for this sort of control flow, which is called promises. It's worth looking into; I personally can't stand AngularJS, but understanding it is important to my day job, and this is pretty fundamental to it.
If I unterstand you need to loop twice. So, in your first loop you call your first request. And in the callback of this request, you can put the code for your second loop.
e.g:
for(var i=0; i<cpt1; cpt1++){
request1.then(function(response1){
for(var j=0; j<cpt2; cpt2++){
request2.then(function(response2){
});
}
});
}
Is there a way to wait on a promise so that you can get the actual result from it and return that instead of returning the promise itself? I'm thinking of something similar to how the C# await keyword works with Tasks.
Here is an example of why I'd like to have a method like canAccess() that returns true or false instead of a promise so that it can be used in an if statement. The method canAccess() would make an AJAX call using $http or $resource and then somehow wait for the promise to get resolved.
The would look something like this:
$scope.canAccess = function(page) {
var resource = $resource('/api/access/:page');
var result = resource.get({page: page});
// how to await this and not return the promise but the real value
return result.canAccess;
}
Is there anyway to do this?
In general that's a bad idea. Let me tell you why. JavaScript in a browser is basically a single threaded beast. Come to think of it, it's single threaded in Node.js too. So anything you do to not "return" at the point you start waiting for the remote request to succeed or fail will likely involve some sort of looping to delay execution of the code after the request. Something like this:
var semaphore = false;
var superImportantInfo = null;
// Make a remote request.
$http.get('some wonderful URL for a service').then(function (results) {
superImportantInfo = results;
semaphore = true;
});
while (!semaphore) {
// We're just waiting.
}
// Code we're trying to avoid running until we know the results of the URL call.
console.log('The thing I want for lunch is... " + superImportantInfo);
But if you try that in a browser and the call takes a long time, the browser will think your JavaScript code is stuck in a loop and pop up a message in the user's face giving the user the chance to stop your code. JavaScript therefore structures it like so:
// Make a remote request.
$http.get('some wonderful URL for a service').then(function (results) {
// Code we're trying to avoid running until we know the results of the URL call.
console.log('The thing I want for lunch is... " + results);
});
// Continue on with other code which does not need the super important info or
// simply end our JavaScript altogether. The code inside the callback will be
// executed later.
The idea being that the code in the callback will be triggered by an event whenever the service call returns. Because event driven is how JavaScript likes it. Timers in JavaScript are events, user actions are events, HTTP/HTTPS calls to send and receive data generate events too. And you're expected to structure your code to respond to those events when they come.
Can you not structure your code such that it thinks canAccess is false until such time as the remote service call returns and it maybe finds out that it really is true after all? I do that all the time in AngularJS code where I don't know what the ultimate set of permissions I should show to the user is because I haven't received them yet or I haven't received all of the data to display in the page at first. I have defaults which show until the real data comes back and then the page adjusts to its new form based on the new data. The two way binding of AngularJS makes that really quite easy.
Use a .get() callback function to ensure you get a resolved resource.
Helpful links:
Official docs
How to add call back for $resource methods in AngularJS
You can't - there aren't any features in angular, Q (promises) or javascript (at this point in time) that let do that.
You will when ES7 happens (with await).
You can if you use another framework or a transpiler (as suggested in the article linked - Traceur transpiler or Spawn).
You can if you roll your own implementation!
My approach was create a function with OLD javascript objects as follows:
var globalRequestSync = function (pUrl, pVerbo, pCallBack) {
httpRequest = new XMLHttpRequest();
httpRequest.onreadystatechange = function () {
if (httpRequest.readyState == 4 && httpRequest.status == 200) {
pCallBack(httpRequest.responseText);
}
}
httpRequest.open(pVerbo, pUrl, false);
httpRequest.send(null);
};
I recently had this problem and made a utility called 'syncPromises'. This basically works by sending what I called an "instruction list", which would be array of functions to be called in order. You'll need to call the first then() to kick things of, dynamically attach a new .then() when the response comes back with the next item in the instruction list so you'll need to keep track of the index.
// instructionList is array.
function syncPromises (instructionList) {
var i = 0,
defer = $q.defer();
function next(i) {
// Each function in the instructionList needs to return a promise
instructionList[i].then(function () {
var test = instructionList[i++];
if(test) {
next(i);
}
});
}
next(i);
return defer.promise;
}
This I found gave us the most flexibility.
You can automatically push operations etc to build an instruction list and you're also able to append as many .then() responses handlers in the callee function. You can also chain multiple syncPromises functions that will all happen in order.
I'm trying to implement the folowing scenario, using JQuery deferred, without much luck.
What parts of the deferred api would you use, and how would you structure your calls to achieve the following:
1st ajax callA to serviceA retrieve a list of Ids
wait until this call returns
then n ajax calls to serviceB, each call using a using an Id from the list returned by callA
wait until all serviceB calls have returned
then a final ajax call to serviceC
You could do like this (more or less pseudocode):
(function() {
// new scope
var data = []; // the ids coming back from serviceA
var deferredA = callToServiceA(data); // has to add the ids to data
deferredA.done(function() { // if callToServiceA successful...
var deferredBs = [];
for i in data {
deferredBs.push(callToServiceB(...));
}
$.when.apply($, deferredBs).then(callToServiceC);
});
}());
The callToServiceX function should return the promise object returned by $.ajax.
There might be a "cleaner" solution than having data in a shared scope, with resolve, but the setup would be a bit more difficult (and not necessarily more readable).