Invoking http.get sequentially when the list length is unknown - javascript

lets suppose I have the following:
var myurls = ['http://server1.com', 'http://server2.com', 'http:sever2.com', etc ]
Each url is a "fallback" and should be used only if the previous one cannot be reached. In other words, this list specifies a priority. Lets also assume that this list can be of any length - I don't know and must iterate.
How do I go about writing a function, lets say "reachability" that loops through this array and returns the first reachable server?
I can't do $http.all as it is parallel. I can't run a while loop with an $http.get inside because the result may come later and in the mean time, my UI will freeze.
Please note I am not using jQuery. I am using ionic, which has a version of jQuery-lite in it.
Various examples I've seen talk about chaining them in .then, which is fine if you know the # of URLs before hand, but I don't.
thanks

Just reduce over the array:
myurls.reduce((p, url) => p.catch(() => http.get(url).then(() => url)),
Promise.reject());
Flow explained:
It's based off the perhaps more common pattern of using reduce to build a promise chain, like so: [func1, func2].reduce((p, f) => p.then(f), Promise.resolve()); is equivalent to Promise.resolve().then(func1).then(func2) (the last arg of reduce is the initial value).
In your case, since you're retrying on failure, you want to build a retry (or reject) chain, so we must start with Promise.reject() instead. Promise.reject().catch(func1).catch(func2)

I guess recursion and chaining could suit your needs:
var findFirstReachableUrl = function (urls) {
if (urls.length > 0) {
return $http.get(urls[0]).then(function () {
return urls[0];
}, function () {
return findFirstReachableUrl(urls.slice(1));
});
} else {
return $q.reject("No reachable URL");
}
}
Call:
findFirstReachableUrl(myurls).then(function (firstReachableUrl) {
// OK: do something with firstReachableUrl
}, function () {
// KO: no url could be reached
});

Related

Execute Promises (or Deferreds) one after the other and be able to interrupt at any time

So, I'm having a problem. I need to make potentially hundreds of http calls and they must be done one after the other. I also need to be able to interrupt the whole process at any time.
The solution I'm using right now is this one, but it doesn't allow me to properly interrupt the "chain" (I use jQuery and Deferreds because I haven't learned to replace them with Promises yet):
function doAnHttpCall() {
return $.post('http://the.domain/theendpoint', { theParam: this });
}
var anArrayOfParams = ['param1', 'param2', 'param3'];
var p = $.Deferred();
setTimeout(p.resolve.bind(p), 0);
var promise = anArrayOfParams.reduce(function(prev, cur) {
return prev.then(doAnHttpCall.bind(cur));
}, p)
.done(function() {
console.log('all done.');
});
(I've found this solution here)
The problem here is that the only way to sort-of break out of the reduce function is by modifying the array you're looping through and check for a particular value and always return immediately when you find it, instead of executing the "doAnHttpCall" method (in this case). This would still make me loop over potentially hundreds of elements of the array instead of just interrupting the process, which is very ugly.
There must be a better way to do this. Or do I really need to use a function that calls itself with the next element to process when an http call has finished? It sounds "bad-practice-y".
Thanks for the help.
You would use async/await:
const request = param => $.post('http://the.domain/theendpoint', { param });
(async () => {
for(const param of ['param1', 'param2', 'param3']) {
let result = await request(param);
if (result === 'particular value') break;
}
console.log('all done!');
})();

JS - Run the promises in order with $q.when(), how do I add finally?

I am generating objects using external source with intention of processing them callback style. Unfortunately these callbacks can chain way beyond the stack limit of a browser and I have to keep in mind the worst case scenario of 500 for IE.
I have rewritten my code using $q.when
$scope.combinations.forEach(function(combination) {
chain = chain.then(function() {
return generateConfiguration(combination);
});
});
generateConfiguration if a function that returns a promise. It all works just fine, but what I want to do is add finally() at the end of the chain.
What I did so far is to have a tracker inside generateConfiguration that recognizes the last combination and triggers what should have been triggered by finally.
Is there a cleaner way to do it?
It depends on what you want to do.
You're creating a promise chain, so if you want a single finally for the chain, just add it at the end after building it with that forEach:
// ...your forEach here, then:
chain.finally(/*...*/);
If you want a finally on each promise from generateConfiguration, either give yourself a wrapper function to do that, or do it inside the forEach callback.
$scope.combinations.forEach(function(combination) {
chain = chain.then(function() {
return generateConfiguration(combination).finally(/*...*/); // <==
});
});
Side note: There's a more idiomatic way to build that chain, via reduce:
var chain = $scope.combinations.reduce(function(chain, combination) {
return chain.then(function() {
return generateConfiguration(combination);
});
}, Promise.resolve());
Adding a final finally (option #1 above):
var chain = $scope.combinations.reduce(function(chain, combination) {
return chain.then(function() {
return generateConfiguration(combination);
});
}, Promise.resolve())
.finally(function() {
// ...do stuff here...
});
Yes, there is: inject $q (i.e. the promise framework) and rew-rite to
$q.all($scope.combinations.map(function(combination) {
return generateConfiguration(combination);
})).finally(function(res) {
/* ... */
});
That way, you'll also fire off all generateConfiguration(s) in parallel (at least as many as possible - like there 's a limit on e.g. XHR). Anyways: $q.all(arrayOfPromises) might be what you've been looking for.

How to render result of an array of asynchronous forEach 'find' functions?

This is my simple task: Find images by id array and render images value into template.
router.get('/gallery', function(req, res) {
var images = [];
imagesIds.forEach(function(eachImageId) {
Images.findById(eachImageId).exec(function(findImageErr, foundImage) {
if (foundImage) {
images.push(foundImage);
}
});
});
res.render('gallery', {
images: images
});
});
The problem is the 'res.render' function does not wait for 'findById' function to finish. 'images' array always become '[]' empty.
I try to use generator but did not know how to achieve.
If someone can explain without library(like q) will be better. Because I want to know generator deeply how to deal with this problem.
Generators allow to write synchronous-like function, because they can stop its execution and resume it later.
I guess you already read some articles like this and know how to define generator function and use them.
Your asynchronous code can be represented as a simple iterator with a magic yield keyword. Generator function will run and stop here until you resume it using method next().
function* loadImages(imagesIds) {
var images = [], image;
for(imageId of imagesIds) {
image = yield loadSingleImage(imageId);
images.push(image);
}
return images;
}
Because there is a cycle, function will go though the cycle with each next() until all imagesIds will have been walked. Finally there will be executed return statement and you will get images.
Now we need to describe image loading. Our generator function need to know when current image have loaded and it can start to load next. All modern javascript runtimes (node.js and latest browsers) have native Promise object support and we will define a function which returns a promise and it will be eventually resolved with image if it will have been found.
function loadSingleImage(imageId) {
return new Promise((resolve, reject) => {
Images.findById(imageId).exec((findImageErr, foundImage) => {
if (foundImage) {
resolve(foundImage)
} else {
reject();
}
});
});
}
Well we have two functions, one for single image load and the second for putting them together. Now we need a some dispatcher for passing control from one to another function. Since your don't want to use libraries, we have to implement some helper by yourself.
It is a smaller version of spawn function, which can be simpler and better to understand, since we don't need to handle errors, but just ignore missing images.
function spawn(generator) {
function continuer(value) {
var result = generator.next(value);
if(!result.done) {
return Promise.resolve(result.value).then(continuer);
} else {
return result.value;
}
}
return continuer();
}
This functions performs a recursive calls of our generator within continuer function while the result.done is not true. Once it got, that means that generation has been successfully finished and we can return our value.
And finally, putting all together, you will get the following code for gallery loading.
router.get('/gallery', function(req, res) {
var imageGenerator = loadImages(imagesIds);
spawn(imageGenerator).then(function(images) {
res.render('gallery', {
images: images
});
});
});
Now you have a little bit pseudo-synchronous code in the loadImages function. And I hope it helps to understand how generators work.
Also note that all images will be loaded sequently, because we wait asynchronous result of loadSingleImage call to put it in array, before we can go to the next imageId. It can cause performance issues, if you are going to use this way in production.
Related links:
Mozilla Hacks – ES6 In Depth: Generators
2ality – ES6 generators in depth
Jake Archibald – ES7 async functions
It can be done without a 3rd party as you asked, but it would be cumbersome...
Anyway the bottom line is to do it inside the callback function "function(findImageErr,foundImage){..}".
1) Without a 3rd party you - you need to render only after all images were accounted for:
var images = [];
var results=0;
imagesIds.forEach(function(eachImageId) {
Images.findById(eachImageId).exec(function(findImageErr, foundImage) {
results++;
if(foundImage)
images.push(foundImage);
if(results == imagesIds.length)
res.render('gallery',{images:images});
});
});
2) I strongly recommend a 3rd party which would do the same.
I'm currently using async, but I might migrate to promises in the future.
async.map(
imageIds,
function(eachImageId,next){
Images.findById(eachImageId).exec(function(findImageErr, foundImage) {
next(null,foundImage);
// don't report errors to async, because it will abort
)
},
function(err, images){
images=_.compact(images); // remove null images, i'm using lodash
res.render('gallery',{images:images});
}
);
Edited: following your readability remark, please note if you create some wrapper function for 'findById(...).exec(...)' that ignores errors and just reports them as null (call it 'findIgnoreError'(imageId, callback)) then you could write:
async.map(
imageIds,
findIgnoreError,
function(err, images){
images=_.compact(images); // remove null images, i'm using lodash
res.render('gallery',{images:images});
}
);
In other words, it becomes a bit more readable if the reader starts to think Functions... It says "go over those imageIds in parallel, run "findIgnoreError" on each imageId, and the final section says what to do with the accumulated results...
Instead of querying mongo(or any DB) N times, I would just fire a single query using $in:
Images.find({ _id : { $in : imagesIds}},function(err,images){
if(err) return next(err);
res.render('gallery',{images:images});
});
This would also reduce the number of io's, plus you won't have to write additional code to handle res.render

Angular/Promises - Multiple Controllers waiting for the same $promise? [duplicate]

I want to implement a dynamic loading of a static resource in AngularJS using Promises. The problem: I have couple components on page which might (or not, depends which are displayed, thus dynamic) need to get a static resource from the server. Once loaded, it can be cached for the whole application life.
I have implemented this mechanism, but I'm new to Angular and Promises, and I want to make sure if this is a right solution \ approach.
var data = null;
var deferredLoadData = null;
function loadDataPromise() {
if (deferredLoadData !== null)
return deferredLoadData.promise;
deferredLoadData = $q.defer();
$http.get("data.json").then(function (res) {
data = res.data;
return deferredLoadData.resolve();
}, function (res) {
return deferredLoadData.reject();
});
return deferredLoadData.promise;
}
So, only one request is made, and all next calls to loadDataPromise() get back the first made promise. It seems to work for request that in the progress or one that already finished some time ago.
But is it a good solution to cache Promises?
Is this the right approach?
Yes. The use of memoisation on functions that return promises a common technique to avoid the repeated execution of asynchronous (and usually expensive) tasks. The promise makes the caching easy because one does not need to distinguish between ongoing and finished operations, they're both represented as (the same) promise for the result value.
Is this the right solution?
No. That global data variable and the resolution with undefined is not how promises are intended to work. Instead, fulfill the promise with the result data! It also makes coding a lot easier:
var dataPromise = null;
function getData() {
if (dataPromise == null)
dataPromise = $http.get("data.json").then(function (res) {
return res.data;
});
return dataPromise;
}
Then, instead of loadDataPromise().then(function() { /* use global */ data }) it is simply getData().then(function(data) { … }).
To further improve the pattern, you might want to hide dataPromise in a closure scope, and notice that you will need a lookup for different promises when getData takes a parameter (like the url).
For this task I created service called defer-cache-service which removes all this boiler plate code. It writted in Typescript, but you can grab compiled js file. Github source code.
Example:
function loadCached() {
return deferCacheService.getDeferred('cacke.key1', function () {
return $http.get("data.json");
});
}
and consume
loadCached().then(function(data) {
//...
});
One important thing to notice that if let's say two or more parts calling the the same loadDataPromise and at the same time, you must add this check
if (defer && defer.promise.$$state.status === 0) {
return defer.promise;
}
otherwise you will be doing duplicate calls to backend.
This design design pattern will cache whatever is returned the first time it runs , and return the cached thing every time it's called again.
const asyncTask = (cache => {
return function(){
// when called first time, put the promise in the "cache" variable
if( !cache ){
cache = new Promise(function(resolve, reject){
setTimeout(() => {
resolve('foo');
}, 2000);
});
}
return cache;
}
})();
asyncTask().then(console.log);
asyncTask().then(console.log);
Explanation:
Simply wrap your function with another self-invoking function which returns a function (your original async function), and the purpose of wrapper function is to provide encapsulating scope for a local variable cache, so that local variable is only accessible within the returned function of the wrapper function and has the exact same value every time asyncTask is called (other than the very first time)

Caching a promise object in AngularJS service

I want to implement a dynamic loading of a static resource in AngularJS using Promises. The problem: I have couple components on page which might (or not, depends which are displayed, thus dynamic) need to get a static resource from the server. Once loaded, it can be cached for the whole application life.
I have implemented this mechanism, but I'm new to Angular and Promises, and I want to make sure if this is a right solution \ approach.
var data = null;
var deferredLoadData = null;
function loadDataPromise() {
if (deferredLoadData !== null)
return deferredLoadData.promise;
deferredLoadData = $q.defer();
$http.get("data.json").then(function (res) {
data = res.data;
return deferredLoadData.resolve();
}, function (res) {
return deferredLoadData.reject();
});
return deferredLoadData.promise;
}
So, only one request is made, and all next calls to loadDataPromise() get back the first made promise. It seems to work for request that in the progress or one that already finished some time ago.
But is it a good solution to cache Promises?
Is this the right approach?
Yes. The use of memoisation on functions that return promises a common technique to avoid the repeated execution of asynchronous (and usually expensive) tasks. The promise makes the caching easy because one does not need to distinguish between ongoing and finished operations, they're both represented as (the same) promise for the result value.
Is this the right solution?
No. That global data variable and the resolution with undefined is not how promises are intended to work. Instead, fulfill the promise with the result data! It also makes coding a lot easier:
var dataPromise = null;
function getData() {
if (dataPromise == null)
dataPromise = $http.get("data.json").then(function (res) {
return res.data;
});
return dataPromise;
}
Then, instead of loadDataPromise().then(function() { /* use global */ data }) it is simply getData().then(function(data) { … }).
To further improve the pattern, you might want to hide dataPromise in a closure scope, and notice that you will need a lookup for different promises when getData takes a parameter (like the url).
For this task I created service called defer-cache-service which removes all this boiler plate code. It writted in Typescript, but you can grab compiled js file. Github source code.
Example:
function loadCached() {
return deferCacheService.getDeferred('cacke.key1', function () {
return $http.get("data.json");
});
}
and consume
loadCached().then(function(data) {
//...
});
One important thing to notice that if let's say two or more parts calling the the same loadDataPromise and at the same time, you must add this check
if (defer && defer.promise.$$state.status === 0) {
return defer.promise;
}
otherwise you will be doing duplicate calls to backend.
This design design pattern will cache whatever is returned the first time it runs , and return the cached thing every time it's called again.
const asyncTask = (cache => {
return function(){
// when called first time, put the promise in the "cache" variable
if( !cache ){
cache = new Promise(function(resolve, reject){
setTimeout(() => {
resolve('foo');
}, 2000);
});
}
return cache;
}
})();
asyncTask().then(console.log);
asyncTask().then(console.log);
Explanation:
Simply wrap your function with another self-invoking function which returns a function (your original async function), and the purpose of wrapper function is to provide encapsulating scope for a local variable cache, so that local variable is only accessible within the returned function of the wrapper function and has the exact same value every time asyncTask is called (other than the very first time)

Categories