Lets say there is an array of numbers, periodically with setInterval you need to get and delete first 5 elements and based on those 5 numbers you need to do some async stuff as follows
for each of five numbers get something from DB
for each record in result set from DB do another async operation
error handling should be done in that way if any error occur along the way that number should be returned to array ensuring whole thing (1. and 2.) to be repeated for that number
Getting something from DB based on some number is totally independent of doing the same with another number. Same apply for fetched records. So in both 1. and 2. Promise.all sounds fantastic, but fact that Promise.all rejects if any passed promise rejects doesn't play nicely with described scenario. Basically if all calls to DB resolves with records except last one which rejects, i'm not able to continue with successful ones and their records.
So not being able to solve this with native Promise support, i looked into bluebird with whom i have very little/no experience which is the reason i'm posting this question. Going through API reference as it seems to me this could easily be solved with .reflect()? What i have tried:
setInterval(() => {
const firstFive = numbers.splice(0, 5)
Promise.all(firstFive.map((number) => firstAsync(number).reflect()))
.each((result, index) => {
if (result.isRejected()) {
return numbers.unshift(firstFive[index])
}
return Promise.all(result.value().map((record) => secondAsync(record).reflect()))
.each((result, index) => {
if(result.isRejected()) {
return numbers.unshift(firstFive[index])
}
})
})
}, 500)
Am i using .reflect() in intended way? Besides that im blindly guessing here that .each((result, index) => {...}) is syntactic sugar for something like .then((results) => results.forEach((result, index) => {...}), isn't?
I wonder why you are using Promise.all at all. You say that the five numbers are totally independent of each other, and it doesn't look like you needed to wait for all five of them to do anything.
So just handle them separately:
function doSomething(number) {
return firstAsync(number)
.then(secondAsync)
.catch(e => {
// something went wrong:
numbers.unshift(number); // do it again in next batch
});
}
setInterval(() => {
const firstFive = numbers.splice(0, 5);
firstFive.forEach(doSomething);
}, 500);
Related
I am having trouble with a fetch request, the thing is that for some reason it's returning the elements in random order. The function is the following:
function create_post_div(element, isLogged, currentUser, isProfile) {
console.log(element.id);
fetch(`likes/${element.id}`)
.then((response) => response.json())
.then((like_element) => {
console.log(like_element.user_liked, element.id);
});
}
For simplicity's sake I didn't put the whole function but the error can be detected with the console log. The first console log should return the element.id which are integers and the second the like_element.user_liked which is a boolean and the same element.id.
The create_post_div function has to run for each of the elements received from another function (let's say a list [1, 2, 3]). When I run the function the first console.log returns the numbers in the correct order, but the console.log inside the fetch return them randomly. Other thing that I noted is that instead of the console be something like:
1
true, 1
2
false, 2
3
true, 3
It logs like this:
1
2
3
false, 2
true, 3
true, 1
It will first log all of the id's and after log the likes with the ids. Again, the id's in the first log always in order and the second log always unordered.
When you use fetch you create a Promise, an operation that will take an undefined amount of time to complete. And Javascript doesn't wait for the promise to be completed, it keeps running doing all the other stuff that you have coded.
Once the fetch operation is completed the code that you put inside the .then is executed, that is why it's called asynchronous code, because it's exectuted after an undefined amount of time.
Rather than fighting this and trying to force the code to run syncronously with async / await like suggested I would reccomend rewiring your brain to understand and embrace this pattern. Unless it's absolutely necessary by design but it should be a very rare situation.
You're just dealing with the beautiful of asynchronous operations.
See the last example on the TutorialsPoint NodeJS Course. This code:
const fs = require("fs");
fs.readFile("input.txt",
(err, data) => err ? console.log(err.stack) : console.log(data.toString())
);
console.log("Program Ended");
outputs:
Program Ended
Some text.
In your case, you:
Call create_post_div
Logs the id
Starts fetching
Call create_post_div again
Logs the id again
Starts fetching again
Call create_post_div again
Logs the id again
Starts fetching again
Then, for each fetched data (which is a random operation, as you can see...), logs it as the data is received.
But, if there's no data received, your code doesn't log it. It just does so as when there is data.
When you call fetch().then, the then method comes from a Promise. And JavaScript Promises promises to return / do something in the future, but not right now.
If you want more explanation, I will leave you with a link to a similar question in the comments.
The fetch is asynchronous so console.log in the callback .then() could be in any order as you observed.
Most of the time this is fine. It should be another part of your code should be changed to adopt this async pattern.
However, if you want to force it running in a synchronous order, you can run it with a reduce.
Here is an example
const elements = [1, 2, 3];
function sleep(ms) {
return new Promise(resolve => setTimeout(resolve, ms));
}
// this prints 3,2,1
await Promise.all(
elements.map(async (i) => {
await sleep(10 - i);
console.log(i);
})
);
// and this prints 1,2,3
await elements.reduce(async (prev, i) => {
await prev;
await sleep(10 - i);
console.log(i);
}, undefined);
I have been reading up on methods to implement a polling function and found a great article on https://davidwalsh.name/javascript-polling. Now using a setTimeout rather than setInterval to poll makes a log of sense, especially with an API that I have no control over and has shown to have varying response times.
So I tried to implement such a solution in my own code in order to challenge my understanding of callbacks, promises and the event loop. I have followed guidance outlined in the post to avoid any anti-patterns Is this a "Deferred Antipattern"? and to ensure promise resolution before a .then() promise resolve before inner promise resolved and this is where I am getting stuck. I have put some code together to simulate the scenario so I can highlight the issues.
My hypothetical scenario is this:
I have an API call to a server which responds with a userID. I then use that userID to make a request to another database server which returns a set of data that carries out some machine learning processing that can take several minutes.
Due to the latency, the task is put onto a task queue and once it is complete it updates a NoSql database entry with from isComplete: false to isComplete: true. This means that we then need to poll the database every n seconds until we get a response indicating isComplete: true and then we cease the polling. I understand there are a number of solutions to polling an api but I have yet to see one involving promises, conditional polling, and not following some of the anti-patterns mentioned in the previously linked post. If I have missed anything and this is a repeat I do apologize in advance.
So far the process is outlined by the code below:
let state = false;
const userIdApi = () => {
return new Promise((res, rej) => {
console.log("userIdApi");
const userId = "uid123";
setTimeout(()=> res(userId), 2000)
})
}
const startProcessingTaskApi = userIdApi().then(result => {
return new Promise((res, rej) => {
console.log("startProcessingTaskApi");
const msg = "Task submitted";
setTimeout(()=> res(msg), 2000)
})
})
const pollDatabase = (userId) => {
return new Promise((res, rej) => {
console.log("Polling databse with " + userId)
setTimeout(()=> res(true), 2000)
})
}
Promise.all([userIdApi(), startProcessingTaskApi])
.then(([resultsuserIdApi, resultsStartProcessingTaskApi]) => {
const id = setTimeout(function poll(resultsuserIdApi){
console.log(resultsuserIdApi)
return pollDatabase(resultsuserIdApi)
.then(res=> {
state = res
if (state === true){
clearTimeout(id);
return;
}
setTimeout(poll, 2000, resultsuserIdApi);
})
},2000)
})
I have a question that relates to this code as it is failing to carry out the polling as I need:
I saw in the accepted answer of the post How do I access previous promise results in a .then() chain? that one should "Break the chain" to avoid huge chains of .then() statements. I followed the guidance and it seemed to do the trick (before adding the polling), however, when I console logged out every line it seems that userIdApi is executed twice; once where it is used in the startProcessingTaskApi definition and then in the Promise.all line.
Is this a known occurrence? It makes sense why it happens I am just wondering why this is fine to send two requests to execute the same promise, or if there is a way to perhaps prevent the first request from happening and restrict the function execution to the Promise.all statement?
I am fairly new to Javascript having come from Python so any pointers on where I may be missing some knowledge to be able to get this seemingly simple task working would be greatly appreciated.
I think you're almost there, it seems you're just struggling with the asynchronous nature of javascript. Using promises is definitely the way to go here and understanding how to chain them together is key to implementing your use case.
I would start by implementing a single method that wraps setTimeout to simplify things down.
function delay(millis) {
return new Promise((resolve) => setTimeout(resolve, millis));
}
Then you can re-implement the "API" methods using the delay function.
const userIdApi = () => {
return delay(2000).then(() => "uid123");
};
// Make userId an argument to this method (like pollDatabase) so we don't need to get it twice.
const startProcessingTaskApi = (userId) => {
return delay(2000).then(() => "Task submitted");
};
const pollDatabase = (userId) => {
return delay(2000).then(() => true);
};
You can continue polling the database by simply chaining another promise in the chain when your condition is not met.
function pollUntilComplete(userId) {
return pollDatabase(userId).then((result) => {
if (!result) {
// Result is not ready yet, chain another database polling promise.
return pollUntilComplete(userId);
}
});
}
Then you can put everything together to implement your use case.
userIdApi().then((userId) => {
// Add task processing to the promise chain.
return startProcessingTaskApi(userId).then(() => {
// Add database polling to the promise chain.
return pollUntilComplete(userId);
});
}).then(() => {
// Everything is done at this point.
console.log('done');
}).catch((err) => {
// An error occurred at some point in the promise chain.
console.error(err);
});
This becomes a lot easier if you're able to actually use the async and await keywords.
Using the same delay function as in Jake's answer:
async function doItAll(userID) {
await startTaskProcessingApi(userID);
while (true) {
if (await pollDatabase(userID)) break;
}
}
I am currently struggling to wrap my head around angular (2+), the HttpClient and Observables.
I come from a promise async/await background, and what I would like to achieve in angular, is the equivalent of:
//(...) Some boilerplate to showcase how to avoid callback hell with promises and async/await
async function getDataFromRemoteServer() {
this.result = await httpGet(`/api/point/id`);
this.dependentKey = someComplexSyncTransformation(this.result);
this.dependentResult = await httpGet(`/api/point/id/dependent/keys/${this.dependentKey}`);
this.deeplyNestedResult = await httpGet(`/api/point/id/dependen/keys/${this.dependentResult.someValue}`);
}
The best I could come op with in angular is:
import { HttpClient } from `#angular/common/http`;
//(...) boilerplate to set component up.
constructor(private http: HttpClient) {}
// somewhere in a component.
getDataFromRemoteServer() {
this.http.get(`/api/point/id`).subscribe( result => {
this.result = result;
this.dependentKey = someComplexSyncTransformation(this.result);
this.http.get(`/api/point/id/dependent/keys/${this.dependentKey}`).subscribe( dependentResult => {
this.dependentResult = dependentResult;
this.http.get(`/api/point/id/dependen/keys/${this.dependentResult.someValue}`).subscribe( deeplyNestedResult => {
this.deeplyNestedResult = deeplyNestedResult;
});
})
});
}
//...
As you might have noticed, I am entering the Pyramid of Doom with this approach, which I would like to avoid.
So how could I write the angular snippet in a way as to avoid this?
Thx!
Ps: I am aware of the fact that you can call .toPromise on the result of the .get call.
But let's just assume I want to go the total Observable way, for now.
When working with observables, you won't call subscribe very often. Instead, you'll use the various operators to combine observables together, forming a pipeline of operations.
To take the output of one observable and turn it into another, the basic operator is map. This is similar to how you can .map an array to produce another array. For a simple example, here's doubling all the values of an observable:
const myObservable = of(1, 2, 3).pipe(
map(val => val * 2)
);
// myObservable is an observable which will emit 2, 4, 6
Mapping is also what you do to take an observable for one http request, and then make another http request. However, we will need one additional piece, so the following code is not quite right:
const myObservable = http.get('someUrl').pipe(
map(result => http.get('someOtherUrl?id=' + result.id)
)
The problem with this code is that it creates an observable that spits out other observables. A 2-dimensional observable if you like. We need to flatten this down so that we have an observable that spits out the results of the second http.get. There are a few different ways to do the flattening, depending on what order we want the results to be in if multiple observables are emitting multiple values. This is not much of an issue in your case since each of these http observables will only emit one item. But for reference, here are the options:
mergeMap will let all the observables run in whatever order, and outputs in whatever order the values arrive. This has its uses, but can also result in race conditions
switchMap will switch to the latest observable, and cancel old ones that may be in progress. This can eliminate race conditions and ensure you have only the latest data.
concatMap will finish the entirety of the first observable before moving on to the second. This can also eliminate race conditions, but won't cancel old work.
Like i said, it doesn't matter much in your case, but i'd recommend using switchMap. So my little example above would become:
const myObservable = http.get('someUrl').pipe(
switchMap(result => http.get('someOtherUrl?id=' + result.id)
)
Now here's how i can use those tools with your code. In this code example, i'm not saving all the this.result, this.dependentKey, etc:
getDataFromRemoteServer() {
return this.http.get(`/api/point/id`).pipe(
map(result => someComplexSyncTransformation(result)),
switchMap(dependentKey => this.http.get(`/api/point/id/dependent/keys/${dependentKey}`)),
switchMap(dependantResult => this.http.get(`/api/point/id/dependent/keys/${dependentResult.someValue}`)
});
}
// to be used like:
getDataFromRemoteServer()
.subscribe(deeplyNestedResult => {
// do whatever with deeplyNestedResult
});
If its important to you to save those values, then i'd recommend using the tap operator to highlight the fact that you're generating side effects. tap will run some code whenever the observable emits a value, but will not mess with the value:
getDataFromRemoteServer() {
return this.http.get(`/api/point/id`).pipe(
tap(result => this.result = result),
map(result => someComplexSyncTransformation(result)),
tap(dependentKey => this.dependentKey = dependentKey),
// ... etc
});
}
I would like to do a synchronous loop in a part of my code.
The function saveInDatabase checks if the item title (string) already exists in the database. That's why it can't be resolved in parallel, otherwise the condition will never apply (and would create duplicates).
Promise.all(arr.map(item => {
saveInDatabase(item).then((myResult) => ... );
}));
I tried to encapsulate this function into separate promises, also tried with npm packages (synchronous.js, sync), but it seems that it does not fit with my code.
Maybe this solution is completely silly.
Do you think it's a better idea to replace promise.all by a synchronous loop (foreach for example) ? The problem is that I need the results of each iteration...
I'm using Node 6.11.2. Could you give me some tips to handle that ?
Thank you in advance.
Without using await (which is not in node.js v6.11.2, but would make this simpler), a classic pattern for serializing a bunch of async operations that return a promise is to use a reduce() loop like this:
arr.reduce(function(p, item) {
return p.then(function() {
return saveInDatabase(item).then((myResult) => ... );
});
}, Promise.resolve()).then(function() {
// all done here
}).catch(function(err) {
// error here
});
If you want to save all the results, you can use your .then(myResult => ...) handler to .push() the result into an array which you can access when done.
This will serialize all the calls to saveInDatabase(item) to it waits for the first one to be done before calling the second one, waits for the second one to be done before calling the third one, etc...
The default implementation here will stop if saveInDatabase(item) rejects. If you want to keep going (you don't say in your question), even when it gives an error, then you can add a .catch() to it to turn the rejected promise into a fulfilled promise.
In node.js v7+, you can use await in a regular for loop:
async function myFunc() {
let results = [];
for (let item of arr) {
let r = await saveInDatabase(item).then((myResult) => ... );
results.push(r);
}
return results;
}
myFunc().then(function(results) {
// all done here
}).catch(function(err) {
// error here
});
If you could run all the requests in parallel, then you could do that like this:
Promise.all(arr.map(item => {
return saveInDatabase(item).then((myResult) => ... );
})).then(function(results) {
// all done here
}).catch(function(err) {
// error here
});
In any of these, if you don't want it to stop upon a rejection, then add a .catch() to your saveInDatabase() promise chain to turn the rejection into a resolved promise with some known value or error value you can detect.
lets suppose I have the following:
var myurls = ['http://server1.com', 'http://server2.com', 'http:sever2.com', etc ]
Each url is a "fallback" and should be used only if the previous one cannot be reached. In other words, this list specifies a priority. Lets also assume that this list can be of any length - I don't know and must iterate.
How do I go about writing a function, lets say "reachability" that loops through this array and returns the first reachable server?
I can't do $http.all as it is parallel. I can't run a while loop with an $http.get inside because the result may come later and in the mean time, my UI will freeze.
Please note I am not using jQuery. I am using ionic, which has a version of jQuery-lite in it.
Various examples I've seen talk about chaining them in .then, which is fine if you know the # of URLs before hand, but I don't.
thanks
Just reduce over the array:
myurls.reduce((p, url) => p.catch(() => http.get(url).then(() => url)),
Promise.reject());
Flow explained:
It's based off the perhaps more common pattern of using reduce to build a promise chain, like so: [func1, func2].reduce((p, f) => p.then(f), Promise.resolve()); is equivalent to Promise.resolve().then(func1).then(func2) (the last arg of reduce is the initial value).
In your case, since you're retrying on failure, you want to build a retry (or reject) chain, so we must start with Promise.reject() instead. Promise.reject().catch(func1).catch(func2)
I guess recursion and chaining could suit your needs:
var findFirstReachableUrl = function (urls) {
if (urls.length > 0) {
return $http.get(urls[0]).then(function () {
return urls[0];
}, function () {
return findFirstReachableUrl(urls.slice(1));
});
} else {
return $q.reject("No reachable URL");
}
}
Call:
findFirstReachableUrl(myurls).then(function (firstReachableUrl) {
// OK: do something with firstReachableUrl
}, function () {
// KO: no url could be reached
});