I have an RxJS sequence being consumed in the normal manner...
However, in the observable 'onNext' handler, some of the operations will complete synchronously, but others require async callbacks, that need to be waited on before processing the next item in the input sequence.
...little bit confused how to do this. Any ideas? thanks!
someObservable.subscribe(
function onNext(item)
{
if (item == 'do-something-async-and-wait-for-completion')
{
setTimeout(
function()
{
console.log('okay, we can continue');
}
, 5000
);
}
else
{
// do something synchronously and keep on going immediately
console.log('ready to go!!!');
}
},
function onError(error)
{
console.log('error');
},
function onComplete()
{
console.log('complete');
}
);
Each operation you want to perform can be modeled as an observable. Even the synchronous operation can be modeled this way. Then you can use map to convert your sequence into a sequence of sequences, then use concatAll to flatten the sequence.
someObservable
.map(function (item) {
if (item === "do-something-async") {
// create an Observable that will do the async action when it is subscribed
// return Rx.Observable.timer(5000);
// or maybe an ajax call? Use `defer` so that the call does not
// start until concatAll() actually subscribes.
return Rx.Observable.defer(function () { return Rx.Observable.ajaxAsObservable(...); });
}
else {
// do something synchronous but model it as an async operation (using Observable.return)
// Use defer so that the sync operation is not carried out until
// concatAll() reaches this item.
return Rx.Observable.defer(function () {
return Rx.Observable.return(someSyncAction(item));
});
}
})
.concatAll() // consume each inner observable in sequence
.subscribe(function (result) {
}, function (error) {
console.log("error", error);
}, function () {
console.log("complete");
});
To reply to some of your comments...at some point you need to force some expectations on the stream of functions. In most languages, when dealing with functions that are possibly async, the function signatures are async and the actual async vs sync nature of the function is hidden as an implementation detail of the function. This is true whether you are using javaScript promises, Rx observables, c# Tasks, c++ Futures, etc. The functions end up returning a promise/observable/task/future/etc and if the function is actually synchronous, then the object it returns is just already completed.
Having said that, since this is JavaScript, you can cheat:
var makeObservable = function (func) {
return Rx.Observable.defer(function () {
// execute the function and then examine the returned value.
// if the returned value is *not* an Rx.Observable, then
// wrap it using Observable.return
var result = func();
return result instanceof Rx.Observable ? result: Rx.Observable.return(result);
});
}
someObservable
.map(makeObservable)
.concatAll()
.subscribe(function (result) {
}, function (error) {
console.log("error", error);
}, function () {
console.log("complete");
});
First of all, move your async operations out of subscribe, it's not made for async operations.
What you can use is mergeMap (alias flatMap) or concatMap. (I am mentioning both of them, but concatMap is actually mergeMap with the concurrent parameter set to 1.) Settting a different concurrent parameter is useful, as sometimes you would want to limit the number of concurrent queries, but still run a couple concurrent.
source.concatMap(item => {
if (item == 'do-something-async-and-wait-for-completion') {
return Rx.Observable.timer(5000)
.mapTo(item)
.do(e => console.log('okay, we can continue'));
} else {
// do something synchronously and keep on going immediately
return Rx.Observable.of(item)
.do(e => console.log('ready to go!!!'));
}
}).subscribe();
I will also show how you can rate limit your calls. Word of advice: Only rate limit at the point where you actually need it, like when calling an external API that allows only a certain number of requests per second or minutes. Otherwise it is better to just limit the number of concurrent operations and let the system move at maximal velocity.
We start with the following snippet:
const concurrent;
const delay;
source.mergeMap(item =>
selector(item, delay)
, concurrent)
Next, we need to pick values for concurrent, delay and implement selector. concurrent and delay are closely related. For example, if we want to run 10 items per second, we can use concurrent = 10 and delay = 1000 (millisecond), but also concurrent = 5 and delay = 500 or concurrent = 4 and delay = 400. The number of items per second will always be concurrent / (delay / 1000).
Now lets implement selector. We have a couple of options. We can set an minimal execution time for selector, we can add a constant delay to it, we can emit the results as soon as they are available, we can can emit the result only after the minimal delay has passed etc. It is even possible to add an timeout by using the timeout operators. Convenience.
Set minimal time, send result early:
function selector(item, delay) {
return Rx.Observable.of(item)
.delay(1000) // replace this with your actual call.
.merge(Rx.Observable.timer(delay).ignoreElements())
}
Set minimal time, send result late:
function selector(item, delay) {
return Rx.Observable.of(item)
.delay(1000) // replace this with your actual call.
.zip(Rx.Observable.timer(delay), (item, _))
}
Add time, send result early:
function selector(item, delay) {
return Rx.Observable.of(item)
.delay(1000) // replace this with your actual call.
.concat(Rx.Observable.timer(delay).ignoreElements())
}
Add time, send result late:
function selector(item, delay) {
return Rx.Observable.of(item)
.delay(1000) // replace this with your actual call.
.delay(delay)
}
Another simple example to do manual async operations.
Be aware that it is not a good reactive practice ! If you only want to wait 1000ms, use Rx.Observable.timer or delay operator.
someObservable.flatMap(response => {
return Rx.Observable.create(observer => {
setTimeout(() => {
observer.next('the returned value')
observer.complete()
}, 1000)
})
}).subscribe()
Now, replace setTimeout by your async function, like Image.onload or fileReader.onload ...
Related
I am calling two APIs with two different functions and having a setTimeout for both, is it possible to handle their timeouts in such a way that they finish in 10s instead of 15s.
function delay(s){
return new Promise(resolve => setTimeout(resolve,s));
}
async function getXml(){
let ans;
await delay(10000)
const {data} = await axios.get('https://gist.githubusercontent.com/SwayamShah97/a3619c5828ac8ed8085c4ae295a855d9/raw/e4e372552e042bd8bd9e8ab87da93eb030114f86/people.xml');
xml2js.parseString(data, (err, result) => {
if(err) {
throw err;
}
ans = result
});
return ans;
}
async function getPeople(){
await delay(5000)
const { data } = await axios.get('https://gist.githubusercontent.com/SwayamShah97/0f2cb53ddfae54eceea083d4aa8d0d65/raw/d7d89c672057cf7d33e10e558e001f33a10868b2/people.json');
return data; // this will be the array of people objects
}
Is there a way to run this code only in 10s, so that both the APIs get called in 10s time period
Forcing either one or both axios.get() functions to complete below some time limit (other than failing on a timeout), isn't doable, since you don't control the transport.
One thing you can do is force one or more functions to complete at or after some time threshold, with something like this...
function padToTime(promise, interval) {
// delay returns a promise that resolves after an interval
const delay = interval => new Promise(resolve => setTimeout(resolve, interval));
// caller can provide a singular or an array of promises, avoiding the extra .all
let promises = Array.isArray(promise) ? promise : [promise];
return Promise.all([...promises, delay(interval)])
.then(results => results.slice(0, -1));
}
EDIT good idea from #VLAZ to append an extra promise that forces minimal time (then slice its result off later).
The caller can say:
async function getXml(){
// op function, no calls to "delay"
}
async function getPeople(){
// op function, no calls to "delay"
}
// run these both together, have them take *no fewer than* 10s
padToTime([getXml(),getPeople()], 10000).then(results => {
// we'll get here in no fewer than 10sec with the promise all results
})
How do I check that every element pass the test if a test is async and can throw an error
const exists = async () => {//can throw an error}
const allExists = [12, 5, 8, 130, 44].every(exists);
You can't use synchronous methods like every with functions that do asynchronous work, because the synchronous method won't wait for the asynchronous result. It's possible to write an async every, though:
async function every(array, callback) {
for (const element of array) {
const result = await callback(element);
if (!result) {
return false;
}
}
return true;
}
Note that because it relies on asynchronous information, it too delivers its result asynchronously. You have to await it (or use .then/.catch, etc.).
That version works in series, only calling the callback for the second entry when the callback for the first has finished its work. That lets you short-circuit (not call the callback for later entries when you already know the answer), but may make it take longer in overall time. Another approach is to do all of the calls in parallel, then check their results:
Promise.all(array.map(callback))
.then(flags => flags.every(Boolean))
.then(result => {
// ...`result` will be `true` or `false`...
});
I am new to rxjs, but need to use this asynchronous regime to fulfill such task:
I have many calculation requests, say 10K, and I want to execute them in batches: 1K per batch and the batch is pre-determined. Only when the current batch is done, I will move on to the next batch. The function will be:
calcBatch(dataset)
The input dateset will come from an array of datasets: datasets = [...]
the synchronous loop would look like:
datasets.foreach(dataset=> {
while (!calculation_is_done) {
wait();
}
calcBatch(dataset);
});
calcBatch(dataset) {
calculation_is_done = false;
/* calculation */
calculation_is_done = true;
}
Now switching to an asynchronous regime, how should I construct the flow? What I am thinking is in the calcBatch, when the work is done, a promise or an observable will be returned. Then in the loop, a subscriber will listen on this promise or observable, once it is caught, calcBatch will be called for the next batch.
The calculation needs to process in batches because the backend (HTTP) cannot handle a full set of calculation.
I think bufferCount and concatMap might be what you need:
function calcBatch(dataset) {
console.log('Processing batch:', dataset)
// Fake async request
return new Promise(res => {
setTimeout(() => res(dataset), 2000)
})
}
from(datasets)
.pipe(
bufferCount(1000),
concatMap(dataset => calcBatch(dataset))
)
.subscribe(dataset => {
console.log('Batch done:', dataset)
})
Note: calcBatch needs to return a promise or an observable.
I am looking at https://www.promisejs.org/patterns/ and it mentions it can be used if you need a value in the form of a promise like:
var value = 10;
var promiseForValue = Promise.resolve(value);
What would be the use of a value in promise form though since it would run synchronously anyway?
If I had:
var value = 10;
var promiseForValue = Promise.resolve(value);
promiseForValue.then(resp => {
myFunction(resp)
})
wouldn't just using value without it being a Promise achieve the same thing:
var value = 10;
myFunction(10);
Say if you write a function that sometimes fetches something from a server, but other times immediately returns, you will probably want that function to always return a promise:
function myThingy() {
if (someCondition) {
return fetch('https://foo');
} else {
return Promise.resolve(true);
}
}
It's also useful if you receive some value that may or may not be a promise. You can wrap it in other promise, and now you are sure it's a promise:
const myValue = someStrangeFunction();
// Guarantee that myValue is a promise
Promise.resolve(myValue).then( ... );
In your examples, yes, there's no point in calling Promise.resolve(value). The use case is when you do want to wrap your already existing value in a Promise, for example to maintain the same API from a function. Let's say I have a function that conditionally does something that would return a promise — the caller of that function shouldn't be the one figuring out what the function returned, the function itself should just make that uniform. For example:
const conditionallyDoAsyncWork = (something) => {
if (something == somethingElse) {
return Promise.resolve(false)
}
return fetch(`/foo/${something}`)
.then((res) => res.json())
}
Then users of this function don't need to check if what they got back was a Promise or not:
const doSomethingWithData = () => {
conditionallyDoAsyncWork(someValue)
.then((result) => result && processData(result))
}
As a side node, using async/await syntax both hides that and makes it a bit easier to read, because any value you return from an async function is automatically wrapped in a Promise:
const conditionallyDoAsyncWork = async (something) => {
if (something == somethingElse) {
return false
}
const res = await fetch(`/foo/${something}`)
return res.json()
}
const doSomethingWithData = async () => {
const result = await conditionallyDoAsyncWork(someValue)
if (result) processData(result)
}
Another use case: dead simple async queue using Promise.resolve() as starting point.
let current = Promise.resolve();
function enqueue(fn) {
current = current.then(fn);
}
enqueue(async () => { console.log("async task") });
Edit, in response to OP's question.
Explanation
Let me break it down for you step by step.
enqueue(task) add the task function as a callback to promise.then, and replace the original current promise reference with the newly returned thenPromise.
current = Promise.resolve()
thenPromise = current.then(task)
current = thenPromise
As per promise spec, if task function in turn returns yet another promise, let's call it task() -> taskPromise, well then the thenPromise will only resolve when taskPromise resolves. thenPromise is practically equivalent to taskPromise, it's just a wrapper. Let's rewrite above code into:
current = Promise.resolve()
taskPromise = current.then(task)
current = taskPromise
So if you go like:
enqueue(task_1)
enqueue(task_2)
enqueue(task_3)
it expands into
current = Promise.resolve()
task_1_promise = current.then(task_1)
task_2_promise = task_1_promise.then(task_2)
task_3_promise = task_2_promise.then(task_3)
current = task_3_promise
effectively forms a linked-list-like struct of promises that'll execute task callbacks in sequential order.
Usage
Let's study a concrete scenario. Imaging you need to handle websocket messages in sequential order.
Let's say you need to do some heavy computation upon receiving messages, so you decide to send it off to a worker thread pool. Then you write the processed result to another message queue (MQ).
But here's the requirement, that MQ is expecting the writing order of messages to match with the order they come in from the websocket stream. What do you do?
Suppose you cannot pause the websocket stream, you can only handle them locally ASAP.
Take One:
websocket.on('message', (msg) => {
sendToWorkerThreadPool(msg).then(result => {
writeToMessageQueue(result)
})
})
This may violate the requirement, cus sendToWorkerThreadPool may not return the result in the original order since it's a pool, some threads may return faster if the workload is light.
Take Two:
websocket.on('message', (msg) => {
const task = () => sendToWorkerThreadPool(msg).then(result => {
writeToMessageQueue(result)
})
enqueue(task)
})
This time we enqueue (defer) the whole process, thus we can ensure the task execution order stays sequential. But there's a drawback, we lost the benefit of using a thread pool, cus each sendToWorkerThreadPool will only fire after last one complete. This model is equivalent to using a single worker thread.
Take Three:
websocket.on('message', (msg) => {
const promise = sendToWorkerThreadPool(msg)
const task = () => promise.then(result => {
writeToMessageQueue(result)
})
enqueue(task)
})
Improvement over take two is, we call sendToWorkerThreadPool ASAP, without deferring, but we still enqueue/defer the writeToMessageQueue part. This way we can make full use of thread pool for computation, but still ensure the sequential writing order to MQ.
I rest my case.
I'm having trouble wrapping my head around promises. I'm using the Google Earth API to do a 'tour' of addresses. A tour is just an animation that lasts about a minute, and when one completes, the next should start.
Here's my function that does a tour:
var tourAddress = function (address) {
return tourService.getLatLong(address).then(function (coords) {
return tourService.getKmlForCoords(coords).then(function (kml) {
_ge.getTourPlayer().setTour(kml);
_ge.getTourPlayer().play();
var counter = 0;
var d = $q.defer();
var waitForTour = function () {
if (counter < _ge.getTourPlayer().getDuration()) {
++counter;
setTimeout(waitForTour, 1000);
} else {
d.resolve();
}
};
waitForTour();
return d.promise;
});
});
}
This seems to work pretty well. It starts the animation and returns a promise that resolves when the animation is complete. Now I have an array of addresses, and I want to do a tour foreach of them:
$scope.addresses.forEach(function (item) {
tourAddress(item.address).then(function(){
$log.log(item.address + " complete");
});
});
When i do this, they all execute at the same time (Google Earth does the animation for the last address) and they all complete at the same time. How do I chain these to fire after the previous one completes?
UPDATE
I used #phtrivier's great help to achieve it:
$scope.addresses.reduce(function (curr,next) {
return curr.then(function(){
return tourAddress(next.address)
});
}, Promise.resolve()).then(function(){
$log.log('all complete');
});
You're right, the requests are done immediately, because calling tourAddress(item.address) does a request and returns a Promise resolved when the request is done.
Since you're calling tourAddress in a loop, many Promises are generated, but they don't depend on each other.
What you want is to call tourAdress, take the returned Promise, wait for it to be resolved, and then call tourAddress again with another address, and so on.
Manually, you would have to write something like :
tourAddress(addresses[0])
.then(function () {
return tourAddress(addresses[1]);
})
.then(function () {
return tourAddress(addresses[2]);
})
... etc...
If you want to do that automatically (you're not the first one : How can I execute array of promises in sequential order?), you could try reducing the list of address to a single Promise that will do the whole chain.
_.reduce(addresses, function (memo, address) {
return memo.then(function (address) {
// This returns a Promise
return tourAddress(address);
});
}, Promise.resolve());
(This is pseudo-code that uses underscore, and bluebird, but it should be adaptable)
It's because you're doing things asynchronously so they'll all run at the same time. This answer should help you rewrite your loop so each address is run after the next one.
Asynchronous Loop of jQuery Deferreds (promises)