Unsubscribing from an observable when another observable is unsubscribed - javascript

Suppose I have a list of items that is queried from an API depending on a parameter that can be changed in the UI. When changing the value of this parameter, I dispatch an action:
this.store$.dispatch(new ChangeParameterAction(newParameterValue));
Now, on the receiving end, I want to trigger a new API call on every parameter change. I do this by subscribing to the store and then switching to the API observable. Then, I dispatch the result back into the store:
/** Statement 1 **/
this.store$.select(selectParameter).pipe(
switchMap(parameter => this.doApiCall$(parameter))
).subscribe(apiResult => {
this.store$.dispatch(new ResultGotAction(apiResult))
});
My UI is receiving the items by subscribing to
/** Statement 2 **/
this.store$.select(selectResults);
Now my question is: How can I join these two statements together so that we only have the subscription for Statement 1 for as long as the UI showing the results is active (and not destroyed)? I will always subscribe to the result of Statement 2, so Statement 1 will never be unsubscribed.
I've tried merging both observables and ignoring the elements for Statement 1, then subscribing tothe merged observables. But this looks like a very unreadable way for doing such a basic task. I think there must be a better way, but I can't find one. Hope you can help!

I can't tell exactly if this would run correctly, but I would go with moving the dispatch of ResultGotAction to a tap operator and then switching to this.store$.select(selectResults)
For example:
this.store$.select(selectParameter).pipe(
switchMap(parameter => this.doApiCall$(parameter)),
tap(apiResult => this.store$.dispatch(new ResultGotAction(apiResult))),
switchMapTo(this.store$.select(selectResults))
);

Related

How to wait and execute other code until multiple subscribe observables are finished javascript

I have a piece of code which contains multiple http requests which are subscribed.
Here an example of how one such subscribe function looks like:
this.http.post<any>(url, body).subscribe({
next: data => {
console.log(data);
}
});
But depending on user input they are not always used (so sometimes there are some not accessible).
After all the subscriptions are done that were neccesary, a function has to be called. Let's name that function subscribesDone().
How do I execute that function only after all the subscribes are done (while there are some subscribe funtions not being used, depending on the situation).
This isn't a very "proper" answer, but a way to do it until others give better answers:
make an array of booleans, one for every http call, all true. If a call is made, set its array slot to false and set it back to true when it returns. Every return, after setting the correct value to true, check if all bools are true. If all are true, do the final function

How to run an array of requests with rxjs like forkJoin and combineLatest but without having to wait for ALL to complete before seeing the results?

So imagine you have an array of URLs:
urls: string[]
You make a collection of requests (in this example I am using Angular's HTTPClient.get which returns an Observable)
const requests = urls.map((url, index) => this.http.get<Film>(url)
Now I want to execute this requests concurrently but not wait for all response to see everything. In other words, if I have something like films$: Observable<Film[]>, I want films$ to update gradually every time a response arrives.
Now to simulate this, you can update the requests above into something like this
const requests = urls.map((url, index) => this.http.get<Film>(url).pipe(delay((index + 1)* 1000))
With the above array of Observables you should get data from each request one by one since they aren't requested at the same time. Note that this is just faking the different times of arrival of data from the individual requests. The requests itself should be done concurrently.
The goal is to update the elements in films$ every time value emitted by any of the requests.
So before I had something like this when I misunderstood how combineLatest works
let films$: Observable<Film[]> = of([]);
const requests = urls.map(url => this.http.get<Film>(url)
.pipe(
take(1),
// Without this handling, the respective observable does not emit a value and you need ALL of the Observables to emit a value before combineLatest gives you results.
// rxjs EMPTY short circuits the function as documented. handle the null elements on the template with *ngIf.
catchError(()=> of(null))
));
// Expect a value like [{...film1}, null, {...film2}] for when one of the URL's are invalid for example.
films$ = combineLatest(requests);
I was expecting the above code to update films$ gradually, overlooking a part of the documentation
To ensure the output array always has the same length, combineLatest will actually wait for all input Observables to emit at least once, before it starts emitting results.
Which is not what I was looking for.
If there is an rxjs operator or function that can achieve what I am looking for, I can have a cleaner template with simply utilizing the async pipe and not having to handle null values and failed requests.
I have also tried
this.films$ = from(urls).pipe(mergeMap(url => this.http.get<Film>(url)));
and
this.films$ = from(requests).pipe(mergeAll());
which isn't right because the returned value type is Observable<Film> instead of Observable<Film[]> that I can use on the template with *ngFor="let film of films$ | async". Instead, if I subscribe to it, it's as if I'm listening to a socket for one record, getting updates realtime (the individual responses coming in). I can manually subscribe to any of the two and make an Array.push to a separate property films: Film[] for example, but that defeats the purpose (use Observable on template with async pipe).
The scan operator will work nicely for you here:
const makeRequest = url => this.http.get<Film>(url).pipe(
catchError(() => EMPTY))
);
films$: Observable<Film[]> = from(urls).pipe(
mergeMap(url => makeRequest(url)),
scan((films, film) => films.concat(film), [])
);
Flow:
from emits urls one at a time
mergeMap subscribes to "makeRequest" and emits result into stream
scan accumulates results into array and emits each time a new emission is received
To preserve order, I would probably use combineLatest since it emits an array in the same order as the input observables. We can start each observable with startWith(undefined), then filter out the undefined items:
const requests = urls.map(url => this.http.get<Film>(url).pipe(startWith(undefined));
films$: Observable<Film[]> = combineLatest(requests).pipe(
map(films => films.filter(f => !!f))
);

Using RxJS for unknown number of consequtive HTTP Requests

I need to fetch a large number of data points from our API.
These can't however all be fetched at once, as the response time would be too long, so I want to break it into multiple requests. The response looks something like this:
{
href: www.website.com/data?skip=0&limit=150,
nextHref: www.website.com/data?skip=150&limit=150,
maxCount: 704,
skip: 0,
count: 150,
limit:150,
results: [...]
}
So, ultimately I need to continually call the nextHref until we actually reach the last one.
After each request, I want to take the results and concatenate them into a list of data, which will be updated on the UI.
I am relatively new to the world of Obervables but would like to create a solution with RxJS. Does anyone have an idea of how to implement this?
The part that gets me the most is that I don't know how many requests I will have to do in advance. It just needs to keep looping until it's done.
It looks like you can determine the number of calls to make after the first response is received. So, we can make the first call, and build an observable that returns the results of the "first call" along with the results of all subsequent calls.
We can use scan to accumulate the results into a single array.
const results$ = makeApiCall(0, 150).pipe(
switchMap(firstResponse => {
const pageCount = Math.ceil(firstResponse.maxCount / firstResponse.limit);
const pageOffsets = Array(pageCount - 1).fill(0).map((_, i) => (i + 1) * firstResponse.limit);
return concat(
of(firstResponse),
from(pageOffsets).pipe(
mergeMap(offset => makeApiCall(offset, firstResponse.limit), MAX_CONCURRENT_CALLS)
)
);
}),
scan((acc, cur) => acc.concat(cur.results), [])
);
Here's a breakdown of what this does:
we first call makeApiCall() so we can determine how many other calls need made
from creates an observable that emits our array of offsets one at a time
mergeMap will execute our subsequent calls to makeApiCall() with the passed in offsets and emit the results. Notice you can provide a "concurrency" limit, to control how many calls are made at a time.
concat is used to return an observable that emits the first response, followed by the results of the subsequent calls
switchMap subscribes to this inner observable and emits the results
scan is used to accumulate the results into a single array
Here's a working StackBlitz demo.

RXJS Subject, return the same value on every first subscribe

I know I need to use startWith, but still trying to figure out how to use it. If I just do Subject.create().startWith("Some Value), it turns the Subject into a Observable, and I can't use next to emit.
So multiple subscribers should be able subscribe to it. Should be able to call next on the Subject. Going through the docs of Subject.create(), but it's going slow.
Edit:
I got it to work after using the accepted solution. The reason why it wasn't working before was because I put the .next call inside the subscription.
Eg:
observable.subscribe((res) => {
// do something
s.next('another res');
}
This creates an infinite loop, and I think RXJS prevented it? Anyway, I put the next in there for debug purposes. I moved it outside of that subscribe block and now and initial result emits, then when next is called, whatever was inside subscribe emit again.
You should avoid using Subject.create() and use just Subject(). See: Subject vs AnonymousSubject
Just keep a reference to the Subject instance and another reference to the Observable chain you need:
let s = new Subject();
let observable = s.startWith("Some initial message");
observable.subscribe(...);
s.next('whatever');

How to avoid race conditions when fetching data with Redux?

We have an action that fetches an object async, let's call it getPostDetails, that takes a parameter of which post to fetch by
an id. The user is presented with a list of posts and can click on one to get some details.
If a user clicks on "Post #1", we dispatch a GET_POST action which might look something like this.
const getPostDetails = (id) => ({
type: c.GET_POST_DETAILS,
promise: (http) => http.get(`http://example.com/posts/#${id}`),
returnKey: 'facebookData'
})
This is picked up by a middleware, which adds a success handler to the promise, which will call an action like
GET_POST__OK with the deserialized JSON object. The reducer sees this object and applies it to a store. A typical
__OK reducer looks like this.
[c.GET_ALL__OK]: (state, response) => assign(state, {
currentPost: response.postDetails
})
Later down the line we have a component that looks at currentPost and displays the details for the current post.
However, we have a race condition. If a user submits two GET_POST_DETAILS actions one right after the other, there is
no guarantee what order we recieve the __OK actions in, if the second http request finishes before the first, the
state will become incorrect.
Action => Result
---------------------------------------------------------------------------------
|T| User Clicks Post #1 => GET_POST for #1 dispatched => Http Request #1 pending
|i| User Clicks Post #2 => GET_POST for #2 dispatched => Http Request #2 pending
|m| Http Request #2 Resolves => Results for #2 added to state
|e| Http Request #1 Resolves => Results for #1 added to state
V
How can we make sure the last item the user clicked always will take priority?
The problem is due to suboptimal state organization.
In a Redux app, state keys like currentPost are usually an anti-pattern. If you have to “reset” the state every time you navigate to another page, you'll lose one of the main benefits of Redux (or Flux): caching. For example, you can no longer navigate back instantly if any navigation resets the state and refetches the data.
A better way to store this information would be to separate postsById and currentPostId:
{
currentPostId: 1,
postsById: {
1: { ... },
2: { ... },
3: { ... }
}
}
Now you can fetch as many posts at the same time as you like, and independently merge them into the postsById cache without worrying whether the fetched post is the current one.
Inside your component, you would always read state.postsById[state.currentPostId], or better, export getCurrentPost(state) selector from the reducer file so that the component doesn’t depend on specific state shape.
Now there are no race conditions and you have a cache of posts so you don’t need to refetch when you go back. Later if you want the current post to be controlled from the URL bar, you can remove currentPostId from Redux state completely, and instead read it from your router—the rest of the logic would stay the same.
While this isn’t strictly the same, I happen to have another example with a similar problem. Check out the code before and the code after. It’s not quite the same as your question, but hopefully it shows how state organization can help avoid race conditions and inconsistent props.
I also recorded a free video series that explains these topics so you might want to check it out.
Dan's solution is probably a better one, but an alternative solution is to abort the first request when the second one begins.
You can do this by splitting your action creator into an async one which can read from the store and dispatch other actions, which redux-thunk allows you to do.
The first thing your async action creator should do is check the store for an existing promise, and abort it if there is one. If not, it can make the request, and dispatch a 'request begins' action, which contains the promise object, which is stored for next time.
That way, only the most recently created promise will resolve. When one does, you can dispatch a success action with the received data. You can also dispatch an error action if the promise is rejected for some reason.

Categories