I have an angular component which subscribes to its route params. Whenever the route params change, it reads the id from the params and calls a function to look up the record for that id. The function returns a promise. Upon resolving, the component sets a property on itself for the purposes of databinding.
In some cases, it seems that the responses are getting out of order, and the wrong value is set. Here's what it looks like:
ngOnInit() {
this.route.params.subscribe((params) => {
const id = params['id'];
console.log('processing', id);
this.service.getPerson(id).then((person) => {
console.log('processed id', id);
this.person = person; // sometimes this happens out of order
}
}
}
So, let's say I have a textbox where I filter by name, and as I type, navigation events occur where id gets the first result for that name. Sometimes I see this:
processing 1
processing 2
processed 2
processed 1 <---- PROBLEM
What is the most elegant way to solve this? I have considered converting the promise to an observable and trying to cancel it, but that seems heavy-handed. I am trying to figure out if there's a way to serialize subscriptions, so that the second 'processing' doens't begin until the first one completes.
You'll want to make use of either concatMap or switchMap. concatMap will make sure the previous request is finished processing before doing the next one. switchMap will also ensure the order, except it will abort the processing of the prior request if it is still processing when the next request comes in.
ngOnInit() {
this.route.params.pipe(
map(params => params['id']),
tap(id => console.log('processing', id)),
concatMap(id => fromPromise(this.service.getPerson(id)).pipe(
tap(person => console.log('processed id', id))
)),
tap(person => this.person = person)
)
.subscribe();
}
You could add a 'request id' to each call, that gets returned back to you. Then in the callback you can check if the req id is higher than the last one you processed.
It's not ideal for little calls with single-integer responses, but I hid mine in the headers and used an interceptor/filter to echo that header back to the client on every request.
Related
So imagine you have an array of URLs:
urls: string[]
You make a collection of requests (in this example I am using Angular's HTTPClient.get which returns an Observable)
const requests = urls.map((url, index) => this.http.get<Film>(url)
Now I want to execute this requests concurrently but not wait for all response to see everything. In other words, if I have something like films$: Observable<Film[]>, I want films$ to update gradually every time a response arrives.
Now to simulate this, you can update the requests above into something like this
const requests = urls.map((url, index) => this.http.get<Film>(url).pipe(delay((index + 1)* 1000))
With the above array of Observables you should get data from each request one by one since they aren't requested at the same time. Note that this is just faking the different times of arrival of data from the individual requests. The requests itself should be done concurrently.
The goal is to update the elements in films$ every time value emitted by any of the requests.
So before I had something like this when I misunderstood how combineLatest works
let films$: Observable<Film[]> = of([]);
const requests = urls.map(url => this.http.get<Film>(url)
.pipe(
take(1),
// Without this handling, the respective observable does not emit a value and you need ALL of the Observables to emit a value before combineLatest gives you results.
// rxjs EMPTY short circuits the function as documented. handle the null elements on the template with *ngIf.
catchError(()=> of(null))
));
// Expect a value like [{...film1}, null, {...film2}] for when one of the URL's are invalid for example.
films$ = combineLatest(requests);
I was expecting the above code to update films$ gradually, overlooking a part of the documentation
To ensure the output array always has the same length, combineLatest will actually wait for all input Observables to emit at least once, before it starts emitting results.
Which is not what I was looking for.
If there is an rxjs operator or function that can achieve what I am looking for, I can have a cleaner template with simply utilizing the async pipe and not having to handle null values and failed requests.
I have also tried
this.films$ = from(urls).pipe(mergeMap(url => this.http.get<Film>(url)));
and
this.films$ = from(requests).pipe(mergeAll());
which isn't right because the returned value type is Observable<Film> instead of Observable<Film[]> that I can use on the template with *ngFor="let film of films$ | async". Instead, if I subscribe to it, it's as if I'm listening to a socket for one record, getting updates realtime (the individual responses coming in). I can manually subscribe to any of the two and make an Array.push to a separate property films: Film[] for example, but that defeats the purpose (use Observable on template with async pipe).
The scan operator will work nicely for you here:
const makeRequest = url => this.http.get<Film>(url).pipe(
catchError(() => EMPTY))
);
films$: Observable<Film[]> = from(urls).pipe(
mergeMap(url => makeRequest(url)),
scan((films, film) => films.concat(film), [])
);
Flow:
from emits urls one at a time
mergeMap subscribes to "makeRequest" and emits result into stream
scan accumulates results into array and emits each time a new emission is received
To preserve order, I would probably use combineLatest since it emits an array in the same order as the input observables. We can start each observable with startWith(undefined), then filter out the undefined items:
const requests = urls.map(url => this.http.get<Film>(url).pipe(startWith(undefined));
films$: Observable<Film[]> = combineLatest(requests).pipe(
map(films => films.filter(f => !!f))
);
Suppose I have a list of items that is queried from an API depending on a parameter that can be changed in the UI. When changing the value of this parameter, I dispatch an action:
this.store$.dispatch(new ChangeParameterAction(newParameterValue));
Now, on the receiving end, I want to trigger a new API call on every parameter change. I do this by subscribing to the store and then switching to the API observable. Then, I dispatch the result back into the store:
/** Statement 1 **/
this.store$.select(selectParameter).pipe(
switchMap(parameter => this.doApiCall$(parameter))
).subscribe(apiResult => {
this.store$.dispatch(new ResultGotAction(apiResult))
});
My UI is receiving the items by subscribing to
/** Statement 2 **/
this.store$.select(selectResults);
Now my question is: How can I join these two statements together so that we only have the subscription for Statement 1 for as long as the UI showing the results is active (and not destroyed)? I will always subscribe to the result of Statement 2, so Statement 1 will never be unsubscribed.
I've tried merging both observables and ignoring the elements for Statement 1, then subscribing tothe merged observables. But this looks like a very unreadable way for doing such a basic task. I think there must be a better way, but I can't find one. Hope you can help!
I can't tell exactly if this would run correctly, but I would go with moving the dispatch of ResultGotAction to a tap operator and then switching to this.store$.select(selectResults)
For example:
this.store$.select(selectParameter).pipe(
switchMap(parameter => this.doApiCall$(parameter)),
tap(apiResult => this.store$.dispatch(new ResultGotAction(apiResult))),
switchMapTo(this.store$.select(selectResults))
);
I have three observable sources in my code that emit values of the same type.
const setTitle$ = params$.do(
params => this.titleService.setTitle( `${params[1].appname} - ${this.pagename}` )
).switchMap(
() => Observable.of(true)
);
const openDocument$ = params$.switchMap(
params => this.openDocument(params[0].id)
);
const saveDocument$ = params$.switchMap(
params => this.saveDocument(params[0].id)
);
When i use them in race like this
setTitle$.race(
openDocument$,
saveDocument$
).subscribe();
works only setTitle and when i subscribe manually to another two sorces like
const openDocument$ = params$.switchMap(
params => this.openDocument(params[0].id)
).subscribe();
const saveDocument$ = params$.switchMap(
params => this.saveDocument(params[0].id)
).subscribe();
then they work too. Help me understand why it's going on and how to force to work all sources in race, merge, etc.
From the documentation, the .race() operator does this:
The observable to emit first is used.
That is why, you will only get ONE emission, because only one out of the three observables that emits first will get emitted.
What you are looking for is .forkJoin() or .combineLatest().
If you want all the observables to execute in parallel and wait for ALL of them to come back as one observables, use .forkJoin():
Observable
.forkJoin([...setTitle$, openDocument$, saveDocument$])
.subscribe(([setTitle, openDocument, saveDocument]) => {
//do something with your your results.
//all three observables must be completed. If any of it was not completed, the other 2 observables will wait for it
})
If you however wants to listen to every emission of all the observables regardless when they are emitted, use .combineLatest():
Observable
.combineLatest(setTitle$, openDocument$, saveDocument$)
.subscribe(([setTitle, openDocument, saveDocument]) => {
//do something with your your results.
// as long as any of the observables completed, it will be emitted here.
});
Problem was with shared params source.
const params$ = this.route.params.map(
routeParams => {
return {
id: <string>routeParams['id']
};
}
).combineLatest(
this.config.getConfig()
).share();
I have shared it with share operator. But in this article from the first comment to my question i found this:
When using multiple async pipes on streams with default values, the .share() operator might cause problems:
The share() will publish the first value of the stream on the first subscription. The first async pipe will trigger that subscription and get that initial value. The second async pipe however will subscribe after that value has already been emitted and therefore miss that value.
The solution for this problem is the .shareReplay(1) operator, which will keep track of the previous value of the stream. That way all the async pipes will get the last value.
I replaced share() with shareReplay(1) and all sources began emitting values.
const params$ = this.route.params.map(
routeParams => {
return {
id: <string>routeParams['id']
};
}
).combineLatest(
this.config.getConfig()
).shareReplay(1);
Thanks to everyone for help!
I might be off on the process, but here goes:
I have an angular2 service. The source for the data of this service is going to be localstorage... later optionally updated when a DB call using http returns. Because I'll be wanting to update the data returned as the various sources come back, it appears I want to use an observables. For now, I'm just trying to get the concept down, so I've skipped the localstorage aspect... but I'm including the 'backstory' so it makes (some) sense as to why I'm wanting to do this in multiple methods.
My thought was I would have a "getHTTPEvents()" method that would return an observable with the payload being the events from the DB. (the theory being that at some point in the future I'd also have a 'getLSEvents()' method that would piggy back in there)
To mock that up, I have this code:
private eventsUrl = 'app/mock-events.json';
getHTTPEvents() : Observable<Array<any>> {
return this._http.get(this.eventsUrl)
.map(response => response.json()['events'])
.catch(this.handleError); // handle error is a logging method
}
My goal would be to create a method that allows filtering on the returned events yet still returns an observable to users of the service. That is where my problem is. With that goal, I have a public method which will be called by users of the service. (attempted to use pattern from here https://coryrylan.com/blog/angular-2-observable-data-services)
public getEvents(key:string,value:string) : Observable<Array<any>> {
var allEventsObserve : Observable<Array<any>> = this.getHTTPEvents();
var filteredEventsObserve : Observable<Array<any>>;
allEventsObserve
.subscribe(
events => {
for(var i=0;i<events.length;i++) {
if(events[i][key]==value) {
console.log('MATCH!!!' + events[i][key]); // THIS WORKS!
return new Observable(observer => filteredEventsObserve = observer); // what do I need to return here? I want to return an observable so the service consumer can get updates
}
}
return allEventsObserve
},
error => console.error("Error retrieving all events for filtering: " + error));
}
The above doesn't work. I've watch lots of videos and read lots of tutorials about observables, but nothing I can find seems to go more indepth other than creating and using the http observable.
I further tried this method of making the new observable:
var newObs = Observable.create(function (observer) {
observer.next(events[i]);
observer.complete(events[i]);
});
And while at least that compiles, I'm not sure how to 'return' it at the right time... as I can't "Create" it outside the allEventsObserve.subscribe method (because 'events' doesn't exist) and can't (seem) to "return" it from within the subscribe. I'm also not entirely sure how I'd then "trigger" the 'next'...?
Do I need to modify the data within allEventsObserve and somehow simply still return that? Do I make a new observable (as attempted above) with the right payload - and if so, how do I trigger it? etc... I've checked here: How to declare an observable on angular2 but can't seem to follow how the 'second' observable gets triggered. Perhaps I have the entire paradigm wrong?
It appears that you're misunderstanding what an RxJS operator (like map, filter, etc) actually returns, and I think correcting that will make the solution clear.
Consider this short example:
allEventsObserve
.map(events => {
return 'this was an event';
})
Granted, it's a pretty useless example since all of the data from events is lost, but let's ignore that for now. The result of the code above is not an array of strings or anything else, it's actually another Observable. This Observable will just emit the string 'this was an event' for each array of events emitted by allEventsObserve This is what allows us to chain operators on observables -- each operator in the chain returns a new Observable that emits items that have been modified in some way be the previous operator.
allEventsObserve
.map(events => {
return 'this was an event';
})
.filter(events => typeof events !== 'undefined')
allEventsObserve is obviously an Observable, allEventsObserve.map() evaluates to an Observable, and so does allEventsObserve.map().filter().
So, since you're expecting your function to return an Observable, you don't want to call subscribe just yet, as doing so would return something that isn't really an Observable.
With that in mind, your code can be rewritten in the following way:
public getEvents(key:string,value:string) : Observable<Array<any>> {
var allEventsObserve : Observable<Array<any>> = this.getHTTPEvents();
return allEventsObserve
.map(events => {
var match = events.filter(event => event[key] == value);
if (match.length == 0) {
throw 'no matching event found';
} else {
return match[0];
}
})
.catch(e => {
console.log(e);
return e;
});
}
Since getEvents returns an Observable, somewhere else in your code you would do something like getEvents().subscribe(events => processEvents()) to interact with them. This code also assumes that this.getHTTPEvents() returns an Observable.
Also, notice that I changed your for loop to a call to filter, which operates on arrays. events in this case is a plain-old JavaScript Array, so the filter that is getting called is not the same filter as the RxJS operator filter.
We have an action that fetches an object async, let's call it getPostDetails, that takes a parameter of which post to fetch by
an id. The user is presented with a list of posts and can click on one to get some details.
If a user clicks on "Post #1", we dispatch a GET_POST action which might look something like this.
const getPostDetails = (id) => ({
type: c.GET_POST_DETAILS,
promise: (http) => http.get(`http://example.com/posts/#${id}`),
returnKey: 'facebookData'
})
This is picked up by a middleware, which adds a success handler to the promise, which will call an action like
GET_POST__OK with the deserialized JSON object. The reducer sees this object and applies it to a store. A typical
__OK reducer looks like this.
[c.GET_ALL__OK]: (state, response) => assign(state, {
currentPost: response.postDetails
})
Later down the line we have a component that looks at currentPost and displays the details for the current post.
However, we have a race condition. If a user submits two GET_POST_DETAILS actions one right after the other, there is
no guarantee what order we recieve the __OK actions in, if the second http request finishes before the first, the
state will become incorrect.
Action => Result
---------------------------------------------------------------------------------
|T| User Clicks Post #1 => GET_POST for #1 dispatched => Http Request #1 pending
|i| User Clicks Post #2 => GET_POST for #2 dispatched => Http Request #2 pending
|m| Http Request #2 Resolves => Results for #2 added to state
|e| Http Request #1 Resolves => Results for #1 added to state
V
How can we make sure the last item the user clicked always will take priority?
The problem is due to suboptimal state organization.
In a Redux app, state keys like currentPost are usually an anti-pattern. If you have to “reset” the state every time you navigate to another page, you'll lose one of the main benefits of Redux (or Flux): caching. For example, you can no longer navigate back instantly if any navigation resets the state and refetches the data.
A better way to store this information would be to separate postsById and currentPostId:
{
currentPostId: 1,
postsById: {
1: { ... },
2: { ... },
3: { ... }
}
}
Now you can fetch as many posts at the same time as you like, and independently merge them into the postsById cache without worrying whether the fetched post is the current one.
Inside your component, you would always read state.postsById[state.currentPostId], or better, export getCurrentPost(state) selector from the reducer file so that the component doesn’t depend on specific state shape.
Now there are no race conditions and you have a cache of posts so you don’t need to refetch when you go back. Later if you want the current post to be controlled from the URL bar, you can remove currentPostId from Redux state completely, and instead read it from your router—the rest of the logic would stay the same.
While this isn’t strictly the same, I happen to have another example with a similar problem. Check out the code before and the code after. It’s not quite the same as your question, but hopefully it shows how state organization can help avoid race conditions and inconsistent props.
I also recorded a free video series that explains these topics so you might want to check it out.
Dan's solution is probably a better one, but an alternative solution is to abort the first request when the second one begins.
You can do this by splitting your action creator into an async one which can read from the store and dispatch other actions, which redux-thunk allows you to do.
The first thing your async action creator should do is check the store for an existing promise, and abort it if there is one. If not, it can make the request, and dispatch a 'request begins' action, which contains the promise object, which is stored for next time.
That way, only the most recently created promise will resolve. When one does, you can dispatch a success action with the received data. You can also dispatch an error action if the promise is rejected for some reason.