How to reset a RXJS scan operator based on another Observable - javascript

I have a component which triggers an onScrollEnd event when the last item in a virtual list is rendered. This event will do a new API request to fetch the next page and merge them with the previous results using the scan operator.
This component also has a search field which triggers an onSearch event.
How do I clear the previous accumulated results from the scan operator when a search event is triggered? Or do I need to refactor my logic here?
const loading$ = new BehaviorSubject(false);
const offset$ = new BehaviorSubject(0);
const search$ = new BehaviorSubject(null);
const options$: Observable<any[]> = merge(offset$, search$).pipe(
// 1. Start the loading indicator.
tap(() => loading$.next(true)),
// 2. Fetch new items based on the offset.
switchMap(([offset, searchterm]) => userService.getUsers(offset, searchterm)),
// 3. Stop the loading indicator.
tap(() => loading$.next(false)),
// 4. Complete the Observable when there is no 'next' link.
takeWhile((response) => response.links.next),
// 5. Map the response.
map(({ data }) =>
data.map((user) => ({
label: user.name,
value: user.id
}))
),
// 6. Accumulate the new options with the previous options.
scan((acc, curr) => {
// TODO: Dont merge on search$.next
return [...acc, ...curr]);
}
);
// Fetch next page
onScrollEnd: (offset: number) => offset$.next(offset);
// Fetch search results
onSearch: (term) => {
search$.next(term);
};

To manipulate the state of a scan you can write higher order functions that get the old state and the new update. Combine then with the merge operator. This way you stick to a clean stream-oriented solution without any side-effects.
const { Subject, merge } = rxjs;
const { scan, map } = rxjs.operators;
add$ = new Subject();
clear$ = new Subject();
add = (value) => (state) => [...state, value];
clear = () => (state) => [];
const result$ = merge(
add$.pipe(map(add)),
clear$.pipe(map(clear))
).pipe(
scan((state, innerFn) => innerFn(state), [])
)
result$.subscribe(result => console.log(...result))
add$.next(1)
add$.next(2)
clear$.next()
add$.next(3)
<script src="https://cdnjs.cloudflare.com/ajax/libs/rxjs/6.5.3/rxjs.umd.min.js"></script>
This method can easily be extended and/or adapted for other state usecases in rxjs.
Example (remove last item)
removeLast$ = new Subject()
removeLast = () => (state) => state.slice(0, -1);
merge(
..
removeLast$.pipe(map(removeLast)),
..
)

I think you could achieve what you want just by restructuring your chain (I'm omitting tap calls that trigger loading for simplicity):
search$.pipe(
switchMap(searchterm =>
concat(
userService.getUsers(0, searchterm),
offset$.pipe(concatMap(offset => userService.getUsers(offset, searchterm)))),
).pipe(
map(({ data }) => data.map((user) => ({
label: user.name,
value: user.id
}))),
scan((acc, curr) => [...acc, ...curr], []),
),
),
);
Every emission from search$ will create a new inner Observable with its own scan that will start with an empty accumulator.

Found a working solution: I check the current offset by using withLatestFrom before the scan operator and reset the accumulator if needed based on this value.
Stackblitz demo

This is an interesting stream. Thinking about it, offset$ and search$ are really 2 separate streams, though, with different logic, and so should be merged at the very end and not the beginning.
Also, it seems to me that searching should reset the offset to 0, and I don't see that in the current logic.
So here's my idea:
const offsettedOptions$ = offset$.pipe(
tap(() => loading$.next(true)),
withLatestFrom(search$),
concatMap(([offset, searchterm]) => userService.getUsers(offset, searchterm)),
tap(() => loading$.next(false)),
map(({ data }) =>
data.map((user) => ({
label: user.name,
value: user.id
})),
scan((acc, curr) => [...acc, ...curr])
);
const searchedOptions$ = search$.pipe(
tap(() => loading$.next(true)),
concatMap(searchTerm => userService.getUsers(0, searchterm)),
tap(() => loading$.next(false)),
map(({ data }) =>
data.map((user) => ({
label: user.name,
value: user.id
})),
);
const options$ = merge(offsettedOptions, searchedOptions);
See if that works or would make sense. I may be missing some context.

I know its old, but I just needed to do the same thing and have another solution to throw in to the mix.
There are really just 2 actions the user can trigger
const search$ = new Subject<string>();
const offset$ = new Subject<number>();
We don't really care about offset$ until search$ emits, and at that point, we want it to be 0 to start over. I would write it like this:
const results$ = search$.pipe( // Search emits
switchMap((searchTerm) => {
return offset$.pipe( // Start watching offset
startWith(0), // We want a value right away, so set it to 0
switchMap((offset) => {
return userService.getUsers(offset, searchTerm)) // get the stuff
})
)
}))
At this point we are resetting the offset every time search$ emits, and any time offset$ emits we make a fresh api call fetching the desired resources. We need the collection to reset if search$ emits, so I believe the right place is inside switchMap wrapping the offset$ pipe.
const results$ = search$.pipe( // Search emits
switchMap((searchTerm) => {
return offset$.pipe( // Start watching offset
startWith(0), // We want a value right away, so set it to 0
switchMap((offset) => {
return userService.getUsers(offset, searchTerm)) // get the stuff
}),
takeWhile((response) => response.links.next), // stop when we know there are no more.
// Turn the data in to a useful shape
map(({ data }) =>
data.map((user) => ({
label: user.name,
value: user.id
}))
),
// Append the new data with the existing list
scan((list, response) => {
return [ // merge
...list,
...response
]
}, [])
)
}))
This great part here is that the scan is reset on every new search$ emission.
The final bit here, I would move loading$ out of tap, and declare it separately. Final code should look something like this
const search$ = new Subject<string>();
const offset$ = new Subject<number>();
let results$: Observable<{label: string; value: string}[]>;
results$ = search$.pipe( // Search emits
switchMap((searchTerm) => {
return offset$.pipe( // Start watching offset
startWith(0), // We want a value right away, so set it to 0
switchMap((offset) => {
return userService.getUsers(offset, searchTerm)) // get the stuff
}),
takeWhile((response) => response.links.next), // stop when we know there are no more.
// Turn the data in to a useful shape
map(({ data }) =>
data.map((user) => ({
label: user.name,
value: user.id
}))
),
// Append the new data with the existing list
scan((list, response) => {
return [ // merge
...list,
...response
]
}, [])
)
}));
const loading$ = merge(
search$.pipe(mapTo(true)), // set to true whenever search emits
offset$.pipe(mapTo(true)), // set to true when offset emits
results$.pipe(mapTo(false)), // set to false when we get new results
);
results$.subscribe((results) => {
console.log(results);
})

Related

RxJS - Keep a cache of the list and update the existing ones

I have a WebSocket connection, that does 2 things:
Sends on the first 'DATA' event the full List(i.e: 5 Items)
On each next 'DATA' it sends information about the updated ones only.
I want to take that stream, process it, keep a cache of the items and do the following:
Keep the existing list.
If a new event arrives, and is in the list, update that based on an id(This should be generic enough).
If it doesn't exist, add it to the list.
This is what I have done so far. Which isn't much. I am appending the items every time. Any help would be appreciated:
function createCachedList$<T extends WSMessage<T>>(observable$: Observable<T>) {
const INITIAL_STATE: any[] = [];
const [fromDataPackets$, fromNonDataPackets$] = partition(
observable$,
(value) => value.type === WSMessageType.DATA
);
const pickDataPacket = fromDataPackets$.pipe(
map((value: any) => value?.data),
scan((prevState, currState: any[]) => {
const nextState = R.uniq([...prevState, ...currState]);
return [...prevState, ...nextState];
}, INITIAL_STATE),
tap((data: any) => console.log('Data:', data)),
map((data: any) => ({ type: WSMessageType.DATA, data }))
);
return merge(pickDataPacket, fromNonDataPackets$);
}
export default createCachedList$;
Your code seems OK. scan is the operator I would use.
Probably you need to elaborate a bit the logic within scan. Something like this could help
scan((prevState, currState: any[]) => {
currState.forEach(m => {
const item = prevState.find(s => s.id === m.id);
if (item) {
Object.assign(item, m)
} else {
prevState.push(m)
}
});
return prevState;
}, INITIAL_STATE),

Rxjs do something on first emit from multiple subscriptions

Is there a clean way to do something on first emit from multiple subscriptions ?
e.g.:
this.subscription1 = this.service.getData1().subscribe(data => {
this.data1 = data;
console.log('1');
});
this.subscription2 = this.service.getData2().subscribe(data => {
this.data2 = data2;
console.log('2');
});
// Do something after first emit from subscription1 AND subscription2
doSomething();
...
doSomething() {
console.log('Hello world !');
}
Output goal:
1
2
Hello world !
1
2
1
1
2
1
2
2
...
There've multiple times where I also needed such a isFirst operator that'll run some predicate only for the first emission. I've slapped together a quick custom operator that uses a single state variable first to decide if the emission is indeed first and run some predicate using the tap operator.
Since it uses tap internally it does not modify the source emission in any way. It only runs the passed predicate when the emission is indeed first.
Try the following
isFirst() operator
export const isFirst = (predicate: any) => {
let first = true;
return <T>(source: Observable<T>) => {
return source.pipe(
tap({
next: _ => {
if (first) {
predicate();
first = false;
}
}
})
);
};
};
For combining multiple streams that will be triggered when any of the source emits, you could use RxJS combineLatest function.
Example
import { Component } from "#angular/core";
import { timer, Observable, Subject, combineLatest } from "rxjs";
import { tap, takeUntil } from "rxjs/operators";
#Component({
selector: "my-app",
template: `<button (mouseup)="stop$.next()">Stop</button>`
})
export class AppComponent {
stop$ = new Subject<any>();
constructor() {
combineLatest(timer(2000, 1000), timer(3000, 500))
.pipe(
isFirst(_ => {
console.log("first");
}),
takeUntil(this.stop$)
)
.subscribe({
next: r => console.log("inside subscription:", r)
});
}
}
Working example: Stackblitz
In your case it might look something like
this.subscription = combineLatest(
this.service.getData1().pipe(
tap({
next: data => {
this.data1 = data;
console.log('1');
}
})
),
this.service.getData2().pipe(
tap({
next: data => {
this.data2 = data;
console.log('2');
}
})
)
).pipe(
isFirst(_ => {
console.log("first");
})
).subscribe({
next: r => console.log("inside subscription:", r)
});
The easiest strategy is to have a 3rd Observable that will perform this action.
See below example
const Observable1$ = timer(1000, 2000).pipe(
map(() => 1),
tap(console.log)
);
const Observable2$ = timer(1700, 1700).pipe(
map(() => 2),
tap(console.log)
);
const Observable3$ = combineLatest([Observable1$, Observable2$]).pipe(
take(1),
map(() => "Hello World!"),
tap(console.log)
);
Observable1$.subscribe();
Observable2$.subscribe();
Observable3$.subscribe();
The console output is as per below, since there are two subscribers to Observable1$ (i.e Observable1$ and Observable3$same as two subscribers toObservable2$(i.eObservable2$ and Observable3$ we see console logs 1 1 2 2 'hello world ...'
Here is the link to the stackblitz
In the above we notice that we get 2 subscriptions hence 2 console logs for each. To solve this we can use Subjects to generate new Observables and combine these instead
const track1Subject$ = new Subject();
const track1$ = track1Subject$.asObservable();
const track2Subject$ = new Subject();
const track2$ = track2Subject$.asObservable();
const Observable1$ = timer(1000, 2000).pipe(
map(() => 1),
tap(console.log),
tap(() => track1Subject$.next()),
take(5)
);
const Observable2$ = timer(1700, 1700).pipe(
map(() => 2),
tap(console.log),
tap(() => track2Subject$.next()),
take(5)
);
const Observable3$ = combineLatest([track1$, track2$]).pipe(
take(1),
map(() => "Hello World!"),
tap(console.log)
);
Observable1$.subscribe();
Observable2$.subscribe();
Observable3$.subscribe();
See Link to final solution
With some further restrictions, this problem becomes easier. Unfortunately, operators like combineLatest, and zip add extra structure to your data. I'll provide a solution with zip below, but it doesn't extend at all (if you want to add more logic downstream of your zip, you're out of luck in many cases).
General solution.
Assuming, however, that getData1 and getData2 are completely orthogonal (How they emit and how they are consumed by your app are not related in any predictable way), then a solution to this will require multiple subscriptions or a custom operator tasked with keeping track of emissions.
It's almost certainly the case that you can do something more elegant than this, but this is the most general solution I could think of that meets your very general criteria.
Here, I merge the service calls, tag each call, and pass through emissions until each call has emitted at least once.
merge(
this.service.getData1().pipe(
tap(_ => console.log('1')),
map(payload => ({fromData: 1, payload}))
),
this.service.getData2().pipe(
tap(_ => console.log('2')),
map(payload => ({fromData: 2, payload}))
)
).pipe(
// Custom Operator
s => defer(() => {
let fromData1 = false;
let fromData2 = false;
let done = false;
return s.pipe(
tap(({fromData}) => {
if(done) return;
if(fromData === 1) fromData1 = true;
if(fromData === 2) fromData2 = true;
if(fromData1 && fromData2){
done = true;
doSomething();
}
})
);
})
).subscribe(({fromData, payload}) => {
if(fromData === 1) this.data1 = payload;
if(fromData === 2) this.data2 = payload;
});
In the subscription, we have to separate out the two calls again. Since you're setting a global variable, you could throw that logic as a side effect in the tap operator for each call. This should have similar results.
merge(
this.service.getData1().pipe(
tap(datum => {
console.log('1')
this.data1 = datum;
),
map(payload => ({fromData: 1, payload}))
),
...
The zip Solution
This solution is much shorter to write but does come with some drawbacks.
zip(
this.service.getData1().pipe(
tap(datum => {
console.log('1')
this.data1 = datum;
)
),
this.service.getData2().pipe(
tap(datum => {
console.log('2')
this.data2 = datum;
)
)
).pipe(
map((payload, index) => {
if(index === 0) doSomething();
return payload;
})
).subscribe();
What is passed into your subscription is the service calls paired off. Here, you absolutely must set a global variable as a side effect of the original service call. The option of doing so in the subscription is lost (unless you want them set as pairs).

Recursion and Observable RxJs

I am performing pagination inside and Observable stream.
The pagination is implemented with a cursor and a total count using recursion.
I am able to emit the every page using the following code observer.next(searches);, by the way I would like to use just observable and no promises but I cannot express recursion using RxJs operators.
Any suggestions?
const search = id =>
new Observable(observer => { recursePages(id, observer) })
const recursePages = (id, observer, processed, searchAfter) => {
httpService.post(
"http://service.com/search",
{
size: 50,
...searchAfter ? { search_after: searchAfter } : null,
id,
})
.toPromise() // httpService.post returns an Observable<AxiosResponse>
.then(res => {
const body = res.data;
const searches = body.data.hits.map(search => ({ data: search.data, cursor: search.id }));
observer.next(searches);
const totalProcessed = processed + searches.length;
if (totalProcessed < body.data.total) {
return recursePages(id, observer, totalProcessed, searches[searches.length - 1].cursor);
}
observer.complete();
})
}
// General Observer
incomingMessages.pipe(
flatMap(msg => search(JSON.parse(msg.content.toString()))),
concatAll(),
).subscribe(console.log),
these methods will recursively gather all the pages and emit them in an array. the pages can then be streamed with from as shown:
// break this out to clean up functions
const performSearch = (id, searchAfter?) => {
return httpService.post(
"http://service.com/search",
{
size: 50,
...searchAfter ? { search_after: searchAfter } : null,
id,
});
}
// main recursion
const _search = (id, processed, searchAfter?) => {
return performSearch(id, searchAfter).pipe( // get page
switchMap(res => {
const body = res.data;
const searches = body.data.hits.map(search => ({ data: search.data, cursor: search.id }));
const totalProcessed = processed + searches.length;
if (totalProcessed < body.total) {
// if not done, recurse and get next page
return _search(id, totalProcessed, searches[searches.length - 1].cursor).pipe(
// attach recursed pages
map(nextPages => [searches].concat(nextPages)
);
}
// if we're done just return the page
return of([searches]);
})
)
}
// entry point
// switch into from to emit pages one by one
const search = id => _search(id, 0).pipe(switchMap(pages => from(pages))
if what you really need is all of the pages to emit one by one before they're all fetched, for instance so you can show page 1 as soon as it's available rather than wait on page 2+, then that can be done with some tweaking. let me know.
EDIT: this method will emit one by one
const _search = (id, processed, searchAfter?) => {
return performSearch(id, searchAfter).pipe( // get page
switchMap(res => {
const body = res.data;
const searches = body.data.hits.map(search => ({ data: search.data, cursor: search.id }));
const totalProcessed = processed + searches.length;
if (totalProcessed < body.total) {
// if not done, concat current page with recursive call for next page
return concat(
of(searches),
_search(id, totalProcessed, searches[searches.length - 1].cursor)
);
}
// if we're done just return the page
return of(searches);
})
)
}
const search = id => _search(id, 0)
you end up with an observable structure like:
concat(
post$(page1),
concat(
post$(page2),
concat(
post$(page3),
post$(page4)
)
)
)
and since nested concat() operations reduce to a flattened structure, this structure would reduce to:
concat(post$(page1), post$(page2), post$(page3), post$(page4))
which is what you're after and the requests run sequentially.
it also seems like expand might do the trick as per #NickL 's comment, soemthing like:
search = (id) => {
let totalProcessed = 0;
return performSearch(id).pipe(
expand(res => {
const body = res.data;
const searches = body.data.hits.map(search => ({ data: search.data, cursor: search.id }));
totalProcessed += searches.length;
if (totalProcessed < body.data.total) {
// not done, keep expanding
return performSearch(id, searches[searches.length - 1].cursor);
}
return EMPTY; // break with EMPTY
})
)
}
though I've never used expand before and this is based off some very limited testing of it, but I am pretty certain this works.
both of these methods could use the reduce (or scan) operator to gather results if you ever wanted:
search(id).pipe(reduce((all, page) => all.concat(page), []))
This is my used solution combining the expand and reduce operator
searchUsers(cursor?: string) {
return from(
this.slackService.app.client.users.list({
token: this.configService.get('SLACK_BOT_TOKEN'),
limit: 1,
...(cursor && { cursor }),
}),
);
}
Usage
.......
this.searchUsers()
.pipe(
expand((res) => {
if (!!res.response_metadata.next_cursor) {
return this.searchUsers(res.response_metadata.next_cursor);
}
return EMPTY;
}),
reduce((acc, val) => {
return [...acc, ...val.members];
}, []),
)
.subscribe((users) => {
console.log(JSON.stringify(users));
});
....

Ngrx effect parallel http call

I have an effect that should call two different APIs (API1 and API2).
Here's the effect
$LoadKpiMission = createEffect(() =>
this.actions$.pipe(
ofType<any>(EKpiActions.GetMissionsByStation),
mergeMap(action =>
this.apiCallsService.getKpi(action.payload, '2016-04-18').pipe(
map(trips => ({ type: EKpiActions.GetMissionsSuccess, payload: trips })),
catchError(() => EMPTY)
)
)
)
);
Here's the structure of the service
getKpi(station: number, date: string) {
let Kpi = `http://192.168.208.25:8998/api/scheduling/circulation_by_date_and_station?orig=${station}&date=${date}`;
return this.http.get<ISchedules>(API1).pipe(
map(data => {
return this.formatDataToKpi1(data);
})
);
}
However, I have to retrieve additional data from API2 and merge it with the data returned from API1.
I should do that inside the formatDataToKpi1 function.
I would like to know how to run requests in parallel and pass the returned responses to formatDataToKpi1 and do treatment then return to the effect ?
You can make use of the forkJoin RxJS operator.
As stated on the documentation,
When all observables complete, emit the last emitted value from each.
This way, when the observables from both requests have been completed, it will be returned, and you can carry out the subsequent operations.
$LoadKpiMission = createEffect(() =>
this.actions$.pipe(
ofType<any>(EKpiActions.GetMissionsByStation),
mergeMap(action =>
const getKpi = this.apiCallsService.getKpi(action.payload, '2016-04-18');
const getKpi2 = this.apiCallsService.getKpi2();
forkJoin(getKpi, getKpi2).subscribe(([res1, res2] => {
// do the rest here
});
)
)
);
EDIT: Looks like I have initially misunderstood your question - Was a bit confused by the variable names
getKpi(station: number, date: string) {
let Kpi = `http://192.168.208.25:8998/api/scheduling/circulation_by_date_and_station?orig=${station}&date=${date}`;
const api1 = this.http.get<ISchedules>(API1);
const api2 = this.http.get<ISchedules>(API2);
return forkJoin(api1, api2).pipe(
map(data => {
return this.formatDataToKpi1(data);
})
);
}

RxJS with redux - emitting action when finished (with mergeMap)

This is my current epic that I've written:
const storiesURL = 'https://hacker-news.firebaseio.com/v0/topstories.json?print=pretty';
const topStoryURL = id => `https://hacker-news.firebaseio.com/v0/item/${id}.json?print=pretty`;
const maxAtOnce = 15;
const fetchStoriesEpic = action$ => action$.pipe(
ofType(actions.FETCH_STORIES),
debounceTime(100),
switchMap(() => ajax.getJSON(storiesURL)), // 1. After that I have an array of ids
map(ids => ids.slice(0, maxAtOnce).map(topStoryURL)), // 2. After that I have an array of 15 of the latest ids
mergeMap(urls => from(urls)), // 3. After that I have those URLs
mergeMap(url => ajax.getJSON(url), null, 3), // 4. After that I have those responses
scan((acc, val) => [...acc, val] , []), // 5. Here I accumulate the responses into an array of stories
map(stories => actions.fetchStoriesFullfilled(stories)), // 6. Here I dispatch an action, so it passes those stories once at a time as they arrive
// ignoreElements(),
);
At FETCH_STORIES I make my loading attribute inside of state to true. I would like to set it to false as those requests finish, but not after single one of them, but when they finish all (in this case 15 requests).
How can I achieve this?
BTW - is this a common pattern? Do you know any resources where I can find RxJS patterns for async actions (that I will actually use)?
I think you can simplify your stream. For example:
const { of, forkJoin, concat } = rxjs;
const { switchMap, map } = rxjs.operators;
const { ajax } = rxjs.ajax;
/**
* test action creators
*/
const actions = {
fetchStoriesStartLoading: () => ({type: 'fetchStoriesStartLoading'}),
fetchStoriesDoneLoading: () => ({type: 'fetchStoriesDoneLoading'}),
fetchStoriesFulfilled: (stories) => ({type: 'fetchStoriesFulfilled', stories}),
};
/**
* configuration
*/
const storiesURL = 'https://hacker-news.firebaseio.com/v0/topstories.json?print=pretty';
const topStoryURL = id => `https://hacker-news.firebaseio.com/v0/item/${id}.json?print=pretty`;
const maxAtOnce = 15;
/**
* this stream loads configured number of stories
*/
const stories$ = ajax.getJSON(storiesURL).pipe(
// switch to loading individual requests, forkJoin will wait until all are resolved
switchMap(ids => forkJoin(
// use only max number of ids
ids.slice(0, maxAtOnce)
// and map those to requests
.map(id => ajax.getJSON(topStoryURL(id))))),
);
/**
* this stream wraps the stories$ stream with loading/fulfilled indicators
*/
const storiesWithLoadingIndicator$ = concat(
// signal start loading
of(actions.fetchStoriesStartLoading()),
// load stories
stories$.pipe(
// signal fulfilled
map(actions.fetchStoriesFulfilled)
),
// signal done loading
of(actions.fetchStoriesDoneLoading()),
);
/**
* test
*/
storiesWithLoadingIndicator$.subscribe(console.log);
/**
* as an epic
*/
const fetchStoriesEpic = action$ => action$.pipe(
ofType(actions.FETCH_STORIES),
debounceTime(100),
switchMapTo(storiesWithLoadingIndicator$)
);
<script src="https://unpkg.com/rxjs#6.2.1/bundles/rxjs.umd.min.js"></script>
Not tested but I think the following should work:
const fetchStoriesEpic = action$ => action$.pipe(
ofType(actions.FETCH_STORIES),
debounceTime(100),
switchMap(() => ajax.getJSON(storiesURL)),
map(ids => ids.slice(0, maxAtOnce).map(topStoryURL)),
mergeMap(urls => from(urls)),
mergeMap(url => ajax.getJSON(url), null, 3).pipe(
// instead of scan use reduce to wait for completion
reduce((acc, val) => [...acc, val] , []),
)
// once they're all done, then return
map(stories => actions.fetchStoriesFullfilled(stories)),
);

Categories