I'm having trouble coming up with this stream.
What I'm looking for is something like debounceTime but with priority.
So if I have events with the shape { type: 'a', priority: 2 }. These events needs to be debounced by a few seconds but instead of the last event being emitted, the event with the highest priority is emitted.
input stream:
------(a|1)--(b|3)---(c|2)-----------------------(a|1)-----------------
output stream:
-----------------------------------(b|3)---------------------(a|1)-----
I've try looking at other operators like window and filtering through the result for the last event but it's not ideal because window work on a fixed cadence where I want the timer to start on the first event like debouncing does.
You have to store and update the item with the highest priority and map to this highest value which you then pass to debounceTime.
let highest = null;
source$.pipe(
map(v => highest = highest && highest.priority > v.priority ? highest : v),
debounceTime(2000),
tap(() => highest = null)
);
You can create your own operator that does this with the help of defer. defer makes sure that every subscriber gets its own highest variable, as every subscriber will get its own new Observable created by calling the given factory function.
function debounceTimeHighest<T>(dueTime: number, getHighest: (curr: T, high: T) => T): MonoTypeOperatorFunction<T> {
return (source: Observable<T>) => defer(() => {
let highest: T = null;
return source.pipe(
map(item => highest = highest ? getHighest(item, highest) : item),
debounceTime(dueTime),
tap(() => highest = null)
);
});
}
// Usage
source$.pipe(
debounceTimeHighest(2000, (v1, v2) => v1.priority >= v2.priority ? v1 : v2)
)
The code above is Typescript. If you want plain Javascript just remove all the types.
https://stackblitz.com/edit/rxjs-hitqxk
I'll offer the following solution, based around using scan to offer up the highest given priority emission so far for consideration by debounceTime(). Note that scan needs to reconsider new data after every successful debounce, so I use the operator window() to split up the emissions, starting a new observable window after every emission by debounceTime().
Here is the CodeSandbox
And here is some simplified code from the CodeSandbox showing the important bits:
const resetScan$ = new Subject();
source$.pipe(
window(resetScan$),
mergeMap(win$ => win$.pipe(
scan((acc, cur) => acc.priority >= cur.priority ? acc : cur )
)),
debounceTime(debounceDelay),
tap(() => resetScan$.next())
);
You can combine the debounceTime and buffer and filter operator to achieve what you need. I have developed this small example for it.
https://stackblitz.com/edit/typescript-lwzt4k
/*
Collect clicks that occur, after 250ms emit array of clicks
*/
clicks$.pipe(
buffer(clicks$.pipe(debounceTime(1000))),
// if array is greater than 1, double click occured
map((clickArray) => {
document.querySelector('#emittedObjects').innerHTML = (`<div>${JSON.stringify(clickArray)}</div>`);
const sortedArray = clickArray.sort((a, b) => {
return a.priority < b.priority ? 1 : -1;
});
const output = sortedArray.length > 0 ? sortedArray[0] : null;
document.querySelector('#mappedOutput').innerHTML = JSON.stringify(output);
return output;
})
)
.subscribe((obj) => {
const str = obj ? JSON.stringify(obj) : 'NULL';
document.querySelector('#throttledOutput').innerHTML = `<div>THROTTLED: ${str}</div>`;
});
Related
Im building a vue application for quizzes, I want to display the all the previous results of the person that has taken the quiz. For that I fetch the results from my backend and then pass them to the "view" component with a computed property:
computed: {
allResults() {
return this.$store.state.allResults;
},
I want to also sort out the best results, and the most recent results and display them separately, In order to do that I have the following methods:
bestResults() {
let orderedArray = this.allResults;
orderedArray.sort((a, b) =>
a.score < b.score ? 1 : a.score > b.score ? -1 : 0
);
let half = Math.round(orderedArray.length / 2);
let bestResults = orderedArray.slice(0, half);
return bestResults;
},
recentResults() {
let recentResults = this.allResults.slice(0, 5);
return recentResults;
}
This works, however it sorts the allResults array in a way that shows the scores from highest to lowest, which is what I do in the bestResults() function. This is a problem since I want to display the recentResults based on date, which should show the most recent result on top.
Well, you first sort the array in bestResults(), then use the sorted array in recentResults.
As a solution, you can create a new array with the same elements and sort that, which will leave the original array untouched:
bestResults() {
let orderedArray = [...this.allResults];
orderedArray.sort((a, b) =>
a.score < b.score ? 1 : a.score > b.score ? -1 : 0
);
let half = Math.round(orderedArray.length / 2);
let bestResults = orderedArray.slice(0, half);
return bestResults;
},
recentResults() {
let recentResults = this.allResults.slice(0, 5);
return recentResults;
}
A user is typing values in a form and an event is emitted every time a user edits a particular field, with the value being the field they edited.
For example a user typing 3 times into the description field, followed by two times in the name field, would look like
"description" => "description" => "description" => "name" => "name" => ...
I want to buffer unique values and emit them as an array when the user stops typing for x amount of seconds. A value may reappear again in a different buffer window.
Essentially this is to track which fields were updated when the user stopped typing and communicate with the server to save the edited values.
I have this so far which emits every 3000 ms, plus it doesn't prevent duplicates when buffering but instead we "deduplicate" the array afterwards.
this.update$
.bufferTime(3000)
.filter(buffer => buffer.length > 0)
.map(buffer => [...new Set(buffer)])
.subscribe(x => console.log(x));
So it should listen until a value is emitted and then buffer unique values until no more values have been emitted for x seconds, then emit the buffer and repeat. How can one achieve this?
This could be an alternate version:
const { Observable } = Rx;
const log = (prefix) => (...args) => { console.log(prefix, ...args); };
const inputs = document.querySelectorAll('input');
const updates$ = Observable
.fromEvent(inputs, 'input')
.pluck('target', 'id')
;
// wait x ms after last update
const flush$ = updates$.debounceTime(3000);
const buffered$ = updates$
// use `distinct` without `keySelector`, but reset with flush$
.distinct(null, flush$)
// flush the buffer using flush$ as `notifier`
.buffer(flush$)
;
buffered$.subscribe(log('buffered$ =>'));
<script src="https://unpkg.com/#reactivex/rxjs#^5/dist/global/Rx.min.js"></script>
<div><input type="text" placeholder="foo" id="input.foo"></div>
<div><input type="text" placeholder="bar" id="input.bar"></div>
<div><input type="text" placeholder="baz" id="input.baz"></div>
Perhaps my question wasn't clear enough, anyhow I've managed to solve it like so: (in case it helps someone else)
To have the buffer emit only when the stream was silent for 3 seconds, I start a new timer every time a user types something (event emitted on update$), and use switchMap to cancel the previous one.
this.update$
.buffer(this.update$.switchMap(x => Observable.timer(3000)))
.filter(buffer => buffer.length > 0)
.map(buffer => [...new Set(buffer)])
.subscribe(console.log);
Then to get the buffer to be unique itself rather than having to manually deduplicate it, I had to create a custom operator uniqueBuffer.
this.update$
.uniqueBuffer(this.update$.switchMap(x => Observable.timer(3000)))
.filter(buffer => buffer.length > 0)
.subscribe(console.log);
function uniqueBuffer(emitObservable) {
return Observable.create(subscriber => {
const source = this;
const uniqueBuffer = new Set();
const subscription = source.subscribe(value => {
uniqueBuffer.add(value);
});
emitObservable.subscribe(emit => {
subscriber.next([...uniqueBuffer]);
uniqueBuffer.clear();
})
})
}
Observable.prototype.uniqueBuffer = uniqueBuffer;
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 5 years ago.
Improve this question
I try to build the Javascript equvivalent for Java's IntStream.range(0, 5).forEach(System.err::println); and reached
const IntStream = (function () {
function range(start, end, numbers = []) {
if (start === end) {
return numbers
}
return range(start + 1, end, numbers.concat(start))
}
return {
range
}
})()
IntStream.range(0, 5).forEach(number => console.log(number))
All the stream magic of Java is builtin in a normal JavaScript array. Why can't an ArrayList in Java do all same things as a Stream or is there a purpose I didn't figure out yet?
Array higher order functions will eagerly do the whole thing at each step.
const isOdd = v => v % 2 == 1;
const multiply = by => v => v * by;
const arrRange = IntStream.range(10, 20);
const arrOdd = arrRange.filter(isOdd);
const arrOddM3 = arrOdd.map(multiply(3));
Here all the bindings are distinct arrays created by each of the steps. Even when you chain them the intermediate arrays are always made and the whole array at each step need to be finished before the next can begin.
const arrOddM3 = IntStream.range(10, 20).filter(isOdd).map(multiply(3));
arrOddM3; // ==> [33, 39, 45, 51, 57]
Streams are different since they only compute values when they are accessed. A stream version would look very similar.
const streamOddM3 = Stream.range(10, Infinity).filter(isOdd).map(multiply(3));
streamOddM3; // ==> Stream
Notice I have changed the end to go to infinity. I can do that because at most it calculates the very first value and some implementations doesn't do any calculations at all until you ask for the values. To force the calculations you can take some values and get them returned as an array:
streamOddM3.take(3); // ==> [33, 39, 45]
Here is a Stream implementation loosely based on the one from the SICP videos which work similar to Java's streams.
class EmptyStream {
map() {
return this;
}
filter() {
return this;
}
take() {
return [];
}
}
class Stream extends EmptyStream {
constructor(value, next) {
super();
this._next = next;
this.value = value;
}
/**
* This prevents the value to be computed more than once
* #returns {EmptyStream|Stream}
*/
next() {
if( ! (this._next instanceof EmptyStream) ) {
this._next = this._next();
}
return this._next;
}
map(fn) {
return new Stream(fn(this.value), () => this.next().map(fn));
}
filter(fn) {
return fn(this.value) ?
new Stream(this.value, () => this.next().filter(fn)) :
this.next().filter(fn);
}
take(n) {
return n == 0 ? [] : [this.value, ...this.next().take(n && n - 1)];
}
static range(from, to, step = 1) {
if (to !== undefined && ( step > 0 && from > to || step < 0 && from < to )) {
return Stream.emptyStream;
}
return new Stream(from, () => Stream.range(from + step, to, step));
}
}
Stream.emptyStream = new EmptyStream();
There are alternatives to Stream that might work in their place.
In JavaScript you have generators (aka coroutines) and you can make a map and filter generator function that takes a generator source and becomes a new generator with that transformation. Since it is already in the language it might be a better match than Streams but I haven't studied it enough to make a generator example of the above.
In Clojure you have transducers that allows you to compose steps so that an eventual list making only happens for the elements that makes it to the final result. They are easily implemented in JavaScript.
Theres a big difference between Streams and Javasvript arrays:
[1,2,3,4]
.filter(el => {
console.log(el);
return el%2 === 0;
})
.forEach( el => console.log(el));
The result in javascript will be:
1,2,3,4 2,4
for a Stream it will be:
1,2 2 3,4 4
So as you can see javascript mutates the collection, then iterates the collection. An element passed into a Stream traverses the stream. If a collection is passed to a Stream, one element after another will be passed in the stream.
A possible Stream implementation would be:
class Stream {
constructor(){
this.queue = [];
}
//the modifying methods
forEach(func){
this.queue.push(["forEach",func]);
return this;
}
filter(func){
this.queue.push(["filter",func]);
return this;
}
map(func){
this.queue.push(["map",func]);
return this;
}
subStream(v){
this.forEach(d => v.get(d));
return this;
}
//data methods
get(value,cb){
for( let [type,func] of this.queue ){
switch(type){
case "forEach":
func(value);
break;
case "map":
value = func(value);
break;
case "filter":
if(! func(value)) return;
}
}
cb(value);
}
range(start,end){
const result = [];
Array.from({length:end-start})
.forEach((_,i)=> this.get(i+start, r => result.push(r)));
return result;
}
}
Usecase:
const nums = new Stream();
const even = new Stream();
even.filter(n => !(n%2) ).forEach(n => console.log(n));
const odd = new Stream();
even.filter(n => (n%2) ).forEach(n => console.log(n));
nums
.subStream(even)
.subStream(odd)
.range(0,100);
No they are not the same because of how they proccess the data.
In LINQ (C#) or javascript, each operation on a collection must end befor calling to the next operation in the pipeline.
In streams, its different. For example:
Arrays.asList(1,2,3).stream()
.filter((Integer x)-> x>1)
.map((Integer x)->x*10)
.forEach(System.out::println);
source collection: 1, 2 ,3
filter(1) -> You are not OK. Element 1 will not pass to the next operation
in the pipeline. Now deal with element 2.
filter(2) -> You are OK. element 2 pass to the next operation.
map(2) -> create new element 20 and put it in the new stream.
forEach(20) -> print 20. End dealing with element 2 in the source collection.
Now deal with element 3.
filter(3) -> You are OK. element 3 pass to the next operation
map(3) -> create new element 30 and put it in the new stream.
forEach(20) -> print 30. No more elements in the source collection.
finish excuting the stream.
output:
20
30
Illustration:
One of the outcome of this approach is sometimes some operations in the pipeline won't go over each element because some of them filtered out in the proccess.
This explanation were taken from: Streams In Depth By Stav Alfi
Hello I'm trying to figure out if there is an equivalent to the RxJs operator zip in xstream, or at least a way to get the same behaviour. In case anyone needs clarification on the difference the marble diagrams below will show.
zip in rxjs
|---1---2---3-----------5->
|-a------b------c---d----->
"zip"
|-1a----2b------3c-----5d->
whereas 'combineLatest' aka 'combine' in xstream does
|---1---2----------4---5->
|----a---b---c---d------->
"combine"
|-1a----2a-2b-2c-2d-4d-5d>
Any help is appreciated as I'm very new to programming with streams. Thank you in advance!
I also needed a zip operator for xstream. So I created my own from existing operators. It takes an arbitrary number of streams for zipping.
function zip(...streams) {
// Wrap the events on each stream with a label
// so that we can seperate them into buckets later.
const streamsLabeled = streams
.map((stream$, idx) => stream$.map(event => ({label: idx + 1, event: event})));
return (event$) => {
// Wrap the events on each stream with a label
// so that we can seperate them into buckets later.
const eventLabeled$ = event$.map(event => ({label: 0, event: event}));
const labeledStreams = [eventLabeled$, ...streamsLabeled];
// Create the buckets used to store stream events
const buckets = labeledStreams.map((stream, idx) => idx)
.reduce((buckets, label) => ({...buckets, [label]: []}), {});
// Initial value for the fold operation
const accumulator = {buckets, tuple: []};
// Merge all the streams together and accumulate them
return xs.merge(...labeledStreams).fold((acc, event) => {
// Buffer the events into seperate buckets
acc.buckets[event.label].push(event);
// Does the first value of all the buckets have something in it?
// If so, then there is a complete tuple.
const tupleComplete = Object.keys(acc.buckets)
.map(key => acc.buckets[key][0])
.reduce((hadValue, value) => value !== undefined
? true && hadValue
: false && hadValue,
true);
// Save completed tuple and remove it from the buckets
if (tupleComplete) {
acc.tuple = [...Object.keys(acc.buckets).map(key => acc.buckets[key][0].event)];
Object.keys(acc.buckets).map(key => acc.buckets[key].shift());
} else {
// Clear tuple since all columns weren't filled
acc.tuple = [];
}
return {...acc};
}, accumulator)
// Only emit when we have a complete tuple
.filter(buffer => buffer.tuple.length !== 0)
// Just return the complete tuple
.map(buffer => buffer.tuple);
};
}
This can be used with compose.
foo$.compose(zip(bar$)).map(([foo, bar]) => doSomething(foo, bar))
I have a function that returns something like Observable<[number, Array<DataItem>]>. Is it possible to write some function that returns Observable<[number, Array<PageWithDataItems>] using some Observable functions, given a function chunk (chunks the DataItem array according to page size) and a simple constructor that creates a PageWithDataItems with a chunked DataItem array.
What I have is some code that subscribes to Observable<[number, Array<DataItem>]> and then creates a new Observable, but I am hoping it would be possible to do the same with map, mapTo, switchMap or similar. I am a bit lost in all the Observable functions, so any help?
I am not entirely sure what you are going for here, but I gave it a shot:
// stream would be your data... just random chunks of numbers as an example here.
const stream = Rx.Observable.range(0, 480).bufferWithCount(100).select(d => [Math.random() * 100, d]);
class DataChunk<T> {
constructor(public data: Array<T>) { }
}
const pageSize = 10;
stream
// I do not understand what the 'number' in your [number, Array<DataItem>]
// represents. But it is the 'someNumber' item here..
.map(d => ({someNumber: <number>d[0], data: <number[]>d[1]}))
.map(d => ({
someNumber: d.someNumber,
pages: Ix.Enumerable
.fromArray(d.data)
.select((item, idx) => ({ pageNr : idx % pageSize, item: item }))
.groupBy(i => i.pageNr)
.select(pageItems => new DataChunk(pageItems.select(i => i.item).toArray()))
.toArray()
}))
.subscribe(dataInfo => {
// here each dataInfo sent down the stream will have been split up in to chunks
// of pageSize
log('Data recieved: ');
log(' someNumber: ' + dataInfo.someNumber);
log(' page count: ' + dataInfo.pages.length);
});
Working example on jsfiddle.
I used IxJS to do the chunking. It works similarly to RxJS but operates on collections (e.g. arrays) and not streams of evens like RxJS. I hope this was close to what you wanted, your question is not entirely clear.