RxJs zip operator equivalent in xstream? - javascript

Hello I'm trying to figure out if there is an equivalent to the RxJs operator zip in xstream, or at least a way to get the same behaviour. In case anyone needs clarification on the difference the marble diagrams below will show.
zip in rxjs
|---1---2---3-----------5->
|-a------b------c---d----->
"zip"
|-1a----2b------3c-----5d->
whereas 'combineLatest' aka 'combine' in xstream does
|---1---2----------4---5->
|----a---b---c---d------->
"combine"
|-1a----2a-2b-2c-2d-4d-5d>
Any help is appreciated as I'm very new to programming with streams. Thank you in advance!

I also needed a zip operator for xstream. So I created my own from existing operators. It takes an arbitrary number of streams for zipping.
function zip(...streams) {
// Wrap the events on each stream with a label
// so that we can seperate them into buckets later.
const streamsLabeled = streams
.map((stream$, idx) => stream$.map(event => ({label: idx + 1, event: event})));
return (event$) => {
// Wrap the events on each stream with a label
// so that we can seperate them into buckets later.
const eventLabeled$ = event$.map(event => ({label: 0, event: event}));
const labeledStreams = [eventLabeled$, ...streamsLabeled];
// Create the buckets used to store stream events
const buckets = labeledStreams.map((stream, idx) => idx)
.reduce((buckets, label) => ({...buckets, [label]: []}), {});
// Initial value for the fold operation
const accumulator = {buckets, tuple: []};
// Merge all the streams together and accumulate them
return xs.merge(...labeledStreams).fold((acc, event) => {
// Buffer the events into seperate buckets
acc.buckets[event.label].push(event);
// Does the first value of all the buckets have something in it?
// If so, then there is a complete tuple.
const tupleComplete = Object.keys(acc.buckets)
.map(key => acc.buckets[key][0])
.reduce((hadValue, value) => value !== undefined
? true && hadValue
: false && hadValue,
true);
// Save completed tuple and remove it from the buckets
if (tupleComplete) {
acc.tuple = [...Object.keys(acc.buckets).map(key => acc.buckets[key][0].event)];
Object.keys(acc.buckets).map(key => acc.buckets[key].shift());
} else {
// Clear tuple since all columns weren't filled
acc.tuple = [];
}
return {...acc};
}, accumulator)
// Only emit when we have a complete tuple
.filter(buffer => buffer.tuple.length !== 0)
// Just return the complete tuple
.map(buffer => buffer.tuple);
};
}
This can be used with compose.
foo$.compose(zip(bar$)).map(([foo, bar]) => doSomething(foo, bar))

Related

GroupBy Array elements from CSV file and help reduce code

Each csv file that is imported has the same data structure.
I need to sum the ['Net Charge Amount'] by each '[Service Type'].
I am currently doing this by assigning each unique ['Service Type'] to their own array. My current script is probably overkill but it is very easy to follow, however I am looking for a more compact way of doing this otherwise this script could get very long.
const fs = require('fs')
const { parse } = require('csv-parse')
// Arrays for each service type
const GroundShipments = []
const HomeDeliveryShipments = []
const SmartPostShipments = []
const Shipments = []
The [Shipments] array will hold all data and I would assume this is the array
we want to work with
//functions for each service type
function isGround(shipment) {
return shipment['Service Type'] === 'Ground'
}
function isHomeDelivery(data) {
return data['Service Type'] === 'Home Delivery'
}
function isSmartpost(shipment) {
return shipment['Service Type'] === 'SmartPost'
}
function isShipment(shipment) {
return shipment['Service Type'] === 'Ground' || shipment['Service Type'] === 'Home Delivery' ||
shipment['Service Type'] === 'SmartPost'
}
// Import csv file / perform business rules by service type
// output sum total by each service type
fs.createReadStream('repco.csv')
.pipe(parse({
columns: true
}))
.on('data', (data) => {
//push data to proper service type array
// Ground
if (isGround(data)) {
GroundShipments.push(data)
}
// Home Delivery
if (isHomeDelivery(data)) {
HomeDeliveryShipments.push(data)
}
// Smartpost
if (isSmartpost(data)) {
SmartPostShipments.push(data)
}
// All shipment types, including Ground, Home Delivery, and Smartpost
if (isShipment(data)) {
Shipments.push(data)
}
})
.on('error', (err) => {
console.log(err)
})
.on('end', (data) => {
// sum data by service type
// Ground Only
const sumGround = GroundShipments.reduce((acc, data) =>
acc + parseFloat(data['Net Charge Amount']), 0)
// Home Delivery Only
const sumHomeDelivery = HomeDeliveryShipments.reduce((acc, data) =>
acc + parseFloat(data['Net Charge Amount']), 0)
// SmartPost Only
const sumSmartPost = SmartPostShipments.reduce((acc, data) =>
acc + parseFloat(data['Net Charge Amount']), 0)
// All services
const sumAllShipments = Shipments.reduce((acc, data) =>
acc + parseFloat(data['Net Charge Amount']), 0)
//output sum by service type to console
console.log(`${GroundShipments.length} Ground shipments: ${sumGround}`)
console.log(`${HomeDeliveryShipments.length} Home Delivery shipments: ${sumHomeDelivery}`)
console.log(`${SmartPostShipments.length} Smartpost shipments: ${sumSmartPost}`)
console.log(`${Shipments.length} All shipments: ${sumAllShipments}`)
})
Here is the console output:
[1]: https://i.stack.imgur.com/FltTU.png
Instead of separating each ['Service Type'] by its own Array and Function, I would like one Array [Shipments] to output each unique ['Service Type'] and sum total of ['Net Charge Amount']
The two keys to simplifying this are:
separating the CSV parsing from the data processing
using a groupBy function
First, you should parse the CSV into a simple JS array. Then you can use regular JS utility functions to operate on the data, such as the groupBy function. It is a utility that can be found in the lodash and ramda libraries. It's probably going to be added to vanilla JS as the .group method but that's a while from now.
I was looking for a sample problem to play with my own JS evaluation framework, so I answered your question there:
You can explore the underlying val yourself: https://www.val.town/stevekrouse.exampleGroupByShppingCSV
There are a couple things about my answer that wouldn't make sense in a normal NodeJS codebase, but that I had to do to make it work in val.town (async/await, using a custom groupBy method instead of importing one). If you'd like help getting it to work in your application, just let me know.
A solution would be to use a Map instance to keep track of the stats of different service types.
For each shipment find the associated stats (based on service type), or create a new stats object { count: 0, sum: 0 }. Then increment the count, and add the amount to the sum.
When all data is iterated (on end), you can loop through the serviceTypeStats which and log the values. You can also use this loop to calculate the total by adding all count and sum of each service type group.
const serviceTypeStats = new Map();
// ...
.on('data', (shipment) => {
const serviceType = shipment['Service Type'];
const amount = parseFloat(shipment['Net Charge Amount']);
if (!serviceTypeStats.has(serviceType)) {
serviceTypeStats.set(serviceType, { count: 0, sum: 0 });
}
const stats = serviceTypeStats.get(serviceType);
stats.count += 1;
stats.sum += amount;
})
// ...
.on('end', () => {
const total = { count: 0, sum: 0 };
for (const [serviceType, stats] of serviceTypeStats) {
total.count += stats.count;
total.sum += stats.sum;
console.log(`${stats.count} ${shipmentType}: ${stats.sum}`);
}
console.log(`${total.count} All shipments: ${total.sum}`);
})
If you want to loop keys in a specific order you can define the order in an array, or sort the keys of the Map instance.
// pre-defined order
const serviceTypeOrder = ["Ground", "Home Delivery", "SmartPost"];
// or
// alphabetic order (case insensitive)
const serviceTypeOrder = Array.from(serviceTypeStats.keys());
serviceTypeOrder.sort((a, b) => a.localeCompare(b, undefined, { sensitivity: "base" }));
// ...
for (const serviceType of sericeTypeOrder) {
const stats = serviceTypeStats.get(serviceType);
// ...
}

RxJS Debounce with priority

I'm having trouble coming up with this stream.
What I'm looking for is something like debounceTime but with priority.
So if I have events with the shape { type: 'a', priority: 2 }. These events needs to be debounced by a few seconds but instead of the last event being emitted, the event with the highest priority is emitted.
input stream:
------(a|1)--(b|3)---(c|2)-----------------------(a|1)-----------------
output stream:
-----------------------------------(b|3)---------------------(a|1)-----
I've try looking at other operators like window and filtering through the result for the last event but it's not ideal because window work on a fixed cadence where I want the timer to start on the first event like debouncing does.
You have to store and update the item with the highest priority and map to this highest value which you then pass to debounceTime.
let highest = null;
source$.pipe(
map(v => highest = highest && highest.priority > v.priority ? highest : v),
debounceTime(2000),
tap(() => highest = null)
);
You can create your own operator that does this with the help of defer. defer makes sure that every subscriber gets its own highest variable, as every subscriber will get its own new Observable created by calling the given factory function.
function debounceTimeHighest<T>(dueTime: number, getHighest: (curr: T, high: T) => T): MonoTypeOperatorFunction<T> {
return (source: Observable<T>) => defer(() => {
let highest: T = null;
return source.pipe(
map(item => highest = highest ? getHighest(item, highest) : item),
debounceTime(dueTime),
tap(() => highest = null)
);
});
}
// Usage
source$.pipe(
debounceTimeHighest(2000, (v1, v2) => v1.priority >= v2.priority ? v1 : v2)
)
The code above is Typescript. If you want plain Javascript just remove all the types.
https://stackblitz.com/edit/rxjs-hitqxk
I'll offer the following solution, based around using scan to offer up the highest given priority emission so far for consideration by debounceTime(). Note that scan needs to reconsider new data after every successful debounce, so I use the operator window() to split up the emissions, starting a new observable window after every emission by debounceTime().
Here is the CodeSandbox
And here is some simplified code from the CodeSandbox showing the important bits:
const resetScan$ = new Subject();
source$.pipe(
window(resetScan$),
mergeMap(win$ => win$.pipe(
scan((acc, cur) => acc.priority >= cur.priority ? acc : cur )
)),
debounceTime(debounceDelay),
tap(() => resetScan$.next())
);
You can combine the debounceTime and buffer and filter operator to achieve what you need. I have developed this small example for it.
https://stackblitz.com/edit/typescript-lwzt4k
/*
Collect clicks that occur, after 250ms emit array of clicks
*/
clicks$.pipe(
buffer(clicks$.pipe(debounceTime(1000))),
// if array is greater than 1, double click occured
map((clickArray) => {
document.querySelector('#emittedObjects').innerHTML = (`<div>${JSON.stringify(clickArray)}</div>`);
const sortedArray = clickArray.sort((a, b) => {
return a.priority < b.priority ? 1 : -1;
});
const output = sortedArray.length > 0 ? sortedArray[0] : null;
document.querySelector('#mappedOutput').innerHTML = JSON.stringify(output);
return output;
})
)
.subscribe((obj) => {
const str = obj ? JSON.stringify(obj) : 'NULL';
document.querySelector('#throttledOutput').innerHTML = `<div>THROTTLED: ${str}</div>`;
});

RxJS - Buffer unique values and emit when no other values have been emitted for x seconds?

A user is typing values in a form and an event is emitted every time a user edits a particular field, with the value being the field they edited.
For example a user typing 3 times into the description field, followed by two times in the name field, would look like
"description" => "description" => "description" => "name" => "name" => ...
I want to buffer unique values and emit them as an array when the user stops typing for x amount of seconds. A value may reappear again in a different buffer window.
Essentially this is to track which fields were updated when the user stopped typing and communicate with the server to save the edited values.
I have this so far which emits every 3000 ms, plus it doesn't prevent duplicates when buffering but instead we "deduplicate" the array afterwards.
this.update$
.bufferTime(3000)
.filter(buffer => buffer.length > 0)
.map(buffer => [...new Set(buffer)])
.subscribe(x => console.log(x));
So it should listen until a value is emitted and then buffer unique values until no more values have been emitted for x seconds, then emit the buffer and repeat. How can one achieve this?
This could be an alternate version:
const { Observable } = Rx;
const log = (prefix) => (...args) => { console.log(prefix, ...args); };
const inputs = document.querySelectorAll('input');
const updates$ = Observable
.fromEvent(inputs, 'input')
.pluck('target', 'id')
;
// wait x ms after last update
const flush$ = updates$.debounceTime(3000);
const buffered$ = updates$
// use `distinct` without `keySelector`, but reset with flush$
.distinct(null, flush$)
// flush the buffer using flush$ as `notifier`
.buffer(flush$)
;
buffered$.subscribe(log('buffered$ =>'));
<script src="https://unpkg.com/#reactivex/rxjs#^5/dist/global/Rx.min.js"></script>
<div><input type="text" placeholder="foo" id="input.foo"></div>
<div><input type="text" placeholder="bar" id="input.bar"></div>
<div><input type="text" placeholder="baz" id="input.baz"></div>
Perhaps my question wasn't clear enough, anyhow I've managed to solve it like so: (in case it helps someone else)
To have the buffer emit only when the stream was silent for 3 seconds, I start a new timer every time a user types something (event emitted on update$), and use switchMap to cancel the previous one.
this.update$
.buffer(this.update$.switchMap(x => Observable.timer(3000)))
.filter(buffer => buffer.length > 0)
.map(buffer => [...new Set(buffer)])
.subscribe(console.log);
Then to get the buffer to be unique itself rather than having to manually deduplicate it, I had to create a custom operator uniqueBuffer.
this.update$
.uniqueBuffer(this.update$.switchMap(x => Observable.timer(3000)))
.filter(buffer => buffer.length > 0)
.subscribe(console.log);
function uniqueBuffer(emitObservable) {
return Observable.create(subscriber => {
const source = this;
const uniqueBuffer = new Set();
const subscription = source.subscribe(value => {
uniqueBuffer.add(value);
});
emitObservable.subscribe(emit => {
subscriber.next([...uniqueBuffer]);
uniqueBuffer.clear();
})
})
}
Observable.prototype.uniqueBuffer = uniqueBuffer;

Are Java's Streams like JavaScript's Arrays? [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 5 years ago.
Improve this question
I try to build the Javascript equvivalent for Java's IntStream.range(0, 5).forEach(System.err::println); and reached
const IntStream = (function () {
function range(start, end, numbers = []) {
if (start === end) {
return numbers
}
return range(start + 1, end, numbers.concat(start))
}
return {
range
}
})()
IntStream.range(0, 5).forEach(number => console.log(number))
All the stream magic of Java is builtin in a normal JavaScript array. Why can't an ArrayList in Java do all same things as a Stream or is there a purpose I didn't figure out yet?
Array higher order functions will eagerly do the whole thing at each step.
const isOdd = v => v % 2 == 1;
const multiply = by => v => v * by;
const arrRange = IntStream.range(10, 20);
const arrOdd = arrRange.filter(isOdd);
const arrOddM3 = arrOdd.map(multiply(3));
Here all the bindings are distinct arrays created by each of the steps. Even when you chain them the intermediate arrays are always made and the whole array at each step need to be finished before the next can begin.
const arrOddM3 = IntStream.range(10, 20).filter(isOdd).map(multiply(3));
arrOddM3; // ==> [33, 39, 45, 51, 57]
Streams are different since they only compute values when they are accessed. A stream version would look very similar.
const streamOddM3 = Stream.range(10, Infinity).filter(isOdd).map(multiply(3));
streamOddM3; // ==> Stream
Notice I have changed the end to go to infinity. I can do that because at most it calculates the very first value and some implementations doesn't do any calculations at all until you ask for the values. To force the calculations you can take some values and get them returned as an array:
streamOddM3.take(3); // ==> [33, 39, 45]
Here is a Stream implementation loosely based on the one from the SICP videos which work similar to Java's streams.
class EmptyStream {
map() {
return this;
}
filter() {
return this;
}
take() {
return [];
}
}
class Stream extends EmptyStream {
constructor(value, next) {
super();
this._next = next;
this.value = value;
}
/**
* This prevents the value to be computed more than once
* #returns {EmptyStream|Stream}
*/
next() {
if( ! (this._next instanceof EmptyStream) ) {
this._next = this._next();
}
return this._next;
}
map(fn) {
return new Stream(fn(this.value), () => this.next().map(fn));
}
filter(fn) {
return fn(this.value) ?
new Stream(this.value, () => this.next().filter(fn)) :
this.next().filter(fn);
}
take(n) {
return n == 0 ? [] : [this.value, ...this.next().take(n && n - 1)];
}
static range(from, to, step = 1) {
if (to !== undefined && ( step > 0 && from > to || step < 0 && from < to )) {
return Stream.emptyStream;
}
return new Stream(from, () => Stream.range(from + step, to, step));
}
}
Stream.emptyStream = new EmptyStream();
There are alternatives to Stream that might work in their place.
In JavaScript you have generators (aka coroutines) and you can make a map and filter generator function that takes a generator source and becomes a new generator with that transformation. Since it is already in the language it might be a better match than Streams but I haven't studied it enough to make a generator example of the above.
In Clojure you have transducers that allows you to compose steps so that an eventual list making only happens for the elements that makes it to the final result. They are easily implemented in JavaScript.
Theres a big difference between Streams and Javasvript arrays:
[1,2,3,4]
.filter(el => {
console.log(el);
return el%2 === 0;
})
.forEach( el => console.log(el));
The result in javascript will be:
1,2,3,4 2,4
for a Stream it will be:
1,2 2 3,4 4
So as you can see javascript mutates the collection, then iterates the collection. An element passed into a Stream traverses the stream. If a collection is passed to a Stream, one element after another will be passed in the stream.
A possible Stream implementation would be:
class Stream {
constructor(){
this.queue = [];
}
//the modifying methods
forEach(func){
this.queue.push(["forEach",func]);
return this;
}
filter(func){
this.queue.push(["filter",func]);
return this;
}
map(func){
this.queue.push(["map",func]);
return this;
}
subStream(v){
this.forEach(d => v.get(d));
return this;
}
//data methods
get(value,cb){
for( let [type,func] of this.queue ){
switch(type){
case "forEach":
func(value);
break;
case "map":
value = func(value);
break;
case "filter":
if(! func(value)) return;
}
}
cb(value);
}
range(start,end){
const result = [];
Array.from({length:end-start})
.forEach((_,i)=> this.get(i+start, r => result.push(r)));
return result;
}
}
Usecase:
const nums = new Stream();
const even = new Stream();
even.filter(n => !(n%2) ).forEach(n => console.log(n));
const odd = new Stream();
even.filter(n => (n%2) ).forEach(n => console.log(n));
nums
.subStream(even)
.subStream(odd)
.range(0,100);
No they are not the same because of how they proccess the data.
In LINQ (C#) or javascript, each operation on a collection must end befor calling to the next operation in the pipeline.
In streams, its different. For example:
Arrays.asList(1,2,3).stream()
.filter((Integer x)-> x>1)
.map((Integer x)->x*10)
.forEach(System.out::println);
source collection: 1, 2 ,3
filter(1) -> You are not OK. Element 1 will not pass to the next operation
in the pipeline. Now deal with element 2.
filter(2) -> You are OK. element 2 pass to the next operation.
map(2) -> create new element 20 and put it in the new stream.
forEach(20) -> print 20. End dealing with element 2 in the source collection.
Now deal with element 3.
filter(3) -> You are OK. element 3 pass to the next operation
map(3) -> create new element 30 and put it in the new stream.
forEach(20) -> print 30. No more elements in the source collection.
finish excuting the stream.
output:
20
30
Illustration:
One of the outcome of this approach is sometimes some operations in the pipeline won't go over each element because some of them filtered out in the proccess.
This explanation were taken from: Streams In Depth By Stav Alfi

Observable function to returned a chunked array

I have a function that returns something like Observable<[number, Array<DataItem>]>. Is it possible to write some function that returns Observable<[number, Array<PageWithDataItems>] using some Observable functions, given a function chunk (chunks the DataItem array according to page size) and a simple constructor that creates a PageWithDataItems with a chunked DataItem array.
What I have is some code that subscribes to Observable<[number, Array<DataItem>]> and then creates a new Observable, but I am hoping it would be possible to do the same with map, mapTo, switchMap or similar. I am a bit lost in all the Observable functions, so any help?
I am not entirely sure what you are going for here, but I gave it a shot:
// stream would be your data... just random chunks of numbers as an example here.
const stream = Rx.Observable.range(0, 480).bufferWithCount(100).select(d => [Math.random() * 100, d]);
class DataChunk<T> {
constructor(public data: Array<T>) { }
}
const pageSize = 10;
stream
// I do not understand what the 'number' in your [number, Array<DataItem>]
// represents. But it is the 'someNumber' item here..
.map(d => ({someNumber: <number>d[0], data: <number[]>d[1]}))
.map(d => ({
someNumber: d.someNumber,
pages: Ix.Enumerable
.fromArray(d.data)
.select((item, idx) => ({ pageNr : idx % pageSize, item: item }))
.groupBy(i => i.pageNr)
.select(pageItems => new DataChunk(pageItems.select(i => i.item).toArray()))
.toArray()
}))
.subscribe(dataInfo => {
// here each dataInfo sent down the stream will have been split up in to chunks
// of pageSize
log('Data recieved: ');
log(' someNumber: ' + dataInfo.someNumber);
log(' page count: ' + dataInfo.pages.length);
});
Working example on jsfiddle.
I used IxJS to do the chunking. It works similarly to RxJS but operates on collections (e.g. arrays) and not streams of evens like RxJS. I hope this was close to what you wanted, your question is not entirely clear.

Categories