Ramda: How can I make this imperative reducer more declarative? - javascript

I have the following reducer function:
The first argument to the reducers is the aggregated value, and the second argument is the next value. The below reducer function is reducing over the same reaction argument but aggregating the state$ value. Each reducer function yields a new aggregated value.
/**
* Applies all the reducers to create a state object.
*/
function reactionReducer(reaction: ReactionObject): ReactionObject {
let state$ = reactionDescriptionReducer({}, reaction);
state$ = reactionDisabledReducer(state$, reaction);
state$ = reactionIconReducer(state$, reaction);
state$ = reactionOrderReducer(state$, reaction);
state$ = reactionStyleReducer(state$, reaction);
state$ = reactionTitleReducer(state$, reaction);
state$ = reactionTooltipReducer(state$, reaction);
state$ = reactionVisibleReducer(state$, reaction);
return state$;
}
const state = reactionReducer(value);
The above works but the function is fixed with the list of reducers. It seems like I should be able to do something like this with RamdaJS.
const state = R.????({}, value, [reactionDescriptionReducer
reactionDisabledReducer,
reactionIconReducer,
reactionOrderReducer,
reactionStyleReducer,
reactionTitleReducer,
reactionTooltipReducer,
reactionVisibleReducer]);
I am new to RamdaJS so forgive me if this is a noob question.
How can I execute a chain of reducers using just RamdaJS?

and constructs a new reducer, (r, x) => ..., by combining the two (2) input reducers, f and g -
const and = (f, g) =>
(r, x) => g (f (r, x), x)
all, by use of and, constructs a new reducer by combining an arbitrary number of reducers -
const identity = x =>
x
const all = (f = identity, ...more) =>
more .reduce (and, f)
Define myReducer using all -
const myReducer =
all
( reactionDisabledReducer
, reactionIconReducer
, reactionOrderReducer
// ...
)
Given a mocked implementation for these three (3) reducers -
const reactionDisabledReducer = (s, x) =>
x < 0
? { ...s, disabled: true }
: s
const reactionIconReducer = (s, x) =>
({ ...s, icon: `${x}.png` })
const reactionOrderReducer = (s, x) =>
x > 10
? { ...s, error: "over 10" }
: s
Run myReducer to see the outputs
const initState =
{ foo: "bar" }
myReducer (initState, 10)
// { foo: 'bar', icon: '10.png' }
myReducer (initState, -1)
// { foo: 'bar', disabled: true, icon: '-1.png' }
myReducer (initState, 100)
// { foo: 'bar', icon: '100.png', error: 'over 10' }
Expand the snippet below to verify the results in your browser -
const identity = x =>
x
const and = (f, g) =>
(r, x) => g (f (r, x), x)
const all = (f, ...more) =>
more .reduce (and, f)
const reactionDisabledReducer = (s, x) =>
x < 0
? { ...s, disabled: true }
: s
const reactionIconReducer = (s, x) =>
({ ...s, icon: `${x}.png` })
const reactionOrderReducer = (s, x) =>
x > 10
? { ...s, error: "over 10" }
: s
const myReducer =
all
( reactionDisabledReducer
, reactionIconReducer
, reactionOrderReducer
// ...
)
const initState =
{ foo: "bar" }
console .log (myReducer (initState, 10))
// { foo: 'bar', icon: '10.png' }
console .log (myReducer (initState, -1))
// { foo: 'bar', disabled: true, icon: '-1.png' }
console .log (myReducer (initState, 100))
// { foo: 'bar', icon: '100.png', error: 'over 10' }
You can choose whatever names you like for and and all. I could see them as part of a reducer module, like reducer.and and reducer.all

One option of utilising Ramda here would be to make use of the fact that it supports passing functions as a monad instance to R.chain (otherwise known as the Reader monad).
This lets you sequence a number of functions together that share some common environment - in your case, reaction.
We can make use of R.pipeWith(R.chain) to allow composing a series of these functions that take some input (e.g. your $state threading through each function) and returns a function that takes the environment, producing a result to pass on to the next function in the pipeline.
// Some mock functions to demonstrate
const reactionDescriptionReducer = ({...state}, reaction) =>
({ description: reaction, ...state })
const reactionDisabledReducer = ({...state}, reaction) =>
({ disabled: reaction, ...state })
const reactionIconReducer = ({...state}, reaction) =>
({ icon: reaction, ...state })
// effectively `R.pipeK`
const kleisli = R.pipeWith(R.chain)
// we need the functions going into chain to be curried
const curried = f => a => b => f(a, b)
// finally, compose the series of functions together
const reactReducer = kleisli([
curried(reactionDescriptionReducer),
curried(reactionDisabledReducer),
curried(reactionIconReducer)
])({})
// and if all goes well...
console.log(
reactReducer("someCommonReactionValue")
)
<script src="//cdnjs.cloudflare.com/ajax/libs/ramda/0.26.1/ramda.min.js"></script>

My first attempt would not involve Ramda at all, just a simple:
const makeReducer = (...fns) => (x) => fns .reduce ( (s, fn) => fn (s, x), {} )
const fn = makeReducer (
(state$, reaction) => ({...state$, foo: `<<-${reaction.foo}->>`}),
(state$, reaction) => ({...state$, bar: `=*=${reaction.bar}=*=`}),
(state$, reaction) => ({...state$, baz: `-=-${reaction.baz}-=-`})
)
console .log (
fn ( {foo: 'a', bar: 'b', baz: 'c'} )
) //~> {foo: '<<-a->>', bar: '=*=b=*=', baz: '-=-c-=-'}
While you could choose to use Ramda's reduce and flip, it doesn't seem as though they'll add much here.

Related

RxJS observable of results from chain of functions

I have a function getC which depends on the output of getB which depends on the output of getA. I want to create an observable stream, so that when I subscribe to it I get the output of each function, as shown in the snippet.
Is there an operator in RxJS that will give me the same behaviour as in the snippet? Or is nesting switchMap, concat and of the only way?
const { concat, from, of } = rxjs
const { switchMap } = rxjs.operators
const getA = async() => new Promise(resolve => setTimeout(() => resolve({
a: 'A'
}), 500));
const getB = async value => new Promise(resolve => setTimeout(() => resolve({
a: value.a,
b: 'B',
}), 500));
const getC = async value => new Promise(resolve => setTimeout(() => resolve({
a: value.a,
b: value.b,
c: 'C',
}), 500));
const stream = from(getA()).pipe(
switchMap(a => concat(
of(a),
from(getB(a)).pipe(
switchMap(b => concat(
of(b),
getC(b)
)
)
)
))
);
stream.subscribe(value => {
console.log(value);
});
<script src="https://unpkg.com/rxjs#7.3.0/dist/bundles/rxjs.umd.min.js"></script>
If you know your dependencies in advance you can hardcode them in the different streams:
c$ requires the latest emission from b$
b$ requires the latest emission from a$
Run the streams in "order" whilst respecting their dependencies so you end up with three emissions, each (except for the first) depending on the previous one.
const a$ = of('A').pipe(delay(600));
const b$ = of('B').pipe(delay(400), combineLatestWith(a$), map(([b, a]) => a + b));
const c$ = of('C').pipe(delay(200), combineLatestWith(b$), map(([c, b]) => b + c));
concat(a$, b$, c$).subscribe(x => console.log(x));
<script src="https://cdnjs.cloudflare.com/ajax/libs/rxjs/7.3.0/rxjs.umd.min.js" integrity="sha512-y3JTS47nnpKORJX8Jn1Rlm+QgRIIZHtu3hWxal0e81avPrqUH48yk+aCi+gprT0RMAcpYa0WCkapxe+bpBHD6g==" crossorigin="anonymous" referrerpolicy="no-referrer"></script>
<script>
const {of, concat} = rxjs;
const {delay, combineLatestWith, map} = rxjs.operators;
</script>
You can do some refactoring to make your function recursive
const getValue = (propertyName, value = null) => new Promise(resolve =>
resolve(value ? {...value, [propertyName]: propertyName.toUpperCase()} : {[propertyName]: propertyName.toUpperCase()})
);
Then you can use a magic rxjs operator EXPAND
import { from, EMPTY } from 'rxjs';
import { expand } from 'rxjs/operators';
from(getValue('a')).pipe(
expand((previousData) => {
if(!previousData.hasOwnProperty('b')) {
return from(getValue('b', previousData))
} else if(!previousData.hasOwnProperty('c')) {
return from(getValue('c', previousData))
} else {
return EMPTY
}
}
)
).subscribe(console.log);
HERE is a working code https://stackblitz.com/edit/typescript-czpotj

Javascript groupBy implementation only produces 1 group

I am trying to implement my own groupBy method and, everything I see says this should work, but I only get 1 group when I use it with an array, even though the grouping is fine. What am I missing:
const merge = (array) => array.reduce((a, b) => Object.keys(a).map(key => {
return {
[key]: a[key].concat(b[key] || [])
};
}).reduce(((a,b) => Object.assign({},a,b))))
Array.prototype.groupBy = function (grouper) {
const groups = this.map(e => {
return {
[grouper(e)]: [e]
};
})
console.log("Groups:\n",JSON.stringify(groups))
return merge(groups)
}
const one = {
1: [1,2,3],
0: [4,5,6]
}
const two = {
1: [7,8,9],
0: [10,11,12]
}
const three = {
1: [13],
0: [16]
}
const array1 = merge([one,two,three])
console.log("case1:\n",JSON.stringify(array1,null,4))
const array2 = [1,2,3,4,5,6,7,9,10].groupBy(e => e % 2)
console.log("case2:\n",JSON.stringify(array2,null,4))
Outputs below, expected is 'case1':
case1:
{
"0": [
4,
5,
6,
10,
11,
12,
16
],
"1": [
1,
2,
3,
7,
8,
9,
13
]
}
Groups:
[{"1":[1]},{"0":[2]},{"1":[3]},{"0":[4]},{"1":[5]},{"0":[6]},{"1":[7]},{"1":[9]},{"0":[10]}]
case2:
{
"1": [
1,
3,
5,
7,
9
]
}
The first reduce in your merge method has a dependency on the keys of the first object in the array.
objs.reduce((a, b) => Object
.keys(a)
// ^-- Takes only the keys from `a`
.map(key => ({ [key]: a[key].concat(b[key] || []) })
// ^^^^^^-- only merges in those keys from `b`
)
To see the issue in action, take away the 0 or 1 key from your one object.
To fix it without deviating from your current approach too much, you could make sure you take both keys from a and b:
objs.reduce((a, b) => Object
.keys(Object.assign({}, a, b))
// etc...
)
It still feels a bit wasteful to first map to key-value-pair type objects and then merge those.
Final solution (removes another bug):
Array.prototype.groupBy = function (grouper) {
const keysOf = (...objs) => Object.keys(Object.assign({}, objs))
const groups = this.map(e => {
return {
[grouper(e)]: [e]
};
})
const merge = (array) => array.reduce((a, b) =>
keysOf(a, b).map(key => {
return {
[key]: (a[key] || []).concat(b[key] || [])
};
}).reduce((a, b) => Object.assign({}, a, b)))
return merge(groups)
}
const array2 = [1,2,3,4,5,6,7,9,10].groupBy(e => e % 2)
console.log("case2:\n",JSON.stringify(array2,null,2))
#user3297291 points out the issue. I would recommend a different merge altogether. First we write merge2 helper which destructively merges b into a -
function merge2 (a, b)
{ for (const [k, v] of Object.entries(b))
if (a[k])
a[k] = [ ...a[k], ...v]
else
a[k] = v
return a
}
Now you can write merge to accept any number of objects. Since it initialises the reduce with a fresh {}, no input objects will be mutated -
const merge = (...all) =>
all.reduce(merge2, {})
Now groupBy works the way you write it, simply applying the mapped elements to merge -
const groupBy = (arr, f) =>
merge(...arr.map(v => ({ [f(v)]: [v] })))
const result =
groupBy([1,2,3,4,5,6,7,9,10], e => e % 2)
Expand the snippet below to verify the result in your own browser -
function merge2 (a, b)
{ for (const [k, v] of Object.entries(b))
if (a[k])
a[k] = [ ...a[k], ...v]
else
a[k] = v
return a
}
const merge = (...all) =>
all.reduce(merge2, {})
const groupBy = (arr, f) =>
merge(...arr.map(v => ({ [f(v)]: [v] })))
const result =
groupBy([1,2,3,4,5,6,7,9,10], e => e % 2)
console.log(JSON.stringify(result))
{"0":[2,4,6,10],"1":[1,3,5,7,9]}
If you want to make merge2 using a pure functional expression, you can write it as -
const merge2 = (a, b) =>
Object
.entries(b)
.reduce
( (r, [k, v]) =>
r[k]
? Object.assign(r, { [k]: [...r[k], ...v] })
: Object.assign(r, { [k]: v })
, a
)
You could skip the whole merge song and dance and write groupBy in a more direct way -
const call = (f, v) =>
f(v)
const groupBy = (arr, f) =>
arr.reduce
( (r, v) =>
call
( k =>
({ ...r, [k]: r[k] ? [...r[k], v] : [v] })
, f(v)
)
, {}
)
const result =
groupBy([1,2,3,4,5,6,7,9,10], e => e % 2)
console.log(JSON.stringify(result))
{"0":[2,4,6,10],"1":[1,3,5,7,9]}
Another option is to use Map as it was designed, and convert to an Object after -
const call = (f, v) =>
f(v)
const groupBy = (arr, f) =>
call
( m =>
Array.from
( m.entries()
, ([ k, v ]) => ({ [k]: v })
)
, arr.reduce
( (r, v) =>
call
( k =>
r.set
( k
, r.has(k)
? r.get(k).concat([v])
: [v]
)
, f(v)
)
, new Map
)
)
const result =
groupBy([1,2,3,4,5,6,7,9,10], e => e % 2)
console.log(JSON.stringify(result))

Javascript - Using compose with reduce

I am learning functional programming with javascript. I have learned that 2 parameters are needed for reduce. Accumalator and the actual value and if we don't supply the initial value, the first argument is used. but I can't understand how the purchaseItem functions is working in the code below. can anyone please explain.
const user = {
name: 'Lachi',
active: true,
cart: [],
purchases: []
}
let history = []
const compose = (f, g) => (...args) => f(g(...args))
console.log(purchaseItem(
emptyCart,
buyItem,
applyTaxToItems,
addItemToCart
)(user, {name: 'laptop', price: 200}))
function purchaseItem(...fns) {
console.log(fns)
return fns.reduce(compose)
}
function addItemToCart (user, item) {
history.push(user)
const updatedCart = user.cart.concat(item)
return Object.assign({}, user, { cart: updatedCart })
}
function applyTaxToItems(user) {
history.push(user)
const {cart} = user
const taxRate = 1.3
const updatedCart = cart.map(item => {
return {
name: item.name,
price: item.price * taxRate
}
})
return Object.assign({}, user, { cart: updatedCart })
}
function buyItem(user) {
history.push(user)
return Object.assign({}, user, { purchases: user.cart })
}
function emptyCart(user) {
history.push(user)
return Object.assign({}, user, {cart: []})
}
Maybe it helps if you take a minimal working example and visualize the output structure:
const comp = (f, g) => x => f(g(x));
const inc = x => `inc(${x})`;
const sqr = x => `sqr(${x})`;
const id = x => `id(${x})`;
const main = [sqr, inc, inc, inc].reduce(comp, id);
console.log(main(0)); // id(sqr(inc(inc(inc(0)))))
Please note that we need id to allow redicung an empty array.
It's a way of creating a pipeline of functions whereby the output from one function is used as the parameter of the next, so we end up with a composed function that is effectively
(...args) =>
emptyCart(
buyItem(
applyTaxToItems(
addItemToCart(...args)
)
)
)
Writing the reduce out in longhand might help in understanding:
fns.reduce((acc, currentFn) => compose(acc, currentFn))

convert object of array to object javascript

I have an object includes arrays of arrays like below
objArray= { hh:[['a','b'],['c','d']],jj:[['x','y'],['z','w']]}
and what I want is here to change array to object:
convertedObject
{
hh:[
{id:'a', name:'b'},
{id:'c', name:'d'}
],
jj:[
{id:'x', name:'y'},
{id:'z', name:'w'}
],
}
I wrote 2 functions to do this but it needs a little change
function convertToObjects(arr) {
return Object.values(arr).map(e => {
return { id: e[0], name: e[1] };
});
}
function convertChoicesToObjects(choicesArray) {
return Object.keys(choicesArray).map((key) => (convertToObjects(choicesArray[key])));
}
const convertedObject=convertChoicesToObjects(objArray)
My function output is:
{
0:[
{id:'a', name:'b'},
{id:'c', name:'d'}
],
1:[
{id:'x', name:'y'},
{id:'z', name:'w'}
],
}
iterate over keys and use map
const objArray = {"hh": [["a", "b"], ["c", "d"]], "jj": [["x", "y"], ["z", "w"]]};
const output = {};
Object.keys(objArray).forEach(key => {
output[key] = objArray[key].map(item => ({"id": item[0], "name": item[1]}));
});
console.log(output);
You could use map and forEach methods.
objArray= { a:[['a','b'],['c','d']],b:[['x','y'],['z','w']]}
Object.keys(objArray).forEach((key) => {
objArray[key] = objArray[key].map(([id, name]) => ({id, name}));
});
console.log(objArray);
The output can be acheived using a simple for...in loop and using .map() method of arrays:
const input = {
a: [['a', 'b'], ['c', 'd']],
b: [['x', 'y'],['z', 'w']]
};
const transform = (input) => {
const output = {};
for (key in input) {
output[key] = input[key].map(([id, name]) => ({id, name}));
}
return output;
};
console.log(transform(input));
You can use reduce()
const objArray = { a:[['a','b'],['c','d']],b:[['x','y'],['z','w']]};
const data = Object.keys(objArray).reduce((prev, key) => {
prev[key] = objArray[key].reduce((res, arr) => {
res.push({id: arr[0], name: arr[1] });
return res;
}, []);
return prev;
}, {});
console.log(data);
You could build new object with Object.fromEntries.
This approach uses another array for the wanted keys of the objects.
var data = { a: [['a', 'b'],['c','d']],b:[['x','y'],['z','w']]},
keys = ['id', 'name'],
result = Object.fromEntries(
Object
.entries(data)
.map(([k, v]) => [
k,
v.map(a => Object.fromEntries(keys.map((k, i) => [k, a[i]])))
])
);
console.log(result);
.as-console-wrapper { max-height: 100% !important; top: 0; }
You can write programs like little stories using prgm -
const myConvert = (o = {}) =>
prgm // given input object, o
( o // starting with o
, Object.entries // get its entries
, map (convert1) // then map over them using convert1
, Object.fromEntries // then make a new object
)
const convert1 = ([ key, values ]) =>
prgm // given input key and values
( values // starting with values
, map (([ id, name ]) => ({ id, name })) // map over arrays creating objs
, r => [ key, r ] // then create a key/result pair
)
const input =
{ a: [ ['a','b']
, ['c','d']
]
, b: [ ['x','y']
, ['z','w']
]
}
console.log
( myConvert (input)
)
// => { ... }
To make this possible, we need -
const prgm = (x, ...fs) =>
fs .reduce ((r, f) => f (r), x)
const map = f => xs =>
xs .map (x => f (x))
But perhaps a better name for myConvert is objectMap. To make it generic, we will make the conversion function convert1 a parameter. And since there's no need to modify the keys of the input object, we will only call the conversion function on the object's values -
const identity = x =>
x
const objectMap = (f = identity) => (o = {}) =>
prgm // given mapper, f, and object, o
( o // starting with o
, Object.entries // get its entries
, map (([ k, v ]) => [ k, f (v) ]) // transform each v using f
, Object.fromEntries // then make a new object
)
Now using generic function objectMap, we can write myConvert as a specialization. This isolates the unique essence of your transformation and detangles it from the rest of your program -
const myConvert =
objectMap // using generic objectMap
( map (([ id, name ]) => ({ id, name })) // convert arrays to objects
)
const input =
{ a: [ ['a','b']
, ['c','d']
]
, b: [ ['x','y']
, ['z','w']
]
}
console.log
( myConvert (input)
)
// => { ... }
Hopefully this shows the power of thinking about your programs from different perspectives. Run the snippet below to confirm the results in your browser -
const prgm = (x, ...fs) =>
fs .reduce ((r, f) => f (r), x)
const map = f => xs =>
xs .map (x => f (x))
const identity = x =>
x
const objectMap = (f = identity) => (o = {}) =>
prgm
( o
, Object.entries
, map (([ k, v ]) => [ k, f (v) ])
, Object.fromEntries
)
// ---
const myConvert =
objectMap
( map (([ id, name ]) => ({ id, name }))
)
const input =
{ a: [ ['a','b']
, ['c','d']
]
, b: [ ['x','y']
, ['z','w']
]
}
console.log
( myConvert (input)
)

How do I break from compose or pipe and return all current data?

I am using a pipe for input validation for login and I would like to break and return current data if possible. Is it even possible to break and return data from a reduce?
My current code sample:
const pipe = (...fns) => fns.reduce((f, g) => (obj) => g(f(obj)))
pipe(
(obj) => {
console.log('fn1', obj)
return { ...obj, ...(!!obj.name || { error: ['NAME_IS_FALSEY'] })}
},
(obj) => {
// if ((obj || {}).error ) return obj
console.log('[fn2]', obj)
return {
...obj,
...(
!!obj.password ||
obj.error ?
{ error: [...obj.error, 'PASSWORD_IS_FALSEY'] } :
{ error: 'PASSWORD_IS_FALSEY' }
)
}
},
(obj) => console.log('[fn3 etc...]', obj)
)({
name: '',
password: '',
})
Maybe I can wrap everything in a new Promise and resolve midway through the reducer?
Here's an example of a recursive pipe that breaks the call chain as soon as the current value does not meet a predicate:
const pipeWhile = pred => (f, ...fs) => x =>
pred(x) && f
? pipeWhile (pred) (...fs) (f(x))
: x;
It might be easier to read an maintain when written as a regular while or for loop.
Here's it used with your provided example:
const pipeWhile = pred => (f, ...fs) => x =>
pred(x) && f
? pipeWhile (pred) (...fs) (f(x))
: x;
const noError = x => !x.hasOwnProperty("error");
const rule = (error, pred) => x => pred(x)
? x : { ...x, error: [error] }
const validation = pipeWhile(noError)(
rule("NAME_IS_FALSEY", obj => !!obj.name),
rule("PASSWORD_IS_FALSEY", obj => !!obj.password)
);
console.log(
validation({
name: '',
password: '',
}),
validation({
name: 'Jane',
password: '',
}),
validation({
name: 'Jane',
password: 'PA$$W0RD',
})
)
If each step in your pipe returns a Maybe object, you can use the pipeK function from Ramda to create a Kleisli-composition based pipe, which runs until one of the steps returns a Nothing. You can get a maybe implementation from the sanctuary-maybe package.

Categories