Trouble using javascript partial.lenses to obtain properties within an array - javascript

I'm trying to figure out how to create a lens that'll give me an array of the key values from within an array. Here's a simple example:
const L = require('partial.lenses');
const data = [
{
r: [
{
d: {
one: 1,
two: 2
}
}
]
},
{
r: [
{
d: {
three: 3,
four: 4
}
}
]
}
];
const lens = L.compose (
L.elems,
L.prop ('r'),
L.elems,
L.prop ('d'),
);
const result = L.get (lens, data);
console.log (result);
I want:
[{ one: 1, two: 2 }, { three: 3, four: 4 }]
But get:
{ one: 1, two: 2 }
I'm sure this is trivial, but can't get it quite right. Once my lens correctly selects the array of 'd's I want to use L.modify to get the data with all of the 'd' objects replaced with a string. I think I know how to do once my lens is correct.
Thanks

Use L.collect instead of L.get, L.get returns the first found entity, while L.collect returns all matching entities, similar to [].filter vs [].find.
https://github.com/calmm-js/partial.lenses#l-collect

Related

How to filter object entries to keep only those whose keys start with a string, using ramda?

Having the following array:
const vegsAndFruits = [
{
"fruit_banana": 10,
"fruit_apple": 1,
"veg_tomato": 3,
"fruit_watermelon": 11
},
{
"veg_carrot": 3,
"veg_garlic": 11,
"veg_potato": 0,
"fruit_apricot": 22
},
{
"veg_eggplant": 2,
"veg_cabbage": 1,
"fruit_strawberry": 100,
"fruit_orange": 30
}
]
I want to filter it to return the same array but to keep only the properties that start with "fruit".
Expected output
const expectedOutput = [
{
"fruit_banana": 10,
"fruit_apple": 1,
"fruit_watermelon": 11
},
{
"fruit_apricot": 22
},
{
"fruit_strawberry": 100,
"fruit_orange": 30
}
]
My attempt
I thought the solution should come from mixing ramda's R.startsWith() and R.pickBy(). But the following doesn't work as I expected:
const R = require("ramda")
R.map(R.pickBy(R.startsWith(["fruit"])),vegsAndFruits)
which returns
// [ {}, {}, {} ]
What am I missing here?
The R.pickBy predicate is called with the value (1st) and key (2nd) parameters. Since you need the key, use R.nthArg to create a function that returns the 2nd param:
const { map, pickBy, pipe, nthArg, startsWith } = R
const fn = map(pickBy(pipe(
nthArg(1),
startsWith('fruit')
)))
const vegsAndFruits = [{"fruit_banana":10,"fruit_apple":1,"veg_tomato":3,"fruit_watermelon":11},{"veg_carrot":3,"veg_garlic":11,"veg_potato":0,"fruit_apricot":22},{"veg_eggplant":2,"veg_cabbage":1,"fruit_strawberry":100,"fruit_orange":30}]
const result = fn(vegsAndFruits)
console.log(result)
<script src="https://cdnjs.cloudflare.com/ajax/libs/ramda/0.28.0/ramda.min.js" integrity="sha512-t0vPcE8ynwIFovsylwUuLPIbdhDj6fav2prN9fEu/VYBupsmrmk9x43Hvnt+Mgn2h5YPSJOk7PMo9zIeGedD1A==" crossorigin="anonymous" referrerpolicy="no-referrer"></script>
The pickby function takes two parameters, value and key.
You can run R.map(R.pickBy((_, key) => key.startsWith('fruit')),vegsAndFruits) to get the result you want.

Removing Duplicates in Arrays

I am trying to pass a function that removes duplicates from an array. It should handle strings, object, integers as well. In my code so far I am showing that it will handle strings but nothing else. How can Imake this function universalto handle numbers,handle arrays,handle objects, and mixed types?
let unique = (a) => a.filter((el, i ,self) => self.indexOf(el) ===i);
In this function I hav unique() filtering to make a new array which checks the element and index in the array to check if duplicate. Any help would be appreciated.
i think the first you should do is to sort the array ( input to the function ). Sorting it makes all the array element to be ordered properly. for example if you have in an array [ 1, 3, 4, 'a', 'c', 'a'], sorting this will result to [ 1 , 3 , 4, 'a', 'a' , 'c' ], the next thing is to filter the returned array.
const unique = a => {
if ( ! Array.isArray(a) )
throw new Error(`${a} is not an array`);
let val = a.sort().filter( (value, idx, array) =>
array[++idx] != value
)
return val;
}
let array = [ 1 , 5, 3, 2, "d", "q", "b" , "d" ];
unique(array); // [1, 2, 3, 5, "b", "d", "q"]
let obj = { foo: "bar" };
let arraySize = array.length;
array[arraySize] = obj;
array[arraySize++] = "foo";
array[arraySize++] = "baz";
array[arraySize++] = obj;
unique(array); // [1, 2, 3, 5, {…}, "b", "baz", "d", "foo", "hi", "q"]
it also works for all types, but if you pass in an array literal with arrays or objects as one of its element this code will fail
unique( [ "a", 1 , 3 , "a", 3 , 3, { foo: "baz" }, { foo: "baz" } ] ); // it will not remove the duplicate of { foo: "baz" } , because they both have a different memory address
and you should also note that this code does not return the array in the same order it was passed in , this is as a result of the sort array method
Try using sets without generics. You can write a function as
Set returnUnique(Object array[]) {
Set set=new HashSet();
for (Object obj:array) {
set.add(obj);
}
return set;
}

Get list of values from several paths in json object

I have an object with several nested layers of arrays and subobjects, from which I need to extract the values from some paths. Is there some library or native function which can help me do that? I'm already using Lodash and jQuery, but have a hard time figuring out how to simplify this problem.
Example:
{
a: [
{
b: 0,
c: 1
},
{
b: 1,
c: 2
}
]
}
Now I would like to get a list of all a[0..n].b.
My actual object is much larger and has 3 layers of arrays and a path like syn[0].sem[0].pdtb3_relation[0].sense, so I'd rather not write 3 nested for loops if a library function exists.
You can use forEach() to iterate through array.
var o = {
a: [
{
b: 0,
c: 1
},
{
b: 1,
c: 2
}
]
}
Object.keys(o).forEach(a => o[a].forEach(y => console.log(y.b)));

How do you group a stream, then process groups separately based on the group's key?

First of all, I'm fairly new to streams, so I'm still getting to grips with some common patterns.
In many libraries we can split a stream into a stream of streams using .groupBy(keySelectorFn). For example, this stream is split into streams based on the value of 'a' in each object (pseudo-code, not based on any particular library):
var groups = Stream.of(
{ a: 1, b: 0 },
{ a: 1, b: 1 },
{ a: 2, b: 2 },
{ a: 1, b: 3 }
)
.groupBy(get('a'));
Say I want to process groups differently based on the value of 'a' of that group:
groups.map(function(group) {
if (..?) {
// Run the group through some process
}
return group;
});
I can't see how to get the value of 'a' without consuming the first element of each group (and if the first element of a group is consumed the group is no longer intact).
This seems to me a fairly common thing that I want to do with streams. Am I taking the wrong approach?
--- EDIT ---
Here's a more specific example of a problem that I'm stuck on:
var groups = Stream.of(
{ a: 1, b: 0 },
{ a: 1, b: 1 },
{ a: 2, b: 0 },
{ a: 2, b: 1 },
{ a: 2, b: 2 },
{ a: 1, b: 2 }
)
.groupBy(get('a'));
How to select the first 1 object where a === 1, and the first 2 objects where a === 2, and pass any other objects straight through? This seems logical to me:
groups.chain(function(group) {
return group.key === 1 ?
group.take(1) :
group.key === 2 ?
group.take(2) :
group ;
});
But group.key does not exist (and even if it did it would seem a bit... smelly).
groupBy will give you a stream of streams(each value in the stream being a stream itself). using fold, you can process each group (which is a stream), into a single value (using conditionals). flatMap puts all the results into a single stream. Here is a simple example that processes groups of objects. It groups the objects according to property "a", does a arithmetic operation based on the value into a single object containing type and val properties. These final objects are flattened into a single stream:
var groupStream = Bacon.fromArray([
{ a: 1, b: 0 },
{ a: 1, b: 1 },
{ a: 2, b: 2 },
{ a: 1, b: 3 }
]);
// -----[stream of objects where a=1]-------[stream of objects where a=2]---->
var groups = groupStream.groupBy(function(k){ return k.a; })
// v is a stream of a=1 objects or a=2 objects
groups.flatMap(function(v) {
//fold(reduce) over the values in v (stream)
return v.fold({type:'', val: 0},function(acc,i) {
if(i.a == 1) {
return {type: 'one', val: acc.val + i.b };
}
if(i.a == 2) {
return {type: 'two', val: acc.val + i.b };
}
})
}).onValue(function(v) {
console.log(v);
});
Here is the jsbin: http://jsbin.com/haliqerita/edit?js,console
Hope that helps.

Firebase.set() adds extra nodes

When running the following query using the root directory of my Firebase, extra nodes are added.
Query:
fire.set({
"users":[
{
"0":[
{
"email":"foobar#gmail.com",
"snake":[
{
"highScore":"15"
}
]
}
]
}
]
});
Result
I get the same result when formatting the data in a json file and importing it directly using the Firebase web interface. Did I miss something in the documentation perhaps?
The issue here is that any time you use array syntax, i.e. [ ... ], you're creating an "array" in Firebase, which we do by just creating an object with numeric keys (i.e. 0, 1, 2, ...).
So if you do:
ref.set({ a: 5 });
The resulting object will be:
{ a: 5 }
But if you instead do:
ref.set([{a: 5}, {b: 6}]);
You'll get:
{
'0': { a: 5 },
'1': { b: 6 }
}
So if you just remove the square brackets from the data you're setting, e.g.:
fire.set({
"users": {
"0": {
"email":"foobar#gmail.com",
"snake": {
"highScore":"15"
}
}
}
});
The resulting data in the web interface should match your data exactly.

Categories