I have an object with several nested layers of arrays and subobjects, from which I need to extract the values from some paths. Is there some library or native function which can help me do that? I'm already using Lodash and jQuery, but have a hard time figuring out how to simplify this problem.
Example:
{
a: [
{
b: 0,
c: 1
},
{
b: 1,
c: 2
}
]
}
Now I would like to get a list of all a[0..n].b.
My actual object is much larger and has 3 layers of arrays and a path like syn[0].sem[0].pdtb3_relation[0].sense, so I'd rather not write 3 nested for loops if a library function exists.
You can use forEach() to iterate through array.
var o = {
a: [
{
b: 0,
c: 1
},
{
b: 1,
c: 2
}
]
}
Object.keys(o).forEach(a => o[a].forEach(y => console.log(y.b)));
Related
Object.assign(...as) appears to change the input parameter. Example:
const as = [{a:1}, {b:2}, {c:3}];
const aObj = Object.assign(...as);
I deconstruct an array of object literals as parameter of the assign function.
I omitted console.log statements. Here's the stdout from node 13.7:
as before assign: [ { a: 1 }, { b: 2 }, { c: 3 } ]
aObj: { a: 1, b: 2, c: 3 }
as after assign: [ { a: 1, b: 2, c: 3 }, { b: 2 }, { c: 3 } ]
The reader may notice that as first element has been changed in an entire.
Changing a new array bs elements to an immutable object (using freeze)
const bs = [{a:1}, {b:2}, {c:3}];
[0, 1, 2].map(k => Object.freeze(bs[k]));
const bObj = Object.assign(...bs);
leads to an error:
TypeError: Cannot add property b, object is not extensible
at Function.assign (<anonymous>)
Which indicates the argument is indeed being changed.
What really confounds me is that even binding my array, cs, by currying it to a function (I think you call this a closure in JS)
const cs = [{a:1}, {b:2}, {c:3}];
const f = (xs) => Object.assign(...xs);
const g = () => f(cs);
const cObj = g();
returns:
cs before assign: [ { a: 1 }, { b: 2 }, { c: 3 } ]
cObj: { a: 1, b: 2, c: 3 }
cs after assign: [ { a: 1, b: 2, c: 3 }, { b: 2 }, { c: 3 } ]
What went wrong here? And how may one safely use Object.assign without wrecking its first argument?
Object.assign is not a pure function, it writes over its first argument target.
Here is its entry on MDN:
Object.assign(target, ...sources)
Parameters
target
The target object — what to apply the sources’ properties to, which is returned after it is modified.
sources
The source object(s) — objects containing the properties you want to apply.
Return value
The target object.
The key phrase is "[the target] is returned after it is modified". To avoid this, pass an empty object literal {} as first argument:
const aObj = Object.assign({}, ...as);
I have a JavaScript object that looks like this:
Const data = {
x: 1,
y: 2,
z: 3
a: 4,
b: 5,
c: 6
};
We have a signing service in our Angular 6 application which stringifies this object, hashes the string, then attached a signature to it. Then it saves it to a firestore database. The database likes to order the properties alphabetically so it ends up looking like this:
{
a: 4,
b: 5,
c: 6,
x: 1,
y: 2,
z: 3
}
When we retrieve this object from the database and try to validate the signature, it fails. It fails because when you stringify this object, the alphabetical order of the properties results in a different string compared to when we signed it. This results in a different hash which doesn’t match with the original signature.
Our current solution to this problem is that we write out the order of the properties alphabetically in the code, but we’d like to make this fool proof (ex. If another developer comes along and adds a property to the bottom, say d, not realizing it’s supposed to be alphabetical). I’m told by a colleague that there is some way of telling Javascript to order the properties according to its own algorithm. If we could do that, then we’d order the properties according to that algorithm before stringifying, hashing, and signing, and then when we retrieve the object from the database, do the same thing: order the properties according to Javascript’s algorithm, stringify, hash, and validate.
Does anyone know what this Javascript ordering is and how to do it?
There isn't a way for JS to naturally order an object, you're going to have to tinker with it yourself.
The easiest way that I can think of to do this would be to use an array and sort from there.
This will return you the following array...
Object.entries(test).sort((a, b) => a[1] - b[1])
returns
[ [ 'x', 1 ],
[ 'y', 2 ],
[ 'z', 3 ],
[ 'a', 4 ],
[ 'b', 5 ],
[ 'c', 6 ] ]
If you want it back in an object,
Object.assign({}, ...Object.entries(test).sort((a, b) => a[1] - b[1]).map(([key, value]) => ({[key]: value})) )
returns
{ x: 1, y: 2, z: 3, a: 4, b: 5, c: 6 }
Create a custom stringify function that handles putting the object in the correct order.
const data = {
a: 4,
b: 5,
c: 6,
x: 1,
y: 2,
z: 3
}
function customStringify(d){
return '{'+Object
.entries(d)
.sort(([,v1],[,v2])=>v1-v2)
.map(([k,v])=>`${k}:${v}`)
.join(",")+'}';
}
const res = customStringify(data);
console.log(res);
I'm trying to figure out how to create a lens that'll give me an array of the key values from within an array. Here's a simple example:
const L = require('partial.lenses');
const data = [
{
r: [
{
d: {
one: 1,
two: 2
}
}
]
},
{
r: [
{
d: {
three: 3,
four: 4
}
}
]
}
];
const lens = L.compose (
L.elems,
L.prop ('r'),
L.elems,
L.prop ('d'),
);
const result = L.get (lens, data);
console.log (result);
I want:
[{ one: 1, two: 2 }, { three: 3, four: 4 }]
But get:
{ one: 1, two: 2 }
I'm sure this is trivial, but can't get it quite right. Once my lens correctly selects the array of 'd's I want to use L.modify to get the data with all of the 'd' objects replaced with a string. I think I know how to do once my lens is correct.
Thanks
Use L.collect instead of L.get, L.get returns the first found entity, while L.collect returns all matching entities, similar to [].filter vs [].find.
https://github.com/calmm-js/partial.lenses#l-collect
First of all, I'm fairly new to streams, so I'm still getting to grips with some common patterns.
In many libraries we can split a stream into a stream of streams using .groupBy(keySelectorFn). For example, this stream is split into streams based on the value of 'a' in each object (pseudo-code, not based on any particular library):
var groups = Stream.of(
{ a: 1, b: 0 },
{ a: 1, b: 1 },
{ a: 2, b: 2 },
{ a: 1, b: 3 }
)
.groupBy(get('a'));
Say I want to process groups differently based on the value of 'a' of that group:
groups.map(function(group) {
if (..?) {
// Run the group through some process
}
return group;
});
I can't see how to get the value of 'a' without consuming the first element of each group (and if the first element of a group is consumed the group is no longer intact).
This seems to me a fairly common thing that I want to do with streams. Am I taking the wrong approach?
--- EDIT ---
Here's a more specific example of a problem that I'm stuck on:
var groups = Stream.of(
{ a: 1, b: 0 },
{ a: 1, b: 1 },
{ a: 2, b: 0 },
{ a: 2, b: 1 },
{ a: 2, b: 2 },
{ a: 1, b: 2 }
)
.groupBy(get('a'));
How to select the first 1 object where a === 1, and the first 2 objects where a === 2, and pass any other objects straight through? This seems logical to me:
groups.chain(function(group) {
return group.key === 1 ?
group.take(1) :
group.key === 2 ?
group.take(2) :
group ;
});
But group.key does not exist (and even if it did it would seem a bit... smelly).
groupBy will give you a stream of streams(each value in the stream being a stream itself). using fold, you can process each group (which is a stream), into a single value (using conditionals). flatMap puts all the results into a single stream. Here is a simple example that processes groups of objects. It groups the objects according to property "a", does a arithmetic operation based on the value into a single object containing type and val properties. These final objects are flattened into a single stream:
var groupStream = Bacon.fromArray([
{ a: 1, b: 0 },
{ a: 1, b: 1 },
{ a: 2, b: 2 },
{ a: 1, b: 3 }
]);
// -----[stream of objects where a=1]-------[stream of objects where a=2]---->
var groups = groupStream.groupBy(function(k){ return k.a; })
// v is a stream of a=1 objects or a=2 objects
groups.flatMap(function(v) {
//fold(reduce) over the values in v (stream)
return v.fold({type:'', val: 0},function(acc,i) {
if(i.a == 1) {
return {type: 'one', val: acc.val + i.b };
}
if(i.a == 2) {
return {type: 'two', val: acc.val + i.b };
}
})
}).onValue(function(v) {
console.log(v);
});
Here is the jsbin: http://jsbin.com/haliqerita/edit?js,console
Hope that helps.
When running the following query using the root directory of my Firebase, extra nodes are added.
Query:
fire.set({
"users":[
{
"0":[
{
"email":"foobar#gmail.com",
"snake":[
{
"highScore":"15"
}
]
}
]
}
]
});
Result
I get the same result when formatting the data in a json file and importing it directly using the Firebase web interface. Did I miss something in the documentation perhaps?
The issue here is that any time you use array syntax, i.e. [ ... ], you're creating an "array" in Firebase, which we do by just creating an object with numeric keys (i.e. 0, 1, 2, ...).
So if you do:
ref.set({ a: 5 });
The resulting object will be:
{ a: 5 }
But if you instead do:
ref.set([{a: 5}, {b: 6}]);
You'll get:
{
'0': { a: 5 },
'1': { b: 6 }
}
So if you just remove the square brackets from the data you're setting, e.g.:
fire.set({
"users": {
"0": {
"email":"foobar#gmail.com",
"snake": {
"highScore":"15"
}
}
}
});
The resulting data in the web interface should match your data exactly.