Supose I have a n-ary tree structure (in json) like this:
[
{
"text": "Some title",
"children": [
{
"text": "Some title",
"children": [
...
]
},
...
]
}
]
Where I neither know how many children the nodes will have nor the tree's depth.
What I would like to do is change the name of property text to name, across all children.
I've tryed this, with a recursive function func:
func(tree) {
if (!tree) return;
for (let node of tree) {
node.name = node.text
delete node.text;
return func(node.children);
}
}
But it didn't work. How would I do that?
I would say, the main problem with your code is that node variable holds the value of corresponding array items and it doesn't keep the reference to those items themselves, so, basically, mutations you attempt to make are never applied to original array (but only to temporary variable reassigned upon each loop iteration)
If you prefer to mutate original array and feel comfortable using for(-loops for that purpose, you'd be much better off using for(..in-loop to access array items by their keys:
const src = [
{
text: "Some title",
children: [
{
text: "Some title",
children: []
},
]
}
],
func = tree => {
for(const nodeIdx in tree){
const {text:name, children} = tree[nodeIdx]
func(children)
tree[nodeIdx] = {name, children}
}
}
func(src)
console.log(src)
.as-console-wrapper{min-height:100%;}
However, I would avoid mutating source data and return new array instead (e.g. with Array.prototype.map():
const src = [
{
text: "Some title",
children: [
{
text: "Some title",
children: []
},
]
}
],
func = tree =>
tree.map(({text:name,children}) => ({
name,
...(children && {children: func(children)})
}))
console.log(func(src))
.as-console-wrapper{min-height:100%;}
You would use the in operator here.
for (let node **in** tree) {
node.name = node.text
delete node.text;
return func(node.children);
}
Related
My object is something like:
let items =
[
{
"creationTimeStamp": "2022-05-31T17:04:28.000Z",
"modifiedTimeStamp": "2022-05-31T17:04:28.000Z",
"locations": [
{
"id": "5ao",
"name": "Store1"
}
],
"typeId": "Lead"
}
]
I am trying to push the following object into the locations property:
{
"id": "9a0",
"name": "Store2"
}
I have tried doing
items1 = [];
for (var i = 0; i < items.length; i++) {
items1.id = "9a0";
items1.name = "Store2";
//9 is some static index value added
Object.assign({9 : items1}, items[i].locations);
}
If I console(Object.assign({9 : items1}, items[i].locations)); I can see 2 arrays inside it, but my items locations property is still the same.
My expectation is as below:
[
{
"creationTimeStamp": "2022-05-31T17:04:28.000Z",
"modifiedTimeStamp": "2022-05-31T17:04:28.000Z",
"locations": [
{
"id": "5ao",
"name": "Store1"
},
{
"id": "9a0",
"name": "Store2"
}
],
"typeId": "Lead"
}
]
I also tried to use items[i].locations.push(item1) but then got:
TypeError: Cannot add property 9, object is not extensible
I also tried to assign a new array to items[i].locations, but then got:
TypeError: Cannot assign to read only property 'locations' of object '#'
What can I do to get the desired result?
You seem to expect that the second argument given to Object.assign will be mutated. But it is the first argument that is mutated. That means your .locations is not mutated. Moreover, in comments you indicate that locations cannot be extended and that the property is read-only.
So that means you'll need a complete new object.
Some other remarks:
Don't initialise items1 as an array, since it is supposed to be a plain object.
Declare a variable with const, let or var and avoid implicit global declaration.
It is safer to declare the items1 object inside the loop, so you create a new object each time and don't mutate the same object. For your example code it makes no difference, but it can lead to unexpected behaviour.
As you don't need i for anything else than items[i], and you actually need a complete new structure, use .map instead.
So:
items = items.map(item => {
let obj = {
id: "9a0",
name: "Store2"
};
return {...item, locations: item.locations.concat(obj) };
});
I always think in terms of functions, and of immutability-by-default, so my approach might look like this, with addLocationToAll built atop a simpler addLocation. The code is fairly simple:
const addLocation = (newLoc) => ({locations, ...rest}) =>
({...rest, locations: locations .concat (newLoc)})
const addLocationToAll = (newLoc) => (items) =>
items .map (addLocation (newLoc))
const items = [{creationTimeStamp: "2022-05-31T17:04:28.000Z", modifiedTimeStamp: "2022-05-31T17:04:28.000Z", locations: [{id: "5ao", name: "Store1"}], typeId:"Lead"}]
const newLoc = {id: "9a0", name: "Store2"}
console .log (addLocationToAll (newLoc) (items))
.as-console-wrapper {max-height: 100% !important; top: 0}
items is an array so it must access the first position of the array, which would be the proposed object.
With this, from the proposed object you will extract thelocation attribute and since this is an array, you use the push function to insert the new object
items[0]
// ->
// {
// creationTimeStamp: '2022-05-31T17:04:28.000Z',
// modifiedTimeStamp: '2022-05-31T17:04:28.000Z',
// locations: [ { id: '5ao', name: 'Store1' } ],
// typeId: 'Lead'
// }
I try this:
items[0].locations.push({"id": "9a0", "name": "Store2" })
And now:
items[0]
//->
// {
// creationTimeStamp: '2022-05-31T17:04:28.000Z',
// modifiedTimeStamp: '2022-05-31T17:04:28.000Z',
// locations: [ { id: '5ao', name: 'Store1' }, { id: '9a0', name: 'Store2' }],
// typeId: 'Lead'
// }
I have several arrays of strings that I want to generate a object out of. An example is this. Given I have:
let graph = {}
let a = ["Vehicle", "Car", "Sport"]
let b = ["Vehicle", "Car", "Van"]
let c = ["Vehicle", "Truck", "4x4"]
I want to make a function that I can pass a into and it would update graph to be:
{
name: "Vehicle",
children: [
{
name: "Car",
children: [
"Sport"
]
}
]
}
I then pass b into the function and graph sees that "Vehicle" > "Car" already exists so it just pushes "Van" into the children. Then when c is passed it pushes a child onto the Vehicle children. I am having trouble as with a loop I am not able to account for the fact that the input can be of any length (not just 3). How can I loop through the depth of an object like this?
As I said in the comments box that, Your expected result isn't valid node tree, Because 3rd nested node should not contain array.
Anyway Here is my answer:
const nodes = new Map([["graph", { children: [], name: 'graph' }]]); // 'graph' is the root node
const entries = [["Vehicle", "Car", "Sport"], ["Vehicle", "Car", "Van"], ["Vehicle", "Truck", "4x4"]];
function* createNodeIds(entrie, parentId, deep = 0) {
const name = entrie.shift();
const nodeId = parentId + '.' + name;
yield [parentId, nodeId, name, ++deep];
while (entrie.length)
yield* createNodeIds(entrie, nodeId, deep);
}
for (const entrie of entries)
for (const [parentId, nodeId, name, deep] of createNodeIds(entrie, 'graph'))
if (!nodes.has(nodeId)) {
const node = { name, children: [] }
nodes.set(nodeId, node);
nodes.get(parentId).children.push(deep > 2 ? name : node)
}
console.log(nodes.get('graph.Vehicle'));
I want to loop through 600+ array items in an object and find one particular item based on certain criteria. The array in the object is called "operations" and its items are arrays themselves.
My goal is to get the index of operation's array item which has the deeply nested string "Go".
In the sample below this would be the first element. My problem is that I can check if an array element contains "call" and "draw" but I don't know how to test for the nested dictionary "foobar". I only have basic JavaScript available, no special libraries.
let json = {
"head": {},
"operations": [
[
"call",
"w40",
"draw",
{
"parent": "w39",
"style": [
"PUSH"
],
"index": 0,
"text": "Modify"
}
],
[
"call",
"w83.gc",
"draw",
{
"foobar": [
["beginPath"],
[
"rect",
0,
0,
245,
80
],
["fill"],
[
"fillText",
"Go",
123,
24
],
[
"drawImage",
"rwt-resources/c8af.png",
]
]
}
],
[
"create",
"w39",
"rwt.widgets.Menu",
{
"parent": "w35",
"style": [
"POP_UP"
]
}
],
[
"call",
"w39",
"draw",
{
"parent": "w35",
"style": [
"POP_UP"
]
}
]
]
};
let index = "";
let operationList = json.operations;
for (i = 0; i < operationList.length; i++) {
if (operationList[i].includes('call') && operationList[i].includes('draw')) //missing another check if the dictionary "foobar" exists in this element )
{
index = i;
}
}
document.write(index)
I'll preface by saying that this data structure is going to be tough to manage in general. I would suggest a scheme for where an operation is an object with well defined properties, rather than just an "array of stuff".
That said, you can use recursion to search the array.
If any value in the array is another array, continue with the next level of recursion
If any value is an object, search its values
const isPlainObject = require('is-plain-object');
const containsTerm = (value, term) => {
// if value is an object, search its values
if (isPlainObject(value)) {
value = Object.values(value);
}
// if value is an array, search within it
if (Array.isArray(value)) {
return value.find((element) => {
return containsTerm(element, term);
});
}
// otherwise, value is a primitive, so check if it matches
return value === term;
};
const index = object.operations.findIndex((operation) => {
return containsTerm(operation, 'Go');
});
i have a below JSON with 4 fields where 4th one is list of js objects
[
{
"srNumber": 1,
"Name": "prod name",
"includeInAutoSupplies": true,
"childObjectList": [
{
"cValue": "cValue Name 1",
"Name": "testName1",
},
{
"cValue": "cValue Name 2",
"Name": "testName2",
},
{
"cValue": "cValue Name 3",
"Name": "testName3",
},
]
}
]
It is difficult to explain in words here but I need to convert the above array to the one shown in the below, where testName1, 2, 3 are the value of Name of child object's field and cValue Name1, 2, 3 are the values of cValue in child object array.
[
{
"srNumber": 1,
"Name": "prod name",
"includeInAutoSupplies": true,
"testName1": "cValue Name 1"
"testName2": "cValue Name 2"
"testName3": "cValue Name 3"
}
]
I cannot change the JSON structure as this is how I am getting it, and I need to make this as one array of objects instead array of arrays property in it so that I can further use it to export into .csv file.
Functional approach:
const transform = list => list.map(
entry => Object.fromEntries([
...Object.entries(entry).filter(([k]) => k !== 'childObjectList'),
...entry.childObjectList.map(({ Name, cValue }) => [Name, cValue])
])
)
Explanation: Object.entries returns an array [[k1, v1], [k2, v2]] from { k1: v1, k2: v2 } and Object.fromEntries does the opposite. This way, we map all entries in your list to their transformed selves, consisting of a combination of all existing properties except childObjectList and the contents of childObjectList converted to separate properties.
Imperative approach:
function transform (list) {
for (const entry of list) {
for (const { Name, cValue } of entry.childObjectList) {
entry[Name] = cValue
}
delete entry.childObjectList
}
return list
}
Note that this one mutates list and its children. If this is undesired, you could change it like this:
function transform (list) {
const newList = []
for (const { ...entry } of list) {
for (const { Name, cValue } of entry.childObjectList) {
entry[Name] = cValue
}
delete entry.childObjectList
newList.push(entry)
}
return newList
}
Explanation: We simply loop over the elements in your list and modify them (or create modified copies) in such a way that the childObjectList property is removed and instead each child is added as separate property.
In all three cases, if you then call transform(theArray), you get your desired result.
I have an array of objects like this:
[
{ name: "Group 1", value: "Foo" },
{ name: "Group 2", value: "Bar" },
{ name: "Group 1", value: "Baz" }
]
I'd like to use Partial Lenses library to transform these groups to keys of an object with corresponding group's items, like this:
{
"Group 1": [
{ name: "Group 1", value: "Foo" },
{ name: "Group 1", value: "Baz" }
],
"Group 2": [
{ name: "Group 2", value: "Bar" }
]
}
My current approach is like this, assuming I have the source data in a variable called data:
const grouped = L.collect([L.groupBy('name'), L.entries], data)
const setKey = [L.elems, 0]
const getName = [L.elems, 1, 0, 'name']
const correctPairs = L.disperse(setKey, L.collectTotal(getName, grouped), grouped)
L.get(L.inverse(L.keyed), correctPairs)
I don't like that I need to use the grouped and correctPairs variables to hold data, as I probably should be able to do the transformation directly in the composition. Could you help me to compose the same functionality in a more meaningful way?
Here's a Partial Lenses Playground with the above code.
I assume the goal is to actually create an isomorphism through which one can
view such an array as an object of arrays and also perform updates. Like a
bidirectional version of e.g. Ramda's
R.groupBy function.
Indeed, one approach would be to just use Ramda's
R.groupBy to implement a new primitive
isomorphism using L.iso.
Something like this:
const objectBy = keyL => L.iso(
R.cond([[R.is(Array), R.groupBy(L.get(keyL))]]),
R.cond([[R.is(Object), L.collect([L.values, L.elems])]])
)
The conditionals are needed to allow for the possibility that the data is not of
the expected type and to map the result to undefined in case it isn't.
Here is a playground with the above Ramda based
objectBy
implementation.
Using only the current version of Partial Lenses, one way to compose a similar
objectBy combinator would be as follows:
const objectBy = keyL => [
L.groupBy(keyL),
L.array(L.unzipWith1(L.iso(x => [L.get(keyL, x), x], L.get(1)))),
L.inverse(L.keyed)
]
Perhaps the interesting part in the above is the middle part that converts an
array of arrays into an array of key-array pairs (or the other way around).
L.unzipWith1
checks that all the keys within a group match, and if they don't, that group
will be mapped to undefined and filtered out by
L.array. If desired,
it is possible to get stricter behaviour by using
L.arrays.
Here is a playground with the above composed
objectBy
implementation.
You don't need any library, use a generic function that returns a reducer, that way you can use to group any collection with any key. In the example below I used this to group by name, but also by value.
const groupBy = key => (result,current) => {
let item = Object.assign({},current);
// optional
// delete item[key];
if (typeof result[current[key]] == 'undefined'){
result[current[key]] = [item];
}else{
result[current[key]].push(item);
}
return result;
};
const data = [{ name: "Group 1", value: "Foo" },{ name: "Group 2", value: "Bar" },{ name: "Group 1", value: "Baz" }];
const grouped = data.reduce(groupBy('name'),{});
console.log(grouped);
const groupedByValue = data.reduce(groupBy('value'),{});
console.log(groupedByValue);
You can use Array.reduce
let arr = [{ name: "Group 1", value: "Foo" },{ name: "Group 2", value: "Bar" },{ name: "Group 1", value: "Baz" }];
let obj = arr.reduce((a,c) => Object.assign(a, {[c.name]: (a[c.name] || []).concat(c)}), {});
console.log(obj);