I am trying to find out the best / most efficient or most functional way to compare / merge / manipulate two arrays (lists) simultaneously in JS.
The example I give below is a simple example of the overall concept. In my current project, I deal with some very crazy list mapping, filtering, etc. with very large lists of objects.
As delinated below, my first idea (version1) on comparing lists would be to run through the first list (i.e. map), and in the anonymous/callback function, filter the second list to meet the criteria needed for the compare (match ids for example). This obviously works, as per version1 below.
I had a question performance-wise, as by this method on every iteration/call of map, the entire 2nd list gets filtered just to find that one item that matches the filter.
Also, the filter passes every other item in list2 which should be matched in list1. Meaning (as that sentence probably did not make sense):
list1.map list2.filter
id:1 [id:3,id:2,id:1]
^-match
id:2 [id:3,id:2,id:1]
^-match
id:3 [id:3,id:2,id:1]
^-match
Ideally on the first iteration of map (list1 id:1), when the filter encounters list2 id:3 (first item) it would just match it to list1 id:3
Thinking with the above concept (matching to a later id when it is encountered earlier, I came up with version2).
This makes list2 into a dictionary, and then looks up the value in any sequence by key.
const list1 = [
{id: '1',init:'init1'},
{id: '2',init:'init2'},
{id: '3',init:'init3'}
];
const list2 = [
{id: '2',data:'data2'},
{id: '3',data:'data3'},
{id: '4',data:'data4'}
];
/* ---------
* version 1
*/
const mergedV1 = list1.map(n => (
{...n,...list2.filter(f => f.id===n.id)[0]}
));
/* [
{"id": "1", "init": "init1"},
{"id": "2", "init": "init2", "data": "data2"},
{"id": "3", "init": "init3", "data": "data3"}
] */
/* ---------
* version 2
*/
const dictList2 = list2.reduce((dict,item) => (dict[item.id]=item,dict),{});
// does not handle duplicate ids but I think that's
// outside the context of this question.
const mergedV2 = list1.map(n => ({...n,...dictList2[n.id]}));
/* [
{"id": "1", "init": "init1"},
{"id": "2", "init": "init2", "data": "data2"},
{"id": "3", "init": "init3", "data": "data3"}
] */
JSON.stringify(mergedV1) === JSON.stringify(mergedV2);
// true
// and just for fun
const sqlLeftOuterJoinInJS = list1 => list2 => on => {
const dict = list2.reduce((dict,item) => (
dict[item[on]]=item,dict
),{});
return list1.map(n => ({...n,...dict[n[on]]}
))};
Obviously the above examples are pretty simple (merging two lists, each list having a length of 3). There are more complex instances that I am working with.
I don't know if there are some smarter (and ideally functional) techniques out there that I should be using.
You could take a closure over the wanted key for the group and a Map for collecting all objects.
function merge(key) {
var map = new Map;
return function (r, a) {
a.forEach(o => {
if (!map.has(o[key])) r.push(map.set(o[key], {}).get(o[key]));
Object.assign(map.get(o[key]), o);
});
return r;
};
}
const
list1 = [{ id: '1', init: 'init1' }, { id: '2', init: 'init2' }, { id: '3', init: 'init3' }],
list2 = [{ id: '2', data: 'data2' }, { id: '3', data: 'data3' }, { id: '4', data: 'data4' }],
result = [list1, list2].reduce(merge('id'), []);
console.log(result);
.as-console-wrapper { max-height: 100% !important; top: 0; }
Using filter for search is a misstep. Your instinct in version 2 is much better. Map and Set provide much faster lookup times.
Here's a decomposed approach. It should be pretty fast, but maybe not as fast as Nina's. She is a speed demon >_<
const merge = (...lists) =>
Array .from
( lists
.reduce (merge1, new Map)
.values ()
)
const merge1 = (cache, list) =>
list .reduce
( (cache, l) =>
cache .has (l.id)
? update (cache, l.id, l)
: insert (cache, l.id, l)
, cache
)
const insert = (cache, key, value) =>
cache .set (key, value)
const update = (cache, key, value) =>
cache .set
( key
, { ...cache .get (key)
, ...value
}
)
const list1 =
[{ id: '1', init: 'init1' }, { id: '2', init: 'init2' }, { id: '3', init: 'init3' }]
const list2 =
[{ id: '2', data: 'data2' }, { id: '3', data: 'data3' }, { id: '4', data: 'data4' }]
console .log (merge (list1, list2))
I'm offering this for completeness as I think Nina and #user633183 have offered most likely more efficient solutions.
If you wish to stick to your initial filter example, which is a max lookup N*M, and your arrays are mutable; you could consider reducing the set as you traverse through. In the old days shrinking the array had a huge impact on performance.
The general pattern today is to use a Map (or dict) as indicated in other answers, as it is both easy to understand and generally efficient.
Find and Resize
const list1 = [
{id: '1',init:'init1'},
{id: '2',init:'init2'},
{id: '3',init:'init3'}
];
const list2 = [
{id: '2',data:'data2'},
{id: '3',data:'data3'},
{id: '4',data:'data4'}
];
// combine by ID
let merged = list1.reduce((acc, obj)=>{
acc.push(obj);
// find index by ID
let foundIdx = list2.findIndex( el => el.id==obj.id );
// if found, store and remove from search
if ( foundIdx >= 0 ){
obj.data = list2[foundIdx].data;
list2.splice( foundIdx, 1 ); // shrink lookup array
}
return acc;
},[]);
// store remaining (if you want); i.e. {id:4,data:'data4'}
merged = merged.concat(list2)
console.log(merged);
.as-console-wrapper {
max-height: 100% !important;
top: 0;
}
I'm not sure whether I should mark this question as a duplicate because you phrased it differently. Anyway, here's my answer to that question copied verbatim. What you want is an equijoin:
const equijoin = (xs, ys, primary, foreign, sel) => {
const ix = xs.reduce((ix, row) => // loop through m items
ix.set(row[primary], row), // populate index for primary table
new Map); // create an index for primary table
return ys.map(row => // loop through n items
sel(ix.get(row[foreign]), // get corresponding row from primary
row)); // select only the columns you need
};
You can use it as follows:
const equijoin = (xs, ys, primary, foreign, sel) => {
const ix = xs.reduce((ix, row) => ix.set(row[primary], row), new Map);
return ys.map(row => sel(ix.get(row[foreign]), row));
};
const list1 = [
{ id: "1", init: "init1" },
{ id: "2", init: "init2" },
{ id: "3", init: "init3" }
];
const list2 = [
{ id: "2", data: "data2" },
{ id: "3", data: "data3" },
{ id: "4", data: "data4" }
];
const result = equijoin(list2, list1, "id", "id",
(row2, row1) => ({ ...row1, ...row2 }));
console.log(result);
It takes O(m + n) time to compute the answer using equijoin. However, if you already have an index then it'll only take O(n) time. Hence, if you plan to do multiple equijoins using the same tables then it might be worthwhile to abstract out the index.
Related
what is the most efficient way to group by multiple object properties? It groups by category and then by group.
I have detailed my implementation using a single array reduce which I will push into an array (expected output). I am not sure how to conditionally push to sources array which is created within the reduce.
Without using lodash groupBy or third party. I was also thinking of filtering by a key I create e.g.
const key = `${category}-${group}`;
const items = [
{
"name": "Halloumi",
"group": "Cheese",
"category": "Dairy"
},
{
"name": "Mozzarella",
"group": "Cheese",
"category": "Dairy"
}
];
// my current implementation
const groupedItems = items.reduce((map, item) => {
const { category, name, group } = item;
if (map.has(category)) {
map.get(category).push({
name,
group,
});
} else {
map.set(category, [
{
name,
group,
},
]);
}
return map;
}, new Map());
console.log(Object.fromEntries(groupedItems));
// Another attempt
const groupedItems2 = items.reduce((map, item) => {
const { category, group, name } = item;
acc[category] = acc[category] || { category, themes: [] };
const source = {
id,
name
};
const totalSources = [source].length;
const uniqueSources = [...new Set([source])].length;
acc[category].themes.push({
theme,
totalSources,
uniqueSources,
sources: [source],
});
return acc;
return map;
}, {});
console.log(Object.entries(groupedItems2));
// expected output
[
{
"category": "Dairy",
"groups": [
{
"group": "Cheese",
"totalSources": 2,
"sources": [
{
"name": "Halloumi"
},
{
"name": "Mozzarella"
}
]
}
]
}
]
You need a two-level map.
There are many ways to do that. Here is one:
const items = [{"name": "Halloumi","group": "Cheese","category": "Dairy"},{"name": "Mozzarella","group": "Cheese","category": "Dairy"}];
const dict = {};
for (const {name, group, category} of items) {
((dict[category] ??= {})[group] ??= []).push({name});
}
const categories = Object.entries(dict).map(([category, groups]) => ({
category,
groups: Object.entries(groups).map(([group, sources]) => ({
group,
totalSources: sources.length,
sources
}))
}));
console.log(categories);
I was interested in a more generic approach to this problem. It might or might not be useful for you, but I think it offers another way to think about such problems. The idea is that we have a function which takes a configuration like this:
const config = {
prop: 'category',
childName: 'groups',
children: {
prop: 'group',
childName: 'sources',
totalName: 'totalSources',
children: {prop: 'name'}
}
}
It returns a new function which takes an array of flat objects and nests them, giving totals, and changing property names as necessary. Here is an implementation, tested only on this problem (with substantially more data added to test various scenarios):
const regroup = ({prop, newName = prop, childName, totalName, children}) => (xs) =>
childName && children
? Object .entries (groupBy (x => x [prop]) (xs))
.map (([k, vs]) => [k, vs .map (omit ([prop]))])
.map (([k, vs, kids = regroup (children) (vs)]) => ({
[newName]: k,
... (totalName ? {[totalName]: kids .length} : {}),
[childName]: kids
})
)
: prop && newName
? xs .map (({[prop]: n, ...rest}) => ({[newName]: n, ...rest}))
: [ ...xs]
const groupBy = (fn, k) => (xs) => xs .reduce (
(a, x) => ((k = fn (x)), (a [k] = a [k] || []), (a [k] .push (x)), a)
, {}
)
const omit = (names) => (o) => Object .fromEntries (
Object .entries (o) .filter (([k, v]) => ! names.includes (k))
)
const config = {
prop: 'category',
childName: 'groups',
children: {
prop: 'group',
childName: 'sources',
totalName: 'totalSources',
children: {prop: 'name'}
}
}
const groupFoods = regroup (config)
const items = [{name: "Halloumi", group: "Cheese", category: "Dairy"}, {name: "Mozzarella", group: "Cheese", category: "Dairy"}, {name: "Whole", group: "Milk", category: "Dairy"}, {name: "Skim", group: "Milk", category: "Dairy"}, {name: "Potatos", group: "Root", category: "Vegetable"}, {name: "Turnips", group: "Root", category: "Vegetable"}, {name: "Strawberry", group: "Berry", category: "Fruit"}, {name: "Baguette", group: "Yeast", category: "Bread"}]
console .log (JSON.stringify (groupFoods (items), null, 4))
.as-console-wrapper {max-height: 100% !important; top: 0}
Note the two helper functions. omit clones an object removing certain property names. So, for instance, omit ([age, id]) ({id: 101, first: 'Fred', last: 'Flintstone', age: 27}) returns {first: 'Fred', last: 'Flintstone'} groupBy accepts a function which returns a key value for a value, and returns a function which takes an array of values, and returns an object with the keys found, assbociated with an array of values that generate that key. For instance, groupBy (n => n % 10) ([1, 3, 4, 21, 501, 23, 43, 25, 64]) //=> {"1": [1, 21, 501], "3": [3, 23, 43], "4": [4, 64], "5": [25]}. These are genuinely reusable functions that you might keep in your personal utility library.
The main function simply builds our output using the configuration values supplied. If we need to nest, then we recur on the children node of the configuration.
I have an array of objects. Each object can also contain an array of objects, and so on to an arbitrary depth.
var myArray = [
{
id:'foo',
items:[]
},
{
id:'bar',
items:[
{
id:'blah'
items:[...etc...]
}
]
}
]
I'd like to read, add, and remove objects in the nested arrays using an array of indices.
So a function to remove the value myArray[1][3][2] from myArray would be called with this array of indexes as a parameter: [1, 3, 2]
I've found that you can use reduce() to return a value like so:
indices.reduce((acc, cur) => Array.isArray(acc) ? acc[cur] : acc.items[cur], myArray)
but cannot work out how to remove or add a value using the same idea. Any help is much appreciated.
You could create a function which takes similar arguments as the splice function. Pass the nested array, the indices path, the total number of items to be deleted and collect all the new items to be added at the end using rest parameters.
function deepSplice(array, indices, deleteCount, ...toBeInserted) {
const last = indices.pop();
const finalItems = indices.reduce((acc, i) => acc[i].items, array);
finalItems.splice(last, deleteCount, ...toBeInserted);
return array
}
Remove the last index from the indices array.
reduce the indices array to get the nested items array in every loop to get the final items array you want to do the insert/delete operation on.
Use splice on the last index to insert/delete based on the argument passed.
If you just want to insert, pass deleteCount = 0. And if you just want to remove, skip the last argument.
Here's a snippet:
const myArray = [
{ id: "0", items: [] },
{
id: "1",
items: [
{
id: "1.0",
items: [
{ id: "1.0.0", items: [] },
{ id: "1.0.1", items: [] }]
},
{ id: "1.1", items: [] }
]
}
];
function deepSplice(array, indices, deleteCount, ...toBeInserted) {
const last = indices.pop();
const finalItems = indices.reduce((acc, i) => acc[i].items, array);
finalItems.splice(last, deleteCount, ...toBeInserted);
return array
}
console.log(
// removes "1.0.1" item and inserts a new object there
deepSplice(myArray, [1,0,1], 1, { id: 'newlyInserted'})
)
The easiest way is to use only the indices without the last one to have it used for any operation as you want.
For deleting, you need this index to splice the array, and as well for updating.
In this case, you could return the parent object and use an object with items as property for the given array as start value for reducing.
This allows to access the parent object and use items for any further operation.
lastIndex = indices.pop();
parent = indices.reduce((r, index) => r.items[index], { items: myArray });
// further use
parent.items.splice(lastIndex, 1); // delete
Here is a solution using object-scan.
The main advantage of using object-scan is that you get more control to adjust your functions (e.g. if you wanted to get multiple entries or do fuzzy key matching etc). However there is increased complexity, so it's a trade-off and depends on your requirements.
// const objectScan = require('object-scan');
const myArray = [{ id: '0', items: [] }, { id: '1', items: [ { id: '1.0', items: [{ id: '1.0.0' }, { id: '1.0.1' }] }, { id: '1.1', items: [] } ] }];
const indicesToNeedle = (indices) => indices.map((idx) => `[${idx}]`).join('.items');
const get = (array, indices) => objectScan(
[indicesToNeedle(indices)],
{ abort: true, rtn: 'value' }
)(array);
const splice = (array, indices, deleteCount, ...toBeInserted) => objectScan(
[indicesToNeedle(indices)],
{
abort: true,
rtn: 'bool', // returns true iff spliced
filterFn: ({ parent, property }) => {
parent.splice(property, deleteCount, ...toBeInserted);
return true;
}
}
)(array);
console.log(get(myArray, [1, 0, 1]));
// => { id: '1.0.1' }
console.log(get(myArray, [1, 0, 2]));
// => undefined
// removes "1.0.1" item and inserts two objects there
console.log(splice(myArray, [1, 0, 1], 1, { id: '1.0.1-new' }, { id: '1.0.2-new' }));
// => true
console.log(myArray);
// => [ { id: '0', items: [] }, { id: '1', items: [ { id: '1.0', items: [ { id: '1.0.0' }, { id: '1.0.1-new' }, { id: '1.0.2-new' } ] }, { id: '1.1', items: [] } ] } ]
.as-console-wrapper {max-height: 100% !important; top: 0}
<script src="https://bundle.run/object-scan#13.8.0"></script>
Disclaimer: I'm the author of object-scan
// This is a large array of objects, e.g.:
let totalArray = [
{"id":"rec01dTDP9T4ZtHL4","fields":
{"user_id":170180717,"user_name":"abcdefg","event_id":516575,
}]
let uniqueArray = [];
let dupeArray = [];
let itemIndex = 0
totalArray.forEach(x => {
if(!uniqueArray.some(y => JSON.stringify(y) === JSON.stringify(x))){
uniqueArray.push(x)
} else(dupeArray.push(x))
})
node.warn(totalArray);
node.warn(uniqueArray);
node.warn(dupeArray);
return msg;
I need my code to identify duplicates in the array by a key value of user_id within the objects in the array. Right now, my code works to identify identical objects in the array, but I need it to identify dupes based on a key value inside the objects instead. How do I do this? I am struggling to figure out how to path the for each loop to identify the dupe based on the key value instead of the entire object.
Right now, my code works to identify identical objects in the array, but I need it to identify dupes based on a key value inside the objects instead. How do I do this?
Don’t compare the JSON representation of the whole objects then, but only their user_id property specifically.
totalArray.forEach(x => {
if(!uniqueArray.some(y => y.fields.user_id === x.fields.user_id)){
uniqueArray.push(x)
} else(dupeArray.push(x))
})
You could take a Set and push to either uniques or duplicates.
var array = [
{ id: 1, data: 0 },
{ id: 2, data: 1 },
{ id: 2, data: 2 },
{ id: 3, data: 3 },
{ id: 3, data: 4 },
{ id: 3, data: 5 },
],
uniques = [],
duplicates = [];
array.forEach(
(s => o => s.has(o.id) ? duplicates.push(o) : (s.add(o.id), uniques.push(o)))
(new Set)
);
console.log(uniques);
console.log(duplicates);
.as-console-wrapper { max-height: 100% !important; top: 0; }
One way is to keep a list of ids you found so far and act accordingly:
totalArray = [
{ id: 1, val: 10 },
{ id: 2, val: 20 },
{ id: 3, val: 30 },
{ id: 2, val: 15 },
{ id: 1, val: 50 }
]
const uniqueArray = []
const dupeArray = []
const ids = {}
totalArray.forEach( x => {
if (ids[x.id]) {
dupeArray.push(x)
} else {
uniqueArray.push(x)
ids[x.id] = true
}
})
for (const obj of uniqueArray) console.log("unique:",JSON.stringify(obj))
for (const obj of dupeArray) console.log("dupes: ",JSON.stringify(obj))
I'm trying to query an array of objects in JavaScript, and return objects that match a specific filter criteria.
I've managed - thanks to help from others - to filter a simple object, but now I need to apply the same thing to a more complex object.
// Simple object query:
var recipes = {
'soup': {'ingredients': ['carrot', 'pepper', 'tomato']},
'pie': {'ingredients': ['carrot', 'steak', 'potato']},
'stew': {'ingredients': ['steak', 'pepper', 'tomato']}
};
var shoppingList = ['carrot', 'steak', 'tomato', 'pepper']
var result = Object.entries(recipes)//1. get the key-value pairs
.filter(([key, {ingredients}]) => ingredients.every(t => shoppingList.includes(t))) //2. filter them
.map(([key]) => key) //3. get the keys only
console.log(result);
// More complex object:
var itemsTest = [
{
uid: 1,
items: [
{ item: { uid: "a" } },
{ item: { uid: "b" } },
{ item: { uid: "g" } }
]
},
{
uid: 2,
items: [
{ item: { uid: "b" } },
{ item: { uid: "q" } },
{ item: { uid: "f" } }
]
},
}
];
var filter = ["b", "q", "f"]
// Expect filter to return {uid: 2, items}
}
The recipes filter works great. But now I have a more complex array of objects, it seems the same approach isn't possible.
I want to filter itemsTest according to the uid of each item in the items array. I'd be happy to use lodash, if it makes life easier.
I tried to flatten the array of objects using Object.entries(), to no avail.
var flattened = objectMap(itemsList, function(value) {
return Object.entries(value);
});
var result = flattened.filter(([key, { uid }]) =>
uid.every(t => filter.includes(t))
)
I also tried a simplified approach filtering with one value using Array.filter.prototype(), which doesn't work either:
var newArray = flattened.filter(function(el) {
return el.uid <= 2
});
console.log(newArray)
Any help understanding how to navigate an object like this would be great.
You can use Array.find() (or Array.filter() to iterate the array of objects. Now use Array.every(), and for each item check if it's uid is included in the filter array.
const itemsTest = [{"uid":1,"items":[{"item":{"uid":"a"}},{"item":{"uid":"b"}},{"item":{"uid":"g"}}]},{"uid":2,"items":[{"item":{"uid":"b"}},{"item":{"uid":"q"}},{"item":{"uid":"f"}}]}];
const filter = ["b", "q", "f"];
const result = itemsTest.find(({ items }) => // use filter instead of find to get multiple items
items.every(o => filter.includes(o.item.uid))
);
console.log(result);
<script src="https://cdnjs.cloudflare.com/ajax/libs/lodash.js/4.17.15/lodash.js"></script>
I have 2 array of objects:
const arr1 = [{'id':'1' 'value':'yes'}, {'id':'2', 'value':'no'}];
const arr2 = [{'id':'2', 'value':'yes'}];
So, if I try and merge these 2 arrays the result should be:
arrTemp = [{'id':'1', 'value':'yes'}, {'id':'2', 'value':'yes'}];
Basically, it should work similar to Object.assign(), but no matter what I try it does not work. Could anyone please help me in this ?
I modified the data structure. Is it possible to merge them now and get the output.
Thanks
This is how you can get the job done with ES6 spread, reduce and Object.values.
const arr1 = [{
'id': '1',
'value': 'yes'
}, {
'id': '2',
'value': 'no'
}];
const arr2 = [{
'id': '2',
'value': 'yes'
}];
const result = Object.values([...arr1, ...arr2].reduce((result, {
id,
...rest
}) => {
result[id] = {
...(result[id] || {}),
id,
...rest
};
return result;
}, {}));
console.log(result);
const result = Object.entries(Object.assign({}, ...arr1,...arr2)).map(([key, value]) => ({[key]:value}));
You could spread (...) the arrays into one resulting object ( via Object.assign) and then map its entries to an array again.
You could work with a valid ES6 data structure like a map for example:
const 1 = { 1: { string: 'yes' }, 2: { string: 'no' } }
const 2 = { 2: { string: 'yes' }, 3: { string: 'no' } }
const 3 = { ...1, ...2}
This will override your first argument with the second one or just combine them where possible.
Just try it out in your browser it's a lot easier and enhances performance since you will never have to use findById() which is an expensive operation.
In javascript, arrays are simply objects indexed by numbers starting from 0.
So when you use Object.assign on arr1 and arr2 you will override the first item in the arr1 with the first item in arr2 because they are both indexed under the key 0.
your result will be:
[
{ '2': 'yes' },
{ '2': 'no' }
]
(or in object syntax:)
{
0: { '2': 'yes' },
1: { '2': 'no' }
}
Instead of using arrays, you could create an object indexed by the number string (which is how you seem to be thinking of the array in any case).
So you could change your original data structure to make the job easier:
const arr1 = {
'1': 'yes',
'2': 'no'
};
const arr2 = {
'2': 'yes'
};
const result = Object.assign(arr1, arr2);
You could take a Map as reference to the new assigned object in the result array and build first a new array with a copy of the objects and then iterate the second array and update the objects with the same key.
var array1 = [{ 1: 'yes' }, { 2: 'no' }],
array2 = [{ 2: 'yes' }],
getKey = o => Object.keys(o)[0],
map = new Map,
result = array1.map(o => (k => map.set(k, Object.assign({}, o)).get(k))(getKey(o)));
array2.forEach(o => Object.assign(map.get(getKey(o)), o));
console.log(result);
.as-console-wrapper { max-height: 100% !important; top: 0; }
Array reduce could come in handy is this case. See example below:
[...arr1, ...arr2].reduce((acc, item) => {
const updated = acc.find(a => a.id === item.id)
if (!updated) {
acc.push(item)
} else {
const index = acc.indexOf(updated)
acc[index] = { ...item, ...acc[index] }
}
return acc
}, [])
simple way to add with exist array of object:
const arr1 = [{ "name":"John", "age":30, "car":"toyata" }];
const arr2 = [{ "name":"Ales", "age":40, "car":"Nissan" }];
Array.prototype.push.apply(arr1, arr2);
Result:=>
console.log(arr1)
For anyone finding this answer at a later point in time. There are a couple of ways that you could want this to work exactly, but you could filter all adjusted elements in the first array, and then combine it with the second array.
const arr3 = [...arr1.filter(item1 => !arr2.find(item2 => item1.id === item2.id)), ...arr2]
Alternatively, you could update the elements in the first array, and then filter them from the second array instead.
You cannot use array.prototype map because the key of arr1 and arr2 have the same value '2'.
You should use something like this
for (var i = 0, l = arr1.length; i < l; i++) {
var key = Object.keys(arr1[i]);
if (!arr2[key]) { arr2[key] = []; }
arr2[key].push(arr1[i][key]);
}
Regards