I have an array or object which will serve as columns for a table with a unique key
const list = [{
key: "Name",
textColor: "red"
},{
key: "Age",
textColor: "green"
},{
key: "Occupation",
textColor: "yellow"
}]
And, I have a list of ordering of columns in the table
const newOrder = ["Occupation", "Name", "Age"]
Now , how can i rearrange the list according to the newOrder without using nested loops. Also, these all are dyanamic, so its not just about the above mentioned three columns
Expected Output
const list = [{
key: "Occupation",
textColor: "yellow"
},{
key: "Name",
textColor: "red"
},{
key: "Age",
textColor: "green"
}]
Your list can be reformatted to a regular javascript object in which key is the property name, and textColor is the value:
const toObject = kvps => Object.fromEntries(kvps.map(kvp => [ kvp.key, kvp.textColor ]));
With a given array of keys, you can pick values from that object like so:
const fromObject = (order, obj) => order.map(key => ({ key, textColor: obj[key] }));
Chain the two together, and you can reorder any list of key value pairs:
const list = [{
key: "Name",
textColor: "red"
},{
key: "Age",
textColor: "green"
},{
key: "Occupation",
textColor: "yellow"
}]
const toObject = kvps => Object.fromEntries(kvps.map(kvp => [ kvp.key, kvp.textColor ]));
const fromObject = (order, obj) => order.map(key => ({ key, textColor: obj[key] }));
const reorder = (order, kvps) => fromObject(order, toObject(kvps));
const newList = reorder(["Occupation", "Name", "Age"], list);
console.log(
newList
)
Edit: if the sizes of your list and order arrays are small, you probably want to go with the much easier to read approach suggested by Jon Webb in one of the other answers. 🙂 I tried to keep my solution to an O(n + m) complexity rather than O(n * m), (n = list size, m = order size) but it's probably not worth the added complexity.
You can iterate on the "order" list, finding the corresponding item in the original list, and push the item to the new list in that order:
const orderList = (list, order) => {
const newList = [];
for (key of order) {
const item = list.find((obj) => obj.key == key);
if (item) newList.push(item);
}
return newList;
}
You can use the sort method on the array. The sort method will sort in place so you should copy your array if you dont want to mutate the original.
The array sort method takes a compare function the receives two elements for comparison a and b. It should return a number and it will sort them depending on that number:
If > 0 then b is before a
If < 0 then b is after a
If 0 then keep as is
By using indexOf on the newOrder array can get the index of the key. And index of 0 should come before and index of 1 should come before and index of 2 of course. So if the index of a.key is 2 and the index of b.key is 0, then we should return a value greater than 0 since b should come before a.
In my implementation below I'm cloning the original list ([...list]) as to not mutate accidentally. You could just as well do list.sort(...) if you don't need or care about mutating.
const list = [{
key: "Name",
textColor: "red"
},{
key: "Age",
textColor: "green"
},{
key: "Occupation",
textColor: "yellow"
}]
const newOrder = ["Occupation", "Name", "Age"]
function sort(list, order) {
return [...list].sort((a, b) => {
const keyIndexA = order.indexOf(a.key);
const keyIndexB = order.indexOf(b.key);
if (keyIndexA < keyIndexB) return -1;
if (keyIndexA > keyIndexB) return 1;
return 0;
});
}
console.log(sort(list, newOrder));
You can use just regular sort
const list = [{key: "Name",textColor: "red"},{key: "Age",textColor: "green"},{key: "Occupation",textColor: "yellow"}];
const newOrder = ["Occupation", "Name", "Age"];
const result = list.sort(({key: a}, {key: b}) => newOrder.indexOf(a) - newOrder.indexOf(b));
console.log(result);
.as-console-wrapper{min-height: 100%!important; top: 0}
You can try to loop through the newOrder array, find the object that correlates to the first item and push to a new array.
const orderedList = [];
newOrder.forEach(order => {
orderedList.push(list.find(({key}) => key === order));
})
You can use orderBy from lodash library
Related
I have an array like so with a single object inside:
FirstArray = [{
"category": "None",
"ARFE": 553.5,
"BV": 900,
"RF rfeer": 0,
.....
}]
I want to convert it so that every key-value pair (where the value is a number) in the object is in its own object like the following:
NewArray = [{
name: "ARFE",
value: 553.05
}, {
name: "BV",
value: 900
}, {
name: "RF rfeer",
value: 0
}, .....]
Here, each key was assigned a new key called name, and the value for the original key was assigned a new key called value. Those pairs are then put into their own object inside the array.
Note that "category": "None" is not its own object in the array since "None" is non-numerical.
It's also important to note that there could be many key-value pairs, so it's not just limited to the items above (e.g., "ARFE": 553.5, etc.)
What I have so far:
I know you can separate a single object into multiple objects:
NewArray = Object.entries(FirstArray).reduce((prev, [og, nw]) => {
let [name, value] = og.match(/\D+|\d+$/g)
prev[value] = { ...(prev[value] || {}), [name]: nw }
return prev;
}, {})
I also know how that you can create a new object with new keys like so:
NewArray = Object.assign(
...Object.entries(FirstArray).map(([key, value]) => ({ [key]: name }))
);
However, I'm having trouble putting everything together. How would I be able to achieve NewArray from FirstArray?
You were pretty close. All you needed to do is specify the name:
const data = {
"category": "None",
"ARFE": 553.5,
"BV": 900,
"RF rfeer": 0
};
const result = Object
.entries(data)
.filter(([_, value]) => typeof value === 'number')
.map(([key, value]) => ({ name: key, value }));
console.log(result);
Also, if you don't want { "name": "category", "value": "None" } to be included in the result, you can just filter it:
const result = Object
.entries(data)
.filter(([ key ]) => key !== 'category')
.map(([key, value]) => ({ name: key, value }));
Object.entries on array has no sense at all, use it on the object
const FirstArray = [{
"category": "None",
"ARFE": 553.5,
"BV": 900,
"RF rfeer": 0,
}]
const newObject = Object.entries(FirstArray[0]).reduce((array, [key, value]) => {
return [...array, {
name: key,
value
}]
}, [])
console.log(newObject)
reduce is not the right way to go. Simply use map:
Object.entries(FirstArray[0])
.filter(x => !isNaN(x[1])) // filter out non-numeric values
.map(([name, value]) => ({name, value}))
I have two arrays as listed below. I'm trying to create a new array of objects by using the field key in array_1 and the values in array_2.
const result = []
array_1 = [{ name: "Color" , field: "color"}, {name: "Shape", field: "shape" }, { name: "Whatever", field: "whatever" }]
array_2 = [["green", "rectangular", "whatever1"], ["yellow", "circle", "whatever2"]]
The result should be:
console.log(result)
// [{color:"green", shape:"rectangular", whatever: "whatever1"},
// { color:"yellow", shape: "circle", whatever:"whatever2"}]
I did this at my final trial:
const rowObj = {}
const result = array.map((subarray) => subarray.map((cell, index) => {
console.log(cell,index)
rowObj[columns[index].field] = cell
return rowObj
}))
Basically, I was overwriting the same object.
Thanks,
One way to do it is to map() over the array_2 and in each iteration:
Create a new object
Iterate over the array_1 to fill the newly created object. You can use the index parameter of the forEach() method's callback function to get the field property from the objects inside array_1.
and then return that object from the callback function of the map() method.
const array_1 = [
{ name: 'Color', field: 'color' },
{ name: 'Shape', field: 'shape' },
{ name: 'Whatever', field: 'whatever' },
];
const array_2 = [
['green', 'rectangular', 'whatever1'],
['yellow', 'circle', 'whatever2'],
];
const result = array_2.map(arr => {
const o = {};
arr.forEach((str, idx) => {
o[array_1[idx].field] = str;
});
return o;
});
console.log(result);
You can use array.map to iterate both arrays and take advantage of Object.fromEntries to build new objects based on the order of array elements:
array_1 = [{ name: "Color" , field: "color"}, {name: "Shape", field: "shape" }, { name: "Whatever", field: "whatever" }]
array_2 = [["green", "rectangular", "whatever1"], ["yellow", "circle", "whatever2"]]
let result = array_2.map(
x => Object.fromEntries(
array_1.map((y,i) => ([y.field, x[i]]))))
console.log(result);
Explanation
You could create a function that creates a constructor based on the descriptions of your object's fields like this:
function createConstructor(fieldsDescriptor) {
return function(fields) {
fieldsDescriptor.forEach((descriptor, index) => {
this[descriptor.field] = fields[index]
})
}
}
Then you could, for example, make a sampleConstructor that creates objects based on the field names of array_1:
const SampleConstructor = createConstructor(array_1)
And then, for each entry in array_2 you could apply your SampleConstructor:
const result = array_2.map(fields => new SampleConstructor(fields))
Motivation
Creating a dedicated constructor adds some clear semantics to your app, shows readers what you are doing and also stores constructor information in the created objects at runtime.
When you later want to know which constructor made which objects you can just call object.constructor and use this information to determine what kind of objects they are.
For example calling result[0].constructor == SampleConstructor will be true because SampleConstructor is the constructor that created the first result.
Demo
Here is a full demo
const array_1 = [{ name: "Color" , field: "color"}, {name: "Shape", field: "shape" }, { name: "Whatever", field: "whatever" }]
const array_2 = [["green", "rectangular", "whatever1"], ["yellow", "circle", "whatever2"]]
function createConstructor(fieldsDescriptor) {
return function(fields) {
fieldsDescriptor.forEach((descriptor, index) => {
this[descriptor.field] = fields[index]
})
}
}
const SampleConstructor = createConstructor(array_1)
const results = array_2.map(fields => new SampleConstructor(fields))
console.log(results)
const EmptyConstructor = createConstructor([])
console.log(results[0].constructor == SampleConstructor)
console.log(results[0].constructor == EmptyConstructor)
You can try this
array_1 = [
{ name: 'Color', field: 'color' },
{ name: 'Shape', field: 'shape' },
{ name: 'Whatever', field: 'whatever' }
];
array_2 = [
['green', 'rectangular', 'whatever1'],
['yellow', 'circle', 'whatever2']
];
const keys = array_1.map(item => item.field);
const output = [];
array_2.forEach(item => {
const temp = {};
for (let i = 0; i < keys.length; i++) {
const key = keys[i];
const value = item[i];
temp[key] = value;
}
output.push(temp);
});
console.log(output);
I'm trying to add into an array. I don't know how to traverse and add objects correctly.
I have data array:
const data = [
{
1: "Apple",
2: "Xiaomi"
}
];
const list = [];
data.forEach(function(key, value) {
console.log("key", key);
})
console.log(list)
I want this effect to be as follows:
list: [{
{
value: 1,
title: 'Apple'
},
{
value: 2,
title: 'Xiaomi'
}
}]
Your expected output is invalid. You can first retrieve all the values from the object with Object.values(). Then use Array.prototype.map() to form the array in the structure you want.
Try the following way:
const data = [
{
1: "Apple",
2: "Xiaomi"
}
];
const list = Object.values(data[0]).map((el,i) => ({value: i+1, title: el})) ;
console.log(list);
You can use the existing key of the object with Object.entries() like the following way:
const data = [
{
1: "Apple",
2: "Xiaomi"
}
];
const list = Object.entries(data[0]).map(item => ({value: item[0], title: item[1]}));
console.log(list);
I'll go ahead and make the assumption that data is an object of key/value pairs and you want to transform it to an array of objects.
// Assuming you have an object with key/value pairs.
const data = {
1: "Apple",
2: "Xiaomi"
};
// Convert the data object into an array by iterating over data's keys.
const list = Object.keys(data).map((key) => {
return {
value: key,
title: data[key]
}
});
console.log(list)
Output:
[
{
value: '1',
title: 'Apple'
},
{
value: '2',
title: 'Xiaomi'
}
]
If you actually need value to be numbers instead of strings, you can do it this way:
const list = Object.keys(data).map((key) => {
return {
value: Number(key),
title: data[key]
}
});
And if you are OK with using a more modern version of JavaScript (ECMAScript 2017) this works nicely:
const data = {
1: "Apple",
2: "Xiaomi"
};
// Using Object.entries gives you the key and value together.
const list = Object.entries(data).map(([value, title]) => {
return { value, title }
});
You could do something like this:
const data = ['Apple', 'Xiaomi'];
const result = data.map((item, index) => ({value: index, title: item}));
console.log(result);
If the idea is to turn key names into values and those are not necessarily autoincremented numbers you might want to look at Object.entries():
const data = {1: "Apple", 2: "Xiaomi"};
const res = Object.entries(data).map(entry => ({value: entry[0], title: entry[1]}));
console.log(res);
Input -
[
{color:'red', shape:'circle', size:'medium'},
{color:'red', shape:'circle', size:'small'}
]
Output -
[
{color:'red', shape:'circle', size:['medium','small']}
]
How can this be achieved in Javascript?
If you just want to group based on color and shape, you could use reduce. Create an accumulator object with each unique combination of those 2 properties separated by a | as key. And the object you want in the output as their value. Then use Object.values() to get those objects as an array.
const input = [
{color:'red', shape:'circle', size :'medium'},
{color:'red', shape:'circle', size:'small'},
{color:'blue', shape:'square', size:'small'}
];
const merged = input.reduce((acc, { color, shape, size }) => {
const key = color + "|" + shape;
acc[key] = acc[key] || { color, shape, size: [] };
acc[key].size.push(size);
return acc
}, {})
console.log(Object.values(merged))
This is what the merged/accumulator looks like:
{
"red|circle": {
"color": "red",
"shape": "circle",
"size": [
"medium",
"small"
]
},
"blue|square": {
"color": "blue",
"shape": "square",
"size": [
"small"
]
}
}
You can make it dynamic by creating an array of keys you'd want to group by:
const input = [
{ color: 'red', shape: 'circle', size: 'medium' },
{ color: 'red', shape: 'circle', size: 'small' },
{ color: 'blue', shape: 'square', size: 'small' }
];
const groupKeys = ['color', 'shape'];
const merged = input.reduce((acc, o) => {
const key = groupKeys.map(k => o[k]).join("|");
if (!acc[key]) {
acc[key] = groupKeys.reduce((r, k) => ({ ...r, [k]: o[k] }), {});
acc[key].size = []
}
acc[key].size.push(o.size)
return acc
}, {})
console.log(Object.values(merged))
You can create a general function which takes an array of objects and array of keys to match as its parameters.
You can do that in following steps:
First use reduce() on the array of objects and set accumulator to empty array []
Get the other props(unique props) by using filter() on Object.keys()
Then in each iteration find the element of the accumulator array whose all the given props matches with the current object.
If the element is found then use forEach() other keys and push the values to to corresponding array.
If element is not found then set each key to an empty array.
At last return the result of reduce()
const arr = [
{color:'red', shape:'circle', size:'medium'},
{color:'red', shape:'circle', size:'small'}
]
function groupByProps(arr,props){
const res = arr.reduce((ac,a) => {
let ind = ac.findIndex(b => props.every(k => a[k] === b[k]));
let others = Object.keys(a).filter(x => !props.includes(x));
if(ind === -1){
ac.push({...a});
others.forEach(x => ac[ac.length - 1][x] = []);
ind = ac.length - 1
}
others.forEach(x => ac[ind][x].push(a[x]));
return ac;
},[])
return res;
}
const res = groupByProps(arr,['color','shape'])
console.log(res)
Here is a simple groupBy function which accepts an array of objects and array of props to group on:
let data = [
{color:'red', shape:'circle', size:'medium'},
{color:'red', shape:'circle', size:'small'}
]
let groupBy = (arr, props) => Object.values(arr.reduce((r,c) => {
let key = props.reduce((a,k) => `${a}${c[k]}`, '')
let otherKeys = Object.keys(c).filter(k => !props.includes(k))
r[key] = r[key] || {...c, ...otherKeys.reduce((a,k) => (a[k] = [], a),{})}
otherKeys.forEach(k => r[key][k].push(c[k]))
return r
}, {}))
console.log(groupBy(data, ['color','shape']))
The idea is to use Array.reduce and basically create a string key of the passed in props. For the other fields create an array and keep pushing values there on each iteration.
I am trying to find out the best / most efficient or most functional way to compare / merge / manipulate two arrays (lists) simultaneously in JS.
The example I give below is a simple example of the overall concept. In my current project, I deal with some very crazy list mapping, filtering, etc. with very large lists of objects.
As delinated below, my first idea (version1) on comparing lists would be to run through the first list (i.e. map), and in the anonymous/callback function, filter the second list to meet the criteria needed for the compare (match ids for example). This obviously works, as per version1 below.
I had a question performance-wise, as by this method on every iteration/call of map, the entire 2nd list gets filtered just to find that one item that matches the filter.
Also, the filter passes every other item in list2 which should be matched in list1. Meaning (as that sentence probably did not make sense):
list1.map list2.filter
id:1 [id:3,id:2,id:1]
^-match
id:2 [id:3,id:2,id:1]
^-match
id:3 [id:3,id:2,id:1]
^-match
Ideally on the first iteration of map (list1 id:1), when the filter encounters list2 id:3 (first item) it would just match it to list1 id:3
Thinking with the above concept (matching to a later id when it is encountered earlier, I came up with version2).
This makes list2 into a dictionary, and then looks up the value in any sequence by key.
const list1 = [
{id: '1',init:'init1'},
{id: '2',init:'init2'},
{id: '3',init:'init3'}
];
const list2 = [
{id: '2',data:'data2'},
{id: '3',data:'data3'},
{id: '4',data:'data4'}
];
/* ---------
* version 1
*/
const mergedV1 = list1.map(n => (
{...n,...list2.filter(f => f.id===n.id)[0]}
));
/* [
{"id": "1", "init": "init1"},
{"id": "2", "init": "init2", "data": "data2"},
{"id": "3", "init": "init3", "data": "data3"}
] */
/* ---------
* version 2
*/
const dictList2 = list2.reduce((dict,item) => (dict[item.id]=item,dict),{});
// does not handle duplicate ids but I think that's
// outside the context of this question.
const mergedV2 = list1.map(n => ({...n,...dictList2[n.id]}));
/* [
{"id": "1", "init": "init1"},
{"id": "2", "init": "init2", "data": "data2"},
{"id": "3", "init": "init3", "data": "data3"}
] */
JSON.stringify(mergedV1) === JSON.stringify(mergedV2);
// true
// and just for fun
const sqlLeftOuterJoinInJS = list1 => list2 => on => {
const dict = list2.reduce((dict,item) => (
dict[item[on]]=item,dict
),{});
return list1.map(n => ({...n,...dict[n[on]]}
))};
Obviously the above examples are pretty simple (merging two lists, each list having a length of 3). There are more complex instances that I am working with.
I don't know if there are some smarter (and ideally functional) techniques out there that I should be using.
You could take a closure over the wanted key for the group and a Map for collecting all objects.
function merge(key) {
var map = new Map;
return function (r, a) {
a.forEach(o => {
if (!map.has(o[key])) r.push(map.set(o[key], {}).get(o[key]));
Object.assign(map.get(o[key]), o);
});
return r;
};
}
const
list1 = [{ id: '1', init: 'init1' }, { id: '2', init: 'init2' }, { id: '3', init: 'init3' }],
list2 = [{ id: '2', data: 'data2' }, { id: '3', data: 'data3' }, { id: '4', data: 'data4' }],
result = [list1, list2].reduce(merge('id'), []);
console.log(result);
.as-console-wrapper { max-height: 100% !important; top: 0; }
Using filter for search is a misstep. Your instinct in version 2 is much better. Map and Set provide much faster lookup times.
Here's a decomposed approach. It should be pretty fast, but maybe not as fast as Nina's. She is a speed demon >_<
const merge = (...lists) =>
Array .from
( lists
.reduce (merge1, new Map)
.values ()
)
const merge1 = (cache, list) =>
list .reduce
( (cache, l) =>
cache .has (l.id)
? update (cache, l.id, l)
: insert (cache, l.id, l)
, cache
)
const insert = (cache, key, value) =>
cache .set (key, value)
const update = (cache, key, value) =>
cache .set
( key
, { ...cache .get (key)
, ...value
}
)
const list1 =
[{ id: '1', init: 'init1' }, { id: '2', init: 'init2' }, { id: '3', init: 'init3' }]
const list2 =
[{ id: '2', data: 'data2' }, { id: '3', data: 'data3' }, { id: '4', data: 'data4' }]
console .log (merge (list1, list2))
I'm offering this for completeness as I think Nina and #user633183 have offered most likely more efficient solutions.
If you wish to stick to your initial filter example, which is a max lookup N*M, and your arrays are mutable; you could consider reducing the set as you traverse through. In the old days shrinking the array had a huge impact on performance.
The general pattern today is to use a Map (or dict) as indicated in other answers, as it is both easy to understand and generally efficient.
Find and Resize
const list1 = [
{id: '1',init:'init1'},
{id: '2',init:'init2'},
{id: '3',init:'init3'}
];
const list2 = [
{id: '2',data:'data2'},
{id: '3',data:'data3'},
{id: '4',data:'data4'}
];
// combine by ID
let merged = list1.reduce((acc, obj)=>{
acc.push(obj);
// find index by ID
let foundIdx = list2.findIndex( el => el.id==obj.id );
// if found, store and remove from search
if ( foundIdx >= 0 ){
obj.data = list2[foundIdx].data;
list2.splice( foundIdx, 1 ); // shrink lookup array
}
return acc;
},[]);
// store remaining (if you want); i.e. {id:4,data:'data4'}
merged = merged.concat(list2)
console.log(merged);
.as-console-wrapper {
max-height: 100% !important;
top: 0;
}
I'm not sure whether I should mark this question as a duplicate because you phrased it differently. Anyway, here's my answer to that question copied verbatim. What you want is an equijoin:
const equijoin = (xs, ys, primary, foreign, sel) => {
const ix = xs.reduce((ix, row) => // loop through m items
ix.set(row[primary], row), // populate index for primary table
new Map); // create an index for primary table
return ys.map(row => // loop through n items
sel(ix.get(row[foreign]), // get corresponding row from primary
row)); // select only the columns you need
};
You can use it as follows:
const equijoin = (xs, ys, primary, foreign, sel) => {
const ix = xs.reduce((ix, row) => ix.set(row[primary], row), new Map);
return ys.map(row => sel(ix.get(row[foreign]), row));
};
const list1 = [
{ id: "1", init: "init1" },
{ id: "2", init: "init2" },
{ id: "3", init: "init3" }
];
const list2 = [
{ id: "2", data: "data2" },
{ id: "3", data: "data3" },
{ id: "4", data: "data4" }
];
const result = equijoin(list2, list1, "id", "id",
(row2, row1) => ({ ...row1, ...row2 }));
console.log(result);
It takes O(m + n) time to compute the answer using equijoin. However, if you already have an index then it'll only take O(n) time. Hence, if you plan to do multiple equijoins using the same tables then it might be worthwhile to abstract out the index.