what is the most efficient way to group by multiple object properties? It groups by category and then by group.
I have detailed my implementation using a single array reduce which I will push into an array (expected output). I am not sure how to conditionally push to sources array which is created within the reduce.
Without using lodash groupBy or third party. I was also thinking of filtering by a key I create e.g.
const key = `${category}-${group}`;
const items = [
{
"name": "Halloumi",
"group": "Cheese",
"category": "Dairy"
},
{
"name": "Mozzarella",
"group": "Cheese",
"category": "Dairy"
}
];
// my current implementation
const groupedItems = items.reduce((map, item) => {
const { category, name, group } = item;
if (map.has(category)) {
map.get(category).push({
name,
group,
});
} else {
map.set(category, [
{
name,
group,
},
]);
}
return map;
}, new Map());
console.log(Object.fromEntries(groupedItems));
// Another attempt
const groupedItems2 = items.reduce((map, item) => {
const { category, group, name } = item;
acc[category] = acc[category] || { category, themes: [] };
const source = {
id,
name
};
const totalSources = [source].length;
const uniqueSources = [...new Set([source])].length;
acc[category].themes.push({
theme,
totalSources,
uniqueSources,
sources: [source],
});
return acc;
return map;
}, {});
console.log(Object.entries(groupedItems2));
// expected output
[
{
"category": "Dairy",
"groups": [
{
"group": "Cheese",
"totalSources": 2,
"sources": [
{
"name": "Halloumi"
},
{
"name": "Mozzarella"
}
]
}
]
}
]
You need a two-level map.
There are many ways to do that. Here is one:
const items = [{"name": "Halloumi","group": "Cheese","category": "Dairy"},{"name": "Mozzarella","group": "Cheese","category": "Dairy"}];
const dict = {};
for (const {name, group, category} of items) {
((dict[category] ??= {})[group] ??= []).push({name});
}
const categories = Object.entries(dict).map(([category, groups]) => ({
category,
groups: Object.entries(groups).map(([group, sources]) => ({
group,
totalSources: sources.length,
sources
}))
}));
console.log(categories);
I was interested in a more generic approach to this problem. It might or might not be useful for you, but I think it offers another way to think about such problems. The idea is that we have a function which takes a configuration like this:
const config = {
prop: 'category',
childName: 'groups',
children: {
prop: 'group',
childName: 'sources',
totalName: 'totalSources',
children: {prop: 'name'}
}
}
It returns a new function which takes an array of flat objects and nests them, giving totals, and changing property names as necessary. Here is an implementation, tested only on this problem (with substantially more data added to test various scenarios):
const regroup = ({prop, newName = prop, childName, totalName, children}) => (xs) =>
childName && children
? Object .entries (groupBy (x => x [prop]) (xs))
.map (([k, vs]) => [k, vs .map (omit ([prop]))])
.map (([k, vs, kids = regroup (children) (vs)]) => ({
[newName]: k,
... (totalName ? {[totalName]: kids .length} : {}),
[childName]: kids
})
)
: prop && newName
? xs .map (({[prop]: n, ...rest}) => ({[newName]: n, ...rest}))
: [ ...xs]
const groupBy = (fn, k) => (xs) => xs .reduce (
(a, x) => ((k = fn (x)), (a [k] = a [k] || []), (a [k] .push (x)), a)
, {}
)
const omit = (names) => (o) => Object .fromEntries (
Object .entries (o) .filter (([k, v]) => ! names.includes (k))
)
const config = {
prop: 'category',
childName: 'groups',
children: {
prop: 'group',
childName: 'sources',
totalName: 'totalSources',
children: {prop: 'name'}
}
}
const groupFoods = regroup (config)
const items = [{name: "Halloumi", group: "Cheese", category: "Dairy"}, {name: "Mozzarella", group: "Cheese", category: "Dairy"}, {name: "Whole", group: "Milk", category: "Dairy"}, {name: "Skim", group: "Milk", category: "Dairy"}, {name: "Potatos", group: "Root", category: "Vegetable"}, {name: "Turnips", group: "Root", category: "Vegetable"}, {name: "Strawberry", group: "Berry", category: "Fruit"}, {name: "Baguette", group: "Yeast", category: "Bread"}]
console .log (JSON.stringify (groupFoods (items), null, 4))
.as-console-wrapper {max-height: 100% !important; top: 0}
Note the two helper functions. omit clones an object removing certain property names. So, for instance, omit ([age, id]) ({id: 101, first: 'Fred', last: 'Flintstone', age: 27}) returns {first: 'Fred', last: 'Flintstone'} groupBy accepts a function which returns a key value for a value, and returns a function which takes an array of values, and returns an object with the keys found, assbociated with an array of values that generate that key. For instance, groupBy (n => n % 10) ([1, 3, 4, 21, 501, 23, 43, 25, 64]) //=> {"1": [1, 21, 501], "3": [3, 23, 43], "4": [4, 64], "5": [25]}. These are genuinely reusable functions that you might keep in your personal utility library.
The main function simply builds our output using the configuration values supplied. If we need to nest, then we recur on the children node of the configuration.
Related
i'm trying to duplicate objects based on two properties that have multiple values differentiated by a comma.
For example:
I have an object
const obj = {
id: 1
date: "2021"
tst1: "111, 222"
tst2: "AAA, BBB"
}
And I would like the result to be an array of 2 objects in this case (because there are 2 values in tst1 OR tst2, these 2 properties will always have the same nr of values differentiated by a comma)
[{
id: 1,
date: "2021",
tst1: "111",
tst2: "AAA",
},
{
id: 1,
date: "2021",
tst1: "222",
tst2: "BBB",
}]
What I tried is this:
I created a temporary object
const tempObject = {
id: obj.id,
date: obj.date,
}
And then I would split and map the property that has multiple values, like this:
cont newObj = obj.tst1.split(",").map(function(value) {
let finalObj = {}
return finalObj = {
id: tempObject.id,
date: tempObject.date,
tst1: value,
})
And now, the newObj is an array of objects and each object contains a value of tst1.
The problem is I still have to do the same for the tst2...
And I was wondering if there is a simpler method to do this...
Thank you!
Here is an example that accepts an array of duplicate keys to differentiate. It first maps them to arrays of entries by splitting on ',' and then trimming the entries, then zips them by index to create sub-arrays of each specified property, finally it returns a result of the original object spread against an Object.fromEntries of the zipped properties.
const mapDuplicateProps = (obj, props) => {
const splitProps = props.map((p) =>
obj[p].split(',').map((s) => [p, s.trim()])
);
// [ [[ 'tst1', '111' ], [ 'tst1', '222' ]], [[ 'tst2', 'AAA' ], [ 'tst2', 'BBB' ]] ]
const dupeEntries = splitProps[0].map((_, i) => splitProps.map((p) => p[i]));
// [ [[ 'tst1', '111' ], [ 'tst2', 'AAA' ]], [[ 'tst1', '222' ], [ 'tst2', 'BBB' ]] ]
return dupeEntries.map((d) => ({ ...obj, ...Object.fromEntries(d) }));
};
const obj = {
id: 1,
date: '2021',
tst1: '111, 222',
tst2: 'AAA, BBB',
};
console.log(mapDuplicateProps(obj, ['tst1', 'tst2']));
Not sure if that's what you're searching for, but I tried making a more general use of what you try to do:
const duplicateProperties = obj => {
const properties = Object.entries(obj);
let acc = [{}];
properties.forEach(([key, value]) => {
if (typeof value === 'string' && value.includes(',')) {
const values = value.split(',');
values.forEach((v, i) => {
if (!acc[i]) {
acc[i] = {};
}
acc[i][key] = v.trim();
});
} else {
acc.forEach(o => o[key] = value);
}
});
return acc;
};
const obj = {
id: 1,
date: '2021',
tst1: '111, 222',
tst2: 'AAA, BBB',
};
console.log(duplicateProperties(obj));
You could start by determining the length of the result using Math.max(), String.split() etc.
Then you'd create an Array using Array.from(), returning the correct object for each value of the output index.
const obj = {
id: 1,
date: "2021",
tst1: "111, 222",
tst2: "AAA, BBB",
}
// Determine the length of our output array...
const length = Math.max(...Object.values(obj).map(s => (s + '').split(',').length))
// Map the object using the relevant index...
const result = Array.from({ length }, (_, idx) => {
return Object.fromEntries(Object.entries(obj).map(([key, value]) => {
const a = (value + '').split(/,\s*/);
return [key, a.length > 1 ? a[idx] : value ]
}))
})
console.log(result)
.as-console-wrapper { max-height: 100% !important; }
I need to create an array of array.
It is worth noting that the database is very large and that if any attribute does not have a corresponding value, it sends an empty string. I've tried with map and reduce but I wasn't successful:
Any help will be appreciated.
Below I show an example of the expected output:
outputExpected = [
["id", 1, 2],
["name", "name1", "name2"],
["price", 6.95, 998.95],
["promoPrice", 5.91, 333.91],
["category", "test1 | test2", "test3 | test4"],
]
Any way to solve this problem performatically?
this is my code:
let arrayObj = [{
"id": 1,
"name": "name1",
"price": 6.95,
"promoPrice": 5.91,
"category": ["test1, test2"]
},
{
"id": 2,
"name": "name2",
"price": 998.95,
"promoPrice": 333.91,
"category": ["test3, test4"]
}
]
const headers = ["id", "name", "price", "promoPrice", "category"]
const result1 = headers.concat(arrayObj.map((obj) => {
return headers.reduce((arr, key) => {
arr.push(obj[key]) return arr;
}, [])
}))
console.log(result1)
Reduce the array to a Map. On each iteration convert the object to an array of [key, value] pairs using Object.entries(). Use Array.forEach() to iterate the entries and add them to the map. Convert the Map's values iterator to an array using Array.from():
const arr = [{"id":1,"name":"name1","price":6.95,"promoPrice":5.91,"category":["test1", "test2"]},{"id":2,"name":"name2","price":998.95,"promoPrice":333.91,"category":["test3", "test4"]}]
const result = Array.from(arr.reduce((acc, o) => {
Object.entries(o)
.forEach(([k, v]) => {
if(!acc.has(k)) acc.set(k, [k])
acc.get(k).push(Array.isArray(v) ? v.join(' | ') : v)
})
return acc
}, new Map()).values())
console.log(result)
You could simply map the value and check if an item is an array, then take the joined values or the value itself.
const
data = [{ id: 1, name: "name1", price: 6.95, promoPrice: 5.91, category: ["test1, test2"] }, { id: 2, name: "name2", price: 998.95, promoPrice: 333.91, category: ["test3, test4"] }],
headers = ["id", "name", "price", "promoPrice", "category"],
result = data
.reduce(
(r, o) => headers.map((k, i) => [
...r[i],
Array.isArray(o[k]) ? o[k].join(' | ') : o[k]
]),
headers.map(k => [k]),
);
console.log(result);
.as-console-wrapper { max-height: 100% !important; top: 0; }
construct one init array with all possible keys with the wanted order, then uses Array.reduce and Array.forEach to Array.push value for per key based on its index.
const arrayObj = [
{
"id":1,
"name":"name1",
"price":6.95,
"promoPrice":5.91,
"category":["test1", "test2"]
},
{
"id":2,
"name":"name2",
"price":998.95,
"promoPrice":333.91,
"category":["test3", "test4"]
}
]
function ConvertToArray2D (items) {
let init = [['id'], ['name'], ['price'], ['promoPrice'], ['category']]
if (!items) return init
return arrayObj.reduce((pre, cur) => {
init.forEach((key, index) => {
pre[index].push(Array.isArray(cur[key[0]]) ? cur[key[0]].join('|') : cur[key[0]])
})
return pre
}, init.slice())
}
console.log(ConvertToArray2D(arrayObj))
This can be handled with a standard 'zip' after mapping your objects to arrays of values in line with the headers array. (This also allows for the result to be pivoted back).
//** #see https://stackoverflow.com/a/10284006/13762301
const zip = (...rs) => [...rs[0]].map((_, c) => rs.map((r) => r[c]));
const headers = ['id', 'name', 'price', 'promoPrice', 'category'];
const arrayObj = [{ id: 1, name: 'name1', price: 6.95, promoPrice: 5.91, category: ['test1', 'test2'] },{ id: 2, name: 'name2', price: 998.95, promoPrice: 333.91, category: ['test3', 'test4'] },];
const result = zip(
headers,
...arrayObj.map((o) => headers.map(h => Array.isArray(o[h]) ? o[h].join(' | ') : o[h]))
);
console.log(result);
// which also allows it to be reversed
console.log(zip(...result));
.as-console-wrapper { max-height: 100% !important; top: 0; }
see: Javascript equivalent of Python's zip function for further zip discussion.
A long title, so I´ll explain the problem by example. I have an array of objects:
const myObjects = [
{
id: 1,
name: "a",
stuff: "x"
},
{
id: 2,
name: "b",
stuff: "y"
},
];
Then I have another array of objects like this:
const myTemplate=[
{
desiredProperty: "name",
someOtherProperty: "..."
},
{
desiredProperty: "stuff",
someOtherProperty: "..."
},
];
Now I want to transform myObjects array to new one, so that the individual objects contain only the properties listed in desiredProperty of each object in myTemplate.
The result should look like this:
myResult = [
{
name: "a",
stuff: "x"
},
{
name: "b",
stuff: "y"
}
]
How to achieve this?
This approach lets you partially apply the template to get back a reusable function to run against multiple sets of inputs:
const convert = (template, keys = new Set (template .map (t => t .desiredProperty))) => (xs) =>
xs .map (
(x) => Object .fromEntries (Object .entries (x) .filter (([k, v]) => keys .has (k)))
)
const myObjects = [{id: 1, name: "a", stuff: "x"}, {id: 2, name: "b", stuff: "y"}]
const myTemplate= [{desiredProperty: "name", someOtherProperty: "..."}, {desiredProperty: "stuff", someOtherProperty: "..."}]
console .log (
convert (myTemplate) (myObjects)
)
But I agree with the comment that the template here is better expressed as an array of keys to keep.
The following code creates a Set of the keys you want to keep. Then, we map over your myObjects array and only keep the object keys that are in the toKeep Set.
const myObjects=[{id:1,name:"a",stuff:"x"},{id:2,name:"b",stuff:"y"}];
const myTemplate=[{desiredProperty:"name",someOtherProperty:"..."},{desiredProperty:"stuff",someOtherProperty:"..."}];
const toKeep = new Set(myTemplate.map(t => t.desiredProperty));
const newObjs = myObjects.map(o => {
const obj = {};
for (let key in o) {
if (toKeep.has(key)) {
obj[key] = o[key];
}
}
return obj;
});
console.log(newObjs);
I have an object that look like this:
{
"id": 45745049
"seller": {
"first_name": "Sam",
"last_name": "Smith",
"email": "samsmith#smith.com",
"phone": {
"number": "1111-1111",
"verified": false
},
},
"order_items": [
{
"item": {
"id": "29239765",
"title": "item1",
"colors": [
"red",
"green",
"blue"
]
},
"quantity": 1,
"unit_price": 230,
},
{
"item": {
"id": "238457363",
"title": "item2",
"colors": [
"red"
]
},
"quantity": 2,
"unit_price": 110,
}
],
"date_created": "2020-08-03T12:17:25.000-04:00",
"date_last_updated": "2020-08-03T16:51:35.61Z"
}
I want an array with pairs of EVERY key in the object with the value.
For example:
[
["id", 45745049],
["first_name", "Sam"],
.....,
["phone.number", "1111-1111"],
["phone.verified", false],
....etc
]
Everything Ok until that point. The problem is when a property is an array of objects. The output I want is the following:
[
...,
["order_items1.item.id", 29239765],
["order_items1.item.colors1", "red"],
["order_items1.item.colors2", "green"],
...,
["order_items2.item.id", 238457363],
["order_items2.item.colors1", "red"],
...etc
]
So it needs to check if the property is an array and add the position number if so.
I know I need a recursive function but I dont know how to do it.
This is what I got until now.
getObjectKeys = (obj) => {
let FieldName = "";
let FieldValue = "";
for(var prop in obj) {
FieldName += prop;
if(!(prop instanceof Array) && (typeof prop !== "object") && obj[prop]) {
FieldValue = obj[prop];
} else if(prop instanceof Array && prop.length !== 0){
prop.forEach((innerItem, i) => {
FieldName += `${i+1}.`;
// check if the inner item is an array or whatever an do it all again
// Dont know what to do here.
});
} else {
getObjectKeys(obj[prop]);
}
}
return [FieldName, FieldValue];
}
Note: I dont want the empty or null keys.
I would be very grateful if someone can help me. Thanks anyways!
This does something very similar to what you're looking for. It's a technique I use often.
const getPaths = (obj) =>
Object (obj) === obj
? Object .entries (obj) .flatMap (([k, v]) => getPaths (v) .map (p => [k, ... p]))
: [[]]
const path = (ps) => (obj) =>
ps .reduce ((o, p) => (o || {}) [p], obj)
const flatten = (obj) =>
Object .fromEntries (getPaths (obj) .map (p => [p.join('.'), path (p) (obj)]))
const input = {id: 45745049, seller: {first_name: "Sam", last_name: "Smith", email: "samsmith#smith.com", phone: {number: "1111-1111", verified: false}}, order_items: [{item: {id: "29239765", title: "item1", colors: ["red", "green", "blue"]}, quantity: 1, unit_price: 230}, {item: {id: "238457363", title: "item2", colors: ["red"]}, quantity: 2, unit_price: 110}], date_created: "2020-08-03T12: 17: 25.000-04: 00", date_last_updated: "2020-08-03T16: 51: 35.61Z"}
console .log (flatten (input))
.as-console-wrapper {min-height: 100% !important; top: 0}
The differences are that there is a separator before the array index and that I use zero-based arrays, not one-based arrays.
I would suggest that it's a much better output format. If nothing else, it would probably allow you to rehydrate the original format. But if you want to change it, you should probably simply reduce the path to combine the numeric elements with their predecessors, something like:
const flatten = (obj) =>
Object .fromEntries (getPaths (obj) .map (p => [
p .reduce (
(a, k) => /^\d+$/ .test(k) ? [...a .slice (0, -1), a [a .length - 1] + (1 + (+k))] : [...a, k],
[]
) .join ('.'),
path2 (p) (obj)
]))
But this would require changes if the outer object might be an array.
Again, though, absent a very good reason to use your requested format, I would strongly recommend my alternative.
I am trying to find out the best / most efficient or most functional way to compare / merge / manipulate two arrays (lists) simultaneously in JS.
The example I give below is a simple example of the overall concept. In my current project, I deal with some very crazy list mapping, filtering, etc. with very large lists of objects.
As delinated below, my first idea (version1) on comparing lists would be to run through the first list (i.e. map), and in the anonymous/callback function, filter the second list to meet the criteria needed for the compare (match ids for example). This obviously works, as per version1 below.
I had a question performance-wise, as by this method on every iteration/call of map, the entire 2nd list gets filtered just to find that one item that matches the filter.
Also, the filter passes every other item in list2 which should be matched in list1. Meaning (as that sentence probably did not make sense):
list1.map list2.filter
id:1 [id:3,id:2,id:1]
^-match
id:2 [id:3,id:2,id:1]
^-match
id:3 [id:3,id:2,id:1]
^-match
Ideally on the first iteration of map (list1 id:1), when the filter encounters list2 id:3 (first item) it would just match it to list1 id:3
Thinking with the above concept (matching to a later id when it is encountered earlier, I came up with version2).
This makes list2 into a dictionary, and then looks up the value in any sequence by key.
const list1 = [
{id: '1',init:'init1'},
{id: '2',init:'init2'},
{id: '3',init:'init3'}
];
const list2 = [
{id: '2',data:'data2'},
{id: '3',data:'data3'},
{id: '4',data:'data4'}
];
/* ---------
* version 1
*/
const mergedV1 = list1.map(n => (
{...n,...list2.filter(f => f.id===n.id)[0]}
));
/* [
{"id": "1", "init": "init1"},
{"id": "2", "init": "init2", "data": "data2"},
{"id": "3", "init": "init3", "data": "data3"}
] */
/* ---------
* version 2
*/
const dictList2 = list2.reduce((dict,item) => (dict[item.id]=item,dict),{});
// does not handle duplicate ids but I think that's
// outside the context of this question.
const mergedV2 = list1.map(n => ({...n,...dictList2[n.id]}));
/* [
{"id": "1", "init": "init1"},
{"id": "2", "init": "init2", "data": "data2"},
{"id": "3", "init": "init3", "data": "data3"}
] */
JSON.stringify(mergedV1) === JSON.stringify(mergedV2);
// true
// and just for fun
const sqlLeftOuterJoinInJS = list1 => list2 => on => {
const dict = list2.reduce((dict,item) => (
dict[item[on]]=item,dict
),{});
return list1.map(n => ({...n,...dict[n[on]]}
))};
Obviously the above examples are pretty simple (merging two lists, each list having a length of 3). There are more complex instances that I am working with.
I don't know if there are some smarter (and ideally functional) techniques out there that I should be using.
You could take a closure over the wanted key for the group and a Map for collecting all objects.
function merge(key) {
var map = new Map;
return function (r, a) {
a.forEach(o => {
if (!map.has(o[key])) r.push(map.set(o[key], {}).get(o[key]));
Object.assign(map.get(o[key]), o);
});
return r;
};
}
const
list1 = [{ id: '1', init: 'init1' }, { id: '2', init: 'init2' }, { id: '3', init: 'init3' }],
list2 = [{ id: '2', data: 'data2' }, { id: '3', data: 'data3' }, { id: '4', data: 'data4' }],
result = [list1, list2].reduce(merge('id'), []);
console.log(result);
.as-console-wrapper { max-height: 100% !important; top: 0; }
Using filter for search is a misstep. Your instinct in version 2 is much better. Map and Set provide much faster lookup times.
Here's a decomposed approach. It should be pretty fast, but maybe not as fast as Nina's. She is a speed demon >_<
const merge = (...lists) =>
Array .from
( lists
.reduce (merge1, new Map)
.values ()
)
const merge1 = (cache, list) =>
list .reduce
( (cache, l) =>
cache .has (l.id)
? update (cache, l.id, l)
: insert (cache, l.id, l)
, cache
)
const insert = (cache, key, value) =>
cache .set (key, value)
const update = (cache, key, value) =>
cache .set
( key
, { ...cache .get (key)
, ...value
}
)
const list1 =
[{ id: '1', init: 'init1' }, { id: '2', init: 'init2' }, { id: '3', init: 'init3' }]
const list2 =
[{ id: '2', data: 'data2' }, { id: '3', data: 'data3' }, { id: '4', data: 'data4' }]
console .log (merge (list1, list2))
I'm offering this for completeness as I think Nina and #user633183 have offered most likely more efficient solutions.
If you wish to stick to your initial filter example, which is a max lookup N*M, and your arrays are mutable; you could consider reducing the set as you traverse through. In the old days shrinking the array had a huge impact on performance.
The general pattern today is to use a Map (or dict) as indicated in other answers, as it is both easy to understand and generally efficient.
Find and Resize
const list1 = [
{id: '1',init:'init1'},
{id: '2',init:'init2'},
{id: '3',init:'init3'}
];
const list2 = [
{id: '2',data:'data2'},
{id: '3',data:'data3'},
{id: '4',data:'data4'}
];
// combine by ID
let merged = list1.reduce((acc, obj)=>{
acc.push(obj);
// find index by ID
let foundIdx = list2.findIndex( el => el.id==obj.id );
// if found, store and remove from search
if ( foundIdx >= 0 ){
obj.data = list2[foundIdx].data;
list2.splice( foundIdx, 1 ); // shrink lookup array
}
return acc;
},[]);
// store remaining (if you want); i.e. {id:4,data:'data4'}
merged = merged.concat(list2)
console.log(merged);
.as-console-wrapper {
max-height: 100% !important;
top: 0;
}
I'm not sure whether I should mark this question as a duplicate because you phrased it differently. Anyway, here's my answer to that question copied verbatim. What you want is an equijoin:
const equijoin = (xs, ys, primary, foreign, sel) => {
const ix = xs.reduce((ix, row) => // loop through m items
ix.set(row[primary], row), // populate index for primary table
new Map); // create an index for primary table
return ys.map(row => // loop through n items
sel(ix.get(row[foreign]), // get corresponding row from primary
row)); // select only the columns you need
};
You can use it as follows:
const equijoin = (xs, ys, primary, foreign, sel) => {
const ix = xs.reduce((ix, row) => ix.set(row[primary], row), new Map);
return ys.map(row => sel(ix.get(row[foreign]), row));
};
const list1 = [
{ id: "1", init: "init1" },
{ id: "2", init: "init2" },
{ id: "3", init: "init3" }
];
const list2 = [
{ id: "2", data: "data2" },
{ id: "3", data: "data3" },
{ id: "4", data: "data4" }
];
const result = equijoin(list2, list1, "id", "id",
(row2, row1) => ({ ...row1, ...row2 }));
console.log(result);
It takes O(m + n) time to compute the answer using equijoin. However, if you already have an index then it'll only take O(n) time. Hence, if you plan to do multiple equijoins using the same tables then it might be worthwhile to abstract out the index.