I am wondering how you would go about deleting arrays that contain the same elements in a 2 dimensional array.
For example:
let 2dArr = [ [1, 2, 3],
[3, 2, 1],
[2, 4, 5],
[4, 5, 2],
[4, 3, 1] ];
This array would delete the second and fourth elements, returning the 2d array:
returnedArr = [ [1, 2, 3],
[2, 4, 5],
[4, 3, 1] ];
How exactly could this be done, preserving the 2d array? I could only think to loop through elements, comparing elements via a sort, and deleting them as you go along, but this would result in an indexing error if an element is deleted.
1) You can easily achieve the result using reduce and Set as:
let twodArr = [
[1, 2, 3],
[3, 2, 1],
[2, 4, 5],
[4, 5, 2],
[4, 3, 1],
];
const set = new Set();
const result = twodArr.reduce((acc, curr) => {
const key = [...curr].sort((a, b) => a - b).join();
if (!set.has(key)) {
set.add(key);
acc.push(curr);
}
return acc;
}, []);
console.log(result);
2) You can also use filter as:
let twodArr = [
[1, 2, 3],
[3, 2, 1],
[2, 4, 5],
[4, 5, 2],
[4, 3, 1],
];
const set = new Set();
const result = twodArr.filter((curr) => {
const key = [...curr].sort((a, b) => a - b).join();
return !set.has(key) ? (set.add(key), true) : false;
});
console.log(result);
const seen = []
const res = array.filter((item) => {
let key = item.sort().join()
if(!seen.includes(key)){
seen.push(key)
return item
}
})
console.log(res)
You can use hash map
let arr = [ [1, 2, 3], [3, 2, 1],[2, 4, 5],[4, 5, 2],[4, 3, 1] ];
let obj = {}
let final = []
for(let i=0; i<arr.length; i++){
// create a key
let sorted = [...arr[i]].sort((a,b)=> a- b).join`,`
// check if this is not present in our hash map
// add value to final out and update hash map accordingly
if(!obj[sorted]){
obj[sorted] = true
final.push(arr[i])
}
}
console.log(final)
Using Array.prototype.filter() and a Set as thisArg
let arr = [ [1, 2, 3],
[3, 2, 1],
[2, 4, 5],
[4, 5, 2],
[4, 3, 1] ];
let res = arr.filter(function(e){
const sorted = [...e].sort((a,b) => a-b).join('|');
return this.has(sorted) ? false : this.add(sorted)
},new Set)
console.log(res)
I have an array of arrays, and I want to map over it and just return the values of arrays, but when I map over it and log the result, it's just an array and I don't know how to map over my array and use it in other places.
const arr = [
[1, 2, 3],
[4, 5, 6],
[7, 8, 9],
];
const arrMap = arr.map((it) => it.map((itm) => itm));
console.log(arrMap);
//what I expected 1,2,3,4,5,6 , ...
//what I got [Array(3), Array(3), Array(3)]
Actually, I need the values for using them in somewhere else, but I don't know what to do.
I also used function for this but when I return the values and log them It's undefined:
const arr = [
[1, 2, 3],
[4, 5, 6],
[7, 8, 9],
];
const arrMap = (arr) => {
arr.forEach((element) => {
console.log(element);
//In here, everything works fine
return element;
});
};
console.log(arrMap);
//what I got undefined
Use flatMap -
const arr = [
[1, 2, 3],
[4, 5, 6],
[7, 8, 9],
];
const arrMap = arr.flatMap(m => m);
console.log(arrMap);
Why it won't work : map() is supposed to run on each element of an array and return a transformed array of the same length. You have three elements in your input array and will always get three elements in your mapped array.
Your expectations can be met by tweaking your code with forEach() if you want. With forEach() there is nothing returned and you will have to start with a separate array variable. Below code uses ...
const arr = [
[1, 2, 3],
[4, 5, 6],
[7, 8, 9],
];
let arrMap = [];
arr.forEach((it) => arrMap.push(...it));
console.log(arrMap);
But flatMap() is already there:
const arr = [
[1, 2, 3],
[4, 5, 6],
[7, 8, 9],
];
let ans = arr.flatMap(x => x);
console.log(ans);
Use flat if you just want to flatten the array:
const arr = [
[1, 2, 3],
[4, 5, 6],
[7, 8, 9],
];
console.log(arr.flat());
Use flatMap if you want to do something with each element before the array gets flattened.
const arr = [
[1, 2, 3],
[4, 5, 6],
[7, 8, 9],
];
const arrMap = arr.flatMap((el) => {
el.forEach((n) => console.log(n));
return el;
});
console.log(arrMap);
forEach doesn't return anything it's like a for loop but for array only.
Since you have double array you should flat it by using flatMap
const arr = [
[1, 2, 3],
[4, 5, 6],
[7, 8, 9],
];
const arrMap = arr.flatMap((it) => it);
console.log(arrMap);
I attempted to ask a more complicated of this before but I couldn't explain it well so I am trying again with a simplified use case.
I will have an array of arrays like the following
var allData = [[1,2,3,4,5],[1,2,3,4,5],[1,2,3,4,5],[1,2,3,4,5],[1,2,3,4,5]]
I need to select 1 element from each array so that I get a unique set like [2,4,1,3,5] easy to do in this case as each array has all values. However this will rarely be the case. Instead I may have
var allData = [[1,2,4],[1,2],[1,2],[2,4,5],[1,2,3,5]]
In this case I couldn't pick 1 or 2 from the first array as that would prevent the 2nd and 3rd from having a unique combination. So something like [4,2,1,5,3] or [4,1,2,5,3] would be the only two possible answers for this combination.
The only way I see to do this is to just go through every combination but these will get fairly large so it doesn't seem reasonable as this happens real time. There are going to be at least 7 arrays, possibly 14 and distantly possible to have 31 so going through every combination would be fairly rough.
The 2nd part is if there is some way to "know" you have the best possible option. Say if there was some way I would know that having a single duplicate is my best case scenario. Even if I have to brute force it if I encounter a 1 duplication solution I would know to stop.
One easy way to get a very simple of this is to just subtract the number of possible choices from the number of elements but this is the correct answer in only the simplest of cases. Is there some type of library or anything to help solve these types of problems? It is a bit beyond my math abilities.
Here is something I have tried but it is too slow for larger sets and can fail. It works sometimes for the 2nd case I presented but only on luck
const allData = [[1,2,4],[1,2],[1,2],[2,4,5],[1,2,3,5]]
var selectedData = []
for (var i in allData){
console.log("length",allData[i].length)
var j = 0
while(j < allData[i].length){
console.log("chekcing",allData[i][j])
if (selectedData.includes(allData[i][j])){
console.log("removing item")
allData[i].splice(j,1)
}
else{j++}
}
var uniqueIds = Object.keys(allData[i])
console.log(uniqueIds)
var randId = Math.floor(Math.random() * uniqueIds.length)
console.log(randId)
selectedData.push(allData[i][randId])
console.log("selectedData",selectedData)
}
You can start with a fairly simple backtracking algorithm:
function pick(bins, n = 0, res = {}) {
if (n === bins.length) {
return res
}
for (let x of bins[n]) {
if (!res[x]) {
res[x] = n + 1
let found = pick(bins, n + 1, res)
if (found)
return found
res[x] = 0
}
}
}
//
let a = [[1, 2, 4], [1, 2], [1, 2], [2, 4, 5], [1, 2, 3, 4]]
console.log(pick(a))
This returns a mapping item => bin index + 1, which is easy to convert back to an array if needed.
This should perform relatively well for N < 10, for more/larger bins you can think of some optimizations, for example, avoid the worst case scenario by sorting bins from smallest to longest, or, depending on the nature of elements, represent bins as bitmasks.
You could count all elements and take various comparison with same indices.
function x([...data]) {
while (data.some(Array.isArray)) {
const
counts = data.reduce((r, a, i) => {
if (Array.isArray(a)) a.forEach(v => (r[JSON.stringify(v)] = r[JSON.stringify(v)] || []).push(i));
return r;
}, {}),
entries = Object.entries(counts),
update = ([k, v]) => {
if (v.length === 1) {
data[v[0]] = JSON.parse(k);
return true;
}
};
if (entries.some(update)) continue;
const grouped = entries.reduce((r, [, a]) => {
const key = JSON.stringify(a);
r[key] = (r[key] || 0) + 1;
return r;
}, {});
Object.entries(grouped).forEach(([json, length]) => {
const indices = JSON.parse(json);
if (indices.length === length) {
let j = 0;
indices.forEach(i => data[i] = data[i][j++]);
return;
}
if (length === 1) {
const value = JSON.parse(entries.find(([_, a]) => JSON.stringify(a) === json)[0]);
indices.forEach(i => data[i] = data[i].filter(v => v !== value));
data[indices[0]] = value;
}
});
}
return data;
}
console.log(...x([[1, 2, 3, 4, 5], [1, 2, 3, 4, 5], [1, 2, 3, 4, 5], [1, 2, 3, 4, 5], [1, 2, 3, 4, 5]]));
console.log(...x([[1, 2, 4], [1, 2], [1, 2], [2, 4, 5], [1, 2, 3, 5]]));
console.log(...x([[1, 2, 4], [1, 2], [1, 2], [2, 4, 5], [1, 2, 3, 5], [6, 7, 8, 9], [6, 7, 8, 9], [6, 7, 8, 10], [6, 7, 8, 10], [6, 7, 8, 10]]));
Here is an implementation based around counting occurrences across the arrays.
It first creates a map indexed by value counting the number of inner arrays each value occurs in. It then sorts by inner array length to prioritize shorter arrays, and then iterates over each inner array, sorting by occurrence and selecting the first non-duplicate with the lowest count, or, if there are no unique values, the element with the lowest count.
const
occurrencesAcrossArrays = (arr) =>
arr
.reduce((a, _arr) => {
[...new Set(_arr)].forEach(n => {
a[n] = a[n] || 0;
a[n] += 1;
});
return a;
}, {}),
generateCombination = (arr) => {
const dist = occurrencesAcrossArrays(arr)
return arr
.sort((a, b) => a.length - b.length)
.reduce((a, _arr) => {
_arr.sort((a, b) => dist[a] - dist[b]);
let m = _arr.find(n => !a.includes(n));
if (m !== undefined) {
a.push(m);
} else {
a.push(_arr[0]);
}
return a;
}, []);
};
console.log(generateCombination([[1, 2, 3, 4, 5], [1, 2, 3, 4, 5], [1, 2, 3, 4, 5], [1, 2, 3, 4, 5], [1, 2, 3, 4, 5]]).toString());
console.log(generateCombination([[1, 2, 4], [1, 2], [1], [2, 4, 5], [1, 2, 3, 5]]).toString());
console.log(generateCombination([[1, 2, 4], [1, 2], [1, 2], [2, 4, 5], [1, 2, 3, 5], [6, 7, 8, 9], [6, 7, 8, 9], [6, 7, 8, 10], [6, 7, 8, 10], [6, 7, 8, 10]]).toString());
Edit
In response to your comment – The situation seems to be emerging because the values all have the same occurrence count and are sequential.
This can be solved by keeping a running count of each value in the result array, and sorting each inner array by both by this running occurrence count as well as the original distribution count.This adds complexity to the sort, but allows you to simply access the first element in the array (the element with the lowest rate of occurrence in the result with the lowest occurrence count across all arrays).
const
occurrencesAcrossArrays = (arr) =>
arr
.reduce((a, _arr) => {
[...new Set(_arr)].forEach(n => {
a[n] = a[n] || 0;
a[n] += 1;
});
return a;
}, {}),
generateCombination = (arr) => {
const dist = occurrencesAcrossArrays(arr)
return arr
.sort((a, b) => a.length - b.length)
.reduce((acc, _arr) => {
_arr.sort((a, b) => (acc.occurrences[a] || 0) - (acc.occurrences[b] || 0) || dist[a] - dist[b]);
let m = _arr[0]
acc.occurrences[m] = acc.occurrences[m] || 0;
acc.occurrences[m] += 1;
acc.result.push(m);
return acc;
}, { result: [], occurrences: {} })
.result; // return the .result property of the accumulator
};
console.log(generateCombination([[2, 3, 4, 5, 6], [2, 3, 4, 5, 6], [2, 3, 4, 5, 6], [2, 3, 4, 5, 6], [2, 3, 4, 5, 6], [2, 3, 4, 5, 6], [2, 3, 4, 5, 6]]).toString());
// 2,3,4,5,6,2,3
console.log(generateCombination([[1, 2, 3, 4, 5], [1, 2, 3, 4, 5], [1, 2, 3, 4, 5], [1, 2, 3, 4, 5], [1, 2, 3, 4, 5]]).toString());
// 1,2,3,4,5
console.log(generateCombination([[1, 2, 4], [1, 2], [1], [2, 4, 5], [1, 2, 3, 5]]).toString());
// 1,2,4,5,3
console.log(generateCombination([[1, 2, 4], [1, 2], [1, 2], [2, 4, 5], [1, 2, 3, 5], [6, 7, 8, 9], [6, 7, 8, 9], [6, 7, 8, 10], [6, 7, 8, 10], [6, 7, 8, 10]]).toString());
//1,2,4,5,3,9,6,10,7,8
console.log(generateCombination([[1], [2, 3,], [3, 4, 5], [3, 4, 5, 6], [2, 3, 4, 5, 6, 7]]).toString());
// 1,2,4,6,7
A note on .reduce()
If you're having trouble getting your head around .reduce() you can rewrite all the instances of it in this example using .forEach() and declaring accumulator variables outside of the loop. (This will not always be the case, depending on how you manipulate the accumulator value within a reduce() call).
Example below:
const occurrencesAcrossArrays = (arr) => {
const occurrences = {};
arr.forEach(_arr => {
[...new Set(_arr)].forEach(n => {
occurrences[n] = occurrences[n] || 0;
occurrences[n] += 1;
});
});
return occurrences;
};
const generateCombination = (arr) => {
const dist = occurrencesAcrossArrays(arr);
const result = [];
const occurrences = {};
arr.sort((a, b) => a.length - b.length);
arr.forEach(_arr => {
_arr.sort((a, b) => (occurrences[a] || 0) - (occurrences[b] || 0) || dist[a] - dist[b]);
let m = _arr[0]
occurrences[m] = occurrences[m] || 0;
occurrences[m] += 1;
result.push(m);
});
return result;
};
console.log(generateCombination([[2, 3, 4, 5, 6], [2, 3, 4, 5, 6], [2, 3, 4, 5, 6], [2, 3, 4, 5, 6], [2, 3, 4, 5, 6], [2, 3, 4, 5, 6], [2, 3, 4, 5, 6]]).toString());
// 2,3,4,5,6,2,3
console.log(generateCombination([[1, 2, 3, 4, 5], [1, 2, 3, 4, 5], [1, 2, 3, 4, 5], [1, 2, 3, 4, 5], [1, 2, 3, 4, 5]]).toString());
// 1,2,3,4,5
console.log(generateCombination([[1, 2, 4], [1, 2], [1], [2, 4, 5], [1, 2, 3, 5]]).toString());
// 1,2,4,5,3
console.log(generateCombination([[1, 2, 4], [1, 2], [1, 2], [2, 4, 5], [1, 2, 3, 5], [6, 7, 8, 9], [6, 7, 8, 9], [6, 7, 8, 10], [6, 7, 8, 10], [6, 7, 8, 10]]).toString());
//1,2,4,5,3,9,6,10,7,8
console.log(generateCombination([[1], [2, 3,], [3, 4, 5], [3, 4, 5, 6], [2, 3, 4, 5, 6, 7]]).toString());
// 1,2,4,6,7
You could solve this problem using a MILP-model. Here is one implementation in MiniZinc (data has been extended to seven days):
int: Days = 7;
int: Items = 5;
set of int: DAY = 1..Days;
set of int: ITEM = 1..Items;
array[DAY, ITEM] of 0..1: A = % whether item k is allowed on day i
[| 1, 1, 0, 1, 0
| 1, 1, 0, 0, 0
| 1, 1, 0, 0, 0
| 0, 1, 0, 1, 1
| 1, 1, 0, 0, 0
| 0, 1, 0, 1, 1
| 1, 1, 1, 0, 1 |];
array[DAY, ITEM] of var 0..1: x; % 1 if item selected k on day i, otherwise 0
array[DAY, DAY, ITEM] of var 0..1: w; % 1 if item k selected on both day i and day j, otherwise 0
% exactly one item per day
constraint forall(i in DAY)
(sum(k in ITEM)(x[i, k]) = 1);
% linking variables x and w
constraint forall(i, j in DAY, k in ITEM where i < j)
(w[i, j, k] <= x[i, k] /\ w[i, j, k] <= x[j, k] /\ w[i, j, k] >= x[i, k] + x[j, k] - 1);
% try to minimize duplicates and if there are duplicates put them as far apart as possible
var int: obj = sum(i, j in DAY, k in ITEM where i < j)(((Days - (j - i))^2)*w[i, j, k]);
solve minimize obj;
output
["obj="] ++ [show(obj)] ++
["\nitem="] ++ [show([sum(k in ITEM)(k*x[i, k]) | i in DAY])];
Running gives:
obj=8
item=[2, 1, 5, 4, 3, 2, 1]
The following package looks promising for a JavaScript implementation: https://www.npmjs.com/package/javascript-lp-solver
I am trying to create a function that will get the items that cannot be seen on the 2nd or 3rd and upcoming arrays passed within the function.
Right now my function gets only the similar items. How can I make it get the difference (w/c are the items that doesn't exist to the 2nd and 3rd and proceeding arrays.
const callM = function(arrays) {
arrays = Array.prototype.slice.call(arguments);
let result = [];
for(let i = 1; i < arrays.length; i++){
for(let x = 0; x < arrays[i].length; x++){
if(arrays[0].includes(arrays[i][x])){
result.push(arrays[i][x]);
}
}
}
return result;
};
console.log(callM([1, 2, 3, 4, 5], [5, 2, 10])); // -> must be [1, 3, 4]
console.log(callM([1, 2, 3, 4, 5], [5, 2, 10], [7, 1, 8])); // -> must be [3,4]
The logic right now is a bit off as it gets the opposite. How do i fix this?
Also is there a way to do this using Higher Order functions such as reduce or filter?
Thanks!
I'd think about this differently. As the difference between two sets: array 0 and array 1...n
To get array 0, just shift it off the top
const arr0 = arrays.shift()
Ref: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/shift
This removes the first array from arrays
Next we combine the remaining arrays
const arrN = arrays.reduce(function(prev, curr) {
return prev.concat(curr)
})
Ref: http://www.jstips.co/en/javascript/flattening-multidimensional-arrays-in-javascript/
Unneeded, handled by includes as mentioned by #Phil
Next filter duplicates from arrN by comparing with itself
const unique = arrN.filter(function(elem, index, self) {
return index == self.indexOf(elem);
})
Ref: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/filter
Next filter with includes to find the difference (or union)
const diff = arr0.filter(function(item) {
return !arrN.includes(item))
}
Full snippet:
function callM(arrays) {
const arr0 = arrays.shift()
const arrN = arrays.reduce(function(prev, curr) {
return prev.concat(curr)
})
//const unique = arrN.filter(function(elem, index, self) {
// return index == self.indexOf(elem)
//})
return arr0.filter(function(item) {
return !arrN.includes(item)
})
}
console.log(callM([[1, 2, 3, 4, 5], [5, 2, 10]]))
console.log(callM([[1, 2, 3, 4, 5], [5, 2, 10], [7, 1, 8]]))
of course ES6 would be easier. ;)
const callM = (first, ...rest) => {
const arrays = [].concat(...rest)
return first.filter(item => !arrays.includes(item))
}
console.log(callM([1, 2, 3, 4, 5], [5, 2, 10]))
console.log(callM([1, 2, 3, 4, 5], [5, 2, 10], [7, 1, 8]))
A short solution for small and medium sized arrays:
// Return elements in array but not in filters:
function difference(array, ...filters) {
return array.filter(el => !filters.some(filter => filter.includes(el)));
}
// Example:
console.log(difference([1, 2, 3, 4, 5], [5, 2, 10])); // [1, 3, 4]
console.log(difference([1, 2, 3, 4, 5], [5, 1, 10], [7, 2, 8])); // [3, 4]
For large inputs, consider creating a Set from all filters and filtering in linear time using set.has(el).
In order to fix your implementation, you could label the outer for-loop and continue from there whenever a filter contains one of the array elements. Only when all filters pass without match, you push the array element into the result:
// Return elements in array but not in filters:
function difference(array, ...filters) {
const result = [];
loop: for (const el of array) {
for (const filter of filters) {
if (filter.includes(el)) continue loop;
}
result.push(el);
}
return result;
}
// Example:
console.log(difference([1, 2, 3, 4, 5], [5, 2, 10])); // [1, 3, 4]
console.log(difference([1, 2, 3, 4, 5], [5, 2, 10], [7, 1, 8])); // [3,4]
If you're willing to use Underscore, you can do this in one line of code:
console.log(_.difference([1, 2, 3, 4, 5], [5, 2, 10], [7, 1, 8]))
https://jsfiddle.net/o1zuaa6m/
You can use array#reduce to create object lookup of all the other array excluding the first array. Then use array#filter to get the values which are not present in the object lookup
var callM = (first, ...rest) => {
var combined = rest
.reduce((res,arr) => res.concat(arr))
.reduce((o, v) => {
o[v] = true;
return o;
},{});
return first
.filter(v => !combined[v]);
}
console.log(callM([1, 2, 3, 4, 5], [5, 2, 10])); // -> must be [1, 3, 4]
console.log(callM([1, 2, 3, 4, 5], [5, 2, 10], [7, 1, 8])); // -> must be [3,4]
The "proper" way to exclude values is usually to use a lookup hash set with the values to exclude:
const callM = (a, ...b) => (b = new Set(b.concat.apply(...b)), a.filter(v => !b.has(v)))
console.log(callM([1, 2, 3, 4, 5], [5, 2, 10])); // [1, 3, 4]
console.log(callM([1, 2, 3, 4, 5], [5, 2, 10], [7, 1, 8])); // [3, 4]