Related
I made this function to get all possible combinations of a set that sum to a target value, its works, but it is not as organize/efficient as possible yet.
After fry my brain trying to optimize this function i ran out of ideas and a litle bit blind, so i came here to get somes adivices and ideas for what more i can do.
const combinate = (set, target) => {
const result = []
set.forEach((element) => {
let division = Math.floor(target / element)
let remainder = target % element
while (remainder !== target) {
let combinations = []
for (let index = 0; index < division; index++) {
combinations.push(element)
}
if (remainder === 0) {
result.push(combinations.sort())
break
} else if (set.includes(remainder)) {
combinations.push(remainder)
result.push(combinations.sort())
break
} else {
division--
remainder += element
}
}
})
return result
}
Here I have some examples of expected outcomes for how this function it should work.
combinate([2, 3, 5], 8) -> [[2,2,2,2],[2,3,3],[3,5]]
I think your algorithm is fine.
I personally would structure the code differently, as I'm a fan of expression-only coding. My version might look like this:
// Ex: countDown (6) //=> [6, 5, 4, 3, 2, 1, 0]
const countDown = (n) =>
n < 0 ? [] : [n, ...countDown (n - 1)]
const subsetSum = ([n, ...ns], t) =>
n == undefined
? t == 0 ? [[]] : []
: countDown (Math.floor (t / n)) .flatMap (
(k) => subsetSum (ns, t - k * n) .map (ss => [...Array (k) .fill (n), ...ss])
)
console .log (subsetSum ([2, 3, 5], 8))
.as-console-wrapper {max-height: 100% !important; top: 0}
I count down rather than up just so the results come in the same order yours did. If I counted up, they would show up as [[3, 5], [2, 3, 3], [2, 2, 2, 2]].
But this is essentially the same algorithm as yours. If it's ill-performant, then I might look at a dynamic programming version where we calculate the results for each lower total, and then for our target total, we look up the values found by subtracting each of the number in our set, and for each of those results, we add one of that number. Here's one version:
const countUp = (n) =>
(n < 1) ? [] : [... countUp (n - 1), n]
const oneMore = (i) => (s) =>
s .split ('-') .map (Number) .map ((n, j) => j == i ? n + 1 : n) .join ('-')
const collect = (ns, t) =>
countUp (t) .reduce (
(rs, v) => [
...rs,
new Set (ns .flatMap ((n, i) => ([...rs [v - n] || []]) .map (oneMore (i))))
],
[new Set([ns .map (n => 0) .join ('-')])]
)
const subsetSum = (ns, t) =>
[...collect (ns, t) [t]]
.map (s => s.split ('-') .map(Number) .flatMap ((c, i) => Array (c) .fill (ns[i])))
console .log (subsetSum ([2, 3, 5], 8))
.as-console-wrapper {max-height: 100% !important; top: 0}
The main function here, collect accepts, say [2, 3, 5] and 8, and returns something like
[
new Set (['0-0-0']), // 0
new Set ([]), // 1
new Set (['1-0-0']), // 2
new Set (['0-1-0']), // 3
new Set (['2-0-0']), // 4
new Set (['1-1-0', '0-0-1']), // 5
new Set (['3-0-0', '0-2-0']), // 6
new Set (['2-1-0', '1-0-1']), // 7
new Set (['4-0-0', '1-2-0', '0-1-1']), // 8
]
where, say '1-1-0' represents one 2 and one 3 and zero 5s, which add up to 5, or '0-1-1' represents zero 2s and one 3 and one 5, which add up to 8. In retrospect, a better string format would probably have been the JSON-stringified version of something like {2: 1, 3: 1, 5: 0} But I'll leave that as an exercise.
The values are stored as Strings in Sets to eliminate duplicates as we go. For instance, when we hit 5, we can add a 2 to '0-1-0' or a 3 to '1-0-0', both of which end up as '1-1-0'. But we only want a single copy of that result.
We use two minor helper functions. countUp for instance, turns 7 into [1, 2, 3, 4, 5, 6, 7]. oneMore handles the string to Array of numbers back to string conversion such that
oneMore (0) ('1-7-4') //=> '2-7-4'
oneMore (1) ('1-7-4') //=> '1-8-4'
oneMore (2) ('1-7-4') //=> '1-7-5'
The main function simply extracts the last value computed by collect and then for each of the Strings in the set, converts that back into a proper array.
I have not tested for performance, but there's a real chance that this will be faster than the original algorithm. If nothing else, it demonstrates a substantially different technique.
I am trying to return an array of indexes of values that add up to a given target. I am trying to solve it the fastest way I can!
Examples:
sumOfTwo([1, 2, 4, 4], 8) // => [2, 3]
sumOfTwo([1, 2, 3, 9], 8) // => []
So first I tried a simple brute-force. (Time complexity: O(n^2) )
function sumOfTwo(arr, target) {
for (let i = 0; i < arr.length; i++) {
for (let j = i + 1; j < arr.length; j++) {
if (arr[i] + arr[j] === target) {
return [i, j];
}
}
}
return [];
}
Then I tried: (Time complexity: sorting O(n log n) + for loop O(n))
function sumOfTwo(arr, target) {
const sortedArr = arr.sort();
let idxFromBack = arr.length - 1;
for (let [idx, val] of sortedArr.entries()) {
if (val + arr[idxFromBack] > target) {
idxFromBack--;
}
if (val + arr[idxFromBack] === target) {
return [idx, idxFromBack];
}
}
return [];
}
Then I came with this solution that I don't even know the time complexity.
function sumOfTwo(arr, target) {
const complements = [];
for (let [idx, val] of arr.entries()) {
if (complements.reduce((acc, v) => (acc || v.value === val), false)) {
return [complements.find(v => v.value === target - val).index, idx];
}
complements.push({index: idx, value: target - val});
}
return [];
}
I know that I am using a for-loop but I don't know the complexity of the build-in high order functions .reduce() and .find(). I tried a couple of searches but I couldn't find anything.
If anyone can help me would be great! Please include Big-O notation if possible.
Repl.it: https://repl.it/#abranhe/sumOfTwo
Please also include the time complexity of the last solution.
The minimum time complexity of .reduce is O(n), because it must iterate through all elements once (assuming an error isn't thrown), but it can be unbounded (since you can write any code you want inside the callback).
For your
// Loop, O(n), n = length of arr:
for (let [idx, val] of arr.entries()) {
// .reduce, O(n), n = length of complements:
if (complements.reduce((acc, v) => (acc || v.value === val), false)) {
// If test succeeds, .find, O(n), n = length of complements:
return [complements.find(v => v.value === target - val).index, idx];
}
complements.push({index: idx, value: target - val});
}
the time complexity is, worst case, O(n^2). The reduce runs in O(n) time, and you run a reduce for every entry in arr, making it O(n^2).
(The .find is also an O(n) operation, but O(n) + O(n) = O(n))
Your code that sorts the array beforehand has the right idea for decreasing complexity, but it has a couple flaws.
First, you should sort numerically ((a, b) => a - b)); .sort() with no arguments will sort lexiographically (eg, [1, 11, 2] is not desirable).
Second, just decrementing idxFromBack isn't enough: for example, sumOfTwo([1, 3, 8, 9, 9], 9) will not see that 1 and 8 are a match. Perhaps the best strategy here would be to oscillate with while instead: from a idxFromBack, iterate backwards until a match is found or the sum is too small, and also iterate forwards until a match is found or the sum is too large.
You can also improve the performance of this code by sorting not with .sort((a, b) => a - b), which has complexity of O(n log n), but with radix sort or counting sort instead (both of which have complexity of O(n + k), where k is a constant). The optimal algorithm will depend on the general shape and variance of the input.
An even better, altogether different O(n) strategy would be to use a Map or object. When iterating over the array, put the value which would result in a match for the current item into a key of the object (where the value is the current index), and just look to see if the current value being iterated over exists in the object initially:
const sumOfTwo = (arr, target) => {
const obj = {};
for (const [i, num] of arr.entries()) {
if (obj.hasOwnProperty(String(num))) {
return [obj[num], i];
}
const matchForThis = target - num;
obj[matchForThis] = i;
}
return [];
};
console.log(
sumOfTwo([1, 2, 4, 4], 8), // => [2, 3]
sumOfTwo([1, 2, 8, 9], 9), // 1 + 8 = 9; [0, 2]
sumOfTwo([1, 2, 3, 9], 8) // => []
);
As a supplementary answer, here is the algorithm of the find method in the language spec:
When the find method is called, the following steps are taken:
Let O be ? ToObject(this value).
Let len be ? LengthOfArrayLike(O).
If IsCallable(predicate) is false, throw a TypeError exception.
Let k be 0.
Repeat, while k < len,
a. Let Pk be ! ToString(𝔽(k)).
b. Let kValue be ? Get(O, Pk).
c. Let testResult be ! ToBoolean(? Call(predicate, thisArg, « kValue, 𝔽(k), O »)).
d. If testResult is true, return kValue.
e. Set k to k + 1.
Return undefined.
Note the "repeat, while k < len" in step 5. Since time complexity in general measures the worst case scenario (aka the upper bound), we can assume that the searched element is not present in the collection.
The number of iterations made during step 5 then is equal to len which directly depends on the number of elements in the collection. And what time complexity has a direct correlation to the number of elements? Exactly, the linear O(n).
For a visual-ish demonstration, run the following snippet. Apart from some stray dots, the improvized graph should show a linear progression (takes a little while to display in Stack snippets, but you can watch it live in the devtools console):
const iter = 1e7;
const incr = 2e5;
const test = new Array(iter).fill(0);
const steps = Math.ceil(iter / incr);
for (let i = 0; i < steps; i += 1) {
const sub = test.slice(0, i * incr + incr);
const s = Date.now();
const find = sub.find((v) => v === 1);
const e = Date.now();
const d = e - s;
console.log("\u200A".repeat(Math.floor(d/3))+"*");
}
I have a function which takes an array of numbers as an argument. I want to return a new array with the products of each number except for the number at the current index.
For example, if arr had 5 indexes and we were creating the value for index 1, the numbers at index 0, 2, 3 and 4 would be multiplied.
Here is the code I have written:
function getProducts(arr) {
let products = [];
for(let i = 0; i < arr.length; i++) {
let product = 0;
for(let value in arr.values()) {
if(value != arr[i]) {
product *= value;
}
}
products.push(product);
}
return products;
}
getProducts([1, 7, 3, 4]);
// Output ➞ [0, 0, 0, 0]
// Expected output ➞ [84, 12, 28, 21]
As you can see, the desired output does not actualise. I did some experimenting and it appears that the second for loop is never really initiated, as any code I put inside the block does not execute:
function getProducts(arr) {
let products = [];
for(let i = 0; i < arr.length; i++) {
let product = 0;
for(let value in arr.values()) {
console.log('hello!');
if(value != arr[i]) {
product *= value;
}
}
products.push(product);
}
return products;
}
getProducts([1, 7, 3, 4]);
// Output ➞
// Expected Output ➞ 'hello!'
What is wrong with my code?
You could take the product of all numbers and divide by the number of the index to get a product of all except the actual value.
function getProducts(array) {
var product = array.reduce((a, b) => a * b, 1);
return array.map(p => product / p);
}
console.log(getProducts([1, 7, 3, 4]));
A more reliable approach with an array with one zero. If an array has more than one zero, all products are zero.
The below approach replaces the value at index with one.
function getProducts(array) {
return array.map((_, i, a) => a.reduce((a, b, j) => a * (i === j || b), 1));
}
console.log(getProducts([1, 7, 0, 4]));
console.log(getProducts([1, 7, 3, 4]));
You simply have to change the in keyword to of keyword. Is not the same a for..in than a for..of.
arr.values() returns an iterator, which has to be iterated with the of keyword.
Also, if product = 0, then all your multiplications will return 0.
By the way this code is prone to error, because you don't check the current index, but you check if the value that you are multiplying is different than the current value. This will lead to a problem if the same number is duplicated in the array.
And, now talking about good practices, is a bit weird that first you iterate through the array with a for(var i... loop and the second time you do it with a for...in/of.
I've fixed the code for you:
function getProducts(arr) {
let products = [];
for(let i = 0; i < arr.length; i++) {
let product = 1;
for(let ii = 0; ii < arr.length; ii++) {
if(i != ii) {
product *= arr[ii];
}
}
products.push(product);
}
return products;
}
A better way to do that is get the total product and use map() to divide total with each value
function getProducts(arr){
let total = arr.reduce((ac,a) => ac * a,1);
return arr.map(x => x === 0 ? total : total/x);
}
console.log(getProducts([1, 7, 3, 4]))
Explanation: replace the number at i with 1 so it doesn't interfere with the multiplication. Also, apply the fill on a copy of a hence the [...a]
console.log( [2,3,4,5].map( (n,i,a) => [...a].fill(1,i,i+1).reduce( (a,b) => a*b ) ) )
I need to get all possible subsets of an array.
Say I have this:
[1, 2, 3]
How do I get this?
[], [1], [2], [3], [1, 2], [2, 3], [1, 3], [1, 2, 3]
I am interested in all subsets. For subsets of specific length, refer to the following questions:
Finding subsets of size n: 1, 2
Finding subsets of size > 1: 1
Here is one more very elegant solution with no loops or recursion, only using the map and reduce array native functions.
const getAllSubsets =
theArray => theArray.reduce(
(subsets, value) => subsets.concat(
subsets.map(set => [value,...set])
),
[[]]
);
console.log(getAllSubsets([1,2,3]));
We can solve this problem for a subset of the input array, starting from offset. Then we recurse back to get a complete solution.
Using a generator function allows us to iterate through subsets with constant memory usage:
// Generate all array subsets:
function* subsets(array, offset = 0) {
while (offset < array.length) {
let first = array[offset++];
for (let subset of subsets(array, offset)) {
subset.push(first);
yield subset;
}
}
yield [];
}
// Example:
for (let subset of subsets([1, 2, 3])) {
console.log(subset);
}
Runtime complexity is proportional to the number of solutions (2ⁿ) times the average length per solution (n/2) = O(n2ⁿ).
Simple solution without recursion:
function getAllSubsets(array) {
const subsets = [[]];
for (const el of array) {
const last = subsets.length-1;
for (let i = 0; i <= last; i++) {
subsets.push( [...subsets[i], el] );
}
}
return subsets;
}
How does it work?
If we have some subsets generated from input numbers and we want to add one more number to our input array, it means that we can take all already existing subsets and generate new ones by appending the new number to each of the existing.
Here is an example for [1, 2, 3]
Start with an empty subset: []
Create new subsets by adding "1" to each existing subset. It will be:[] [1]
Create new subsets by adding "2" to each existing subset. It will be:[], [1] [2], [1, 2]
Create new subsets by adding "3" to each existing subset. It will be: [], [1], [2], [1, 2] [3], [1, 3], [2, 3], [1, 2, 3]
Another simple solution.
function getCombinations(array) {
function fork(i, t) {
if (i === array.length) {
result.push(t);
return;
}
fork(i + 1, t.concat([array[i]]));
fork(i + 1, t);
}
var result = [];
fork(0, []);
return result;
}
var data = [1, 2, 3],
result = getCombinations(data);
console.log(result);
.as-console-wrapper { max-height: 100% !important; top: 0; }
You can easily generate the powerset from an array, using something like the following:
var arr = [1, 2, 3];
function generatePowerSet(array) {
var result = [];
result.push([]);
for (var i = 1; i < (1 << array.length); i++) {
var subset = [];
for (var j = 0; j < array.length; j++)
if (i & (1 << j))
subset.push(array[j]);
result.push(subset);
}
return result;
}
console.log(generatePowerSet(arr));
Throughout the main loop of the function, subsets are created and then pushed into the result array.
I set out to understand what is happening with the examples in this post. While the function generator example, bit-wise operator example, and the example use of the array map and reduce functions are very elegant and impressive, I found it tough to mentally visual what precisely was happening. I have 2 examples below that I believe are easy to visualize both a non-recursive and a recursive solution. I hope this helps others attempting to wrap their heads around the process of finding all subsets.
NON-RECURSIVE:
For each value of the array clone all existing subsets (including the empty set) and add the new value to each of the clones, pushing the clones back to the results.
const PowerSet = array => {
const result = [[]] // Starting with empty set
for (let value of array) { // For each value of the array
const length = result.length // Can't use result.length in loop since
// results length is increased in loop
for (let i = 0; i < length; i++){
let temp = result[i].slice(0) // Make a clone of the value at index i
temp.push(value) // Add current value to clone
result.push(temp) // Add clone back to results array
}
}
return result;
}
console.log(PowerSet([1,2,3]))
RECURSIVELY:
Build the powerset by recursively pushing a combination of the current index value concatenated with an ever increasing prefix array of values.
const powerSetRecursive = (arr, prefix=[], set=[[]]) => {
if(arr.length === 0) return// Base case, end recursion
for (let i = 0; i < arr.length; i++) {
set.push(prefix.concat(arr[i]))// If a prefix comes through, concatenate value
powerSetRecursive(arr.slice(i + 1), prefix.concat(arr[i]), set)
// Call function recursively removing values at or before i and adding
// value at i to prefix
}
return set
}
console.log(powerSetRecursive([1,2,3]))
function subSets(num){
/*
example given number : [1,3]
[]
1: copy push 1
[] [1]
3: copy push 3
[] [1] [3] [1,3]
*/
let result = [];
result.push([]);
for(let i=0; i < num.length;i++){
let currentNum = num[i];
let len = result.length;
for(let j=0; j < len; j++){
let cloneData = result[j].slice();
cloneData.push(currentNum);
result.push(cloneData)
}
}
return result;
}
let test = [1,3];
console.log(subSets(test))//[ [], [ 1 ], [ 3 ], [ 1, 3 ] ]
let subsets = (n) => {
let result = [];
result.push([]);
n.forEach(a => {
//array length
let length = result.length;
let i =0;
while(i < length){
let temp = result[i].slice(0);
temp.push(a);
result.push(temp);
i++;
}
})
return result;
}
Using flatMap and rest/spread, this can be fairly elegant:
const subsets = ([x, ...xs]) =>
x == undefined
? [[]]
: subsets (xs) .flatMap (ss => [ss, [x, ...ss]])
console .log (subsets ([1, 2, 3]))
.as-console-wrapper {max-height: 100% !important; top: 0}
This version does not return them in the requested order. Doing that seems slightly less elegant, and there's probably a better version:
const subset = (xs = []) => {
if (xs.length == 0) {return [[]]}
const ys = subset (xs .slice (0, -1))
const x = xs .slice (-1) [0]
return [... ys, ... ys .map (y => [... y, x])]
}
Or, the same algorithm in a different style,
const subsets = (
xs = [],
x = xs .slice (-1) [0],
ys = xs.length && subsets (xs .slice (0, -1))
) =>
xs .length == 0
? [[]]
: [... ys, ... ys .map (y => [... y, x])]
A shorter version of #koorchik's answer.
var getAllSubsets = (nums) => {
const subsets = [[]];
for (n of nums) {
subsets.map((el) => {
subsets.push([...el, n]);
});
}
return subsets;
};
console.log(getAllSubsets([1, 2, 3]));
// [[],[1],[2],[1,2],[3],[1,3],[2,3],[1,2,3]]
For loop:
function powerSet(numbers) {
const subsets = [[]]
for (const number of numbers) {
subsets.forEach(subset => subsets.push([...subset, number]))
}
return subsets
}
Recursion:
function powerSet(numbers) {
const subsets = [[]]
if (numbers.length === 0) return subsets
for (let i = 0; i < numbers.length; i++) {
subsets.push(...powerSet(numbers.slice(i + 1)).map(subset => [numbers[i], ...subset]))
// Or step by step:
// const number = numbers[i]
// const otherNumbers = numbers.slice(i + 1)
// const otherNumbersSubsets = powerSet(otherNumbers)
// const otherNumbersSubsetsWithNumber = otherNumbersSubsets.map(subset => [number, ...subset])
// subsets.push(...otherNumbersSubsetsWithNumber)
}
return subsets
}
Using reduceRight:
const subsets = array =>
array.reduceRight(
(accumulator, a) => [...accumulator, ...accumulator.map(b => [a, ...b])],
[[]]
);
console.log(subsets([1, 2, 3])); // [[], [3], [2], [2, 3], [1], [1, 3], [1, 2], [1, 2, 3]]
This one is with recursion
var subsets = function(s){
if(s.length === 0) {
return [[]]
}
var h,t,ss_excl_h;
var ss_incl_h = [];
[h,...t] = s;
ss_excl_h = subsets(t)
for(ss of ss_excl_h) {
let hArr = [];
hArr.push(h);
let temp = hArr.concat(ss)
ss_incl_h.push(temp);
}
return ss_incl_h.concat(ss_excl_h)
}
console.log(subsets([1,2,3])) // returns distinct subsets
Update ES2020
With ES2020 BigInts have become available.
Bigints don’t have a fixed storage size in bits; their sizes adapt to the integers they represent.
- Dr. Axel Rauschmayer; JavaScript for impatient programmers - Chapter 18.2 BigInts
See source.
Using BitInts we can use a binary counter to calculate the power set and are no longer limited by the maximum integer size.
Using a generator we can additionally loop over a power set with constant memory requirement which is important if you want to generate a huge power set.
Here an example using you original array [1, 2, 3].
/**
* Generate power set from a given array
* #param {Array<any>} array array to create power set from
*/
function* powerSet(array){
// use BigInt to be no longer limited by maximum length of 53-bits
const size = 2n ** BigInt(array.length);
for (let i = 0; i < size; i++) {
const cur = [];
for(let j = 0; j < array.length; j++){
// check if i-th bit is set to 1
if((i & (1 << j)) > 0){
// push array value (represented by that 1-bit) to result
cur.push(array[j]);
}
}
// generate next result
yield cur;
}
}
// generate power set for [1, 2, 3] and print results
console.log([...powerSet([1, 2, 3])]);
.as-console-wrapper { max-height: 100% !important; top: 0; }
Here how you could loop over a very large power set with constant memory and no upper bound (theoretically, there will be an upper bound in terms of compute time) for the array length.
/**
* Generate power set from a given array
* #param {Array<any>} array array to create power set from
*/
function* powerSet(array){
// use BigInt to no longer limited by maximum length of 53-bits
const size = 2n ** BigInt(array.length);
for (let i = 0; i < size; i++) {
const cur = [];
for(let j = 0; j < array.length; j++){
// check if i-th bit is set to 1
if((i & (1 << j)) > 0){
// push array value (represented by that 1-bit) to result
cur.push(array[j]);
}
}
// generate next result
yield cur;
}
}
/**
* Helper function to generate an array containing more than 53 elements
* #param {number} start
* #param {number} end
*/
function* range(start, end){
for (let i = start; i < end; i++) {
yield i;
}
}
// create an array containing elments 1 through 60 ([1, 2, 3, ..., 60])
const oneToSixty = [...range(1, 61)];
let i = 0;
const max = 1000;
// loop over whole powerSet with constant memory requirement
// abort after 1000 subsets, otherwise this will take a very long time to complete
for(const subset of powerSet(oneToSixty)){
console.log(subset);
if(i++ === max) break;
}
.as-console-wrapper { max-height: 100% !important; top: 0; }
I need to get all possible subsets of an array.
Say I have this:
[1, 2, 3]
How do I get this?
[], [1], [2], [3], [1, 2], [2, 3], [1, 3], [1, 2, 3]
I am interested in all subsets. For subsets of specific length, refer to the following questions:
Finding subsets of size n: 1, 2
Finding subsets of size > 1: 1
Here is one more very elegant solution with no loops or recursion, only using the map and reduce array native functions.
const getAllSubsets =
theArray => theArray.reduce(
(subsets, value) => subsets.concat(
subsets.map(set => [value,...set])
),
[[]]
);
console.log(getAllSubsets([1,2,3]));
We can solve this problem for a subset of the input array, starting from offset. Then we recurse back to get a complete solution.
Using a generator function allows us to iterate through subsets with constant memory usage:
// Generate all array subsets:
function* subsets(array, offset = 0) {
while (offset < array.length) {
let first = array[offset++];
for (let subset of subsets(array, offset)) {
subset.push(first);
yield subset;
}
}
yield [];
}
// Example:
for (let subset of subsets([1, 2, 3])) {
console.log(subset);
}
Runtime complexity is proportional to the number of solutions (2ⁿ) times the average length per solution (n/2) = O(n2ⁿ).
Simple solution without recursion:
function getAllSubsets(array) {
const subsets = [[]];
for (const el of array) {
const last = subsets.length-1;
for (let i = 0; i <= last; i++) {
subsets.push( [...subsets[i], el] );
}
}
return subsets;
}
How does it work?
If we have some subsets generated from input numbers and we want to add one more number to our input array, it means that we can take all already existing subsets and generate new ones by appending the new number to each of the existing.
Here is an example for [1, 2, 3]
Start with an empty subset: []
Create new subsets by adding "1" to each existing subset. It will be:[] [1]
Create new subsets by adding "2" to each existing subset. It will be:[], [1] [2], [1, 2]
Create new subsets by adding "3" to each existing subset. It will be: [], [1], [2], [1, 2] [3], [1, 3], [2, 3], [1, 2, 3]
Another simple solution.
function getCombinations(array) {
function fork(i, t) {
if (i === array.length) {
result.push(t);
return;
}
fork(i + 1, t.concat([array[i]]));
fork(i + 1, t);
}
var result = [];
fork(0, []);
return result;
}
var data = [1, 2, 3],
result = getCombinations(data);
console.log(result);
.as-console-wrapper { max-height: 100% !important; top: 0; }
You can easily generate the powerset from an array, using something like the following:
var arr = [1, 2, 3];
function generatePowerSet(array) {
var result = [];
result.push([]);
for (var i = 1; i < (1 << array.length); i++) {
var subset = [];
for (var j = 0; j < array.length; j++)
if (i & (1 << j))
subset.push(array[j]);
result.push(subset);
}
return result;
}
console.log(generatePowerSet(arr));
Throughout the main loop of the function, subsets are created and then pushed into the result array.
I set out to understand what is happening with the examples in this post. While the function generator example, bit-wise operator example, and the example use of the array map and reduce functions are very elegant and impressive, I found it tough to mentally visual what precisely was happening. I have 2 examples below that I believe are easy to visualize both a non-recursive and a recursive solution. I hope this helps others attempting to wrap their heads around the process of finding all subsets.
NON-RECURSIVE:
For each value of the array clone all existing subsets (including the empty set) and add the new value to each of the clones, pushing the clones back to the results.
const PowerSet = array => {
const result = [[]] // Starting with empty set
for (let value of array) { // For each value of the array
const length = result.length // Can't use result.length in loop since
// results length is increased in loop
for (let i = 0; i < length; i++){
let temp = result[i].slice(0) // Make a clone of the value at index i
temp.push(value) // Add current value to clone
result.push(temp) // Add clone back to results array
}
}
return result;
}
console.log(PowerSet([1,2,3]))
RECURSIVELY:
Build the powerset by recursively pushing a combination of the current index value concatenated with an ever increasing prefix array of values.
const powerSetRecursive = (arr, prefix=[], set=[[]]) => {
if(arr.length === 0) return// Base case, end recursion
for (let i = 0; i < arr.length; i++) {
set.push(prefix.concat(arr[i]))// If a prefix comes through, concatenate value
powerSetRecursive(arr.slice(i + 1), prefix.concat(arr[i]), set)
// Call function recursively removing values at or before i and adding
// value at i to prefix
}
return set
}
console.log(powerSetRecursive([1,2,3]))
function subSets(num){
/*
example given number : [1,3]
[]
1: copy push 1
[] [1]
3: copy push 3
[] [1] [3] [1,3]
*/
let result = [];
result.push([]);
for(let i=0; i < num.length;i++){
let currentNum = num[i];
let len = result.length;
for(let j=0; j < len; j++){
let cloneData = result[j].slice();
cloneData.push(currentNum);
result.push(cloneData)
}
}
return result;
}
let test = [1,3];
console.log(subSets(test))//[ [], [ 1 ], [ 3 ], [ 1, 3 ] ]
let subsets = (n) => {
let result = [];
result.push([]);
n.forEach(a => {
//array length
let length = result.length;
let i =0;
while(i < length){
let temp = result[i].slice(0);
temp.push(a);
result.push(temp);
i++;
}
})
return result;
}
Using flatMap and rest/spread, this can be fairly elegant:
const subsets = ([x, ...xs]) =>
x == undefined
? [[]]
: subsets (xs) .flatMap (ss => [ss, [x, ...ss]])
console .log (subsets ([1, 2, 3]))
.as-console-wrapper {max-height: 100% !important; top: 0}
This version does not return them in the requested order. Doing that seems slightly less elegant, and there's probably a better version:
const subset = (xs = []) => {
if (xs.length == 0) {return [[]]}
const ys = subset (xs .slice (0, -1))
const x = xs .slice (-1) [0]
return [... ys, ... ys .map (y => [... y, x])]
}
Or, the same algorithm in a different style,
const subsets = (
xs = [],
x = xs .slice (-1) [0],
ys = xs.length && subsets (xs .slice (0, -1))
) =>
xs .length == 0
? [[]]
: [... ys, ... ys .map (y => [... y, x])]
A shorter version of #koorchik's answer.
var getAllSubsets = (nums) => {
const subsets = [[]];
for (n of nums) {
subsets.map((el) => {
subsets.push([...el, n]);
});
}
return subsets;
};
console.log(getAllSubsets([1, 2, 3]));
// [[],[1],[2],[1,2],[3],[1,3],[2,3],[1,2,3]]
For loop:
function powerSet(numbers) {
const subsets = [[]]
for (const number of numbers) {
subsets.forEach(subset => subsets.push([...subset, number]))
}
return subsets
}
Recursion:
function powerSet(numbers) {
const subsets = [[]]
if (numbers.length === 0) return subsets
for (let i = 0; i < numbers.length; i++) {
subsets.push(...powerSet(numbers.slice(i + 1)).map(subset => [numbers[i], ...subset]))
// Or step by step:
// const number = numbers[i]
// const otherNumbers = numbers.slice(i + 1)
// const otherNumbersSubsets = powerSet(otherNumbers)
// const otherNumbersSubsetsWithNumber = otherNumbersSubsets.map(subset => [number, ...subset])
// subsets.push(...otherNumbersSubsetsWithNumber)
}
return subsets
}
Using reduceRight:
const subsets = array =>
array.reduceRight(
(accumulator, a) => [...accumulator, ...accumulator.map(b => [a, ...b])],
[[]]
);
console.log(subsets([1, 2, 3])); // [[], [3], [2], [2, 3], [1], [1, 3], [1, 2], [1, 2, 3]]
This one is with recursion
var subsets = function(s){
if(s.length === 0) {
return [[]]
}
var h,t,ss_excl_h;
var ss_incl_h = [];
[h,...t] = s;
ss_excl_h = subsets(t)
for(ss of ss_excl_h) {
let hArr = [];
hArr.push(h);
let temp = hArr.concat(ss)
ss_incl_h.push(temp);
}
return ss_incl_h.concat(ss_excl_h)
}
console.log(subsets([1,2,3])) // returns distinct subsets
Update ES2020
With ES2020 BigInts have become available.
Bigints don’t have a fixed storage size in bits; their sizes adapt to the integers they represent.
- Dr. Axel Rauschmayer; JavaScript for impatient programmers - Chapter 18.2 BigInts
See source.
Using BitInts we can use a binary counter to calculate the power set and are no longer limited by the maximum integer size.
Using a generator we can additionally loop over a power set with constant memory requirement which is important if you want to generate a huge power set.
Here an example using you original array [1, 2, 3].
/**
* Generate power set from a given array
* #param {Array<any>} array array to create power set from
*/
function* powerSet(array){
// use BigInt to be no longer limited by maximum length of 53-bits
const size = 2n ** BigInt(array.length);
for (let i = 0; i < size; i++) {
const cur = [];
for(let j = 0; j < array.length; j++){
// check if i-th bit is set to 1
if((i & (1 << j)) > 0){
// push array value (represented by that 1-bit) to result
cur.push(array[j]);
}
}
// generate next result
yield cur;
}
}
// generate power set for [1, 2, 3] and print results
console.log([...powerSet([1, 2, 3])]);
.as-console-wrapper { max-height: 100% !important; top: 0; }
Here how you could loop over a very large power set with constant memory and no upper bound (theoretically, there will be an upper bound in terms of compute time) for the array length.
/**
* Generate power set from a given array
* #param {Array<any>} array array to create power set from
*/
function* powerSet(array){
// use BigInt to no longer limited by maximum length of 53-bits
const size = 2n ** BigInt(array.length);
for (let i = 0; i < size; i++) {
const cur = [];
for(let j = 0; j < array.length; j++){
// check if i-th bit is set to 1
if((i & (1 << j)) > 0){
// push array value (represented by that 1-bit) to result
cur.push(array[j]);
}
}
// generate next result
yield cur;
}
}
/**
* Helper function to generate an array containing more than 53 elements
* #param {number} start
* #param {number} end
*/
function* range(start, end){
for (let i = start; i < end; i++) {
yield i;
}
}
// create an array containing elments 1 through 60 ([1, 2, 3, ..., 60])
const oneToSixty = [...range(1, 61)];
let i = 0;
const max = 1000;
// loop over whole powerSet with constant memory requirement
// abort after 1000 subsets, otherwise this will take a very long time to complete
for(const subset of powerSet(oneToSixty)){
console.log(subset);
if(i++ === max) break;
}
.as-console-wrapper { max-height: 100% !important; top: 0; }