What is the time complexity of Javascript Array.reduce() and Array.find()? - javascript

I am trying to return an array of indexes of values that add up to a given target. I am trying to solve it the fastest way I can!
Examples:
sumOfTwo([1, 2, 4, 4], 8) // => [2, 3]
sumOfTwo([1, 2, 3, 9], 8) // => []
So first I tried a simple brute-force. (Time complexity: O(n^2) )
function sumOfTwo(arr, target) {
for (let i = 0; i < arr.length; i++) {
for (let j = i + 1; j < arr.length; j++) {
if (arr[i] + arr[j] === target) {
return [i, j];
}
}
}
return [];
}
Then I tried: (Time complexity: sorting O(n log n) + for loop O(n))
function sumOfTwo(arr, target) {
const sortedArr = arr.sort();
let idxFromBack = arr.length - 1;
for (let [idx, val] of sortedArr.entries()) {
if (val + arr[idxFromBack] > target) {
idxFromBack--;
}
if (val + arr[idxFromBack] === target) {
return [idx, idxFromBack];
}
}
return [];
}
Then I came with this solution that I don't even know the time complexity.
function sumOfTwo(arr, target) {
const complements = [];
for (let [idx, val] of arr.entries()) {
if (complements.reduce((acc, v) => (acc || v.value === val), false)) {
return [complements.find(v => v.value === target - val).index, idx];
}
complements.push({index: idx, value: target - val});
}
return [];
}
I know that I am using a for-loop but I don't know the complexity of the build-in high order functions .reduce() and .find(). I tried a couple of searches but I couldn't find anything.
If anyone can help me would be great! Please include Big-O notation if possible.
Repl.it: https://repl.it/#abranhe/sumOfTwo
Please also include the time complexity of the last solution.

The minimum time complexity of .reduce is O(n), because it must iterate through all elements once (assuming an error isn't thrown), but it can be unbounded (since you can write any code you want inside the callback).
For your
// Loop, O(n), n = length of arr:
for (let [idx, val] of arr.entries()) {
// .reduce, O(n), n = length of complements:
if (complements.reduce((acc, v) => (acc || v.value === val), false)) {
// If test succeeds, .find, O(n), n = length of complements:
return [complements.find(v => v.value === target - val).index, idx];
}
complements.push({index: idx, value: target - val});
}
the time complexity is, worst case, O(n^2). The reduce runs in O(n) time, and you run a reduce for every entry in arr, making it O(n^2).
(The .find is also an O(n) operation, but O(n) + O(n) = O(n))
Your code that sorts the array beforehand has the right idea for decreasing complexity, but it has a couple flaws.
First, you should sort numerically ((a, b) => a - b)); .sort() with no arguments will sort lexiographically (eg, [1, 11, 2] is not desirable).
Second, just decrementing idxFromBack isn't enough: for example, sumOfTwo([1, 3, 8, 9, 9], 9) will not see that 1 and 8 are a match. Perhaps the best strategy here would be to oscillate with while instead: from a idxFromBack, iterate backwards until a match is found or the sum is too small, and also iterate forwards until a match is found or the sum is too large.
You can also improve the performance of this code by sorting not with .sort((a, b) => a - b), which has complexity of O(n log n), but with radix sort or counting sort instead (both of which have complexity of O(n + k), where k is a constant). The optimal algorithm will depend on the general shape and variance of the input.
An even better, altogether different O(n) strategy would be to use a Map or object. When iterating over the array, put the value which would result in a match for the current item into a key of the object (where the value is the current index), and just look to see if the current value being iterated over exists in the object initially:
const sumOfTwo = (arr, target) => {
const obj = {};
for (const [i, num] of arr.entries()) {
if (obj.hasOwnProperty(String(num))) {
return [obj[num], i];
}
const matchForThis = target - num;
obj[matchForThis] = i;
}
return [];
};
console.log(
sumOfTwo([1, 2, 4, 4], 8), // => [2, 3]
sumOfTwo([1, 2, 8, 9], 9), // 1 + 8 = 9; [0, 2]
sumOfTwo([1, 2, 3, 9], 8) // => []
);

As a supplementary answer, here is the algorithm of the find method in the language spec:
When the find method is called, the following steps are taken:
Let O be ? ToObject(this value).
Let len be ? LengthOfArrayLike(O).
If IsCallable(predicate) is false, throw a TypeError exception.
Let k be 0.
Repeat, while k < len,
a. Let Pk be ! ToString(𝔽(k)).
b. Let kValue be ? Get(O, Pk).
c. Let testResult be ! ToBoolean(? Call(predicate, thisArg, « kValue, 𝔽(k), O »)).
d. If testResult is true, return kValue.
e. Set k to k + 1.
Return undefined.
Note the "repeat, while k < len" in step 5. Since time complexity in general measures the worst case scenario (aka the upper bound), we can assume that the searched element is not present in the collection.
The number of iterations made during step 5 then is equal to len which directly depends on the number of elements in the collection. And what time complexity has a direct correlation to the number of elements? Exactly, the linear O(n).
For a visual-ish demonstration, run the following snippet. Apart from some stray dots, the improvized graph should show a linear progression (takes a little while to display in Stack snippets, but you can watch it live in the devtools console):
const iter = 1e7;
const incr = 2e5;
const test = new Array(iter).fill(0);
const steps = Math.ceil(iter / incr);
for (let i = 0; i < steps; i += 1) {
const sub = test.slice(0, i * incr + incr);
const s = Date.now();
const find = sub.find((v) => v === 1);
const e = Date.now();
const d = e - s;
console.log("\u200A".repeat(Math.floor(d/3))+"*");
}

Related

How can I find all the correct combinations?

There will be an array and a number.
I want to get all possible combinations from the array by number.
I mean if the array is [2,4,1,3,0,1] and the number is 5,
then the result will be [2,3] [4,1] [2,3,0] [4,1,0] [1,3,1,0].
So, I try this algorithm
function solution(arr, n) {
let result = []
for (let i = 0; i < arr.length; i++) {
for (let j = i + 1; j < arr.length; j++) {
let sum = arr[i] + arr[j]
if(sum === n) {
result.push([arr[i], arr[j]])
}
}
}
return result
}
But, this algorithm is only for two combinations.
I have been thinking how to do three combinations and four combinations during the last five hours.
What is the way to access it to solve that?
You can try something simple and naive:
const sumsTo = ([n, ...ns], t) =>
n === undefined ? [] : [ // if list is empty, no results
... (n === t ? [[n]] : []), // if n == t, include [n]
... sumsTo (ns, t - n) .map (s => [n, ...s]), // include n and recur
... sumsTo (ns, t) // skip n and recur
]
console .log (
sumsTo ([2, 4, 1, 3, 0, 1], 5)
/* display */ .map (JSON.stringify) .join ('\n')
)
.as-console-wrapper {max-height: 100% !important; top: 0}
Here we use a fairly simple recursion, always reducing the list of numbers to test and sometimes reducing the total we're seeking.
It will repeat arrays, for instance in this example, [4, 1]. If you want to remove duplicates, then we can just run a uniqueness function (such as const uniq = (xs) => [...new Set(xs)]) on the result.
The performance is likely to be terrible, but then it seems likely that no algorithm here will have great performance.
One elegant way to solve the problem is with a slightly modified k-combinatorial algorithm.
I don't know enough JavaScript to write it in your native language, but here is a Python implementation that I hope will suffice as well as pseudo-code:
def find_combination(input, value):
"""Find all combination in input which it's sum is equal to value and has minimum of two elements."""
for k in range(2, len(input) + 1):
result = gen_combination(input, k, value)
for x in result:
print(x)
def gen_combination(input, k, value):
"""Returns unique combinations of k-value from input which it's sum equal to value."""
output = []
_gen_combination_rec(input, output, k, value)
return output
def _gen_combination_rec(array, output, k, value, index=0, data=None):
"""Recursively traverse input to find combinations of k-value."""
if data == None:
data = [0] * k
if index == k:
sd = sorted(data)
if sd not in output and sum(data) == value:
output.append(sd)
data[index - 1] = 0
return
i = 0
n = len(array)
while i < n and index < k:
data[index] = array[i]
_gen_combination_rec(array[i + 1 :], output, k, value, index + 1, data)
i += 1
if __name__ == "__main__":
input = [2, 4, 1, 0, 3, 1]
find_combination(input, 5)
Here is the output of the above code when run from a terminal:
$ python main.py
[2, 3]
[1, 4]
[0, 2, 3]
[0, 1, 4]
[1, 1, 3]
[0, 1, 1, 3]

How to move all zero elements to the center of the array?

Yesterday, I was struggling with a question asked by a company in their pre-screen round. The problem is that you will be given an array of integers. Your task is to move all the elements that have the value of zero into the middle of the array. To make it easier, center = floor(array.length/2).
I already had a solution, but it only works if you have only one zero element in your array.
I hope to receive better answers from you all.
Thank you.
/**
*
* #param {*} arr: a list of integers
* #return: updated list of integers with all zero element moved to the middle
* #example: [4,0,1,1,3] => [4,1,0,1,3]
*/
const zeroesToMid = (arr) => {
const mid = Math.floor(arr.length/2)
let result
for(let i = 0;i < arr.length;i++){
if(arr[i] === 0) {
let firstHalf = arr.splice(0,i)
let secondHalf = arr.splice(i, arr.length)
result = firstHalf.concat(secondHalf)
result.splice(mid,0,0)
}
}
return result
}
Some comments about your attempt:
In each iteration you restart with a result based on arr. So actually you lose the result from every previous iteration.
splice will move several array items: so this is not very efficient, yet you call it several times in each iteration. Moreover, you have at least one conceptual error here:
let firstHalf = arr.splice(0,i)
let secondHalf = arr.splice(i, arr.length)
The first splice will actually remove those entries from arr (making the array shorter), and so the second call to splice will not have the desired result. Maybe you confused splice with slice here (but not later in your code).
If the previous two errors were corrected, then there still is this: after a zero has been moved from index i to a forward position, there is now another value at arr[i]. But as the next iteration of the loop will have incremented i, you will never look at that value, which might have been zero too.
concat creates a new array and you call it in each iteration, making it even more inefficient.
Solution
Better than splicing and concatenating is to copy values without affecting the array size.
If it is necessary that the non-zero values keep their original order, then I would propose the following algorithm:
In a first iteration you could just count the number of zeros. From this you can derive at which range they should be grouped in the result.
In a second iteration you would copy a certain number op non-zero values to the left side of the array. Similarly you would do that at the right side as will. Finally, just fill the center section with zeroes.
So here is an implementation:
function moveZeroes(arr) {
// count zeroes
let numZeroes = 0;
for (let i = 0; i < arr.length; i++) numZeroes += !arr[i];
// Determine target range for those zeroes:
let first = (arr.length - numZeroes) >> 1;
let last = first + numZeroes - 1;
// Move some non-zero values to the left of the array
for (let i = 0, j = 0; i < first; i++, j++) {
while (!arr[j]) j++; // Find next non-zero value
arr[i] = arr[j]; // Move it to right
}
// Move other non-zero values to the right of the array
for (let i = arr.length - 1, j = i; i > last; i--, j--) {
while (!arr[j]) j--; // Find next non-zero value
arr[i] = arr[j]; // Move it to right
}
// Fill the middle section with zeroes:
for (let i = first; i <= last; i++) arr[i] = 0;
return arr;
}
// Demo
console.log(moveZeroes([0, 1, 2, 3, 4, 5, 0, 6, 0, 0]));
NB: If it is not necessary that the non-zero values keep their original order, then you could reduce the number of assignments even more, as you then only need to move the non-zero values that occur in the region where the zeroes should come. The other non-zero values can just stay where they are.
Possible approach:
Problem statement doesn't specify, whether all zeros should be centered exactly (that requires counting zeros first), so we can just squeeze left and right parts, shifting non-zero elements to the corresponding end.
For example, left half might be treated like this sketch
cz = 0;
for(let i = 0; i <= mid;i++) {
if (A[i])
A[i-cz] = A[i];
else
cz++;
}
for(let i = mid - cz + 1; i <= mid;i++) {
A[i] = 0;
}
Now increment mid if cz>0 (so we have already filled central position with zero) and do the same for the right half in reverse direction (backward for-loop, A[i+cz] = A[i]).
Result:
[0, 1, 2, 3, 4, 5, 0, 6, 0, 0] => [1, 2, 3, 4, 0, 0, 0, 0, 5, 6]
As always, trincot does a spot-on analysis of what's wrong with your code.
I have what I think is a simpler solution to the problem though. We can use Array.prototype.filter to select the non-zero entries and then split that in half, then creating an output array by combining the first half with an appropriate number of zeros and then the second half. It might look like this:
const moveZeroes = (
ns,
nonZeroes = ns .filter (n => n !== 0),
len = nonZeroes .length >> 1
) => [
... nonZeroes .slice (0, len),
... Array (ns .length - nonZeroes .length) .fill (0),
... nonZeroes .slice (len)
]
const xs = [0, 1, 2, 3, 4, 5, 0, 6, 0, 0]
console .log (... moveZeroes (xs))
console .log (... xs)
First off, nonZeroes .length >> 1 takes advantage of bit-manipulation to create an expression that is simpler than Math .floor (nonZeroes .length / 2). It's like to be marginally faster too. But they have equivalent behavior.
Notice in the output that the original array did not change. This is very much intentional. I simply much prefer working with immutable data. But if you really want to change the original, we can just use Object.assign, to fold these new values back into the original array:
const moveZeroes = (
ns,
nonZeroes = ns .filter (n => n !== 0),
len = nonZeroes .length >> 1
) => Object .assign (ns, [
... nonZeroes .slice (0, len),
... Array (ns .length - nonZeroes .length) .fill (0),
... nonZeroes .slice (len)
])
const xs = [0, 1, 2, 3, 4, 5, 0, 6, 0, 0]
console .log (... moveZeroes (xs))
console .log (... xs)
Finally, I prefer the style demonstrated here of using only expressions and not statements. But if you don't like the defaulted parameters for extra variables, this could easily be rewritten as follows:
const moveZeroes = (ns) => {
const nonZeroes = ns .filter (n => n !== 0)
const len = nonZeroes .length >> 1
return [
... nonZeroes .slice (0, len),
... Array (ns .length - nonZeroes .length) .fill (0),
... nonZeroes .slice (len)
]
}
and again, we can use Object .assign if we want to.
The big trick here is that we're not actually moving the values, we're extracting some of them, and then simply sticking back together three separate arrays.
here we have to take new array to return the value so it takes space complexity of O(n)
const fun = (arr)=>{
new_array = []
let zero_count = 0
for(let i in arr)
if(!arr[i])
zero_count += 1
let last_index = -1
for(let i = 0; i<arr.length;i++){
if (new_array.length >= Math.floor((arr.length - zero_count)/2)){
last_index = i
break;
}
if(arr[i]) new_array.push(arr[i]);
}
for (let i = 0; i<zero_count;i++)
new_array.push(0)
if (last_index != -1)
for(let i =last_index;i<arr.length;i++)
if(arr[i]) new_array.push(arr[i]);
return new_array
}
console.log(fun([0,1,0,0,2,3,4,0,0]))
resultant output is [1,2,0,0,0,0,0,3,4]
Time Complexity is O(n),
Space Complexity is O(n)

Trying to solve using recursion without using other algorithms

I am trying to get better at understanding recursion so that I can get better at implementing dynamic programming principles. I am aware this problem can be solved using Kadane's algorithm; however, I would like to solve it using recursion.
Problem statement:
Given an array of integers, find the subset of non-adjacent elements with the maximum sum. Calculate the sum of that subset.
I have written the following partial solution:
const maxSubsetSum = (arr) => {
let max = -Infinity
const helper = (arr, len) => {
if (len < 0) return max
let pointer = len
let sum = 0
while (pointer >= 0) {
sum += arr[pointer]
pointer -= 2
}
return max = Math.max(sum, helper(arr, len - 1))
}
return helper(arr, arr.length - 1)
}
If I had this data:
console.log(maxSubsetSum([3, 5, -7, 8, 10])) //15
//Our subsets are [3,-7,10], [3,8], [3,10], [5,8], [5,10] and [-7,10].
My algorithm calculates 13. I know it's because when I start my algorithm my (n - 2) values are calculated, but I am not accounting for other subsets that are (n-3) or more that still validate the problem statement's condition. I can't figure out the logic to account for the other values, please guide me on how I can accomplish that.
The code is combining recursion (the call to helper inside helper) with iteration (the while loop inside helper). You should only be using recursion.
For each element of the array, there are two choices:
Skip the current element. In this case, the sum is not changed, and the next element can be used. So the recursive call is
sum1 = helper(arr, len - 1, sum)
Use the current element. In this case, the current element is added to the sum, and the next element must be skipped. So the recursive call is
sum2 = helper(arr, len - 2, sum + arr[len])
So the code looks like something this:
const maxSubsetSum = (arr) => {
const helper = (arr, len, sum) => {
if (len < 0) return sum
let sum1 = helper(arr, len - 1, sum)
let sum2 = helper(arr, len - 2, sum + arr[len])
return Math.max(sum1, sum2)
}
return helper(arr, arr.length - 1, 0)
}
Your thinking is right in that you need to recurse from (n-2) once you start with a current index. But you don't seem to understand that you don't need to run through your array to get sum and then recurse.
So the right way is to
either include the current item and recurse on the remaining n-2 items or
not include the current item and recurse on the remaining n-1 items
Lets look at those two choices:
Choice 1:
You chose to include the item at the current index. Then you recurse on the remaining n-2 items. So your maximum could be item itself without adding to any of remaining n-2 items or add to some items from n-2 items.
So Math.max( arr[idx], arr[idx] + recurse(idx-2)) is the maximum for this choice. If recurse(idx-2) gives you -Infinity, you just consider the item at the current index.
Choice 2:
You didn't choose to include the item at the current index. So just recurse on the remaining n-1 items - recurse(n-1)
The final maximum is maximum from those two choices.
Code is :
const maxSubsetSum = (arr) => {
let min = -Infinity
const helper = (arr, idx) => {
if ( idx < 0 ) return min
let inc = helper(arr, idx-2)
let notInc = helper(arr, idx-1)
inc = inc == min ? arr[idx] : Math.max(arr[idx], arr[idx] + inc)
return Math.max( inc, notInc )
}
return helper(arr, arr.length - 1)
}
console.log(maxSubsetSum([-3, -5, -7, -8, 10]))
console.log(maxSubsetSum([-3, -5, -7, -8, -10]))
console.log(maxSubsetSum([-3, 5, 7, -8, 10]))
console.log(maxSubsetSum([3, 5, 7, 8, 10]))
Output :
10
-3
17
20
For the case where all items are negative:
In this case you can say that there are no items to combine together to get a maximum sum. If that is the requirement the result should be zero. In that case just return 0 by having 0 as the default result. Code in that case is :
const maxSubsetSum = (arr) => {
const helper = (arr, idx) => {
if ( idx < 0 ) return 0
let inc = arr[idx] + helper(arr, idx-2)
let notInc = helper(arr, idx-1)
return Math.max( inc, notInc )
}
return helper(arr, arr.length - 1)
}
With memoization:
You could memoize this solution for the indices you visited during recursion. There is only one state i.e. the index so your memo is one dimensional. Code with memo is :
const maxSubsetSum = (arr) => {
let min = -Infinity
let memo = new Array(arr.length).fill(min)
const helper = (arr, idx) => {
if ( idx < 0 ) return min
if ( memo[idx] !== min) return memo[idx]
let inc = helper(arr, idx-2)
let notInc = helper(arr, idx-1)
inc = inc == min ? arr[idx] : Math.max(arr[idx], arr[idx] + inc)
memo[idx] = Math.max( inc, notInc )
return memo[idx]
}
return helper(arr, arr.length - 1)
}
A basic version is simple enough with the obvious recursion. We either include the current value in our sum or we don't. If we do, we need to skip the next value, and then recur on the remaining values. If we don't then we need to recur on all the values after the current one. We choose the larger of these two results. That translates almost directly to code:
const maxSubsetSum = ([n, ...ns]) =>
n == undefined // empty array
? 0
: Math .max (n + maxSubsetSum (ns .slice (1)), maxSubsetSum (ns))
Update
That was missing a case, where our highest sum is just the number itself. That's fixed here (and in the snippets below)
const maxSubsetSum = ([n, ...ns]) =>
n == undefined // empty array
? 0
: Math .max (n, n + maxSubsetSum (ns .slice (1)), maxSubsetSum (ns))
console.log (maxSubsetSum ([3, 5, -7, 8, 10])) //15
But, as you note in your comments, we really might want to memoize this for performance reasons. There are several ways we could choose to do this. One option would be to turn the array we're testing in one invocation of our function into something we can use as a key in an Object or a Map. It might look like this:
const maxSubsetSum = (ns) => {
const memo = {}
const mss = ([n, ...ns]) => {
const key = `${n},${ns.join(',')}`
return n == undefined
? 0
: key in memo
? memo [key]
: memo [key] = Math .max (n, n + maxSubsetSum (ns .slice (1)), maxSubsetSum (ns))
}
return mss(ns)
}
console.log (maxSubsetSum ([3, 5, -7, 8, 10])) //15
We could also do this with a helper function that acted on the index and memoized using the index for a key. It would be about the same level of complexity.
This is a bit ugly, however, and perhaps we can do better.
There's one issue with this sort of memoization: it only lasts for the current run. It I'm going to memoize a function, I would rather it holds that cache for any calls for the same data. That means memoization in the definition of the function. I usually do this with a reusable external memoize helper, something like this:
const memoize = (keyGen) => (fn) => {
const cache = {}
return (...args) => {
const key = keyGen (...args)
return cache[key] || (cache[key] = fn (...args))
}
}
const maxSubsetSum = memoize (ns => ns .join (',')) (([n, ...ns]) =>
n == undefined
? 0
: Math .max (n, n + maxSubsetSum (ns .slice (1)), maxSubsetSum (ns)))
console.log (maxSubsetSum ([3, 5, -7, 8, 10])) //15
memoize takes a function that uses your arguments to generate a String key, and returns a function that accepts your function and returns a memoized version of it. It runs by calling the key generation on your input, checks whether that key is in the cache. If it is, we simply return it. If not, we call your function, store the result under that key and return it.
For this version, the key generated is simply the string created by joining the array values with ','. There are probably other equally-good options.
Note that we cannot do
const recursiveFunction = (...args) => /* some recursive body */
const memomizedFunction = memoize (someKeyGen) (recursiveFunction)
because the recursive calls in memoizedFunction would then be to the non-memoized recursiveFunction. Instead, we always have to use it like this:
const memomizedFunction = memoize (someKeyGen) ((...args) => /* some recursive body */)
But that's a small price to pay for the convenience of being able to simply wrap up the function definition with a key-generator to memoize a function.
This code was accepted:
function maxSubsetSum(A) {
return A.reduce((_, x, i) =>
A[i] = Math.max(A[i], A[i-1] | 0, A[i] + (A[i-2] | 0)));
}
But trying to recurse that far, (I tried submitting Scott Sauyet's last memoised example), I believe results in run-time errors since we potentially pass the recursion limit.
For fun, here's bottom-up that gets filled top-down :)
function f(A, i=0){
if (i > A.length - 3)
return A[i] = Math.max(A[i] | 0, A[i+1] | 0);
// Fill the table
f(A, i + 1);
return A[i] = Math.max(A[i], A[i] + A[i+2], A[i+1]);
}
var As = [
[3, 7, 4, 6, 5], // 13
[2, 1, 5, 8, 4], // 11
[3, 5, -7, 8, 10] // 15
];
for (let A of As){
console.log('' + A);
console.log(f(A));
}

Generate all variants of arrays from a number [duplicate]

I need to get all possible subsets of an array.
Say I have this:
[1, 2, 3]
How do I get this?
[], [1], [2], [3], [1, 2], [2, 3], [1, 3], [1, 2, 3]
I am interested in all subsets. For subsets of specific length, refer to the following questions:
Finding subsets of size n: 1, 2
Finding subsets of size > 1: 1
Here is one more very elegant solution with no loops or recursion, only using the map and reduce array native functions.
const getAllSubsets =
theArray => theArray.reduce(
(subsets, value) => subsets.concat(
subsets.map(set => [value,...set])
),
[[]]
);
console.log(getAllSubsets([1,2,3]));
We can solve this problem for a subset of the input array, starting from offset. Then we recurse back to get a complete solution.
Using a generator function allows us to iterate through subsets with constant memory usage:
// Generate all array subsets:
function* subsets(array, offset = 0) {
while (offset < array.length) {
let first = array[offset++];
for (let subset of subsets(array, offset)) {
subset.push(first);
yield subset;
}
}
yield [];
}
// Example:
for (let subset of subsets([1, 2, 3])) {
console.log(subset);
}
Runtime complexity is proportional to the number of solutions (2ⁿ) times the average length per solution (n/2) = O(n2ⁿ).
Simple solution without recursion:
function getAllSubsets(array) {
const subsets = [[]];
for (const el of array) {
const last = subsets.length-1;
for (let i = 0; i <= last; i++) {
subsets.push( [...subsets[i], el] );
}
}
return subsets;
}
How does it work?
If we have some subsets generated from input numbers and we want to add one more number to our input array, it means that we can take all already existing subsets and generate new ones by appending the new number to each of the existing.
Here is an example for [1, 2, 3]
Start with an empty subset: []
Create new subsets by adding "1" to each existing subset. It will be:[] [1]
Create new subsets by adding "2" to each existing subset. It will be:[], [1] [2], [1, 2]
Create new subsets by adding "3" to each existing subset. It will be: [], [1], [2], [1, 2] [3], [1, 3], [2, 3], [1, 2, 3]
Another simple solution.
function getCombinations(array) {
function fork(i, t) {
if (i === array.length) {
result.push(t);
return;
}
fork(i + 1, t.concat([array[i]]));
fork(i + 1, t);
}
var result = [];
fork(0, []);
return result;
}
var data = [1, 2, 3],
result = getCombinations(data);
console.log(result);
.as-console-wrapper { max-height: 100% !important; top: 0; }
You can easily generate the powerset from an array, using something like the following:
var arr = [1, 2, 3];
function generatePowerSet(array) {
var result = [];
result.push([]);
for (var i = 1; i < (1 << array.length); i++) {
var subset = [];
for (var j = 0; j < array.length; j++)
if (i & (1 << j))
subset.push(array[j]);
result.push(subset);
}
return result;
}
console.log(generatePowerSet(arr));
Throughout the main loop of the function, subsets are created and then pushed into the result array.
I set out to understand what is happening with the examples in this post. While the function generator example, bit-wise operator example, and the example use of the array map and reduce functions are very elegant and impressive, I found it tough to mentally visual what precisely was happening. I have 2 examples below that I believe are easy to visualize both a non-recursive and a recursive solution. I hope this helps others attempting to wrap their heads around the process of finding all subsets.
NON-RECURSIVE:
For each value of the array clone all existing subsets (including the empty set) and add the new value to each of the clones, pushing the clones back to the results.
const PowerSet = array => {
const result = [[]] // Starting with empty set
for (let value of array) { // For each value of the array
const length = result.length // Can't use result.length in loop since
// results length is increased in loop
for (let i = 0; i < length; i++){
let temp = result[i].slice(0) // Make a clone of the value at index i
temp.push(value) // Add current value to clone
result.push(temp) // Add clone back to results array
}
}
return result;
}
console.log(PowerSet([1,2,3]))
RECURSIVELY:
Build the powerset by recursively pushing a combination of the current index value concatenated with an ever increasing prefix array of values.
const powerSetRecursive = (arr, prefix=[], set=[[]]) => {
if(arr.length === 0) return// Base case, end recursion
for (let i = 0; i < arr.length; i++) {
set.push(prefix.concat(arr[i]))// If a prefix comes through, concatenate value
powerSetRecursive(arr.slice(i + 1), prefix.concat(arr[i]), set)
// Call function recursively removing values at or before i and adding
// value at i to prefix
}
return set
}
console.log(powerSetRecursive([1,2,3]))
function subSets(num){
/*
example given number : [1,3]
[]
1: copy push 1
[] [1]
3: copy push 3
[] [1] [3] [1,3]
*/
let result = [];
result.push([]);
for(let i=0; i < num.length;i++){
let currentNum = num[i];
let len = result.length;
for(let j=0; j < len; j++){
let cloneData = result[j].slice();
cloneData.push(currentNum);
result.push(cloneData)
}
}
return result;
}
let test = [1,3];
console.log(subSets(test))//[ [], [ 1 ], [ 3 ], [ 1, 3 ] ]
let subsets = (n) => {
let result = [];
result.push([]);
n.forEach(a => {
//array length
let length = result.length;
let i =0;
while(i < length){
let temp = result[i].slice(0);
temp.push(a);
result.push(temp);
i++;
}
})
return result;
}
Using flatMap and rest/spread, this can be fairly elegant:
const subsets = ([x, ...xs]) =>
x == undefined
? [[]]
: subsets (xs) .flatMap (ss => [ss, [x, ...ss]])
console .log (subsets ([1, 2, 3]))
.as-console-wrapper {max-height: 100% !important; top: 0}
This version does not return them in the requested order. Doing that seems slightly less elegant, and there's probably a better version:
const subset = (xs = []) => {
if (xs.length == 0) {return [[]]}
const ys = subset (xs .slice (0, -1))
const x = xs .slice (-1) [0]
return [... ys, ... ys .map (y => [... y, x])]
}
Or, the same algorithm in a different style,
const subsets = (
xs = [],
x = xs .slice (-1) [0],
ys = xs.length && subsets (xs .slice (0, -1))
) =>
xs .length == 0
? [[]]
: [... ys, ... ys .map (y => [... y, x])]
A shorter version of #koorchik's answer.
var getAllSubsets = (nums) => {
const subsets = [[]];
for (n of nums) {
subsets.map((el) => {
subsets.push([...el, n]);
});
}
return subsets;
};
console.log(getAllSubsets([1, 2, 3]));
// [[],[1],[2],[1,2],[3],[1,3],[2,3],[1,2,3]]
For loop:
function powerSet(numbers) {
const subsets = [[]]
for (const number of numbers) {
subsets.forEach(subset => subsets.push([...subset, number]))
}
return subsets
}
Recursion:
function powerSet(numbers) {
const subsets = [[]]
if (numbers.length === 0) return subsets
for (let i = 0; i < numbers.length; i++) {
subsets.push(...powerSet(numbers.slice(i + 1)).map(subset => [numbers[i], ...subset]))
// Or step by step:
// const number = numbers[i]
// const otherNumbers = numbers.slice(i + 1)
// const otherNumbersSubsets = powerSet(otherNumbers)
// const otherNumbersSubsetsWithNumber = otherNumbersSubsets.map(subset => [number, ...subset])
// subsets.push(...otherNumbersSubsetsWithNumber)
}
return subsets
}
Using reduceRight:
const subsets = array =>
array.reduceRight(
(accumulator, a) => [...accumulator, ...accumulator.map(b => [a, ...b])],
[[]]
);
console.log(subsets([1, 2, 3])); // [[], [3], [2], [2, 3], [1], [1, 3], [1, 2], [1, 2, 3]]
This one is with recursion
var subsets = function(s){
if(s.length === 0) {
return [[]]
}
var h,t,ss_excl_h;
var ss_incl_h = [];
[h,...t] = s;
ss_excl_h = subsets(t)
for(ss of ss_excl_h) {
let hArr = [];
hArr.push(h);
let temp = hArr.concat(ss)
ss_incl_h.push(temp);
}
return ss_incl_h.concat(ss_excl_h)
}
console.log(subsets([1,2,3])) // returns distinct subsets
Update ES2020
With ES2020 BigInts have become available.
Bigints don’t have a fixed storage size in bits; their sizes adapt to the integers they represent.
- Dr. Axel Rauschmayer; JavaScript for impatient programmers - Chapter 18.2 BigInts
See source.
Using BitInts we can use a binary counter to calculate the power set and are no longer limited by the maximum integer size.
Using a generator we can additionally loop over a power set with constant memory requirement which is important if you want to generate a huge power set.
Here an example using you original array [1, 2, 3].
/**
* Generate power set from a given array
* #param {Array<any>} array array to create power set from
*/
function* powerSet(array){
// use BigInt to be no longer limited by maximum length of 53-bits
const size = 2n ** BigInt(array.length);
for (let i = 0; i < size; i++) {
const cur = [];
for(let j = 0; j < array.length; j++){
// check if i-th bit is set to 1
if((i & (1 << j)) > 0){
// push array value (represented by that 1-bit) to result
cur.push(array[j]);
}
}
// generate next result
yield cur;
}
}
// generate power set for [1, 2, 3] and print results
console.log([...powerSet([1, 2, 3])]);
.as-console-wrapper { max-height: 100% !important; top: 0; }
Here how you could loop over a very large power set with constant memory and no upper bound (theoretically, there will be an upper bound in terms of compute time) for the array length.
/**
* Generate power set from a given array
* #param {Array<any>} array array to create power set from
*/
function* powerSet(array){
// use BigInt to no longer limited by maximum length of 53-bits
const size = 2n ** BigInt(array.length);
for (let i = 0; i < size; i++) {
const cur = [];
for(let j = 0; j < array.length; j++){
// check if i-th bit is set to 1
if((i & (1 << j)) > 0){
// push array value (represented by that 1-bit) to result
cur.push(array[j]);
}
}
// generate next result
yield cur;
}
}
/**
* Helper function to generate an array containing more than 53 elements
* #param {number} start
* #param {number} end
*/
function* range(start, end){
for (let i = start; i < end; i++) {
yield i;
}
}
// create an array containing elments 1 through 60 ([1, 2, 3, ..., 60])
const oneToSixty = [...range(1, 61)];
let i = 0;
const max = 1000;
// loop over whole powerSet with constant memory requirement
// abort after 1000 subsets, otherwise this will take a very long time to complete
for(const subset of powerSet(oneToSixty)){
console.log(subset);
if(i++ === max) break;
}
.as-console-wrapper { max-height: 100% !important; top: 0; }

How to find all subsets of a set in JavaScript? (Powerset of array)

I need to get all possible subsets of an array.
Say I have this:
[1, 2, 3]
How do I get this?
[], [1], [2], [3], [1, 2], [2, 3], [1, 3], [1, 2, 3]
I am interested in all subsets. For subsets of specific length, refer to the following questions:
Finding subsets of size n: 1, 2
Finding subsets of size > 1: 1
Here is one more very elegant solution with no loops or recursion, only using the map and reduce array native functions.
const getAllSubsets =
theArray => theArray.reduce(
(subsets, value) => subsets.concat(
subsets.map(set => [value,...set])
),
[[]]
);
console.log(getAllSubsets([1,2,3]));
We can solve this problem for a subset of the input array, starting from offset. Then we recurse back to get a complete solution.
Using a generator function allows us to iterate through subsets with constant memory usage:
// Generate all array subsets:
function* subsets(array, offset = 0) {
while (offset < array.length) {
let first = array[offset++];
for (let subset of subsets(array, offset)) {
subset.push(first);
yield subset;
}
}
yield [];
}
// Example:
for (let subset of subsets([1, 2, 3])) {
console.log(subset);
}
Runtime complexity is proportional to the number of solutions (2ⁿ) times the average length per solution (n/2) = O(n2ⁿ).
Simple solution without recursion:
function getAllSubsets(array) {
const subsets = [[]];
for (const el of array) {
const last = subsets.length-1;
for (let i = 0; i <= last; i++) {
subsets.push( [...subsets[i], el] );
}
}
return subsets;
}
How does it work?
If we have some subsets generated from input numbers and we want to add one more number to our input array, it means that we can take all already existing subsets and generate new ones by appending the new number to each of the existing.
Here is an example for [1, 2, 3]
Start with an empty subset: []
Create new subsets by adding "1" to each existing subset. It will be:[] [1]
Create new subsets by adding "2" to each existing subset. It will be:[], [1] [2], [1, 2]
Create new subsets by adding "3" to each existing subset. It will be: [], [1], [2], [1, 2] [3], [1, 3], [2, 3], [1, 2, 3]
Another simple solution.
function getCombinations(array) {
function fork(i, t) {
if (i === array.length) {
result.push(t);
return;
}
fork(i + 1, t.concat([array[i]]));
fork(i + 1, t);
}
var result = [];
fork(0, []);
return result;
}
var data = [1, 2, 3],
result = getCombinations(data);
console.log(result);
.as-console-wrapper { max-height: 100% !important; top: 0; }
You can easily generate the powerset from an array, using something like the following:
var arr = [1, 2, 3];
function generatePowerSet(array) {
var result = [];
result.push([]);
for (var i = 1; i < (1 << array.length); i++) {
var subset = [];
for (var j = 0; j < array.length; j++)
if (i & (1 << j))
subset.push(array[j]);
result.push(subset);
}
return result;
}
console.log(generatePowerSet(arr));
Throughout the main loop of the function, subsets are created and then pushed into the result array.
I set out to understand what is happening with the examples in this post. While the function generator example, bit-wise operator example, and the example use of the array map and reduce functions are very elegant and impressive, I found it tough to mentally visual what precisely was happening. I have 2 examples below that I believe are easy to visualize both a non-recursive and a recursive solution. I hope this helps others attempting to wrap their heads around the process of finding all subsets.
NON-RECURSIVE:
For each value of the array clone all existing subsets (including the empty set) and add the new value to each of the clones, pushing the clones back to the results.
const PowerSet = array => {
const result = [[]] // Starting with empty set
for (let value of array) { // For each value of the array
const length = result.length // Can't use result.length in loop since
// results length is increased in loop
for (let i = 0; i < length; i++){
let temp = result[i].slice(0) // Make a clone of the value at index i
temp.push(value) // Add current value to clone
result.push(temp) // Add clone back to results array
}
}
return result;
}
console.log(PowerSet([1,2,3]))
RECURSIVELY:
Build the powerset by recursively pushing a combination of the current index value concatenated with an ever increasing prefix array of values.
const powerSetRecursive = (arr, prefix=[], set=[[]]) => {
if(arr.length === 0) return// Base case, end recursion
for (let i = 0; i < arr.length; i++) {
set.push(prefix.concat(arr[i]))// If a prefix comes through, concatenate value
powerSetRecursive(arr.slice(i + 1), prefix.concat(arr[i]), set)
// Call function recursively removing values at or before i and adding
// value at i to prefix
}
return set
}
console.log(powerSetRecursive([1,2,3]))
function subSets(num){
/*
example given number : [1,3]
[]
1: copy push 1
[] [1]
3: copy push 3
[] [1] [3] [1,3]
*/
let result = [];
result.push([]);
for(let i=0; i < num.length;i++){
let currentNum = num[i];
let len = result.length;
for(let j=0; j < len; j++){
let cloneData = result[j].slice();
cloneData.push(currentNum);
result.push(cloneData)
}
}
return result;
}
let test = [1,3];
console.log(subSets(test))//[ [], [ 1 ], [ 3 ], [ 1, 3 ] ]
let subsets = (n) => {
let result = [];
result.push([]);
n.forEach(a => {
//array length
let length = result.length;
let i =0;
while(i < length){
let temp = result[i].slice(0);
temp.push(a);
result.push(temp);
i++;
}
})
return result;
}
Using flatMap and rest/spread, this can be fairly elegant:
const subsets = ([x, ...xs]) =>
x == undefined
? [[]]
: subsets (xs) .flatMap (ss => [ss, [x, ...ss]])
console .log (subsets ([1, 2, 3]))
.as-console-wrapper {max-height: 100% !important; top: 0}
This version does not return them in the requested order. Doing that seems slightly less elegant, and there's probably a better version:
const subset = (xs = []) => {
if (xs.length == 0) {return [[]]}
const ys = subset (xs .slice (0, -1))
const x = xs .slice (-1) [0]
return [... ys, ... ys .map (y => [... y, x])]
}
Or, the same algorithm in a different style,
const subsets = (
xs = [],
x = xs .slice (-1) [0],
ys = xs.length && subsets (xs .slice (0, -1))
) =>
xs .length == 0
? [[]]
: [... ys, ... ys .map (y => [... y, x])]
A shorter version of #koorchik's answer.
var getAllSubsets = (nums) => {
const subsets = [[]];
for (n of nums) {
subsets.map((el) => {
subsets.push([...el, n]);
});
}
return subsets;
};
console.log(getAllSubsets([1, 2, 3]));
// [[],[1],[2],[1,2],[3],[1,3],[2,3],[1,2,3]]
For loop:
function powerSet(numbers) {
const subsets = [[]]
for (const number of numbers) {
subsets.forEach(subset => subsets.push([...subset, number]))
}
return subsets
}
Recursion:
function powerSet(numbers) {
const subsets = [[]]
if (numbers.length === 0) return subsets
for (let i = 0; i < numbers.length; i++) {
subsets.push(...powerSet(numbers.slice(i + 1)).map(subset => [numbers[i], ...subset]))
// Or step by step:
// const number = numbers[i]
// const otherNumbers = numbers.slice(i + 1)
// const otherNumbersSubsets = powerSet(otherNumbers)
// const otherNumbersSubsetsWithNumber = otherNumbersSubsets.map(subset => [number, ...subset])
// subsets.push(...otherNumbersSubsetsWithNumber)
}
return subsets
}
Using reduceRight:
const subsets = array =>
array.reduceRight(
(accumulator, a) => [...accumulator, ...accumulator.map(b => [a, ...b])],
[[]]
);
console.log(subsets([1, 2, 3])); // [[], [3], [2], [2, 3], [1], [1, 3], [1, 2], [1, 2, 3]]
This one is with recursion
var subsets = function(s){
if(s.length === 0) {
return [[]]
}
var h,t,ss_excl_h;
var ss_incl_h = [];
[h,...t] = s;
ss_excl_h = subsets(t)
for(ss of ss_excl_h) {
let hArr = [];
hArr.push(h);
let temp = hArr.concat(ss)
ss_incl_h.push(temp);
}
return ss_incl_h.concat(ss_excl_h)
}
console.log(subsets([1,2,3])) // returns distinct subsets
Update ES2020
With ES2020 BigInts have become available.
Bigints don’t have a fixed storage size in bits; their sizes adapt to the integers they represent.
- Dr. Axel Rauschmayer; JavaScript for impatient programmers - Chapter 18.2 BigInts
See source.
Using BitInts we can use a binary counter to calculate the power set and are no longer limited by the maximum integer size.
Using a generator we can additionally loop over a power set with constant memory requirement which is important if you want to generate a huge power set.
Here an example using you original array [1, 2, 3].
/**
* Generate power set from a given array
* #param {Array<any>} array array to create power set from
*/
function* powerSet(array){
// use BigInt to be no longer limited by maximum length of 53-bits
const size = 2n ** BigInt(array.length);
for (let i = 0; i < size; i++) {
const cur = [];
for(let j = 0; j < array.length; j++){
// check if i-th bit is set to 1
if((i & (1 << j)) > 0){
// push array value (represented by that 1-bit) to result
cur.push(array[j]);
}
}
// generate next result
yield cur;
}
}
// generate power set for [1, 2, 3] and print results
console.log([...powerSet([1, 2, 3])]);
.as-console-wrapper { max-height: 100% !important; top: 0; }
Here how you could loop over a very large power set with constant memory and no upper bound (theoretically, there will be an upper bound in terms of compute time) for the array length.
/**
* Generate power set from a given array
* #param {Array<any>} array array to create power set from
*/
function* powerSet(array){
// use BigInt to no longer limited by maximum length of 53-bits
const size = 2n ** BigInt(array.length);
for (let i = 0; i < size; i++) {
const cur = [];
for(let j = 0; j < array.length; j++){
// check if i-th bit is set to 1
if((i & (1 << j)) > 0){
// push array value (represented by that 1-bit) to result
cur.push(array[j]);
}
}
// generate next result
yield cur;
}
}
/**
* Helper function to generate an array containing more than 53 elements
* #param {number} start
* #param {number} end
*/
function* range(start, end){
for (let i = start; i < end; i++) {
yield i;
}
}
// create an array containing elments 1 through 60 ([1, 2, 3, ..., 60])
const oneToSixty = [...range(1, 61)];
let i = 0;
const max = 1000;
// loop over whole powerSet with constant memory requirement
// abort after 1000 subsets, otherwise this will take a very long time to complete
for(const subset of powerSet(oneToSixty)){
console.log(subset);
if(i++ === max) break;
}
.as-console-wrapper { max-height: 100% !important; top: 0; }

Categories