Related
I want to find all possible arrays -of non-negative numbers- that sum up to -at most- N in JavaScript:
function findArrays(maxSize, maxSum){}
Example input: findArrays(3, 10)
Some acceptable outputs: (not writing all as it would be too long)
[[0], [0,0,0], [10,0,0], [1,9], [1,2,3] /*, ... */]
What I tried so far:
I know it looks like homework but it's not :) I can think of a solution that simply generates all (size*maxSum) possible arrays of acceptable sizes and then iterate through them to check if sum is greater than maxSum. However, I think this solution is very bad in terms of performance as maxSum gets bigger. I'm looking for a more efficient implementation but I just don't know where to start.
My "bad" solution
function getNextArray(r,maxVal){
for(var i=r.length-1;i>=0;i--){
if(r[i]<maxVal){
r[i]++;
if(i<r.length-1){
r[i+1]=0;
}
break;
}
}
return r;
}
function getAllArraysOfSize(size, maxVal){
var arrays=[],r=[],i;
for(i=0;i<size;i++){
r[i]=0;
}
while(r.reduce((a, b) => a + b, 0) < (maxVal*size)){
r = getNextArray(r.slice(),maxVal);
arrays.push(r);
}
return arrays;
};
function findArrays(maxSize, maxSum){
var allArrays=[],arraysOfFixedSize=[],acceptableArrays=[],i,j;
for(i=1; i<=maxSize; i++){
arraysOfFixedSize=getAllArraysOfSize(i,maxSum);
for(j=0; j<arraysOfFixedSize.length; j++){
allArrays.push(arraysOfFixedSize[j]);
}
}
for(i=0; i<allArrays.length; i++){
if(allArrays[i].reduce((a, b) => a + b, 0) <= maxSum){
acceptableArrays.push(allArrays[i]);
}
}
return acceptableArrays;
};
You can use recursion and a generator. The number of outputs grows quickly for higher valued arguments, so I keep them low here:
function * findArrays(maxSize, maxSum) {
let arr = [];
function * recur(maxSum) {
let k = arr.length;
yield [...arr]; // or: if (k) yield [...arr]
if (k === maxSize) return;
for (let i = 0; i <= maxSum; i++) {
arr[k] = i;
yield * recur(maxSum - i);
}
arr.length = k;
}
yield * recur(maxSum);
}
// demo
for (let arr of findArrays(2, 4))
console.log(JSON.stringify(arr));
NB: this also produces the empty array, which makes sense. If you want to avoid this, then just check that you don't yield an empty array.
If you prefer working with plain functions instead of generators, then translate the innermost yield expression to a push unto a result array, as follows:
function findArrays(maxSize, maxSum) {
let arr = [];
let result = []; // <--- will collect all the subarrays
function recur(maxSum) {
let k = arr.length;
result.push([...arr]);
if (k === maxSize) return;
for (let i = 0; i <= maxSum; i++) {
arr[k] = i;
recur(maxSum - i);
}
arr.length = k;
}
recur(maxSum);
return result;
}
// demo
for (let arr of findArrays(2, 4))
console.log(JSON.stringify(arr));
i hope this is helpful
const data = [[0],[0,0,0],[10,0,0],[1,9],[1,2,3]];
function findArrays(maxSize, maxSum){
return data.reduce(
(acc, value) => {
if (value.length <= maxSize) {
const tempValue = value;
const sum = tempValue.reduce((acc, val) => val >= 0 ? acc + val : 0, 0);
if (sum <= maxSum && sum > 0) acc.push(value);
}
return acc
}, []
)
}
console.log(findArrays(3, 10));
I am working on a leetcode question and I cant quite think of a way to compare the rest of the elements in the array with one another. I figured out for the biggest and smallest numbers but to compare with the rest of them is something I am having trouble with. Below you will find the question and my work with it:
How Many Numbers Are Smaller Than the Current Number?
Given the array nums, for each nums[i] find out how many numbers in the array are smaller than it. That is, for each nums[i] you have to count the number of valid j's such that j != i and nums[j] < nums[i].
Return the answer in an array.
Example 1:
Input: nums = [8,1,2,2,3]
Output: [4,0,1,1,3]
Explanation:
For nums[0]=8 there exist four smaller numbers than it (1, 2, 2 and 3).
For nums[1]=1 does not exist any smaller number than it.
For nums[2]=2 there exist one smaller number than it (1).
For nums[3]=2 there exist one smaller number than it (1).
For nums[4]=3 there exist three smaller numbers than it (1, 2 and 2).
My work:
var smallerNumbersThanCurrent = (nums) => {
const output = []
const max = nums.reduce(function(a, b) {
return Math.max(a, b);
});
const min = nums.reduce(function(a, b) {
return Math.min(a, b);
});
for(let i = 0; i < nums.length; i++){
if(nums[i] === max){
output.push(nums.length - 1)
} else if (nums[i] === min){
output.push(0)
}
else if (nums[i] < max && nums[i] > min){
//how do i compare with rest of the elements in the array?
}
}
}
Use a nested loop.
nums = [8,1,2,2,3];
answer = [];
for (let i = 0; i < nums.length; i++) {
let count = 0;
for (let j = 0; j < nums.length; j++) {
if (nums[j] < nums[i]) {
count++;
}
}
answer.push(count);
console.log(`For nums[${i}]=${nums[i]} there are ${count} lower numbers`);
}
console.log(`Answer: ${answer}`);
It's not necessary to test i != j since a number will never be lower than itself.
A much easier way would be to simply sort the array, and then the index of the element will tell you how many are less than it:
const nums = [8,1,2,2,3]
const sorted = [...nums].sort();
const result = nums.map((i) => {
return sorted.findIndex(s => s === i);
});
console.log(result);
This has the added benefit that you don't have to search the entire array for each number.
I'd do like:
function rankZero(array){
const s = [...array], r = [];
s.sort((a, b)=>{
return a - b;
});
for(let n of array){
r.push(s.indexOf(n));
}
return r;
}
console.log(rankZero([8, 1, 2, 2, 3]));
One way to do this is to filter the array on the condition that the value is less than the current one and then count the number of values in the filtered array:
const nums = [8,1,2,2,3];
const smallerNums = nums.map(v => nums.filter(n => n < v).length);
console.log(smallerNums); // [4,0,1,1,3]
Alternatively you can do a count in reduce, which should be significantly faster:
const nums = [8, 1, 2, 2, 3];
const smallerNums = nums.map(v => nums.reduce((c, n) => c += (n < v), 0));
console.log(smallerNums); // [4,0,1,1,3]
Inspired by #tao I did performance testing of each solution. On my computer (an Intel Core I9-9900 with 64GB RAM) #StackSlave's solution is consistently the fastest, followed by the other sorting solution, the reduce solution, the basic iteration and the filter. You can run the tests yourself below:
const datalength = 1000;
const iterations = 100;
const getRandom = (min, max) => Math.random() * (max - min) + min;
const data = Array.from({
length: datalength
}, () => getRandom(1, 100));
const mapper = arr => arr.map(i => arr.filter(n => n < i).length);
const sorter = nums => {
const sorted = [...nums].sort();
const result = nums.map((i) => {
return sorted.findIndex(s => s === i);
});
};
const iterator = arr => {
const answer = [];
for (let i = 0; i < arr.length; i++) {
let count = 0;
for (let j = 0; j < arr.length; j++) {
if (arr[j] < arr[i]) {
count++;
}
}
answer.push(count);
}
return answer;
};
const rankZero = array => {
const s = [...array],
r = [];
s.sort((a, b) => {
return a - b;
});
for (let n of array) {
r.push(s.indexOf(n));
}
return r;
}
const reducer = arr => arr.map(v => arr.reduce((c, n) => c += (n < v), 0));
let fns = {
'iterator': iterator,
'mapper': mapper,
'sorter': sorter,
'reducer': reducer,
'rankZero': rankZero
}
for (let [name, fn] of Object.entries(fns)) {
let total = 0;
for (i = 0; i < iterations; i++) {
let t0 = performance.now();
fn(data);
let t1 = performance.now();
total += t1 - t0;
}
console.log(name, (total / iterations).toFixed(2));
}
How would you best turn [1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1] into [12,12]
Basically reducing an array into an aggregate every 12 items.
This is my not so elegant attempt:
let arr = [1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1];
let result = 0;
let finalArr = [];
arr.forEach((item,index) => {
result += item;
if((index+1) % 12 === 0) {
finalArr.push(result)
result = 0
}
})
Can this be done a bit more elegantly? Perhaps using reduce()? I haven't used js in a while so I am a bit rusty! Thanks in advance.
function reduceGroupByN(arr, count, fn) {
const out = []
if (count <= 1) throw new Error("Grouping must be greater than 1")
for (var last = 0; last < arr.length; last += count) {
out.push(arr.slice(last, last + count).reduce(fn))
}
return out
}
const arr = [1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1]
const result = reduceGroupByN(arr, 12, (a, b) => a + b)
For a more general solution, consider using a function that chunks an array into slices. (This sort of utility function is relatively common already.) Then call that function and add up the resulting subarrays with .map:
const arr = [1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1];
const chunk = (arr, length) => arr.reduce((a, num, i) => {
const chunkIndex = Math.floor(i / length);
if (!a[chunkIndex]) {
a[chunkIndex] = [];
}
a[chunkIndex].push(num);
return a;
}, []);
const finalArr = chunk(arr, 12)
.map(
subarr => subarr.reduce((a, b) => a + b)
);
console.log(finalArr);
At the risk of angering the modern JS gods, there is still something to be said for the performance and simplicity of a classic solution.
On my machine this runs two to three times faster compared to the other answers.
let arr =[1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1];
let finalArr=[]
let chunkSize=12
for (n=0; n < arr.length;)
{
for( result=0, m=0; m < chunkSize; m++, n++)
{
result+=arr[n]
}
finalArr.push(result)
}
console.log(finalArr)
I am new to coding for parallel processing and am attempting to implement a multithreaded Merge Sort algorithm. I am unclear on the proper usage of threads and how exactly it works, but I attempted to implement it anyway and this is where I landed. (tells me that slice is not a function)
function pMergeSort(input, done) {
const spawn = require('threads').spawn;
if (input.length< 2)
return input;
function partition(arr) {
let middle = Math.floor(arr.length / 2);
let left = arr.sclice(0, middle);
let right = arr.slice(middle + 1, arr.length);
}
let left,right=partition(input);
let task = spawn(pMergeSort).send(left).on('done', function (arr) {
if (result === undefined) {
for (let i = 0; i< n.length; i++) {
result[i] = n[i];
}
}else {
merge(result, n);
left.kill();
}
});
pMergeSort(right, function (n) {
if (result === undefined) {
for (let i = 0; i< n.length; i++) {
result[i] = n[i];
}
}else {
merge(result, n);
right.kill();
}
});
}
/*
function mergeSort (arr) {
if (arr.length === 1) {
// return once we hit an array with a single item
return arr
}
const middle = Math.floor(arr.length / 2) // get the middle item of the array rounded down
const left = arr.slice(0, middle) // items on the left side
const right = arr.slice(middle) // items on the right side
return merge(mergeSort(left), mergeSort(right))
}*/
// compare the arrays item by item and return the concatenated result
function merge(left, right) {
let result = []
let indexLeft = 0
let indexRight = 0
while (indexLeft < left.length && indexRight < right.length) {
if (left[indexLeft] < right[indexRight]) {
result.push(left[indexLeft])
indexLeft++
} else {
result.push(right[indexRight])
indexRight++
}
}
return result.concat(left.slice(indexLeft)).concat(right.slice(indexRight))
}
function genArray(size) {
// randomly fills arr
var arr = Array(size);
for (var i = 0; i < size; i++) {
arr[i] = Math.floor(Math.random() * 98);
}
return arr
}
function testParallel(){
var array=genArray(Math.floor(Math.random()*100));
pMergeSort(array, function(i){
console.log(JSON.stringify(i));
})
}
var testArr = genArray(2);
pMergeSort(testArr, function (i) {
console.log(JSON.stringify(i));
});
help implementing this would be amazing, but some simple questions that could push me in the right direction is, is the merge of pMergeSort supposed to be a callback function? how do you define your call back function? Would it be better to use a pool of threads rather than trying to spawn threads? And the function wrapped in pMergeSort should be merge sort + splitting into threads right?
I am trying to solve a problem with lower time complexity which in this case O(n).
Here is the problem:
let arr = ['dog', 'come', 'ogd', 'something', 'emoc'];
It should return
[['dog', 'ogd], ['come', 'emoc']]; // which same using letters
So far I solved this problem using two functions it is working great but mine is nested loop will give me O(n2)
Here is my code
const isSameChars = (str1, str2) => {
let str1sorted = str1.split("").sort().join("");
let str2sorted = str2.split("").sort().join("");
if (str1.length !== str2.length) {
return false;
}
for (var i = 0; i < str1sorted.length; i++) {
let char1 = str1sorted[i];
let char2 = str2sorted[i];
if (char1 !== char2) {
return false;
}
}
return true;
}
const isSameCharElements = (arr) => {
let result = [];
for(var i = 0; i < arr.length; i++) {
for(var j = 0; j < i; j++) {
if (isSameChars(arr[i], arr[j]) === true) {
result.push(arr[j])
}
}
}
return result;
}
console.log(isSameCharElements(['dog', 'come', 'ogd', 'something',
'emoc'])) // [['dog', 'ogd], ['come', 'emoc']]
Is there any way to solve this with O(n) time complexity?
Thank you an advance!
You can have a 'bag of letters' representation of any String by sorting the letters:
function sortLetters(word){
return word.split('').sort().join('');
}
You can then iterate over your input, grouping words that have the same bag of letters representation into an object:
const grouped = arr.reduce(function (m, word) {
var bagRepr = sortLetters(word);
var withSameLetters = m[bagRepr] || [];
withSameLetters.push(word);
m[bagRepr] = withSameLetters;
return m;
}, {});
const result = Object.values(grouped)
.filter(function (arr) {
return arr.length > 1;
});
This is O(n), provided that sortLetters() is O(1), which is the case if the words lengths are bounded by a constant.
Disclaimer: note that we're only talking about asymptotic complexity here - this does not mean at all that this approach is the most efficient from a practical standpoint!