Got this question recently:
Write a function which takes an array of arrays (each of which contains numbers sorted from largest to smallest), and a number (n). Return the n largest numbers.
For example:
findLargest([ [10, 5, 3, 1], [9, 8, 7, 6], [11, 2, 1, 0] ], 5)
=> [11, 10, 9, 8, 7]
findLargest([ [15, 5, 3, 1], [10, 8, 7, 6]], 3)
=> [ 15, 10, 8 ]
Do this without copying or modifying the arrays (just read from them).
Optimize for time complexity.
I came up with this, but am not that happy with my solution:
function findLargest(numberArrays, n ) {
var results = [];
var pointers = [];
for (var x = 0; x < numberArrays.length; x++) {
pointers.push(0);
}
while (results.length < n) {
var subMaxes = [];
for (var i = 0; i < pointers.length; i++) {
var point = pointers[i];
subMaxes.push(numberArrays[i][point]);
}
var max = Math.max.apply(null, subMaxes);
var indexOfMax = subMaxes.indexOf(max);
pointers[indexOfMax]++;
results.push(max);
}
return results;
}
I think it is O(n^2).... is there anyway to do it in O(n)?
The question can be formalised (and slightly tweaked) as, Given a 2D array of dimension n x n, where each row is sorted in a decreasing order, find the largest k elements
For the largest n elements, the time complexity will be O(nlogn). The procedure for k largest elements is explained below:
Build a max heap of the first element from each row: Time complexity is O(n)
Extract the largest element from the heap, and insert an element into the heap from the row to which the extracted element belongs. Time Complexity is O(logn)
Repeat under desired number of elements is extracted.
So an iteration to extract the largest number requires O(logn) time, with a pre-processing O(n) cost.
To extract k elements, the time complexity of the above algorithm is O(klogn)
Merge all arrays into single array. This takes O(n) time.
Use median of medians algorithm to find kth largest element in new array. O(n) time.
Traverse array and grab all elements greater than or equal to that element. This takes O(n) time.
This algorithm runs in O(n) time.
Related
Given an array of integers, where the values should be sorted in the following order:
if we have an array
[1, -1, -3, 9, -2, -5, 4, 8,]
we must rearrange it this way: largest number, smallest number, 2nd largest number, 2nd smallest number, ...
[9, -5, 8, -3, 4, -2, 1, -1 ]
I get the first largest and smallest numbers, but can't figure out how to make it dynamic for all values in the array.
I know that I must take two variables, say firstSmallest and firstLargest and point them to the first and last index of the array respectively, run a loop, which I do already in the code below, and store value into new array by incrementing firstSmallest and decrementing firstLargest, but couldn't implement into code.
let unsortedArr = [1, 5, 8 , 7, 6, -1, -5, 4, 9, 5]
let output = [];
function meanderArray(unsorted){
let sorted = unsorted.sort((a, b) => a-b);
let firstSmallest = sorted[0];
let firstLargest = sorted[unsorted.length-1];
for(let i = 0; i <= sorted.length; i++){
//I should increment firstSmallest and decrement firstLargest numbers and store in output
}
return output;
}
meanderArray(unsortedArr);
console.log(output);
You could take a toggle object which takes the property of either the first item or last from an array and iterate until no more items are available.
function meanderArray([...array]) {
const
result = [],
toggle = { shift: 'pop', pop: 'shift' };
let fn = 'shift';
array.sort((a, b) => a - b);
while (array.length) result.push(array[fn = toggle[fn]]());
return result;
}
console.log(...meanderArray([1, 5, 8, 7, 6, -1, -5, 4, 9, 5]));
You can sort an array by descending, then logic is the following: take first from start and first from end, then second from start-second from end, etc.
let unsortedArr = [1, 5, 8 , 7, 6, -1, -5, 4, 9, 5]
let output = [];
function meanderArray(unsorted){
let sorted = unsorted.sort((a, b) => b-a);
let output = []
for(let i = 0; i < sorted.length/2; i++){
output.push(sorted[i])
if(i !== sorted.length - 1 - i){
output.push(sorted[sorted.length - 1 - i])
}
}
return output;
}
let result = meanderArray(unsortedArr);
console.log(result);
You can sort, then loop and extract the last number with pop() and extract the first number with shift().
let unsortedArr = [1, -1, -3, 9, -2, -5, 4, 8,]
let output = [];
function meanderArray(unsorted){
let sorted = unsorted.sort((a, b) => a - b);
for(let i = 0; i < unsortedArr.length + 2; i++){
output.push(sorted.pop());
output.push(sorted.shift());
}
console.log(output);
return output;
}
meanderArray(unsortedArr);
Fastest Meandering Array method among all solutions mentioned above.
According to the JSBench.me, this solution is the fastest and for your reference i have attached a screenshot below.
I got a different approach, but i found that was very close to one of above answers from elvira.genkel.
In my solution for Meandering Array, First I sorted the given array and then i tried to find the middle of the array. After that i divided sorted array in to two arrays, which are indices from 0 to middle index and other one is from middle index to full length of sorted array.
We need to make sure that first half of array's length is greater than the second array. Other wise when applying for() loop as next step newly created array will contains some undefined values. For avoiding this issue i have incremented first array length by one.
So, always it should be firstArr.length > secondArr.length.
And planned to create new array with values in meandering order. As next step I created for() loop and try to push values from beginning of the first array and from end of the second array. Make sure that dynamically created index of second array will receive only zero or positive index. Other wise you can find undefined values inside newly created Meandering Array.
Hope this solution will be helpful for everyone, who loves to do high performance coding :)
Your comments and suggestions are welcome.
const unsorted = [1, 5, 8, 7, 6, -1, -5, 4, 9, 5];
const sorted = unsorted.sort((a,b)=>a-b).reverse();
const half = Math.round(Math.floor(sorted.length/2)) + 1;
const leftArr = sorted.slice(0, half);
const rightArr = sorted.slice(half, sorted.length);
const newArr = [];
for(let i=0; i<leftArr.length; i++) {
newArr.push(leftArr[i]);
if (rightArr.length-1-i >= 0) {
newArr.push(rightArr[rightArr.length-1-i]);
}
}
I want to find all possible maximum contiguous subarray averages from an array of values. Each array value represents the value at a duration, the number of seconds passed.
Ex. Input = [6, 4, 3, 10, 5]
Ex. Output = [5.6, 5.75, 6, 7.5, 10]
Output[0] = 6+4+3+10+5 / 5 = 5.6
Output[1] = 6+4+3+10 / 4 = 5.75
Output[2] = 3+10+5 / 3 = 6
Output[3] = 10+5 / 2 = 7.5
Output[4] = 10 / 1 = 10
The issue is that the real data has length of up to 40,000 values.
The result should have the same length as the input. I‘ve done a reduce on a subarray of specific lengths (only getting 5s, 60s, 3600s, etc. length), but that’s not a viable solution for each possible duration. Is there a way I can partition or otherwise create a specialized data structure to get these values? If not, how can I exclude durations as I go?
You can just take the reverse of the input array, then calculate sum and average incrementally. Then again taking the of output array.
const input = [6, 4, 3, 10, 5].reverse();
let output = [];
let total_sum = 0;
for (var i = 0; i < input.length; i++) {
total_sum += input[i];
let avg = total_sum / (i + 1);
output.push(avg);
}
console.log(output.reverse());
(You can even eliminate the reverse by looping the for loop in reverse order.)
Why not .map()? Mixed with reduce you could do something like this:
const output = [
[1, 2, 3, 4],
[5, 6, 7, 8]
];
const averages = output.map(
subarray =>
subarray.reduce(
(previousValue, currentValue) => previousValue + currentValue,
0
) / subarray.length
);
Where subarray is the collection of values, they're then added together and divided by the length of the subarray.
I hope this is what you mean
I have n (n between 1 and 100) sorted number arrays, each with m elements (m around 1000 in my case). I want to merge them into a single sorted array.
I can think of two possibilities for doing this:
1.Use a two arrays merging algo (like merge() function below from http://www.nczonline.net/blog/2012/10/02/computer-science-and-javascript-merge-sort/) and applying it iteratively (1st and 2nd, then merge of 1st-2nd and 3rd, etc)
function merge(left, right) {
var result = [],
il = 0,
ir = 0;
while (il < left.length && ir < right.length){
if (left[il] < right[ir]){
result.push(left[il++]);
} else {
result.push(right[ir++]);
}
}
return result.concat(left.slice(il)).concat(right.slice(ir));
}
Generalize merge() function to n arrays simultaneously. At each iteration, I would pick the min value of the n first values not yet processed and append it to the result.
Are these two algo equivalent in terms of complexity ? I have the feeling that both algo are in o(m*n). Am I right ?
Are there any performance consideration to take one algo rather than the other ? I have the feeling that 1 is simpler than 2.
Merge n arrays using priority queue (based on binary heap, for example).
Overall element count is m*n, so algorithm complexity is O(m * n * Log(n)).
algorithm sketch:
Add numbers 1..n to priority queue, using 1st element of every
array as sorting key
(you may also use pairs (first element/array number).
At every step -
J = pop_minimum
add current head of Jth array to result
move head of Jth array to the right
if Jth array is not exhausted, insert J in queue (with new sorting key)
1st algoritm complexity is
2*m + 3*m+ 4*m+...+n*m = m * (n*(n-1)/2-1) = O(n^2 * m)
That's an old question, but for the sake of posterity:
Both algos are indeed O(n*m). In algo 1, you have to remerge for each m array. In algo 2, you do just one big merge, but picking out the minimum from m arrays is still linear.
What I did instead was implement a modified version of merge sort to get O(mlogn).
The code is there on GitHub https://github.com/jairemix/merge-sorted if anyone needs it.
Here's how it works
The idea is to modify algo 1 and merge each array pairwise instead of linearly.
So in the first iteration you would merge array1 with array2, array3 with array4, etc.
Then in the second iteration, you would merge array1+array2 with array3+array4, array5+array6 with array7+array8, etc.
For example:
// starting with:
[1, 8], [4, 14], [2, 5], [3, 7], [0, 6], [10, 12], [9, 15], [11, 13]
// after iteration 1:
[1, 4, 8, 14], [2, 3, 5, 7], [0, 6, 10, 12], [9, 11, 13, 15]
// after iteration 2
[1, 2, 3, 4, 5, 7, 8, 14], [0, 6, 9, 10, 11, 12, 13, 15]
// after iteration 3
[0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15]
In JS:
function mergeSortedArrays(arrays) {
// while there are still unmerged arrays
while (arrays.length > 1) {
const result = [];
// merge arrays in pairs
for (let i = 0; i < arrays.length; i += 2) {
const a1 = arrays[i];
const a2 = arrays[i + 1];
// a2 can be undefined if arrays.length is odd, so let's do a check
const mergedPair = a2 ? merge2SortedArrays(a1, a2) : a1;
result.push(mergedPair);
}
arrays = result;
}
// handle the case where no arrays is input
return arrays.length === 1 ? arrays[0] : [];
}
Notice the similarity to merge sort. In fact in merge sort, the only difference is that n = m, so you're starting further back with m presorted arrays of 1 item each. Hence the O(mlogm) complexity of merge sort.
Given an index of 5, and an array size of 10, this array is returned: [5, 4, 6, 3, 7, 2, 8, 1, 9, 0]
Code:
function middleOutIterator(index, arraySize) {
var distances = [];
for (var i = 0; i < arraySize; i++) {
distances[i] = [ i, Math.abs(index - i) ];
}
distances.sort(sort);
for (var i = 0; i < distances.length; i++) {
distances[i] = distances[i][0];
}
return distances;
}
function sort(a, b) {
return a[1] > b[1];
}
Basically, you pass in a starting index, and it iterates outward in either direction.
This is not a true iterator, it merely creates an array of indices, so the name I gave it is a bit of misnomer, but what would you call this kind of iteration/sorting?
I'm not looking to optimize this function as it's not in a crucial area and certainly not a bottleneck, but I am interested in reading more about it and any related algorithms.
This type of iteration was recommended in the initial paper on Robin Hood hashing, where it was used in the searching step when doing lookups. The paper refers to it as "mean-centric searching," since the idea is to jump to the (expected) middle of a range and search outwards in both directions around the mean.
I'm not sure if this is the "official" name of this technique or whether it goes by many names, but it's nice to have something to point to.
What would be the best way to shuffle an array of numbers with the condition that each number must be +3 or -3 of the next/prev number? So, for example [0,1] wouldn't work, but [0,3] would.
Thanks!
Looking at the screenshot it seems you're wanting to pick a random assortment from the list, with no 2 choices being within 3 of each other.
This code takes an array, and gives you a subset of the array satisfying that condition.
You can specify a maximum number of selections too, although you might not always get that many.
var src = [0,1,2,3,4,5,6,7,8,9,10,11,12];
var getRnd = function(max){
var output = [];
var newSrc = src.slice();
var test, index, i, safe;
while (newSrc.length > 0 && output.length < max){
index = Math.floor(Math.random()*newSrc.length);
test = newSrc.splice(index,1);
//Make sure it's not within 3
safe = true;
for (i=0; i<output.length;i++){
if(Math.abs(test-output[i]) < 3){
//abort!
safe=false;
}
}
if(safe){
output.push(test);
}
}
return output;
};
alert(getRnd(4));
A way (likley not the fastes) would be to:
sort array
pick random element to start new shuffled array with (mark element in sorted array as used or remove)
with binary search find next element that is +3 or -3 for the last one (randomly pick between -3 and +3). Make sure element is not marked as used before (otherwise find another one)
repeat 3 till you can find elements.
you either picked all elements from sorted array or such shuffling is not possible.
I think you get O(N*logN) with this (sorting N*logN and picking N elements with logN for each serch).
Assuming that the values in the array cannot be duplicated.
function one(array, mod){
var modArray = [];
for(var index in array){
var item = array[index];
var itemMod = item%3;
if(itemMod === mod){
modArray.push(item);
}
}
return modArray();
}
function two(modArray){
var sortedArray = // sort highest to lowest
for(var index in sortedArray ){
var item = array[index];
if(index > 0 && item[index-1] === item[index]-3){
}else{return false;}
}
return sortedArray.length;
}
function main(array){
var a1 = one(array, 0);
var a2 = one(array, 1);
var a3 = one(array, 2);
var a1c = two(a1);
var a2c = two(a2);
var a3c = two(a3);
return // if a1c is greatest then a1, if a2c greatest then a2 ... etc
}
I think you must be using the phrase "shuffle" in some non-standard way. If all of the numbers are already within +-3 of each other, then sorting the array will put them in the right order, unless there are duplicates, I guess.
More examples would probably be helpful. For instance, are these examples valid, and the sort of thing you're looking for?
[0, 3, 3] -> [3, 0, 3]
[9, 3, 6, 0, 6] -> [0, 3, 6, 9, 6]
[3, 3, 6, 0, 6] -> [0, 3, 6, 3, 6]
It feels like this is probably a solved problem in graph theory - some kind of network traversal with a maximum/minimum cost function.