Find the lexicographically smallest sequence achievable - javascript

Here's the problem statement
Given a sequence of n integers arr, determine the lexicographically smallest sequence which may be obtained from it after performing at most k element swaps, each involving a pair of consecutive elements in the sequence.
Note: A list x is lexicographically smaller than a different equal-length list y if and only if, for the earliest index at which the two lists differ, x's element at that index is smaller than y's element at that index.
I'm trying to wrap my head around the what the phrase "lexicographically smaller" implies based on the above note. As I understand the English meaning of it, we are basically talking about a dictionary order. Let me explain my question with an example.
Here an example
Example 1
n = 3
k = 2
arr = [5, 3, 1]
output = [1, 5, 3]
We can swap the 2nd and 3rd elements, followed by the 1st and 2nd elements, to end up with the sequence [1, 5, 3]. This is the lexicographically smallest sequence achievable after at most 2 swaps.
The above example came with the problem statement. But wouldn't the lexicographically smallest sequence instead be [1, 3 , 5] instead of the provided answer (output) [1, 4, 3]?
Here's another
Example 2
n = 5
k = 3
arr = [8, 9, 11, 2, 1]
output = [2, 8, 9, 11, 1]
We can swap [11, 2], followed by [9, 2], then [8, 2].
Again, the answer I can see in this case is [1, 2, 8, 11, 9] (after three swaps), which is the smallest lexicographic ally and the provided answer is output = [2, 8, 9, 11, 1].
Am I reading the the problem statement incorrectly?

The problem statement says that we are allowed to make at most k swaps of consecutive elements in the process to get a lexicographically smallest sequence. The following explanation can help us understand it better. [NOTE: keep in mind that you can only swap consecutive elements]
n = 3
k = 2
arr = [5, 3, 1]
output = [1, 5, 3]
Approach:
swap 1: swap 3 and 1 (3<->1) ====> [5,1,3]
swap 2: swap 5 and 1 (5<->1) ====> [1,5,3] #RESULT
n = 5
k = 3
arr = [8, 9, 11, 2, 1]
output = [2, 8, 9, 11, 1]
Approach:
swap 1: swap 11 and 2 (11<->2) ===> [8, 9, 2, 11, 1]
swap 2: swap 9 and 2 (9<->2) ===> [8, 2, 9, 11, 1]
swap 2: swap 8 and 2 (9<->2) ===> [2, 8, 9, 11, 1] #RESULT
So, you can never get [1, 2, 8, 11, 9] with 3 swaps of consecutive elements. 2 is the smallest element that you can move to the first index with at most 3 swaps of consecutive elements, but yes if k=4 then we can bring 1 to the first position.
So, the thing that you are missing is that the rule is you are allowed to swap at most k elements but the elements those you swap should be consecutive to each other.

Related

JavaScript - Unexpected result while remove elements from an array using Array.prototype.splice

i know this is an annoying question, but can someone explain me why splice method is executing in a weird way. Please explain me why the expected output is different from the actual result.
let numbers = [15, 12, 15, 3, 5, 4, 6];
// Get the indexes of the numbers greater than 5
let indexes = numbers.reduce((arr, current, index) => {
if (current > 5) {
arr.push(index);
}
return arr;
}, []);
// Loop through the indexes while removing the indexes from the numbers array
indexes.forEach((element) => {
numbers.splice(element, 1);
});
// expected result: numbers = [ 3 , 5, 4 ];
// actual result: numbers = [ 12, 3, 4, 6 ]
.splice() changes the array it is used on. You might have already known this, but if you debug your code using a console.log, you'll see what's happening; in short, your first number > 5 is 15. 15 is at index 0, so you remove index 0. However, as splice changes the array it is used on, 12 becomes index 0, and then the second 15 index 1, and so on and so forth. So for example, your code has the following indexes: 0, 1, 2, 6.
The first time you remove index 0: [12, 15, 3, 5, 4, 6]
Then you remove index 1: [12, 3, 5, 4, 6]
Then you remove index 2: [12, 3, 4, 6]
Then you remove index 6, which doesn't exist: [12, 3, 4, 6]
The better way of accomplishing that goal is with .filter(). Filter creates a new array of all items that pass the test given in the callback, so:
numbers = numbers.filter((num) => num < 6);
That's the arrow function expression shorthand to return only numbers less than 6.
splice actually removes the item in place. It does not create any copy of array. In your case after reduce operation, indexes would be
 [0, 1, 2, 6]
and then while iterating and splicing, in first iteration array with position 0 is removed so array becomes
numbers = [12, 15, 3, 5, 4, 6];
and its length is also reduced. On next iteration of forEach array element with index position 1 is removed which is 15 in our case. So after second iteration array becomes
numbers = [12, 3, 5, 4, 6];
Similarly in next subsequent iteration you will have result like
[12, 3, 4, 6]
As someone has mentioned the problem is with applying changes over an array that is mutated in every iteration.
I assume the example is for learning purposes as it would have been easier to write it like:
let numbers = [15, 12, 15, 3, 5, 4, 6]
numbers.filter(elem => elem <= 5)
In any case, and following the demonstration code, it would be good to stress the dangerous of mutations that is prone to spooky effects. I have rewritten the code in a more functional style:
let numbers = [15, 12, 15, 3, 5, 4, 6];
// Get the indexes of the numbers greater than 5
let indexes = numbers.reduce((arr, current, index) => {
if (current > 5) {
return arr.concat(index);
}
return arr;
}, []);
// Instead of removing we create a new array filtering out the elements we dont want
let filteredNumbers = numbers.filter((_,index) => indexes.indexOf(index) === -1)
console.log(filteredNumbers)
// expected result: numbers = [ 3 , 5, 4 ];
// actual result: numbers = [ 3, 5, 4 ]

Finding and removing matching and corresponding values in an array

Here's a sample of the problem I'm having in JavaScript:
first array [1, 2, 3, 4, 5, 6, 7]
second array [7, 8, 9, 4, 2, 5, 7]
In this case, I need to be able to find and eliminate "4" and "7" from both arrays, eliminating both. This is based on their location and matching value.
I haven't been able to find anything other than eliminating matching values. In this case, however, the values must be in the same place and also be matching.
I've tried this so far:
function findCommonElements3(array1, array2) {
return arr1.some(item => arr2.includes(item))
}
it looks like it only looks for matching elements, whereas I need to find matching corresponding elements and then remove them.
As mentioned in the comments, you may use the splice method to remove one or more elements of an array in JavaScript.
First of all I would store the indexes of the elements I should remove looping the array as so:
const array1 = [1, 2, 3, 4, 5, 6, 7];
const array2 = [7, 8, 9, 4, 2, 5, 7];
//Indexes of same elements
var sameIndexes = [];
function findSameIndexes(element, index) {
if (array1[index] == array2[index]) {
sameIndexes.push(index);
}
}
array1.forEach(findSameIndexes);
Calling console.log(sameIndexes) should give this result:
Array [3, 6]
The problem is that if you loop again the array and remove the elements in that order, the indexes would not correspond to the elements anymore.
For example if you remove the 3rd element, the number 7 wouldn't be at index 6 anymore, to solve this issue I'd use the reverse method so you won't lose track of the indexes
// A simple function to remove the elements in both arrays
function removeElements(index) {
array1.splice(index,1);
array2.splice(index,1);
}
sameIndexes.reverse().forEach(removeElements);
And the final results would be
Array [1, 2, 3, 5, 6]
Array [7, 8, 9, 2, 5]
Which hopefully is what you were looking for, of course there are better ways to write it down, but maybe this will help you find a solution.
You could just use a for loop and use index. something like this
const firstarray = [1, 2, 3, 4, 5, 6, 7]
const secondarray = [7, 8, 9, 4, 2, 5, 7]
for (let i = 0; i <= firstarray.length - 1; i++) {
if (firstarray[i] === secondarray[i]) {
console.log(`found ${firstarray[i]} at index ${i}`);
firstarray.splice(i, 1);
secondarray.splice(i, 1);
}
}
console.log(firstarray, secondarray);
const excludeCommon = (ar1, ar2) => {
const both = [...ar1, ...ar2].filter((v, i, ar) => v !== ar[i + (2 * (i < ar1.length) - 1) * ar1.length]);
return [both.slice(0, both.length / 2), both.slice(both.length / 2)];
}
console.log(excludeCommon([1, 2, 3, 4, 5, 6, 7], [7, 8, 9, 4, 2, 5, 7]));

Swapping array values with destructuring assignment and indexOf()

I'm trying to swap the two lowest values in a shuffled array containing the numbers 0-14. For those curious, I'm implementing the shuffling algorithm for a 15 puzzle described by pkpnd here.
I wanted to try destructuring assignment, as described here, but am encountering an unexpected behavior. I realize that I can get my code working (and make it more readable) by just creating a temporary variable, but I'd like to understand what's happening before moving on.
I'm grabbing a subset of my array [1,2] and then trying to replace it with [2,1]. For some reason, it's only swapping the values when their order in the original array is opposite of the order of my subset.
I originally tried this:
var arr1 = [1, 2, 3, 4];
var arr2 = [2, 1, 3, 4];
[arr1[arr1.indexOf(1)], arr1[arr1.indexOf(2)]] = [arr1[arr1.indexOf(2)], arr1[arr1.indexOf(1)]];
[arr2[arr2.indexOf(1)], arr2[arr2.indexOf(2)]] = [arr2[arr2.indexOf(2)], arr2[arr2.indexOf(1)]];
console.log("arr1: " + arr1, "\narr2: " + arr2);
And then tried this:
var arr1 = [1, 2, 3, 4];
var arr2 = [2, 1, 3, 4];
[arr1[arr1.indexOf(1)], arr1[arr1.indexOf(2)]] = [2, 1];
[arr2[arr2.indexOf(1)], arr2[arr2.indexOf(2)]] = [2, 1];
console.log("arr1: " + arr1, "\narr2: " + arr2);
But both produce identical output:
arr1: 1,2,3,4
arr2: 1,2,3,4
I would expect that the position of 1 and 2 would be swapped for both arrays, but they're only swapped in arr2. I suspect this has to do with they way the initial array subset [1,2] is created, but I'm not sure.
Can anybody explain why the values aren't always swapped?
The result is simple, because it goes step for step for each item of destructuring assingment. And while the values are changing, the index of the values changes.
Case 1 [1, 2, 3, 4]
Get index of target for the first value 2
[arr1[arr1.indexOf(1)], arr1[arr1.indexOf(2)]] = [2, 1];
// ^^^^^^^^^^^^^^^ 0
Assign 2 to arr1[0]
[2, 2, 3, 4]
Get index of target for the second value 1:
[arr1[arr1.indexOf(1)], arr1[arr1.indexOf(2)]] = [2, 1];
// ^^^^^^^^^^^^^^^ 0
Assign 1 to arr1[0]
[1, 2, 3, 4]
Case 2 [2, 1, 3, 4]
Get index of target for the first value 2
[arr2[arr2.indexOf(1)], arr2[arr2.indexOf(2)]] = [2, 1];
// ^^^^^^^^^^^^^^^ 1
Assign 2 to arr1[1]
[2, 2, 3, 4]
Get index of target for the second value 1:
[arr2[arr2.indexOf(1)], arr2[arr2.indexOf(2)]] = [2, 1];
// ^^^^^^^^^^^^^^^ 0
Assign 1 to arr1[0]
[1, 2, 3, 4]

while(i--) loop in javascript

I normally use while loop as:
while (i<some_value)
I saw while(i--) syntax and thought it is shorter and cooler and tried the following in google-chrome.
var num_arr= [4,8,7,1,3];
var length_of_num_arr=num_arr.length;
while(length_of_num_arr--) console.log(num_arr);
[4, 8, 7, 1, 3]
[4, 8, 7, 1, 3]
[4, 8, 7, 1, 3]
[4, 8, 7, 1, 3]
[4, 8, 7, 1, 3] **// THIS IS EXPECTED RESULT**
But When I try...
while((num_arr.length)--) console.log(num_arr);
[4, 8, 7, 1]
[4, 8, 7]
[4, 8]
[4]
[] // WHY IS THIS HAPPENING??
Is there some hidden things you need to understand to use this syntax?
Arrays’ length property is writable, and will cut off their elements or add empty slots as appropriate when you set it.
var items = [1, 2, 3, 4, 5];
items.length = 3; // items is now [1, 2, 3]
items.length = 6; // items is now a sparse array: [1, 2, 3, undefined × 3]
So, don’t do that.
When you do array.length-- you're potentially shortening the array by one element each time.
See the reference under the section Shortening the array from: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/length
Array.prototype.length can be re-written programmatically and it potentially shorten your array by the new length you assign.
For example
a = [1,2,3,4,5,6,7,8,9,10];
// Shorten the array by one element
a.length--; // a <-- [1,2,3,4,5,6,7,8,9]
// In case you want to shorten your array to 3 elements, you can:
a.length = 3; // a <-- [1,2,3]
When you set the length property of an array to a lower value, the items at the end are removed:
var arr = [1,2,3,4,5];
arr.length; // 5
arr.length = 3;
arr; // [1,2,3]
This is described in the spec:
While newLen < oldLen repeat,
Set oldLen to oldLen – 1.
Let deleteSucceeded be the result of calling the [[Delete]] internal method of A passing ToString(oldLen) and false
as arguments.
In your code you use the postfix decrement operator (--) which reduces the length of the array.

Merging n sorted arrays in Javascript

I have n (n between 1 and 100) sorted number arrays, each with m elements (m around 1000 in my case). I want to merge them into a single sorted array.
I can think of two possibilities for doing this:
1.Use a two arrays merging algo (like merge() function below from http://www.nczonline.net/blog/2012/10/02/computer-science-and-javascript-merge-sort/) and applying it iteratively (1st and 2nd, then merge of 1st-2nd and 3rd, etc)
function merge(left, right) {
var result = [],
il = 0,
ir = 0;
while (il < left.length && ir < right.length){
if (left[il] < right[ir]){
result.push(left[il++]);
} else {
result.push(right[ir++]);
}
}
return result.concat(left.slice(il)).concat(right.slice(ir));
}
Generalize merge() function to n arrays simultaneously. At each iteration, I would pick the min value of the n first values not yet processed and append it to the result.
Are these two algo equivalent in terms of complexity ? I have the feeling that both algo are in o(m*n). Am I right ?
Are there any performance consideration to take one algo rather than the other ? I have the feeling that 1 is simpler than 2.
Merge n arrays using priority queue (based on binary heap, for example).
Overall element count is m*n, so algorithm complexity is O(m * n * Log(n)).
algorithm sketch:
Add numbers 1..n to priority queue, using 1st element of every
array as sorting key
(you may also use pairs (first element/array number).
At every step -
J = pop_minimum
add current head of Jth array to result
move head of Jth array to the right
if Jth array is not exhausted, insert J in queue (with new sorting key)
1st algoritm complexity is
2*m + 3*m+ 4*m+...+n*m = m * (n*(n-1)/2-1) = O(n^2 * m)
That's an old question, but for the sake of posterity:
Both algos are indeed O(n*m). In algo 1, you have to remerge for each m array. In algo 2, you do just one big merge, but picking out the minimum from m arrays is still linear.
What I did instead was implement a modified version of merge sort to get O(mlogn).
The code is there on GitHub https://github.com/jairemix/merge-sorted if anyone needs it.
Here's how it works
The idea is to modify algo 1 and merge each array pairwise instead of linearly.
So in the first iteration you would merge array1 with array2, array3 with array4, etc.
Then in the second iteration, you would merge array1+array2 with array3+array4, array5+array6 with array7+array8, etc.
For example:
// starting with:
[1, 8], [4, 14], [2, 5], [3, 7], [0, 6], [10, 12], [9, 15], [11, 13]
// after iteration 1:
[1, 4, 8, 14], [2, 3, 5, 7], [0, 6, 10, 12], [9, 11, 13, 15]
// after iteration 2
[1, 2, 3, 4, 5, 7, 8, 14], [0, 6, 9, 10, 11, 12, 13, 15]
// after iteration 3
[0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15]
In JS:
function mergeSortedArrays(arrays) {
// while there are still unmerged arrays
while (arrays.length > 1) {
const result = [];
// merge arrays in pairs
for (let i = 0; i < arrays.length; i += 2) {
const a1 = arrays[i];
const a2 = arrays[i + 1];
// a2 can be undefined if arrays.length is odd, so let's do a check
const mergedPair = a2 ? merge2SortedArrays(a1, a2) : a1;
result.push(mergedPair);
}
arrays = result;
}
// handle the case where no arrays is input
return arrays.length === 1 ? arrays[0] : [];
}
Notice the similarity to merge sort. In fact in merge sort, the only difference is that n = m, so you're starting further back with m presorted arrays of 1 item each. Hence the O(mlogm) complexity of merge sort.

Categories