How to find the highest number in a 2-dimensional array? - javascript

Hi all and thanks in advance for your help.
So I would like to find the highest number in a 2-dimensional array.
Below is the code:
const matrix = [
[2, 56, 10],
[20, 34, 10],
[23, 144, 26]
];
let maximum = matrix[0][0];
for (var row = 0; row < matrix.length; row++) {
for (var col = 0; col < matrix.length; col++) {
if (matrix[row][col] > maximum) {
maximum = matrix[row][col];
};
};
};
document.write(' -- ', maximum);
Here is my problem - Could you please help me to understand why when I have more numbers in the array I cannot see the highest number - Find below an example ):
const matrix = [
[2, 56, 10, 14, 422, 3242],
[20, 34, 55, 100, 33, 422],
[23, 12, 14, 26, 203, 233]
];
let maximum = matrix[0][0];
for (var row = 0; row < matrix.length; row++) {
for (var col = 0; col < matrix.length; col++) {
if (matrix[row][col] > maximum) {
maximum = matrix[row][col];
};
};
};
document.write(' -- ', maximum);

row < matrix.length tests the correct thing. col < matrix.length does not: you should replace it with col < matrix[row].length.
However, there is an easier way, using some of the newer JavaScript features:
const matrix = [[2,56,10,14,422,3242],[20,34,55,100,33,422],[23,12,14,26,203,233]];
const maximum = Math.max(...matrix.flat())
console.log(maximum);
matrix.flat() will flatten the two-dimensional array into one dimension, ... syntax will put each value from the one-dimensional array as its own argument to Math.max, which then finds the biggest one.

There is one small mistake. When you iterate the column make sure you iterate the number of columns.
matrix.length gives you the number of rows and matrix[i].length gives you the number of columns.
const matrix = [[2,56,10,14,422,3242],[20,34,55,100,33,422],[23,12,14,26,203,233]];
let maximum = matrix[0][0];
for(var row = 0; row < matrix.length; row++){
for(var col = 0; col < matrix[row].length; col++){
if(matrix[row][col] > maximum){
maximum = matrix[row][col];
};
};
};
document.write(' -- ', maximum);

You are taking matrix.length for no. of column as well, but it gives you no. of rows i.e. 3 but in your case but no. of column is 6 . that's why it only check for 3 numbers
const matrix = [[2,56,10,14,422,3242],[20,34,55,100,33,422],[23,12,14,26,203,233]];
let maximum = matrix[0][0];
for(var row = 0; row < matrix.length; row++){
for(var col = 0; col < matrix[0].length; col++){ <--- Correction
if(matrix[row][col] > maximum){
maximum = matrix[row][col];
};
};
};
document.write(' -- ', maximum);

Also can use Array.prototype.reduce() combined with Math.max():
const matrix = [[2,56,10,14,422,3242],[20,34,55,100,33,422],[23,12,14,26,203,233]]
const maximum = matrix.reduce((a, c) => Math.max(a, ...c), 0)
console.log(maximum)

Related

How to create 2d array using "for" loop in Javascript?

I need to write a program that creates a 2d array in variable "numbers" in rows (5) and columns (4). The elements of the array have to be consecutive integers starting at 1 and end at 20. I have to use "for" loop.
[ 1, 2, 3, 4 ],
[ 5, 6, 7, 8 ],
[ 9, 10, 11, 12 ],
[ 13, 14, 15, 16 ],
[ 17, 18, 19, 20 ],
So I came up with that:
const numbers = [];
const columns = 4;
const rows = 5;
for (let i = 0; i < rows; i++) {
numbers [i] = [];
for (let j = 0; j < columns; j++){
numbers [i][j] = j + 1;
}
}
console.log(numbers);
But the result of this is five identical rows, like this:
[ 1, 2, 3, 4 ],
[ 1, 2, 3, 4 ],
[ 1, 2, 3, 4 ],
[ 1, 2, 3, 4 ],
[ 1, 2, 3, 4 ]
Do you have any idea how to fix it? How to make second row starting from 5?
Here is some updated code. You need to add i*columns to every value
const numbers = [];
const columns = 4;
const rows = 5;
for (let i = 0; i < rows; i++) {
numbers[i] = [];
for (let j = 0; j < columns; j++){
numbers[i][j] = j + 1 + (i*columns);
}
}
console.log(numbers);
Looks like in the second loop, you should do numbers [i][j] = j * i; instead
Every time the outer for loop starts a new iteration, j is reset back to 0, which is why you keep getting rows starting with 1.
To fix this, you could declare a variable outside of the for loops that tracks the current number, and use that instead of j like so:
const numbers = [];
const columns = 4;
const rows = 5;
let currNum = 0;
for (let i = 0; i < rows; i++) {
numbers [i] = [];
for (let j = 0; j < columns; j++){
currNum++;
numbers [i][j] = currNum;
}
}
console.log(numbers);

Convert 1D array into 2D array JavaScript

Hi I have this example where I want my 1D array to be a 2D array 4x3
var array1 = [15, 33, 21, 39, 24, 27, 19, 7, 18, 28, 30, 38];
var i, j, t;
var positionarray1 = 0;
var array2 = new Array(4);
for (t = 0; t < 4; t++) {
array2[t] = new Array(3);
}
for (i = 0; i < 4; i++) {
for (j = 0; j < 3; j++) {
array2[i][j] = array1[i];
array2[i][j] = array1[j];
}
positionarray1 = positionarray1 + 1; //I do this to know which value we are taking
}
console.log(array2);
My solution is only giving me the first numbers of the array1. Any idea?
i and j are indexes into the new 2D array that only run up to 0 to 3 and 0 to 2, which is why you are seeing the beginning values over and over. You need a way to index array1 that goes from 0 to 11.
It looks like you are on the right track with "positionarray1" and "position", though you need to move where you are incrementing it. You need to use that value when indexing array1 rather than i and j:
array2[i][j] = array1[positionarray1];
array2[i][j] = array1[positionarray1];
positionarray1++;
If you rename i to row and j to col, it makes is easier to see what is going on. Also, avoid magic numbers. I am seeing 3 and 4 all over the place. These can be replaced with parameter references. All you need to do it wrap your logic within a reusable function (as seen in the reshape function below).
The main algorithm is:
result[row][col] = arr[row * cols + col];
There is no need to track position, because it can be calculated from the current row and column.
const reshape = (arr, rows, cols) => {
const result = new Array(rows);
for (let row = 0; row < rows; row++) {
result[row] = new Array(cols);
}
for (let row = 0; row < rows; row++) {
for (let col = 0; col < cols; col++) {
result[row][col] = arr[row * cols + col];
}
}
return result;
};
const array1 = [15, 33, 21, 39, 24, 27, 19, 7, 18, 28, 30, 38];
const array2 = reshape(array1, 4, 3);
console.log(JSON.stringify(array2));
.as-console-wrapper { top: 0; max-height: 100% !important; }
var array1 = [15, 33, 21, 39, 24, 27, 19, 7, 18, 28, 30, 38];
var i, j, t;
var positionarray1 = 0;
var array2 = new Array(4);
for (t = 0; t < 4; t++) {
array2[t] = new Array(3);
}
for (i = 0; i < 4; i++) {
for (j = 0; j < 3; j++) {
array2[i][j] = array1[i*3+j]; //here was the error
}
positionarray1 = positionarray1 + 1; //I do this to know which value we are taking
}
console.log(array2);
I just solved it thanks for your comments. I actually used one apportation, but it was 3 instead of 2.
solution with 1 loop for efficiency :
const arr1D = new Array(19).fill(undefined).map((_, i) => i);
const arr2D = [];
const cols = 3;
for (let i = 0, len = arr1D.length; i < len; ++i) {
const col = i % cols;
const row = Math.floor(i / cols);
if (!arr2D[row]) arr2D[row] = []; // create an array if not exist
arr2D[row][col] = arr1D[i];
}
console.log({ arr1D, arr2D });

Calculate Subtraction of diagonals-summations in a two-dimensional matrix using JavaScript

I am practing at hackerrank and I have an exercise with
two-dimensional matrix. I am facing an error in my implementation
11 2 4
4 5 6
10 8 -12
I need to sum across the primary diagonal: 11 + 5 - 12 = 4 after the other diagonal 4 + 5 +10 = 19 finally 19 - 4 = 15
function diagonalDifference(arr) {
var sumRigth = 0;
var sumLeft = 0;
var array = new Array();
for(var i = 0; i < arr.length ; i++ ){
for(var j = 0; j < arr[i].length; j++){
array.push(arr[i][j]);
}
}
for (var i = 0 ; i < array.length; i = i + 4){
sumRigth += array[i];
}
for (var j = 2 ; j < array.length - 1 ; j = j + 2 ){
sumLeft += array[j];
}
return sumLeft - sumRigth;
}
you can try this
function sumDiagonal(matrix) {
let firstSum = 0, secondSum = 0;
for (let row = 0; row < matrix.length; row++) {
firstSum += matrix[row][row];
secondSum += matrix[row][matrix.length - row - 1];
}
console.log(firstSum + ' ' + secondSum);
console.log(firstSum-secondSum);
}
sumDiagonal([[11,2,4],[4,5,6],[10,8,-12]]);
I don't think you're on the right path. A general solution would first sum the elements from top-left to bottom-right (saved here as sumRigth). Then, sum the elements from top-right to bottom-left (saved here as sumLeft). I took it for granted that arrays contain numbers and are of the same size.
function diagonalDifference(array) {
let sumRigth = 0, sumLeft = 0, count = 0;
for (var i = 0 ; i < array.length; i++){
sumRigth += array[i][count++];
}
count = array.length-1;
for (var i = 0; i < array.length; i++){
sumLeft += array[i][count--];
}
return sumLeft - sumRigth;
}
let arr = [
[11, 2, 4],
[4, 5, 6],
[10, 8, -12]
];
console.log(diagonalDifference(arr));
You could take a single loop and get two values dierctly for summing.
function getValue(matrix) {
let sum = 0;
for (let i = 0, l = matrix.length; i < l; i++)
sum += matrix[i][l - i - 1] - matrix[i][i];
return sum;
}
console.log(getValue([[11, 2, 4], [4, 5, 6], [10, 8, -12]]));

Javascript for loop skipping a rotation inside matrix

I have been pounding my head for a while now, trying to figure out why my for loops skips a rotation in my matrix. I am trying to make it print out [[1, 2, 3, 4, 5], [2, 3, 4, 5, 6], [4, 5, 6, 7, 8]], that being the whole matrix.
function solve(args)
{
let arr = args[0].split(' ').map(Number),
rows = +arr[0],
cols = +arr[1];
let matrix = new Array(rows);
matrix.fill();
for (let i in matrix)
{
matrix[i] = new Array(cols);
}
for (let row = 0; row < rows; row++)
{
matrix[row][0] = Math.pow(2, row);
for (let col = matrix[row][0]; col < cols; col++)
{
matrix[row][col] = +col + 1;
}
}
console.log(matrix);
}
solve([
'3 5'
]);
P.S. I tried this too:
for (let row = 0; row < rows; row++)
{
matrix[row][0] = Math.pow(2, row);
for (let col = matrix[row][0]; col < cols; col++)
{
matrix[row][col] = +col + matrix[row][0];
}
}
I hope this helps you. While starting with a certain value as first element (matrix[row][0]) you can increment each row by 1 iteratively:
function solve(args)
{
let arr = args[0].split(' '),
rows = +arr[0],
cols = +arr[1];
let matrix = new Array(rows);
matrix.fill();
for (let i in matrix)
{
matrix[i] = new Array(cols);
}
for (let row = 0; row < matrix.length; row++) {
matrix[row][0] = Math.pow(2, row);
for (let col = 1; col < matrix[row].length; col++) {
matrix[row][col] = col + matrix[row][0];
}
}
console.log(matrix);
return matrix;
}
solve([
'3 5'
]);

Insertion Sort help in javascript -- Khan Academy

I think I am on the verge of solving this but Im not sure why my code is not executing correctly. Can someone provide some feedback for me and show me where I messed up?
var insert = function(array, rightIndex, value) {
for(var j = rightIndex;
j >= 0 && array[j] > value;
j--) {
array[j + 1] = array[j];
}
array[j + 1] = value;
};
var insertionSort = function(array) {
for(var i = 1; i < array.length; i++){
insert(array, array.length -1, i);
}
};
var array = [22, 11, 99, 88, 9, 7, 42];
insertionSort(array);
println("Array after sorting: " + array);
//Program.assertEqual(array, [7, 9, 11, 22, 42, 88, 99]);
if I do this insert(array, array[i], i);,I get the following output:
Array after sorting: 22,11,12,100,89,10,8,43,5,,4,,1,,
I got here another solution for this insertion sort:
var insert = function(array, rightIndex, value) {
for(var j = rightIndex; j >= 0 && array[j] > value; j--) {
array[j + 1] = array[j];
}
array[j + 1] = value;
};
var insertionSort = function(array) {
for(var i = 0; i < array.length-1; i++){
insert(array, i, array[i+1]);
}
};
var array = [22, 11, 99, 88, 9, 7, 42];
insertionSort(array);
I think you have a probleme here:
in insert(array, array.length -1, i); it should be insert(array, array.length -1, array[i]);
you were inserting array index instead of the value
also you have an array out of bound in array[j + 1] = array[j]; because j start from array.length -1, it should be array[j] = array[j-1]; while j>0
last thing: your rightIndex should be i at each iteration not array.length -1.
Complete code :
var insert = function(array, rightIndex, value) {
for(var j = rightIndex;
j > 0 && array[j-1] > value;
j--) {
array[j] = array[j-1];
}
array[j] = value;
};
var insertionSort = function(array) {
for(var i = 0; i < array.length; i++){
insert(array, i, array[i]);
}
};
var array = [22, 11, 99, 88, 9, 7, 42];
insertionSort(array);
In insertion sort, we divide the initial unsorted array into two parts; sorted part and unsorted part. Initially the sorted part just has one element (Array of only 1 element is a sorted array). We then pick up element one by one from unsorted part; insert into the sorted part at the correct position and expand sorted part one element at a time.
var a = [34, 203, 3, 746, 200, 984, 198, 764, 9];
function insertionSort(values) {
var length = values.length;
for(var i = 1; i < length; ++i) {
var temp = values[i];
var j = i - 1;
for(; j >= 0 && values[j] > temp; --j) {
values[j+1] = values[j];
}
values[j+1] = temp;
}
};
console.log(a);
insertionSort(a);
console.log(a);
I know I am too late at the party. As you are aware there are several ways to do this but the yellow creature on KA apparently wants us to do it in a particular way. Here's the solution that made it happy:
var insert = function(array, rightIndex, value) {
for(var i=rightIndex; i >= 0 && array[i] > value ; i--){
array[i+1] = array[i];
}
array[i+1] = value;
};

Categories