Order of operation and How does the random() play in this scenario - javascript

Just started learning loops and I'm having trouble understanding the order of operation here in the let value, along with how the random() works in this scenario.
From what it looks like: Math.floor() prevents decimals and Math.random() selects a random number between 0 and 36. Does random() select a random value for both MAX and MIN? Does random() also generate a random for its self to be multiplied by whatever the value of MAX and MIN equal after being subtracted, then adding the MIN back?
const MIN = 0;
const MAX = 36;
var testNumber = 15;
var i = 1;
while (MAX) {
let randomValue = Math.floor(Math.random() * (MAX - MIN)) + MIN;
if (randomValue == testNumber) {
break;
}

Math.random() provides a random floating point number between 0 and 1. If you want to get a wider range of random values, you multiply by the magnitude of the range you want (i.e. MAX - MIN). Then, if MIN is great than 0 you'll need to add it to the resulting random number, otherwise the results range would be 0 up to (MAX - MIN).
As you say, Math.floor() simply rounds the result down to the nearest integer.

The Math.floor() function returns the largest integer less than or equal to a given number. This is in contrast to the Math.ceil() function that returns the largest integer more than or equal to a given number.
The Math.random() function returns a floating-point, pseudo-random number in the range 0 to less than 1 (inclusive of 0, but not 1) with an approximately uniform distribution over that range — which you can then scale to your desired range.
So in the case of your randomValue variable, a pseudo-random value between the values for MIN and MAX is generated. This value could have decimals because of how Math.random() operates thus Math.floor is used to get a whole number. MIN is added to the end so that the random value will always fall within the range, especially if MIN is not 0.

Related

Correct function using Math.random() to get 50/50 chance

Which is the correct function for getting a precise 50/50 chance:
return Math.random() < 0.5;
Vs
return Math.random() <= 0.5;
Math.random():
The Math.random() function returns a floating-point, pseudo-random number in the range [0, 1); that is, from 0 (inclusive) up to but not including 1 (exclusive)
The random number is either in the range [0,0.5) or [0.5,1). So you should use return Math.random() < 0.5; to have a (theoretical) 50/50 chance.
The first one is the correct because the random number generators returns a number from 0 to 0.99999999 (depends on the exact accuracy of the generator itself)
So by splitting the values into two groups using the "<" operator, you should get two equal ranges:
[0 upto 0.49999999] and [0.5 upto 0.9999999]

How does the Javascript device that generate any random number between two number inclusive work mathematically?

Math.floor() rounds the decimal to "A number representing the largest integer less than or equal to the specified number." So Math.floor(45.03) will be 45 and Math.floor(-34.23) will be 35.
And Math.random generates a (decimal, or whole) number between 0, inclusive and 1 exclusive.
I just learned this:
Math.floor(Math.random()*(max-min+1)+min);
This will generate a random whole number between max and min inclusive. I can figure out mathematically why it works. Just wondering.
Lets call Math.random(), R; max M;min m. If you just look at the inside:
R*(M-m+1)+m //or RM-Rm+R+m,
It's obvious the quantity is at least as big as m. But why is it no bigger than M? I assume this works with negative M and m, as well.
Lets assume r is Math.random() which is a number between 0, inclusive and 1 exclusive; which we show it as:
// Let r is shortcut for Math.random()
r => [0 to 1} // lets [] symbols for inclusive and {} symbols for exclusive
to get a number in a larger scale, we can multiply it with N:
//scale by N:
r * N => [0 to N}
to include N in our range, we can use one number greater than N and round the result:
r * (N+1) => [ 0 to (N+1) }
floor(r * (N+1)) => [ 0 to N ] // decimals after N will be removed
So up to now, we reach a good formula: to have a random number between 0 and N (both inclusive), we should use: floor(r * (N+1))
if we shift the equation to start from a min value:
//add `min` to equation:
floor(r * (N+1)) + min => [ min to N+min ]
it is almost finished: consider N+min as max, we have:
N+min = max => N = max-min
replace it in our equation:
floor(r * (max-min+1)) + min => [ min to max ]
Note: It is obvious that we can move the min inside the floor function as it is an integer value and does not have any decimal digits. so we could write it also as:
floor( r * (max-min+1) + min )
It's pretty easy, but you need to write it down a little wordy:
var min = 10;
var max = 20;
var difference = max - min; // 10
var random = Math.random() // (0...1)
var randomDifference = random * difference; // (0...difference)
var withMinOffset = randomDifference + min; // (10...20)
The above withMinOffset is the random value between min and max. The reason this works is because you know the number cannot be lower than min, so we will always have to add the min to the randomised number. Then we know we want a range, so we can use max - min to get the maximum amount of randomness we can get.
The number will never be bigger than M simply because difference + min === max, which is the upper limit. Multiplying say, 10, by a random number between 0 and 1 will always generate a number between 0 and 10. Adding the min to it will always have a number between min and min + difference.

How to get a random number in specific range using Javascript?

I'm trying to find a way to run in my Node script command that gets a random number from 0.01 to 10.85 for example. The number must look like 0.01 and not like 0.000000001.
How can I do this?
My current command:
var randomTicket = helper.getRandomInt(0, 10.85);
That command for some reason it returns only the number 0
Are there any other alternative ways?
Math.random would do the trick. It returns a value between [0,1) (1 is exclusive). Then you scale the range by the difference between max and min and add min as offset.
function getRandomArbitrary(min, max) {
return Math.random() * (max - min) + min;
}
But because you get a float value, you can not clip the decimal after the second position. You can format it to a string. I'm not sure but this should work.
var number = getRandomArbitrary(0.01, 10.85);
var numString = number.toFixed(2);
EDIT
If you only generate integers then multiply min and max by 100.0 and divide the result by 100.0.
var number = getRandomArbitrary(1.0, 1085.0);
var numString = (number / 100.0).toFixed(2);
or
var number = getRandomInt(1, 1085);
var numString = (number / 100.0).toFixed(2);

Random integer functions differences

inspecting some code, I found the following random integer generator function:
function randomInt(min, max) {
return min ? min : min = 0,
max ? max : max = 1,
0 | Math.random() * (max - min + 1) + min
}
Comparing it with the equivalent function at MDN:
// Returns a random integer between min (included) and max (excluded)
// Using Math.round() will give you a non-uniform distribution!
function getRandomInt(min, max) {
return Math.floor(Math.random() * (max - min)) + min;
}
I understand that the first creates and integer with max included, and that it checks for values or assign them defaults to min and max but I don't understand how it returns an integer and not a float without the Math.floor() method.
Is it achieved using 0 | Math.random() * (max - min + 1) + min expression? If so, how?
The result is converted to an integer with the | operator, which is the bitwise OR. From MDN, the first step in computing the result is:
The operands are converted to thirty-two-bit integers and expressed by a series of bits (zeros and ones).
And because you're ORing with 0, this operation will not change the value of the result (other than the previously mentioned conversion).
0 | is a bitwise operation.
It has no effect on the value (ORing with zero returns the original value), but, like all bitwise operations, it truncates to an integer (bitwise operations make no sense on non-integers).

Javascript: Math.random

If num parameter is 52, how many possible return values are there? is it 52 or 53? If I understand this correctly, Math.random uses random values from 0 to 1 inclusive. If so, then 0 is a possible return value and so is 52. This results in 53 possible return values. Is this correct? Reason I ask is that a book that I'm learning from uses this code for a deck of cards. I wonder if num should equal 51 ?
Thanks ...
function getRandom(num) {
var my_num = Math.floor(Math.random * num);
return my_num;
};
Math.floor(Math.random() * num) // note random() is a function.
This will return all integers from 0 (including 0) to num (NOT including num).
Math.random returns a number between 0 (inclusive) and 1 (exclusive). Multiplying the result by X gives you between 0 (inclusive) and X (exclusive). Adding or subtracting X shifts the range by +-X.
Here's some handy functions from MDN:
// Returns a random number between 0 (inclusive) and 1 (exclusive)
function getRandom() {
return Math.random();
}
// Returns a random number between min and max
function getRandomArbitrary(min, max) {
return Math.random() * (max - min) + min;
}
// Returns a random integer between min and max
// Using Math.round() will give you a non-uniform distribution!
function getRandomInt(min, max) {
return Math.floor(Math.random() * (max - min + 1)) + min;
}
Since Math.random returns a real number between [0,1) (1 is not inclusive), multiplying the result returns a real number between [0, 52).
Since you are flooring the result, the maximum number returned is 51 and there are 52 distinct values (counting 0).
Since value of Math.random varies from 0 to 1(exclusive);
so if you pass 52 in getRandom, return value will vary from 0 to 52(exclusive). so getRandom can return only 52 values. as you are using Math.floor. the max value can be returned is 51.

Categories