Function that accepts two numbers and returns a random number between the two numbers. (JavaScript) - javascript

Been learning JavaScript for a few weeks now, and I am trying to write a function that prompts the user for 2 numbers and the generate a random number between them. I wrote this code,
function randomInt(min,max) {
var min = prompt( "Enter a number." );
var max = prompt( "Enter another number." );
var randNum = Math.floor(Math.random() * (max - min + 1)) + min;
return randNum;
}
alert(randomInt());
It does prompt me for the numbers but it generates a random number not a number between the two that I've entered. Like I enter 5 and 20 and it returns 31 or 2 but when tried,
var min = 5;
var max = 20;
It works, actually gives me a number between 5 and 20, not sure why though. It seems like a simple task but just stumped.

The prompt() function returns strings, not numbers. You can force the values to be interpreted as numbers with the unary + operator:
var min = +prompt("Enter a number.");
var max = +prompt( "Enter another number." );
Without that, the outer + operator will preferentially perform a string concatenation operation because the right-hand operand (min) is a string. That is, in this expression:
Math.floor(Math.random() * (max - min + 1)) + min
The - operator in max - min will cause that to be done properly as a numeric subtraction. However, the outer + by which the minimum value is added back to the random number will perform a string concatenation and not a numeric addition if min is a string. Thus you'll end up with a random integer between 0 and the implied range, concatenated onto the minimum value (as a string).

Related

Order of operation and How does the random() play in this scenario

Just started learning loops and I'm having trouble understanding the order of operation here in the let value, along with how the random() works in this scenario.
From what it looks like: Math.floor() prevents decimals and Math.random() selects a random number between 0 and 36. Does random() select a random value for both MAX and MIN? Does random() also generate a random for its self to be multiplied by whatever the value of MAX and MIN equal after being subtracted, then adding the MIN back?
const MIN = 0;
const MAX = 36;
var testNumber = 15;
var i = 1;
while (MAX) {
let randomValue = Math.floor(Math.random() * (MAX - MIN)) + MIN;
if (randomValue == testNumber) {
break;
}
Math.random() provides a random floating point number between 0 and 1. If you want to get a wider range of random values, you multiply by the magnitude of the range you want (i.e. MAX - MIN). Then, if MIN is great than 0 you'll need to add it to the resulting random number, otherwise the results range would be 0 up to (MAX - MIN).
As you say, Math.floor() simply rounds the result down to the nearest integer.
The Math.floor() function returns the largest integer less than or equal to a given number. This is in contrast to the Math.ceil() function that returns the largest integer more than or equal to a given number.
The Math.random() function returns a floating-point, pseudo-random number in the range 0 to less than 1 (inclusive of 0, but not 1) with an approximately uniform distribution over that range — which you can then scale to your desired range.
So in the case of your randomValue variable, a pseudo-random value between the values for MIN and MAX is generated. This value could have decimals because of how Math.random() operates thus Math.floor is used to get a whole number. MIN is added to the end so that the random value will always fall within the range, especially if MIN is not 0.

How to get a random number in specific range using Javascript?

I'm trying to find a way to run in my Node script command that gets a random number from 0.01 to 10.85 for example. The number must look like 0.01 and not like 0.000000001.
How can I do this?
My current command:
var randomTicket = helper.getRandomInt(0, 10.85);
That command for some reason it returns only the number 0
Are there any other alternative ways?
Math.random would do the trick. It returns a value between [0,1) (1 is exclusive). Then you scale the range by the difference between max and min and add min as offset.
function getRandomArbitrary(min, max) {
return Math.random() * (max - min) + min;
}
But because you get a float value, you can not clip the decimal after the second position. You can format it to a string. I'm not sure but this should work.
var number = getRandomArbitrary(0.01, 10.85);
var numString = number.toFixed(2);
EDIT
If you only generate integers then multiply min and max by 100.0 and divide the result by 100.0.
var number = getRandomArbitrary(1.0, 1085.0);
var numString = (number / 100.0).toFixed(2);
or
var number = getRandomInt(1, 1085);
var numString = (number / 100.0).toFixed(2);

Prompting for a variable not working when trying to find random number

My assignment for Intro to Javascript is to: "Write a function that accepts two numbers and returns a random number between the two values." It seems easy enough until I attempt to prompt the variables to input, at which point the output seems incorrect.
This is code that is most recommended for finding a random number between two ints, in this case 1 and 6:
function getRandomizer(bottom, top) {
return function() {
return Math.floor( Math.random() * ( 1 + top - bottom ) ) + bottom;
}
}
var rolldie = getRandomizer(1, 6);
document.write(rolldie());
The output I get is what it should be, a random integer between 1 and 6. However my assignment is to prompt for the numbers, which can be anything. So I do this, using 10 and 1 as example numbers:
var max = prompt("input 10");
var min = prompt("input 1");
function getRandomizer(bottom, top) {
return function() {
return Math.floor( Math.random() * ( 1 + top - bottom ) ) + bottom;
}
}
var rolldie = getRandomizer(min, max);
document.write(rolldie());
The output: 961. Second try: 231. etc. If I set the variables max and min directly to 10 and 1, the code works perfectly, returning numbers between 1 and 10. But for some reason, prompting input and then entering the exact same numbers gives completely different output. Why does this happen and how can I fix it?
The reason this happens is that the prompts are being treated as strings. So you're actually getting numbers between 1101 and 1.
You can ensure the vars min and max are numbers by using parseInt:
var max = parseInt(prompt("input 10"));
Try this Code below:
var max = Number(prompt("input 10"));
var min = Number(prompt("input 1"));

Random generator in Javascript with min and max value not working

I am new to JavaScript. I am trying to make an exercise program which generates a random number between a min and a max value. I am facing an issue in the program below. var2 + min is not working correctly. If I replace variable min with the actual value, then it works. What am I doing wrong?
var var1=Math.random()
var min = prompt("Enter Min value:")
var max = prompt("Enter max value:")
alert("min is "+min+" max is "+max)
var var2=var1*(max-min)
var var3=var2+min
var var4=Math.floor(var3)
alert("var1= "+var1+" var2= "+var2+" var3= "+var3+" Var4 "+var4)
Use:
var min = parseInt(prompt("Enter Min value:"), 10);
var max = parseInt(prompt("Enter max value:"), 10);
The problem is that these variables contain strings, so the expressions containing + are performing string concatenation rather than number addition.
And while you're learning, get in the habit of ending statements with ;. Javascript is lax about requiring this, but you should be explicit about it -- the rules for when semicolon can be omitted are a bit arcane.
var1, var2, var3, etc. are not good variable names. Don't use them.
Your code isn't working because prompt returns a string. 1 - "2" is -1, but 1 + "2" is "12", as the addition operator is used for string concatenation.
Parse the strings into integers:
var min = parseInt(prompt("Enter Min value:"), 10);
prompt returns a string so you need to convert min and max to a number:
var min = Number(prompt("Enter Min value:"));
var max = Number(prompt("Enter max value:"));

math.round vs parseInt

Have a quick JS question. What is the difference between math.round and parseInt?
I made a JS script to sum the inverses of prompted numbers:
<script type="text/javascript">
var numRep = prompt("How many repetitions would you like to run?");
var sum = 0;
var count = 0;
var i = 1; //variable i becomes 1
while (i <= numRep) {// repeat 5 times
var number = prompt("Please enter a non zero integer");
if(number==0){
document.write("Invalid Input <br>");
count++;
}
else{
document.write("The inverse is: " + 1/number + "<br>");
sum = sum + (1/parseInt(number)); //add number to the sum
}
i++; //increase i by 1
}
if (sum==0){
document.write("You did not enter valid input");}
else { document.write("The sum of the inverses is: " + sum); //display sum
}
</script></body></html>
and it uses parseInt. If I wanted to makeit use math.round, is there anything else I need to do so that It knows to limit the number of decimal places accordingly?
In other words, does math.round have to be formatted in a certain way?
The two functions are really quite different.
parseInt() extracts a number from a string, e.g.
parseInt('1.5')
// => 1
Math.round() rounds the number to the nearest whole number:
Math.round('1.5')
// => 2
parseInt() can get its number by removing extra text, e.g.:
parseInt('12foo')
// => 12
However, Math.round will not:
Math.round('12foo')
// => NaN
You should probably use parseFloat and Math.round since you're getting input from the user:
var number = parseFloat(prompt('Enter number:'));
var rounded = Math.round(number);
Math.round will round the number to the nearest integer. parseInt will assure you that the value is a number
So what you will need is something like this:
number = parseInt(number);
if ( isNan(number) || number == 0 ){
document.write("Invalid Input <br>");
count++;
}
This will assure you that the use has put in a number
Math.round expects a number, parseInt expects a string.
Use parseInt('12345', 10) for parsing 10-based numbers.
http://www.javascripter.net/faq/convert2.htm

Categories