Prompting for a variable not working when trying to find random number - javascript

My assignment for Intro to Javascript is to: "Write a function that accepts two numbers and returns a random number between the two values." It seems easy enough until I attempt to prompt the variables to input, at which point the output seems incorrect.
This is code that is most recommended for finding a random number between two ints, in this case 1 and 6:
function getRandomizer(bottom, top) {
return function() {
return Math.floor( Math.random() * ( 1 + top - bottom ) ) + bottom;
}
}
var rolldie = getRandomizer(1, 6);
document.write(rolldie());
The output I get is what it should be, a random integer between 1 and 6. However my assignment is to prompt for the numbers, which can be anything. So I do this, using 10 and 1 as example numbers:
var max = prompt("input 10");
var min = prompt("input 1");
function getRandomizer(bottom, top) {
return function() {
return Math.floor( Math.random() * ( 1 + top - bottom ) ) + bottom;
}
}
var rolldie = getRandomizer(min, max);
document.write(rolldie());
The output: 961. Second try: 231. etc. If I set the variables max and min directly to 10 and 1, the code works perfectly, returning numbers between 1 and 10. But for some reason, prompting input and then entering the exact same numbers gives completely different output. Why does this happen and how can I fix it?

The reason this happens is that the prompts are being treated as strings. So you're actually getting numbers between 1101 and 1.
You can ensure the vars min and max are numbers by using parseInt:
var max = parseInt(prompt("input 10"));

Try this Code below:
var max = Number(prompt("input 10"));
var min = Number(prompt("input 1"));

Related

Why does it take so many attempts to generate a random value between 1 and 10000?

I have the following code that generates an initial random number between 1 and 10000, then repeatedly generates a second random number until it matches the first:
let upper = 10000;
let randomNumber = getRandomNumber(upper);
let guess;
let attempt = 0;
function getRandomNumber(upper) {
return Math.floor(Math.random() * upper) + 1;
}
while (guess !== randomNumber) {
guess = getRandomNumber(upper);
attempt += 1;
}
document.write('The randomNumber is ' + randomNumber);
document.write(' it took' + attempt);
I am confused at (attempt) variables. Why is it that the computer took this many attempts to get the randomNumber? Also, it didn't put attempt in the loop condition.
Just to give you a start. This is what your code does:
// define the maximum of randomly generated number. range = 0 - 10.000
let upper = 10000;
// generate a random number out of the range 0-10.000
let randomNumber = getRandomNumber(upper);
// predefine variable guess
let guess;
// set a counter to 0
let attempt = 0;
// generate and return a random number out of the range from 0 to `upper`
function getRandomNumber(upper) {
return Math.floor(Math.random() * upper) + 1;
}
// loop until guess equals randomNumber
while (guess !== randomNumber) {
// generate a new random number and assign it to the variable guess
guess = getRandomNumber(upper);
// increase the counter by 1
attempt += 1;
}
// output the initially generated number
document.write('The randomNumber is ' + randomNumber);
// output the number of repetitions
document.write(' it took' + attempt);
So, once again. You generate a random number at start. And then you repeat generating another random number until this second random number matches the first. As you don't set any limits e.g. "each random number can only appear once" or "no more than 10.000 tries" your program might need millions of tries until the number matches, because you have a range of 10.000 possible numbers and they might repeat a hundreds of times each before the match is finally there.
Try to optimize your program by limiting the number of tries to 10.000. And you could your computer just let count upwards from 0 to 10.000 instead of guessing with a randomly generated number.
When I repeatedly run your snippet, I am seeing your code take anywhere from a few thousand to a few tens of thousands of repetitions to get a given value for a random number sampled between 1 and 10000. But this is not surprising -- it is expected.
Assuming your getRandomNumber(upper) function does indeed return a number between 1 and upper with a uniform distribution, the expected probability that the number returned will not be the initial, given value randomNumber is:
1 - (1/upper)
And the chance that the first N generated numbers will not include the given value is:
(1 - (1/upper)) ^ N
And so the chance P that the first N generated numbers will include given value is:
P = 1 - (1 - (1/upper)) ^ N
Thus the following formula gives the number of repetitions you will need to make to generate your initial value with a given probability P:
N = ln(1.0 - P) / ln(1.0 - (1.0/upper))
Using this formula, there is only a 50% chance of getting randomValue after 6932 repetitions, and a 95% chance after 29956 repetitions.
let upper = 10000;
function numberOfRepetitionsToGetValueWithRequiredProbability(upper, P) {
return Math.ceil(Math.log(1.0 - P) / Math.log(1.0 - (1.0/upper)))
}
function printNumberOfRepetitionsToGetValueWithRequiredProbability(upper, P) {
document.write('The number of tries to get a given value between 1 and ' + upper + ' with a ' + P + ' probability: ' + numberOfRepetitionsToGetValueWithRequiredProbability(upper, P) + ".<br>");
}
var probabilities = [0.10, 0.20, 0.30, 0.40, 0.50, 0.60, 0.70, 0.80, 0.90, 0.95, 0.99, 0.9999];
probabilities.forEach((p) => printNumberOfRepetitionsToGetValueWithRequiredProbability(upper, p));
This is entirely consistent with the observed behavior of your code. And of course, assuming Math.random() is truly random (which it isn't, it's only pseudorandom, according to the docs) there is always going to be a vanishingly small probability of never encountering your initial value no matter how many repetitions you make.

Javascript number comparison not correct

I'm doing some maths in javascript but I'm not getting the expected result all the time.
Here's my function - some parts have been simplified.
function updateExample($widget) {
var loan = parseInt($widget.attr("data-loan"), 10);
var term = parseInt($widget.attr("data-term"), 10);
// Get the best rate
var rateInfo = GetRateInfo(loan, term);
var annualRate = rateInfo[2];
// Calculate costs
var rate = (term === 1
? annualRate / 365 * 30
: annualRate / 12) / 100;
var pow = Math.pow(rate + 1, term);
var payment = round(rate * pow / (pow - 1) * loan, 2);
var totalRepayable = round(payment * term, 2);
var totalCostCap = round(loan * 2, 2);
var costCapped = false;
console.log(totalRepayable);
console.log(totalCostCap);
if (totalRepayable > totalCostCap) {
console.log("capped");
}
}
One of the tests that's failing is when I pass in a loan of 500 and a term of 1.
As you can see, I log 3 values to the console. The first 2 values output are:
620.00 and 1000.00
Given the values, I expect the following test to fail but it doesn't.
if (totalRepayable > totalCostCap)
if (620.00 > 1000.00)
The console log reads "capped" to prove the if statement has been entered.
I'm not a javascript expert by any means but I can't see how this is failing.
Here's the custom round function:
function round(value, decimals) {
return Number(Math.round(value + 'e' + decimals) + 'e-' + decimals).toFixed(decimals);
}
Any advice appreciated.
You don't show your round function, but I'm assuming it's using .toFixed(). The problem is you don't actually have arbitrary precision floating point numbers, so it converts to string, and
console.log("620.00" > "1000.00"); // true
The thing that tipped me off is that if you log a number like 620.00 to the console it automatically truncates it, the fact that you were seeing trailing zeros suggests it's a string.
Update
Yeah, now that you posted that it's definitely returning a string. The last part of the return value is a call to .toFixed(). Just cast the result back to a number to do the comparison.

JavaScript - converting 3 numbers to percentage, doesn't yield 100% total

I have a scenario where I have three numbers:
17
10
90
I need to convert those into whole percentage values (so that when added, total 100% as you'd expect). I have this function:
function roundPercentageTotals(num1, num2, num3) {
var total = num1 + num2 + num3; // 117
var num1Total = (num1 / total) * 100; // 14.529914529914531
var num2Total = (num2 / total) * 100; // 8.547008547008546
var num3Total = (num3 / total) * 100; // 76.92307692307693
var num1ToDecimal = num1Total.toFixed(1); // 14.5
var num2ToDecimal = num2Total.toFixed(1); // 8.5
var num3ToDecimal = num3Total.toFixed(1); // 76.9
var totalPercentage = parseInt(num1ToDecimal) + parseInt(num2ToDecimal) + parseInt(num3ToDecimal); // 98
return { percentage1: Math.round(num1ToDecimal, percentage2: Math.round(num2ToDecimal), percentage3: Math.round(num3ToDecimal) };
}
In my example, the total percentage calculated is 98%. Followed by:
Percentage1 = 15
Percentage2 = 9
Percentage3 = 77
Which adds up to 101%, where am I going wrong?
Thanks for any help in advance!
You're getting 98% in the first calculation because you're rounding the values down, and then getting 101% in your second because you're rounding them up.
Change your parseInt() to parseFloat() to get your totals to be closer to 100% instead of 98%. parseInt() floors decimals, it does not round them.
In regards to your second calculation totaling 101%: By rounding up 14.5 to 15, and 8.5 to 9, you've just added a full 1%. This leaves you with 101% instead of 100%.
The bottom line is that you cannot consistently achieve an even 100% if you're going to round the exact values, unless you fudge your percentages to fit somewhere along the way.
Ok, so it looks like mathematically, I cannot achieve exactly what I was looking for. However, I needed to round figures up so it equalled 100% in the end (all be in that some of the figures where rounded, so not totally accurate).
Here's my solution, just in case this is useful to someone else:
function roundPercentageTotals(numArr) {
// Total of all numbers passed.
var total = numArr[0] + numArr[1] + numArr[2];
// Percentage representations of each number (out of 100).
var num1Percent = Math.round((numArr[0] / total) * 100);
var num2Percent = Math.round((numArr[1] / total) * 100);
var num3Percent = Math.round((numArr[2] / total) * 100);
// Total percent of the 3 numbers combined (doesnt always equal 100%).
var totalPercentage = num1Percent + num2Percent + num3Percent;
// If not 100%, then we need to work around it by subtracting from the largest number (not as accurate but works out).
if (totalPercentage != 100) {
// Get the index of the largest number in the array.
var index = getLargestNumInArrayIndex(numArr);
// Take the difference away from the largest number.
numArr[index] = numArr[index] - (totalPercentage - 100);
// Re-run this method recursively, until we get a total percentage of 100%.
return roundPercentageTotals(numArr);
}
// Return the percentage version of the array passed in.
return [num1Percent, num2Percent, num3Percent];
}
function getLargestNumInArrayIndex(array) {
return array.indexOf(Math.max.apply(Math, array));
}
Pass an array of the numbers into roundPercentageTotals, such as roundPercentageTotals([13,54,38]) and it will return the whole percentage (or nearest percentage I should say) figures in an array.
percent-round is doing just that. did the job for me.
just pass values and get percentages back that always add up to 100%.
const percentRound = require ('percent-round');
percentRound([16, 56, 18]); // [18, 62, 20]
You cannot convert those numbers in percentage without decimals. It will work only if the numbers are divided by 100. So the answere here must be (1. 14.5 , 2. 8.5 , 3. 76.9). And as you can see there is a "0.1" percent missing for the same reason of the decimals you threw (i.e by converting 14.529914529914531 to 14.5).

Math.ceil not working with negative floats

I am trying to create a RoundUp function with help of Math.ceil it working fine with positive number but do not round up the negative numbers
Here is what i am trying
var value = -12.369754; --> output = -12
// make value = 12.369754; and out put will be 13
var decimalPoints = 0;
if (decimalPoints == 0) {
value = Math.ceil(parseFloat(value));
}
console.log(value);
Here is the Fiddle http://jsfiddle.net/n7ecyr7h/
Why This function?
I need to create a function in which user will give a number and decimal points upto which he wants to round the number The RoundUp function will roundUp the given value to a given number of decimal points
For example if user enters 12.12445 and wants to roundUp to 3 decimal points the output will be 12.125
Here is a table of required outputs with 2 decimal points
**Input** **output**
1.2369 1.24
1.2869 1.29
-1.1234 -1.13
-1.17321 -1.18
And here is the Updated Fiddle with original JS code http://jsfiddle.net/n7ecyr7h/1/
The Math.ceil method does actually round up even for negative values. The value -12 is the closest integer value that is at higher than -12.369754.
What you are looking for is to round away from zero:
value = value >= 0 ? Math.ceil(value) : Math.floor(value);
Edit:
To use that with different number of decimal points:
// it seems that the value is actually a string
// judging from the parseFloat calls that you have
var value = '-12.369754';
var decimalPoints = 0;
// parse it once
value = parseFloat(value);
// calculate multiplier
var m = Math.pow(10, decimalPoints);
// round the value
value = (value >= 0 ? Math.ceil(value * m) : Math.floor(value * m)) / m;
console.log(value);
Demo: http://jsfiddle.net/Guffa/n7ecyr7h/3/
Math.ceil(-1.1234) will be -1.12 because in negative -1.12 > -1.1234.
I think you misunderstood mathematically.

Function that accepts two numbers and returns a random number between the two numbers. (JavaScript)

Been learning JavaScript for a few weeks now, and I am trying to write a function that prompts the user for 2 numbers and the generate a random number between them. I wrote this code,
function randomInt(min,max) {
var min = prompt( "Enter a number." );
var max = prompt( "Enter another number." );
var randNum = Math.floor(Math.random() * (max - min + 1)) + min;
return randNum;
}
alert(randomInt());
It does prompt me for the numbers but it generates a random number not a number between the two that I've entered. Like I enter 5 and 20 and it returns 31 or 2 but when tried,
var min = 5;
var max = 20;
It works, actually gives me a number between 5 and 20, not sure why though. It seems like a simple task but just stumped.
The prompt() function returns strings, not numbers. You can force the values to be interpreted as numbers with the unary + operator:
var min = +prompt("Enter a number.");
var max = +prompt( "Enter another number." );
Without that, the outer + operator will preferentially perform a string concatenation operation because the right-hand operand (min) is a string. That is, in this expression:
Math.floor(Math.random() * (max - min + 1)) + min
The - operator in max - min will cause that to be done properly as a numeric subtraction. However, the outer + by which the minimum value is added back to the random number will perform a string concatenation and not a numeric addition if min is a string. Thus you'll end up with a random integer between 0 and the implied range, concatenated onto the minimum value (as a string).

Categories