Generating random integer between two inputs Javascript - javascript

I'm trying to write a function (GetPositiveInteger) that generates a random integer between variables a and b. if a > b, it's supposed to switch the values of the two so that b=>a before generating the Number.
I don't know if I'm misunderstanding the question or what because I can't figure out where I'm meant to be converting a and b to integers and my program keeps just returning NaN. here's my code maybe someone could take a look at it?
function GetRandomInteger(a, b, c) {
a = Number(a), b = Number(b);
if (a > b) {
var a = 2,
b = 1,
c = a;
a = b;
b = c;
} else {
var x = parseInt(Math.random() * b) + a;
}
return x;
}
let x;
console.log(Number(GetRandomInteger(x)));

When a > b, you're setting them to specific numbers before swapping them, instead of swapping the original values.
The code that generates the random number shouldn't be in else, because you won't run that after swapping a and b. It should just be after the if.
You don't need the c parameter. Use a temporary variable declared inside the function when swapping.
Use Math.floor() to convert a floating point number to an integer. parseInt() is for parsing strings (it will work in this case because it first converts the float to a string, but it's better to use the more specific function).
You need to call the function with two arguments. In the example below I just hard-coded them, but you can use your function that asks the user for numbers. Just use it twice to set two variables.
function GetRandomInteger(a, b) {
a = Number(a), b = Number(b);
if (a > b) {
let temp = a;
a = b;
b = temp;
}
var x = Math.floor(Math.random() * b) + a;
return x;
}
console.log(GetRandomInteger(1, 10));
console.log(GetRandomInteger(15, 3));

Related

How to fix this triangel function in Javascript?

i am so newbie in programming, what i want is create a file which able automatically count triangel around (actually this is pythagoras problem).
i had tried to code with simple programming (the result is written below), and use function in javascript.
i use simple html file to call the javascript file, but the result is always wrong.
function fungsiKllSegitiga1(a,b) {
var c = Math.SQRT(a * a + b * b);
var kll=a+b+c;
return c.sqrt;
}
function fungsiKllSegitiga2(a,b) {
var c = (a * a + b * b)^(0.5);
var kll=a+b+c;
return c.sqrt;
}
i want variable c to count pythagoras, but i can't do it properly. any suggestion?
Try This:
function fungsiKllSegitiga1(a,b) {
var c = Math.SQRT(a * a + b * b);
return c;
}
fungsiKllSegitiga1(tmp, tmp)
tmp means temporary, change it to the numbers you want.

Why is the browser sorting in the following way

I have following function:
var sortString = function (a, b) {
a = a.toLowerCase();
b = b.toLowerCase();
if (a < b) return 1;
if (a > b) return -1;
return 0;
}
and I have following two strings:
x = ["B1C3N_EUR_DFAK_ALL_3M_ALL","B1C3N_EUR_BPP_BCO_3M"];
When I run the above function on this array. I expect "B1C3N_EUR_BPP_BCO_3M" to be at index 0 whereas browser returns it in the reverse order. I have checked both on Chrome and IE. Why is it so??
Do I need to replace "-" with some other values. Is there any way I can do it without replacing.
You return the wrong value for smaller and greater value in the callback for Array#sort.
if (a < b) return 1;
// ^ should be -1, because a is smaller than b
if (a > b) return -1;
// ^^ should be 1, because a is greater than b
For a more concise style, you could use String#localeCompare, which test the given strings and returns a value in the wanted range.

Why Modular Exponentiation functions work differently in Python and Javascript for large numbers?

I need to perform modular exponentiation on quite large numbers on both python3 and javascript. I have functions that do the task, but they give me different outputs.
Python (all of the three work in the same way):
pow(176672119508, 55, 200000023499)
def expmod_iter(a,b,c):
x = 1
while(b>0):
if(b&1==1): x = (x*a)%c
a=(a*a)%c
b >>= 1
return x%c
def pow_mod(x, y, z):
number = 1
while y:
if y & 1:
number = number * x % z
y >>= 1
x = x * x % z
return number
# The result is always 124912252967
and now JavaScript (both functions work in the same way):
function powMod(x, y, z) {
let number = 1;
while (y) {
if (y & 1) {
number = number * x % z;
}
y >>= 1;
x = x * x % z;
}
return number;
}
function expmod_iter(a, b, c) {
let x = 1;
while (b > 0) {
if (b & 1 === 1) {
x = (x * a) % c;
}
a = (a * a) % c;
b >>= 1
}
return x % c;
}
console.log(powMod(176672119508, 55, 200000023499));
console.log(expmod_iter(176672119508, 55, 200000023499));
// The result is always 138693107570
And moreover, when I used this service with my numbers, I also got 138693107570.
Why does this happen? I'm not even sure what variant is correct now. However, on smaller numbers the functions give identical results.
Is it possible to somehow get the same result from the functions? It doesn't even matter that much that the result is mathematically correct, the results should be at least the same.
Could you please explain why this happens? Is it the function design? To me, functions on both languages seem to be identical.
Is there a way to get the same result from functions of both languages?
Python's result is correct.
Python uses an arbitrary-precision integer representation, while Javascript stores all numbers as IEEE754 64-bit floating point (and temporarily coerces them to 32-bit integers for bitwise operations). This means that for large integers, the Javascript code starts losing precision, while the Python code maintains the exact values of all results throughout the calculation.
If you want to handle large integers exactly in Javascript, you will need to use an appropriate library. Alternatively, you say you don't care much about the result being correct. That is a really weird thing to not care about, but if you really feel that way:
# Python
def wrongpow(a, b, c):
return 0
// Javascript
function wrongpow(a, b, c) {
return 0;
}

Javascript: Use reduce() method to find the LCM in an array

I am trying to find the LCM(least common multiples) among several numbers in an array. To get a LCM between each two numbers in the array, I use the reduce() method which is not working.Please tell me what's wrong? Thanks.
function gcd(a,b){
//gcd: greatest common divisor
//use euclidean algorithm
var temp = 0;
while(a !== 0){
temp = a;
a = b % a;
b = temp;
}
return b;
}
function lcm(a,b){
//least common multiple between two numbers
return (a * b / gcd(a,b));
}
function smallestCommons(arr) {
//this function is not working, why?
arr.reduce(function(a, b){
return lcm(a, b);
});
}
smallestCommons([1,2,3,4,5]);
//------>undefined
Your smallestCommons function is missing a return. undefined is the default return value for all functions that don't have an explicit return.
function smallestCommons(arr) {
return arr.reduce(lcm);
}

how to prevent chain when adding javascript variables containing numbers

How can i prevent to javascript interpret my numeric vars from string vars?
var a = 100;
var b = -10
var c = a + b // 10-10 (string)
lets say i allways want
var c = a + b = 100+(-10) = 90 (number)
In your example c will always be 90, however;
var a = 100;
var b = "-10";
var c = a + b // "100-10" (string)
to prevent this convert the string to an integer;
var c = a + parseInt(b, 10);
or with a unary+
var c = a + +b;
Your code example...
var a = 100;
var b = -10
var c = a + b // 90 (number)
...won't do that unless one of the operands is a String. In your example, both are Number.
If you do have numbers inside of Strings, you can use parseInt() (don't forget to pass the radix of 10 if working in decimal) or possibly just prefix the String with + to coerce it to Number.
Your code works fine. See here.
JavaScript will always do the latter, as long as both of the variables you are adding are numbers.
The most concise way is prepending a + if you aren't certain whether the variables are numbers or strings:
var a = "100";
var b = "-10";
var c = +a + +b; // 90
This works since +"123" === 123 etc.

Categories