Basic JavaScript - Fibonacci Sequence Generator Not Working - javascript

I am new to JavaScript, for some reason my Fibonacci Sequence Generator. What is a Fibonacci Sequence, easy, it is a sequence that takes the last two numbers of the sequence and adds them to create the next number. Here is an example. 0, 1, 1, 2, 3, 5, 8, 13, 21... I tried to make a loop with a while statement in which the variable that was once the first number (in this case 0), converts into a new number which is whatever that number was plus the second number (in this case 1). example: var firstNumber(which is 0) + var secondNumber(which is 1) = var firstNumber(which no equals to 1 because we added 0 + 1). If this goes on in a loop, then, in theory, it could go infinitely adding the last number and the one before it, and making that second to last number making the result of the addition. Here is my code, it is not working at all. Any help would be deeply appreciated. Hopefully, I have explained my self correctly.
var firstNumber = 0;
var secondNumber = 1;
function fibonacciGenerator(n){
while (secondNumber <= n){
firstNumber + secondNumber == firstNumber;
secondNumber + firstNumber == secondNumber;
}
}
console.log(fibonacciGenerator(50));

firstNumber + secondNumber == firstNumber; this is not the way to assign a value to a variable
and you need to add a return statement at the end of your function.
Does this help ?
var firstNumber = 0;
var secondNumber = 1;
function fibonacciGenerator(n){
while (secondNumber <= n){
firstNumber = firstNumber + secondNumber;
secondNumber = secondNumber + firstNumber;
}
return secondNumber
}
console.log(fibonacciGenerator(50));

You need to assign a third variable to store the sum of the first and second variable each iteration after that you should make your loop starts from 2, not 1 because you already assing it before the loop staring,
another thing when you assign a value from variable to another it works from right side to left side so it should be like this
firstNumber = firstNumber + secondNumber;
secondNumber = secondNumber + firstNumber;
so after editing it would be like this
function fibonacciGenerator(n) {
var a = 0,
b = 1,
c;
for (let i = 2; i <= n; i++) {
c = a + b;
a = b;
b = c;
}
return b;
}
console.log(fibonacciGenerator(9))
to read more about it check this link

Related

Factorial will not multiply

When I run the code it doesn't multiply my number it just outputs in a loop until it reaches 1
I have tried multiple different ways and this one is the closest i've gotten
......................................
var BR="<br />"; //html line break
function factorial(Num) {
document.write("The factorial of your number is" +Num +BR);
if (Num == 1 ) return 1;
else return (Num * factorial(Num-1));
}
.....................................................
.......................................
var Num; //users number
var Numfactorial;
var ES="";
var factorial
Num=prompt("Enter a number between 1-20" +ES);
Numfactorial=factorial(Num);
.........................................
It's supposed to take the number and multiply it down, so say you put in 20 it should go 19*18*17... down until it multiplies 1 and then outputs the product.
Place the document.write outside your function, and just print the result (Numfactorial) to the page. You also need to parse Num to a number, and remove var factorial:
var BR = "<br />";
function factorial(Num) {
if (Num == 1) return 1;
else return Num * factorial(Num - 1);
}
var Num;
var Numfactorial;
var ES = "";
Num = parseInt(prompt("Enter a number between 1-20" + ES));
Numfactorial = factorial(Num);
document.write("The factorial of your number is " + Numfactorial + BR);
Well, that is the expected output of your code. Your function factorial is returning your value which you aren't using to display the value. You should instead write it like this:
....................................................
var BR="<br />"; //html line break
function factorial(Num) {
if (Num == 1 ) return 1;
else return (Num * factorial(Num-1));
}
.....................................................
.......................................
var Num; //users number
var Numfactorial;
var ES="";
var factorial
Num=prompt("Enter a number between 1-20" +ES);
Numfactorial=factorial(Num);
document.write("The factorial of your number is" +Numfactorial +BR);
.........................................

while loop test case errors

The question as per the practice course is :
Write a JavaScript program to find the maximum integer n such that (1 + 2 + ... + n <= given integer ) is true. For eg. If a given integer is 10, value of maximum integer n is 4 so that 1+2+3+4 <= 10 is true. Your output code should be in the format console.log("Value of n is ", variableName)
My code is :
var num = prompt("Enter a number");
function test(x) {
var sum = 1,
n = 1,
a = 0;
while (sum <= x) {
sum += n;
n = n + 1;
a += 1;
}
return a;
}
var output = test(num);
console.log("Result is :", output);
I'm getting the correct outputs as per the test cases I've entered(10-4,15-5,16-6,17-6) but the website says there is something wrong with the program.
What am i doing wrong?
Better answer than looping: exploit maths. Starting with Triangular number formula:
1 + 2 + ... + n = n * (n + 1) / 2
Thus, for input x, you need to find n such that
n * (n + 1) / 2 <= x
To solve this, we need to clean up the inequality, then use the quadratic equation formula:
n^2 + n <= 2x
n^2 + n - 2x <= 0
n <= (-1 + sqrt(1 + 8x)) / 2
as the final solution. e.g. for
x = 10: n <= (-1 + sqrt(81)) / 2; n <= 4
x = 16: n <= (-1 + sqrt(128)) / 2; n <= 5.156854249492381
Round the upper limit down, and you have the largest allowed integer. Translated into JavaScript:
function test(x) {
return Math.floor((Math.sqrt(8 * x + 1) - 1) / 2);
}
var num = prompt("Enter a number");
console.log("Result is :", test(num));
Consider if the passed value is 11. Then, the maximum integer n should be 4, because 1+2+3+4 < 11 is true, while 1+2+3+4+5 < 11 is false. Your current code outputs 5 for an input of 11, though, which is incorrect; your while loop is sometimes overshooting sum.
You also need to initialize sum to start at 0, not at 1.
Subtract one from a before returning it:
function test(x) {
var sum = 0,
n = 1,
a = 0;
while (sum <= x) {
sum += n;
n = n + 1;
a += 1;
console.log(a, sum);
}
return a - 1;
}
console.log(test(10));
console.log(test(11));
var num = prompt("Enter a number");
var output = test(num);
console.log("Result is :", output);
The code below should work for you. Basically, what I did was that if the input is 10, and your sum is 9, it will still go into the while loop. Then it will add n again and now your number is greater than your input (which is 10), but you still return it. Here what I did is that at the end of the while loop, if your sum is greater than your input, subtract one from a. That way it will still execute, but it will fix the problem.
Also another error I noticed was that sum started at 1, and n started at 1. You wanted 1+2+3+...+n, however using your previous method, you got 1+1+2+3+...+n.
var num = prompt("Enter a number");
function test(x) {
var sum = 0,
n = 1,
tempSum = 1,
a = 0;
while (sum <= x) {
sum += n;
n++;
a++;
if (sum > x) {
a--;
}
}
return a;
}
var output = test(num);
console.log("Result is :", output);
Your order of operation is a little funky; all you have to do is add the incrementor. The while false case will make sure the sum only passes over the number once. So when you return, reduce the number by one:
var num = prompt("Enter a number");
var output = test(num);
console.log("Result is :", output);
function test(num){
let sum = 0
let inc = 0
while(sum<=num)
sum+=++inc
return --inc;
}
This is a reduced version of your code, basically we increment first the number to add (n) in each iteration, and then we add it to the variable holding the sum. When the loop conditions evaluates to false you need to decrement one to n to get your value:
var num = prompt("Enter a number");
function test(x)
{
var sum = 0, n = 0;
while (sum <= x)
{
sum += (++n);
}
return --n;
}
var output = test(num);
console.log("Result is :", output);
I think this will work for you:
var num = prompt("Enter a number");
function test(x) {
var sum = 1,
n = 0;
while ((sum+n) <= x) {
n = n + 1;
sum += n;
}
return n;
}
var output = test(num);
console.log("Result is :", output);
Try below function to find max Number
function maxNumber(a){
var i=1,sum=0,maxNumber=0;
while(sum<=a) {
sum=sum+i;
if(sum<=a)
{
maxNumber=i;
}
i+=1;
}
return maxNumber;
}
doubled checked condition sum<=a to preserve the previous loop value and if condition not satisfied that means current loop value is not useful so returned preserved value of previous loop
Output tested :
Below will help you get the job done.
var num = prompt("Enter a number");
function findMaxNumber(num){
var sum = 0;
var counter = 0;
while(sum < num){
if(sum + counter > num){
break; // Exit loop
}
sum = sum + counter;
counter++;
}
return --counter; // Loop will cause this to be 1 higher than the max int.
}
console.log('Result is: ' + findMaxNumber(num));

Multiply all items in an Array JS

I'm new to JavaScript, and are fooling around to really understand the basics. Now I try to make a calculator. A very basic one, that can add, subrtact, devide and multiply. I've gotten it to work with this code(showing multiply only:
var multiply = function () {
var numbers = prompt("How many numbers do you want to multiply?","At least 2, max 4");
numbers = Number(numbers);
switch (numbers){
case 2:
num1 = prompt("Your first number: ");
num2 = prompt("Your second number: ");
ans = Number(num1) * Number(num2);
alert(num1 + " * " + num2 + " = " + ans);
break;
case 3:
num1 = Number(prompt("Your first number: "));
num2 = Number(prompt("Your second number: "));
num3 = Number(prompt("Your third number: "));
ans = Number(num1) * Number(num2) * Number(num3);
alert(num1 + " * " + num2 + " * " + num3 + " = " + ans);
break;
case 4:
num1 = Number(prompt("Your first number: "));
num2 = Number(prompt("Your second number: "));
num3 = Number(prompt("Your third number: "));
num4 = Number(prompt("Your fourth number: "));
ans = Number(num1) * Number(num2) * Number(num3) * Number(num4);
alert(num1 + " * " + num2 + " * " + num3 + " * " + num4 + " = " + ans);
break;
default:
alert("Not valid");
break;
}
};
multiply();
My problem is that I'm very limited when it comes to how many numbers the user can multiply. Making a switch case for each possible quantity is going to take a while, so I thought of this:
var multiply = function () {
var numbers = [];
var ans = 0;
var times = prompt("How many numbers do you want to multiply?");
for(var i = 0; i<times; i++){
Number(numbers.push(prompt("Please, enter one of your numbers")));
}
alert(ans);
};
multiply();
So, my question is: How can I get "ans" to be equal to each element of my array "numbers" multiplied with eachother?
You can use reduce function:
[1, 2, 3, 4].reduce(function(a, b) {
return a * b;
}); // it return 24
Btw. in your loop you should push to array in this way:
for(var i = 0; i<times; i++){
numbers.push(Number(prompt("Please, enter one of your numbers")));
}
As stated in other answers you can use the Array.reduce method. But rather than rolling your own multiplication function you can also use the native Math.imul method:
var numbers = [1, 2, 3, 4];
var ans = numbers.reduce(Math.imul);
console.log(ans);
If I understand you properly you want something like multiply([1, 2, 3, 4]) === 24 ?
Then you can use https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/Reduce
You could just keep asking for a number and show the intermediate result at the same time. The user can exit with escape:
var multiply = function () {
var s, ans = 1;
while (s = prompt('Current product is ' + ans +
'. Enter next factor to multiply with, or hit escape to exit')) {
ans *= Number(s);
}
}
multiply();
Reduce is probably the right answer, but to give you a more complete understanding of what it's actually doing, take a look at this. This is how I would manually do basically the same thing, adding a few guards to make it more safe.
//This is an enum. It's basically a cleaner and more
//scalable way to define constants. Here I'm using an
//integer to represent each of the four operations
var OPERATIONS = {
'ADD': 1,
'SUBTRACT': 2,
'MULTIPLY': 3,
'DIVIDE': 4
};
function calc (operation, values)
{
if (!operation || !values || values.length < 2)
{
//The inputs aren't valid, so throw some kind of error
}
//This will be used in all of our cases, so
//we define it at a larger scope
var result;
switch (operation)
{
case OPERATIONS.MULTIPLY:
//Extracts the first value and stores it
result = values.shift ();
//Iterate through the remaining values.
//Remember that the first value is no
//longer in the set
values.forEach (function (value, index)
{
//*= is a unary operator which multiplies a value by
//the operand, and then stores it back in itself.
//This is equivalent to result = result * value.
result *= value;
});
break;
//Create cases for ADD, SUBTRACT, and DIVIDE
}
return result;
}
//Test cases
console.log (calc (OPERATIONS.ADD, [1, 1]); //Prints 2
console.log (calc (OPERATIONS.SUBTRACT, [10, 1, 1]); //Prints 8
console.log (calc (OPERATIONS.MULTIPLY, [1, 2, 3, 4]); //Prints 24
console.log (calc (OPERATIONS.ADD, [calc (OPERATIONS.MULTIPLY, [5, 5], 3, 100]); //Prints 128
You can make it a bit more generalised if you want doing something like this...
function calc2 (operations, values)
{
//You need one more value than operation here
if (!operations || !values || values.length < 2 || (values.length - operations.length !== 1))
{
//The inputs aren't valid, so throw some kind of error
}
var result = values.shift ();
while (values.length)
{
switch (operations[0])
{
case OPERATIONS.ADD:
result += values[0]
break;
case OPERATIONS.SUBTRACT:
result -= values[0]
break;
case OPERATIONS.MULTIPLY:
result *= values[0]
break;
case OPERATIONS.DIVIDE:
result /= values[0]
break;
default:
//Something has gone horribly wrong. Thrown an error
}
//Work your way down the array by continually
//removing the first value
values.shift ();
operations.shift ();
}
//Note that this method solves the equation linerally;
//BEDMAS (or PEMDAS, depending on where you're from)
//is not honoured.
return result;
}
//Test cases
console.log (calc ([OPERATIONS.ADD], [1, 1])); //Prints 2
console.log (calc ([OPERATIONS.ADD, OPERATIONS.ADD], [1, 2, 3])); //Prints 6
console.log (calc ([OPERATIONS.ADD, OPERATIONS.ADD, OPERATIONS.DIVIDE], [6, 7, 5, 3])); //Prints 6
This second function would be used by storing the inputs and the operations one by one. So you get something like 6 + 7 + 5 / 3 =, and then you break it into its individual components to do the calculation.
The general methodology here is that you want to get a base value, and then iterate on top of it to get your final result. In the first case, this means mutating the value with the same operation for every value. In the second case you tell it the type of mutation that you'd like to perform at every step instead of at the beginning.
If you want to generalise this to used BEDMAS or have more complex functionality, you would probably have to create a tree structure where each node represents an operation and its respective operands, and to get your result you would simply traverse the tree.
e.g. PLUS(PLUS(DIVIDE(5, 3), 7), 6)

Euler Project 2 in Javascript

I am going through the Odin Project and part of that is doing questions 1-3 in the Euler project. I am stumped on question 2:
"By considering the terms in the Fibonacci sequence whose values do not exceed four million, find the sum of the even-valued terms."
I am so frustrated! What am I doing wrong? Here's what I have so far. Thanks!
function f() {
var fib = [];
fib.push(1,2,3);
var i = fib.length;
var total = 0;
while(fib[i] < 4000000) {
var x = fib[i-2] + fib [i-1];
if(x % 2 == 0) {
total += x;
}
} return total;
}
console.log(f());
The fibonacci sequences starts 1, 1, 2, not 1, 2, 3.
Also, your solution looks like it will work, but you are storing every number in the sequence instead of just the last two, so this will gobble memory comparatively.
As #DLeh notes, the fibonacci sequence starts with 1,1,2 - not 1,2,3. However, that doesn't change the result of summing the even valued items. The problem you're having is that at this point:
while(fib[i] < 4000000) {
fib[i] is undefined, so the function immediately exits with the total staying at 0. Also within the while loop, you're not pushing the next item in the sequence into your array. The below code fixes both of these problems:
function f() {
var fib = [];
fib.push(1,1);
var i = fib.length;
var total = 0;
while(fib[i-1] < 4000000) {
var x = fib[i-2] + fib [i-1];
fib.push(x);
i = fib.length;
if(x % 2 == 0) {
total += x;
}
} return total;
}
console.log(f()); //4613732
#DLeh also pointed out that you're storing more numbers than needed, this solution works without using the array:
function f() {
var f1 = 1;
var f2 = 1;
var total = 0;
while (f2 < 4000000) {
var t = f1 + f2;
if (t % 2 == 0)
total += t;
f1 = f2;
f2 = t;
}
return total;
}
console.log(f()); //4613732
Just for grins, note that you can do this problem without any use of %, and just + operations. Every third value in the sequence is even. That is, 2 is followed by 3 (odd), and then 3 + 2 is 5 (odd), but that sum of two odd numbers gets us back to even (8) and the cycle repeats.
Thus:
function evenFibTotal(limit) {
var a = 1, b = 1, c = 2, total = 0;
while (c < limit) {
total += c;
a = b + c;
b = a + c;
c = a + b;
}
return total;
}
On each iteration, the second trailing value is set to the next value in the sequence (b + c), and that plus the current one is the first trailing value, and finally the next even Fibonacci number is the sum of those two.
(There's also the closed solution but it's no fun :)

Javascript Calculator double digits

I am trying to make a calculator that is simple. However I am having a problem when trying to add, subtract, multiply and divide a double digit number. So far when I plug in a 3 it lets me but if I press 3 again it will just show another 3 instead of making it 33. How do I fix my Javascript in order to be able to compute double digit numbers (or even more). so far this is what I have for this particular problem
var firstNumber;
var secondNumber;
var clear;
var operation;
function calculate() {
var answer;
if (operation == "+") {
answer = firstNumber + secondNumber;
} else if (operation == "-") {
answer = firstNumber - secondNumber;
} else if (operation == "*") {
answer = firstNumber * secondNumber;
}else if (operation == "/") {
answer = firstNumber / secondNumber;
}
firstNumber = answer;
displayAnswer(answer)
}
}
function DisplayOutput(data) {
var display = firstNumber + " " + operation + " " + secondNumber;
document.getElementById('answer').textContent = data;
}
function SaveNumber(number) {
if (firstNumber == undefined) {
firstNumber = number;
}else if (firstNumber != undefined) {
secondNumber = number;
}
DisplayOutput(number);
}
function Operation(op){
operation = op;
DisplayOutput(op);
}
function displayAnswer(answer){
document.getElementById('answer').textContent=answer;
}
function clearData(){
firstNumber = null;
secondNumber = null;
answer = 0;
DisplayOutput(answer);
}
I'll try and explain what Paul Hagerty did with his calculator on stanford's objective-c course.
Add a variable that keeps track of when the user is pressing digits.
For example...
var userIsInTheMiddleOfTyping = false;
when the user presses a digit, this var changes it's value to true, and you concatenate the new digit with the ones you already had. When the user enters a new number you check this var, if it's true then you concatenate again.
When the user presses an operand (+, -, x, etc) reset the var's value to false, so it knows the user finished entering the number and the next number he inputs is a new one.
I believe when you enter a number, you use the function SaveNumber.
First of all, you don't need the if after the else, the else already is a "if (firstNumber != undefined)".
Second, if you place the first 3, the firstNumber becomes defined, therefore you are changing the secondNumber.
What you need to do is to change to the secondNumber only when a operator is inserted.
For that, you can create an auxiliary variable.
Third, how are you adding the digits?
var isFirstNumber = true;
...
function SaveNumber(number) {
if (isFirstNumber) {
firstNumber = firstNumber*10; // This shifts the inserted digits to the left
firstNumber = firstNumber + number;
} else {
secondNumber = secondNumber*10;
secondNumber = secondNumber + number;
}
DisplayOutput(number);
}
And where you process the input, when you get an operator you should do:
isFirstNumber = false;

Categories