I'm doing the factorialize algorithm challenge on FCC and I would really appreciate if someone with a greater knowledge than me would explain me what's wrong in the thinking process of my code
Im following these steps to factorialize a number:
Create a function with a parameter (num).
I create an if statement to accomplish the next task: factorialize(0) should return 1. If (num === 0) {return 1;}
Create an array inside the function.
Create a loop to iterate through num-1 numbers and push them into the array. So we add the current number + all the previous values to the array. Example: If our number is 5 we add 5, 4, 3, 2, 1.
Use the reduce method into the array to multiply the values of each number in the array (factorialize).
Return the given value.
My code:
function factorialize(num) {
if (num === 0) { return 1; }
else {var array = [];
for(i = num; i >= 1; i--){
var newArray = array.push[i];
newArray.reduce(function(previousVal, currentVal){
return previousVal * currentVal;
});
}
}
}
factorialize(5);
I am mainly getting 2 douts now. This way to solve the algorithm might not be the most efficient one okay but:
Is this a viable way to solve it?
Why am I getting "cannot read property 'reduce' of undefined".
Link to the challenge:
https://www.freecodecamp.org/challenges/factorialize-a-number
You can simply try this:
function factorialize(num) {
if (num === 0) { return 1; }
else {
for (var i = num - 1; i >= 1; i--) {
num *= i;
}
return num;
}
}
factorialize(5);
Related
So, I'm learning Javascript through a book and it has some exercises. One of the exercises asks for you to build two functions, one that creates an array from two numbers provided in the arguments, and the other function has to sum all the numbers in the array. Here's my code:
let beg = 1;
let end = 3;
array = [];
sumNum = 0;
function range(begg, endd) {
for (let count = begg; count <= endd; count++) {
array.push(count);
}
return array;
}
console.log(range(beg, end));
function sum(arrayy) {
for (let i = 0; i <= arrayy.length - 1; i++) {
sumNum = arrayy[i] + sumNum;
console.log(sumNum);
}
console.log("\n")
console.log(arrayy.length - 1);
return sumNum / 2;
}
console.log(sum(range(beg, end)));
array2 = [1, 2, 3];
console.log("\n");
console.log(array2.length);
As I was solving the exercise I kept getting double the sum of all the numbers in the array. I started to print some information and discovered that my arrayy.length is returning double the value it's supposed to return and the loop runs double the times it should run.
Here's my output:
[ 1, 2, 3 ]
1
3
6
7
9
12
5
6
3
Sorry it this is a noob question, but my curiosity is killing me and I have not found anything on the internet, so why am I getting this result?
Thanks in advance.
As Ivan said: The "array" variable is global, so each time you call the range function you keep appending items to that shared array. You should add the array inside your function and return it. Other than that you did a pretty nice job!
function range(begg, endd) {
let array = []
for (let count = begg; count <= endd; count++) {
array.push(count);
}
return array;
}
Also: The sum function should have the "sumnum" variable inside the function to prevent it from increasing every time you call the function:
function sum(arrayy) {
let sumnum = 0;
for (let i = 0; i <= arrayy.length - 1; i++) {
sumNum = arrayy[i] + sumNum;
console.log(sumNum);
}
console.log("\n");
console.log(arrayy.length - 1);
return sumNum / 2;
}
remove the array and sumnum variables from the top of your code to get rid of the global variables.
I'm 3 months into Javascript, and find it has no respect for its own declarations.
I have declared an empty array, and want to push a number to it, so that it returns an array of numbers (i.e., separable). However, I am told .push() is not a function. Concat() will not work either, and of course += ruins the algorithm.
How do I get it to accept each next value into an array?
I have tried using 'new Array()', but it does not assist.
Similar to, but not the same as, Array.push throwing TypeError for array.push is not a function
In brief:
const fn = (n) => {
let factors = [];
let index = 1;
while (n % 2 == 0){
let out = 2 ** index;
factors.push(out);
index++;
n /= 2;
}
}
This returns:
Uncaught TypeError: factors.push is not a function
(I have left out a lot of code that does not affect the issue.)
Edit update: WIP. Apparently, the loop this is enclosed in has an effect, as do other variable declarations.
With any luck, I will return with a different question, having solved the initial problem.
Nested loops seems to have been the actual problem, which means others could not reproduce.
const fn = (m, n) => {
let factors = [];
let index = 1;
for (let i = m; i <= n; i++) {
while (i % 2 == 0){
let out = 2 ** index;
factors.push(out);
index++;
n /= 2;
}
}}
For unknown reasons, this causes a type mismatch. The declaration of the array appears to be the cause.
This problem has been abandoned as the initiating function was completed by other methods.
Here! its working well, what might went wrong on your side?
const fn = (n) => {
let index = 1;
let factors = [];
while (n % 2 == 0){
let out = 2 ** index;
factors.push(out);
index++;
n /= 2;
}
return factors
}
document.write(fn(400));
Just was performing simple task in JS which was to take integer as an input, divide it into single digits and multiply them ignoring all zeros in it.
I have solved it but had some troubles which were simply solved by changing the loop. I am just curious why the code did not work properly with the for loop and started to work as I it for for of loop. I could not find out the answer by my self. If somebody could tell where I am wrong.
First one works as intended, second one always returns 1.
function digitsMultip1(data) {
var stringg = data.toString().split("", data.lenght);
for (let elements of stringg) {
if (elements != 0) {
sum = parseInt(elements) * sum
} else {
continue
};
}
return sum;
}
console.log(digitsMultip1(12035))
function digitsMultip2(data) {
var sum = 1;
var stringg = data.toString().split("", data.lenght);
for (var i = 0; i > stringg.lenght; i++) {
if (stringg[i] != 0) {
sum = parseInt(stringg[i]) * sum
} else {
continue
};
}
return sum;
}
console.log(digitsMultip2(12035))
There is no big difference. for..of works in newer browsers
The for...of statement creates a loop iterating over iterable objects, including: built-in String, Array, Array-like objects (e.g., arguments or NodeList), TypedArray, Map, Set, and user-defined iterables. It invokes a custom iteration hook with statements to be executed for the value of each distinct property of the object.
Several typos
length spelled wrong
> (greater than) should be < (less than) in your for loop
Now they both work
function digitsMultip1(data) {
var sum=1, stringg = data.toString().split("");
for (let elements of stringg) {
if (elements != 0) {
sum *= parseInt(elements)
} else {
continue
};
}
return sum;
}
console.log(digitsMultip1(12035))
function digitsMultip2(data) {
var sum = 1, stringg = data.toString().split("");
for (var i = 0; i < stringg.length; i++) {
if (stringg[i] != 0) {
sum *= parseInt(stringg[i])
} else {
continue
};
}
return sum;
}
console.log(digitsMultip2(12035))
You might want to look at reduce
const reducer = (accumulator, currentValue) => {
currentValue = +currentValue || 1; return accumulator *= currentValue
}
console.log(String(12035).split("").reduce(reducer,1));
I doing some Hackerrank challange to improve my problem solving skills, so one of the challanges was about finding the total maximum numbers from an array of numbers. For example if we have 3 2 1 3 1 3 it should return 3
This is what I did :
function birthdayCakeCandles(ar) {
let total= 0
let sortedArray = ar.sort((cur,next)=>{
return cur<next
})
ar.map(item => {
if(item===sortedArray[0]) {
total ++;
}
})
return total
}
So I sorted the given array and then map through the array and check how many of the numbers are equal to the maximum number in that array and count the total.
This will pass 8/9 test cases, one of the test cases, have a array with length of 100000 and for this one it failed, this is the given data for this test case.
Really can't get it why it fails in this test, is it possible that this happened because of JavaScript which is always synchronous and single-threaded?
I tried to use Promise and async await, but hackerrank will consider the first return as the output ( Which is the Promise itself ) and it not use the resolve value as a output, so can't really test this.
Is it something wrong with my logic?
The sorting approach is too slow (O(n log n) time complexity). For algorithmic challenges on HR, it's unlikely that features somewhat particular to your language choice like promises/async are going to rescue you.
You can do this in one pass using an object to keep track of how many times you've "seen" each number and the array's maximum number, then simply index into the object to get your answer:
function birthdayCakeCandles(ar) {
let best = -Infinity;
const seen = {};
for (let i = 0; i < ar.length; i++) {
if (ar[i] > best) {
best = ar[i];
}
seen[ar[i]] = ++seen[ar[i]] || 1;
}
return seen[best];
}
Time and space complexity: O(n).
Edit:
This answer is even better, with constant space (here it is in JS):
function birthdayCakeCandles(ar) {
let best = -Infinity;
let count = 0;
for (const n of ar) {
if (n > best) {
best = n;
count = 1;
}
else if (n === best) {
count++;
}
}
return count;
}
In your case, the build in function sort is using the resource heavily. Maybe that's the reason it is failing for a space/time complexity.
BTW, This problem can be solved easily using a for loop. The idea is
Pseudocode
var maxNum = -999999; // put here the highest limit of int or what ever data type
int count = 0;
for(x in arr)
{
if (x > maxNum)
{
maxNum = x;
count = 1;
}
if(x==maxNum) count ++;
}
Here count will be the output.
The full code is
function birthdayCakeCandles(ar) {
var maxNum = -1;
var count = 0;
for(var i=0; i< ar.length; i++){
var x = ar[i];
if(x<maxNum) continue;
if(x>maxNum){
maxNum = x;
count = 1;
}
else{
count++;
}
}
return count;
}
if I have a simple test function that adds even numbers to an array:
function isEven(n){
var enumbers = [];
if (n % 2 == 0){
enumbers.push (n);
}
}
how can I increment my parameter until I have a set number of members in my array? for instance, I've tried this:
function isEven(n){
var enumbers = [];
while ( enumbers.length < 10){
if (n % 2 == 0){
enumbers.push (n);
}
console.log (enumbers);
n = n + 1;
isEven(n);
}
}
isEven(1);
but it seems to just create a new array for each number until it finally throws a range error (maximum call stack size exceeded).
It's creating that array multiple times because you're constantly calling that function with:
isEven(n);
You're also not comparing to the length of the array, just the array. Add .length to enumbers. Try changing to:
var enumbers = [];
while ( enumbers.length < 10){
if (n % 2 == 0){
enumbers.push (n);
}
console.log (enumbers);
}
I'm not sure if I understood your question.
But you shouldn't use global variables, and it is unnecessary to call your function recursively inside a while loop.
The error maximum call stack size exceeded is your browser trying to break a infinity loop.
This is what you need.
Examples here jsFiddle1 and jsFiddle2
function isEven(n) {
var enumbers = [];
while (enumbers.length < 10) {
if (n % 2 == 0) {
enumbers.push(n);
}
n++;
}
return enumbers;
}
Setup a test
var n = 1;
var evenArray = isEven(n); //call isEven function and it returns an array
document.body.innerHTML = evenArray; //2,4,6,8,10,12,14,16,18,20
The problem is that (enumber < 10) apparently always evaluates to true, causing an endless loop. But it is this comparison that is wrong, since you're comparing an integer with an array I think you're trying to get the array length?:
while (enumbers.length < 10) {
Another thing. enumbers is a local variable, so each call to isEven has it's own array. Therefore, the functions is called recursively, over and over again.
I suggest you create the array outside of is even method
I would have written something like:
function isEven(n,enumbers){
while(enumbers < 10){
if (n % 2 == 0){
enumbers.push (n);
}
console.log (enumbers);
n = n + 1;
isEven(n, enumbers);
}
}
var enumbers = [];
isEven(1,enumbers);