Hello I'm trying to understand recursion in JavaScript.
So far I have:
function countVowels(string) {
let vowelCount = 0;
// if we're not at the end of the string,
// and if the character in the string is a vowel
if (string.length - 1 >= 0 && charAt(string.length -1) === "aeiouAEIOU") {
//increase vowel count every time we iterate
countVowels(vowelCount++);
}
return vowelCount;
}
First of all, this is giving me issues because charAt is not defined. How else can I say "the character at the current index" while iterating?
I can't use a for-loop - I have to use recursion.
Second of all, am I using recursion correctly here?
countVowels(vowelCount++);
I'm trying to increase the vowel count every time the function is called.
Thanks for your guidance.
If you're interested, here is a version that does not keep track of the index or count, which might illuminate more about how the recursion can be done.
function countVowels(string) {
if (!string.length) return 0;
return (
"aeiou".includes(string.charAt(0).toLowerCase()) +
countVowels(string.substr(1))
);
}
console.log(countVowels("")); // 0
console.log(countVowels("abcde")); // 2
console.log(countVowels("eee")); // 3
// Note that:
console.log('"hello".substr(1)', "hello".substr(1)) // ello
console.log('"hello".charAt(0)', "hello".charAt(0)) // h
console.log('"aeiou".includes("a")', "aeiou".includes("a")) // true
console.log('"a".includes("aeiou")', "a".includes("aeiou")) // false
Our base case is that the string is empty, so we return 0.
Otherwise, we check if the first character in the string is a vowel (true == 1 and false == 0 in javascript) and sum that with counting the next (smaller by one) string.
You are making two mistakes:
You should have three parameters string , count(count of vowels) and current index i.
You should use includes() instead of comparing character with "aeiouAEIOU"
function countVowels(string,count= 0,i=0) {
if(!string[i]) return count
if("aeiou".includes(string[i].toLowerCase())) count++;
return countVowels(string,count,i+1);
}
console.log(countVowels("abcde")) //2
As asked by OP in comments "Can you please explain why it'sif("aeiou".includes(string[i].toLowerCase())) instead of if(string[i].includes("aeiou".toLowerCase()))"
So first we should know what includes does. includes() checks for string if it includes a certain substring passed to it or not. The string on which the method will be used it will be larger string and the value passed to includes() be smaller one.
Wrong one.
"a".includes('aeiou') //checking if 'aeiou' is present in string "a" //false
Correct one.
"aeiou".includes('a') //checking if 'a' is present in string "aeiou" //true
One possible solution would be:
function countVowels(string, number) {
if (!string) return number;
return countVowels(string.slice(1), 'aeiouAEIOU'.includes(string[0])? number + 1 : number);
}
// tests
console.log('abc --> ' + countVowels('abc', 0));
console.log('noor --> ' + countVowels('noor', 0));
console.log('hi --> ' + countVowels('hi', 0));
console.log('xyz --> ' + countVowels('xyz', 0));
and you should call your function like: countVowels('abc', 0)
Notes about your solution:
you always reset vowelCount inside your function, this usually does not work with recursion.
you defined your function to accept a string, but recall it with an integer in countVowels(vowelCount++); this it will misbehave.
always remember that you have to define your base case first thing in your recursion function, to make sure that you will stop sometime and not generate an infinite loop.
Alternative ES6 solution using regex and slice() method. Regex test() method will return true for vowels and as stated in a previous answer JavaScript considers true + true === 2.
const countVowels = str => {
return !str.length ? 0 : /[aeiou]/i.test(str[0]) + countVowels(str.slice(1));
}
as far as I understand, the additional assignment operator simply shortens the way we increment a value, e.g, instead of writing x = x + 1 for instance, we use x += 1, Now, here is an example of a case that this approach didn't work for me as intended. I want to loop through a given string, add the current character to a predeclared empty object, if the character already exists in the object, i will increment its value by 1, if it does not, then I will add it with the value of 1. in other words I am building a character map of a given string.
let charMap = {}
for (let char of 'doooppy') {
charMap[char] = charMap[char] + 1 || 1;
}
this works like a charm. now check the following
let charMap = {}
for (let char of 'doooppy') {
charMap[char] += 1 || 1;
}
this now returns a NaN, which is pretty weird to me as its the same idea of using the assignment operator. can somebody explain why is that ?! thanks.
1 || 1 is being evaluated before charMap[char] += 1 || 1; is evaluated. That's the way operator precedence in JavaScript works. The part with || is evaluated before the part with +=. The first time that line of code is run, charMap[char] is undefined. Using += on undefined returns NaN, and then in subsequent executions using += on NaN also returns NaN.
Correct solution with += would be:
let charMap = {}
for (let char of 'doooppy') {
if (!charMap[char]) {
charMap[char] = 1;
} else {
charMap[char] += 1;
}
}
I am currently learning JS and when I do some practice, I find some issues I am unclear on data type in Javascript. I understand that JS do NOT require specific type indication, it will automatically do the type conversion whenever possible. However, I suffer one problem when I do NOT do type conversion which is as follows:
var sum = 0;
function totalSum (a) {
if (a == 0) {
return sum;
}
else {
sum += a;
return totalSum(--a);
}
}
var i = prompt("Give me an integer");
// var num = parseInt(i);
alert("Total sum from 1 to " + i + " = " + totalSum(i));
// alert("Total sum from 1 to " + i + " = " + totalSum(num));
I notice that the code works perfectly if I change the data type from string to int using parseInt function, just as the comment in the code does. BUT when I do NOT do the type conversion, things are getting strange, and I get a final result of 054321, if I input the prompt value as 5. AND in a similar way, input of 3, gets 0321 and so on.
Why is it the case? Can someone explain to me why the totalSum will be such a number? Isn't javascript will automatically helps me to turn it into integer, in order for it to work in the function, totalSum?
The sample code can also be viewed in http://jsfiddle.net/hphchan/66ghktd2/.
Thanks.
I will try to decompose what's happening in the totalSum method.
First the method totalSum is called with a string as parameter, like doing totalSum("5");
Then sum += a; (sum = 0 + "5" : sum = "05") (note that sum become a string now)
then return totalSum(--a);, --a is converting the value of a to a number and decrement it's value. so like calling return totalSum(4);
Then sum += a (sum = "05" + 4 : sum = "054") ...
See the documentation of window.prompt: (emphasis mine)
result is a string containing the text entered by the user, or the value null.
Why this code outputs 3, not 2?
var i = 1;
i = ++i + --i;
console.log(i);
I expected:
++i // i == 2
--i // i == 1
i = 1 + 1 // i == 2
Where I made mistake?
The changes occur in this order:
Increment i (to 2)
Take i for the left hand side of the addition (2)
Decrement i (to 1)
Take i for the right hand side of the addition (1)
Perform the addition and assign to i (3)
… and seeing you attempt to do this gives me some insight in to why JSLint doesn't like ++ and --.
Look at it this way
x = (something)
x = (++i) + (something)
x = (2) + (something)
x = (2) + (--i)
x = (2) + (1)
The terms are evaluated from left to right, once the first ++i is evaluated it won't be re-evaluated when you change its value with --i.
Your second line is adding 2 + 1.
In order, the interpreter would execute:
++i // i == 2
+
--i // i == 1
i = 2 + 1
++i equals 2, `--i' equals 1. 2 + 1 = 3.
You're a little off on your order of operations. Here's how it goes:
i is incremented by 1 (++i) resulting in a value of 2. This is
stored in i.
That value of two is then added to the value of (--i)
which is 1. 2 + 1 = 3
Because when you use ++i the value of i is incremented and then returned. However, if you use i++, the value of i is returned and then incremented. Reference
++$a Increments $a by one, then returns $a.
$a++ Returns $a, then increments $a by one.
--$a Decrements $a by one, then returns $a.
$a-- Returns $a, then decrements $a by one.
Because you're expecting this code to work as if this is a reference object and the values aren't collected until the unary operations are complete. But in most languages an expression is evaluated first, so i returns the value of i, not i itself.
If you had ++(--i) then you'd be right.
In short, don't do this.
The result of that operation isn't defined the same in every language/compiler/interpreter. So while it results in 3 in JavaScript, it may result in 2 elsewhere.
What's the "best" way to convert a number to a string (in terms of speed advantage, clarity advantage, memory advantage, etc) ?
Some examples:
String(n)
n.toString()
""+n
n+""
like this:
var foo = 45;
var bar = '' + foo;
Actually, even though I typically do it like this for simple convenience, over 1,000s of iterations it appears for raw speed there is an advantage for .toString()
See Performance tests here (not by me, but found when I went to write my own):
http://jsben.ch/#/ghQYR
Fastest based on the JSPerf test above: str = num.toString();
It should be noted that the difference in speed is not overly significant when you consider that it can do the conversion any way 1 Million times in 0.1 seconds.
Update: The speed seems to differ greatly by browser. In Chrome num + '' seems to be fastest based on this test http://jsben.ch/#/ghQYR
Update 2: Again based on my test above it should be noted that Firefox 20.0.1 executes the .toString() about 100 times slower than the '' + num sample.
In my opinion n.toString() takes the prize for its clarity, and I don't think it carries any extra overhead.
Explicit conversions are very clear to someone that's new to the language. Using type coercion, as others have suggested, leads to ambiguity if a developer is not aware of the coercion rules. Ultimately developer time is more costly than CPU time, so I'd optimize for the former at the cost of the latter. That being said, in this case the difference is likely negligible, but if not I'm sure there are some decent JavaScript compressors that will optimize this sort of thing.
So, for the above reasons I'd go with: n.toString() or String(n). String(n) is probably a better choice because it won't fail if n is null or undefined.
The below are the methods to convert an Integer to String in JS.
The methods are arranged in the decreasing order of performance.
var num = 1
Method 1:
num = `${num}`
Method 2:
num = num + ''
Method 3:
num = String(num)
Method 4:
num = num.toString()
Note: You can't directly call toString() on a number. 2.toString() will throw Uncaught SyntaxError: Invalid or unexpected token.
(The performance test results are given by #DarckBlezzer in his answer)
Other answers already covered other options, but I prefer this one:
s = `${n}`
Short, succinct, already used in many other places (if you're using a modern framework / ES version) so it's a safe bet any programmer will understand it.
Not that it (usually) matters much, but it also seems to be among the fastest compared to other methods.
...JavaScript's parser tries to parse
the dot notation on a number as a floating point literal.
2..toString(); // the second point is correctly recognized
2 .toString(); // note the space left to the dot
(2).toString(); // 2 is evaluated first
Source
Tongue-in-cheek obviously:
var harshNum = 108;
"".split.call(harshNum,"").join("");
Or in ES6 you could simply use template strings:
var harshNum = 108;
`${harshNum}`;
The simplest way to convert any variable to a string is to add an empty string to that variable.
5.41 + '' // Result: the string '5.41'
Math.PI + '' // Result: the string '3.141592653589793'
I used https://jsperf.com to create a test case for the following cases:
number + ''
`${number}`
String(number)
number.toString()
https://jsperf.com/number-string-conversion-speed-comparison
As of 24th of July, 2018 the results say that number + '' is the fastest in Chrome, in Firefox that ties with template string literals.
Both String(number), and number.toString() are around 95% slower than the fastest option.
I recommended `${expression}` because you don't need to worry about errors.
[undefined,null,NaN,true,false,"2","",3].forEach(elem=>{
console.log(`${elem}`, typeof(`${elem}`))
})
/* output
undefined string
null string
NaN string
true string
false string
2 string
string
3 string
*/
Below you can test the speed. but the order will affect the result. (in StackOverflow) you can test it on your platform.
const testCases = [
["${n}", (n) => `${n}`], // 👈
['----', undefined],
[`"" + n`, (n) => "" + n],
[`'' + n`, (n) => '' + n],
[`\`\` + n`, (n) => `` + n],
[`n + ''`, (n) => n + ''],
['----', undefined],
[`String(n)`, (n) => String(n)],
["${n}", (n) => `${n}`], // 👈
['----', undefined],
[`(n).toString()`, (n) => (n).toString()],
[`n.toString()`, (n) => n.toString()],
]
for (const [name, testFunc] of testCases) {
if (testFunc === undefined) {
console.log(name)
continue
}
console.time(name)
for (const n of [...Array(1000000).keys()]) {
testFunc(n)
}
console.timeEnd(name)
}
I'm going to re-edit this with more data when I have time to, for right now this is fine...
Test in nodejs v8.11.2: 2018/06/06
let i=0;
console.time("test1")
for(;i<10000000;i=i+1){
const string = "" + 1234;
}
console.timeEnd("test1")
i=0;
console.time("test1.1")
for(;i<10000000;i=i+1){
const string = '' + 1234;
}
console.timeEnd("test1.1")
i=0;
console.time("test1.2")
for(;i<10000000;i=i+1){
const string = `` + 1234;
}
console.timeEnd("test1.2")
i=0;
console.time("test1.3")
for(;i<10000000;i=i+1){
const string = 1234 + '';
}
console.timeEnd("test1.3")
i=0;
console.time("test2")
for(;i<10000000;i=i+1){
const string = (1234).toString();
}
console.timeEnd("test2")
i=0;
console.time("test3")
for(;i<10000000;i=i+1){
const string = String(1234);
}
console.timeEnd("test3")
i=0;
console.time("test4")
for(;i<10000000;i=i+1){
const string = `${1234}`;
}
console.timeEnd("test4")
i=0;
console.time("test5")
for(;i<10000000;i=i+1){
const string = 1234..toString();
}
console.timeEnd("test5")
i=0;
console.time("test6")
for(;i<10000000;i=i+1){
const string = 1234 .toString();
}
console.timeEnd("test6")
output
test1: 72.268ms
test1.1: 61.086ms
test1.2: 66.854ms
test1.3: 63.698ms
test2: 207.912ms
test3: 81.987ms
test4: 59.752ms
test5: 213.136ms
test6: 204.869ms
If you need to format the result to a specific number of decimal places, for example to represent currency, you need something like the toFixed() method.
number.toFixed( [digits] )
digits is the number of digits to display after the decimal place.
The only valid solution for almost all possible existing and future cases (input is number, null, undefined, Symbol, anything else) is String(x). Do not use 3 ways for simple operation, basing on value type assumptions, like "here I convert definitely number to string and here definitely boolean to string".
Explanation:
String(x) handles nulls, undefined, Symbols, [anything] and calls .toString() for objects.
'' + x calls .valueOf() on x (casting to number), throws on Symbols, can provide implementation dependent results.
x.toString() throws on nulls and undefined.
Note: String(x) will still fail on prototype-less objects like Object.create(null).
If you don't like strings like 'Hello, undefined' or want to support prototype-less objects, use the following type conversion function:
/**
* Safely casts any value to string. Null and undefined are converted to ''.
* #param {*} value
* #return {string}
*/
function string (str) {
return value == null ? '' : (typeof value === 'object' && !value.toString ? '[object]' : String(value));
}
With number literals, the dot for accessing a property must be distinguished from the decimal dot. This leaves you with the following options if you want to invoke to String() on the number literal 123:
123..toString()
123 .toString() // space before the dot 123.0.toString()
(123).toString()
I like the first two since they're easier to read. I tend to use String(n) but it is just a matter of style than anything else.
That is unless you have a line as
var n = 5;
console.log ("the number is: " + n);
which is very self explanatory
I think it depends on the situation but anyway you can use the .toString() method as it is very clear to understand.
.toString() is the built-in typecasting function, I'm no expert to that details but whenever we compare built-in type casting verse explicit methodologies, built-in workarounds always preferred.
If I had to take everything into consideration, I will suggest following
var myint = 1;
var mystring = myint + '';
/*or int to string*/
myint = myint + ''
IMHO, its the fastest way to convert to string. Correct me if I am wrong.
If you are curious as to which is the most performant check this out where I compare all the different Number -> String conversions.
Looks like 2+'' or 2+"" are the fastest.
https://jsperf.com/int-2-string
We can also use the String constructor. According to this benchmark it's the fastest way to convert a Number to String in Firefox 58 even though it's slower than
" + num in the popular browser Google Chrome.
Method toFixed() will also solves the purpose.
var n = 8.434332;
n.toFixed(2) // 8.43
You can call Number object and then call toString().
Number.call(null, n).toString()
You may use this trick for another javascript native objects.
Just come across this recently, method 3 and 4 are not appropriate because how the strings are copied and then put together. For a small program this problem is insignificant, but for any real web application this action where we have to deal with frequency string manipulations can affects the performance and readability.
Here is the link the read.
It seems similar results when using node.js. I ran this script:
let bar;
let foo = ["45","foo"];
console.time('string concat testing');
for (let i = 0; i < 10000000; i++) {
bar = "" + foo;
}
console.timeEnd('string concat testing');
console.time("string obj testing");
for (let i = 0; i < 10000000; i++) {
bar = String(foo);
}
console.timeEnd("string obj testing");
console.time("string both");
for (let i = 0; i < 10000000; i++) {
bar = "" + foo + "";
}
console.timeEnd("string both");
and got the following results:
❯ node testing.js
string concat testing: 2802.542ms
string obj testing: 3374.530ms
string both: 2660.023ms
Similar times each time I ran it.
Just use template literal syntax:
`${this.num}`