I am trying to write a function to get the length of a string without using the .length property nor any loops or built in methods. I am using recursion, however, the stopping statement is not evaluating and causing a stack overflow. I tried console logging string[i] and sure enough, once the length is reached, it console logs "undefined" but the if statement still won't evaluate.
const getLength = (string, i = 0) => {
if (string[i] === 'undefined') return 0;
return 1 + getLength(string, i+1);
}
console.log(getLength("what is going on??")); //18
the if-state should be string[i] === undefined
Your test for undefined is not working
You can test for falsy or more detailed
Here is just a falsy one, it will fail on falsy values like 0
const getLength = (string, i = 0) => string[i] ? 1 + getLength(string, i+1) : 0;
console.log(getLength("what is going on??")); //18
Related
if (!value || value.length<1)
if (value.length<1)
What's the difference between the two conditions? Wouldn't it be same?
No, they are absolutely different.
!value
this checks for whether an item is present and it is not undefined, but ![] and also ![3] this is always false. basically it checks for presence.
and even [] is always true.
length
calculates the no of elements inside that array and it is purely applied to array.
for [] , value.length<1 this returns true.
If value is null or undefined, the second if will throw an error stating that you can't access length of null / undefined.
The first one prevents that, as it will only be accessing value.length if value is truthy. Otherwise, the first condition (!value) is satisfied, so the second one (value.length < 1) won't even be evaluated.
const arr1 = null;
const arr2 = [];
// Satisfies first condition:
if (!arr1 || arr1.length < 1) console.log('No values in arr1.');
// Satisfies second condition:
if (!arr2 || arr2.length < 1) console.log('No values in arr2.');
// Breaks:
if (arr1.length < 1) console.log('No values in arr1.');
Anyway, that's not specific to TS, it's just how vanilla JS works.
Quick way to understand is you cannot access length property of an undefined array. So the second if condition will throw an error akin to Cannot access property 'length' of undefined.
The first if condition however checks if the array is defined. So it won't throw any error.
Typescript contains a native way of doing this check using a "safe navigation operator" or optional chaining operator ?.. So in TS, you could simply do
if (value?.length < 1) { }
It is equivalent to JS
if ((value === null || value === void 0 ? void 0 : value.length) < 1) { }
I'm currently studying Javascript Algorithms. Below is the algorithm im currently trying to learn/understand.
function same(arr1, arr2){
if(arr1.length !== arr2.length){
return false;
}
let frequencyCounter1 = {}
let frequencyCounter2 = {}
for(let val of arr1){
frequencyCounter1[val] = (frequencyCounter1[val] || 0) +1
console.log(frequencyCounter1);
}
for(let val of arr2){
frequencyCounter2[val] = (frequencyCounter2[val] || 0) +1
}
for(let key in frequencyCounter1){
if(!(key ** 2 in frequencyCounter2)){
return false
}
if(frequencyCounter2[key ** 2] !== frequencyCounter1[key]){
return false
}
}
return true
}
same([1,2,3,2,5], [9,1,4,4,11])
I understand the code, except for 1 line.
frequencyCounter1[val] = (frequencyCounter1[val] || 0) +1
So what this algo does, is it tries compares 2 arrays. If array b is equal to the square of each number is array a, then it should return true, else it will return false.
So in this example, it will return false
If i do [1,2,3,4,5] [1,4,9,16,25], it will return true
I know what this line does:
frequencyCounter1[val] = (frequencyCounter1[val] || 0) +1
It makes a key value pair, so say for the first iteration, it makes take in 1 as the key and then (frequencyCounter1[val] || 0) +1 as the value, now this value represents the number of a times a number appears in array so if 1 appears 10 times it'll have a key value pair 1:10
I understand this very clearly, just wanted to know how this statement is evaluated and whats happening behind the scenes?
(frequencyCounter1[val] || 0) +1
The idea is that if frequencyCounter1[val] is undefined it defaults to 0. undefined + 1 returns NaN and it wouldn't work as the programmer intended, so he uses || to workaround that problem without having to write additional lines of code.
In JavaScript the operator || doesn't return true or false as you would expect, it returns either the first element that would evaluates as true if it was converted to boolean, or default to the last element if none is found.
For example, (null || "" || undefined || false || NaN || "test" || 2) will return "test"
This code is very unique in the sense that it starts with the first value of the array and checks if it already exists. If it does not exist, it is undefined. undefined is compared with 0 using the || operator. If undefined the val becomes registered and becomes 1.
Here we are basically setting the key:value pairs as seen in the below example
obj = {
1:1,
2:2,
3:5
}
obj[3] = 7
console.log(obj);
If val already exist the the OR operator is ignored and 1 is added to val. This one-liner is very useful as a counter operation in javaScript and the big O for the code is O(N) which is better than writing a nested loop O(N^2) which is a common solution for such problem.
In the for loop: counter < (x.lenght) is spelled wrong, but the function returns zero. When corrected to x.length the function returns the correct number of Bs, 3. 1) Why is zero being returned? 2) Why does javascript not catch this error? 3) For the future, anything I can do to make sure these types of errors are caught?
function countBs(x){
var lCounter = 0;
for (var counter = 0; counter < (x.lenght); counter++){
if((x.charAt(counter)) == "B"){
lCounter++;
}
}
return lCounter;
}
console.log(countBs("BCBDB"));
Accessing x.lenght is returning undefined causing the for loop to terminate immediately. Therefore the initial value of lCounter is returned.
You can check for the existence of a property in an object by using the in keyword like so:
if ( 'lenght' in x ) {
...
x.lenght is returning undefined. Comparison operators perform automatic type juggling, so undefined is converted to a number to perform the comparison, and it converts to NaN. Any comparison with NaN returns false, so the loop ends.
Javascript doesn't catch this error because it uses loose typing, automatically converting types as needed in most cases.
There's no easy way to ensure that typos like this are caught. A good IDE might be able to detect it if you provide good type comments.
JavaScript does all kind of crazy coversions, instead of throwing an error: https://www.w3schools.com/js/js_type_conversion.asp
'undefined' in particular becomes NaN when necessary (very last line of the very last table), which results in 'false' when compared to a number (regardless of <, >, <=, >=, == or !=, they all fail, NaN does not even equal to itself).
If you want to catch or log an error to make sure your variable property is defined. Please see code below:
function countBs(x){
var lCounter = 0;
if(typeof x.lenght == 'undefined')
{
console.log('Undefined poperty lenght on variable x');
return 'Error catch';
}
for (var counter = 0; counter < (x.lenght); counter++){
if((x.charAt(counter)) == "B"){
lCounter++;
}
}
return lCounter;
}
console.log(countBs("BCBDB"));
To catch this particular error, set lCounter to -1 instead of 0.
That will ensure that the loop will run at least once if the for condition is correct.
You can return (or throw) an error if the loop isn't entered.
Otherwise, return lCounter + 1 to account for the initialization of -1.
function countBs(x) {
var lCounter = -1;
for (var counter = 0; counter < (x.lenght); counter++) {
if((x.charAt(counter)) == "B") {
lCounter++;
}
}
if(lCounter == -1) {
return 'Error';
} else {
return lCounter + 1;
}
}
Since jQuery 3 .outerHeight() returns undefined instead of null if called on a non-existing element. This causes problems when adding up heights of elements that don't exist, because number + undefined now results in NaN. While prior to jQuery 3 number + null would return number.
var lorem = $('#div1').outerHeight() + $('#div2').outerHeight();
Returns NaN if for example #div2 does not exist.
Potential solution:
undef2null = function(myvar) {
if (myvar === undefined) {return null;}
else {return myvar;}
}
Would turn above code into:
var lorem = undef2null($('#div1').outerHeight()) + undef2null($('#div2').outerHeight());
Is there a more elegant solution than this?
You can guard against an undefined or null value using the || operator:
($('#div1').outerHeight() || 0)
...which is less typing than calling a function on each of the potentially problem values (although obviously you could make a one-line function out of that if you wanted to).
Or instead of a function that checks a single value to see if it is undefined you could write a function that adds up all supplied arguments, checking each:
function addNumbers() {
var result = 0;
for (var i = 0; i < arguments.length; i++)
result += arguments[i] || 0;
return result;
}
console.log( addNumbers(1,2,3) );
console.log( addNumbers(1,undefined,3) );
console.log( addNumbers(1,undefined,null) );
(Note that the code I've shown doesn't actually test the type of the supplied values, so if you called it with strings or whatever it would happily concatenate those into the result. You could add an extra test for non-numeric/undefined/null values if desired.)
It seems strange to write a method to turn undefined to null when you really want to treat both as zero.
To coerce both undefined and null to zero, you can
someValThatMightBeNullOrUndefined || 0
I am a bit stuck with the below scenario.
I am currently reading JavaScript: The Definitive Guide and came with below code under the Accessor properties.
function inherit(p) {
if (p == null) throw TypeError();
if (Object.create) {
return Object.create(p);
}
var t = typeof p;
if (t !== 'Object' && t !== 'function') throw TypeError();
function f() {};
f.prototype = p;
return new f();
};
The above is a simple code that sets the prototype of the newly created Object.
The below is a simple code, in which when next is called, is supposed to return a value greater than 55, i.e. 56
var serialNum = {
$n: 0,
get next() {
return this['$n']++;
},
set next(n) {
if (n >= this.$n) {
this.$n = n;
} else {
throw 'serial number can only be set to a larger value';
}
}
};
var genSerialNum = inherit(serialNum);
genSerialNum.$n = 55;
console.log(genSerialNum.next);
So when I set $n of genSerialNum Object it creates a $n property of genSerialNum and on calling the getter property next, it increments 55 to 56(as this is what I can see from the scope Variable of the chrome dev tools), but on printing in console it displays 55. Why?
Think about it:
> i = 0
0
> i++
0
> i
1
See JavaScript Increment operator (++ )
As others have said, you're seeing the old value returned because you're using the postfix operator.
This confusion is (part of) the reason why Douglas Crockford recommends using the += operator instead of pre/post --/++ in JavaScript the Good Parts.
get next(){
return this['$n'] += 1;
}
since you have used this['$n']++(postfix operator) the value of $n is incremented after the value is returned to the caller, if you want to print 56 then use ++this['$n'](prefix operator)
Demo: Fiddle
Another way to say what #Speransky is saying is that this problem has nothing to do with getters and setters, but rather the way that the ++ operator works. To put it another way:
var i = 0;
console.log(i); // 0
console.log(i++); // 0
console.log(i); // 1
i++ will return the original value of i and then will increment it.