If I set a variable to 0, I get the weird behavior that a comparison to "" (empty) is true. How do I check that the variable is really empty?
tmp = 0;
if ( tmp != "")
{
//do something - This is where the code goes.
}
else
{
//isEmpty - I would expect to be here
}
Use strict comparison operators
=== and !==
With == and != (called abstract comparison operators),
If the two operands are not of the same type, JavaScript attempts to
convert the operands to an appropriate type for the comparison.
If by empty, you want to check if the variable hasn't been defined, use:
if (typeof tmp !== "undefined") {
// it exists!
}
What do you mean by empty variable? If you mean an empty string, then you should use !== to check it.
if (tmp !== "")
JavaScript implicitly converts values to other types. To check type also, use the !== operator:
if ( tmp !== "")
In JavaScript everything except 0, NaN, undefined, false and null are considered to be false. "" is considered as true.
if (tmp) {
}
Above if will be executed if variable contains any value other than 0, NaN, undefined, false and null.
If tmp is a string then you can use the following code:
if (tmp !== "") {
}
=== and !== operators compare without doing type-conversion.
Related
So my uploaded media file (event.target.files[0]) does not equal to true or false.
It has a typeof object.
It's part of some form state and I'd like to check the object whether all fields are not empty "".
I thought a JS object should always === true, but maybe this is different for 'files' objects?
=== checks for strict equality, so the two values must be exactly the same.
An object is truthy, but does not equal true, so what you are really doing is { ... } === true, which is false.
If you want to check if none of the object's values are empty, you can filter for empty values:
const empty = Object.keys(theObject).length === 0 || Object.values(theObject).filter(value => {
return value.trim() === '';
}).length > 0;
=== tests for equal value and equal type (ref). typeof(true) is boolean but a file is not a boolean. So the comparison will never yield true.
See also https://stackoverflow.com/a/8511350/4640820
To check the type of a value, you must write
if( (typeof <your value>) == ("<expected type>")){
...
}
For example, a statement like this:
if( (typeof 42)=="number" )
is true.
Reference for most cases of typeof
Below is my code and it always returns the IF statement as if it's false. Shouldn't it be true?
The variables asscostied with the IF statement:
var coloredUI = '';
var coloredText = '';
And here's the IF statement:
if (coloredText && coloredUI == '') {
} else {
}
In JavaScript, values can be "truthy" or "falsy". You set both your variables to empty strings, which are "falsy" (no characters == false). Other falsy values are:
undefined, 0, false, null
An if statement always wants to test a condition for a truthy Boolean result. If you give it an expression, that expression is evaluated, and if the result is not a Boolean, the JavaScript engine will coerce it into one. Falsy values become false and truthy values become true, so:
if(coloredText) {}
Evaluates to:
if(false) {}
because coloredText was intialized to a falsy value (''). And because you used the short-circuited logical AND, both expressions would have to be true for the entire if to be true. But, since the first one was coerced to false, the if statement proceeds to the false branch.
To avoid this, you can write an expression that compares the expression rather than coerces it alone, as in:
if(coloredText == '') // true
This concept of implicit type coercion is also why JavaScript provides two mechanisms for equality testing. Take this for example:
var x = 0;
if(x == false)
This will result in true because the double equal sign means equality with conversion. The false is converted to a number (0) and then checked against the number (0), so we get true.
But this:
var x = 0;
if(x === false)
will result in a false result because the triple equal sign means strict equality, where no conversion takes place and the two values/expression are compared as is.
Getting back to your original scenario. We leverage this implicit type coercion often when checking for feature support. For example, older browsers don't have support for Geolocation (they don't implement the object that provides that feature). We can test for support like this:
if(navigator.geolocation)
If the navigator object doesn't have a geolocation property, the expression will evaluate to undefined (falsy) and the if will head into its false branch. But, if the browser does support geolocation, then the expression will evaluate to an object reference (truthy) and we proceed into the true branch.
Empty string('') is falsey value
Following example will test whether both the values holds truthy values.
var coloredUI = '';
var coloredText = '';
if (coloredText && coloredUI) {
alert('if');
} else {
alert('else');
}
To test both values as ''
var coloredUI = '';
var coloredText = '';
if (coloredText == '' && coloredUI == '') {
alert('if');
} else {
alert('else');
}
Truthy and Falsy Values
if (coloredText == '' && coloredUI == '') {
} else {
}
if (coloredText == '' && coloredUI == '') {
} else {
}
if ((coloredText==='') && (coloredUI == '')){
} else {
}
OR if you want to check if there is a value in coloredText then use this:
if ((coloredText) && (coloredUI == '')){
} else {
}
What is the difference between next if statements in javascript when checking with null?
var a = "";
if( a != null ) // one equality sign
....
if( a !== null ) // two equality sign
....
When comparing to null I can't find any difference.
According to http://www.w3schools.com/js/js_comparisons.asp
!= - not equal
!== - not equal value or not equal type
In JavaScript, null has type: object (try yourself executing the following sentence typeof null).
That is, !== will check that a is also object before checking if the reference equals.
Actually you know that === and !== are meant to check that both left and right side of the equality have the same type without implicit conversions involved. For example:
"0" == 0 // true
"0" === 0 // false
Same reasoning works on null checking.
!= checks
negative equality
while !== checks for
negative identity
For example,
var a = "";
a != false; //returns false since "" is equal false
a !== false; //returns true since "" is not same as false
but if you are comparing it with null, then value will be true in both ocassion since "" is neither equal nor identical to null
There is no difference between them when comparing to null.
When we use strict equality (!==) it is obvious because they have different types, but when we use loose equality (!=) it is an important thing to remember about JavaScript language design.
Because of language design there are also some common questions:
How do you check for an empty string in JavaScript?
Is there a standard function to check for null, undefined, or blank variables in JavaScript?
var a = "";
(1) if( a != null ) // one equality sign
Above condition(1) checks value only and not data-type, this will return true.
....
(2) if( a !== null ) // two equality sign
This checks value and data-type both, this will true as well.
To understand it more precisely,
var a = "1";
if( a == 1) {
alert("works and doesn't check data type string");
}
if( a === 1) {
alert('doesn't works because "1" is string');
}
if( a === "1") {
alert('works because "1" is string');
}
There is a difference if variable has value undefined:
var a = undefined;
if( a != null ) // doesn't pass
console.log("ok");
if( a !== null ) // passes
console.log("ok");
Got idea from reading this great post Why ==null, Not ===null. Also != is faster.
I have this code:
if (window.content.document.getElementById("error-msg") != null )
{
if (window.content.document.getElementById("error-msg").offsetParent !== null)
{
...
}
}
Can it be written in one if statement?
I tried the following...
if ( (window.content.document.getElementById("error-msg") != null) || (window.content.document.getElementById("error-msg").offsetParent !== null) ) {}
But, it didn't work, and produces an error:
TypeError: window.content.document.getElementById(...) is null
The common idiom is to use the && operator like this
var errorMsg = window.content.document.getElementById("error-msg");
if (errorMsg && errorMsg.offsetParent) {
...
}
Here, JavaScript will evaluate errorMsg first and if it is Truthy, then it will evaluate the errorMsg.offsetParent part. The condition will be satisfied only if both the expressions in && are Truthy.
Note: The Truthy evaluation will return false, if the expression being tested is 0, false etc (See the list of Falsy values here). So, if you want to test if they are not null, just write that explicitly, like this
if (errorMsg !== null && errorMsg.offsetParent !== null) {
...
}
On the other hand, the || operator will evaluate the second operator only if the first expression is Falsy. In your case, if
(window.content.document.getElementById("error-msg") != null)
is true, it means that getElementById("error-msg") returns null. Since the first expression is evaluated to be Truthy, it evaluates the other expression and it effectively tries to check
null.offsetParent !== null
That is why it fails.
Maybe you want to use &&
if (a != null && b != null) {
// Do something.
}
If Not (oResponse.selectSingleNode("BigGroupType") Is Nothing) Then
End If
I need to convert this to javascript. Is that enough to check null?
This was my lead's answer, plz verify this,
if(typeof $(data).find("BigGroupType").text() != "undefined" && $(data).find("BigGroupType").text() != null) {
}
JavaScript has two values which mean "nothing", undefined and null. undefined has a much stronger "nothing" meaning than null because it is the default value of every variable. No variable can be null unless it is set to null, but variables are undefined by default.
var x;
console.log(x === undefined); // => true
var X = { foo: 'bar' };
console.log(X.baz); // => undefined
If you want to check to see if something is undefined, you should use === because == isn't good enough to distinguish it from null.
var x = null;
console.log(x == undefined); // => true
console.log(x === undefined); // => false
However, this can be useful because sometimes you want to know if something is undefined or null, so you can just do if (value == null) to test if it is either.
Finally, if you want to test whether a variable even exists in scope, you can use typeof. This can be helpful when testing for built-ins which may not exist in older browsers, such as JSON.
if (typeof JSON == 'undefined') {
// Either no variable named JSON exists, or it exists and
// its value is undefined.
}
You need to check for both null and undefined, this implicitly does so
if( oResponse.selectSingleNode("BigGroupType") != null ) {
}
It is the equivalent of:
var node = oResponse.selectSingleNode("BigGroupType");
if( node !== null &&
node !== void 0 ) {
}
void 0 being a bulletproof expression to get undefined
In JavaScript equvalent for Nothing is undefined
if(oResponse.selectSingleNode("BigGroupType") != undefined){
}
This logic:
If Not (oResponse.selectSingleNode("BigGroupType") Is Nothing)
Can be written like this in JavaScript:
if (typeof oResponse.selectSingleNode("BigGroupType") != 'undefined')
Nothing would equal undefined, but checking against undefined is not recommended for several reasons, it’s generally safer to use typeof.
However, if the selectSingleNode can return other falsy values such as null, it’s probably OK to just do a simple check if it is truthy:
if (oResponse.selectSingleNode("BigGroupType"))
JavaScript:-
(document.getElementById(“BigGroupType”) == undefined) // Returns true
JQuery:-
($(“#BigGroupType”).val() === “undefined”) // Returns true
Note in above examples undefined is a keyword in JavaScript, where as in JQuery it is just a string.