I am trying to write an if statement that prints "input is defined" if the input is defined, otherwise, undefined is printed.
function isItUndefined(param)
{
console.log(typeof(param))
if (typeof(param) !== undefined)
{
return 'Input is defined'
}
return undefined
}
console.log(isItUndefined(5))
console.log(isItUndefined(null))
console.log(isItUndefined(undefined))
The above code however gives the following output, as the default statement does not execute even when the condition is false:
number
Input is defined
object
Input is defined
undefined
Input is defined
typeof returns a string, so your code should be:
if (typeof(param) !== 'undefined') {
// Print
}
MDN actually has a dedicated page on how to detect undefined values and what kind of hiccups you might run into with each method. For example, using typeof here might actually cause a bug that's relatively easy to catch with a unit test: if you made a typo in param, this if statement is going to be true almost always.
The typeof operator returns a string, so either compare the input directly with the value undefined, or compare typeof param with the string 'undefined'.
if (typeof param !== 'undefined')
// or
if (param !== undefined)
You don't need typeof to do that
function isItUndefined(param) {
return param === undefined ? "undefined" : "Input is defined";
// If the goal is to print the return value you may prefer to return a string in both cases, in order to have a consistent return type.
}
console.log(isItUndefined(5))
console.log(isItUndefined(null))
console.log(isItUndefined(undefined))
console.log(isItUndefined())
Related
I am using this "!!" comparation in order to compare with "undefined". However, is there any other way to do it?
isWidgetTemplatesLoaded: function(widgetName) {
return !!templates[widgetName];
}
Thanks
You could use typeof to check for undefined:
(typeof templates[widgetName] !== 'undefined')
typeof always returns a string. It returns "undefined" if the value of the variable is undefined or the variable does not exist.
You can use the typeof operator
typeof templates[widgetName];
else you can also use like this
if (templates[widgetName] === undefined) {
// rest of code
}
else {
// rest of code
}
To check for undefined you can just use the no-conversion equal/different operators:
if (x !== undefined) { ... }
if (y === undefined) { ... }
note however that this doesn't have the exact same meaning of !!. The double-negation for example returns true even for false, null, 0 or "" despite the fact that these values are not undefined.
By the way in Javascript you should basically always use === and !== instead of == and !=, unless you really need the crazy implicit conversion that equality/different operators do (and no one needs that). Good Javascript editors will warn you about any use of == or != as they are just bug-hiding places.
In the specific of your question code the !! logic seems wrong on a philosophical level because an empty template could be "" and this function would say that the template has not been loaded while instead it has been loaded and simply happens to be the empty string.
You could use the in operator, if you know, if the property exist with some value, other than undefined.
return widgetName in templates;
Here is the code:
if(!typeOf(node.parentNode)) return null;
Here is the error:
Uncaught TypeError: Cannot read property 'parentNode' of null
I am trying to test if it is null/undefined/false. But it keeps sending me errors.
How can I test it without getting an error with the if statement?
Test the object reference too:
if (!node || !node.parentNode) return null;
If "node" can really be anything (like, a string or a number in addition to an object reference), you'd also want to test the type.
As the other answers mentioned, your particular errors come from the fact that your node object is actually null. The most bullet-proof way of testing if node.parentNode exists and is not null is:
if ((typeof node==='undefined') || !node || !node.parentNode) return null;
This covers the following cases:
the node variable doesn't exist
the node variable is null or undefined
parentNode is falsy (undefined, null, false, 0, NaN, or '')
As per Blue Skies' comments, you should take care about the first check (typeof node === 'undefined') because it masks undeclared variables which may lead to problems later on:
function f() {
if (typeof node==='undefined') {
node = {}; // global variable node, usually not what you want
}
}
You have to check to see if node is null first.
if(!node || !node.parentNode) {
return null;
}
This is also known as a "short-circuit" evaluation. When it sees that !node is true, it will immediately execute what is inside the block because the operator is an OR (||) and in an OR if one of the inputs is true, then the result can only be true.
Also, typeof is a keyword; not a function (although your code will still work).
try {
if (!typeOf(node.parentNode)) return null;
} catch (err) {
console.log(err);
}
I know there are two methods to determine if a variable exists and not null(false, empty) in javascript:
1) if ( typeof variableName !== 'undefined' && variableName )
2) if ( window.variableName )
which one is more preferred and why?
A variable is declared if accessing the variable name will not produce a ReferenceError. The expression typeof variableName !== 'undefined' will be false in only one of two cases:
the variable is not declared (i.e., there is no var variableName in scope), or
the variable is declared and its value is undefined (i.e., the variable's value is not defined)
Otherwise, the comparison evaluates to true.
If you really want to test if a variable is declared or not, you'll need to catch any ReferenceError produced by attempts to reference it:
var barIsDeclared = true;
try{ bar; }
catch(e) {
if(e.name == "ReferenceError") {
barIsDeclared = false;
}
}
If you merely want to test if a declared variable's value is neither undefined nor null, you can simply test for it:
if (variableName !== undefined && variableName !== null) { ... }
Or equivalently, with a non-strict equality check against null:
if (variableName != null) { ... }
Both your second example and your right-hand expression in the && operation tests if the value is "falsey", i.e., if it coerces to false in a boolean context. Such values include null, false, 0, and the empty string, not all of which you may want to discard.
It is important to note that 'undefined' is a perfectly valid value for a variable to hold. If you want to check if the variable exists at all,
if (window.variableName)
is a more complete check, since it is verifying that the variable has actually been defined. However, this is only useful if the variable is guaranteed to be an object! In addition, as others have pointed out, this could also return false if the value of variableName is false, 0, '', or null.
That said, that is usually not enough for our everyday purposes, since we often don't want to have an undefined value. As such, you should first check to see that the variable is defined, and then assert that it is not undefined using the typeof operator which, as Adam has pointed out, will not return undefined unless the variable truly is undefined.
if ( variableName && typeof variableName !== 'undefined' )
If you want to check if a variable (say v) has been defined and is not null:
if (typeof v !== 'undefined' && v !== null) {
// Do some operation
}
If you want to check for all falsy values such as: undefined, null, '', 0, false:
if (v) {
// Do some operation
}
I'm writing an answer only because I do not have enough reputations to comment the accepted answer from apsillers.
I agree with his answer, but
If you really want to test if a variable is undeclared, you'll need to
catch the ReferenceError ...
is not the only way. One can do just:
this.hasOwnProperty("bar")
to check if there is a variable bar declared in the current context.
(I'm not sure, but calling the hasOwnProperty could also be more fast/effective than raising an exception)
This works only for the current context (not for the whole current scope).
if ( typeof variableName !== 'undefined' && variableName )
//// could throw an error if var doesnt exist at all
if ( window.variableName )
//// could be true if var == 0
////further on it depends on what is stored into that var
// if you expect an object to be stored in that var maybe
if ( !!window.variableName )
//could be the right way
best way => see what works for your case
I found this shorter and much better:
if(varName !== (undefined || null)) { //do something }
if (variable) can be used if variable is guaranteed to be an object, or if false, 0, etc. are considered "default" values (hence equivalent to undefined or null).
typeof variable == 'undefined' can be used in cases where a specified null has a distinct meaning to an uninitialised variable or property. This check will not throw and error is variable is not declared.
You can simply do
if(variableName){console.log("Variable exist")}
I'm checking if(response[0].title !== undefined), but I get the error:
Uncaught TypeError: Cannot read property 'title' of undefined.
response[0] is not defined, check if it is defined and then check for its property title.
if(typeof response[0] !== 'undefined' && typeof response[0].title !== 'undefined'){
//Do something
}
Just check if response[0] is undefined:
if(response[0] !== undefined) { ... }
If you still need to explicitly check the title, do so after the initial check:
if(response[0] !== undefined && response[0].title !== undefined){ ... }
I had trouble with all of the other code examples above. In Chrome, this was the condition that worked for me:
typeof possiblyUndefinedVariable !== "undefined"
I will have to test that in other browsers and see how things go I suppose.
Actually you must surround it with an Try/Catch block so your code won't stop from working.
Like this:
try{
if(typeof response[0].title !== 'undefined') {
doSomething();
}
}catch(e){
console.log('responde[0].title is undefined');
}
typeof:
var foo;
if (typeof foo == "undefined"){
//do stuff
}
It'll be because response[0] itself is undefined.
Check if condition == null;
It will resolve the problem
Check if you're response[0] actually exists, the error seems to suggest it doesn't.
You must first check whether response[0] is undefined, and only if it's not, check for the rest. That means that in your case, response[0] is undefined.
I know i went here 7 months late, but I found this questions and it looks interesting. I tried this on my browser console.
try{x,true}catch(e){false}
If variable x is undefined, error is catched and it will be false, if not, it will return true. So you can use eval function to set the value to a variable
var isxdefined = eval('try{x,true}catch(e){false}')
In some of these answers there is a fundamental misunderstanding about how to use typeof.
Incorrect
if (typeof myVar === undefined) {
Correct
if (typeof myVar === 'undefined') {
The reason is that typeof returns a string. Therefore, you should be checking that it returned the string "undefined" rather than undefined (not enclosed in quotation marks), which is itself one of JavaScript's primitive types. The typeof operator will never return a value of type undefined.
Addendum
Your code might technically work if you use the incorrect comparison, but probably not for the reason you think. There is no preexisting undefined variable in JavaScript - it's not some sort of magic keyword you can compare things to. You can actually create a variable called undefined and give it any value you like.
let undefined = 42;
And here is an example of how you can use this to prove the first method is incorrect:
https://jsfiddle.net/p6zha5dm/
using javascript
I have a function
function test(variable)
{
if(variable != 'undefined')
{
console.log('variable is not set');
console.log('variable', where);
}
}
i call it using test();
yet in the console I get
'where is not set'
'where is set as undefined'
why?
Update
This function is not what i am actually using.
The function should not do anything if the variable is undefined.
the example was to show the if statement not working.
But the issue was actually me using if variable != 'undefined' instead of variable != undefined
You have both console.log calls in the same if branch. Do this instead:
function test(variable)
{
if(variable != 'undefined')
{
console.log('where is not set');
}
else
{
console.log('where is set as ', where);
}
}
Besides that: If you want to test if a variable is undefined, use the typeof operator to test the type: typeof variable != 'undefined'. Currently you just test if variable is not equal to the string value 'undefined'.
You are testing whether variable has the string content of "undefined".
What you probably want is
if(typeof variable != 'undefined')
The rest of the function is not making sense to me yet, either. Where does where come from?
I don't quite understand your question but try using different paramater name instead of "variable". See if the error still there.
Plus call the function like this: test(paramterValueHere);
Best