Uncaught ReferenceError: var is undefined when testing for truthy - javascript

So,
I have always used the type of construction for testing the presence of variables:
if(foo){
doThings();
}
Now, I'm getting an
Uncaught ReferenceError: foo is undefined
Here's a fiddle
it's a fact that the var was never even declared.
My question is, is this normal behaviour? I've used this many times and i think this is not the first time the variable is not declared; i'm almost sure that i never had a problem with this, it just returned false and didn't get in the condition.
Any help and clarification is welcome.

If a variable has not been declared then an attempt to reference it will result in a reference error.
If a variable has been declared but not assigned a value then it will implicitly have the value undefined and your code will work as expected.
In your case, this is what happens:
Evaluate the if statement [if ( Expression ) Statement]
This involves evaluating Expression, which returns a reference, as per 10.3.1
Call GetValue on the returned reference
If the reference is not resolvable (it's value is undefined), throw a reference error
Coerce the value of the reference to a boolean value
The algorithm for determining the value of a reference traverses the chain of nested lexical environments until it reaches the outermost context. When it reaches that point and still does not find a binding for the provided identifier it returns a reference whose base value is undefined.
When the base value of a reference is undefined that reference is said to be "unresolvable", and when a reference is unresolvable any attempt to reference it will result (unsurprisingly) in a reference error.

check the updated fiddle. If you haven't declare a variable then in condition u will have to check its type.
var a = 1;
var b;
try{
if(typeof(c)!='undefined') {
alert("OK");
}
} catch(ex){
alert(ex);
}
fiddle

var is a reserved keyword in Javascript.
The following is the corresponding error
Uncaught SyntaxError: Unexpected token var

Related

javaScript closures. Undefined value as input stops script. Sometimes

Playing with javascript closures I end up with this question I cannot reach to explain.
(function() {
console.log("Inside closure"); //does not work
}(foo));
It does not work because foo is undefined. referenceError
But if prior I set var foo = undefined; it works (tested on Chrome and ff).
It is not the same to be undefined that to be set to undefined?
example in jsfiddle
In javaScript, 'undefined' is one of the primitive data types. It represents the type of an object.
var a = undefined;
var b;
console.log((typeof(a)));
console.log((typeOf(b)));
The output will be undefined for both the cases.
If you don't declare a variable at all and try to access it, the error thrown will be the following. I tried to access c which is not declared at all.
Uncaught ReferenceError: c is not defined(…)
Having a variable whose value is undefined is not the same as having no variable with that name.
In ECMAScript, identifiers are resolved according to Identifier Resolution, which returns a reference given by GetIdentifierReference.
In case the variable doesn't exist, the base value of that reference will be undefined, otherwise it will be an environment record.
When GetValue gets the value of that reference, it will check IsUnresolvableReference. If the base value is undefined, it will return false, and GetValue will throw a ReferenceError exception.
If the variable exists (even if its value is undefined), GetValue will return its value.
If you are not sure whether some variable exists, you can use the typeof operator, which returns "undefined" for unresolvable references instead of throwing.
If the question is:
But what if somebody uses the the common jQuery wrapper and has the jQuery script not working yet?
Then you will still get a reference error if the jQuery identifier is not in the global scope because it is trying to find the reference to jQuery but it will not exist.
the undefined keyword is passed in the closure so that if undefined is assigned a value, it will not mess with your code. For example:
var undefined = 123
if(undefined){
console.log("pass")
}
this code will pass. If you Pass through undefined but leave out the parameter when you invoke it, undefined will have not a value assigned to it. It wont be looking for a reference as you haven't assigned the parameter For example :
(function(undefined){
// undefined will be undefined.
}())
I've done a quick example here for you : https://jsfiddle.net/gn7h3641/
TLDR : if your asking JavaScript for the value of a variable that doesn't exist(not been defined with var), it will explode because it has not got a reference to find the value yet which is different to checking if something is undefined in a 'if(test === undefined)'

Javascript undefined, truthy vs typeof operator

Considering the following code:
//the var Test has NOT been declared yet
console.log((typeof Test)); // "undefined"
console.log(Test); //"Uncaught ReferenceError: Test is not defined"
Why does the second console.log statement throw a ReferenceError and the first displays undefined.
Because test is undefined.
In that first console.log you're asking the system to tell you the type of a variable. So it looks through the current scope chain to find a reference to that variable so it may infer its type.
When it doesn't find the variable, it receives the undefined primitive. Which has a type, as I'm sure you've guessed, of undefined.
The second time you're asking it to print out the value of an undefined variable. Since the variable is not defined (there is no reference to it in the current scope chain), this is an error you're trying to ACCESS DATA that doesn't exist, not just infer its type.

Assigning existing or empty object

This might be a duplicate but I did not know how to search for it.
Why does
var test = test || {};
work, but
var test = testing || {};
throws an error? At the point of definition both test and testing are undefined.
Edit
The error thrown is "Reference Error: testing is not defined"
There's a difference between a variable being undefined in the sense of not existing and a variable holding the value undefined.
In your first example you declare test with var such that when the expression on the right of the = is evaluated the variable test exists but has the value undefined.
In your second example you haven't defined testing at all, hence the error.
EDIT: It occurs to me that perhaps further explanation wouldn't hurt.
To simplify, the JavaScript engine takes two passes through the code. The first pass is the parse/compile phase, and it is then that variable declarations but not assignments happen. The second pass is the actual execution, and it is then that assignments occur. This results in an effect often called "variable hoisting" - it is as if the JS engine "hoists" the declarations to the top of their scope but still does the assignments in place.
So as to the point of code like this:
var test = test || {};
...it is basically saying "Does test already exist with a truthy value? If so use it, otherwise set it to a new empty object."
The JS engine doesn't mind if the same variable is declared more than once in the same scope - it basically ignores the second declaration, but doesn't ignore any assignment included with the second declaration. So if test is declared in some other script block, perhaps in a separate JS include file, then the line in question just assigns test to itself (assuming it has a truthy value, where all objects are truthy). But if it hasn't been declared elsewhere it will still exist as a result of the current var statement but it will be undefined before the current assignment so then the || operator returns the right-hand operand, the {}.
It's due to the var keyword.
Because the variable is declared it also exists albeit with the value undefined. What the || then does is to check for truthiness and when it finds that the object is undefined it gives you a new object to work with.
The latter does the exact same but since you're doing it on "one line" the testing object isn't defined when evaluated and thusly throws the exception.
var test = test || {};
Here, test has been declared but not defined
but in,
var test = testing || {};
There's no reference of testing whatsoever and you are still trying to assign its value.
A corresponding code for the first case would be,
var testing; // See testing is still undefined
var test = testing || {};

why is window.foo undefined whereas a call to foo throws an error?

In my understanding defining a variable without the var keyword just evaluates to adding this variable to the window object. And on the other hand, trying to access a member of an object, that isn't yet defined, evaluates to undefined. So I can do things like this:
> foo = "bar";
"bar"
> window.foo;
"bar"
> window.bar;
undefined
Why am I not able to get an undefined variables value (undefined) when accessing it directly?
> bar;
ReferenceError: bar is not defined
There is another thing that I don't quite get, that I think could be related. When I type some literals into the console, they always evaluate to themselves. 1 evaluates to 1, [1] to [1] and so on. I always thought of a function to also be a literal because it has some value-like qualities (beeing first class citizen). But when I try to evaluate an anonymous function, I get a syntax error.
> function() {}
SyntaxError: Unexpected token (
I know that I can define a named function, but that evaluates to undefined (it defines the function somewhere rather then being it itself). So why arent functions literals?
thanks
For the first part of your question, see ReferenceError and the global object. Basically, explicitly referencing a non-existent property of an object will return undefined because there may be cases where you would want to handle that and recover. Referencing a variable that doesn't exist should never happen, though, so it will fail loudly.
For the second part of your question, you are trying to declare a function without a name, which isn't possible. There's a subtle difference between a function declaration and a function expression. Function expressions, for which the function name is optional, can only appear as a part of an expression, not a statement. So these are legal:
var foo = function () { };
(function () { });
But not this:
function () { };
If you just access 'bar', the scope is unclear. The variable is first sought in the local scope (if your inside a function). If it's not found there' the window object is checked. But any error you get logically refers to the 'bar' that doesn't exist in the local scope.
What would you expect to be displayed if you want to show a function like that? A function has no value and its declaration certainly hasn't. You could expect the console to be able to execute a function and return the result, but that's also dangerous. Not only can you have a function that doesn't return a value, but also, functions can contain code that modify their environment, in other words, running a function in the console could modify the current state of the document and the current Javascript state in it.
Javascript has been coded such that if you try to read a property of an object, you get undefined.
But, if you try to read the value of a variable that doesn't exist without referencing it as a property of the global object, it throws an error.
One could explain this choice a number of ways, but you'd have to interview one of the original designers to find out exactly why they chose it. The good news is that once you understand the behavior, you can code it one way or the other depending upon which behavior you want. If you know a global variable might not be defined, then you can preface it with window.xxx and check for undefined.
In any case, if you want to test if a global variable exists, you can do so like this:
if (typeof bar !== "undefined")
or
if (window.bar !== undefined)
Also, be very careful about assuming the console is exactly the same as a real javascript execution because it's not quite the same in a number of ways. If you're testing a borderline behavior, it's best to test it in the actual javascript execution context (I find jsFiddle very useful for that).

Why does an undefined variable in Javascript sometimes evaluate to false and sometimes throw an uncaught ReferenceError?

Everything I've ever read indicates that in Javascript, the boolean value of an undefined variable is False. I've used code like this hundreds of times:
if (!elem) {
...
}
with the intent that if "elem" is undefined, the code in the block will execute. It usually works, but on occasion the browser will throw an error complaining about the undefined reference. This seems so basic, but I can't find the answer.
Is it that there's a difference between a variable that has not been defined and one that has been defined but which has a value of undefined? That seems completely unintuitive.
What is a ReferenceError?
As defined by ECMAScript 5, a ReferenceError indicates that an invalid reference has been detected. That doesn't say much by itself, so let's dig a little deeper.
Leaving aside strict mode, a ReferenceError occurs when the scripting engine is instructed to get the value of a reference that it cannot resolve the base value for:
A Reference is a resolved name binding. A Reference consists of three
components, the base value, the referenced name and the Boolean valued
strict reference flag. The base value is either undefined, an Object,
a Boolean, a String, a Number, or an environment record (10.2.1). A
base value of undefined indicates that the reference could not be
resolved to a binding. The referenced name is a String.
When we are referencing a property, the base value is the object whose property we are referencing. When we are referencing a variable, the base value is unique for each execution context and it's called an environment record. When we reference something that is neither a property of the base object value nor a variable of the base environment record value, a ReferenceError occurs.
Consider what happens when you type foo in the console when no such variable exists: you get a ReferenceError because the base value is not resolvable. However, if you do var foo; foo.bar then you get a TypeError instead of a ReferenceError -- a subtle perhaps but very significant difference. This is because the base value was successfully resolved; however, it was of type undefined, and undefined does not have a property bar.
Guarding against ReferenceError
From the above it follows that to catch a ReferenceError before it occurs you have to make sure that the base value is resolvable. So if you want to check if foo is resolvable, do
if(this.foo) //...
In the global context, this equals the window object so doing if (window.foo) is equivalent. In other execution contexts it does not make as much sense to use such a check because by definition it's an execution context your own code has created -- so you should be aware of which variables exist and which do not.
Checking for undefined works for variables that have no value associated but if the variable itself hasn't been declared you can run into these reference issues.
if (typeof elem === "undefined")
This is a far better check as doesn't run the risk of the reference issue as typeof isn't a method but a keyword within JavaScript.
Is it that there's a difference between a variable that has not been defined and one that has been defined but which has a value of undefined?
Yes. A undeclared variable will throw a ReferenceError when used in an expression, which is what you're seeing.
if (x) { // error, x is undeclared
}
Compared to;
var y; // alert(y === undefined); // true
if (y) { // false, but no error
}
That seems completely unintuitive.
Meh... what I find unintuitive:
if (y) // error, y is undeclared
var x = {};
if (x.someUndeclaredAttribute) // no error... someUndeclaredAttribute is implictly undefined.
Here's an example of Jon's explanation regarding environment records:
var bar = function bar() {
if (!b) { // will throw a reference error.
}
},
foo = function foo() {
var a = false;
if (a) {
var b = true;
}
if (!b) { // will not throw a reference error.
}
};
In strict mode, it is an error:
function a() {
"use strict";
if (!banana) alert("no banana"); // throws error
}
It's always really been better to make an explicit test for globals:
if (!window['banana']) alert("no banana");
It doesn't make sense to perform such a test for non-global variables. (That is, testing to see whether the variable is defined that way; it's fine to test to see whether a defined variable has a truthy value.)
edit I'll soften that to say that it rarely makes sense to thusly test for the existence of non-globals.
When a variable is declared and not initialized or it's used with declaration, its value is "undefined". The browser complains exactly when this undefined variable is referenced. Here "reference" means some piece of javascript code is trying to visit an attribute or method of it. For example, if "elem" is undefined, this will throw an exception:
elem.id = "BadElem";
or you use try/catch:
try { x } catch(err){}
This way you're not doing anything in case of an error but at least your script won't jump off the cliff...
undefined = variable exists but has no value in it
ReferenceError = variable does not exist

Categories