assigning non-existent variable in javascript - javascript

I've found a line of JavaScript code that I do not clearly understand how it works. Let's say that window object didn't have any app property set before:
var app = window.app || {};
The question is - why does JS not throw any ReferenceError for non-existent app attribute of window object and, instead, it creates window.app as {} - why? If I execute following line:
var a = b || {}
I get ReferenceError: b is not defined and I'm ok with it. Additionally, I do understand the xxx || {} expression - it returns the first element if it's not falsy, and the second one otherwise. It's useful to initialize something undefined with a blank object for example (e.g. default function parameters in JS).

You only get reference errors when trying to use undeclared variables, never for undefined properties.
|| returns the left hand side if the left hand side is a true value, otherwise it returns the right hand side.
window.app is undefined, so it is a false value. It therefore returns the right hand side: {}.
The results of evaluating the or statement are then assigned to app.
If app is a global variable (i.e. isn't declared inside a function), then the property app of the window object will be created. This is not a consequence of using window.app in the test. It is just how global variables work.

Related

javaScript closures. Undefined value as input stops script. Sometimes

Playing with javascript closures I end up with this question I cannot reach to explain.
(function() {
console.log("Inside closure"); //does not work
}(foo));
It does not work because foo is undefined. referenceError
But if prior I set var foo = undefined; it works (tested on Chrome and ff).
It is not the same to be undefined that to be set to undefined?
example in jsfiddle
In javaScript, 'undefined' is one of the primitive data types. It represents the type of an object.
var a = undefined;
var b;
console.log((typeof(a)));
console.log((typeOf(b)));
The output will be undefined for both the cases.
If you don't declare a variable at all and try to access it, the error thrown will be the following. I tried to access c which is not declared at all.
Uncaught ReferenceError: c is not defined(…)
Having a variable whose value is undefined is not the same as having no variable with that name.
In ECMAScript, identifiers are resolved according to Identifier Resolution, which returns a reference given by GetIdentifierReference.
In case the variable doesn't exist, the base value of that reference will be undefined, otherwise it will be an environment record.
When GetValue gets the value of that reference, it will check IsUnresolvableReference. If the base value is undefined, it will return false, and GetValue will throw a ReferenceError exception.
If the variable exists (even if its value is undefined), GetValue will return its value.
If you are not sure whether some variable exists, you can use the typeof operator, which returns "undefined" for unresolvable references instead of throwing.
If the question is:
But what if somebody uses the the common jQuery wrapper and has the jQuery script not working yet?
Then you will still get a reference error if the jQuery identifier is not in the global scope because it is trying to find the reference to jQuery but it will not exist.
the undefined keyword is passed in the closure so that if undefined is assigned a value, it will not mess with your code. For example:
var undefined = 123
if(undefined){
console.log("pass")
}
this code will pass. If you Pass through undefined but leave out the parameter when you invoke it, undefined will have not a value assigned to it. It wont be looking for a reference as you haven't assigned the parameter For example :
(function(undefined){
// undefined will be undefined.
}())
I've done a quick example here for you : https://jsfiddle.net/gn7h3641/
TLDR : if your asking JavaScript for the value of a variable that doesn't exist(not been defined with var), it will explode because it has not got a reference to find the value yet which is different to checking if something is undefined in a 'if(test === undefined)'

Assigning existing or empty object

This might be a duplicate but I did not know how to search for it.
Why does
var test = test || {};
work, but
var test = testing || {};
throws an error? At the point of definition both test and testing are undefined.
Edit
The error thrown is "Reference Error: testing is not defined"
There's a difference between a variable being undefined in the sense of not existing and a variable holding the value undefined.
In your first example you declare test with var such that when the expression on the right of the = is evaluated the variable test exists but has the value undefined.
In your second example you haven't defined testing at all, hence the error.
EDIT: It occurs to me that perhaps further explanation wouldn't hurt.
To simplify, the JavaScript engine takes two passes through the code. The first pass is the parse/compile phase, and it is then that variable declarations but not assignments happen. The second pass is the actual execution, and it is then that assignments occur. This results in an effect often called "variable hoisting" - it is as if the JS engine "hoists" the declarations to the top of their scope but still does the assignments in place.
So as to the point of code like this:
var test = test || {};
...it is basically saying "Does test already exist with a truthy value? If so use it, otherwise set it to a new empty object."
The JS engine doesn't mind if the same variable is declared more than once in the same scope - it basically ignores the second declaration, but doesn't ignore any assignment included with the second declaration. So if test is declared in some other script block, perhaps in a separate JS include file, then the line in question just assigns test to itself (assuming it has a truthy value, where all objects are truthy). But if it hasn't been declared elsewhere it will still exist as a result of the current var statement but it will be undefined before the current assignment so then the || operator returns the right-hand operand, the {}.
It's due to the var keyword.
Because the variable is declared it also exists albeit with the value undefined. What the || then does is to check for truthiness and when it finds that the object is undefined it gives you a new object to work with.
The latter does the exact same but since you're doing it on "one line" the testing object isn't defined when evaluated and thusly throws the exception.
var test = test || {};
Here, test has been declared but not defined
but in,
var test = testing || {};
There's no reference of testing whatsoever and you are still trying to assign its value.
A corresponding code for the first case would be,
var testing; // See testing is still undefined
var test = testing || {};

When do I initialize variables in JavaScript with null or not at all?

I have seen different code examples with variables declared and set to undefined and null. Such as:
var a; // undefined - unintentional value, object of type 'undefined'
var b = null; // null - deliberate non-value, object of type 'object'
If the code to follow these declarations assigns a value to a or to b, what is the reason for using one type of declaration over another?
The first example doesn't assign anything to the variable, so it implicitly refers to the undefined value (see 10.5 in the spec for the details). It is commonly used when declaring variables for later use. There is no need to explicitly assign anything to them before necessary.
The second example is explicitly assigned null (which is actually of type null, but due to a quirk of the JavaScript specification, claims to have type "object"). It is commonly used to clear a value already stored in an existing variable. It could be seen as more robust to use null when clearing the value since it is possible to overwrite undefined, and in that situation assiging undefined would result in unexpected behaviour.
As an aside, that quirk of null is a good reason to use a more robust form of type checking:
Object.prototype.toString.call(null); // Returns "[object Null]"
I just saw this link which clarified my question.
What reason is there to use null instead of undefined in JavaScript?
Javascript for Web Developers states "When defining a variable that is meant to later hold an object, it is advisable to initialize the variable to null as opposed to anything else. That way, you can explicitly check for the value null to determine if the variable has been filled with an object reference at a later time."
I don't think there's a particular better way of doing things, but I'm inclined to avoid undefined as much as possible. Maybe it's due to a strong OOP background.
When I try to mimic OOP with Javascript, I would generally declare and initialize explicitly my variables to null (as do OOP languages when you declare an instance variable without explicitly initializing it).
If I'm not going to initialize them, why even declare them in the first place? When you debug, if you haven't set a value for a variable you watch, you will see it as undefined whether you declared it or not...
I prefer keeping undefined for specific behaviours:
when you call a method, any argument you don't provide would have an undefined value. Good way of having an optional argument, implementing a specific behaviour if the argument is not defined.
lazy init... in my book undefined means "not initialized: go fetch the value", and null means "initialized but the value was null (eg maybe there was a server error?)"
arrays: myArray[myKey] === null if there is a null value for this key, undefined if a value has never been set for this key.
Be careful though, that myVar == null and myVar == undefined return the same value whether myVar is undefined, null or something else. Use === if you want to know if a variable is undefined.
I would explicitly choose null, to not create an object whose id is later overwritten anyway:
var foo = {}; // empty object, has id etc.
var foo2 = null; // good practice, // filled with an object reference at a later time
var foo3;// undefined, I would avoid it
console.log(typeof foo);
console.log(typeof foo2);
console.log(typeof foo3);
Setting null also ensures good readability to identify the datatype (object),
results in
object
object
undefined
Source (Professional JavaScript for Web Developers 3rd Edition):
When defining a variable that is meant to later hold an object, it is
advisable to initialize the variable to null as opposed to anything
else. That way, you can explicitly check for the value null to
determine if the variable has been filled with an object reference at
a later time, such as in this example:
if (foo2 != null){
//do something with foo2
}
The value undefined is a derivative of null, so ECMA-262 defines them to be superficially equal as follows:
console.log(null == undefined); // prints true
console.log(null === undefined);// prints false
null is an object.. When something is undefined it is nothing.. it is not an object, neither a string, because it hasn't been defined yet.. obviously.. then it is simply a property with no function..
example:
You declare a variable with var but never set it.
var foo;
alert(foo); //undefined.
You attempt to access a property on an object you've never set.
var foo = {};
alert(foo.bar); //undefined
You attempt to access an argument that was never provided.
function myFunction (foo) {
alert(foo); //undefined.
}

why is window.foo undefined whereas a call to foo throws an error?

In my understanding defining a variable without the var keyword just evaluates to adding this variable to the window object. And on the other hand, trying to access a member of an object, that isn't yet defined, evaluates to undefined. So I can do things like this:
> foo = "bar";
"bar"
> window.foo;
"bar"
> window.bar;
undefined
Why am I not able to get an undefined variables value (undefined) when accessing it directly?
> bar;
ReferenceError: bar is not defined
There is another thing that I don't quite get, that I think could be related. When I type some literals into the console, they always evaluate to themselves. 1 evaluates to 1, [1] to [1] and so on. I always thought of a function to also be a literal because it has some value-like qualities (beeing first class citizen). But when I try to evaluate an anonymous function, I get a syntax error.
> function() {}
SyntaxError: Unexpected token (
I know that I can define a named function, but that evaluates to undefined (it defines the function somewhere rather then being it itself). So why arent functions literals?
thanks
For the first part of your question, see ReferenceError and the global object. Basically, explicitly referencing a non-existent property of an object will return undefined because there may be cases where you would want to handle that and recover. Referencing a variable that doesn't exist should never happen, though, so it will fail loudly.
For the second part of your question, you are trying to declare a function without a name, which isn't possible. There's a subtle difference between a function declaration and a function expression. Function expressions, for which the function name is optional, can only appear as a part of an expression, not a statement. So these are legal:
var foo = function () { };
(function () { });
But not this:
function () { };
If you just access 'bar', the scope is unclear. The variable is first sought in the local scope (if your inside a function). If it's not found there' the window object is checked. But any error you get logically refers to the 'bar' that doesn't exist in the local scope.
What would you expect to be displayed if you want to show a function like that? A function has no value and its declaration certainly hasn't. You could expect the console to be able to execute a function and return the result, but that's also dangerous. Not only can you have a function that doesn't return a value, but also, functions can contain code that modify their environment, in other words, running a function in the console could modify the current state of the document and the current Javascript state in it.
Javascript has been coded such that if you try to read a property of an object, you get undefined.
But, if you try to read the value of a variable that doesn't exist without referencing it as a property of the global object, it throws an error.
One could explain this choice a number of ways, but you'd have to interview one of the original designers to find out exactly why they chose it. The good news is that once you understand the behavior, you can code it one way or the other depending upon which behavior you want. If you know a global variable might not be defined, then you can preface it with window.xxx and check for undefined.
In any case, if you want to test if a global variable exists, you can do so like this:
if (typeof bar !== "undefined")
or
if (window.bar !== undefined)
Also, be very careful about assuming the console is exactly the same as a real javascript execution because it's not quite the same in a number of ways. If you're testing a borderline behavior, it's best to test it in the actual javascript execution context (I find jsFiddle very useful for that).

Why does an undefined variable in Javascript sometimes evaluate to false and sometimes throw an uncaught ReferenceError?

Everything I've ever read indicates that in Javascript, the boolean value of an undefined variable is False. I've used code like this hundreds of times:
if (!elem) {
...
}
with the intent that if "elem" is undefined, the code in the block will execute. It usually works, but on occasion the browser will throw an error complaining about the undefined reference. This seems so basic, but I can't find the answer.
Is it that there's a difference between a variable that has not been defined and one that has been defined but which has a value of undefined? That seems completely unintuitive.
What is a ReferenceError?
As defined by ECMAScript 5, a ReferenceError indicates that an invalid reference has been detected. That doesn't say much by itself, so let's dig a little deeper.
Leaving aside strict mode, a ReferenceError occurs when the scripting engine is instructed to get the value of a reference that it cannot resolve the base value for:
A Reference is a resolved name binding. A Reference consists of three
components, the base value, the referenced name and the Boolean valued
strict reference flag. The base value is either undefined, an Object,
a Boolean, a String, a Number, or an environment record (10.2.1). A
base value of undefined indicates that the reference could not be
resolved to a binding. The referenced name is a String.
When we are referencing a property, the base value is the object whose property we are referencing. When we are referencing a variable, the base value is unique for each execution context and it's called an environment record. When we reference something that is neither a property of the base object value nor a variable of the base environment record value, a ReferenceError occurs.
Consider what happens when you type foo in the console when no such variable exists: you get a ReferenceError because the base value is not resolvable. However, if you do var foo; foo.bar then you get a TypeError instead of a ReferenceError -- a subtle perhaps but very significant difference. This is because the base value was successfully resolved; however, it was of type undefined, and undefined does not have a property bar.
Guarding against ReferenceError
From the above it follows that to catch a ReferenceError before it occurs you have to make sure that the base value is resolvable. So if you want to check if foo is resolvable, do
if(this.foo) //...
In the global context, this equals the window object so doing if (window.foo) is equivalent. In other execution contexts it does not make as much sense to use such a check because by definition it's an execution context your own code has created -- so you should be aware of which variables exist and which do not.
Checking for undefined works for variables that have no value associated but if the variable itself hasn't been declared you can run into these reference issues.
if (typeof elem === "undefined")
This is a far better check as doesn't run the risk of the reference issue as typeof isn't a method but a keyword within JavaScript.
Is it that there's a difference between a variable that has not been defined and one that has been defined but which has a value of undefined?
Yes. A undeclared variable will throw a ReferenceError when used in an expression, which is what you're seeing.
if (x) { // error, x is undeclared
}
Compared to;
var y; // alert(y === undefined); // true
if (y) { // false, but no error
}
That seems completely unintuitive.
Meh... what I find unintuitive:
if (y) // error, y is undeclared
var x = {};
if (x.someUndeclaredAttribute) // no error... someUndeclaredAttribute is implictly undefined.
Here's an example of Jon's explanation regarding environment records:
var bar = function bar() {
if (!b) { // will throw a reference error.
}
},
foo = function foo() {
var a = false;
if (a) {
var b = true;
}
if (!b) { // will not throw a reference error.
}
};
In strict mode, it is an error:
function a() {
"use strict";
if (!banana) alert("no banana"); // throws error
}
It's always really been better to make an explicit test for globals:
if (!window['banana']) alert("no banana");
It doesn't make sense to perform such a test for non-global variables. (That is, testing to see whether the variable is defined that way; it's fine to test to see whether a defined variable has a truthy value.)
edit I'll soften that to say that it rarely makes sense to thusly test for the existence of non-globals.
When a variable is declared and not initialized or it's used with declaration, its value is "undefined". The browser complains exactly when this undefined variable is referenced. Here "reference" means some piece of javascript code is trying to visit an attribute or method of it. For example, if "elem" is undefined, this will throw an exception:
elem.id = "BadElem";
or you use try/catch:
try { x } catch(err){}
This way you're not doing anything in case of an error but at least your script won't jump off the cliff...
undefined = variable exists but has no value in it
ReferenceError = variable does not exist

Categories