JS object key sequence - javascript

Does javascript guarantees that the sequence of keys of an object gets preserved even if the new value is assigned to a key?
For example, if i have the following object
var Object = {
keyX: value1,
keyB: value2,
keyZ: value3
}
If i iterate through keys using for .. in, I get the proper sequence i.e. keyX, keyB, keyZ. and if I change the value of keyB, I am still getting the same sequence in iteration.
My question is, will the sequence remains the same always, or it might change in any case?

Well, it's quite clearly said in the doc (MDN):
A for...in loop iterates over the properties of an object in an
arbitrary order.
And this section of documentation gives more comprehensive explanation to this:
Although ECMAScript makes iteration order of objects
implementation-dependent, it may appear that all major browsers
support an iteration order based on the earliest added property coming
first (at least for properties not on the prototype). However, in the
case of Internet Explorer, when one uses delete on a property, some
confusing behavior results, preventing other browsers from using
simple objects like object literals as ordered associative arrays.
In Explorer, while the property value is indeed set to undefined, if
one later adds back a property with the same name, the property will
be iterated in its old position--not at the end of the iteration
sequence as one might expect after having deleted the property and
then added it back.

Related

How do arrays store empty values - JavaScript

Not too long ago, I discovered that arrays in JavaScript need not contain an ordered set of keys (0-x) to store values within it
and some numeric keys may not be defined (0-4 ... 6-x, where 5 is not defined).
And this creates semantically two types of arrays that are similar:
arrayA = [, ,] (partially-empty arrays or sparse arrays)
arrayB = [undefined, undefined] (filled arrays)
But recently, I was tinkering with JavaScript in the Google Chrome Developer Console and came across this:
Now the second array is like arrayA, and the third like arrayB as shown in the console.
But the first array ([...'🏃🏽‍♀️'])... what is it?
I opened up its directory and saw the elements that were defined as hole were undefined with their respective keys in the array.
I also ran a few types of JavaScript loops on the array:
for...in statement captures all elements, except the *hole*s.
for...of statement captures all elements, except the *hole*s and proceeds to throw an error that the iterator variable used is undefined i.e.:
for (var value of [...'🏃🏽‍♀️']) console.log(value);
// Throw 'ReferenceError' when loop is done.
Array.prototype.forEach method captures all elements, except the *hole*s.
do...while, for and while statements captures all elements, except the *hole*s.
Why does the console see those values as different from empty or undefined (as with arrayA and arrayB)?
The main question is: Is there implicitly another type of array and if so, is there anything to note about it?
The ... is known as spread syntax. Read more about it here: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Spread_syntax
Emojis are made up of a variety of elements which the browser renders as a single emoji. Here's a quick article that expands on that. https://til.hashrocket.com/posts/2f488279a3-expand-emojis-with-the-spread-operator
By applying the spread syntax to an emoji, you can look at the individual emojis it's composed of.

Does anyone know how consistent `[1,,2]` only having two keys is?

I'm wondering if I can rely on the fact that [1,,2] only has two keys: 0 and 2, or if anyone happens to know if any JS engines will also give me a 1.
Every browser I've tested shows keys 0, 2, but I don't have older versions available at the moment, or an android phone, or ...
Reasoning:
I'm writing a custom timer library on top of requestAnimationFrame and so am returning cancelable ids based on internal array indices. I'm trying to figure out if simply delete ary[ix] is sufficient to be able to walk all of the object keys on the array without extra sanity checks.
You are covered if you are willing to assume that your code will be running on a conforming implementation of ECMAScript. From the spec:
Array elements may be elided at the beginning, middle or end of the
element list. Whenever a comma in the element list is not preceded
by an AssignmentExpression (i.e., a comma at the beginning or after
another comma), the missing array element contributes to the length of
the Array and increases the index of subsequent elements. Elided array
elements are not defined. If an element is elided at the end of an
array, that element does not contribute to the length of the Array.
[1,,3] will produce
[1, undefined, 3]
The length returned is 3.
Edit: to clarify, even though the value is present, a key is not returned.
Object.keys([1,,3]) returns ["0", "2"]
Edit 2: a great answer about checking if an array key exists.
Checking if a key exists in a JavaScript object?

Javascript associative array modification during for loop

The javascript for keyword will iterate over all properties of an object. If the object is modified within the loop body, what happens?
For example, is the following code OK?
for(var key in obj)
if (whatever(obj[key]))
delete obj[key];
OK would be if this code works in a deterministic fashion and preferably that all keys in obj are tested exactly once. By contrast, in .NET or Java similar constructs will typically throw an exception.
I think it works. Just be careful to ask for hasOwnProperty(key) - because for will also happily iterate over inherited properties (and methods, which are just properties with function values).
Also: http://www.w3schools.com/js/js_loop_for_in.asp says:
Note: The code in the body of the for...in loop is executed once for each property.
Also: https://developer.mozilla.org/en/JavaScript/Reference/Statements/for...in says:
A for...in loop iterates over the properties of an object in an arbitrary order (see the delete operator for more on why one cannot depend on the seeming orderliness of iteration, at least in a cross-browser setting). If a property is modified in one iteration and then visited at a later time, the value exposed by the loop will be its value at that later time. A property which is deleted before it has been visited will not then be visited later. Properties added to the object over which iteration is occurring may either be visited or omitted from iteration. In general it is best not to add, modify, or remove properties from the object during iteration, other than the property currently being visited; there is no guarantee whether or not an added property will be visited, whether a modified property will be visited before or after it is modified, or whether a deleted property will be visited before it is deleted.
What I read from this is - if you're modifying values other than the current one, the nondeterminism might bite you in the ass. However, modifying the current one should be okay.
It may be possible to modify an object's properties while iterating over them using for-in. The question is: should you care? It is easy to rewrite the code to first create a list of the properties and then to delete the ones that match the criteria for deletion.
If, for example, you wanted to delete certain properties that are not functions and are not inherited:
var keys = Object.keys(obj).filter(k=>!(obj[k] instanceof Function))
for(var key of keys)
if(whatever(obj[key]))
delete obj[key];
In general, it is better to write code that does not give rise to tricky questions like the one you raise.
Note that this code assumes the existence of Object.keys() in your runtime environment. Older browsers may not support this, so you may need to find a workaround if you need to support those.

Can chrome be forced to iterate "string numeric" keys in the order they are stored

I seem to be having an interesting problem only in chrome (not IE, FF). Given the following object:
var myObj = {
"59" : "Hello",
"52" : "and",
"50" : "how",
"31" : "are",
"65" : "you"
};
Going over this object via a for loop spits out the contents in the following order:
for(var j in myObj) { document.write(myObj[j] +', '); }
are, how, and, hello, you
All the other major browser give it in 'proper' order. Chrome is treating the keys as integers instead of strings. The problem is I have a json data source I can't change, and I need to access the items in the order they're in the object.
Can anyone suggest a way to do this in Google Chrome?
When iterating over an object via for...in, the order of properties is not guaranteed. There is nothing you can do:
A for...in loop iterates over the properties of an object in an arbitrary order (see the delete operator for more on why one cannot depend on the seeming orderliness of iteration, at least in a cross-browser setting).
And from the delete page:
Although ECMAScript makes iteration order of objects implementation-dependent, it may appear that all major browsers support an iteration order based on the earliest added property coming first (at least for properties not on the prototype). However, in the case of Internet Explorer, when one uses delete on a property, some confusing behavior results, preventing other browsers from using simple objects like object literals as ordered associative arrays. In Explorer, while the property value is indeed set to undefined, if one later adds back a property with the same name, the property will be iterated in its old position--not at the end of the iteration sequence as one might expect after having deleted the property and then added it back.
So if you want to simulate an ordered associative array in a cross-browser environment, you are forced to either use two separate arrays (one for the keys and the other for the values), or build an array of single-property objects, etc.
If you would like to know why it is like that, feel free to read the very long discussion in the V8 Issue Tracker "Wrong order in Object properties interation":
http://code.google.com/p/v8/issues/detail?id=164

If I set only a high index in an array, does it waste memory?

In Javascript, if I do something like
var alpha = [];
alpha[1000000] = 2;
does this waste memory somehow? I remember reading something about Javascript arrays still setting values for unspecified indices (maybe sets them to undefined?), but I think this may have had something to do with delete. I can't really remember.
See this topic:
are-javascript-arrays-sparse
In most implementations of Javascript (probably all modern ones) arrays are sparse. That means no, it's not going to allocate memory up to the maximum index.
If it's anything like a Lua implementation there is actually an internal array and dictionary. Densely populated parts from the starting index will be stored in the array, sparse portions in the dictionary.
This is an old myth. The other indexes on the array will not be assigned.
When you assign a property name that is an "array index" (e.g. alpha[10] = 'foo', a name that represents an unsigned 32-bit integer) and it is greater than the current value of the length property of an Array object, two things will happen:
The "index named" property will be created on the object.
The length will be incremented to be that index + 1.
Proof of concept:
var alpha = [];
alpha[10] = 2;
alpha.hasOwnProperty(0); // false, the property doesn't exist
alpha.hasOwnProperty(9); // false
alpha.hasOwnProperty(10); // true, the property exist
alpha.length; // 11
As you can see, the hasOwnProperty method returns false when we test the presence of the 0 or 9 properties, because they don't exist physically on the object, whereas it returns true for 10, the property was created.
This misconception probably comes from popular JS consoles, like Firebug, because when they detect that the object being printed is an array-like one, they will simply make a loop, showing each of the index values from 0 to length - 1.
For example, Firebug detects array-like objects simply by looking if they have a length property whose its value is an unsigned 32-bit integer (less than 2^32 - 1), and if they have a splice property that is a function:
console.log({length:3, splice:function(){}});
// Firebug will log: `[undefined, undefined, undefined]`
In the above case, Firebug will internally make a sequential loop, to show each of the property values, but no one of the indexes really exist and showing [undefined, undefined, undefined] will give you the false sensation that those properties exist, or that they were "allocated", but that's not the case...
This has been like that since ever, it's specified even of the ECMAScript 1st Edition Specification (as of 1997), you shouldn't worry to have implementation differences.
About a year ago, I did some testing on how browsers handle arrays (obligatory self-promotional link to my blog post.) My testing was aimed more at CPU performance than at memory consumption, which is much harder to measure. The bottom line, though, was that every browser I tested with seemed to treat sparse arrays as hash tables. That is, unless you initialized the array from the get-go by putting values in consecutive indexes (starting from 0), the array would be implemented in a way that seemed to optimize for space.
So while there's no guarantee, I don't think that setting array[100000] will take any more room than setting array[1] -- unless you also set all the indexes leading up to those.
I dont think so because javascript treats arrays kinda like dictionaries, but with integer keys.
alpha[1000000] = alpha["1000000"]
I don't really know javascript, but it would be pretty odd behaviour if it DIDN'T allocate space for the entire array. Why would you think it wouldn't take up space? You're asking for a huge array. If it didn't give it to you, that would be a specific optimisation.
This obviously ignores OS optimisations such as memory overcommit and other kernel and implementation specifics.

Categories