I discovered a bug on a project I'm working on that can be replicated by this snippet:
const original = [ { value: 1 } ];
function test() {
const copy = Object.assign([], original);
copy.forEach(obj => obj.value = obj.value + 1);
}
console.log(original[0].value); // -> 1, expected 1
test();
console.log(original[0].value); // -> 2, expected 1
test();
console.log(original[0].value); // -> 3, expected 1
I do not understand why this is the case. In the MDN web docs, the following statements can be found in the deep copy warning section:
For deep cloning, we need to use alternatives, because Object.assign() copies property values.
If the source value is a reference to an object, it only copies the reference value.
How do these notes apply to arrays / in this case? Are array values somehow considered as properties?
Looking back now, the method was probably not intended to work with arrays, so I guess I reap what I sow... but I'd still like to understand what's going on here. The intent was to deep copy the array in order to mutate the objects inside while keeping the original intact.
Are array values somehow considered as properties?
Yes. In JavaScript, arrays are objects (which is why Object.assign works with them), and properties with a special class of names called array indexes (strings defining decimal numbers in standard form with numeric values < 232 - 1) represent the elements of the array. (Naturally, JavaScript engines optimize them into true arrays when they can, but they're defined as objects and performing object operations on them is fully supported.) I found this sufficiently surprising when getting deep into JavaScript that I wrote it up on my anemic old blog.
Given:
const obj = {a: 1};
const arr = [1];
these two operations are the same from a specification viewpoint:
console.log(obj["a"]);
console.log(arr["0"]); // Yes, in quotes
Of course, we don't normally write the quotes when accessing array elements by index, normally we'll just do arr[0], but in theory, the number is converted to a string and then the property is looked up by name — although, again, modern JavaScript engines optimize.
const obj = {a: 1};
const arr = [1];
console.log(obj["a"]);
console.log(arr["0"]); // Yes, in quotes
console.log(arr[0]);
If you need to clone an array and the objects in it, map + property spread is a useful way to do that, but note the objects are only cloned shallowly (which is often sufficient, but not always):
const result = original.map((value) => ({...value}));
For a full deep copy, see this question's answers.
Here we can use structuredClone for deep copy.
If I wanted to clone any javascript object (that's not null), I would think I could just copy all of its own properties (enumerable and non-enumerable) -- using Object.getOwnPropertyNames -- onto a new empty object.
But I've noticed that an example of a deep cloning function provided by Dojo toolkit (https://davidwalsh.name/javascript-clone) treats RegExp, Date, and Node objects as special cases, and lodash.cloneDeep also has a lot of logic that is a lot more complicated than simply copying properties, including having some special cases of its own and apparently not supporting all types of objects: (https://github.com/lodash/lodash/blob/master/.internal/baseClone.js).
Why is simply copying the object properties not sufficient? What else is there to a javascript object besides its properties that I don't know about?
EDIT: to be clear, I'm talking about deep cloning an object. Sorry for the confusion.
If the top level properties are all value objects like strings and numbers then just copying the top level properties is fine for a clone of an object. If there are any reference objects such as dates, arrays or other objects then all your are doing is copying a reference from one object to another. If you change the reference object on the clone you will mutate the original object.
Take a look at my clone function at https://stackblitz.com/edit/typescript-qmzgf7
If it is an array it clones every item in the array, if it is a date it creates a new date with the same time, if it is an object it clones every property else if just copies the property.
The cloned object can now be mutated without worrying about effects it might have on the original object.
const clone = obj =>
Array.isArray(obj)
? obj.map(item => clone(item))
: obj instanceof Date
? new Date(obj.getTime())
: (typeof obj === 'object') && obj
? Object.getOwnPropertyNames(obj).reduce((o, prop) => ({ ...o, [prop]: clone(obj[prop]) }), {})
: obj;
let original = { prop1: "Original", objProp: { prop1: "Original" } };
let swallowCopy = { ...original };
let clonedObj = clone(original);
clonedObj.prop1 = "Changed";
clonedObj.objProp.prop1 = "Changed";
console.log(`Original objects properties are '${original.prop1}' and '${original.objProp.prop1}'`);
swallowCopy.prop1 = "Changed";
swallowCopy.objProp.prop1 = "Changed";
console.log(`Original objects properties are '${original.prop1}' and '${original.objProp.prop1}'`);
Notice how modifying the property on the object property shallow copy causes the original to change as well.
The easiest way to clone an object in JS is by using the ... spread operator.
Let's say you have this object:
const object = { foo: 1, bar: 2 }
To clone it, you can simply declare:
const objectClone = {...object}.
This will create all the properties present in the original object onto the clone, as well as their values.
Now the problem is, if you have any object nested in there, the copies will be made by reference. Suppose the original object is this instead:
const student = { studentID: 1, tests: { test1: 90, test2: 95}}
If you create a copy of that object by using the spread operator(or Object.assign, spread is just syntactic sugar), the nested object will actually point to the object inside the original object! So repeating this:
const studentClone = {...student}
And now you edit a property of the nested object inside the clone:
studentClone.tests.test1 = 80
This will change the value in both clone, and original object, as the nested object is really just pointing to 1 object in memory.
Now what those utilities, like _.cloneDeep will do, is iterate through all inner objects in the object you're cloning, and repeat the process. You could technically do it yourself, but you wouldn't be able to do it on objects with many nested objects easily. Something like this:
const studentClone = {...studentClone, tests: {...studentClone.tests}}
This would create new objects, with no reference problems.
Hope this helped!
EDIT: Just adding, object spreading would only work properly for prototype objects, of course. Each instantiated objects,such as arrays, Date objects etc, would have their own way of cloning.
Arrays can be copied similarly, through [...array]. It does follow the same rules regarding to references. For dates, you can simply pass the original date object into the Date constructor again:
const clonedDate = new Date(date)
This is where the third-party utilities will come in handy, as they'll usually handle most use cases.
This answer does a good job of explaining two of the problems with cloning a normal JavaScript object: prototype properties and circular references. But to answer your question regarding certain built-in types, the TL;DR answer is that there are 'under the hood' properties that you have no programmatic access to.
Consider:
let foo = [1, 2];
let bar = {};
Object.assign(bar, foo);
Object.setPrototypeOf(bar, foo.constructor.prototype); // aka Array.prototype
bar[0]; // 1
bar instanceof Array; // true
bar.map(x => x + 1); // [] ????
Empty array? Why? Just to make sure we're not crazy
foo.map(x => x + 1); // [2, 3]
The reason why map (and the other array methods) fail to work is that an Array isn't simply an object: it has internal slot properties for the stuff you put in it that you don't get to see as the JavaScript programmer. As another example, every JavaScript object has an internal [[Class]] property that says what kind of object it is. Fortunately for us, there's a loophole in the spec that allows us indirect access to it: the good ol Object.prototype.toString.call hack. So let's see what that has to say about various stuff:
Object.prototype.toString.call(true); // [object Boolean]
Object.prototype.toString.call(3); // [object Number]
Object.prototype.toString.call({}); // [object Object]
Object.prototype.toString.call([]); // [object Array]
Object.prototype.toString.call(null); // [object Null]
Object.prototype.toString.call(/\w/); // [object RegExp]
Object.prototype.toString.call(JSON); // [object JSON]
Object.prototype.toString.call(Math); // [object Math]
Let's see what it says about our foo and bar:
Object.prototype.toString.call(foo); // [object Array]
Object.prototype.toString.call(bar); // [object Object] Doh!
There's no way to 'convert' a random object to an Array... or a Date... or an HTMLElement... or a regex. Now, there are in fact ways to clone all of those things, but they require special logic: you can't just copy properties, or even set the prototype, because they have internal logic you can't access or directly replicate.
In normal everyday JavaScript programming we don't worry too much about this stuff, it's the kind of thing that's generally of interest to library authors (or language implementers). We everyday working stiffs just use a library to cover the edge cases and call it a day. But every once in a while the abstractions we use leak and the ugly bubbles through. This is however a great illustration of why you should probably use battle-tested libraries rather than trying to roll your own.
An object in javascript includes fields and functions together, and every field could be another object (Like Date type). If you copy a date field, it will be a reference type assignment.
Example:
var obj1 = { myField : new Date('2018/9/17') };
var obj2 = {};
obj2.myField = obj1.myField;
Now, if we change "obj2.myField" like this:
obj2.myField.setDate(obj2.myField.getDate() + 2);
console.log(obj1.myField); // Result =====> Wed Sep 19 2018 00:00:00 GMT+0430
As you see, obj1 and obj2 still are linked.
Correct way to copy a date field:
obj2.myField = new Date(obj1.myField.getTime());
Most native objects(like you have mentioned - I don't know for is the correct naming for them; maybe built-in?) are treated as "simple": it does not make sense to copy Date object property-by-property. In the same time they all are mutable in some way.
let a = {test: new Date(1)}; // test === Thu Jan 01 1970 00:00:00GMT
let copy_a = {test: a.test}; // looks like cloned
a.test.setDate(12); // let's mutate original date
console.log(copy_a.test); // Thu Jan 12 1970 00:00:00GMT ooops modified as well
So you either should handle that exceptions(special cases) explicitly or take a risk of side effects for some cases.
I have a fairly decent knowledge of JavaScript and the prototypical inheritance that is used when initializing data structures, but I am still not completely sure how one of JS's unique functionalities works.
Lets say I create an array:
var myArr = [];
I can now push items to the array:
myArr.push('foo');
myArr.push('bar');
At this time myArr.length == 2
Now from there I can do something like
myArr['myProp'] = 5; // OR myArr.myProp = 5;
But my myArr.length still equals 2 and I can use some type of iteration method to iterate over the 2 values pushed initially.
So basically this object is a "hybrid" data structure that can be treated like an Array or an Object.
So my question is does the native Object syntax (myObj.someProperty = 'blah' OR myObj['someProperty'] = 'blah'), apply specifically to the Object.prototype and therefore ANY object inherited from that prototype? This would make sense because an Object's prototype chain looks like:
var myObj = {} -> Object.prototype -> null
And an Array's prototype chain looks like:
var myArr = [] -> Array.prototype -> Object.prototype -> null
Which would make me assume that anything you can do with an object (myObj.someProperty //as getter or setter) can be done with an Array which would then explain the phenomena I stated above.
To formalise my answer. This is exactly correct. Barring literals and temporary values pretty much everything in JavaScript is an object. Including functions, arrays, and variables.
It is for this exact reason that it is considered dangerous to iterate through an array with the syntax:
for (x in myArray){}
The above line of code could lead to unpredictable results!
This approach of creating all data types as objects allows JavaScript to be as dynamic and flexible as it is.
You can see the full description of Object here.
Why are objects not iterable by default?
I see questions all the time related to iterating objects, the common solution being to iterate over an object's properties and accessing the values within an object that way. This seems so common that it makes me wonder why objects themselves aren't iterable.
Statements like the ES6 for...of would be nice to use for objects by default. Because these features are only available for special "iterable objects" which don't include {} objects, we have to go through hoops to make this work for objects we want to use it for.
The for...of statement creates a loop Iterating over iterable objects
(including Array, Map, Set, arguments object and so on)...
For example using an ES6 generator function:
var example = {a: {e: 'one', f: 'two'}, b: {g: 'three'}, c: {h: 'four', i: 'five'}};
function* entries(obj) {
for (let key of Object.keys(obj)) {
yield [key, obj[key]];
}
}
for (let [key, value] of entries(example)) {
console.log(key);
console.log(value);
for (let [key, value] of entries(value)) {
console.log(key);
console.log(value);
}
}
The above properly logs data in the order I expect it to when I run the code in Firefox (which supports ES6):
By default, {} objects are not iterable, but why? Would the disadvantages outweigh the potential benefits of objects being iterable? What are the issues associated with this?
In addition, because {} objects are different from "Array-like" collections and "iterable objects" such as NodeList, HtmlCollection, and arguments, they can't be converted into Arrays.
For example:
var argumentsArray = Array.prototype.slice.call(arguments);
or be used with Array methods:
Array.prototype.forEach.call(nodeList, function (element) {}).
Besides the questions I have above, I would love to see a working example on how to make {} objects into iterables, especially from those who have mentioned the [Symbol.iterator]. This should allow these new {} "iterable objects" to use statements like for...of. Also, I wonder if making objects iterable allow them to be converted into Arrays.
I tried the below code, but I get a TypeError: can't convert undefined to object.
var example = {a: {e: 'one', f: 'two'}, b: {g: 'three'}, c: {h: 'four', i: 'five'}};
// I want to be able to use "for...of" for the "example" object.
// I also want to be able to convert the "example" object into an Array.
example[Symbol.iterator] = function* (obj) {
for (let key of Object.keys(obj)) {
yield [key, obj[key]];
}
};
for (let [key, value] of example) { console.log(value); } // error
console.log([...example]); // error
I'll give this a try. Note that I'm not affiliated with ECMA and have no visibility into their decision-making process, so I cannot definitively say why they have or have not done anything. However, I'll state my assumptions and take my best shot.
1. Why add a for...of construct in the first place?
JavaScript already includes a for...in construct that can be used to iterate the properties of an object. However, it's not really a forEach loop, as it enumerates all of the properties on an object and tends to only work predictably in simple cases.
It breaks down in more complex cases (including with arrays, where its use tends to be either discouraged or thoroughly obfuscated by the safeguards needed to for use for...in with an array correctly). You can work around that by using hasOwnProperty (among other things), but that's a bit clunky and inelegant.
So therefore my assumption is that the for...of construct is being added to address the deficiencies associated with the for...in construct, and provide greater utility and flexibility when iterating things. People tend to treat for...in as a forEach loop that can be generally applied to any collection and produce sane results in any possible context, but that's not what happens. The for...of loop fixes that.
I also assume that it's important for existing ES5 code to run under ES6 and produce the same result as it did under ES5, so breaking changes cannot be made, for instance, to the behavior of the for...in construct.
2. How does for...of work?
The reference documentation is useful for this part. Specifically, an object is considered iterable if it defines the Symbol.iterator property.
The property-definition should be a function that returns the items in the collection, one, by, one, and sets a flag indicating whether or not there are more items to fetch. Predefined implementations are provided for some object-types, and it's relatively clear that using for...of simply delegates to the iterator function.
This approach is useful, as it makes it very straightforward to provide your own iterators. I might say the approach could have presented practical issues due to its reliance upon defining a property where previously there was none, except from what I can tell that's not the case as the new property is essentially ignored unless you deliberately go looking for it (i.e. it will not present in for...in loops as a key, etc.). So that's not the case.
Practical non-issues aside, it may have been considered conceptually controversial to start all objects off with a new pre-defined property, or to implicitly say that "every object is a collection".
3. Why are objects not iterable using for...of by default?
My guess is that this is a combination of:
Making all objects iterable by default may have been considered unacceptable because it adds a property where previously there was none, or because an object isn't (necessarily) a collection. As Felix notes, "what does it mean to iterate over a function or a regular expression object"?
Simple objects can already be iterated using for...in, and it's not clear what a built-in iterator implementation could have done differently/better than the existing for...in behavior. So even if #1 is wrong and adding the property was acceptable, it may not have been seen as useful.
Users who want to make their objects iterable can easily do so, by defining the Symbol.iterator property.
The ES6 spec also provides a Map type, which is iterable by default and has some other small advantages over using a plain object as a Map.
There's even an example provided for #3 in the reference documentation:
var myIterable = {};
myIterable[Symbol.iterator] = function* () {
yield 1;
yield 2;
yield 3;
};
for (var value of myIterable) {
console.log(value);
}
Given that objects can easily be made iterable, that they can already be iterated using for...in, and that there's likely not clear agreement on what a default object iterator should do (if what it does is meant to be somehow different from what for...in does), it seems reasonable enough that objects were not made iterable by default.
Note that your example code can be rewritten using for...in:
for (let levelOneKey in object) {
console.log(levelOneKey); // "example"
console.log(object[levelOneKey]); // {"random":"nest","another":"thing"}
var levelTwoObj = object[levelOneKey];
for (let levelTwoKey in levelTwoObj ) {
console.log(levelTwoKey); // "random"
console.log(levelTwoObj[levelTwoKey]); // "nest"
}
}
...or you can also make your object iterable in the way you want by doing something like the following (or you can make all objects iterable by assigning to Object.prototype[Symbol.iterator] instead):
obj = {
a: '1',
b: { something: 'else' },
c: 4,
d: { nested: { nestedAgain: true }}
};
obj[Symbol.iterator] = function() {
var keys = [];
var ref = this;
for (var key in this) {
//note: can do hasOwnProperty() here, etc.
keys.push(key);
}
return {
next: function() {
if (this._keys && this._obj && this._index < this._keys.length) {
var key = this._keys[this._index];
this._index++;
return { key: key, value: this._obj[key], done: false };
} else {
return { done: true };
}
},
_index: 0,
_keys: keys,
_obj: ref
};
};
You can play with that here (in Chrome, at lease): http://jsfiddle.net/rncr3ppz/5/
Edit
And in response to your updated question, yes, it is possible to convert an iterable to an array, using the spread operator in ES6.
However, this doesn't seem to be working in Chrome yet, or at least I cannot get it to work in my jsFiddle. In theory it should be as simple as:
var array = [...myIterable];
Objects don't implement the iteration protocols in Javascript for very good reasons. There are two levels at which object properties can be iterated over in JavaScript:
the program level
the data level
Program Level Iteration
When you iterate over an object at the program level you examine a portion of the structure of your program. It is a reflective operation. Let's illustrate this statement with an array type, which is usually iterated over at the data level:
const xs = [1,2,3];
xs.f = function f() {};
for (let i in xs) console.log(xs[i]); // logs `f` as well
We just examined the program level of xs. Since arrays store data sequences, we are regularly interested in the data level only. for..in evidently makes no sense in connection with arrays and other "data-oriented" structures in most cases. That is the reason why ES2015 has introduced for..of and the iterable protocol.
Data Level Iteration
Does that mean that we can simply distinguish the data from the program level by distinguishing functions from primitive types? No, because functions can also be data in Javascript:
Array.prototype.sort for instance expects a function to perform a certain sort algorithm
Thunks like () => 1 + 2 are just functional wrappers for lazily evaluated values
Besides primitive values can represent the program level as well:
[].length for instance is a Number but represents the length of an array and thus belongs to the program domain
That means that we can't distinguish the program and data level by merely checking types.
It is important to understand that the implementation of the iteration protocols for plain old Javascript objects would rely on the data level. But as we've just seen, a reliable distinction between data and program level iteration is not possible.
With Arrays this distinction is trivial: Every element with an integer-like key is a data element. Objects have an equivalent feature: The enumerable descriptor. But is it really advisable to rely on this? I believe it is not! The meaning of the enumerable descriptor is too blurry.
Conclusion
There is no meaningful way to implement the iteration protocols for objects, because not every object is a collection.
If object properties were iterable by default, program and data level were mixed-up. Since every composite type in Javascript is based on plain objects this would apply for Array and Map as well.
for..in, Object.keys, Reflect.ownKeys etc. can be used for both reflection and data iteration, a clear distinction is regularly not possible. If you're not careful, you end up quickly with meta programming and weird dependencies. The Map abstract data type effectively ends the conflating of program and data level. I believe Map is the most significant achievement in ES2015, even if Promises are much more exciting.
I was also bothered with this question.
Then I came up with an idea of using Object.entries({...}), it returns an Array which is an Iterable.
Also, Dr. Axel Rauschmayer posted an excellent answer on this.
See Why plain objects are NOT iterable
I guess the question should be "why is there no built-in object iteration?
Adding iterability to objects themselves could conceivably have unintended consequences, and no, there is no way to guarantee order, but writing an iterator is as simple as
function* iterate_object(o) {
var keys = Object.keys(o);
for (var i=0; i<keys.length; i++) {
yield [keys[i], o[keys[i]]];
}
}
Then
for (var [key, val] of iterate_object({a: 1, b: 2})) {
console.log(key, val);
}
a 1
b 2
You can easily make all objects iterable globally:
Object.defineProperty(Object.prototype, Symbol.iterator, {
enumerable: false,
value: function * (){
for(let key in this){
if(this.hasOwnProperty(key)){
yield [key, this[key]];
}
}
}
});
This is the latest approach (which works in chrome canary)
var files = {
'/root': {type: 'directory'},
'/root/example.txt': {type: 'file'}
};
for (let [key, {type}] of Object.entries(files)) {
console.log(type);
}
Yes entries is now a method thats part of Object :)
edit
After looking more into it, it seems you could do the following
Object.prototype[Symbol.iterator] = function * () {
for (const [key, value] of Object.entries(this)) {
yield {key, value}; // or [key, value]
}
};
so you can now do this
for (const {key, value:{type}} of files) {
console.log(key, type);
}
edit2
Back to your original example, if you wanted to use the above prototype method it would like like this
for (const {key, value:item1} of example) {
console.log(key);
console.log(item1);
for (const {key, value:item2} of item1) {
console.log(key);
console.log(item2);
}
}
Technically, this is not an answer to the question why? but I have adapted Jack Slocum’s answer above in light of BT’s comments to something which can be used to make an Object iterable.
var iterableProperties={
enumerable: false,
value: function * () {
for(let key in this) if(this.hasOwnProperty(key)) yield this[key];
}
};
var fruit={
'a': 'apple',
'b': 'banana',
'c': 'cherry'
};
Object.defineProperty(fruit,Symbol.iterator,iterableProperties);
for(let v of fruit) console.log(v);
Not quite as convenient as it should have been, but it’s workable, especially if you have multiple objects:
var instruments={
'a': 'accordion',
'b': 'banjo',
'c': 'cor anglais'
};
Object.defineProperty(instruments,Symbol.iterator,iterableProperties);
for(let v of instruments) console.log(v);
And, because every one is entitled to an opinion, I can’t see why Objects are not already iterable either. If you can polyfill them as above, or use for … in then I can’t see a simple argument.
One possible suggestion is that what is iterable is a type of object, so it is possible that iterable has been limited to a subset of objects just in case some other objects explode in the attempt.
I have a situation, where I need to create a new JavaScript object that is inherited from Array. I am using the following code:
// Create constructor function.
var SpecialArray = function () {};
// Create intermediate function to create closure upon Array's prototype.
// This prevents littering of native Array's prototype.
var ISpecialArray = function () {};
ISpecialArray.prototype = Array.prototype;
SpecialArray.prototype = new ISpecialArray();
SpecialArray.prototype.constructor = SpecialArray;
// Use Array's push() method to add two elements to the prototype itself.
SpecialArray.prototype.push('pushed proto 0', 'pushed proto 1');
// Use [] operator to add item to 4th position
SpecialArray.prototype[4] = 'direct [] proto to 4';
// Create new instance of Special Array
var x = new SpecialArray();
// Directly add items to this new instance.
x.push('pushed directly on X');
x[9] = 'direct [] to 9'
console.log(x, 'length: ' + x.length);
Quite interestingly, the [] operation seem to be useless and the console output is:
["pushed proto 0", "pushed proto 1", "pushed directly on X"] length: 3
What am I missing here?
It is not possible to subclass the Array class and use t this way.
The best solution for you is to extend just the array class and use it as it is.
There are two other options that I do not like but they exist
http://ajaxian.com/archives/another-trick-to-allow-array-subclasses
http://dean.edwards.name/weblog/2006/11/hooray/
This is one of those that always trips people up. The length property only applies to the ordered elements. You can't extend an array then insert an arbitrary non-sequitous key and expect it to work. This is because the relationship between the length property and the array contents is broken once you extend the array. Pointy's link above does a very good job of explaining this in more detail.
To prove this add this to the end of your example:
console.log(x[4]);
As you can see your entry is present and correct, it's just not part of the ordered array.
Like everything else in javascript the Array object is just a Associative Array with string keys. Non numerical, non sequitous keys are hidden to fool you into thinking it's a 'proper' numerically indexed array.
This strange mixed design of the Array object does mean you can do some strange and wonderful things like storing ordered and unordered information in the same object. I'm not saying this is a good idea, I'm just saying it's possible.
As you will have noticed by now when iterating structures like this the non sequitous keys don't appear which makes sense for the general use case of arrays for ordered information. It's less useful, or in fact useless when you want to get keyed info. I would venture that if ordering is unimportant you should use an object not an array. If you need both ordered and unordered store an array as a property in an object.
The best way I have found to create a child prototype of an "Array" is to not make a child prototype of "Array" but rather create a child of an "Array-Like" prototype. There are many prototypes floating around that attempt to mimic the properties of an "Array" while still being able to "inherit" from it, the best one I've found is Collection because it preserves the ability to use brackets []. The major downfall is that it doesn't work well with non-numeric keys (i.e. myArray["foo"] = "bar") but if you're only using numeric keys it works great.
You can extend this prototype like this:
http://codepen.io/dustinpoissant/pen/AXbjxm?editors=0011
var MySubArray = function(){
Collection.apply(this, arguments);
this.myCustomMethod = function(){
console.log("The second item is "+this[1]);
};
};
MySubArray.prototype = Object.create(Collection.prototype);
var msa = new MySubArray("Hello", "World");
msa[2] = "Third Item";
console.log(msa);
msa.myCustomMethod();