Eloquent JavaScript: Persistent Group - javascript

In the book, Eloquent JavaScript, it asks to make a class PGroup which is similar to a class made in a previous exercise. It basically is like a simplified Set class that has add, delete, and has methods. The specific part I don't understand at all is at the end. It says:
"The constructor shouldn't be part of the class's interface (though
you'll definitely want to use it internally). Instead, there is an
empty instance, PGroup.empty, that can be used as a starting value.
Why do you need only one Pgroup.empty value, rather than having a
function that creates a new, empty map every time?"
This is the given answer to the problem:
class PGroup {
constructor(members) {
this.members = members;
}
add(value) {
if (this.has(value)) return this;
return new PGroup(this.members.concat([value]));
}
delete(value) {
if (!this.has(value)) return this;
return new PGroup(this.members.filter(m => m !== value));
}
has(value) {
return this.members.includes(value);
}
}
PGroup.empty = new PGroup([]);
let a = PGroup.empty.add("a");
let ab = a.add("b");
let b = ab.delete("a");
tldr: What is PGroup.empty?
edit: To clear up confusion, what I meant is that I don't understand the purpose of PGroup.empty or what it even is in relation to the class PGroup. Like, for example, is it a property of the constructor?

PGroup.empty represents an empty set. You can use PGroup.empty as the starting point to create more sets.
What makes this particular implementation of PGroup interesting is that the add and delete methods do not modify the existing PGroup instance that you're operating on. Instead, add and delete return entirely new PGroup instances. This means that every time you add or remove an element from a PGroup that you already have, you create an entirely new PGroup instance rather than modify your existing PGroup instance.
Using this pattern means that given an empty set (in our case PGroup.empty), we can create a whole bunch of other PGroups without ever explicitly using the new keyword. In particular, if we wanted a set of ['a', 'b', 'c'], we could do the following:
let abc = PGroup.empty.add('a').add('b').add('c');
Moreover, because the PGroup.empty instance itself does not change when you call the add method on it, you can use reuse the same PGroup.empty instance however many times you want.
let xyz = PGroup.empty.add('x').add('y').add('z');
let efg = PGroup.empty.add('e').add('f').add('g');
This aspect of immutability allows us to satisfy the following requirement:
The constructor shouldn't be part of the class's interface
Instead, we can use add and delete to create more instances of PGroup.
The technical term of having add and delete create a new instance of a PGroup rather than modify the existing instance of a PGroup is known as immutibility.

Related

Reconstructing JS objects from JSON

Don't do it
This question has some comments with a low opinion of the very notion of reconstructing the objects. The commenters either couldn't or wouldn't explain why they thought it was a bad idea, but since asking I have come the to same conclusion. Here's why.
If you think about MVVM, the purpose of having a model and a view-model is to separate behaviour from data. This is kind of funny, because the point of object-orientation is to combine them. But in a distributed world, the data has to be shipped around. If your code and data are all munged together then you have to either invent MVVM or keep de- and re-constructing objects.
The code to de- and re-construct objects is a testing and maintenance time-sink you don't need, and introduces two failure modes. Don't do it. Have a method-less class to hold the state and a stateless class that operates on the method-less class. This is the essence of MVVM, and really nothing more than application of Memento pattern.
Memento (283)
Without violating encapsulation, capture and externalize an object's internal state so that the object can be restored to this state later.
Design Patterns, Gamma et al, 1995
Original question
The data of my view models is passed back and forth between client JS and server Web APIs as JSON.
It is well understood that JSON.stringify(object) serialises only members that have a non-null value that is not a Function. Thus, JSON.parse(JSON.stringify(someObject)) will remove all the methods from the object.
My current implementation has each graph node implemented as a Typescript class with Serialise and Deserialise methods. JQuery.ajax calls a Web API and implicitly parses the resultant JSON into a DAG of object definitions, each of which has a Type property indicating which type of class it was prior to serialisation. I have a map of constructors indexed by name and the appropriate constructor is retrieved and the data passed as the constructor parameter.
Depending on type there may be children; if so things proceed recursively down the graph.
I have been wondering whether, rather than copy all the property values, I couldn't just assign an appropriate prototype. Bring the mountain to Mahomed, you might say. This would eliminate quite a bit of clutter in my codebase.
As I write it occurs to me that I could use $.extend, but I'm progressively weeding jQuery out of my codebase so this would be a retrograde step.
Is there any known peril in my proposition of diddling the prototype?
Does anyone have a better idea? Other than $.extend, I mean. Something TypeScripty, by preference.
It has been observed in comments that assigning the prototype means the constructor is never called. This is irrelevant. The object state is already set up, all that is required is to make the methods available.
I recently built object with methods which content could be serialized and then reconstructed.
I simply added an argument which could take a JSON object and assign it to itself.
Example using plain object:
function myObject() {
this.valueA = 1;
this.valueB = 2;
this.valueC = 3;
this.add = function() {
return this.valueA + this.valueB + this.valueC;
};
}
var o = new myObject();
console.log(o.add());
console.log(JSON.stringify(o));
If you serialized this you would get:
{"valueA":1,"valueB":2,"valueC":3}
Now, to reconstruct this you can add a Object.assign() to the object like this taking the argument and merge it with self:
function myObject(json) {
this.valueA = 0;
this.valueB = 0;
this.valueC = 0;
this.add = function() {
return this.value1 + this.value2 + this.value3;
};
Object.assign(this, json); // will merge argument with itself
}
If we now pass the parsed JSON object as argument it will merge itself with the object recreating what you had:
var json = JSON.parse('{"valueA":1,"valueB":2,"valueC":3}')
function myObject(json) {
this.valueA = 0;
this.valueB = 0;
this.valueC = 0;
this.add = function() {
return this.valueA + this.valueB + this.valueC;
};
Object.assign(this, json); // will merge argument with itself
}
var o = new myObject(json); // reconstruct using original data
console.log(o.add());
If you now have children via array you simply repeat the process recursively down the chain.
(A bonus is that you can also pass options this way).

How to use a dynamic variable in JavaScript?

I need to create something like :
var term = new Terminal();
each time I click on a button. I have found that we can create dynamic variable like this in JavaScript:
window["term_" + _idContainer] = new Terminal({
cursorBlink: true,
});
But I'm not sure about this because I can only use the last one that I create.
So someone could tell me if it really create dynamic var and if they aren't overwritten each time we create one.
You could use an object, without polluting the global space, like
var collection = Object.create(null); // empty object without prototypes
// use
collection["term_" + _idContainer] = new Terminal({ cursorBlink: true });
JavaScript objects are effectively dictionary-style objects. Therefore, you can add a property to any object in these two ways:
myobj.newProp = 'I am new!';
myobj['newProp2'] = 'So am I';
And therefore, yes, what you are doing is creating a sequence of new properties on the window object. There is no reason the one would overwrite the other, unless you neglected to increment _idContainer.
I should add that adding variables to the window object is not a popular thing to do and you could be adding many with this mechanism. Perhaps better would be to just add one, and expand it as needed:
window.termList = {};
// Then, in a loop or whatever
window.termList[_idContainer] = new Terminal({cursorBlink: true});

Immutables and collections in JavaScript

I'm trying to get my head around how to use Immutables in JavaScript/TypeScript without taking all day about it. I'm not quite ready to take the dive into Immutable.js, because it seems to leave you high and dry as far as type safety.
So let's take an example where I have an Array where the elements are all of Type MyType. In my Class, I have a method that searches the Array and returns a copy of a matching element so we don't edit the original. Say now that at a later time, I need to look and see if the object is in the Array, but what I have is the copy, not the original.
What is the standard method of handling this? Any method I can think of to determine whether I already have this item is going to take some form of looping through the collection and visiting each element and then doing a clunky equality match, whether that's turning both of them to strings or using a third-party library.
I'd like to use Immutables, but I keep running into situations like this that make them look pretty unattractive. What am I missing?
I suspect that my solution is not "...the standard method of handling this." However, I think it at least is a way of doing what I think you're asking.
You write that you have a method that "...returns a copy of a matching element so we don't edit the original". Could you change that method so that it instead returns both the original and a copy?
As an example, the strategy below involves retrieving both an original element from the array (which can later be used to search by reference) as well as a clone (which can be manipulated as needed without affecting the original). There is still the cost of cloning the original during retrieval, but at least you don't have to do such conversions for every element in the array when you later search the array. Moreover, it even allows you to differentiate between array elements that are identical-by-value, something that would be impossible if you only originally retrieved a copy of an element. The code below demonstrates this by making every array element identical-by-value (but, by definition of what objects are, different-by-reference).
I don't know if this violates other immutability best practices by, e.g., keeping copies of references to elements (which, I suppose, leaves the code open to future violations of immutability even if they are not currently being violated...though you could deep-freeze the original to prevent future mutations). However it at least allows you to keep everything technically immutable while still being able to search by reference. Thus you can mutate your clone as much as you want but still always hold onto an associated copy-by-reference of the original.
const retrieveDerivative = (array, elmtNum) => {
const orig = array[elmtNum];
const clone = JSON.parse(JSON.stringify(orig));
return {orig, clone};
};
const getIndexOfElmt = (array, derivativeOfElement) => {
return array.indexOf(derivativeOfElement.orig);
};
const obj1 = {a: {b: 1}}; // Object #s are irrelevant.
const obj3 = {a: {b: 1}}; // Note that all objects are identical
const obj5 = {a: {b: 1}}; // by value and thus can only be
const obj8 = {a: {b: 1}}; // differentiated by reference.
const myArr = [obj3, obj5, obj1, obj8];
const derivedFromSomeElmt = retrieveDerivative(myArr, 2);
const indexOfSomeElmt = getIndexOfElmt(myArr, derivedFromSomeElmt);
console.log(indexOfSomeElmt);
The situation you've described is one where a mutable datastructure has obvious advantages, but if you otherwise benefit from using immutables there are better approaches.
While keeping it immutable means that your new updated object is completely new, that cuts both ways: you may have a new object, but you also still have access to the original object! You can do a lot of neat things with this, e.g. chain your objects so you have an undo-history, and can go back in time to roll back changes.
So don't use some hacky looking-up-the-properties in the array. The problem with your example is because you're building a new object at the wrong time: don't have a function return a copy of the object. Have the function return the original object, and call your update using the original object as an index.
let myThings = [new MyType(), new MyType(), new MyType()];
// We update by taking the thing, and replacing with a new one.
// I'll keep the array immutable too
function replaceThing(oldThing, newThing) {
const oldIndex = myThings.indexOf(oldThing);
myThings = myThings.slice();
myThings[oldIndex] = newThing;
return myThings;
}
// then when I want to update it
// Keep immutable by spreading
const redThing = myThings.find(({ red }) => red);
if (redThing) {
// In this example, there is a 'clone' method
replaceThing(redThing, Object.assign(redThing.clone(), {
newProperty: 'a new value in my immutable!',
});
}
All that said, classes make this a whole lot more complex too. It's much easier to keep simple objects immutable, since you could simple spread the old object into the new one, e.g. { ...redThing, newProperty: 'a new value' }. Once you get a higher than 1-height object, you may find immutable.js far more useful, since you can mergeDeep.

Saving/Loading a complex object structure in Javascript (Node.js)

I have been coding in javascript for some time, but am fairly new to Node. I recently undertook a project that involves a complex object structure with multiple levels of prototypical inheritance and sub objects. This structure needs to be periodically saved / loaded. Saving and loading in JSON is desirable.
The Question
Is there a more elegant way of accomplishing the task of saving/loading these complex Javascript objects than my current method (outlined below)? Is it possible to design it in such a way where the constructors can initialize themselves as if they were normal objects without being bound by all of the restoring functionality?
My Solution
The base 'class' (from which, by design, all other objects under consideration inherit protoypically) has a function which processes an 'options' argument, adding all of it's properties to the current object. All deriving objects must include an options argument as the last argument and call the processing function in their constructor.
Each object also must add it's function name to a specific property so that the correct constructor function can be called when the object needs to be rebuilt.
An unpack function takes the saved object JSON, creates a plain object with JSON.parse and then passes that object in as the 'options' argument to the object's constructor.
Each object is given a unique id and stored in a lookup table, so that a function under construction with links to other objects can point to the right ones, or create them if it needs to.
Here is a plunker which demonstrates the idea (obviously in a non-Node way).
If you don't want to load the plunker, here's an excerpt which should hopefully provide the gist of what I'm trying to do:
function BaseClass(name, locale, options){
if(name) this.name = name;
if(locale) this.locale = locale;
// If options are defined, apply them
this.processOptions(options);
// create the classList array which keeps track of
// the object's prototype chain
this._classList = [arguments.callee.name];
// Create a unique id for the object and add it to
// the lookup table
if(!this.id) this.id = numEntities++;
lookupTable[this.id] = this;
if(!this.relations) this.relations = [];
// other initialization stuff
}
BaseClass.prototype = {
processOptions: function(options) {
if(options && !options._processed){
for(var key in options){
if(options.hasOwnProperty(key)){
this[key] = options[key];
}
}
options._processed = true;
}
},
addNewRelation: function(otherObj){
this.relations.push(otherObj.id);
}
// Other functions and such for the base object
}
function DerivedClassA(name, locale, age, options){
if(age) this.age = age;
this.processOptions(options);
if(options && options.connectedObj){
// Get the sub object if it already exists
if(lookupTable[options.subObj.id]){
this.subObj = lookupTable[options.subObj.id];
}
// Otherwise, create it from the options
else {
this.subObj = new OtherDerivedClass(options.subObj);
}
}
else {
// If no options then construct as normal
this.subObj = new OtherDerivedClass();
}
// If something needs to be done before calling the super
// constructor, It's done here.
BaseClass.call(this, name, locale, options);
this._classList.push(arguments.callee.name);
}
DerivedClassA.prototype = Object.create(BaseClass.prototype);
As mentioned, this gets the job done, but I can't help but feeling like this could be much better. It seems to impose a ridiculous amount of restrictions on the inheriting 'classes' and how their constructors must behave. It makes a specific order of execution critical, and requires that each object be deeply involved and aware of the restoration process, which is far from ideal.

Copy and extend global objects in javascript

is there a way to copy a global object (Array,String...) and then extend the prototype of the copy without affecting the original one? I've tried with this:
var copy=Array;
copy.prototype.test=2;
But if i check Array.prototype.test it's 2 because the Array object is passed by reference. I want to know if there's a way to make the "copy" variable behave like an array but that can be extended without affecting the original Array object.
Good question. I have a feeling you might have to write a wrapper class for this. What you're essentially doing with copy.prototype.test=2 is setting a class prototype which will (of course) be visible for all instances of that class.
I think the reason the example in http://dean.edwards.name/weblog/2006/11/hooray/ doesn't work is because it's an anonymous function. So, instead of the following:
// create the constructor
var Array2 = function() {
// initialise the array
};
// inherit from Array
Array2.prototype = new Array;
// add some sugar
Array2.prototype.each = function(iterator) {
// iterate
};
you would want something like this:
function Array2() {
}
Array2.prototype = new Array();
From my own testing, the length property is maintained in IE with this inheritance. Also, anything added to MyArray.prototype does not appear to be added to Array.prototype. Hope this helps.
Instead of extending the prototype, why don't you simply extend the copy variable. For example, adding a function
copy.newFunction = function(pParam1) {
alert(pParam1);
};

Categories