Adding a custom compare function for sets (and other containers) [duplicate] - javascript

New ES 6 (Harmony) introduces new Set object. Identity algorithm used by Set is similar to === operator and so not much suitable for comparing objects:
var set = new Set();
set.add({a:1});
set.add({a:1});
console.log([...set.values()]); // Array [ Object, Object ]
How to customize equality for Set objects in order to do deep object comparison? Is there anything like Java equals(Object)?

Update 3/2022
There is currently a proposal to add Records and Tuples (basically immutable Objects and Arrays) to Javascript. In that proposal, it offers direct comparison of Records and Tuples using === or !== where it compares values, not just object references AND relevant to this answer both Set and Map objects would use the value of the Record or Tuple in key comparisons/lookups which would solve what is being asked for here.
Since the Records and Tuples are immutable (can't be modified) and because they are easily compared by value (by their contents, not just their object reference), it allows Maps and Sets to use object contents as keys and the proposed spec explicitly names this feature for Sets and Maps.
This original question asked for customizability of a Set comparison in order to support deep object comparison. This doesn't propose customizability of the Set comparison, but it directly supports deep object comparison if you use the new Record or a Tuple instead of an Object or an Array and thus would solve the original problem here.
Note, this proposal advanced to Stage 2 in mid-2021. It has been moving forward recently, but is certainly not done.
Mozilla work on this new proposal can be tracked here.
Original Answer
The ES6 Set object does not have any compare methods or custom compare extensibility.
The .has(), .add() and .delete() methods work only off it being the same actual object or same value for a primitive and don't have a means to plug into or replace just that logic.
You could presumably derive your own object from a Set and replace .has(), .add() and .delete() methods with something that did a deep object comparison first to find if the item is already in the Set, but the performance would likely not be good since the underlying Set object would not be helping at all. You'd probably have to just do a brute force iteration through all existing objects to find a match using your own custom compare before calling the original .add().
Here's some info from this article and discussion of ES6 features:
5.2 Why can’t I configure how maps and sets compare keys and values?
Question: It would be nice if there were a way to configure what map
keys and what set elements are considered equal. Why isn’t there?
Answer: That feature has been postponed, as it is difficult to
implement properly and efficiently. One option is to hand callbacks to
collections that specify equality.
Another option, available in Java, is to specify equality via a method
that object implement (equals() in Java). However, this approach is
problematic for mutable objects: In general, if an object changes, its
“location” inside a collection has to change, as well. But that’s not
what happens in Java. JavaScript will probably go the safer route of
only enabling comparison by value for special immutable objects
(so-called value objects). Comparison by value means that two values
are considered equal if their contents are equal. Primitive values are
compared by value in JavaScript.

As mentioned in jfriend00's answer customization of equality relation is probably not possible.
Following code presents an outline of computationally efficient (but memory expensive) workaround:
class GeneralSet {
constructor() {
this.map = new Map();
this[Symbol.iterator] = this.values;
}
add(item) {
this.map.set(item.toIdString(), item);
}
values() {
return this.map.values();
}
delete(item) {
return this.map.delete(item.toIdString());
}
// ...
}
Each inserted element has to implement toIdString() method that returns string. Two objects are considered equal if and only if their toIdString methods returns same value.

As the top answer mentions, customizing equality is problematic for mutable objects. The good news is (and I'm surprised no one has mentioned this yet) there's a very popular library called immutable-js that provides a rich set of immutable types which provide the deep value equality semantics you're looking for.
Here's your example using immutable-js:
const { Map, Set } = require('immutable');
var set = new Set();
set = set.add(Map({a:1}));
set = set.add(Map({a:1}));
console.log([...set.values()]); // [Map {"a" => 1}]

Maybe you can try to use JSON.stringify() to do deep object comparison.
for example :
const arr = [
{name:'a', value:10},
{name:'a', value:20},
{name:'a', value:20},
{name:'b', value:30},
{name:'b', value:40},
{name:'b', value:40}
];
const names = new Set();
const result = arr.filter(item => !names.has(JSON.stringify(item)) ? names.add(JSON.stringify(item)) : false);
console.log(result);

To add to the answers here, I went ahead and implemented a Map wrapper that takes a custom hash function, a custom equality function, and stores distinct values that have equivalent (custom) hashes in buckets.
Predictably, it turned out to be slower than czerny's string concatenation method.
Full source here: https://github.com/makoConstruct/ValueMap

Comparing them directly seems not possible, but JSON.stringify works if the keys just were sorted. As I pointed out in a comment
JSON.stringify({a:1, b:2}) !== JSON.stringify({b:2, a:1});
But we can work around that with a custom stringify method. First we write the method
Custom Stringify
Object.prototype.stringifySorted = function(){
let oldObj = this;
let obj = (oldObj.length || oldObj.length === 0) ? [] : {};
for (let key of Object.keys(this).sort((a, b) => a.localeCompare(b))) {
let type = typeof (oldObj[key])
if (type === 'object') {
obj[key] = oldObj[key].stringifySorted();
} else {
obj[key] = oldObj[key];
}
}
return JSON.stringify(obj);
}
The Set
Now we use a Set. But we use a Set of Strings instead of objects
let set = new Set()
set.add({a:1, b:2}.stringifySorted());
set.has({b:2, a:1}.stringifySorted());
// returns true
Get all the values
After we created the set and added the values, we can get all values by
let iterator = set.values();
let done = false;
while (!done) {
let val = iterator.next();
if (!done) {
console.log(val.value);
}
done = val.done;
}
Here's a link with all in one file
http://tpcg.io/FnJg2i

For Typescript users the answers by others (especially czerny) can be generalized to a nice type-safe and reusable base class:
/**
* Map that stringifies the key objects in order to leverage
* the javascript native Map and preserve key uniqueness.
*/
abstract class StringifyingMap<K, V> {
private map = new Map<string, V>();
private keyMap = new Map<string, K>();
has(key: K): boolean {
let keyString = this.stringifyKey(key);
return this.map.has(keyString);
}
get(key: K): V {
let keyString = this.stringifyKey(key);
return this.map.get(keyString);
}
set(key: K, value: V): StringifyingMap<K, V> {
let keyString = this.stringifyKey(key);
this.map.set(keyString, value);
this.keyMap.set(keyString, key);
return this;
}
/**
* Puts new key/value if key is absent.
* #param key key
* #param defaultValue default value factory
*/
putIfAbsent(key: K, defaultValue: () => V): boolean {
if (!this.has(key)) {
let value = defaultValue();
this.set(key, value);
return true;
}
return false;
}
keys(): IterableIterator<K> {
return this.keyMap.values();
}
keyList(): K[] {
return [...this.keys()];
}
delete(key: K): boolean {
let keyString = this.stringifyKey(key);
let flag = this.map.delete(keyString);
this.keyMap.delete(keyString);
return flag;
}
clear(): void {
this.map.clear();
this.keyMap.clear();
}
size(): number {
return this.map.size;
}
/**
* Turns the `key` object to a primitive `string` for the underlying `Map`
* #param key key to be stringified
*/
protected abstract stringifyKey(key: K): string;
}
Example implementation is then this simple: just override the stringifyKey method. In my case I stringify some uri property.
class MyMap extends StringifyingMap<MyKey, MyValue> {
protected stringifyKey(key: MyKey): string {
return key.uri.toString();
}
}
Example usage is then as if this was a regular Map<K, V>.
const key1 = new MyKey(1);
const value1 = new MyValue(1);
const value2 = new MyValue(2);
const myMap = new MyMap();
myMap.set(key1, value1);
myMap.set(key1, value2); // native Map would put another key/value pair
myMap.size(); // returns 1, not 2

A good stringification method for the special but frequent case of a TypedArray as Set/Map key is using
const key = String.fromCharCode(...new Uint16Array(myArray.buffer));
It generates the shortest possible unique string that can be easily converted back. However this is not always a valid UTF-16 string for display concerning Low and High Surrogates. Set and Map seem to ignore surrogate validity.
As measured in Firefox and Chrome, the spread operator performs slowly. If your myArray has fixed size, it executes faster when you write:
const a = new Uint16Array(myArray.buffer); // here: myArray = Uint32Array(2) = 8 bytes
const key = String.fromCharCode(a[0],a[1],a[2],a[3]); // 8 bytes too
Probably the most valuable advantage of this method of key-building: It works for Float32Array and Float64Array without any rounding side-effect. Note that +0 and -0 are then different. Infinities are same. Silent NaNs are same. Signaling NaNs are different depending on their signal (never seen in vanilla JavaScript).

As other guys said there is no native method can do it by far.
But if you would like to distinguish an array with your custom comparator, you can try to do it with the reduce method.
function distinct(array, equal) {
// No need to convert it to a Set object since it may give you a wrong signal that the set can work with your objects.
return array.reduce((p, c) => {
p.findIndex((element) => equal(element, c)) > -1 || p.push(c);
return p;
}, []);
}
// You can call this method like below,
const users = distinct(
[
{id: 1, name: "kevin"},
{id: 2, name: "sean"},
{id: 1, name: "jerry"}
],
(a, b) => a.id === b.id
);
...

As others have said, there is no way to do it with the current version of Set.
My suggestion is to do it using a combination of arrays and maps.
The code snipped below will create a map of unique keys based on your own defined key and then transform that map of unique items into an array.
const array =
[
{ "name": "Joe", "age": 17 },
{ "name": "Bob", "age": 17 },
{ "name": "Carl", "age": 35 }
]
const key = 'age';
const arrayUniqueByKey = [...new Map(array.map(item =>
[item[key], item])).values()];
console.log(arrayUniqueByKey);
/*OUTPUT
[
{ "name": "Bob", "age": 17 },
{ "name": "Carl", "age": 35 }
]
*/
// Note: this will pick the last duplicated item in the list.

To someone who found this question on Google (as me) wanting to get a value of a Map using an object as Key:
Warning: this answer will not work with all objects
var map = new Map<string,string>();
map.set(JSON.stringify({"A":2} /*string of object as key*/), "Worked");
console.log(map.get(JSON.stringify({"A":2}))||"Not worked");
Output:
Worked

Related

Check if object already exists in object

I want to check if an object already exists in a given object by only having the object.
For instance:
const information = {
...
city: {
Streetname: ''
}
}
Now, I get the city object and want to check if it is already in the information object (without knowing the property name). The city could be n deep in the information object.
To get the property name of an object you can use Object.keys(). The first problem solved.
Now we need to iterate through the whole object including nested objects. This is the second problem.
And compare it to a query object. This is the third problem.
I assume that we have an object that only contains "simple" though nested objects with primitive values (I do not consider objects with functions or arrays)
// let's assume we have this object
const information = {
city: {
Streetname: 'streetname1'
},
house: {
color: "blue",
height: 100,
city: {
findMe: { Streetname: '' } // we want to get the path to this property 'findMe'
}
},
findMeToo: {
Streetname: '' // we also want to get the path to this proeprty 'findMeToo'
},
willNotFindMe: {
streetname: '' // case sensetive
}
}
// this is our object we want to use to find the property name with
const queryObject = {
Streetname : ''
}
If you use === to compare Objects you will always compare by reference. In our case, we are interested to compare the values. There is a rather extensive checking involved if you want to do it for more complex objects (read this SO comment for details), we will use a simplistic version:
// Note that this only evaluates to true if EVERYTHING is equal.
// This includes the order of the properties, since we are eventually comparing strings here.
JSON.stringify(obj1) === JSON.stringify(obj2)
Before we start to implement our property pathfinder I will introduce a simple function to check if a given value is an Object or a primitive value.
function isObject(obj) {
return obj === Object(obj); // if you pass a string it will create an object and compare it to a string and thus result to false
}
We use this function to know when to stop diving deeper since we reached a primitive value which does not contain any further objects. We loop through the whole object and dive deeper every time we find a nested object.
function findPropertyPath(obj, currentPropertyPath) {
const keys = isObject(obj) ? Object.keys(obj) : []; // if it is not an Object we want to assign an empty array or Object.keys() will implicitly cast a String to an array object
const previousPath = currentPropertyPath; // set to the parent node
keys.forEach(key => {
const currentObj = obj[key];
currentPropertyPath = `${previousPath}.${key}`;
if (JSON.stringify(currentObj) === JSON.stringify(queryObject)) console.log(currentPropertyPath); // this is what we are looking for
findPropertyPath(currentObj, currentPropertyPath); // since we are using recursion this is not suited for deeply nested objects
})
}
findPropertyPath(information, "information"); // call the function with the root key
This will find all "property paths" that contain an object that is equal to your query object (compared by value) using recursion.
information.house.city.findMe
information.findMeToo
const contains = (item, data) => item === data || Object.getOwnPropertyNames(data).some(prop => contains(item, data[prop]));
const information = {
city: {
Streetname: ''
}
}
console.log(contains(information.city, information));
console.log(contains({}, information));

How to map over arbitrary Iterables?

I wrote a reduce function for Iterables and now I want to derive a generic map that can map over arbitrary Iterables. However, I have encountered an issue: Since Iterables abstract the data source, map couldn't determine the type of it (e.g. Array, String, Map etc.). I need this type to invoke the corresponding identity element/concat function. Three solutions come to mind:
pass the identity element/concat function explicitly const map = f => id => concat => xs (this is verbose and would leak internal API though)
only map Iterables that implement the monoid interface (that were cool, but introducing new types?)
rely on the prototype or constructor identity of ArrayIterator,StringIterator, etc.
I tried the latter but isPrototypeOf/instanceof always yield false no matter what a do, for instance:
Array.prototype.values.prototype.isPrototypeOf([].values()); // false
Array.prototype.isPrototypeOf([].values()); // false
My questions:
Where are the prototypes of ArrayIterator/StringIterator/...?
Is there a better approach that solves the given issue?
Edit: [][Symbol.iterator]() and ("")[Symbol.iterator]() seem to share the same prototype:
Object.getPrototypeOf(Object.getPrototypeOf([][Symbol.iterator]())) ====
Object.getPrototypeOf(Object.getPrototypeOf(("")[Symbol.iterator]()))
A distinction by prototypes seems not to be possible.
Edit: Here is my code:
const values = o => keys(o).values();
const next = iter => iter.next();
const foldl = f => acc => iter => {
let loop = (acc, {value, done}) => done
? acc
: loop(f(acc) (value), next(iter));
return loop(acc, next(iter));
}
// static `map` version only for `Array`s - not what I desire
const map = f => foldl(acc => x => [...acc, f(x)]) ([]);
console.log( map(x => x + x) ([1,2,3].values()) ); // A
console.log( map(x => x + x) (("abc")[Symbol.iterator]()) ); // B
The code in line A yields the desired result. However B yields an Array instead of String and the concatenation only works, because Strings and Numbers are coincidentally equivalent in this regard.
Edit: There seems to be confusion for what reason I do this: I want to use the iterable/iterator protocol to abstract iteration details away, so that my fold/unfold and derived map/filter etc. functions are generic. The problem is, that you can't do this without also having a protocol for identity/concat. And my little "hack" to rely on prototype identity didn't work out.
#redneb made a good point in his response and I agree with him that not every iterable is also a "mappable". However, keeping that in mind I still think it is meaningful - at least in Javascript - to utilize the protocol in this way, until maybe in future versions there is a mappable or collection protocol for such usage.
I have not used the iterable protocol before, but it seems to me that it is essentially an interface designed to let you iterate over container objects using a for loop. The problem is that you are trying to use that interface for something that it was not designed for. For that you would need a separate interface. It is conceivable that an object might be "iterable" but not "mappable". For example, imagine that in an application we are working with binary trees and we implement the iterable interface for them by traversing them say in BFS order, just because that order makes sense for this particular application. How would a generic map work for this particular iterable? It would need to return a tree of the "same shape", but this particular iterable implementation does not provide enough information to reconstruct the tree.
So the solution to this is to define a new interface (call it Mappable, Functor, or whatever you like) but it has to be a distinct interface. Then, you can implement that interface for types that makes sense, such as arrays.
Pass the identity element/concat function explicitly const map = f => id => concat => xs
Yes, this is almost always necessary if the xs parameter doesn't expose the functionality to construct new values. In Scala, every collection type features a builder for this, unfortunately there is nothing in the ECMAScript standard that matches this.
only map Iterables that implement the monoid interface
Well, yes, that might be one way to got. You don't even need to introduce "new types", a standard for this already exists with the Fantasyland specification. The downsides however are
most builtin types (String, Map, Set) don't implement the monoid interface despite being iterable
not all "mappables" are even monoids!
On the other hand, not all iterables are necessarily mappable. Trying to write a map over arbitrary iterables without falling back to an Array result is doomed to fail.
So rather just look for the Functor or Traversable interfaces, and use them where they exist. They might internally be built on an iterator, but that should not concern you. The only thing you might want to do is to provide a generic helper for creating such iterator-based mapping methods, so that you can e.g. decorate Map or String with it. That helper might as well take a builder object as a parameter.
rely on the prototype or constructor identity of ArrayIterator, StringIterator, etc.
That won't work, for example typed arrays are using the same kind of iterator as normal arrays. Since the iterator does not have a way to access the iterated object, you cannot distinguish them. But you really shouldn't anyway, as soon as you're dealing with the iterator itself you should at most map to another iterator but not to the type of iterable that created the iterator.
Where are the prototypes of ArrayIterator/StringIterator/...?
There are no global variables for them, but you can access them by using Object.getPrototypeOf after creating an instance.
You could compare the object strings, though this is not fool proof as there have been known bugs in certain environments and in ES6 the user can modify these strings.
console.log(Object.prototype.toString.call(""[Symbol.iterator]()));
console.log(Object.prototype.toString.call([][Symbol.iterator]()));
Update: You could get more reliable results by testing an iterator's callability of an object, it does require a fully ES6 spec compliant environment. Something like this.
var sValues = String.prototype[Symbol.iterator];
var testString = 'abc';
function isStringIterator(value) {
if (value === null || typeof value !== 'object') {
return false;
}
try {
return value.next.call(sValues.call(testString)).value === 'a';
} catch (ignore) {}
return false;
}
var aValues = Array.prototype.values;
var testArray = ['a', 'b', 'c'];
function isArrayIterator(value) {
if (value === null || typeof value !== 'object') {
return false;
}
try {
return value.next.call(aValues.call(testArray)).value === 'a';
} catch (ignore) {}
return false;
}
var mapValues = Map.prototype.values;
var testMap = new Map([
[1, 'MapSentinel']
]);
function isMapIterator(value) {
if (value === null || typeof value !== 'object') {
return false;
}
try {
return value.next.call(mapValues.call(testMap)).value === 'MapSentinel';
} catch (ignore) {}
return false;
}
var setValues = Set.prototype.values;
var testSet = new Set(['SetSentinel']);
function isSetIterator(value) {
if (value === null || typeof value !== 'object') {
return false;
}
try {
return value.next.call(setValues.call(testSet)).value === 'SetSentinel';
} catch (ignore) {}
return false;
}
var string = '';
var array = [];
var map = new Map();
var set = new Set();
console.log('string');
console.log(isStringIterator(string[Symbol.iterator]()));
console.log(isArrayIterator(string[Symbol.iterator]()));
console.log(isMapIterator(string[Symbol.iterator]()));
console.log(isSetIterator(string[Symbol.iterator]()));
console.log('array');
console.log(isStringIterator(array[Symbol.iterator]()));
console.log(isArrayIterator(array[Symbol.iterator]()));
console.log(isMapIterator(array[Symbol.iterator]()));
console.log(isSetIterator(array[Symbol.iterator]()));
console.log('map');
console.log(isStringIterator(map[Symbol.iterator]()));
console.log(isArrayIterator(map[Symbol.iterator]()));
console.log(isMapIterator(map[Symbol.iterator]()));
console.log(isSetIterator(map[Symbol.iterator]()));
console.log('set');
console.log(isStringIterator(set[Symbol.iterator]()));
console.log(isArrayIterator(set[Symbol.iterator]()));
console.log(isMapIterator(set[Symbol.iterator]()));
console.log(isSetIterator(set[Symbol.iterator]()));
<script src="https://cdnjs.cloudflare.com/ajax/libs/es6-shim/0.35.1/es6-shim.js"></script>
Note: included ES6-shim because Chrome does not currently support Array#values
I know this question was posted quite a while back, but take a look at
https://www.npmjs.com/package/fluent-iterable
It supports iterable maps along with ~50 other methods.
Using iter-ops library, you can apply any processing logic, while iterating only once:
import {pipe, map, concat} from 'iter-ops';
// some arbitrary iterables:
const iterable1 = [1, 2, 3];
const iterable2 = 'hello'; // strings are also iterable
const i1 = pipe(
iterable1,
map(a => a * 2)
);
console.log([...i1]); //=> 2, 4, 6
const i2 = pipe(
iterable1,
map(a => a * 3),
concat(iterable2)
);
console.log([...i2]); //=> 3, 6, 9, 'h', 'e', 'l', 'l', 'o'
There's a plethora of operators in the library that you can use with iterables.
There's no clean way to do this for arbitrary iterable. It is possible to create a map for built-in iterables and refer to it.
const iteratorProtoMap = [String, Array, Map, Set]
.map(ctor => [
Object.getPrototypeOf((new ctor)[Symbol.iterator]()),
ctor]
)
.reduce((map, entry) => map.set(...entry), new Map);
function getCtorFromIterator(iterator) {
return iteratorProtoMap.get(Object.getPrototypeOf(iterator));
}
With a possibility of custom iterables an API for adding them can also be added.
To provide a common pattern for concatenating/constructing a desired iterable a callback can be provided for the map instead of a constructor.

How to customize object equality for JavaScript Set

New ES 6 (Harmony) introduces new Set object. Identity algorithm used by Set is similar to === operator and so not much suitable for comparing objects:
var set = new Set();
set.add({a:1});
set.add({a:1});
console.log([...set.values()]); // Array [ Object, Object ]
How to customize equality for Set objects in order to do deep object comparison? Is there anything like Java equals(Object)?
Update 3/2022
There is currently a proposal to add Records and Tuples (basically immutable Objects and Arrays) to Javascript. In that proposal, it offers direct comparison of Records and Tuples using === or !== where it compares values, not just object references AND relevant to this answer both Set and Map objects would use the value of the Record or Tuple in key comparisons/lookups which would solve what is being asked for here.
Since the Records and Tuples are immutable (can't be modified) and because they are easily compared by value (by their contents, not just their object reference), it allows Maps and Sets to use object contents as keys and the proposed spec explicitly names this feature for Sets and Maps.
This original question asked for customizability of a Set comparison in order to support deep object comparison. This doesn't propose customizability of the Set comparison, but it directly supports deep object comparison if you use the new Record or a Tuple instead of an Object or an Array and thus would solve the original problem here.
Note, this proposal advanced to Stage 2 in mid-2021. It has been moving forward recently, but is certainly not done.
Mozilla work on this new proposal can be tracked here.
Original Answer
The ES6 Set object does not have any compare methods or custom compare extensibility.
The .has(), .add() and .delete() methods work only off it being the same actual object or same value for a primitive and don't have a means to plug into or replace just that logic.
You could presumably derive your own object from a Set and replace .has(), .add() and .delete() methods with something that did a deep object comparison first to find if the item is already in the Set, but the performance would likely not be good since the underlying Set object would not be helping at all. You'd probably have to just do a brute force iteration through all existing objects to find a match using your own custom compare before calling the original .add().
Here's some info from this article and discussion of ES6 features:
5.2 Why can’t I configure how maps and sets compare keys and values?
Question: It would be nice if there were a way to configure what map
keys and what set elements are considered equal. Why isn’t there?
Answer: That feature has been postponed, as it is difficult to
implement properly and efficiently. One option is to hand callbacks to
collections that specify equality.
Another option, available in Java, is to specify equality via a method
that object implement (equals() in Java). However, this approach is
problematic for mutable objects: In general, if an object changes, its
“location” inside a collection has to change, as well. But that’s not
what happens in Java. JavaScript will probably go the safer route of
only enabling comparison by value for special immutable objects
(so-called value objects). Comparison by value means that two values
are considered equal if their contents are equal. Primitive values are
compared by value in JavaScript.
As mentioned in jfriend00's answer customization of equality relation is probably not possible.
Following code presents an outline of computationally efficient (but memory expensive) workaround:
class GeneralSet {
constructor() {
this.map = new Map();
this[Symbol.iterator] = this.values;
}
add(item) {
this.map.set(item.toIdString(), item);
}
values() {
return this.map.values();
}
delete(item) {
return this.map.delete(item.toIdString());
}
// ...
}
Each inserted element has to implement toIdString() method that returns string. Two objects are considered equal if and only if their toIdString methods returns same value.
As the top answer mentions, customizing equality is problematic for mutable objects. The good news is (and I'm surprised no one has mentioned this yet) there's a very popular library called immutable-js that provides a rich set of immutable types which provide the deep value equality semantics you're looking for.
Here's your example using immutable-js:
const { Map, Set } = require('immutable');
var set = new Set();
set = set.add(Map({a:1}));
set = set.add(Map({a:1}));
console.log([...set.values()]); // [Map {"a" => 1}]
Maybe you can try to use JSON.stringify() to do deep object comparison.
for example :
const arr = [
{name:'a', value:10},
{name:'a', value:20},
{name:'a', value:20},
{name:'b', value:30},
{name:'b', value:40},
{name:'b', value:40}
];
const names = new Set();
const result = arr.filter(item => !names.has(JSON.stringify(item)) ? names.add(JSON.stringify(item)) : false);
console.log(result);
To add to the answers here, I went ahead and implemented a Map wrapper that takes a custom hash function, a custom equality function, and stores distinct values that have equivalent (custom) hashes in buckets.
Predictably, it turned out to be slower than czerny's string concatenation method.
Full source here: https://github.com/makoConstruct/ValueMap
Comparing them directly seems not possible, but JSON.stringify works if the keys just were sorted. As I pointed out in a comment
JSON.stringify({a:1, b:2}) !== JSON.stringify({b:2, a:1});
But we can work around that with a custom stringify method. First we write the method
Custom Stringify
Object.prototype.stringifySorted = function(){
let oldObj = this;
let obj = (oldObj.length || oldObj.length === 0) ? [] : {};
for (let key of Object.keys(this).sort((a, b) => a.localeCompare(b))) {
let type = typeof (oldObj[key])
if (type === 'object') {
obj[key] = oldObj[key].stringifySorted();
} else {
obj[key] = oldObj[key];
}
}
return JSON.stringify(obj);
}
The Set
Now we use a Set. But we use a Set of Strings instead of objects
let set = new Set()
set.add({a:1, b:2}.stringifySorted());
set.has({b:2, a:1}.stringifySorted());
// returns true
Get all the values
After we created the set and added the values, we can get all values by
let iterator = set.values();
let done = false;
while (!done) {
let val = iterator.next();
if (!done) {
console.log(val.value);
}
done = val.done;
}
Here's a link with all in one file
http://tpcg.io/FnJg2i
For Typescript users the answers by others (especially czerny) can be generalized to a nice type-safe and reusable base class:
/**
* Map that stringifies the key objects in order to leverage
* the javascript native Map and preserve key uniqueness.
*/
abstract class StringifyingMap<K, V> {
private map = new Map<string, V>();
private keyMap = new Map<string, K>();
has(key: K): boolean {
let keyString = this.stringifyKey(key);
return this.map.has(keyString);
}
get(key: K): V {
let keyString = this.stringifyKey(key);
return this.map.get(keyString);
}
set(key: K, value: V): StringifyingMap<K, V> {
let keyString = this.stringifyKey(key);
this.map.set(keyString, value);
this.keyMap.set(keyString, key);
return this;
}
/**
* Puts new key/value if key is absent.
* #param key key
* #param defaultValue default value factory
*/
putIfAbsent(key: K, defaultValue: () => V): boolean {
if (!this.has(key)) {
let value = defaultValue();
this.set(key, value);
return true;
}
return false;
}
keys(): IterableIterator<K> {
return this.keyMap.values();
}
keyList(): K[] {
return [...this.keys()];
}
delete(key: K): boolean {
let keyString = this.stringifyKey(key);
let flag = this.map.delete(keyString);
this.keyMap.delete(keyString);
return flag;
}
clear(): void {
this.map.clear();
this.keyMap.clear();
}
size(): number {
return this.map.size;
}
/**
* Turns the `key` object to a primitive `string` for the underlying `Map`
* #param key key to be stringified
*/
protected abstract stringifyKey(key: K): string;
}
Example implementation is then this simple: just override the stringifyKey method. In my case I stringify some uri property.
class MyMap extends StringifyingMap<MyKey, MyValue> {
protected stringifyKey(key: MyKey): string {
return key.uri.toString();
}
}
Example usage is then as if this was a regular Map<K, V>.
const key1 = new MyKey(1);
const value1 = new MyValue(1);
const value2 = new MyValue(2);
const myMap = new MyMap();
myMap.set(key1, value1);
myMap.set(key1, value2); // native Map would put another key/value pair
myMap.size(); // returns 1, not 2
A good stringification method for the special but frequent case of a TypedArray as Set/Map key is using
const key = String.fromCharCode(...new Uint16Array(myArray.buffer));
It generates the shortest possible unique string that can be easily converted back. However this is not always a valid UTF-16 string for display concerning Low and High Surrogates. Set and Map seem to ignore surrogate validity.
As measured in Firefox and Chrome, the spread operator performs slowly. If your myArray has fixed size, it executes faster when you write:
const a = new Uint16Array(myArray.buffer); // here: myArray = Uint32Array(2) = 8 bytes
const key = String.fromCharCode(a[0],a[1],a[2],a[3]); // 8 bytes too
Probably the most valuable advantage of this method of key-building: It works for Float32Array and Float64Array without any rounding side-effect. Note that +0 and -0 are then different. Infinities are same. Silent NaNs are same. Signaling NaNs are different depending on their signal (never seen in vanilla JavaScript).
As other guys said there is no native method can do it by far.
But if you would like to distinguish an array with your custom comparator, you can try to do it with the reduce method.
function distinct(array, equal) {
// No need to convert it to a Set object since it may give you a wrong signal that the set can work with your objects.
return array.reduce((p, c) => {
p.findIndex((element) => equal(element, c)) > -1 || p.push(c);
return p;
}, []);
}
// You can call this method like below,
const users = distinct(
[
{id: 1, name: "kevin"},
{id: 2, name: "sean"},
{id: 1, name: "jerry"}
],
(a, b) => a.id === b.id
);
...
As others have said, there is no way to do it with the current version of Set.
My suggestion is to do it using a combination of arrays and maps.
The code snipped below will create a map of unique keys based on your own defined key and then transform that map of unique items into an array.
const array =
[
{ "name": "Joe", "age": 17 },
{ "name": "Bob", "age": 17 },
{ "name": "Carl", "age": 35 }
]
const key = 'age';
const arrayUniqueByKey = [...new Map(array.map(item =>
[item[key], item])).values()];
console.log(arrayUniqueByKey);
/*OUTPUT
[
{ "name": "Bob", "age": 17 },
{ "name": "Carl", "age": 35 }
]
*/
// Note: this will pick the last duplicated item in the list.
To someone who found this question on Google (as me) wanting to get a value of a Map using an object as Key:
Warning: this answer will not work with all objects
var map = new Map<string,string>();
map.set(JSON.stringify({"A":2} /*string of object as key*/), "Worked");
console.log(map.get(JSON.stringify({"A":2}))||"Not worked");
Output:
Worked

How do I persist a ES6 Map in localstorage (or elsewhere)?

var a = new Map([[ 'a', 1 ]]);
a.get('a') // 1
var forStorageSomewhere = JSON.stringify(a);
// Store, in my case, in localStorage.
// Later:
var a = JSON.parse(forStorageSomewhere);
a.get('a') // TypeError: undefined is not a function
Unfortunatly JSON.stringify(a); simply returns '{}', which means a becomes an empty object when restored.
I found es6-mapify that allows up/down-casting between a Map and a plain object, so that might be one solution, but I was hoping I would need to resort to an external dependency simply to persist my map.
Assuming that both your keys and your values are serialisable,
localStorage.myMap = JSON.stringify(Array.from(map.entries()));
should work. For the reverse, use
map = new Map(JSON.parse(localStorage.myMap));
Clean as a whistle:
JSON.stringify([...myMap])
Usually, serialization is only useful if this property holds
deserialize(serialize(data)).get(key) ≈ data.get(key)
where a ≈ b could be defined as serialize(a) === serialize(b).
This is satisfied when serializing an object to JSON:
var obj1 = {foo: [1,2]},
obj2 = JSON.parse(JSON.stringify(obj1));
obj1.foo; // [1,2]
obj2.foo; // [1,2] :)
JSON.stringify(obj1.foo) === JSON.stringify(obj2.foo); // true :)
And this works because properties can only be strings, which can be losslessly serialized into strings.
However, ES6 maps allow arbitrary values as keys. This is problematic because, objects are uniquely identified by their reference, not their data. And when serializing objects, you lose the references.
var key = {},
map1 = new Map([ [1,2], [key,3] ]),
map2 = new Map(JSON.parse(JSON.stringify([...map1.entries()])));
map1.get(1); // 2
map2.get(1); // 2 :)
map1.get(key); // 3
map2.get(key); // undefined :(
So I would say in general it's not possible to do it in an useful way.
And for those cases where it would work, most probably you can use a plain object instead of a map. This will also have these advantages:
It will be able to be stringified to JSON without losing key information.
It will work on older browsers.
It might be faster.
Building off of Oriol's answer, we can do a little better. We can still use object references for keys as long as the there is primitive root or entrance into the map, and each object key can be transitively found from that root key.
Modifying Oriol's example to use Douglas Crockford's JSON.decycle and JSON.retrocycle we can create a map that handles this case:
var key = {},
map1 = new Map([ [1, key], [key, 3] ]),
map2 = new Map(JSON.parse(JSON.stringify([...map1.entries()]))),
map3 = new Map(JSON.retrocycle(JSON.parse(JSON.stringify(JSON.decycle([...map1.entries()])))));
map1.get(1); // key
map2.get(1); // key
map3.get(1); // key
map1.get(map1.get(1)); // 3 :)
map2.get(map2.get(1)); // undefined :(
map3.get(map3.get(1)); // 3 :)
Decycle and retrocycle make it possible to encode cyclical structures and dags in JSON. This is useful if we want to build relations between objects without creating additional properties on those objects themselves, or want to interchangeably relate primitives to objects and visa-versa, by using an ES6 Map.
The one pitfall is that we cannot use the original key object for the new map (map3.get(key); would return undefined). However, holding the original key reference, but a newly parsed JSON map seems like a very unlikely case to ever have.
If you implement your own toJSON() function for any class objects you have then just regular old JSON.stringify() will just work!
Maps with Arrays for keys? Maps with other Map as values? A Map inside a regular Object? Maybe even your own custom class; easy.
Map.prototype.toJSON = function() {
return Array.from(this.entries());
};
That's it!
prototype manipulation is required here. You could go around adding toJSON() manually to all your non-standard stuff, but really you're just avoiding the power of JS
DEMO
test = {
regular : 'object',
map : new Map([
[['array', 'key'], 7],
['stringKey' , new Map([
['innerMap' , 'supported'],
['anotherValue', 8]
])]
])
};
console.log(JSON.stringify(test));
outputs:
{"regular":"object","map":[[["array","key"],7],["stringKey",[["innerMap","supported"],["anotherValue",8]]]]}
Deserialising all the way back to real Maps isn't as automatic, though. Using the above resultant string, I'll remake the maps to pull out a value:
test2 = JSON.parse(JSON.stringify(test));
console.log((new Map((new Map(test2.map)).get('stringKey'))).get('innerMap'));
outputs
"supported"
That's a bit messy, but with a little magic sauce you can make deserialisation automagic too.
Map.prototype.toJSON = function() {
return ['window.Map', Array.from(this.entries())];
};
Map.fromJSON = function(key, value) {
return (value instanceof Array && value[0] == 'window.Map') ?
new Map(value[1]) :
value
;
};
Now the JSON is
{"regular":"object","test":["window.Map",[[["array","key"],7],["stringKey",["window.Map",[["innerMap","supported"],["anotherValue",8]]]]]]}
And deserialising and use is dead simple with our Map.fromJSON
test2 = JSON.parse(JSON.stringify(test), Map.fromJSON);
console.log(test2.map.get('stringKey').get('innerMap'));
outputs (and no new Map()s used)
"supported"
DEMO
The accepted answer will fail when you have multi dimentional Maps. One should always keep in mind that, a Map object can take another Map object as a key or value.
So a better and safer way of handling this job could be as follows;
function arrayifyMap(m){
return m.constructor === Map ? [...m].map(([v,k]) => [arrayifyMap(v),arrayifyMap(k)])
: m;
}
Once you have this tool then you can always do like;
localStorage.myMap = JSON.stringify(arrayifyMap(myMap))
// store
const mapObj = new Map([['a', 1]]);
localStorage.a = JSON.stringify(mapObj, replacer);
// retrieve
const newMapObj = JSON.parse(localStorage.a, reviver);
// required replacer and reviver functions
function replacer(key, value) {
const originalObject = this[key];
if(originalObject instanceof Map) {
return {
dataType: 'Map',
value: Array.from(originalObject.entries()), // or with spread: value: [...originalObject]
};
} else {
return value;
}
}
function reviver(key, value) {
if(typeof value === 'object' && value !== null) {
if (value.dataType === 'Map') {
return new Map(value.value);
}
}
return value;
}
I wrote here the explanation about replacer and reviver functions here https://stackoverflow.com/a/56150320/696535
This code will work for any other value like regular JSON.stringify so there's no assumption that the serialised object must be a Map. It can also be a Map deeply nested in an array or an object.
One thing that is being left outis that Map is an ORDERED structure - i.e. when iterating the first item entered would be the first listed.
This is NOT like a Javascript Object. I required this type of structure (so i used Map) and then to find out that JSON.stringify doesn't work is painful (but understandable).
I ended up making a 'value_to_json' function, which means parsing EVERYTHING -
using JSON.stringify only for the most basic 'types'.
Unfortunately subclassing MAP with a .toJSON() doesn't work as it excepts a value not a JSON_string. Also it is considered legacy.
My use case would be exceptional though.
related:
https://github.com/DavidBruant/Map-Set.prototype.toJSON/issues/16
JSON left out Infinity and NaN; JSON status in ECMAScript?
How to stringify objects containing ES5 Sets and Maps?
JSON stringify a Set
function value_to_json(value) {
if (value === null) {
return 'null';
}
if (value === undefined) {
return 'null';
}
//DEAL WITH +/- INF at your leisure - null instead..
const type = typeof value;
//handle as much as possible taht have no side effects. function could
//return some MAP / SET -> TODO, but not likely
if (['string', 'boolean', 'number', 'function'].includes(type)) {
return JSON.stringify(value)
} else if (Object.prototype.toString.call(value) === '[object Object]') {
let parts = [];
for (let key in value) {
if (Object.prototype.hasOwnProperty.call(value, key)) {
parts.push(JSON.stringify(key) + ': ' + value_to_json(value[key]));
}
}
return '{' + parts.join(',') + '}';
}
else if (value instanceof Map) {
let parts_in_order = [];
value.forEach((entry, key) => {
if (typeof key === 'string') {
parts_in_order.push(JSON.stringify(key) + ':' + value_to_json(entry));
} else {
console.log('Non String KEYS in MAP not directly supported');
}
//FOR OTHER KEY TYPES ADD CUSTOM... 'Key' encoding...
});
return '{' + parts_in_order.join(',') + '}';
} else if (typeof value[Symbol.iterator] !== "undefined") {
//Other iterables like SET (also in ORDER)
let parts = [];
for (let entry of value) {
parts.push(value_to_json(entry))
}
return '[' + parts.join(',') + ']';
} else {
return JSON.stringify(value)
}
}
let m = new Map();
m.set('first', 'first_value');
m.set('second', 'second_value');
let m2 = new Map();
m2.set('nested', 'nested_value');
m.set('sub_map', m2);
let map_in_array = new Map();
map_in_array.set('key', 'value');
let set1 = new Set(["1", 2, 3.0, 4]);
m2.set('array_here', [map_in_array, "Hello", true, 0.1, null, undefined, Number.POSITIVE_INFINITY, {
"a": 4
}]);
m2.set('a set: ', set1);
const test = {
"hello": "ok",
"map": m
};
console.log(value_to_json(test));
js use the localStorage API to store the ES6 Map
bug
"[object Map]" ❌
(() => {
const map = new Map();
map.set(1, {id: 1, name: 'eric'});
// Map(1) {1 => {…}}
// ❌
localStorage.setItem('app', map);
localStorage.getItem('app');
// "[object Map]"
})();
solution
use JSON.stringify to serialize the Map object before storing it and then use JSON.parse to deserialize it before access the the Map object ✅
(() => {
const map = new Map();
map.set(1, {id: 1, name: 'eric'});
// Map(1) {1 => {…}}
// ✅
localStorage.setItem('app', JSON.stringify([...map]));
const newMap = new Map(JSON.parse(localStorage.getItem('app')));
// Map(1) {1 => {…}}
})();
screenshots
refs
https://www.cnblogs.com/xgqfrms/p/14431425.html
It's important to remember that if you try to setItem on a huge map collection, it will throw Quota Exceeded Error. I tried persisting to local storage a map with 168590 entries and got this error. :(

Is there hash code function accepting any object type?

Basically, I'm trying to create an object of unique objects, a set. I had the brilliant idea of just using a JavaScript object with objects for the property names. Such as,
set[obj] = true;
This works, up to a point. It works great with string and numbers, but with other objects, they all seem to "hash" to the same value and access the same property. Is there some kind of way I can generate a unique hash value for an object? How do strings and numbers do it, can I override the same behavior?
If you want a hashCode() function like Java's in JavaScript, that is yours:
function hashCode(string){
var hash = 0;
for (var i = 0; i < string.length; i++) {
var code = string.charCodeAt(i);
hash = ((hash<<5)-hash)+code;
hash = hash & hash; // Convert to 32bit integer
}
return hash;
}
That is the way of implementation in Java (bitwise operator).
Please note that hashCode could be positive and negative, and that's normal, see HashCode giving negative values. So, you could consider to use Math.abs() along with this function.
JavaScript objects can only use strings as keys (anything else is converted to a string).
You could, alternatively, maintain an array which indexes the objects in question, and use its index string as a reference to the object. Something like this:
var ObjectReference = [];
ObjectReference.push(obj);
set['ObjectReference.' + ObjectReference.indexOf(obj)] = true;
Obviously it's a little verbose, but you could write a couple of methods that handle it and get and set all willy nilly.
Edit:
Your guess is fact -- this is defined behaviour in JavaScript -- specifically a toString conversion occurs meaning that you can can define your own toString function on the object that will be used as the property name. - olliej
This brings up another interesting point; you can define a toString method on the objects you want to hash, and that can form their hash identifier.
The easiest way to do this is to give each of your objects its own unique toString method:
(function() {
var id = 0;
/*global MyObject */
MyObject = function() {
this.objectId = '<#MyObject:' + (id++) + '>';
this.toString= function() {
return this.objectId;
};
};
})();
I had the same problem and this solved it perfectly for me with minimal fuss, and was a lot easier that re-implementing some fatty Java style Hashtable and adding equals() and hashCode() to your object classes. Just make sure that you don't also stick a string '<#MyObject:12> into your hash or it will wipe out the entry for your exiting object with that id.
Now all my hashes are totally chill. I also just posted a blog entry a few days ago about this exact topic.
What you described is covered by Harmony WeakMaps, part of the ECMAScript 6 specification (next version of JavaScript). That is: a set where the keys can be anything (including undefined) and is non-enumerable.
This means it's impossible to get a reference to a value unless you have a direct reference to the key (any object!) that links to it. It's important for a bunch of engine implementation reasons relating to efficiency and garbage collection, but it's also super cool for in that it allows for new semantics like revokable access permissions and passing data without exposing the data sender.
From MDN:
var wm1 = new WeakMap(),
wm2 = new WeakMap();
var o1 = {},
o2 = function(){},
o3 = window;
wm1.set(o1, 37);
wm1.set(o2, "azerty");
wm2.set(o1, o2); // A value can be anything, including an object or a function.
wm2.set(o3, undefined);
wm2.set(wm1, wm2); // Keys and values can be any objects. Even WeakMaps!
wm1.get(o2); // "azerty"
wm2.get(o2); // Undefined, because there is no value for o2 on wm2.
wm2.get(o3); // Undefined, because that is the set value.
wm1.has(o2); // True
wm2.has(o2); // False
wm2.has(o3); // True (even if the value itself is 'undefined').
wm1.has(o1); // True
wm1.delete(o1);
wm1.has(o1); // False
WeakMaps are available in current Firefox, Chrome and Edge. They're also supported in Node v7 , and in v6 with the --harmony-weak-maps flag.
The solution I chose is similar to Daniel's, but rather than use an object factory and override the toString, I explicitly add the hash to the object when it is first requested through a getHashCode function. A little messy, but better for my needs :)
Function.prototype.getHashCode = (function(id) {
return function() {
if (!this.hashCode) {
this.hashCode = '<hash|#' + (id++) + '>';
}
return this.hashCode;
}
}(0));
For my specific situation I only care about the equality of the object as far as keys and primitive values go. The solution that worked for me was converting the object to its JSON representation and using that as the hash. There are limitations such as order of key definition potentially being inconsistent; but like I said it worked for me because these objects were all being generated in one place.
var hashtable = {};
var myObject = {a:0,b:1,c:2};
var hash = JSON.stringify(myObject);
// '{"a":0,"b":1,"c":2}'
hashtable[hash] = myObject;
// {
// '{"a":0,"b":1,"c":2}': myObject
// }
I put together a small JavaScript module a while ago to produce hashcodes for strings, objects, arrays, etc. (I just committed it to GitHub :) )
Usage:
Hashcode.value("stackoverflow")
// -2559914341
Hashcode.value({ 'site' : "stackoverflow" })
// -3579752159
In ECMAScript 6 there's now a Set that works how you'd like: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Set
It's already available in the latest Chrome, FF, and IE11.
The JavaScript specification defines indexed property access as performing a toString conversion on the index name. For example,
myObject[myProperty] = ...;
is the same as
myObject[myProperty.toString()] = ...;
This is necessary as in JavaScript
myObject["someProperty"]
is the same as
myObject.someProperty
And yes, it makes me sad as well :-(
Based on the title, we can generate strong SHA hashes, in a browser context, it can be used to generate a unique hash from an object, an array of params, a string, or whatever.
async function H(m) {
const msgUint8 = new TextEncoder().encode(m)
const hashBuffer = await crypto.subtle.digest('SHA-256', msgUint8)
const hashArray = Array.from(new Uint8Array(hashBuffer))
const hashHex = hashArray.map(b => b.toString(16).padStart(2, '0')).join('')
console.log(hashHex)
}
/* Examples ----------------------- */
H("An obscure ....")
H(JSON.stringify( {"hello" : "world"} ))
H(JSON.stringify( [54,51,54,47] ))
The above output in my browser, it should be equal for you too:
bf1cf3fe6975fe382ab392ec1dd42009380614be03d489f23601c11413cfca2b
93a23971a914e5eacbf0a8d25154cda309c3c1c72fbb9914d47c60f3cb681588
d2f209e194045604a3b15bdfd7502898a0e848e4603c5a818bd01da69c00ad19
Supported algos:
SHA-1 (but don't use this in cryptographic applications)
SHA-256
SHA-384
SHA-512
https://developer.mozilla.org/en-US/docs/Web/API/SubtleCrypto/digest#Converting_a_digest_to_a_hex_string
However, for a simple FAST checksum hash function, made only for collision avoidance, see CRC32 (Content Redundancy Check)
JavaScript CRC32
You might also be interested by this similar method to generate HMAC codes via the web crypto api.
Reference: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Symbol
you can use Es6 symbol to create unique key and access object.
Every symbol value returned from Symbol() is unique. A symbol value may be used as an identifier for object properties; this is the data type's only purpose.
var obj = {};
obj[Symbol('a')] = 'a';
obj[Symbol.for('b')] = 'b';
obj['c'] = 'c';
obj.d = 'd';
Here's my simple solution that returns a unique integer.
function hashcode(obj) {
var hc = 0;
var chars = JSON.stringify(obj).replace(/\{|\"|\}|\:|,/g, '');
var len = chars.length;
for (var i = 0; i < len; i++) {
// Bump 7 to larger prime number to increase uniqueness
hc += (chars.charCodeAt(i) * 7);
}
return hc;
}
My solution introduces a static function for the global Object object.
(function() {
var lastStorageId = 0;
this.Object.hash = function(object) {
var hash = object.__id;
if (!hash)
hash = object.__id = lastStorageId++;
return '#' + hash;
};
}());
I think this is more convenient with other object manipulating functions in JavaScript.
I will try to go a little deeper than other answers.
Even if JS had better hashing support it would not magically hash everything perfectly, in many cases you will have to define your own hash function. For example Java has good hashing support, but you still have to think and do some work.
One problem is with the term hash/hashcode ... there is cryptographic hashing and non-cryptographic hashing. The other problem, is you have to understand why hashing is useful and how it works.
When we talk about hashing in JavaScript or Java most of the time we are talking about non-cryptographic hashing, usually about hashing for hashmap/hashtable (unless we are working on authentication or passwords, which you could be doing server-side using NodeJS ...).
It depends on what data you have and what you want to achieve.
Your data has some natural "simple" uniqueness:
The hash of an integer is ... the integer, as it is unique, lucky you !
The hash of a string ... it depends on the string, if the string represents a unique identifier, you may consider it as a hash (so no hashing needed).
Anything which is indirectly pretty much a unique integer is the simplest case
This will respect: hashcode equal if objects are equal
Your data has some natural "composite" uniqueness:
For example with a person object, you may compute a hash using firstname, lastname, birthdate, ... see how Java does it: Good Hash Function for Strings, or use some other ID info that is cheap and unique enough for your usecase
You have no idea what your data will be:
Good luck ... you could serialize to string and hash it Java style, but that may be expensive if the string is large and it will not avoid collisions as well as say the hash of an integer (self).
There is no magically efficient hashing technique for unknown data, in some cases it is quite easy, in other cases you may have to think twice. So even if JavaScript/ECMAScript adds more support, there is no magic language solution for this problem.
In practice you need two things: enough uniqueness, enough speed
In addition to that it is great to have: "hashcode equal if objects are equal"
https://en.wikipedia.org/wiki/Hash_table#Collision_resolution
Relationship between hashCode and equals method in Java
I combined the answers from eyelidlessness and KimKha.
The following is an angularjs service and it supports numbers, strings, and objects.
exports.Hash = () => {
let hashFunc;
function stringHash(string, noType) {
let hashString = string;
if (!noType) {
hashString = `string${string}`;
}
var hash = 0;
for (var i = 0; i < hashString.length; i++) {
var character = hashString.charCodeAt(i);
hash = ((hash<<5)-hash)+character;
hash = hash & hash; // Convert to 32bit integer
}
return hash;
}
function objectHash(obj, exclude) {
if (exclude.indexOf(obj) > -1) {
return undefined;
}
let hash = '';
const keys = Object.keys(obj).sort();
for (let index = 0; index < keys.length; index += 1) {
const key = keys[index];
const keyHash = hashFunc(key);
const attrHash = hashFunc(obj[key], exclude);
exclude.push(obj[key]);
hash += stringHash(`object${keyHash}${attrHash}`, true);
}
return stringHash(hash, true);
}
function Hash(unkType, exclude) {
let ex = exclude;
if (ex === undefined) {
ex = [];
}
if (!isNaN(unkType) && typeof unkType !== 'string') {
return unkType;
}
switch (typeof unkType) {
case 'object':
return objectHash(unkType, ex);
default:
return stringHash(String(unkType));
}
}
hashFunc = Hash;
return Hash;
};
Example Usage:
Hash('hello world'), Hash('hello world') == Hash('hello world')
Hash({hello: 'hello world'}), Hash({hello: 'hello world'}) == Hash({hello: 'hello world'})
Hash({hello: 'hello world', goodbye: 'adios amigos'}), Hash({hello: 'hello world', goodbye: 'adios amigos'}) == Hash({goodbye: 'adios amigos', hello: 'hello world'})
Hash(['hello world']), Hash(['hello world']) == Hash(['hello world'])
Hash(1), Hash(1) == Hash(1)
Hash('1'), Hash('1') == Hash('1')
Output
432700947 true
-411117486 true
1725787021 true
-1585332251 true
1 true
-1881759168 true
Explanation
As you can see the heart of the service is the hash function created by KimKha.I have added types to the strings so that the sturucture of the object would also impact the final hash value.The keys are hashed to prevent array|object collisions.
eyelidlessness object comparision is used to prevent infinit recursion by self referencing objects.
Usage
I created this service so that I could have an error service that is accessed with objects. So that one service can register an error with a given object and another can determine if any errors were found.
ie
JsonValidation.js
ErrorSvc({id: 1, json: '{attr: "not-valid"}'}, 'Invalid Json Syntax - key not double quoted');
UserOfData.js
ErrorSvc({id: 1, json: '{attr: "not-valid"}'});
This would return:
['Invalid Json Syntax - key not double quoted']
While
ErrorSvc({id: 1, json: '{"attr": "not-valid"}'});
This would return
[]
If you truly want set behavior (I'm going by Java knowledge), then you will be hard pressed to find a solution in JavaScript. Most developers will recommend a unique key to represent each object, but this is unlike set, in that you can get two identical objects each with a unique key. The Java API does the work of checking for duplicate values by comparing hash code values, not keys, and since there is no hash code value representation of objects in JavaScript, it becomes almost impossible to do the same. Even the Prototype JS library admits this shortcoming, when it says:
"Hash can be thought of as an
associative array, binding unique keys
to values (which are not necessarily
unique)..."
http://www.prototypejs.org/api/hash
In addition to eyelidlessness's answer, here is a function that returns a reproducible, unique ID for any object:
var uniqueIdList = [];
function getConstantUniqueIdFor(element) {
// HACK, using a list results in O(n), but how do we hash e.g. a DOM node?
if (uniqueIdList.indexOf(element) < 0) {
uniqueIdList.push(element);
}
return uniqueIdList.indexOf(element);
}
As you can see it uses a list for look-up which is very inefficient, however that's the best I could find for now.
If you want to use objects as keys you need to overwrite their toString Method, as some already mentioned here. The hash functions that were used are all fine, but they only work for the same objects not for equal objects.
I've written a small library that creates hashes from objects, which you can easily use for this purpose. The objects can even have a different order, the hashes will be the same. Internally you can use different types for your hash (djb2, md5, sha1, sha256, sha512, ripemd160).
Here is a small example from the documentation:
var hash = require('es-hash');
// Save data in an object with an object as a key
Object.prototype.toString = function () {
return '[object Object #'+hash(this)+']';
}
var foo = {};
foo[{bar: 'foo'}] = 'foo';
/*
* Output:
* foo
* undefined
*/
console.log(foo[{bar: 'foo'}]);
console.log(foo[{}]);
The package can be used either in browser and in Node-Js.
Repository: https://bitbucket.org/tehrengruber/es-js-hash
If you want to have unique values in a lookup object you can do something like this:
Creating a lookup object
var lookup = {};
Setting up the hashcode function
function getHashCode(obj) {
var hashCode = '';
if (typeof obj !== 'object')
return hashCode + obj;
for (var prop in obj) // No hasOwnProperty needed
hashCode += prop + getHashCode(obj[prop]); // Add key + value to the result string
return hashCode;
}
Object
var key = getHashCode({ 1: 3, 3: 7 });
// key = '1337'
lookup[key] = true;
Array
var key = getHashCode([1, 3, 3, 7]);
// key = '01132337'
lookup[key] = true;
Other types
var key = getHashCode('StackOverflow');
// key = 'StackOverflow'
lookup[key] = true;
Final result
{ 1337: true, 01132337: true, StackOverflow: true }
Do note that getHashCode doesn't return any value when the object or array is empty
getHashCode([{},{},{}]);
// '012'
getHashCode([[],[],[]]);
// '012'
This is similar to #ijmacd solution only getHashCode doesn't has the JSON dependency.
Just use hidden secret property with the defineProperty enumerable: false
It work very fast:
The first read uniqueId: 1,257,500 ops/s
All others: 309,226,485 ops/s
var nextObjectId = 1
function getNextObjectId() {
return nextObjectId++
}
var UNIQUE_ID_PROPERTY_NAME = '458d576952bc489ab45e98ac7f296fd9'
function getObjectUniqueId(object) {
if (object == null) {
return null
}
var id = object[UNIQUE_ID_PROPERTY_NAME]
if (id != null) {
return id
}
if (Object.isFrozen(object)) {
return null
}
var uniqueId = getNextObjectId()
Object.defineProperty(object, UNIQUE_ID_PROPERTY_NAME, {
enumerable: false,
configurable: false,
writable: false,
value: uniqueId,
})
return uniqueId
}

Categories