The Mozilla Dev Center says:
it is best not to add, modify, or remove properties from the object
during iteration, other than the property currently being visited;
there is no guarantee whether or not an added property will be visited,
However, I have no need to visit the added properties until later. So is it safe to add them?
E.g.
var animals = {"cats":25, "dogs":15}
for(var key in animals){
if(key.substring(0,3) !=="big"){ // no danger of referencing them
var newAnimal = "big" + key;
animals[newAnimal] =0;
}
}
Or will increasing the size of the object confuse the "for - in" iteration?
The docs don't say that the added propterties won't be visited, they say it's undefined.
So depending on the implementation, you may end up with a bigdogs, a bigbigdogs etc., resulting in an endless loop.
Or it may do something completely different, after all, it's undefined behavior.
To solve this, work with a copy of the object instead and add the new properties to the copy without mutating the looped object.
Edit: Looks like you are checking whether the key starts with big, missed that when I first looked at it. So you should be fine.
It is still good practice to avoid undefined behavior like this. This can easily come back and bite you when the code has to be changed at some point in the future and the reasoning behind the loop/check is not absolutely clear.
If you don't care that "there is no guarantee whether or not an added property will be visited", then you can do it.
If you want to make sure you won't visit them, make a snapshot of the properties before the loop:
var animals = {"cats":25, "dogs":15}
for(var key of Object.keys(animals))
animals["big" + key] = 0;
console.log(animals);
If you want to make sure you will visit them, use maps:
var animals = new Map([ ["cats",25], ["dogs",15]]);
for(var key of animals.keys())
if(key.slice(0,6) !== 'bigbig')
animals.set("big" + key, 0);
console.log([...animals].map(a => a.join(': ')));
Yes it is safe to add them since you have check to make sure you wont end up in an infinite loop.
The "add" adds it in random place so the order is not something you can count on, but looking as you don't need to visit the newly created property in the same loop this should be fine.
I would suggest if you just want to add new property create a new object which will later replace the current object once you are done. Because that way it will have better performance.
Also won't drive the person who is reviewing your code nuts :). Have a good day
It's safe
After start the iteration, when
delete an item before visiting , you not see it.
update an item, you see the new value
insert an item, you may or may not see it in the current cicle. But that was also safe.
source
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/for...in
Related
I have what seems like it should be a simple operation. For each bridgedSection, I check for a potentialSection with an id that matches the bridged.referenceSection
Then I take that result, parse the HTML on the object with Cherio, make a slight modification (using an id for testing), and then store both the bridgedSection and the modified result on an object, then push that object to the array.
If I log the new object BEFORE pushing, I get the correct object values. If I log it from the array I get incorrect values only for reference.section. bridgedSection is fine, but reference.section matches across all entries in the array.
To say that I'm thoroughly flummoxed is an understatement. Can anyone shed some light on what I am (clearly) doing wrong?
var sectionCount = 0;
bridgedSections.forEach(bridged => {
var obj = potentialSections.find(obj => obj._id == bridged.referenceSection);
$ = cheerio.load(obj.html);
$(".meditor").html(bridged._id);// dropping the id here so it's easy to see if it was updated
obj.html = $.html();
obj.rand = Math.floor(Math.random() * 1000); // can't seem to add to obj either
var thisSection = {
referenceSection: obj,
bridgedSection: bridged,
}
console.log(thisSection) // correct value logged
currentSections.push(thisSection);
sectionCount++;
});
console.log(currentSections);
// this logs an array of the correct length but each
// {}.referenceSection is identical to the last entry pushed above
To try to clarify what both of the above folks are saying, the JavaScript language (like many others) has the concept of references, and makes very heavy use of that concept.
When one variable "refers to" another, there is only one copy of the value in question: everything else is a reference to that one value. Changes made to any of those references will therefore change the [one ...] underlying value (and, be reflected instantaneously in all of the references).
The advantage of references is, of course, that they are extremely "lightweight."
If you need to make a so-called "deep copy" of an array or structure or what-have-you, you can do so. If you want to push the value and be sure that it cannot be changed, you need to make sure that what you've pushed is either such a "deep copy," or that there are no references (as there obviously are, now ...) to whatever it contains. Your choice.
N.B. References – especially circular references – also have important implications for memory management (and "leaks"), because a thing will not be "reaped" by the memory manager until all references to it have ceased to exist. (Everything is "reference counted.")
And, all of what I've just said pretty much applies equally to every language that supports this – as most languages now do.
Javascript is passes function parameters by reference. This means the following happens:
derp = {a:1}
function passedByRef(param){
param['a'] = 2;
}
passedByRef(derp)
console.log(derp['a']) // 2
So when you pass a json object to a function, if you modify said object in the function it will change the original object. You probably want to make a deep copy of bridged before you assign it to thisSection because if you modify the version of bridged later on in thisSection it will modify the original object.
Here is a post that talks about cloning objects or you could look into something like immutable js
I think you need to look into Javascript deep copy.
You are modifying the original object when you modify the second assigned variable, because they are pointing to the same object. What you really need is to duplicate the object, not simply making a pointer to it.
Take a look at this:
https://scotch.io/bar-talk/copying-objects-in-javascript#toc-deep-copying-objects
Typically, you cannot safely delete items from an list while you're looping through that list. Does this concept remain true for ES6 Maps?
I tried this simple test without exceptions:
var map = new Map([['a',1],['b',2],['c',3]]);
map.forEach((value,key,map)=>
{
map.delete(key);
let str = `["${key}",${value}] was deleted. `;
str += `map.size = ${map.size}`;
console.log(str);
});
It seems ok.
Update: I just read this reference from Mozilla. It is certainly doable. I'd be interested in any performance benchmarks comparing this method of deletion with other methods (on larger datasets).
Well I guess you're right. I am not quite familiar with the ES6 Maps, but had done a bit of research and found this blog a bit helpful where it explains about the MAPS:
https://hackernoon.com/what-you-should-know-about-es6-maps-dc66af6b9a1e
Here you will get the deleting mechanism explanation too:
Something like this:
var m = new Map()
m.set('a', 1)
m.set('b', 2)
m.delete('a'); // true
m.delete('c'); // false (key was not there to delete)
Hope this helps.
Why? If you are working with the same instance, actually you can delete. It has functions which are used for deleting an item, so you can delete.
But from the side of Optimization don't delete any item from the Map or array. Javascript engines have optimizations based on the shape of the object. If it is the same over some reference to that object, it would be optimized. Instead of this, create a new object from the filtered values of the current object.
var map = new Map([['a',1],['b',2],['c',3]]);
map.forEach((value,key,map)=>
{
map.delete(key);
});
console.log(map);
There are some languages ( C# ), that you can't remove items from the IEnumerable in the for each loop, because it works another way under the hood, actually it gives you only read and update access, not delete.
I just found out that javascript has a delete statement. I've read a bit about it and am not much the wiser.
So I am hoping to get a functional definition of when I should use it, if at all. So I know I can delete properties of an object; as is made obvious by this fiddle:
var myData = {a:"hello",b:"world"};
alert(myData.b);
delete myData.b;
alert(myData.b);
Which shows "world" then undefined in successive alerts. However, you cannot use delete like this (as one might in C++):
function data() {
this.attribute1 = "aww";
this.attribute2 = "poo";
}
var myData = new data();
delete myData;
Here delete returns false indicating that you cannot delete myData. I used to work primarily in C++ and this was like the whole idea of delete. I can't think of any reason I would use delete to remove properties. Should I ever worry about using delete to mark memory to be freed? Like if I do something like this.
var myData = new data();
... //do stuff
myData = new data();
Addition
So I dug up the post that confused me. The most upvoted answer on this question states (as quoted from the Apple Javascript Coding Guidelines):
Use delete statements. Whenever you create an object using a new statement, pair it with a delete statement. This ensures that all of the memory associated with the object, including its property name, is available for garbage collection. The delete statement is discussed more in “Freeing Objects.”
So, if I understand some of the comments and answers I've been given, this statement is not accurate, because you cannot even call delete on an object created using a new statement.
According to mozilla's developer documents, delete does not work that way.
The delete operator deletes a property from an object, it does not delete the object itself.
So instead of using it as you have demonstrated, you would use it more like the following:
myGlobalObject = {};
var myObject = {};
myObject.propertyA = "blah";
// Do some stuff
delete myObject.propertyA; // This works because you're deleting a property off myObject
delete myGlobalObject; // This works because myGlobalObject is a property of the global object.
delete myObject; // This does NOT work - most likely because you declared it using the var keyword
This doesn't actually do garbage collection though. Also if myObject has a prototype up the chain that has propertyA it would still inherit that property through the prototype.
For more indepth information feel free to check out the developer documents:
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/delete
delete on its own specifically states:
The delete operator removes a property from an object.
You might remove a property if you don't want it included in data sent to a server, or used by other code, e.g., something that automatically takes object data and turns it into a table.
In general you'd never use it for memory management, a possible exception being if you had a huge chunk of data in an object (like received from the back end) that you explicitly don't need.
I have been trying clone the following in such a way as to avoid all references to the original data:
initially, a d3.js selection, using the clone_d3_selection() method but which, though correctly duplicating DOM elements, maintains references to selection data (the d parameter in function calls)..
the array at it's heart, extracted using d3's selection.data() function. Cloning seems to fail in part because the target structure appears to be a mix of object and array, but moreover because what are claimed to be clones generally maintain references to the original data, meaning changes to one are reflected in the other. A further (but minor) issue has been that (generally) null elements were being copied...
Note: JSON.parse(JSON.stringify(object)) is of no use in either case, as it applies to objects, whereas d3 uses / coerces / outputs arrays).
Applied to an array, in all respects EXCEPT that it too replicates references, an object clone/copy function of the type shown below works fine. Despite the preserved references, it has been provided (and accepted) as a resolution to many a javascript-tagged object-cloning question.
function transfer(obj) {
var result = [];
for (var property in obj) {
if (obj.hasOwnProperty(property)) {
result[property.toString()] = arr[property];
}
}
return result;
};
I, however, really need complete independence from the current/original. Seems no matter what I do, references are copied.
How do I know? At regular intervals, the current selection/array is a) cloned then b) -with no further direct changes- designated previous. Beyond this point, any changes made to the clone are instantly reflected in the original - and remain through it's redesigation into previous.. The whole point of the clone was to avoid this..
sole
modifications!
:
v
--------> current ------> clone
^ :
: v
: previous
: :
merge.....:
Is there a better way, or might the code above be modified so that it provides a new, completely independent array, but bearing the same data values? Might this even be done directly to the original selection in a d3-centric way?
Incidentally, the array being cloned is simple enough, having been created along these lines:
var arr = [];
arr["key1"] = "value1";
arr["key2"] = 2;
: : :
... followed by the accustomed d3 append() call chain.
Incidentally, every attempt at a simulation outside my rather large codebase has become mired in data formatting issues. Just astonishing what a minefield this is..
Glad of any suggestions.
Thanks
Thug
To deep copy an array as retrieved from a d3.js selection using selection.data():
http://www.overset.com/2007/07/11/javascript-recursive-object-copy-deep-object-copy-pass-by-value/
This link (were it more easily found) turns to be provided in other answers, making this question something of a duplicate.
The problem will be encountered more frequently as d3.js's limits are pushed, though, so rather than delete it, here it stays...
I've got an array, and it's got a method I threw onto it called add, which I use as a wrapper around push. I've found myself using push a few times when I should have used add, and now I'm thinking it would be nice to assign a reference to my add method to the array's native push. Thus, calling push on the array would call add.
Do internals depend on externally available native methods like push? How would this affect compatibility? Is this a bad idea? If so, why?
Some code:
PR.state = {
views: []
};
_.extend(PR.state.views, {
add: function(view) {
var parent = view.parent;
if ((!this.length && view instanceof PR.Views.App) || (parent && _.contains(this, parent)))
this.push(view);
}
});
// I am thinking:
PR.state.views.push = PR.state.views.add;
I would strongly advise against changing the behavior of a standard array method. If you really want a custom method, then just create a new method and give it it's own unique name and use that.
Changing the behavior of existing methods could have all sorts of bad consequences:
Incompatibility with code retrieved from any other source.
Creates a non-standard and unexpected implementation if anybody else ever works on this project. This is like adding in a time bomb to trip up some future developer.
Training yourself to use your own custom method instead of .push() is just something that a decent developer would do. Just do it.
Creating a newly named method with an appropriate and descriptive name improves the readability, understandability and maintainability of your code. Replacing an existing method with something that works differently does the opposite.
It's not so bad if you just replace the method on one instance of an array, not the whole array prototype, but it's still not a good idea.
What a stupid question. If I replace push with add, then what happens when I call push from add? :< :< I haven't tested it, but I suspect that while Array.prototype.push will still be available, unless I use Array.prototype.push explicitly, calling add will result in a mondo endless loop.