Openlayers 3 - How to garbage collect Canvas ReplayGroup objects? - javascript

I've got an issue with ol.render.canvas.ReplayGroup objects not being let go to garbage collection.
The layer that this is for is an ol.layer.Image, created from an ol.source.ImageVector, in turn created from an ol.source.Vector source.
The sequence of events that I'd like to have result in some garbage collection is,
the Image's style is set to null with setStyle(null);
then the Image's source is set to null with setSource(null);
then the ol.layer.Image object is removed from the map with setMap(null);
This does result in the layer being removed from the map ( I think, it disappears ), but when I profile the web page with Chrome's heap allocation profile, the canvas.ReplayGroup object is still there, never to be used again.
Is this something anyone else has run into? I've tried to use the map.addLayer() instead of layer.setMap( ), same results.
== edit ==
I forgot to write, the ol.layer.Image has been added to an ol.layer.Group. More specifically, that last part above was map.addLayer( group ) and also group.getLayers().forEach(function(l){ l.setMap(map); }), no difference it seems.

https://github.com/openlayers/ol3/blob/master/src/ol/source/imagevectorsource.js
In the ol.source.ImageVector that's the source for the ol.layer.Image, I found a reference to the replay group classes called replayGroup_. Setting that 'private' property to null after setting the layer source to null results in garbage collection.. yay!
var imageVector = layer.getSource();
layer.setSource(null);
imageVector.setStyle(null);
imageVector.replayGroup_ = null;
imageVector = null;
This works for me for now

Related

Do Javascript Maps have a set amount of keys that they can set or do they have a set amount of key operations that can be done?

So I know that Javascript Maps have a set amount of keys that they can store ( around 16.7 M ).
I was trying to test if I can ( in a very ugly way ) remove the oldest elements from the array. I noticed that no matter what I do it is actually not the Map size that was a limiting factor but it was rather the amount of operations I have done that were limiting me.
Below is an example code:
const map = new Map();
let i = 0;
while (true) {
i++;
set(i, i);
if (i % 1000 === 0)
console.log('INSERTED: ', i, 'KEYS', 'MAP SIZE :', map.size);
}
function set(key, value) {
if (map.size > 16770000) {
Array.from(map.keys()).slice(0, 10000).forEach(key => map.delete(key));
console.log('DELETED, current map size:', map.size);
}
try {
map.set(key, value);
} catch (e) {
console.log('MAP SIZE:', map.size, 'INSERTED:', key);
throw e;
}
}
When you run the snippet, just check your console. What you should notice is at the end ( when the exception is thrown ) you will get the Map Size and the INSERTED. Map Size will be a variable ( depending on how many elements you remove, which in this case is 10000) but INSERTED will always be the same value. So how come if I am not reaching the limit of the Map.... I am somehow reaching a limit. Is this some kind of reference issue that I am missing?
EDIT: As mentioned by #CRice if you increase the items deleted to around 10,000,000 then the cycle continues on seemingly forever.
EDIT 2: Here is an answer from one of the V8 devs talking about the limit of 16.7M keys: https://stackoverflow.com/a/54466812/5507414
EDIT 3: See answer: https://stackoverflow.com/a/63234302/5507414. We still need a V8 developer or someone with further knowledge in the engine to clarify this.
I adapted your script (see below) to see how many items had to be deleted before it could insert keys again in the Map.
The result is 8388608 (= 16777216/2) with node v12.18.1 (built on Chrome's V8 JavaScript engine).
It reminded me of a usual pattern where the underlying data structure doubles in size when it's almost full.
So I looked for the actual Map implementation in the V8 engine.
Here's what V8 development blog says about it:
ECMAScript 2015 introduced several new data structures such as Map, Set, WeakSet, and WeakMap, all of which use hash tables under the hood.
And here's an interesting comment in V8 source code:
HashTable is a subclass of FixedArray that implements a hash table
that uses open addressing and quadratic probing.
In order for the quadratic probing to work, elements that have not
yet been used and elements that have been deleted are
distinguished. Probing continues when deleted elements are
encountered and stops when unused elements are encountered.
- Elements with key == undefined have not been used yet.
- Elements with key == the_hole have been deleted.
Basically, when the script deletes a key, it seems that it's just marked as deleted. It becomes a "hole", as the V8 code comment puts it.
It's actually deleted only when the engine actually rebuilds the underlying data structure (that's what happens when the script deletes half of the elements).
Anyway, that's my understanding. We would need to delve into V8 code in order to clarify all the details.
Other interesting references:
Maximum number of entries in Node.js Map and FixedArray limit
Open addressing and Lazy deletion
map = new Map();
let i = 0;
while (true) {
i++;
try {
map.set(i, i);
} catch (e) {
console.log(e);
break;
}
if (i % 100000 === 0)
console.log('inserted: ', i);
}
console.log('max map size:', map.size, 'inserted:', i);
let j = 0;
while (true) {
j++;
map.delete(j);
if (j % 100000 === 0) {
console.log('deleted: ', j, 'map size: ', map.size);
if (map.size == 0) {
break;
}
}
try {
map.set(i, i);
} catch(e) {
continue;
}
break;
}
console.log('deleted before inserting again: ', j);
I dug into the ECMA language spec to take a look at Maps (Link). It seems that the behavior you are seeing is consistent with spec, and comes out of the spec'd definition for Map's delete prototype.
When a Map element is deleted with Map.prototype.delete(key), the spec only requires that the element with the matching key be set to empty.
Here's the definition copied and pasted from the ECMA spec:
3.1.3.3 Map.prototype.delete ( key )
The following steps are taken:
Let M be the this value.
Perform ? RequireInternalSlot(M, [[MapData]]).
Let entries be the List that is M.[[MapData]].
For each Record { [[Key]], [[Value]] } p that is an element of entries, do
a. If p.[[Key]] is not empty and SameValueZero(p.[[Key]], key) is true, then
    i. Set p.[[Key]] to empty.
    ii. Set p.[[Value]] to empty.
    iii. Return true.
Return false.
The most important piece to us here is 4a.
When deleting an element, Map.prototype.delete checks each record p for an element where p.[[Key]] matches the provided key argument.
When found, p.[[Key]] and p.[[Value]] are both set to empty.
This means that, while the key and value are gone and are no longer stored or retrievable, the space, the element itself where the key and value were stored, may indeed be left in the Map's storage, and still takes up space behind the scenes.
While the specification contains the following note about its use of "empty"...
The value empty is used as a specification device to indicate that an entry has been deleted. Actual implementations may take other actions such as physically removing the entry from internal data structures.
...it's still leaving the door open for implementations to simply wipe the data without reclaiming the space, which is apparently what is occurring in your example here.
What about Map.prototype.set(key, value)?
In the case of set(), the function checks first for an existing element with a matching key to mutate the value of, and skips over all empty elements in the process. If none is found, then "Append p [<key, value>] as the last element of entries".
What, then, about Map.prototype.size?
In the case of size, the spec loops over all elements in the Map, and simply increments a counter for all non-empty elements it encounters.
I found this really interesting... If I had to hazard a guess, I suppose that the overhead of finding and removing empty elements is seen as unnecessary in most cases, since the quantities that must be reached to fill the structure are so large, ie. since maps hold so much. I wonder how large the time and space overhead of removing an empty element would be for a dataset large enough for it to be needed.
please check this i made some change in code now its working please let me know if it still not working
i accept this is not best way to do this, but reinitializing map
object will let us add some more data, but it's also slow down
operation speed,
please open console to see output
var map = new Map();
let i = 0;
var ke=[]
while (true) {
i++;
set(i, i,map.size);
if (i % 1000 === 0)
console.log('INSERTED: ', i, 'KEYS', 'MAP SIZE :', map.size);
}
function set(key, value,s) {
if (s >= 16730000) {
var arr= ke.slice(0, 10000)
ke.splice(0, 10000)
arr.forEach(key => map.delete(key));
console.log('DELETED, current map size:', map.size);
map= new Map(map);
arr=[]
}else{
try {
ke.push(key)
map.set(key, value);
} catch (e) {
console.log('MAP SIZE:', map.size, 'INSERTED:', key);
throw e;
}
}
}

Javascript JSON Encode of associative array

I know there are several issues with JavaScript Array.push() method posted here, but I can't get the following example to work:
First of all, i got a global Array:
var results = new Array();
for (var i = 0; i < X.length; i++) {
results[i] = new Array();
}
Now this array should be filled on a button-event. All I want is to fill the Array with new Arrays and push some data to them. The following code should check if results[x][y] already is an Array (and create one if it's not) and push data to it.
So, in the end, there should be an Array (result) that contains X.length Arrays filled with an unknown number of new Arrays, each of them containing an unknown number of data:
function pushResult(result) {
if (typeof results[currentStim][currentDist] == 'undefined') {
results[currentStim][currentDist] = new Array();
}
results[currentStim][currentDist].push(result);
}
Problem is: It just doesn't work. I made sure that currentStim is never out of bounds, I made sure that the "if"-Statement is only accessed when needed (so the Array isn't overwritten with a new one) and I watched the return-value of push(), always throwring back a number representing the new array length. As expected, this number increases evertime a value is pushed to an Array.
However, when I finally call:
document.getElementById('results').value = JSON.stringify(results);
to pass results to my PHP-script, something like this will be passed:
[[],[[1]],[],[]]
push()was called MUCH more often than once (at least "1" is one of the results I wanted to be stored) and, as described, always returned an increasing arrayLength. How does that work? What happened to my data?
I testet this on Chrome as well as on Firefox, same result. It might be interesting that a seperate loop draws to a Canvas the same time, but that shouldn't interupt Array-Handling and onKey-Events, right?
Hope u can help me,
MA3o
EDIT:
pushResult is called like this:
// Handles Space-Events
function pushed(event) {
if (event.which == 32 && stimAlive) {
pushResult(1);
hitCurrent = true;
}
Where hitCurrent and stimAlive are just flags set somewhere else. Some code further the function pushedis registered as an event listener:
document.onkeydown = function(event) { pushed(event)}
All the functions are called correctly. Adding console.log(results) to every loop just shows the right Array, as far as I can see.
According to the comments, the problem might be that "currentDist" can be a float value.

KinteticJS - destroying nodes, infinite while loop because child doesn't get removed

KinerticJS version: 4.4.3
I have this problem when i want to remove a layer from stage. When I call Layer.destroy() kinetic runs upon an node which can't be removed. I don't get any error, but the while loop gets infinite since the loop is based on the length of the children.
destroy: function (_nameSpace) {
var parent = this.getParent(),
stage = this.getStage(),
dd = Kinetic.DD,
go = Kinetic.Global;
var tempLength
// destroy children
while (this.children && this.children.length > 0) {
this.children[0].destroy();
}
}
The object which, in my case, can not be removed is a Kinetic.Image. When I trace the node type it returns a Shape (which is correct). Also, i can trace all the stuff which i want to know from the object...
Rectangles do get removed.
I created a low level test in fiddle, and there everything is working fine and dandy, so it has te be something with my code. Than again, the node whicht is not being removed IS a valid object, so why does is not being removed?
I created a error message, so i wouldn't crash my browser everytime:
tempLength = this.children.length
this.children[0].destroy();
if (tempLength == this.children.length) {
throw 'item not removed ' + this.children[0].getNodeType();
}
As you can see i check right after i destroyed an item if the length of the children did change, so i can assume thah no other code is interfering. For example adding an node when destroying one.
I'm at a dead end here. I hope somebody can help me out or point me in any direction. Anyway, thanks for reading :)
Ok, i figured out what the issue was. Somehow i aadded a same Kinetic.Image twice to a layer. When destroying the layer it could not remove it's child.
I assume this is an Kinetic bug. I should not be able to break the framework so (relatively) easy.
https://github.com/ericdrowell/KineticJS/issues/434

Removing items from data bound array

How do I remove an items from a data bound array? My code follows.
for(var i = 0; i < listBox.selectedIndices.length; i++) {
var toRemove = listFiles.selectedIndices[i];
dataArray.splice(toRemove, 1);
}
Thanks in advance!
Edit Here is my swf. The Add Photos works except when you remove items.
http://www.3rdshooter.com/Content/Flash/PhotoUploader.html
Add 3 photos different.
Remove 2nd photo.
Add a different photo.
SWF adds the 2nd photo to the end.
Any ideas on why it would be doing this?
Edit 2 Here is my code
private function OnSelectFileRefList(e:Event):void
{
Alert.show('addstart:' + arrayQueue.length);
for each (var f:FileReference in fileRefList.fileList)
{
var lid:ListItemData = new ListItemData();
lid.fileRef = f;
arrayQueue[arrayQueue.length]=lid;
}
Alert.show('addcomplete:' + arrayQueue.length);
listFiles.executeBindings();
Alert.show(ListItemData(arrayQueue[arrayQueue.length-1]).fileRef.name);
PushStatus('Added ' + fileRefList.fileList.length.toString() + ' photo(s) to queue!');
fileRefList.fileList.length = 0;
buttonUpload.enabled = (arrayQueue.length > 0);
}
private function OnButtonRemoveClicked(e:Event):void
{
for(var i:Number = 0; i < listFiles.selectedIndices.length; i++) {
var toRemove:Number = listFiles.selectedIndices[i];
//Alert.show(toRemove.toString());
arrayQueue.splice(toRemove, 1);
}
listFiles.executeBindings();
Alert.show('removecomplete:' + arrayQueue.length);
PushStatus('Removed photos from queue.');
buttonRemove.enabled = (listFiles.selectedItems.length > 0);
buttonUpload.enabled = (arrayQueue.length > 0);
}
It would definitely be helpful to know two things:
Which version of ActionScript are you targeting?
Judging from the behavior of your application, the error isn't occurring when the user removes an item from the list of files to upload. Looks more like an issue with your logic when a user adds a new item to the list. Any chance you could post that code as well?
UPDATE:
Instead of: arrayQueue[arrayQueue.length]=lid
Try: arrayQueue.push(lid)
That will add a new item to the end of the array and push the item in to that spot.
UPDATE 2:
Ok, did a little more digging. Turns out that the fileList doesn't get cleared every time the dialog is opened (if you're not creating a new instance of the FileReferenceList each time the user selects new files). You need to call splice() on the fileList after you add each file to your Array.
Try something like this in your AddFile() method...
for(var j:int=0; j < fileRefList.fileList.length; j++)
{
arrayQueue.push(fileRefList.fileList[j]);
fileRefList.fileList.splice(j, 1);
}
That will keep the fileList up to date rather than holding on to previous selections.
I see one issue. The selected indices are no longer valid once you have spliced out the first element from the array. But that should only be a problem when removing multiple items at once.
I think we need to see more code about how you are handling the upload before we can figure out what is going on. It looks to me like you are holding a reference to the removed FileReference or something. The described problem is occurring when you upload a new file, not when you remove the selected one.
Do you mean to use listBox and listFiles to refer to the same thing?
I'm stepping out on a limb here, because I don't have a ton of experience with JavaScript, but I'd do this the same way that I'd do it in C, C++, or Java: By copying the remaining array elements down into their new locations.
Assuming that listFiles.selectedIndices is sorted (and its contents are valid indices for dataArray), the code would be something like the following:
(WARNING: untested code follows.)
// Don't bother copying any elements below the first selected element.
var writeIndex = listFiles.selectedIndices[0];
var readIndex = listFiles.selectedIndices[0] + 1;
var selectionIndex = 1;
while(writeIndex < (dataArray.length - listFiles.selectedIndices.length)) {
if (selectionIndex < listFiles.selectedIndices.length) {
// If the read pointer is currently at a selected element,
// then bump it up until it's past selected range.
while(selectionIndex < listFiles.selectedIndices.length &&
readIndex == listFiles.selectedIndices[selectionIndex]) {
selectionIndex++;
readIndex++;
}
}
dataArray[writeIndex++] = dataArray[readIndex++];
}
// Remove the tail of the dataArray
if (writeIndex < dataArray.length) {
dataArray.splice(writeIndex, dataArray.length - writeIndex);
}
EDIT 2009/04/04: Your Remove algorithm still suffers from the flaw that as you remove items in listFiles.selectedIndices, you break the correspondence between the indices in arrayQueue and those in listFiles.selectedIndices.
To see this, try adding 3 files, then doing "Select All" and then hit Remove. It will start by removing the 1st file in the list (index 0). Now what had been the 2nd and 3rd files in the list are at indices 0 and 1. The next value taken from listFiles.selectedIndices is 1 -- but now, what had been the 3rd file is at index 1. So the former File #3 gets spliced out of the array, leaving the former 2nd file un-removed and at index 0. (Using more files, you'll see that this implementation only removes every other file in the array.)
This is why my JavaScript code (above) uses a readIndex and a writeIndex to copy the entries in the array, skipping the readIndex over the indices that are to be deleted. This algorithm avoids the problem of losing correspondence between the array indices. (It does need to be coded carefully to guard against various edge conditions.) I tried some JavaScript code similar to what I wrote above; it worked for me.
I suspect that the problem in your original test case (removing the 2nd file, then adding another) is analogous. Since you've only shown part of your code, I can't tell whether the array indices and the data in listFiles.selectedIndices, arrayQueue, and fileRefList.fileList are always going to match up appropriately. (But I suspect that the problem is that they don't.)
BTW, even if you fix the problem with using splice() by adjusting the array index values appropriately, it's still an O(N2) algorithm in the general case. The array copy algorithm is O(N).
I'd really need to see the whole class to provide a difinitive answer, but I would write a method to handle removing multiple objects from the dataProvider and perhaps assigning a new array as the dataProvider for the list instead of toying with binding and using the same list for the duration. Like I said, this is probably inefficient, and would require a look at the context of the question, but that is what I would do 9unless you have a big need for binding in this circumstance)
/**
* Returns a new Array with the selected objects removed
*/
private function removeSelected(selectedItems:Array):Array
{
var returnArray:Array = []
for each(var object:Object in this.arrayQueue)
{
if( selectedItems.indexOf(object)==-1 )
returnArray.push( object )
}
return returnArray;
}
You might be interested in this blog entry about the fact that robust iterators are missing in the Java language.
The programming language, you mentioned Javascript, is not the issue, it's the concept of robust iterators that I wanted to point out (the paper actually is about C++ as the programming language).
The [research document]() about providing robust iterators for the ET++ C++ framework may still e helpful in solving your problem. I am sure the document can provide you with the necessary ideas how to approach your problem.

Why does GetClusterShape return null when the cluster specification was retrieved through the GetClusteredShapes method?

Suppose I have a virtual earth shape layer called shapeLayer1 (my creative energy is apparently at an alltime low).
When i call the GetClusteredShapes method I get an array of VEClusterSpecification objects that represent each and every one of my currently visible clusters; no problem there. But when I call the GetClusterShape() method it returns null... null! Why on earth would it do that? I used firebug to confirm that the private variable of the VEClusterSpecification that's supposed to hold a reference to the shape is indeed null so it's not the method that's causing the problem.
Some have suggested that this is actually documented behavior
Returns null if a VEClusterSpecification object was returned from the VEShapeLayer.GetClusteredShapes Method
But looking at the current MSDN documentation for the VEShape class it says:
Returns if a VEClusterSpecification object was returned from the VEShapeLayer.GetClusteredShapes Method
Is this a bug or a feature? Is there any known workarounds or (if it is a bug) some plan on when they are going to fix it?
I know it sux... I'm still looking at the code but from what I can tell they want you to set the custom stuff using the VEClusteringOptions callback method. This doesn't work for for me because I'm using a custom infobox but it may help someone else, using the method below you have complete access to the shapes inside the cluster.
function clusteringCallback(clusters)
{
for (var i=0; i < clusters.length; ++i)
{
var cluster = clusters[i];
var clusterShape = cluster.GetClusterShape();
clusterShape.SetCustomIcon("<div class='cluster'><div class='text'>"+cluster.Shapes.length+"</div></div>");
clusterShape.SetTitle("This is my Cluster #" + i);
clusterShape.SetDescription("This cluster has " + cluster.Shapes.length + " shapes in it!");
}
}
function SetClustering()
{
var options = new VEClusteringOptions();
options.Callback = clusteringCallback;
shapeLayer.SetClusteringConfiguration(VEClusteringType.Grid, options);
}
If you need to get the layer id from the layer which the clustershape belongs to you can do it like this:
var layerId = clusters[i].Shapes[0].GetShapeLayer().MsnId;
if you find another way, please inform us ;-)

Categories