I have a a question that's been bothering me for some time.
I am using the three.js webgl library to render a large scene with many textures and meshes.
This question is not necessarily bound to webgl, but more javascript arrays and memory management.
I am basically doing this:
var modelArray = [];
var model = function(geometry,db_data){
var tex = THREE.ImageUtils.loadTexture('texture.jpg');
var mat = new THREE.MeshPhongMaterial({map:tex})
this.mesh = new THREE.Mesh(geometry,mat);
this.db = db_data;
scene.add(this.mesh);
};
function loadModels(model_array){
for(i=0;i<geometry.length;i++){
modelArray.push(new model(model_array[i]['geometry'],model_array[i]['db_info']));
}
}
loadModels();
Am I being inefficient here? Am I essentially doubling up the amount of memory being used since I have the mesh loaded to the scene and an array. Or does the model (specifically the model.mesh) object in the array simply point to a singular memory block?
Should I just create an array of mesh ids and reference the scene objects, or is it ok to add the mesh to the scene and an array?
Thanks in advance and I hope I was clear enough.
The main thing that jumps out at me is this:
var tex = THREE.ImageUtils.loadTexture('texture.jpg');
var mat = new THREE.MeshPhongMaterial({map:tex})
If you are loading the same texture every time you create a new model, that could create a lot of overhead (and it can also be pretty slow). I would load the texture(s) and corresponding material(s) you need outside of your loop once.
Your modelArray is a list of plain model objects, each of which has a pointer to the corresponding mesh object (and db object). The scene has a pointer to the same mesh object so you are not exploding your memory use by cloning meshes.
It's possible that your memory use is just because your mesh geometries take up a lot of memory. Try loading your models one by one while watching memory usage; perhaps you have one that is unexpectedly detailed.
Related
I am creating a Geometry in three.js and populating it with vertices to build a 2D terrain. I am pushing all of the Vector3s and Face3s to the geometry as soon as my terrain is created, and then modifying each vertex and face every frame.
Because I am modifying the face vertices every frame, I need to tell three.js to update the faces. I am doing this using geometry.elementsNeedUpdate = true. This works, however I have noticed it causes a substantially large amount of memory usage (my app uses an extra ~50mb of RAM every second).
The following code demonstrates what I'm trying to do:
function pushEverything(geom) {
for (var i = 0; i < 10000; i++) {
geom.vertices.push(new THREE.Vector3(...));
geom.faces.push(new THREE.Face3(...));
geom.faces.push(new THREE.Face3(...));
}
}
function rebuild(geom) {
for (var face of geom.faces) {
face.a = ...
face.b = ...
face.c = ...
}
geom.elementsNeedUpdate = true
}
var renderer = new THREE.WebGLRenderer({
canvas: document.getElementById("my-canvas")
});
var geom = new THREE.Geometry();
var camera = new THREE.PerspectiveCamera(...);
pushEverything(geom);
while (true) {
// Perform some terrain modifications
rebuild(geom);
renderer.render(geom, camera);
sleep(1000 / 30);
}
I have already followed the advice of this question, which suggested using geometry.vertices[x].copy(...) instead of geometry.vertices[x] = new Vector3(...).
My question is: why is my memory usage so high when using geometry.elementsNeedUpdate = true? Is there an alternative method to updating a Geometry's faces?
I am using three.js 0.87.1 from NPM.
I have found and solved the issue. It was not a memory leak on three.js' part, but it was a memory leak on my part.
I was creating a Geometry and allowing myself to clone it, perform modifications to the clone, and then merge it back into the original. What I didn't realise is that I should call geometry.dispose() on the cloned geometry when I was done with it. So, I was basically cloning the geometry every frame, which explains the huge memory usage.
I have fixed my issue by converting the Geometry to a BufferGeometry, and calling geometry.dispose() on the geometry when I am done with it. I now have expected memory usage.
I'm trying to clone and then scale a mesh, but scaling does not seem to be working immediately on the cloned object, for programming purposes using CSG ThreeBSP. I think I should call a function after the scaling to force the matrix or other internal variables to recalculate immediately and not to wait for the full update loop on render side.
My code looks something like this:
var someMesh2 = someMesh1.clone();
someMesh2.scale.set(2,2,2);
someProgrammingOperation(someMesh2);
//It turns out that internally, someMesh2 still has the same properties (matrix?) as someMesh1 :(
What am I missing? Suggestions are also welcomed :)
object.matrix is updated for you by the renderer whenever you call renderer.render().
If you need to update the object matrix manually, call
object.updateMatrix();
and it will update the matrix from the current values of object.position, object.quaternion, and object.scale.
(Note that object.rotation and object.quaternion remain synchronized. When you update one, the other updates automatically.)
three.js r.84
In the end, my problem was that the CSG ThreeBSP object needed to work based on the Geometry of the object, not in the Mesh itself. I applied the scaling on the Geometry and it worked as expected.
There is a caveat though, that one should be careful as with the meshes and geometries instances, therefore is needed to do some cloning in order to keep the original objects as they were, as in the following example:
var clonedMesh = original.mesh.clone()
var clonedGeometry = clonedMesh.geometry.clone()
clonedMesh.geometry = clonedGeometry
clonedMesh.geometry.scale(2,2,2)
var someBsp = new ThreeBSP( clonedMesh )
var newMesh = someBspBsp.toMesh()
someScene.add newMesh
I am developing a THREE.JS WebGL application where I need to render multiple objects with the same geometry and I've stumbled upon a bottleneck. It seems that my instancing of objects has some issue, that I can't really understand/realize, maybe someone can help me with that. For context, I have a PointCloud with normals, that gives me information about where to position my instanced objects, and also the orientation of the object through the normal quaternion. Then, I loop through this array, and place each instanced object accordingly. After looking at various posts about instancing, merging, etc, I can't figure out what I'm doing wrong.
I attach the code snippet of the method in question :
bitbucket.org/snippets/electricganesha/Mdddz
After reviewing it multiple times, I'm really wondering what is wrong here, and why does this particular method slow down my application from 60fps to 20fps.
You might be overcompensating with the optimization.
In your loop where you merge all these geometries try to add something like this
var maxVerts = 1 << 16;
//if merging a new object causes the vert number to go over 2^16 push the merged geometry somewhere, and make a new one for the next batch
if( singleGeometry.vertices.length + newObject.geometry.vertices.length > maxVerts ){
scene.add(singleGeometry);
singleGeometry = new Geometry();
}
singleGeometry.merge(newObject.geometry, newObject.matrix);
I need to export scene as single STL file.
Whereas its easy to export each single <asset>/<mesh>/<model> exporting whole scene with transformations its another story. That requires applying world matrix transform to every vertex of each asset data on-the-fly before export.
Does XML3D has some mechanisms that would help me with that?
Where should I start?
Actually, XML3D is an presentation format and was never designed to extract something useful other than interactive renderings. However, since it is JavaScript, you can access everything somehow and obviously you can also get the data you need to apply all transformations and create a single huge STL mesh from the scene.
The easiest way I can imagine is using the internal scene:
var scene = document.querySelector("xml3d")._configured.adapters["webgl_1"].getScene();
// Iterate render objects
scene.ready.forEach(function(renderObject) {
// Get word matrix
var worldMatrix = new Float32Array(16);
renderObject.getWorldMatrix(worldMatrix);
// Get local position data
var dataRequest = new Xflow.ComputeRequest(renderObject.drawable.dataNode, ["position"]);
var positions = dataRequest.getResult().getOutputData("position").getValue();
console.log(worldMatrix, positions.length);
// apply worldmatrix to all positions
...
});
my application loads a lot of meshes.
to get rid of old meshes i try to dispose them. but the memory is never being freed.
am i missing something ?
my simple example for reproducing:
load 100 of big binary meshes
dispose all of them again
chrome task manager says 250mb memory used, its exactly the same as without step 2
memtest
var scene = new THREE.Scene();
var mymesh=Array();
// 1. load a lot of geometry/meshes...
for(var i=0;i<100;i++)
{
var bloader;
bloader = new THREE.BinaryLoader();
bloader.load( "objekte/presto_6.js" , function( geometry )
{
mymesh.push(new THREE.Mesh( geometry, new THREE.MeshBasicMaterial( {color:0xffffff } ) ));
scene.add(mymesh.length-1);
});
}
// 2. try to dispose objects and free memory...
for(var j=0;j<mymesh.length;j++)
{
mymesh[j].geometry.dispose();
mymesh[j].material.dispose();
screne.remove(mymesh[j]);
}
mymesh=Array();
</script>
Probably a typo, but if it isn't: screne.remove(mymesh[j]); should be scene.remove(mymesh[j]);
Other than that: rember (or find out) how JS manages the memory. Its garbage collector is a sweep-and-clean GC. It flags objects that aren't being referenced anywhere and then cleans them next time the GC kicks in:
for(var j=0;j<mymesh.length;j++)
{
mymesh[j].geometry.dispose();
mymesh[j].material.dispose();
scene.remove(mymesh[j]);
}
The mymesh array still contains references to the mesh objects you are attemting to free. The GC sees this referenec, and therefore refrains from flagging these objects. Reassign, delete, either the entire array of those specific keys you no longer need:
for(var j=0;j<mymesh.length;j++)
{
mymesh[j].geometry.dispose();
mymesh[j].material.dispose();//don't know if you need these, even
scene.remove(mymesh[j]);
mymesh[j] = undefined;//or
delete(mymesh[j]);
}
//or simply:
mymesh = undefined;//or some other value
That allows the memory to be freed, unless another variable remains in scope that references some or all of these objects, too.
As an asside:
mymesh=Array();
Is bad code on many levels. JS functions that begin with an UpperCase are constructors, and should be called usign the new keyword, though most constructors (especially the native objects) shoul be called as little as possibe.
Their behaviour can be unpredictable, and there's often a shorter way to write the code:
mymesh = [];//an array literal
//example of werird behaviour:
mymesh = new Array(10);//[undefined, undefined, undefined....]
mymesh = [10];
mymesh = new Array('10');//['10']
mymesh = new Array(1.2);//RangeError
var o = new Object(123);//returns new Nuber
o = new Object('something');//new String
o = new Object(false);//new Boolean
o = new Object('foo', 'bar');//new String('foo')!!!
o = {foo: 'bar'};//<-- this is soooo much easier