I have several meshes, with about 5000 vertices each. these are the original vertices and call "verticesA" the problem is that it is delayed to run the following line:
shapeFigure[x] = new THREE.Shape (geometry [x] .vertices); // Very slow.
then transformed "shape [x]" in a mesh.
shape[x] = new THREE.ShapeGeometry( shapeFigure[x][x] );
mesh[x] = new THREE.Mesh (shape[x], new THREE.MeshLambertMaterial ({color: "# FF0000"}));
this is obvious because they are many mesh with many vertices. I have a button in my alplicación which generates an algorithm. This algorithm ONLY generates new vertices (I'll call verticesB, these vertices have the same length as the "verticesA"). I want to update the "verticesA" with "verticesB".
how I can update the "verticesA" that figure is in the form of "verticesB".
I do not want to use again "new THREE.Shape ..." because it is delayed (because I have many mesh with many vertices). I want to directly update verticesA TO verticesB, (it is faster)
I'm doing something like this:
//mesh-> is the original (verticesA)
http://imgur.com/ympHorb (verticesA)
//geometry[x]-> is a array with the new vertices (verticesB)
//geometry[x] and mesh[x] is the same length of vertices
for (var a in mesh[x].geometry.vertices) {
mesh[x].geometry.vertices[a].x=geometry[x][a].x;
mesh[x].geometry.vertices[a].y=geometry[x][a].y;
}
In my function render(), i have:
for (var t in mesh){
mesh[t].geometry.verticesNeedUpdate=true;
mesh[t].geometry.dynamic = true;
}
this is a map, but after running the above code and is updated but is distorted.the problem is that is not properly updated geometry.
http://imgur.com/qCWoMWe
the map should be this way:
http://imgur.com/MYXzaEd (this is verticesB)
how I can update the geometry correctly?
Related
I made a line mesh with elliptical shape, representing the orbit wiht eccentricity e, and semi-major axis a. The mesh is a child of a group called orbitGroup that contains other objects.
Also, I added a gui to change this parameters. Every time gui changes it calls the next function:
function ElementsUpdate(){
scene.remove(orbitGroup);
orbitGroup.remove(Orbit);
Orbit = undefined;
Orbit = new THREE.Line( GetGeometryOrbit(GetOrbitLine(a,e,100)), materialOrbit);
orbitGroup.add(Orbit);
scene.add(orbitGroup);
}
The mesh (Orbit) is being created successfully. However the it does not update. I'm aware that setGeometry method is not working anymore. Any solution? I am replacing the mesh because replacing only the geometry seems to be more complicated.
Thanks beforehand for the help.
The project is in this link
You should be able to replace the vertex (position) buffer and call it a day.
function ElementsUpdate(){
let points = GetOrbitLine(a,e,100).getPoints(); // THREE.Curve.getPoints
Orbit.geometry.setFromPoints( points ); // replaces the position buffer
}
Curve.getPoints gives you an array of the points on your elipse.
BufferBgeometry.setFromPoints replaces the position buffer, derived from your array of points.
Because it replaces the buffer (and the BufferAttribute) You should not need to mark anything as needing re-sent to the GPU.
I have a small web app that I've designed for viewing bathymetric data of the seafloor in Three.js. Basically I am using a loader to bring in JSON models of the my extruded bathymetry into my scene and allowing the user to rotate the model or click next to load a new part of the seafloor.
All of my models have the same 2D footprint so are identical in two dimensions, only elevations and texture change from model to model.
My question is this: What is the most cost effective way to update my model?
Using scene.remove(mesh); then calling my loader again to load a new model and then adding it to the scene with scene.add(mesh);.
Updating the existing mesh by calling my loader to bring in material and geometry and then calling mesh.geometry = geometry;, mesh.material = material and then mesh.geometry.needsUpdate;.
I've heard that updating is pretty intensive from a computational point of view, but all of the articles that I've read on this state that the two methods are almost the same. Is this information correct? Is there a better way to approach my code in this instance?
An alternative that I've considered is skipping the step where I create the model (in Blender) and instead using a displacement map to update the y coordinates of my vertices. Then to update I could push new vertices on an existing plane geometry before replacing the material. Would this be a sound approach? At the very least I think the displacement map would be a smaller file to load than a .JSON file. I could even optimize the display by loading a GUI element to divide the mesh into more or fewer divisions for high or low quality render...
I dont know off the top of my head what exactly happens under the hood, but from what i remember i think these two are the exact same thing.
You aren't updating the existing mesh. A mesh extends from Object3D, so it just sits there, wiring some geometry and some materials.
mesh.geometry = geometry did not "update the mesh", or it did, but with new geometry (which may be the thing you are actually referring to as mesh).
In other words, you always have your container, but when you replace the geometry by doing =geometry you set it up for all sorts of GL calls in the next THREE.WebGLRenderer.render() call.
Where that new geometry gets attached to, be it an existing mesh, or a new one, shouldnt matter at all. The geometry is the thing that will trigger the low level webgl calls like gl.bufferData().
//upload two geometries to the gpu on first render()
var meshA = new THREE.Mesh( new THREE.BoxGeometry(1,1,1) );
var meshB = new THREE.Mesh( new THREE.BoxGeometry(1,1,1) );
//upload one geometry to the gpu on first render()
var bg = new THREE.BoxGeometry()
var meshA = new THREE.Mesh( bg );
var meshB = new THREE.Mesh( bg );
for ( var i = 0 ; i < someBigNumber ; i ++ ){
var meshTemp = new THREE.Mesh( bg );
}
//doesnt matter that you have X meshes, you only have one geometry
//1 mesh two geometries / "computations"
var meshA = new THREE.Mesh( new THREE.BoxGeometry() ); //first computation - compute box geometry
scene.add(meshA);
renderer.render( scene , camera ); //upload box to the gpu
meshA.geometry = new THREE.SphereGeometry();
renderer.render( scene , camera); //upload sphere to the gpu
THREE.Mesh seems to be the most confusing concept in three.js.
I have two meshes: mesh1 and mesh2. both have the same number of vertices and have extrusion.
mesh1 = 5000 vertices.
mesh2 = 5000 vertices.
I assign the vertices of mesh2 to mesh1. then I do:
mesh2.geometry.verticesNeedUpdate = true;
mesh2.geometry.vertices = mesh1.geometry.vertices;
thus the vertices of mesh2 are updated. but this happens too fast. I can not see an animation while it makes mesh2's vertices to mesh1's vertices.
I want to see the transformation when malla2 starts to become the malla1, I mean to see an animation of vertices when they are changing.
I used "Tween.js" for animations such as position and color. I'm not sure if this can help to view animations when vertices begin to change.
I do:
new TWEEN.Tween( mesh2.geometry.vertices ).to( mesh1.geometry.vertices, 1000 ).start();
but not works. sorry for my level of english.
As you've seen, this doesn't work -- in part because for each call to update the mesh vertices, you must also call geometry2.verticesNeedUpdate = true; for every one of these frames.
--
more specifically I think you will want to add .onUpdate(function() { geometry2.verticesNeedUpdate = true; }) to your tween.
Adding new vertices to a three.js mesh goes by mesh.geometry.vertices.push(new THREE.Vector3(x, y, z)), but how do I remove them?
"geometry" is an array, so I thought, I could remove vertices with:
mesh.geometry.vertices.splice(vertexIndex, 1)
mesh.geometry.verticesNeedUpdate = true;
But when I do that, that whole thing breaks with three.js internal error messages that say: "Uncaught TypeError: Cannot read property 'x' of undefined" inside three.min.js.
I searched their wiki, their github issues. And can't find an answer to this. The mesh is a simple BoxGeometry, so not even a custom one.
In threejs each face is made of 3 vertices. Here is an example to make it clearer. Here is how you create a geometry in r71 :
geometry=new THREE.Geometry();
geometry.vertices.push(// few vertices with random coordinates
new THREE.Vector3(12,15,5),//index:0 -- the numbers are (x,y,z) coordinates
new THREE.Vector3(10,15,5),//index:1
new THREE.Vector3(12,10,2),//index:2
new THREE.Vector3(10,10,2)//index:3
);
geometry.faces.push(
new THREE.Face3(0,1,2),//those numbers are indices of vertices in the previous array
new THREE.Face3(0,3,2)
);
geometry.computeFaceNormals();// we won't care about this here
(I did not care about the values so i do not know which shape it can give)
What you can see is that two arrays are built : vertices and faces. Now what happens at each frame is that each face is 'drawed' with the position of its vertices.
You ask what is wrong by deleting a vertex in the geometry.vertices array : let's imagine the second vertex above is deleted. The array now looks like this :
geometry.vertices=[
THREE.Vector3(12,15,5),//index:0
THREE.Vector3(12,10,2),//new index:1
THREE.Vector3(10,10,2)//new index:2
];
There is no more vertex at index 3. So when the GPU will draw the next frame, if a face points to it (here the second face) it will try to access its coordinates (first x before y and z). That is why the console returns that it cannot read x of undefined.
Here was a long explanation of the error. You can see the vertex deletion also shifted the array so faces do not have the correct shape, and their normals do not correspond anymore. The worst is that the buffer will have to change and that is simply not allowed, as stated there for example :
Dynamically Adding Vertices to a Line in Three.js
Adding geometry to a three.js mesh after render
The solution is to use tricks, as quoted : modify your vertex coordinates, hide faces... this depends on what you want to do.
If your scene has not much vertices you can also remove the previous mesh and create a new one with a new geometry, without one vertex and with a corrected face array.
Is it possible to merge vertices only at render time? I'm doing a series of morphs which requires the vertex list to stay the same, however I want to merge the vertices to get a smooth reflection on a cube camera. Any one aware of a command similar to unmerge vertices?
Have you tried doing it? It should work.
You'll need to call
geometry.verticesNeedUpdate()
geometry.elementsNeedUpdate()
to tell three.js that the vertices and faces, respectively, have changed. There are other update functions you may need to call too (for instance if normals have changed). More details here: https://github.com/mrdoob/three.js/wiki/Updates
Note the comment on that page that the total number of vertices can't change. This may require you to do the merge on a temp geometry and then copy the vertices to your rendered geometry.
Alright, this is not in the documentation section, but you need to use the explode modifier as demonstrated in this example: http://threejs.org/examples/#webgl_geometry_tessellation
var explodeModifier = new THREE.ExplodeModifier();
explodeModifier.modify( geometry );
geometry.computeFaceNormals();
geometry.computeVertexNormals();
//This will undo the geometry.mergeVertices();