so - I have a situation where I'm filling the screen with calculated polygons. The polygons are constantly changing shape - ie the number of vertices is changing each frame. If I create a new geometry each frame - suddenly my machine effectively halts and I'm apparently sucking heaps of memory - it seems that I have to change the buffers in a geometry.
So I've tried using bufferGeometry, and basically doing:
var position = asBuffer.attributes.position;
position.array[0] = changedValue;
position.needsUpdate = true;
in my render loop - but it doesn't seem to work at all. If I use just a normal geometry, it will dynamically chnage - if I set needsUpdate - but only if I change the values in the original vectors. If I change the arrays themselves - it doesn't seem to show up.
I've got an example of all of this here: http://jsbin.com/fanebah/edit?js,console,output - if you swap the lines that create the "cube" - it goes from working to not working.
I'd prefer to use bufferGeometry - it's faster and closer to the way I'm producing the data - What am I doing wrong? Or does threejs just not support dynamic buffergeometry?
Related
I'm exporting a simple scene from blender to three. Aside from the texture not showing up (which I'm also fighting with), I have a weird problem with the positions of objects. Here's how it looks in blender:
and this is how it renders in three
as you can see, elements are stacked up on each other (and the skybox texture is missing, even though it's referenced properly in the json, embedded as a base64 image). I'm using Three.js exporter v 1.5.0, three.js v84 and blender v 2.77
this is my configuration:
here's the code loading the scene:
var loader = new THREE.ObjectLoader();
loader.load(
'../dist/landscape.json',
function ( obj ) {
scene.add(obj)
}
);
now, I do realise that this way I'm adding a scene to a scene but for some reason, if I try to extract children from it like this:
loader.load(
'../dist/landscape.json',
function ( obj ) {
obj.children.forEach(function(elem) {
scene.add(elem)
}
}
)
I only get half of the objects. No idea why. Besides the objects are still stacked up on each other. I checked the positions in the result versus the original values in blender, and aside from the standard y/z swap x values are reversed (though that's not the cause of the problem), and rotation is removed from the bridge which causes it to render upside down. I'm completely lost
Also, here are the .blend and .json files:
http://www.filehosting.org/file/details/653174/landscape.blend
http://www.filehosting.org/file/details/653175/landscape.json
EDIT:
Partial solution: Scale was set to 10 in exporter, caused the objects to look as if they were misplaced. The thing is, they are still rotated and there's still some mismatch compared to the original. picture here:
I've just come across this issue for myself once again. Having the scale setting at 1 didn't fix it. The issue was that I hadn't applied object transformations in Blender.
Select all problematic objects in your blender file (or just all with A)
Press CTRL+A
Select Rotation & Scale
Repeat for Location if necessary
I'm trying to clone and then scale a mesh, but scaling does not seem to be working immediately on the cloned object, for programming purposes using CSG ThreeBSP. I think I should call a function after the scaling to force the matrix or other internal variables to recalculate immediately and not to wait for the full update loop on render side.
My code looks something like this:
var someMesh2 = someMesh1.clone();
someMesh2.scale.set(2,2,2);
someProgrammingOperation(someMesh2);
//It turns out that internally, someMesh2 still has the same properties (matrix?) as someMesh1 :(
What am I missing? Suggestions are also welcomed :)
object.matrix is updated for you by the renderer whenever you call renderer.render().
If you need to update the object matrix manually, call
object.updateMatrix();
and it will update the matrix from the current values of object.position, object.quaternion, and object.scale.
(Note that object.rotation and object.quaternion remain synchronized. When you update one, the other updates automatically.)
three.js r.84
In the end, my problem was that the CSG ThreeBSP object needed to work based on the Geometry of the object, not in the Mesh itself. I applied the scaling on the Geometry and it worked as expected.
There is a caveat though, that one should be careful as with the meshes and geometries instances, therefore is needed to do some cloning in order to keep the original objects as they were, as in the following example:
var clonedMesh = original.mesh.clone()
var clonedGeometry = clonedMesh.geometry.clone()
clonedMesh.geometry = clonedGeometry
clonedMesh.geometry.scale(2,2,2)
var someBsp = new ThreeBSP( clonedMesh )
var newMesh = someBspBsp.toMesh()
someScene.add newMesh
I am trying to combine WebGL earth with d3.geo.satellite projection.
I have managed to to overlay the 2 projections on top of each other and sync rotation, but I am having trouble to sync zooming. When I sync them to match size, WebGL projection gets deformed, but the d3.geo.satellite remains the same. I have tried different combination of projection.scale, projection.distance without much success.
Here is JS fiddle (it take a little while to load the resources). You can drag it to rotate (works well). But if you zoom in (use mousewheel) you can see the problem.
https://jsfiddle.net/nxtwrld/7x7dLj4n/2/
The important code is at the bottom of the script - the scale function.
function scale(){
var scale = d3.event.scale;
var ratio = scale/scale0;
// scale projection
projection.scale(scale);
// scale Three.js earth
earth.scale.x = earth.scale.y = earth.scale.z = ratio;
}
I do not using WebGL earth either , checking on your jsfiddle is not working anymore, and my assumption of your problem that you want to integrated D3.js with Threejs as a solution for 3d globe.
May I suggest you to try earthjs as your solution. Under the hood it use D3.js v4 & Threejs revision 8x both are the latest, and it can combine between Svg, canvas & threejs(WebGL).
const g = earthjs({padding:60})
.register(earthjs.plugins.mousePlugin())
.register(earthjs.plugins.threejsPlugin())
.register(earthjs.plugins.autorotatePlugin())
.register(earthjs.plugins.dropShadowSvg(),'dropshadow')
.register(earthjs.plugins.worldSvg('../d/world-110m.json'))
.register(earthjs.plugins.globeThreejs('../globe/world.jpg'))
g._.options.showLakes = false;
g.ready(function(){
g.create();
})
above snippet code you can run it from here.
I am developing a THREE.JS WebGL application where I need to render multiple objects with the same geometry and I've stumbled upon a bottleneck. It seems that my instancing of objects has some issue, that I can't really understand/realize, maybe someone can help me with that. For context, I have a PointCloud with normals, that gives me information about where to position my instanced objects, and also the orientation of the object through the normal quaternion. Then, I loop through this array, and place each instanced object accordingly. After looking at various posts about instancing, merging, etc, I can't figure out what I'm doing wrong.
I attach the code snippet of the method in question :
bitbucket.org/snippets/electricganesha/Mdddz
After reviewing it multiple times, I'm really wondering what is wrong here, and why does this particular method slow down my application from 60fps to 20fps.
You might be overcompensating with the optimization.
In your loop where you merge all these geometries try to add something like this
var maxVerts = 1 << 16;
//if merging a new object causes the vert number to go over 2^16 push the merged geometry somewhere, and make a new one for the next batch
if( singleGeometry.vertices.length + newObject.geometry.vertices.length > maxVerts ){
scene.add(singleGeometry);
singleGeometry = new Geometry();
}
singleGeometry.merge(newObject.geometry, newObject.matrix);
I've an animated model that I've animated with Mixamo and then exported as an FBX into Maya. I've then used the Three.js exporter to output the animation 'baked' as morph targets.
Here's how the model looks when loaded into Maya:
However, when I read the data in, it includes not just the animation, but also the base model in a static pose, and each morphTarget array has the vertices repeated in it. This is what it ends up looking like:
Beyond manually writing some code to de-duplicate the vertices, is there any way to just get the animation out and not the model as well? I'm very new to Maya, so I'm guessing there's an option that I need to untick, or some selection step that I'm missing.
Thanks in advance
Should someone else have this problem, there's a simple answer (at least in this instance) - truncate the vertex and face arrays by half. After checking the vertices for duplicates it turned out they were all in the second half of these arrays, and could just be dumped.
geometry.vertices.length = geometry.vertices.length / 2
geometry.faces.length = geometry.faces.length / 2
geometry.morphTargets.forEach(function(target) {
target.vertices.length = target.vertices.length / 2
})
There's almost certainly a better way of doing it however.