THREE.js. Change mesh position after applying EdgesHelper - javascript

I tried to rotate or change position of a mesh after applying EdgesHelper, but it doesn't work — mesh stays on the same position. (Without EdgesHelper it works fine). What am I doing wrong?
var mesh = new THREE.Mesh( geometry, material );
var edges = new THREE.EdgesHelper( mesh, 0xcf0000 );
edges.position.z = 100;
edges.position.x = 100;
scene.add( edges );

Looking into the source of THREE.EdgesHelper it seems that the matrixAutoUpdate is set to false. This prevents the computation of the position and rotation on every update.
https://github.com/mrdoob/three.js/blob/master/src/extras/helpers/EdgesHelper.js
Setting the matrixAutoUpdate of the EdgesHelper to true should do the trick, but calling the .updateMatrix() function after setting the new position or rotation seems cleaner.

I had the same problem, as suggested I added:
egh.matrixAutoUpdate = true;
That the edge and the cube followed my coordinates I used:
var cube = new THREE.Mesh( new THREE.CubeGeometry(width,height,depth),
new THREE.MeshBasicMaterial( {color: 0x00AA00, side:THREE.DoubleSide, opacity: 0.5, transparent: true } ) );
scene.add(cube);
var egh = new THREE.EdgesHelper(cube, 0x777777);
egh.material.linewidth = 1;
egh.position.x = cube.position.x;
egh.position.y = cube.position.y;
egh.position.z = cube.position.z;
egh.matrixAutoUpdate = true; // this helped
scene.add(egh);

Related

three.js clone() property changes for all my meshes

I would like to create 100 clones of a simple cube and decrease gradually the opacity of each cube. Here's the loop I have :
var geometry = new THREE.BoxGeometry(0.15,0.15,0.15);
var material = new THREE.MeshNormalMaterial();
var cube = new THREE.Mesh( geometry, material );
cube.material.transparent = true;
scene.add( cube );
for(let i = 0; i < 100; i++){
window['cube'+i] = cube.clone();
window['cube'+i].position.x = i;
window['cube'+i].material.opacity = 1 - (0.01*i);
scene.add(window['cube'+i]);
}
Unfortunately, all my meshes end with the last opacity established.
I don't understand why all my meshes have the same opacity while the x position increase normally.
Does anyone have an idea on how to separate each opacity property ? Thank you
Cloning a mesh does not clone its geometry and material by default for performance reasons. If you want to control the opacity per mesh, it's best to clone the material for each instance.

Align BoxGeometry between two 3D Points

I make a game where you can add objects to a world without using a grid. Now I want to make a footpath. When you click on "Create footpath", then you can add a point to on the world at the raycaster position. After you add a first point you can add a second point to the world. When these 2 objects where placed. A line/footpath is visible from the first point to the second one.
I can do this really simple with THREE.Line. See the code:
var lineGeometry = new THREE.Geometry();
lineGeometry.vertices.push( new THREE.Vector3(x1,0,z1), new THREE.Vector3(x2,0,z2) );
lineGeometry.computeLineDistances();
var lineMaterial = new THREE.LineBasicMaterial( { color: 0xFF0000 } );
var line = new THREE.Line( lineGeometry, lineMaterial );
scene.add(line);
But I can't add a texture on a simple line. Now I want to do something the same with a Mesh. I have the position of the first point and the raycaster position of the second point. I also have the lenght between the two objects for the lenght of the footpath. But I don't know how I can get the rotation what is needed.
Note. I saw something about LookAt, is this maybe a good idea, how can I use this with a mesh?
Can anyone help me to get the correct rotation for the footpath object?
I use this code for the foodpath mesh:
var loader = new THREE.TextureLoader();
loader.load('images/floor.jpg', function ( texture ) {
var geometry = new THREE.BoxGeometry(10, 0, 2);
var material = new THREE.MeshBasicMaterial({ map: texture, overdraw: 0.5 });
var footpath = new THREE.Mesh(geometry, material);
footpath.position.copy(point2);
var direction = // What can I do here?
footpath.rotation.y = direction;
scene.add(footpath);
});
I want to get the correct rotation for direction.
[UPDATE]
The code of WestLangley helps a lot. But it works not in all directions. I used this code for the lenght:
var lenght = footpaths[i].position.z - point2.position.z;
What can I do that the lenght works in all directions?
You want to align a box between two 3D points. You can do that like so:
var geometry = new THREE.BoxGeometry( width, height, length ); // align length with z-axis
geometry.translate( 0, 0, length / 2 ); // so one end is at the origin
...
var footpath = new THREE.Mesh( geometry, material );
footpath.position.copy( point1 );
footpath.lookAt( point2 );
three.js r.84

THREE.js Geometry map does not appear

Following I'm loading a image map on a custom geometry,
it represents the brown colored geometry on the picture above:
var aqua_ground_geo = new THREE.Geometry();
var top0 = new THREE.Vector3(aqua_ground_geo_x_NEG, user_data['aqua_soil_calc_b_y'], aqua_ground_geo_z_NEG);
var top1 = new THREE.Vector3(aqua_ground_geo_x_POS, user_data['aqua_soil_calc_b_y'], aqua_ground_geo_z_NEG);
var top2 = new THREE.Vector3(aqua_ground_geo_x_NEG, user_data['aqua_soil_calc_f_y'], aqua_ground_geo_z_POS);
aqua_ground_geo.vertices.push(top0);
aqua_ground_geo.vertices.push(top1);
aqua_ground_geo.vertices.push(top2);
aqua_ground_geo.faces.push( new THREE.Face3(0,1,2) );
aqua_ground_geo.computeFaceNormals();
aqua_ground_geo.computeVertexNormals();
var textureUrl = "http://www.lifeguider.de/wp-content/uploads/aquag/bodengrund/dennerle_kies_naturweiss_1-2mm.jpg";
var aqua_bodengrund_tex = new THREE.TextureLoader().load( textureUrl );
var aqua_bodengrund_mat = new THREE.MeshLambertMaterial( {
map: aqua_bodengrund_tex,
color: 0xffffff,
} );
aqua_bodengrund_mat.shading = THREE.FlatShading;
aqua_bodengrund_mat.side = THREE.DoubleSide;
var aqua_bodengrund = new THREE.Mesh( aqua_ground_geo,aqua_bodengrund_mat);
On a simple THREE.BoxGeometry all works as expected with the same material (it represents the cube in the picture above):
var lala = new THREE.BoxGeometry( 100, 100, 100 );
var lala2 = new THREE.Mesh( lala,aqua_bodengrund_mat);
I'm not an expert in 3D, what is missing in my code that the image texture will be shown correctly?
You need to apply the texture in the callback of the THREE.TextureLoader. Check also the documentation here; the second argument (onLoad) is the callback.
var textureUrl = "https://raw.githubusercontent.com/mrdoob/three.js/master/examples/textures/crate.gif";
var aqua_bodengrund_mat = new THREE.MeshLambertMaterial( {
color: 0xffffff
});
var onLoad = function( texture ){
aqua_bodengrund_mat.map = texture;
aqua_bodengrund_mat.needsUpdate = true;
}
var loader = new THREE.TextureLoader();
loader.load( textureUrl, onLoad );
See this fiddle for a demo.
UPDATE
In case you have a custom geometry you also need to calculate the UVs for showing the texture correctly. I used this answer here to calculate them in another fiddle here
Note. The UVs in my fiddle are calculated for faces in the XY plane, if your faces are in another plane you will have to update accordingly...

Three.js - Things disappear when zooming out

In my three.js project I use a high z position for my camera.
When the z position is too high my scene becomes black.
So, when I zoom out it becomes black. But I don't want that to happen.
This is how it is with camera.position.z = 3000;
And when I zoom out, just one zoom, it is like this:
For the controls I use OrbitControls, My camera is like:
var camera = new THREE.PerspectiveCamera(45, window.innerWidth / window.innerHeight, 1, 3000);
camera.position.z = 3000;
And here the code for the planet and some planets' orbits:
var scene = new THREE.Scene();
var material = new THREE.MeshLambertMaterial({
map: THREE.ImageUtils.loadTexture("assets/img/sun.jpg")
});
var sun = new THREE.Mesh(new THREE.SphereGeometry(200, 50, 50), material);
scene.add(sun);
var orbitLine = function(radius,y)
{
var segments = 64,
line_material = new THREE.LineBasicMaterial( { color: 0xffffff } ),
geometry = new THREE.CircleGeometry( radius, segments );
geometry.vertices.shift();
var orbit = new THREE.Line( geometry, line_material );
if(y)
orbit.position.y=y;
else if(!y)
orbit.position.y=0;
scene.add(orbit);
};
var Mercury_orbit = orbitLine(400,-70);
var Venus_orbit = orbitLine(700,70);
var Earth_orbit = orbitLine(900,70);
var Mars_orbit = orbitLine(1250,70);
var Jupiter_orbit = orbitLine(3000,70);
Couldn't provide a fiddle as for some reason it didn't work.
If you need more code tell me in the comments and I will add it.
Any ideas?
thanks.
Your camera's far plane is at 3000 which means everything that is 3000 units away will be clipped and not drawn.
At the same time you have placed your camera at (0,0,3000) so you are right on the position where things will start to disappear.

Threejs PlaneGeometry doesn't receive shadows or reflections

I'm pretty new to 3d and to threejs and I can't figure out how I can get a PlaneGeometry to show individually illuminated polygons i.e. receive shadows or show reflection. What I basically do is taking a PlaneGeometry applying some noise to every z value of the vertices. Then I have a simple directional light in my scene which is supposed to make the emerging noise pattern on the plane visible. I tried different things like plane.castShadow = true or renderer.shadowMapEnabled = true without success. Am I just missing a simple option or is this way more complicated than I think?
Here's are the relevant pieces of my code
renderer.setSize(width, height);
renderer.setClearColor(0x111111, 1);
...
var directionalLight = new THREE.DirectionalLight( 0xffffff, 0.9);
directionalLight.position.set(10, 2, 20);
directionalLight.castShadow = true;
directionalLight.shadowCameraVisible = true;
scene.add( directionalLight );
var geometry = new THREE.PlaneGeometry(20, 20, segments, segments);
var index = 0;
for(var i=0; i < segments + 1; i++) {
for(var j=0; j < segments + 1; j++) {
zOffset = simplex.noise2D(i * xNoiseScale, j * yNoiseScale) * 5;
geometry.vertices[index].z = zOffset;
index++;
}
}
var material = new THREE.MeshLambertMaterial({
side: THREE.DoubleSide,
color: 0xf50066
});
var plane = new THREE.Mesh(geometry, material);
plane.rotation.x = -Math.PI / 2.35;
plane.castShadow = true;
plane.receiveShadow = true;
scene.add(plane);
This is the output I get. Obviously the plane is aware of the light because the bottom side is darker than the upper side but there is no sign of any individual polygons receiving individual lightening and no 3d structure is visible. Interestingly when I put in a different geometry like a BoxGeometry individual polygons are illuminated individually (see 2nd image). Any ideas?
Ok I figured it out thanks to this post. The trick is to use the THREE.FlatShading shader on the material. Important to note is that after every update of the vertices two things need to be done. Before rendering geometry.normalsNeedUpdate must be set to true so the renderer also incorporates the newly oriented vertices. Also geometry.computeFaceNormals() needs to be called before rendering because when you alter the vertices the normals are not the same anymore.

Categories