Three.js shadows acting weirdly - javascript

I am trying to create a little solar system but found a bug... or a feature. I'd like for all of my planets to be able to cast and receive shadows from all other planets. However, it seems as if it depends on instancing order if shadows are cast or not.
Code for the light and shadows:
const sunLight = new THREE.PointLight(0xffffff, 3, 100);
sunLight.position.set(0, 0, 0);
sunLight.castShadow = true
scene.add(sunLight);
//Set up shadow properties for the light
sunLight.shadow.mapSize.width = 512; // default
sunLight.shadow.mapSize.height = 512; // default
sunLight.shadow.camera.near = 0.5; // default
sunLight.shadow.camera.far = 500; // default
const sphereSize = 1;
const pointLightHelper = new THREE.PointLightHelper(sunLight, sphereSize);
scene.add(pointLightHelper);
const shadowHelper = new THREE.CameraHelper( sunLight.shadow.camera );
scene.add( shadowHelper );
Basic code for the code objects:
var earth = new THREE.Mesh(
new THREE.SphereGeometry(1, 32, 16),
new THREE.MeshStandardMaterial({
map: tLoader.load("/textures/nasa-world.jpg"),
bumpMap: tLoader.load("/textures/nasa-jpl-world-bump.png"),
bumpScale: 0.01,
}));
earth.castShadow = true
earth.receiveShadow = true
// position goes here
scene.add(earth);
var mars = new THREE.Mesh(
new THREE.SphereGeometry(0.53, 32, 16),
new THREE.MeshStandardMaterial({
map: tLoader.load("/textures/nasa-mars.jpg"),
bumpMap: tLoader.load("/textures/nasa-mars-bump.png"),
bumpScale: 0.01,
}))
mars.castShadow = true
mars.receiveShadow = true
//position goes here
scene.add(mars);
Case 1 (working shadow):
earth.position.x = 18
mars.position.x = 15
(https://ibb.co/gS26Sfz)
Case 2 (not working):
earth.position.x = 15
mars.position.x = 18
(https://ibb.co/PZrh2wS)
Case 3 (not sure why, but it works):
When I switch around the instancing (I first instance mars, then earth, Case 2 DOES work).
(https://ibb.co/pRz06b1)
It does seem to me that only objects that are instanced BEFORE the objects that drop shadows can actually receive shadows. I cannot imagine, though, that this is truly a limitation, I am probably doing something wrong.
Please help me, how can I make both objects cast and receive shadows from one another?

After playing around and reading some more documentation, it seems as if this problem is hard coded. The instancing order does seem to determine what can cast and receive shadows. In other words: the meshes that are to receive the shadow must be instanced before the objects casting the shadows. This is quite a limitation, in some ways.

Related

In THREE.js, how to map one texture to a 3D rectangle

I'm trying to make a box in THREE that represents a box of 2x4 Legos, 24 pieces wide by 48 pieces long and and arbitrary number of pieces tall. I've generated a texture that shows this pattern using random colors:
I need to show two sides this cube, but the textures have to align so that the pieces on the edges are the same colors, like so (generated in Blender):
I'd really prefer not to make six images for a CubeTexture, particularly since four are not visible. Is it possible to flip the texture on one side so that they appear to align? (We're just going for visual effect here.)
Further, not all 3D rectangles will be cubes, but I can't quite figure out how to set the texture.repeat.x and texture.repeat.y so that the x is scaled correctly and the y is at the same scale, but just cuts off when the height of the object ends, like so:
Thanks!
You can flip an image by flipping the UVs.
You'll need to figure out which UVs correspond to the face you're trying to mirror, and which direction to flip them (not sure how your geometry is created).
Here's an example using a basic BoxBufferGeometry and modifying its uv attribute. (The face on the right is the mirrored-by-UV-flipping face.)
var textureURL = "https://upload.wikimedia.org/wikipedia/commons/0/02/Triangular_hebesphenorotunda.png";
// attribution and license here: https://commons.wikimedia.org/wiki/File:Triangular_hebesphenorotunda.png
var renderer = new THREE.WebGLRenderer({antialias:true});
document.body.appendChild(renderer.domElement);
renderer.setSize(500, 500);
var textureLoader = new THREE.TextureLoader();
var scene = new THREE.Scene();
var camera = new THREE.PerspectiveCamera(28, 1, 1, 1000);
camera.position.set(50, 25, 50);
camera.lookAt(scene.position);
scene.add(camera);
camera.add(new THREE.PointLight(0xffffff, 1, Infinity));
var cubeGeo = new THREE.BoxBufferGeometry(20, 20, 20);
var uvs = cubeGeo.attributes.uv;
// originally:
// [0] = 0,1
// [1] = 1,1
// [2] = 0,0
// [3] = 1,0
// convert to:
// [0] = 1,1
// [1] = 0,1
// [2] = 1,0
// [3] = 0.0
uvs.setX(0, 1);
uvs.setY(0, 1);
uvs.setX(1, 0);
uvs.setY(1, 1);
uvs.setX(2, 1);
uvs.setY(2, 0);
uvs.setX(3, 0);
uvs.setY(3, 0);
uvs.needsUpdate = true;
var mat = new THREE.MeshLambertMaterial({
color: "white",
map: textureLoader.load(textureURL, function(){
animate();
})
});
var mesh = new THREE.Mesh(cubeGeo, mat);
scene.add(mesh);
function render() {
renderer.render(scene, camera);
}
function animate() {
requestAnimationFrame(animate);
render();
}
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/three.js/91/three.min.js"></script>
You can create six PlaneBufferGeometries assign the same material, and then position them to form a cube. Rotate them in 90deg increments until you reach the desired result. For performance reasons, you could merge these back into a single BufferGeometry.
You can export the model you made in blender, either using the THREE.js json exporter, or a format like OBJ or GLTF, and load and render it directly.
What you are talking about is simply having the UV's laid out the way you have them in blender.. so if you need that level of control.. it's probably easier to just load the model instead of trying to generate it.
If you use either three.js .json or .gltf, both exporters have an option to embed the textures directly in the export. This can make it easier to get things working quicker, at the expense of possibly less efficient storage.

three js spotlight above r73

I have a problem with spotlight. I was using r.73 and had 50x simple Spotlights without shadows etc.. it works without problems, still 60fps also on mobile.
Now i was changed to r84 (The problem occurs above r73), and the spotlights are much better quality but also drop my frames. I know there was some changes with adding penumbra options in r74.. i not really understand how can i set down the quality..
On fiddle , you dont see qualityChanges, dont matter. but Frames will drope.
So my Question, is it possible to set up the spotlight of a way, i still have 60frames?
The mistake occurs only when the mesh (floor) is big enough.
var spotLightSize=50;
var spotLight=[];
var geometry = new THREE.BoxGeometry( 500, 1, 500 );
var material = new THREE.MeshPhongMaterial( {color: "blue"} );
var floor = new THREE.Mesh( geometry, material );
var renderer = new THREE.WebGLRenderer({precision:"lowp",alpha:true});
for (var i=0;i<spotLightSize;i++){
spotLight.push(new THREE.SpotLight("green" ,2,20,0.1,0,1));
spotLight[spotLight.length-1].position.set( 0, 5, 0 );
scene.add(spotLight[spotLight.length-1]);
var spotLightHelper = new THREE.SpotLightHelper( spotLight[spotLight.length-1] );
scene.add( spotLightHelper );
}
http://jsfiddle.net/killerkarnikel/hyqgjLLz/19/

Three js children length

I created a three js object and added some children to it. then i changed the length of children to 0. then the objects have gone out of screen. Will that make the objects fully removed from the screen and memory?
var balls = new THREE.Object3D(); // parent
for creating childrens
var geometry = new THREE.SphereGeometry(5, 32, 32);
var material = new THREE.MeshPhongMaterial({color: 0x0f0ff0, shininess: 50, transparent: true, opacity: 1});
var sphere = new THREE.Mesh(geometry, material);
sphere.position.x = scale('some random value');
sphere.position.y = scale('some random value');
balls.add(sphere);
above steps repeated for more spheres
then in the console i wrote
balls.children = [];
this removes all the spheres from the scene. Will that removes all the sphere objects from the memory also??
Yes, when you have an array and then set array.length = 0; all the elements of the array will be deleted. When you type array.length = 2, all elements other than the first two elements will be deleted.
Javascript has a function called slice() which does a similar thing.
The correct way for deleting a child is calling remove(child) from its parent, and then use dispose() on children's material and geometry.
In your code:
var balls = new THREE.Object3D(); // parent
var geometry = new THREE.SphereGeometry(5, 32, 32);
var material = new THREE.MeshPhongMaterial({color: 0x0f0ff0, shininess: 50, transparent: true, opacity: 1});
var sphere = new THREE.Mesh(geometry, material);
sphere.position.x = scale('some random value');
sphere.position.y = scale('some random value');
balls.add(sphere);
// Do some work
balls.remove(sphere);
geometry.dispose();
material.dispose();
Dispose the material/geometry only when it is not used by other Mesh anymore.
From THREE.Object3D at remove(object, ...):
"Removes object as child of this object. An arbitrary number of objects may be removed."
From THREE.Geometry at dispose():
"Don't forget to call this method when you remove a geometry because it can cause memory leaks."
From THREE.Material at dispose():
"This disposes the material. Textures of a material don't get disposed. These needs to be disposed by Texture."
If you use textures, you must dispose these too.
(THREE.js r85).

How to merge two geometries or meshes using three.js r71?

Here I bumped to the problem since I need to merge two geometries (or meshes) to one. Using the earlier versions of three.js there was a nice function:
THREE.GeometryUtils.merge(pendulum, ball);
However, it is not on the new version anymore.
I tried to merge pendulum and ball with the following code:
ball is a mesh.
var ballGeo = new THREE.SphereGeometry(24,35,35);
var ballMat = new THREE.MeshPhongMaterial({color: 0xF7FE2E});
var ball = new THREE.Mesh(ballGeo, ballMat);
ball.position.set(0,0,0);
var pendulum = new THREE.CylinderGeometry(1, 1, 20, 16);
ball.updateMatrix();
pendulum.merge(ball.geometry, ball.matrix);
scene.add(pendulum);
After all, I got the following error:
THREE.Object3D.add: object not an instance of THREE.Object3D. THREE.CylinderGeometry {uuid: "688B0EB1-70F7-4C51-86DB-5B1B90A8A24C", name: "", type: "CylinderGeometry", vertices: Array[1332], colors: Array[0]…}THREE.error # three_r71.js:35THREE.Object3D.add # three_r71.js:7770(anonymous function) # pendulum.js:20
To explain Darius' answer more clearly (as I struggled with it, while trying to update a version of Mr Doob's procedural city to work with the Face3 boxes):
Essentially you are merging all of your Meshes into a single Geometry. So, if you, for instance, want to merge a box and sphere:
var box = new THREE.BoxGeometry(1, 1, 1);
var sphere = new THREE.SphereGeometry(.65, 32, 32);
...into a single geometry:
var singleGeometry = new THREE.Geometry();
...you would create a Mesh for each geometry:
var boxMesh = new THREE.Mesh(box);
var sphereMesh = new THREE.Mesh(sphere);
...then call the merge method of the single geometry for each, passing the geometry and matrix of each into the method:
boxMesh.updateMatrix(); // as needed
singleGeometry.merge(boxMesh.geometry, boxMesh.matrix);
sphereMesh.updateMatrix(); // as needed
singleGeometry.merge(sphereMesh.geometry, sphereMesh.matrix);
Once merged, create a mesh from the single geometry and add to the scene:
var material = new THREE.MeshPhongMaterial({color: 0xFF0000});
var mesh = new THREE.Mesh(singleGeometry, material);
scene.add(mesh);
A working example:
<!DOCTYPE html>
<script src="https://cdnjs.cloudflare.com/ajax/libs/three.js/r77/three.js"></script>
<!-- OrbitControls.js is not versioned and may stop working with r77 -->
<script src='http://threejs.org/examples/js/controls/OrbitControls.js'></script>
<body style='margin: 0px; background-color: #bbbbbb; overflow: hidden;'>
<script>
// init renderer
var renderer = new THREE.WebGLRenderer();
renderer.setSize(window.innerWidth, window.innerHeight);
document.body.appendChild(renderer.domElement);
// init scene and camera
var scene = new THREE.Scene();
var camera = new THREE.PerspectiveCamera(45, window.innerWidth / window.innerHeight, 0.01, 3000);
camera.position.z = 5;
var controls = new THREE.OrbitControls(camera)
// our code
var box = new THREE.BoxGeometry(1, 1, 1);
var sphere = new THREE.SphereGeometry(.65, 32, 32);
var singleGeometry = new THREE.Geometry();
var boxMesh = new THREE.Mesh(box);
var sphereMesh = new THREE.Mesh(sphere);
boxMesh.updateMatrix(); // as needed
singleGeometry.merge(boxMesh.geometry, boxMesh.matrix);
sphereMesh.updateMatrix(); // as needed
singleGeometry.merge(sphereMesh.geometry, sphereMesh.matrix);
var material = new THREE.MeshPhongMaterial({color: 0xFF0000});
var mesh = new THREE.Mesh(singleGeometry, material);
scene.add(mesh);
// a light
var light = new THREE.HemisphereLight(0xfffff0, 0x101020, 1.25);
light.position.set(0.75, 1, 0.25);
scene.add(light);
// render
requestAnimationFrame(function animate(){
requestAnimationFrame(animate);
renderer.render(scene, camera);
})
</script>
</body>
At least, that's how I am interpreting things; apologies to anyone if I have something wrong, as I am no where close to being a three.js expert (currently learning). I just had the "bad luck" to try my hand at customizing Mr. Doob's procedural city code, when the latest version breaks things (the merge stuff being one of them, the fact that three.js no longer uses quads for cube -ahem- box geometry the other - which has led to all kinds of fun getting the shading and such to work properly again).
Finally, I found a possible solution. I am posting since it could be useful for somebody else while I wasted a lot of hours. The tricky thing is about manipulating the concept of meshes and geometries:
var ballGeo = new THREE.SphereGeometry(10,35,35);
var material = new THREE.MeshPhongMaterial({color: 0xF7FE2E});
var ball = new THREE.Mesh(ballGeo, material);
var pendulumGeo = new THREE.CylinderGeometry(1, 1, 50, 16);
ball.updateMatrix();
pendulumGeo.merge(ball.geometry, ball.matrix);
var pendulum = new THREE.Mesh(pendulumGeo, material);
scene.add(pendulum);
The error message is right. CylinderGeometry is not an Object3D. Mesh is. A Mesh is constructed from a Geometry and a Material. A Mesh can be added to the scene, while a Geometry cannot.
In the newest versions of three.js, Geometry has two merge methods: merge and mergeMesh.
merge takes a mandatory argument geometry, and two optional arguments matrix and materialIndexOffset.
geom.mergeMesh(mesh) is basically a shorthand for geom.merge(mesh.geometry, mesh.matrix), as used in other answers. ('geom' and 'mesh' being arbitrary names for a Geometry and a Mesh, respectively.) The Material of the Mesh is ignored.
This is my ultimate compact version in four (or five) lines (as long as material is defined somewhere else) making use of mergeMesh:
var geom = new THREE.Geometry();
geom.mergeMesh(new THREE.Mesh(new THREE.BoxGeometry(2,20,2)));
geom.mergeMesh(new THREE.Mesh(new THREE.BoxGeometry(5,5,5)));
geom.mergeVertices(); // optional
scene.add(new THREE.Mesh(geom, material));
Edit: added optional extra line to remove duplicate vertices, which should help performance.
Edit 2: I'm using the newest version, 94.
The answers and code that I've seen posted here do not work because the second argument of the merge method is an integer, not a matrix. As far as I can tell, the merge method is not really functioning in a useful way. Therefore, I used the following approach to make a simple rocket with a nose cone.
import * as BufferGeometryUtils from '../three.js/examples/jsm/utils/BufferGeometryUtils.js'
lengthSegments = 2
radius = 5
radialSegments = 32
const bodyLength = dParamWithUnits['launchVehicleBodyLength'].value
const noseConeLength = dParamWithUnits['launchVehicleNoseConeLength'].value
// Create the vehicle's body
const launchVehicleBodyGeometry = new THREE.CylinderGeometry(radius, radius, bodyLength, radialSegments, lengthSegments, false)
launchVehicleBodyGeometry.name = "body"
// Create the nose cone
const launchVehicleNoseConeGeometry = new THREE.CylinderGeometry(0, radius, noseConeLength, radialSegments, lengthSegments, false)
launchVehicleNoseConeGeometry.name = "noseCone"
launchVehicleNoseConeGeometry.translate(0, (bodyLength+noseConeLength)/2, 0)
// Merge the nosecone into the body
const launchVehicleGeometry = BufferGeometryUtils.mergeBufferGeometries([launchVehicleBodyGeometry, launchVehicleNoseConeGeometry])
// Rotate the vehicle to horizontal
launchVehicleGeometry.rotateX(-Math.PI/2)
const launchVehicleMaterial = new THREE.MeshPhongMaterial( {color: 0x7f3f00})
const launchVehicleMesh = new THREE.Mesh(launchVehicleGeometry, launchVehicleMaterial)

How to get correct values for normals in threejs?

I don’t understand how normals are computed in threejs.
Here is my problem :
I create a simple plane
var plane = new THREE.PlaneGeometry(10, 100, 10, 10);
var material = new THREE.MeshBasicMaterial();
material.setValues({side: THREE.DoubleSide, color: 0xaabbcc});
var mesh = new THREE.Mesh(plane, material);
mesh.rotateY(Math.PI / 2);
scene.add(mesh);
When I read the normal of this plane, I get (0, 0, 1).
But the plane is parallel to the z axis so the value is wrong.
I tried adding
mesh.geometry.computeFaceNormals();
mesh.geometry.computeVertexNormals();
but I still get the same result.
Did I miss anything ?
How can I get correct values for normals from threejs ?
Thanks.
Geometry normals are in object space. To transform them to world space, first make sure the object matrix is updated.
object.updateMatrixWorld();
(The renderer does this for you in each render loop, so you may be able to skip this step.)
Then, compute the normal matrix:
var normalMatrix = new THREE.Matrix3().getNormalMatrix( object.matrixWorld );
Now transform the normal to world space like so:
var newNormal = normal.clone().applyMatrix3( normalMatrix ).normalize();
three.js r.66

Categories