I'm trying to implement the code from this tutorial, but in much greater proportions (radius = 100000 units).
I don't know if the size matters but on my earth render the clouds have a strange render.
As the tutorial does, I'm using two spheres and three textures (earth map, bump map, clouds).
Here the result (that's worse if the clouds are closer):
More the clouds are closer of the planet surface, more this glitch is visible. If the clouds are sufficiently far (but that's not realistic) the problem disappears completely.
What can I do?
Use logarithmic depth buffer instead of the linear one. This is a very simple change, just enable logarithmicDepthBuffer when you create your THREE.WebGLRenderer like so:
var renderer = new THREE.WebGLRenderer({ antialias: true, logarithmicDepthBuffer: true});
Here's an example you can have a look at:
http://threejs.org/examples/#webgl_camera_logarithmicdepthbuffer
Using polygonOffset as suggested by LJ_1102 is a possibility, but it shouldn't be necessary.
What you're experiencing is z-fighting due to insufficient depth buffer resolution.
You basically have three options to counteract this:
Write / use a multi-texture shader that renders all three textures on one sphere.
Increase the distance between the sphere faces. / Decrease the distance between your near and far clipping planes.
Use polygonOffset and the POLYGON_OFFSET_FILL renderstate to offset depth values written by your outer sphere. Read more about polygonOffset here.
Related
I'm using Three.js to render some a few planes with textures, one or top of another.
The distance between the planes is 10 units (which I'm not sure what are these exactly).
All the planes are MeshBasicMaterial, with the following configuration:
let frontMaterial = new THREE.MeshBasicMaterial( {
map: getFrontCover(),
side: THREE.FrontSide
});
When the planes are placed at the regular distance of 10, I can see strange stripes rendered, like in the picture:
Assuming that the distance is mandatory, how can I solve this issue?
Thanks
There are a few things you can do.
make the near and far settings of your perspective camera as tight fitting to your content as possible.
For example let's assume the camera is 2 units from the book and the book is 1 unit deep. In that case setting your near and far setting to 0.5 and 3.5 would possibly solve the issue. In other words
.... new THREE.PerspectiveCamera(fieldOfView, aspect, 0.5 /* near /, 3.5 / far */
If those numbers are orders of magnitude off you'll get this issue
Use a logarithmic depth buffer
See this example: https://threejs.org/examples/?q=log#webgl_camera_logarithmicdepthbuffer
Set the material's polygon offset
You want to set this on the material for the mesh that is supposed to be behind
material.polygonOffset = true;
material.polygonOffsetFactor = 1;
material.polygonOffsetUnits = 1;
This THREE.BoxHelper is wildly inaccurate, and the position of the cube is not accurate when drawing a line to it!?
See the proof of concept JSFiddle: https://jsfiddle.net/can35bj0/15/
cubeBox = new THREE.BoxHelper(cube, 0xffff00)
scene.add(cubeBox);
cube.position.copy(positionVector);
cubeTrace.geometry.vertices[cubeTrace.geometry.vertices.length - 1].copy(cube.position);
Why is this, and is there a way to fix this? So far I've come up empty...
p.s. scales and position need to be small vs. large
I've more or less concluded that this is due to a 'long vector' problem of THREE.js.
When an object (such as the square cube in the JSFiddle) is on a long arm from Origin compared to it's size, in this case the positionVector is roughly 100,000 units and the cube size is 0.001 units, positions become erratic and fluctuating as can be seen by the weird behavior of the Boxhelper. (Note that the cube is set to move slightly every second and the camera moves with it)
I have placed 3d objects with translateZ. They look good when paused and are placed on top/bottom of each other. But when I rotate the scene these objects merger. Can someone tell me how to get rid of this issue ?
Black block is on top of brown, but when I rotate the object, positions got disturb.
I see you have large faces in the models, I think that's a z-buffer problem.
Try logarithmic depth buffer and see whats happens:
var renderer = new THREE.WebGLRenderer({logarithmicDepthBuffer: true});
I'm trying to generate some terrain in the low poly style, for reference, this kind of style:
What I mean by this is each triangle is one shade.
When I attempt something like this, the shading is very smooth. Here's an example with only a few triangles:
(source: willdonohoe.com)
I also tried adding shadows, but this didn't create the desired effect either. Here's a shot with more triangles with added shadows:
(source: willdonohoe.com)
Looking through the Three documentation, the shading property on the materials class sounds like it would do the trick, but THREE.FlatShading and THREE.NoShading doesn't seem to have any effect.
Is there a special technique that I need to use create this effect? Any direction you can point my way would be much appreciated.
You can find my first demo here
Many thanks,
Will
EDIT: This answer was outdated. Updating:
material.shading = THREE.FlatShading is now material.flatShading = true.
You modified the vertex positions of your PlaneGeometry.
To generate flat shading with MeshLambertMaterial, you must update your normals by calling
geometry.computeFlatVertexNormals();
For other materials, simply setting material.flatShading = true is sufficient to get the flat look.
three.js r.87
I am using the Three.JS library to display a point cloud in a web brower. The point cloud is generated once at start up and no further points are added or removed. But it does need to be rotated, panned and zoomed. I've gone through the tutorial about creating particles in three.js here
Using the example I can create particles that are squares or use an image of a sphere to create a texture. The image is closer to what I want, but is it possible to generate the point clouds without using the image? The sphere geometry for example.
The problem with the image is that when you have thousands of points it seems they sometimes obscure each other around the edges. From what I can gather it seems like the black region in a point's png file blocks the image immediately behind the current point. (But it is transparent to points further behind)
This obscuring of the images is the reason I would like to generate the points using shapes. I have tried replacing particles = new THREE.Geometry() with THREE.SphereGeometry(radius, segments, rings) and tried to change the vertices to spheres.
So my question is. How do I modify the example code so that it renders spheres (or points) instead of squares? Also, is a particle system the most efficient system for my particular case or should I just generate the particles and set their individual positions? As I mentioned I only generate the points once, but then rotate, zoom, pan the points. (I used the TrackBall sample code to get the mouse events working).
Thanks for your help
I don't think rendering a point cloud with spheres is very efficient. You should be able to get away with a particle system and use a texture or a small canvas program to draw a circle.
One of the first three.js sample uses a canvas program, here are the important bits:
var PI2 = Math.PI * 2;
var program = function ( context )
{
context.beginPath();
context.arc( 0, 0, 1, 0, PI2, true );
context.closePath();
context.fill();
};
var particle = new THREE.Particle( new THREE.ParticleCanvasMaterial( {
color: Math.random() * 0x808008 + 0x808080,
program: program
} ) );
Feel free to adapt the code for the WebGL renderer.
Another clever solution I've seen in the examples is using an encoded webm video to store the data and pass that to a GLSL shader which is rendered through a particle system in three.js
If your point cloud comes from a Kinect, these resources might be useful:
DepthCam
KinectJS
When comparing my code to http://threejs.org/examples/#webgl_custom_attributes_particles3
I saw the only difference was:
vec4 outColor = texture2D( texture, gl_PointCoord );
if ( outColor.a < 0.5 ) discard;
gl_FragColor = outColor;
Added to the fragment shader, fixed this problem for me.
It wasn't z fighting because randomly, some corners would overlap distant particles.
material.alphaTest = 0.5 didn't work and turning off depth writes/tests messed up the viewing order.
The problem with the image is that when you have thousands of points
it seems they sometimes obscure each other around the edges. From what
I can gather it seems like the black region in a point's png file
blocks the image immediately behind the current point. (But it is
transparent to points further behind)
You can get rid of the transparency overlapping problem of the underlying square structure by turning
depthTest:false
The problem then is, if you are adding additional objects to the scene the depth-testing will fail and the PointCloud will be rendered in front of the other objects, ignoring the actual order. To get around that you can additionally turn off
depthWrite:false