I'm using Three.js to render some a few planes with textures, one or top of another.
The distance between the planes is 10 units (which I'm not sure what are these exactly).
All the planes are MeshBasicMaterial, with the following configuration:
let frontMaterial = new THREE.MeshBasicMaterial( {
map: getFrontCover(),
side: THREE.FrontSide
});
When the planes are placed at the regular distance of 10, I can see strange stripes rendered, like in the picture:
Assuming that the distance is mandatory, how can I solve this issue?
Thanks
There are a few things you can do.
make the near and far settings of your perspective camera as tight fitting to your content as possible.
For example let's assume the camera is 2 units from the book and the book is 1 unit deep. In that case setting your near and far setting to 0.5 and 3.5 would possibly solve the issue. In other words
.... new THREE.PerspectiveCamera(fieldOfView, aspect, 0.5 /* near /, 3.5 / far */
If those numbers are orders of magnitude off you'll get this issue
Use a logarithmic depth buffer
See this example: https://threejs.org/examples/?q=log#webgl_camera_logarithmicdepthbuffer
Set the material's polygon offset
You want to set this on the material for the mesh that is supposed to be behind
material.polygonOffset = true;
material.polygonOffsetFactor = 1;
material.polygonOffsetUnits = 1;
Related
What I am attempting:
I am building a lightweight GIS application that includes topography. One thing I would like to add is atmosphere haze.
My current code:
(Please excuse my code, I have multiple scenes)
fogColor = new THREE.Color(0x7EC0EE);
this.scenes[name].background = fogColor;
this.scenes[name].fog = new THREE.Fog(fogColor, 250, 2000);
//alternatively:
this.scenes[name].fog = new THREE.FogExp2( fogColor, .001 )
Problem encountered:
Both Fog and FogExp2 work well for the units of scale in my app when I am close to the ground. However when moving the camera farther above the ground, and looking down, eventually the earth turns 100% blue, as its obscured by the fog setting.
My Question:
Is there a way to apply a max opacity to the fog?
I would like the topography to stay hazy at a distance but not completely obscured by fog as a solid color. I was thinking I could calculate the furthest object in view, and adjust the fog setting on every camera change, but I am not sure how or if I am overthinking this. I'd like to calculate fog based on the amount of "air" between the camera and the object and never go over a certain opacity of fog. Is this done better in a shader?
There's no way to apply max opacity to fog, but you could change the fog's near and far parameters on the fly. For example:
var origin = new THREE.Vector3(0, 0, 0);
update() {
var dist = camera.position.distanceTo(origin);
fog.far = 2000 + dist;
}
I'm not sure what kind of units you're dealing with, so you might need to play with the way you calculate dist. With this approach, the further you are from 0, 0, 0, the further away the fog will reach.
With FogExp2, you could try modifying the .density property.
Introduction:
I render an isometric map with Three.JS (v95, WebGL Renderer). The map includes many different graphic tilesets. I get the specific tile via a TextureAtlasLoader and it’s position from a JSON. It looks like this:
The problem is that it performs really slow the more tiles I render (I need to render about 120’000 tiles on one map). I can barely move the camera then. I know there are several better approaches than adding every single tile as sprite to the scene. But I’m stuck somehow.
Current extract from the code to create the tiles (it’s in a loop):
var ts_tile = Map.Imagesets[ims].Map.getTexture((bg_left / tw), (bg_top / th));
var material = new THREE.SpriteMaterial({ map: ts_tile, color: 0xffffff, fog: false });
var sprite = new THREE.Sprite(material);
sprite.position.set(pos_left, -top, 0);
sprite.scale.set(tw, th, 1);
scene.add(sprite)
I also tried to render it as a Mesh, which also works, but the performance is the same (of course):
var material = new THREE.MeshBasicMaterial({ map: ts_tile, color: 0xffffff, transparent: true, depthWrite: false });
var geo = new THREE.PlaneGeometry(1, 1, 1);
var sprite = new THREE.Mesh(new THREE.BufferGeometry().fromGeometry(geo), material);
possible solutions in the web:
I know that I can’t add so many sprites or meshes to a scene and I have tried different things and looked at examples, where it works flawless, but I can’t adapt their approaches to my code. Every tile on my map has a different texture and has it’s own position.
There is an example in the official three.js docs: They work with PointsMaterial and Points. In the end they only add 5 Points to the scene, which includes about 10000 “vertices / Images”. docs: https://threejs.org/examples/#webgl_points_sprites
Another approach can be found here on github: https://github.com/YaleDHLab/pix-plot
They create 5 meshes, every mesh includes around 4096 “tiles”, which they build up with Faces, Vertices, etc.
Final question:
My question is, how can I render my map more performant? I’m simply overchallenged by changing my code into one of the possible solutions.
I think Sergiu Paraschiv is on the right track. Try to split your rendering into chunks. This strategy and others are outlined here: Tilemap Performance. Depending on how dynamic your terrain is, these chunks could be bigger or smaller. This way you only have to re-render chunks that have changed. Assuming your terrain doesn't change, you can render the whole terrain to a texture and then you only have to render a single texture per frame, rather than a huge array of them. Take a look at this tutorial on rendering to a texture, it should give you an idea on where to start with rendering your chunks.
I have a problem updating a vertex of a line in three.js
So, I want to have a line in my scene, that its start is always at the (0,0,0) and its end is always in a specific position of the users screen (in x,y coordinates).
What I do to achieve that (and I almost succeed) is to have an invisible plane looking always to the camera and also have its position always a little bit in front of the camera. The reason I do that is because I want the line to seem like "going towards" the user's screen. So I "send" a raycaster from the desired screen position (in x,y) and I check in which point of the plane it intersect and that's my 3D point in three.js scene. Then I update one of the 2 vertices of the line.
The problem
What I do works fine, the line end is where I want to be, but something in updating the camera and the vertex is not synchronized and causes some noticeable glitches. When I move the camera, the line do not update itself quickly and smoothly, and as a result I see the line in other position before I see it in the calculated and desireable one.
Please take a look at this jsfiddle I created to emulate the problem.
What can I do to avoid these glitches?
Thanks
code i use in render function :
var cameToCenterScaled = camera.position.clone();
cameToCenterScaled.setLength(cameToCenterScaled.length()*0.9);
plane.position.set(cameToCenterScaled.x, cameToCenterScaled.y, cameToCenterScaled.z);
plane.lookAt(camera.position);
// define in pixels where in screen we want the line to end
var notePos = findNotePoint(120,30);
linemesh.geometry.vertices[ 1 ].set(notePos.x, notePos.y, notePos.z) ;
linemesh.geometry.verticesNeedUpdate = true;
when you raycast you set the raycaster from camera, you have to make sure the camera matrices are updated
simply add
camera.updateMatrixWorld();
before you call
raycaster.setFromCamera( new THREE.Vector2( x_, y_ ) , camera );
and the line will behave as you described
I'm trying to implement the code from this tutorial, but in much greater proportions (radius = 100000 units).
I don't know if the size matters but on my earth render the clouds have a strange render.
As the tutorial does, I'm using two spheres and three textures (earth map, bump map, clouds).
Here the result (that's worse if the clouds are closer):
More the clouds are closer of the planet surface, more this glitch is visible. If the clouds are sufficiently far (but that's not realistic) the problem disappears completely.
What can I do?
Use logarithmic depth buffer instead of the linear one. This is a very simple change, just enable logarithmicDepthBuffer when you create your THREE.WebGLRenderer like so:
var renderer = new THREE.WebGLRenderer({ antialias: true, logarithmicDepthBuffer: true});
Here's an example you can have a look at:
http://threejs.org/examples/#webgl_camera_logarithmicdepthbuffer
Using polygonOffset as suggested by LJ_1102 is a possibility, but it shouldn't be necessary.
What you're experiencing is z-fighting due to insufficient depth buffer resolution.
You basically have three options to counteract this:
Write / use a multi-texture shader that renders all three textures on one sphere.
Increase the distance between the sphere faces. / Decrease the distance between your near and far clipping planes.
Use polygonOffset and the POLYGON_OFFSET_FILL renderstate to offset depth values written by your outer sphere. Read more about polygonOffset here.
I am relatively new to three.js and am trying to position and manipulate a plane object to have the effect of laying over the surface of a sphere object (or any for that matter), so that the plane takes the form of the object surface. The intention is to be able to move the plane on the surface later on.
I position the plane in front of the sphere and index through the plane's vertices casting a ray towards the sphere to detect the intersection with the sphere. I then try to change the z position of said vertices, but it does not achieve the desired result. Can anyone give me some guidance on how to get this working, or indeed suggest another method?
This is how I attempt to change the vertices (with an offset of 1 to be visible 'on' the sphere surface);
planeMesh.geometry.vertices[vertexIndex].z = collisionResults[0].distance - 1;
Making sure to set the following before rendering;
planeMesh.geometry.verticesNeedUpdate = true;
planeMesh.geometry.normalsNeedUpdate = true;
I have a fiddle that shows where I am, here I cast my rays in z and I do not get intersections (collisions) with the sphere, and cannot change the plane in the manner I wish.
http://jsfiddle.net/stokewoggle/vuezL/
You can rotate the camera around the scene with the left and right arrows (in chrome anyway) to see the shape of the plane. I have made the sphere see through as I find it useful to see the plane better.
EDIT: Updated fiddle and corrected description mistake.
Sorry for the delay, but it took me a couple of days to figure this one out. The reason why the collisions were not working was because (like we had suspected) the planeMesh vertices are in local space, which is essentially the same as starting in the center of the sphere and not what you're expecting. At first, I thought a quick-fix would be to apply the worldMatrix like stemkoski did on his github three.js collision example I linked to, but that didn't end up working either because the plane itself is defined in x and y coordinates, up and down, left and right - but no z information (depth) is made locally when you create a flat 2D planeMesh.
What ended up working is manually setting the z component of each vertex of the plane. You had originaly wanted the plane to be at z = 201, so I just moved that code inside the loop that goes through each vertex and I manually set each vertex to z = 201; Now, all the ray start-positions were correct (globally) and having a ray direction of (0,0,-1) resulted in correct collisions.
var localVertex = planeMesh.geometry.vertices[vertexIndex].clone();
localVertex.z = 201;
One more thing was in order to make the plane-wrap absolutely perfect in shape, instead of using (0,0,-1) as each ray direction, I manually calculated each ray direction by subtracting each vertex from the sphere's center position location and normalizing the resulting vector. Now, the collisionResult intersection point will be even better.
var directionVector = new THREE.Vector3();
directionVector.subVectors(sphereMesh.position, localVertex);
directionVector.normalize();
var ray = new THREE.Raycaster(localVertex, directionVector);
Here is a working example:
http://jsfiddle.net/FLyaY/1/
As you can see, the planeMesh fits snugly on the sphere, kind of like a patch or a band-aid. :)
Hope this helps. Thanks for posting the question on three.js's github page - I wouldn't have seen it here. At first I thought it was a bug in THREE.Raycaster but in the end it was just user (mine) error. I learned a lot about collision code from working on this problem and I will be using it later down the line in my own 3D game projects. You can check out one of my games at: https://github.com/erichlof/SpacePong3D
Best of luck to you!
-Erich
Your ray start position is not good. Probably due to vertex coordinates being local to the plane. You start the raycast from inside the sphere so it never hits anything.
I changed the ray start position like this as a test and get 726 collisions:
var rayStart = new THREE.Vector3(0, 0, 500);
var ray = new THREE.Raycaster(rayStart, new THREE.Vector3(0, 0, -1));
Forked jsfiddle: http://jsfiddle.net/H5YSL/
I think you need to transform the vertex coordinates to world coordinates to get the position correctly. That should be easy to figure out from docs and examples.