Three js transparency issue with PointClouds and textures - javascript

I have two PointCloud objects, each for a specific structure and texture. One should be clickable but not the other. Let's call them P1 and P2, respectively.
P1 is initialized using a THREE.ShaderMaterial as:
var p1Material = new THREE.ShaderMaterial({
uniforms: uniforms,
attributes: attributes,
vertexShader: document.getElementById('vertexShader').textContent,
fragmentShader: document.getElementById('fragmentShader').textContent,
transparent: true
});
P2 in it's turn is using a THREE.PointCloudMaterial as:
var p2Material = THREE.PointCloudMaterial({
size : SIZE,
map : THREE.ImageUtils.loadTexture("icons/myAwesomeIcon.png"),
sizeAttenuation : true,
transparent: true
});
Both resulting THREE.PointCloud objects have their sortParticles property set to true.
However, I'm running into transparency issues such as the following:
(REMOVED - LOOK AT EDIT)
Everything is a texture, except for the white line. The Sphere texture is used in P2, and the other textures in P1.
We can see that P2's textures are not really transparent against P1's. But they are, against each other, as seen on the second image. Inversly, the same is happening on between P1's textures. However, here's a different example, in the same scene:
(REMOVED - LOOK AT EDIT)
Some of P1's textures are ok, but P2's don't want to behave properly.
I suspect having the textures reside in different PointClouds is not helping. Yet, since the P2's elements shouldn't be clickable, for performance reasons I decided to separate them from the lot, and thus having P1 and P2. Note that selectability is done by clicking on something and using a THREE.Raycaster.
Any ideas on what I'm doing wrong guys?
Thanks in advance!
EDIT: Apparently the problem seems to be due to using BufferGeometry...
Here are two JSFiddle sources that are exactly the same, except for the used geometry.
http://jsfiddle.net/vf6uu90t/3/
http://jsfiddle.net/2uh0q8Lr/2/
Am I missing something?
I had to remove the links from before, because stackoverflow only allows me to insert two links... --'

Here's the ThreeJs github issue, and a possible solution.
https://github.com/mrdoob/three.js/issues/5668
The trick was alphaTesting. In any case, there seems to be a bug related to this is r69.

Related

Three.JS: render a large map based on different tilesets (Texture Atlas)

Introduction:
I render an isometric map with Three.JS (v95, WebGL Renderer). The map includes many different graphic tilesets. I get the specific tile via a TextureAtlasLoader and it’s position from a JSON. It looks like this:
The problem is that it performs really slow the more tiles I render (I need to render about 120’000 tiles on one map). I can barely move the camera then. I know there are several better approaches than adding every single tile as sprite to the scene. But I’m stuck somehow.
Current extract from the code to create the tiles (it’s in a loop):
var ts_tile = Map.Imagesets[ims].Map.getTexture((bg_left / tw), (bg_top / th));
var material = new THREE.SpriteMaterial({ map: ts_tile, color: 0xffffff, fog: false });
var sprite = new THREE.Sprite(material);
sprite.position.set(pos_left, -top, 0);
sprite.scale.set(tw, th, 1);
scene.add(sprite)
I also tried to render it as a Mesh, which also works, but the performance is the same (of course):
var material = new THREE.MeshBasicMaterial({ map: ts_tile, color: 0xffffff, transparent: true, depthWrite: false });
var geo = new THREE.PlaneGeometry(1, 1, 1);
var sprite = new THREE.Mesh(new THREE.BufferGeometry().fromGeometry(geo), material);
possible solutions in the web:
I know that I can’t add so many sprites or meshes to a scene and I have tried different things and looked at examples, where it works flawless, but I can’t adapt their approaches to my code. Every tile on my map has a different texture and has it’s own position.
There is an example in the official three.js docs: They work with PointsMaterial and Points. In the end they only add 5 Points to the scene, which includes about 10000 “vertices / Images”. docs: https://threejs.org/examples/#webgl_points_sprites
Another approach can be found here on github: https://github.com/YaleDHLab/pix-plot
They create 5 meshes, every mesh includes around 4096 “tiles”, which they build up with Faces, Vertices, etc.
Final question:
My question is, how can I render my map more performant? I’m simply overchallenged by changing my code into one of the possible solutions.
I think Sergiu Paraschiv is on the right track. Try to split your rendering into chunks. This strategy and others are outlined here: Tilemap Performance. Depending on how dynamic your terrain is, these chunks could be bigger or smaller. This way you only have to re-render chunks that have changed. Assuming your terrain doesn't change, you can render the whole terrain to a texture and then you only have to render a single texture per frame, rather than a huge array of them. Take a look at this tutorial on rendering to a texture, it should give you an idea on where to start with rendering your chunks.

Render Order of BufferGeometry in Thee.js

Following the previous question, I'm dealing with building models in BufferGeometry, and realize that the transparent flag affects the render order: objects with transparent materials will be rendered after non-transparent ones.
Also, I read from this thread, did an experiment on JSFiddle and realized the render order of faces in BufferGeometry is the same as the order they are specified in buffers, but not distance from cameras. (In the above experiment, I specify a closer triangle first in the buffer, and it occludes others behind it.)
So my question is: is it possible to set render order of faces manually in BufferGeometry?
In my case, I may need to change transparency of building elements dynamically.
(I've read the thread saying we can set renderOrder of Object3D.)
Thank you.
Faces are rendered in the order in which they appear in the BufferGeometry.
If you have to vary the transparency of scene elements dynamically, I suggest you maintain separate geometries, each paired with its own material.
The renderer will render the objects having transparent = false first. Then it will render the objects having transparent = true.
You will likely find you have fewer artifacts if you use the following settings for your transparent materials:
material.transparent = true;
material.opacity = 0.5; // or as desired
material.depthTest = true; // the default
material.depthWrite = false; // use for transparent materials only
Also, self-transparency is particularly tricky. An example would be a semi-transparent cube (or building). One way to reduce artifacts in such situations is to render the object twice: first with material.side = THREE.BackSide and then again with material.side = THREE.FrontSide. You can use object.renderOrder to force a specific render order between objects.
three.js r.75

Three.js - sprite depths rendering backwards in r70+

It looks like something broke with r70+ regarding z-depth of sprites.
Here is a jsfiddle that works perfect with r69.
Here is the same jsfiddle except using r71.
You can see that now when the scene rotates, the depths of the sprites are not always shown correctly. Half the time they are rotated into view with wrong z-depths.
Is this a bug or is something new I need to add that I missed?
I've tried all variations of common commands below and nothing seems to work all around like it used to.
var shaderMaterial = new THREE.ShaderMaterial({
...
depthTest: false,
depthWrite: false,
transparent: true
});
particleSystem.sortParticles = true;
I'm aware of the new renderDepth, but that solution seems to be unrelated and doesn't explain why it would break previous behaviour. We don't need to continually update renderDepths manually for all camera angles now do we?
PointCloud.sortParticles was removed in three.js r70; see this commit.
In your original example (without transparency), you can get your desired behavior by enabling the depth test for your material:
var shaderMaterial = new THREE.ShaderMaterial({
...
depthTest: true
});
In your updated example (with transparency), it's necessary to sort the particles yourself in three.js r70.
Note that three.js still handles z-sorting when rendering THREE.Sprite objects. That could be worth investigating.

Rendering multiple objects in WebGL

I have tried following the suggestions given as answer to this questions but I still can't figure out how the "rendering flow" of a WebGL program really works.
I am simply trying to draw two triangles on a canvas, and it works in a rather non-deterministic way: sometimes both triangles are rendered, sometimes only the second one (second as in the last one drawn) is rendered.
(it appears to depend on rendering time: strangely enough, the longer it takes, the better the odds of ending up with both triangles rendered). EDIT: not true, tried refreshing over and over and the two triangles sometimes show up on very rapid renders (~55ms), sometimes on longer-running ones (~120ms). What does seem to be a recurring pattern is that on the very first time the page is rendered, the two triangles show, and on subsequent repeated refreshes the red one either shows for good or for a very short lapse of time, then flickers away.
Apparently I'm missing something here, let me explain my program's flow in pseudo-code (can include the real code if need be) to see if I'm doing something wrong:
var canvas = new Canvas(/*...*/);
var redTriangle = new Shape(/* vertex positions & colors */);
var blueTriangle = new Shape(/* vertex positions & colors */);
canvas.add(redTriangle, blueTriangle);
canvas.init(); //compiles and links shaders, calls gl.enableVertexAttribArray()
//for vertex attributes "position" and "color"
for(shape in canvas) {
for(bufferType in [PositionBuffer, ColorBuffer]) {
shape.bindBuffer(bufferType); //calls gl.bindBuffer() and gl.bufferData()
//This is equivalent to the initBuffers()
//function in the tutorial
}
}
for(shape in canvas) {
shape.draw();
//calls:
//-gl.bindBuffer() and gl.vertexAttribPointer() for each buffer (position & color),
//-setMatrixUniforms()
//-drawArrays()
//This is equivalent to the drawScene() function in the tutorial
}
Despite the fact I've wrapped the instructions inside object methods in my attempt to make the use of WebGLs slightly more OO, it seems to me I have fully complied to the instructions on this lesson (comparing the lesson's source and my own code), hence I cannot figure out what I'm doing wrong.
I've even tried to use only one for(shape in canvas) loop, as so:
for(shape in canvas) {
for(bufferType in [PositionBuffer, ColorBuffer]) {
shape.bindBuffer(bufferType); //calls gl.bindBuffer() and gl.bufferData()
//This is equivalent to the initBuffers()
//function in the tutorial
}
shape.draw();
//calls:
//-gl.bindBuffer() and gl.vertexAttribPointer() for each buffer (position & color),
//-setMatrixUniforms()
//-drawArrays()
//This is equivalent to the drawScene() function in the tutorial
}
but it doesn't seem to have any effect.
Any clues?
I'm guessing the issue is that by default WebGL canvases are cleared everytime they are composited
Try changing your WebGL context creation to
var gl = someCanvas.getContext("webgl", { preserveDrawingBuffer: true });
I'm just guessing your app is doing things asynchronously which means each triangle is drawn in response to some event? So, if both events happen to come in quick enough (between a single composite) then you get both triangles. If they come on different composites then you'll only see the second one.
preserveDrawingBuffer: true says "don't clear after each composite". Clearing is the default because it allows certain optimizations for certain devices, specifically iOS, and the majority of WebGL apps clear at the beginning of each draw operation. Those few apps that don't clear can set preserveDrawingBuffer: true
In your particular case line 21 of angulargl-canvas.js
options = {alpha: false, premultipliedAlpha: false};
try changing it to
options = {alpha: false, premultipliedAlpha: false, preserveDrawingBuffer: true};

Three.js - Getting the low poly look

I'm trying to generate some terrain in the low poly style, for reference, this kind of style:
What I mean by this is each triangle is one shade.
When I attempt something like this, the shading is very smooth. Here's an example with only a few triangles:
(source: willdonohoe.com)
I also tried adding shadows, but this didn't create the desired effect either. Here's a shot with more triangles with added shadows:
(source: willdonohoe.com)
Looking through the Three documentation, the shading property on the materials class sounds like it would do the trick, but THREE.FlatShading and THREE.NoShading doesn't seem to have any effect.
Is there a special technique that I need to use create this effect? Any direction you can point my way would be much appreciated.
You can find my first demo here
Many thanks,
Will
EDIT: This answer was outdated. Updating:
material.shading = THREE.FlatShading is now material.flatShading = true.
You modified the vertex positions of your PlaneGeometry.
To generate flat shading with MeshLambertMaterial, you must update your normals by calling
geometry.computeFlatVertexNormals();
For other materials, simply setting material.flatShading = true is sufficient to get the flat look.
three.js r.87

Categories