EDL Shader in Potree Viewer with orthographic mode - javascript

I have a problem with EDL Shader in Potree Viewer. While it works correct with perspective mode, I am in need to use orthographic mode in my project.
Please see the GIF and the example at the bottom.
There is a problem with EDL Renderer that does not work correctly with an orthographic mode. The problem appears when you zoom the point cloud. At some point, shadows disappear and we lose the depth perception.
You should be able to reproduce this on any public example. Just remember to set the orthographic mode from the Tools panel in the sidebar.
I would really appreciate any help.
You can see how it looks like on this GIF:
https://im.ezgif.com/tmp/ezgif-1-9e188b342d1c.gif
The example I used in GIF is available here: http://www.potree.org/potree/examples/lion_laz.html
You can see Potree shaders here: https://github.com/potree/potree/tree/develop/src/materials/shaders
Also, navigation controls that manipulate position of the camera:
https://github.com/potree/potree/blob/develop/src/navigation/OrbitControls.js

The example you linked to looks like the EDL shading gets cut off at the same time that the camera's near plane is crossed. You can see this by switching between perspective and ortho view; shading ends at the same point when the particles no longer show in the perspective cam.
You could change the camera.near value to something lower, I think you can even set it as a negative value on Ortho cams.

Related

React-Three-Fiber + ThreeJS: Sprite alpha very rough on edges

I am working on a project for a client and we are finding that some of the transparent logos have a very ugly looking dark border around them when they are used in threejs. I have tried so many things with no luck so I would love help getting the alpha to look nicer.
Threejs vs the supplied image:
It is very faint but when you zoom in (which they can to an extent in the application) you can see the border:
Things I have tried:
Setting texture's min + mag filters to LinearFilter/NearestFilter. This is the most common suggestion but you can see in my codesandbox that this does not help. If I set it to NearestFilter then the logos start to become pixelated and alias when the camera moves around.
I changed the blend modes of the standard material and I got weird colors on the edges.
I wrote my own custom shader that blends between the image color and white based on the supplied image's alpha but I still get a weird color leaking through.
I played with the alphaTest value but this ends up causing the edges to end abruptly/not look great.
I demonstrate all of my approaches here:
https://codesandbox.io/s/interesting-wozniak-povk1?file=/src/App.js
I think that my shader is close but not perfect. I would really appreciate any advice on the right way to approach and solve this problem.
You need to increase the resolution of the image, the mesh at half the scale looks better with that resolution on the png

Head-coupled/Off-axis perspective in Three.js

I'm trying to achieve a permanent head-coupled perspective without using the full headtrackr library. My head won't be moving, but it won't be directly in front of the screen.
I have a little demo that you can download and run with python -m SimpleHTTPServer 8000
The code is adapted from mainly this headtrackr example and part of the headtrackr source
My expectations are based off this diagram:
In the third image, I imagine slightly swiveling my monitor counter-clockwise from above. This should be equivalent to reducing Z and making X less than zero. I expect my monitor to show the middle image, but instead I see something like this:
I think the "window" I'm looking through is the XY-plane, but shouldn't it stretch like the middle orange rectangle in the first diagram? Here's another window that stays fixed: http://kode80.com/2012/04/09/holotoy-perspective-in-webgl/ to see what I mean by "window."
Are off-axis perspective and head-tracking unrelated? How do I get a convincing illusion of off-axis perspective in THREE.js?
I think you can accomplish what you're aiming for with setViewOffset. Your diagram looks a little off to me. Perhaps its because there is no qube in the off-axis projection, but I think the point is that the frustum should remain framed on a fixed point without rotating the camera which would introducer perspective distortion.
To accomplish this with setViewOffset I would set the fullWidth and fullHeight to some extra large size. The view offset the will be a window in that oversized view. As the user moves that window will be offset in the opposite direction of the viewer.
http://threejs.org/docs/#Reference/Cameras/PerspectiveCamera
3D Projection mapping can be broken into a corner-pinning texture step and a perspective adjustment step. I think they're somewhat unrelated.
In Three.js you can make a quad surface (composed of two triangle Face3) and map a texture onto the surface. Then move the corners of the quad in XY (not Z). I don't think this step introduces perspective artifacts other than what's necessary for minor deformations in a quad texture. I think I'm talking about banding issues and nearest neighbor artifacts, not 3d perspective errors. The size of these artifacts depends on how the projector is shining on the object. If the projector is nicely perpendicular to the surface, very little corner-mapping is necessary. If you're using a monitor instead of a projector, then no corner-pinning is necessary.
Next is the perspective adjustment step. You have to adjust the content of the texture based on where the user is in relation to the real-life surface. I believe you can do this with an XYZ distance from the physical viewer to the center of the screen surface, a scaling factor between pixels and real-life size, and the pixel dimensions of the surface.
In my demo, the blueish faces of my cubes point in the positive Z direction. When I rotate my monitor in the real-life world, they continue to point in the positive Z direction of their monitor world. The diagram that I posted is a little misleading because the orange box in the middle picture is locally rotated to compensate for the rotating of the real-life monitor world. That orange box's front face is no longer pointing exactly in the positive Z direction of its monitor world.
Outside of Three.js, in Processing there are some techniques for projection mapping.
This one may be the simplest, although I haven't tried it myself: http://blogs.bl0rg.net/netzstaub/2008/08/24/wiimote-headtracking-in-processing/
SurfaceMapper for Processing has support for real-life curved surfaces (not just flat rectangles), but it only works for Processing before Processing 2.0.
If anyone develops a SurfaceMapper library for Three.js that would be really cool! I'd love to design a virtual world, put cameras in the world, have each camera consider real-life viewer perspective, and then put those rendered textures on real-life displays.
You need to adjust the perspective matrix. It's built with -left +right -bottom +top. Changing these will produce the effect you are looking for.

Including an image with Three.js

I'm actually trying to include a .jpg image into my 3D scene. All solutions i have found consisted in apply those images on meshes as texture : but then the scene does not look like well. Indeed, we can see the mesh border whether it be a plane or sphere... I just want to see the image. Does exist it another solution ?
On my application, i want to rotate an airplane around the earth, and the problem is about including this airplane.
Thanks for your help :)
Perhaps the class THREE.Sprite will accomplish the effect you want. THREE.Sprite can display an image, and can use either screen coordinates (e.g. canvas coordinates) or it can be part of your 3D scene, but a sprite image is always facing the camera. If you want the image to rotate, you do need to use it as a texture on a mesh. Whatever you decide to do in the end, I've posted a bunch of tutorial-style examples at http://stemkoski.github.io/Three.js/ that may help. Good luck!

Three.js CanvasRender problems: faces flicker in and out

I am drawing two relative simple shapes and the geometry of them do not overlap.
Here is the code sample:
http://jsfiddle.net/pGD4n/9/
The Three.js Trackball is in there so you can click and drag to spin the objects around in 3d space. The problem is that as the objects rotate some faces disappear revealing the object below. A slight more rotation and the missing face returns, but others have gone missing.
I've tried BasicMaterial, Normal Material and LambertMaterial with both SmoothShading and Flat Shading. I have tried the scene with and without lighting. Moving the objects farther apart seems to correct the issue, but in the given example code the meshes do not overlap and should not have this problem. Problem happens in both Chrome and Firefox.
I imagine that switching to the OpenGL renderer would resolve the issue, but for compatibility we need use the Canvas renderer.
Any help or ideas appreciated.
This is a limitation of CanvasRenderer. Unfortunately per pixel z sorting is not available in CanvasRenderer so it basically tries to sort the whole polygon instead. Depending of where you're looking from the center of one polygon may be closer than the polygon on the side and so it "jumps".
The only solution right now is using WebGLRenderer. I'm working on a new renderer for context2d which hopefully will solve this without requiring webgl but it will still take some time.

Background with three.js

Can anybody help me with three.js?
I need to draw background, something, like a THREE.Sprite, but it neet to be UNDER any 3d object, that will draw later. I have a camera, that can be move only on Z axis.
I tryed to use:
cube mapping shader - PROBLEM: artefacts with shadow planes, it's unstable draw
THREE.Sprite that dublicate camera moving - PROBLEM: artefacts with shadow plane - it have a edge highlighting OR drawing only other spirtes without objects.
HTML DOM Background - PROBLEM: big and ugly aliasing in models.
What can I try more? Thanks!
You could maybe try drawing in several passes, i.e. making a first render of the background scene to a buffer, and then a second one over the first "buffer". Maybe using the buffer as background (painting it in 2D with an orthographic projection, and disabling depth buffer writes in that pass).
I haven't tried it myself with three.js, but that's how I'd do that with "traditional" OpenGL.
If you want a "3d" background i.e. something that will follow the rotation of your camera, but not react to the movement (be infinitely far), then the only way to do it is with a cubemap.
The other solution is a environment dome - a fully 3d object.
If you want a static background, then you should be able todo just a html background, i'm not sure why this would fail and what 'aliasing in models' you are talking about.

Categories