Can anybody help me with three.js?
I need to draw background, something, like a THREE.Sprite, but it neet to be UNDER any 3d object, that will draw later. I have a camera, that can be move only on Z axis.
I tryed to use:
cube mapping shader - PROBLEM: artefacts with shadow planes, it's unstable draw
THREE.Sprite that dublicate camera moving - PROBLEM: artefacts with shadow plane - it have a edge highlighting OR drawing only other spirtes without objects.
HTML DOM Background - PROBLEM: big and ugly aliasing in models.
What can I try more? Thanks!
You could maybe try drawing in several passes, i.e. making a first render of the background scene to a buffer, and then a second one over the first "buffer". Maybe using the buffer as background (painting it in 2D with an orthographic projection, and disabling depth buffer writes in that pass).
I haven't tried it myself with three.js, but that's how I'd do that with "traditional" OpenGL.
If you want a "3d" background i.e. something that will follow the rotation of your camera, but not react to the movement (be infinitely far), then the only way to do it is with a cubemap.
The other solution is a environment dome - a fully 3d object.
If you want a static background, then you should be able todo just a html background, i'm not sure why this would fail and what 'aliasing in models' you are talking about.
Related
I'm interested in if it would be possible to reproduce the Labeled Geometry three.js example on an animated asset, and if so, how?
I want to attach labels to an animated 3D model but they do not move along with the asset (even if I parent them to it) I'm guessing because the animation performs a mesh deformation itself.
Is there a performant way to have, say, the sprite's position reference that of a specific vertex in the animated mesh buffer? Something that would not require me to manually update the sprite's position whenever the mesh's animation progresses. Ideally I would like to parent the sprite to a particular face or vertex on the mesh, but as far as I know there is no way to do this.
Thanks in advance for any help!
I have a webgl application, I've written using threejs. But the FPS is not good enough on some of my test machines. I've tried to profile my application using Chrome's about:tracing with the help from this article : http://www.html5rocks.com/en/tutorials/games/abouttracing/
It appears that the gpu is being overloaded. I also found out that my FPS falls drastically when I have my entire scene in the camera's view. The scene contains about 17 meshes and a single directional light source. Its not really a heavy scene. I've seen much heavier scenes get render flawlessly on the same GPU.
So, what changes can I make in the scene to make it less heavy, without completely changing it? I've already tried removing the textures? But that doesn't seem to fix the problem.
Is there a way to figure out what computation threejs is pushing on to the GPU? Or would this be breaking the basic abstraction threejs gives?
What are general tips for profiling GPU webgl-threejs apps?
There are various things to try.
Are you draw bound?
Change your canvas to 1x1 pixel big. Does your framerate go way up? If so you're drawing too many pixels or your fragment shaders are too complex.
To see if simplifying your fragment shader would help use a simpler shader. I don't know three.js that well. Maybe the Basic Material?
Do you have shadows? Turn them off. Does it go faster? Can you use simpler shadows? For example the shadows in this sample are fake. They are just planes with a circle texture.
Are you using any postprocessing effects? Post processing effects are expensive, specially on mobile GPUs.
Are you drawing lots of opaque stuff? If so can you sort your drawing order so you draw front to back (close to far). Not sure if three.js has an option to do this or not. I know it can sort transparent stuff back to front so it should be simple to reverse the test. This will make rendering go quicker assuming you're drawing with the depth test on because pixels in the back will be rejected by the DEPTH_TEST and so won't have the fragment shader run for them.
Another thing you can do to save bandwidth is draw to a smaller canvas and have it be stretched using CSS to cover the area you want it to appear. Lots of games do this.
Are you geometry bound?
You say you're only drawing 17 meshes but how big are those meshes? 17 12 triangle cubes or 17 one million triangle meshes?
If you're geometry bound can use simplify? If the geometry goes far into the distance can you split it and use lods? see lod sample.
I'm actually trying to include a .jpg image into my 3D scene. All solutions i have found consisted in apply those images on meshes as texture : but then the scene does not look like well. Indeed, we can see the mesh border whether it be a plane or sphere... I just want to see the image. Does exist it another solution ?
On my application, i want to rotate an airplane around the earth, and the problem is about including this airplane.
Thanks for your help :)
Perhaps the class THREE.Sprite will accomplish the effect you want. THREE.Sprite can display an image, and can use either screen coordinates (e.g. canvas coordinates) or it can be part of your 3D scene, but a sprite image is always facing the camera. If you want the image to rotate, you do need to use it as a texture on a mesh. Whatever you decide to do in the end, I've posted a bunch of tutorial-style examples at http://stemkoski.github.io/Three.js/ that may help. Good luck!
I've been working on a WebGL project that runs on top of the Three.js library. I am rendering several semi-transparent meshes, and I notice that depending on the angle you tilt the camera, a different object is on top.
To illustrate the problem, I made a quick demo using three semi-transparent cubes. When you rotate the image past perpendicular to the screen, the second half of the smallest cube "jumps" and is no longer visible. However, shouldn't it still be visible? I tried adjusting some of the blending equations, but that didn't seem to make a difference.
What I'm wondering is whether or not this is a bug in WebGL/Three, or something I can fix. Any insight would be much appreciated :)
Well, that's something they weren't able to solve when they invented all this hardware accelerated graphics business and sounds like we'll have to deal with this for a long while.
The issue here is that graphic cards do not sort the polygons, nor objects. The graphics card is "dumb", you tell it to draw an object and it will draw the pixels that represent it and also, in another non-visible "image" called zbuffer (or depthbuffer), will draw the pixels that represent the object but instead of color it will draw the distance to the camera for each pixels. Any other objects that you draw afterwards, the graphics card will check if the distance to the camera for each pixel, and if it's farther, it won't draw it (unless you disable the check, that is).
This speeds up things a lot and gives you nice intersections between solid objects. But it doesn't play well with transparency. Say that you have 2 transparent objects and you want A to be drawn behind B. You'll need to tell the graphics card to draw A first and then B. This works fine as long as they're not intersecting. In order to draw 2 transparent objects intersecting then the graphics would have to sort all the polygons, and as the graphics card doesn't do that, then you'll have to do it.
It's one of these things that you need to understand and specifically tweak for your case.
In three.js, if you set material.transparent = true we'll sort that object so it's drawn before (earlier) other objects that are in front. But we can't really help you if you want to intersect them.
I am drawing two relative simple shapes and the geometry of them do not overlap.
Here is the code sample:
http://jsfiddle.net/pGD4n/9/
The Three.js Trackball is in there so you can click and drag to spin the objects around in 3d space. The problem is that as the objects rotate some faces disappear revealing the object below. A slight more rotation and the missing face returns, but others have gone missing.
I've tried BasicMaterial, Normal Material and LambertMaterial with both SmoothShading and Flat Shading. I have tried the scene with and without lighting. Moving the objects farther apart seems to correct the issue, but in the given example code the meshes do not overlap and should not have this problem. Problem happens in both Chrome and Firefox.
I imagine that switching to the OpenGL renderer would resolve the issue, but for compatibility we need use the Canvas renderer.
Any help or ideas appreciated.
This is a limitation of CanvasRenderer. Unfortunately per pixel z sorting is not available in CanvasRenderer so it basically tries to sort the whole polygon instead. Depending of where you're looking from the center of one polygon may be closer than the polygon on the side and so it "jumps".
The only solution right now is using WebGLRenderer. I'm working on a new renderer for context2d which hopefully will solve this without requiring webgl but it will still take some time.