Lets say I have a geometry which I am using the vertices of to create Points or an InstancedMesh. But then I want change this underlying geometry to something else, let's as a cone to a sphere or something which has the same number of vertices. I would like to animated between these without using MorphTargets so I guess I need to use a custom vertex shader which is fine however I'm a bit stuck as to how to pass in the additional BufferGeometrys into the vertex shader.
I can't really think how I might do this with the uniforms - has anyone got any ideas as, in my understanding i can only use int/float/bool/vec/ivec/mat but i need multiple vertex buffers - is it just an array of some kind?
I guess i'm trying to find a way of having multiple "full" geometries which i can interrogate within the vertex shader but can't figure out how to access/pass these additional buffers into webgl from three.js
The default position of the vertices is defined in a BufferAttribute called position. This is passed to the vertex shader as attribute vec3 position.
You could create a new BufferAttribute in your geometry called position2 and you can define it in your vertex shader as attribute vec3 position2.
Related
I am trying to get back and export the mesh that is being displaced by a displacementMap.
The shader is transforming vertexes according to this line (from
three.js/src/renderers/shaders/ShaderChunk/displacementmap_vertex.glsl):
transformed += normalize( objectNormal ) * ( texture2D( displacementMap, uv ).x * displacementScale + displacementBias );
This is displacing a vertex according to the displacementMap, mixed with the uv coordinates for that vertex.
I am trying to create this mesh/geometry so that I can then later export it.
I have created a "demo" of the problem here:
Github Page
I would like the displaced mesh, as seen in the viewport, up on pressing exportSTL. However I am only getting the undisplaced plane.
I understand why this happens, the displacement only happens in the shader and is not really displacing the geometry of the plane directly.
I have not found a method provided by three.js and so far have not found any way in getting the changes from the shader.
So I am trying to do it with a function in the "demo.js".
However, I am a WebGL/three.js newbie and have problems re-creating what the shader does.
I have found exporters handling morphTargets, but these are of no help.
After reading this question I tried PlaneBufferGeometry, as this is closer to the shader - but this produces the same results for me.
I think this question originally tried to produce something similar, but accepted an unrelated question.
In the end I would like to draw on a HTML-canvas which then updates the texture in real time (I have this part working). The user can then export the mesh for 3d printing.
Is there a way three.js can give me the modified geometry of the shader?
Or can someone help me translate the shader line in to a "conventional" Three.js function?
Maybe this is totally the wrong approach to get a displaced mesh?
Update - Example is working
Thanks to the example from DeeFisher I can now calculate the displacement in CPU, as originally suggested by imerso.
If you click on the Github Page now, you will get a working example.
At the moment I do not fully understand why I have to mirror the canvas to get the correct displacement in the end, but this is at worst a minor nuissance.
To do that while still using a shader for the displacement, you will need to switch to WebGL2 and use Transform-Feedback (Google search: WebGL2 Transform-Feedback).
An alternative would be to read the texture back to CPU, and scan it while displacing the vertices using CPU only (Google search: WebGL readPixels).
Both alternatives will require some effort, so no code sample at this time. =)
BABYLON.js can be used in conjunction with THREE.js and it allows you to displace the actual mesh vertices when applying displacement maps:
var sphere = BABYLON.Mesh.CreateSphere("Sphere", 64, 10, scene, true);
sphere.applyDisplacementMap(url, minHeight, maxHeight, onSuccess, uvOffset, uvScale)
See an example of the function in use here.
You can then use a for to loop transfer the BABYLON mesh data into a THREE mesh object.
I am developing a 3D engine with WebGL and I am trying to implement shadows. My logic is as follow:
The first time that scene is rendered I loop over all meshes and create and compile the shader program (vertex and fragment shader). I only have one shader program per mesh, so, when the shader is created I need to know the lights that has the scene, the mesh's material and other considerations.
Once the shader is created I attached it to the mesh object and render the object. In the next iteration the shader is not created (because it was created previously).
I heard about shadow mapping. In order to implement it I need to render to texture and compute the distance between the light and the current fragment and this for each light source. So if I have 2 lights I need to do this process twice and then pass those textures to the shader that render the scene.
The problem is that I can create 100 ligths if I want and I would need to render 100 textures and pass it to the shader that render the scene, but OpenGL and WebGL has a limited unit textures, so I couldn't bind all textures to render the complete scene.
How can I implement shadow mapping with an arbitrary number of lights?
As far as I know, it is called projective texture mapping. Are there any library methods to project primitive 2D shapes (lines mostly) to a texture?
This threejs example looks close to what I need. I tried replacing the decal texture (decalMaterial) with
THREE.LineBasicMaterial
but I get square instead of lines.
Drawing a line onto a 3D texture is possible. But are you wanting that or just a decal on your model? There are other ways of putting a decal that might be better, but I'll assume you want to paint onto the texture. I've done it in Unity, but not WebGL and am not familiar with WebGL's limitations. It was a program that let's you paint onto 3D models like in Substance Painter, ZBrush, etc. I recommend doing this in a non-destructive manner at runtime so do this to a separate render texture then just combine the 2 textures for rendering in your final model.
To do this you are going to need to have to render your model into texture space. So in your vertex shader output the uv of the model as your position.
Been writing hlsl and Cg a lot recently so glsl is super rusty. Treat this more like psuedocode.
//Vertex Shader
in vec4 position;
in vec2 uv;
out vec4 fworldPos;
out vec2 fuv;
uniform mat4 transform;
void main()
{
//We use the uv position of the model so we can draw into
//its texture space.
gl_Position = vec4(uv.x*2.0-1.0,uv.y*2.0-1.0,0.0,1.0);
//Give the fragment shader the uv
fuv = uv;
//We will need to do per pixel collision detection with
//the line or object so we need the world position.
fworldPos = transform*position
}
You then need to do collision detection with your brush in the pixel shader and see if that world position is contained with your brush. Again though this is if you want to paint on it. Project I did we used raycasts onto the model so the brush would follow along the surface and do the above easily. This will give you a very rough looking brush and you'll need to add a lot of parameters to adjust how the brush looks like falloff strength, falloff radius, etc. Basically all the parameters you see in other painting 3D software.
You can do this with plane's, boxes, spheres, lines, whatever. Just need to test if the world position is contained within your brush. Will need different shaders for different types of brushes though.
I know that we can load JSON models in WebGL, but I don't know how to animate them if we have a rigged model loaded. Is there any way of doing this without three.js?
You can animate a rigged model using THREE.js (however you seem to not want to use the built in functionality).
What THREE.js is doing in the background, is passing all the matrix transforms (an array of matrices), and per vertex it passes the bone indexes (up to 4) and bone weights to the vertex shader. In the vertex shader, it's blending between those matrices based on vertex weight and translating the vertex. So in theory you can pass values to the vertex shader to animate things. Or just use THREE.js animation routines.
It can use 2 methods to store all this data. One method uses an "image texture" which stores all those matrix and does some fancy footwork to turn the image into matrices in the vertex shader. Another method is just passing uniform matrix array (for newer graphics cards this is preferred method).
I'm having difficulties with vertex normals in THREE.js. (For reference I'm using revision 58.) For various reasons I'd like to first calculate the face vertex normals when I setup my geometry, then be free to transform it, merge and whatnot.
While I realize the normals depend on the vertices which are transformed when you apply a matrix, I thought geometry.applyMatrix was able to transform them as well. However, while the following works fine:
geometry.applyMatrix(new THREE.Matrix4().makeScale(1, -1, 1));
geometry.computeFaceNormals();
geometry.computeVertexNormals();
...the following order of operations yields reversed vertex normals:
geometry.computeFaceNormals();
geometry.computeVertexNormals();
geometry.applyMatrix(new THREE.Matrix4().makeScale(1, -1, 1));
So I'm simply wondering, is this working as intended? Do I need to first do all the transformations on the geometry before I calculate the vertex normals?
three.js does not support reflections in the object matrix. By setting a negative scale factor, you are reflecting the geometry of the object.
You are free to apply such a matrix to your geometry directly, however, which of course, is what you are doing.
However, this will result in a number of undesirable consequences, one of which is the geometry faces will no longer have counterclockwise winding order, but clockwise. It will also result in reversed face normals as calculated by geometry.computeFaceNormals().
I would advise against doing this unless you are familiar with the inner-workings of the library.
three.js r.58