I'm new to Three.js and JavaScript.
I made an object with material in Blender after UV unwrapping it twice for bump maps.
Then, I exported it by using THREE.JSONLoader.
The object is showing up, but the problem is, it's using the first UV map for bump maps. However, I want it to use the second UV map.
In this case, how can I make it use the second UV map? What coding is needed?
Thank you for reading.
This is currently a limitation of three.js.
The default materials in three.js all use the first set of UVs for all maps, with the exception of the light map and ambient occlusion map, which each use the second set of UVs.
Your workarounds are to hack the library, write a custom ShaderMaterial, or modify your geometry and/or textures. Since you are new to three.js and JS, I'd suggest the latter.
Also see this related answer.
three.js r.90
Related
EDIT:
I had a question about exporting to obj and mtl but discovered that I could export from three.js using GLTFExporter.js and had success getting the geometry and texture out of three.js from that.
The issue I'm having with the GLTF Exporter is that I have textures that have offset and repeat settings that seem to not be exported from three.js when I open the file in Blender. In Blender the whole texture takes up the MeshPlane that used to only have a small part of the texture showing in Three.js scene.
Might anyone know what I could add to the GLTF Exporter to be able to record and keep the repeat and offset texture settings?
Many Thanks :)
I've hit this myself.. and as far as I know, the answer is No.
Offset and Repeat are THREE.js specific features. Some other libraries have equivalents.. some engines use direct texture matrix manipulation to achieve the same effect.
One workaround is to modify your models UV coordinates before exporting to reflect the settings of texture.offset and texture.repeat.
You would basically multiply each vertex UV by the texture.repeat, and then add texture.offset. That would effectively "bake" those parameters into the model UV's, but then would require you to reset .repeat and .offset back to 1,1 and 0,0 respectively, in order to render the model correctly again in THREE.js.
Here's a slightly relevant thread from the GLTF working group:
https://github.com/KhronosGroup/glTF/issues/107
I am trying to get back and export the mesh that is being displaced by a displacementMap.
The shader is transforming vertexes according to this line (from
three.js/src/renderers/shaders/ShaderChunk/displacementmap_vertex.glsl):
transformed += normalize( objectNormal ) * ( texture2D( displacementMap, uv ).x * displacementScale + displacementBias );
This is displacing a vertex according to the displacementMap, mixed with the uv coordinates for that vertex.
I am trying to create this mesh/geometry so that I can then later export it.
I have created a "demo" of the problem here:
Github Page
I would like the displaced mesh, as seen in the viewport, up on pressing exportSTL. However I am only getting the undisplaced plane.
I understand why this happens, the displacement only happens in the shader and is not really displacing the geometry of the plane directly.
I have not found a method provided by three.js and so far have not found any way in getting the changes from the shader.
So I am trying to do it with a function in the "demo.js".
However, I am a WebGL/three.js newbie and have problems re-creating what the shader does.
I have found exporters handling morphTargets, but these are of no help.
After reading this question I tried PlaneBufferGeometry, as this is closer to the shader - but this produces the same results for me.
I think this question originally tried to produce something similar, but accepted an unrelated question.
In the end I would like to draw on a HTML-canvas which then updates the texture in real time (I have this part working). The user can then export the mesh for 3d printing.
Is there a way three.js can give me the modified geometry of the shader?
Or can someone help me translate the shader line in to a "conventional" Three.js function?
Maybe this is totally the wrong approach to get a displaced mesh?
Update - Example is working
Thanks to the example from DeeFisher I can now calculate the displacement in CPU, as originally suggested by imerso.
If you click on the Github Page now, you will get a working example.
At the moment I do not fully understand why I have to mirror the canvas to get the correct displacement in the end, but this is at worst a minor nuissance.
To do that while still using a shader for the displacement, you will need to switch to WebGL2 and use Transform-Feedback (Google search: WebGL2 Transform-Feedback).
An alternative would be to read the texture back to CPU, and scan it while displacing the vertices using CPU only (Google search: WebGL readPixels).
Both alternatives will require some effort, so no code sample at this time. =)
BABYLON.js can be used in conjunction with THREE.js and it allows you to displace the actual mesh vertices when applying displacement maps:
var sphere = BABYLON.Mesh.CreateSphere("Sphere", 64, 10, scene, true);
sphere.applyDisplacementMap(url, minHeight, maxHeight, onSuccess, uvOffset, uvScale)
See an example of the function in use here.
You can then use a for to loop transfer the BABYLON mesh data into a THREE mesh object.
I'm trying to load with three.js the same image in a large number (~ 1000) of bidimensional shapes but with different offsets in every shape.
I've taken this demo from the official website and customized it into this other demo, with all my shapes and a random background texture.
The problem is that if I clone the texture once per shape the page eats a lot of RAM and it ends up crashing.
You can see this in action by going in the javascript and changing the comments in the addShape function (you'll find the instructions in the code).
I've done some research and found some results, like this open issue or this older question where it's recommended to clone the texture; anyway nothing seems to work in my example.
Am I doing something wrong? It's changed something since these last posts about this problem?
Maybe I´m misunderstanding the problem, but why don´t you change the UV coordinates of the individual shapes to align the texture and use just one texture?
From documentation:
Geometry.faceVertexUvs
Array of face UV layers, used for mapping textures onto the geometry.
Each UV layer is an array of UVs matching the order and number of
vertices in faces.
To signal an update in this array, Geometry.uvsNeedUpdate needs to be
set to true.
I'm having trouble to correctly apply a texture on object.
As you can see from this picture: http://i.stack.imgur.com/4WpP4.png the texture is repeated and not applied continuously across whole the front face of the object.
Here: http://goo.gl/Dx6hDI you can find the code and a live example.
someone can help me ?
Your object does not have correct UV coordinates. Load the object into a 3d editor, apply the coordinates you want, then export new version.
Currently I'm working with ThreeJS and I need to combine the shadermaterial, because I'm using a custom shader that combines several textures into a single one, with the meshphongmaterial, since i don't want to lose all the work (lights and reflection) that the shader from meshphongmaterial does.
Is there a way to do this?
The solution was rather easy, I just took the shader code from the phong material and added my custom code in the section the texel variable is assigned.