Currently I'm working with ThreeJS and I need to combine the shadermaterial, because I'm using a custom shader that combines several textures into a single one, with the meshphongmaterial, since i don't want to lose all the work (lights and reflection) that the shader from meshphongmaterial does.
Is there a way to do this?
The solution was rather easy, I just took the shader code from the phong material and added my custom code in the section the texel variable is assigned.
Related
Is there a way to import three.js material qualities and scene items (like lights) into my shader material, in order to light it and cause it to cast shadows (on itself)?
I'm using react-three-fiber, and so haven't been able to find apprpriate resources yet.
Heres my code: https://codesandbox.io/s/r3f-wavey-image-shader-forked-nm3ykn?file=/src/App.js
EDIT:
I had a question about exporting to obj and mtl but discovered that I could export from three.js using GLTFExporter.js and had success getting the geometry and texture out of three.js from that.
The issue I'm having with the GLTF Exporter is that I have textures that have offset and repeat settings that seem to not be exported from three.js when I open the file in Blender. In Blender the whole texture takes up the MeshPlane that used to only have a small part of the texture showing in Three.js scene.
Might anyone know what I could add to the GLTF Exporter to be able to record and keep the repeat and offset texture settings?
Many Thanks :)
I've hit this myself.. and as far as I know, the answer is No.
Offset and Repeat are THREE.js specific features. Some other libraries have equivalents.. some engines use direct texture matrix manipulation to achieve the same effect.
One workaround is to modify your models UV coordinates before exporting to reflect the settings of texture.offset and texture.repeat.
You would basically multiply each vertex UV by the texture.repeat, and then add texture.offset. That would effectively "bake" those parameters into the model UV's, but then would require you to reset .repeat and .offset back to 1,1 and 0,0 respectively, in order to render the model correctly again in THREE.js.
Here's a slightly relevant thread from the GLTF working group:
https://github.com/KhronosGroup/glTF/issues/107
I'm new to Three.js and JavaScript.
I made an object with material in Blender after UV unwrapping it twice for bump maps.
Then, I exported it by using THREE.JSONLoader.
The object is showing up, but the problem is, it's using the first UV map for bump maps. However, I want it to use the second UV map.
In this case, how can I make it use the second UV map? What coding is needed?
Thank you for reading.
This is currently a limitation of three.js.
The default materials in three.js all use the first set of UVs for all maps, with the exception of the light map and ambient occlusion map, which each use the second set of UVs.
Your workarounds are to hack the library, write a custom ShaderMaterial, or modify your geometry and/or textures. Since you are new to three.js and JS, I'd suggest the latter.
Also see this related answer.
three.js r.90
I created a THREE.Point object, using a bufferGeometry, to render thousands of particles, and I used the PointsMaterial material. I update the material on runtime, changing between the normal square particle (default one) with some other textures and colors, and it works fine.
The problem comes when I want to create particles with different sizes. I can't do it just setting a BufferAttribute. I tried make a custom shader, Three.js Particles of various sizes but I couldn't make it works, some Three version problem maybe? I duno.
So I thought grouping all my particles by sizes and create one bufferGeometry for each particles sizes. But I'm not sure this is the best approach. Is this the best choice in terms of performance? Or do I have to create a custom shader to achieve this goal?
Three.js revision:72
An example of what you want to do is given in this answer -- just use the alpha attribute value in the example to vary the point size instead: http://jsfiddle.net/8mrH7/196/
gl_PointSize = 8.0 * alpha; // use alpha to vary point size, instead
three.js r.73
I am building an application that dynamically loads images from a server to use as textures in the scene and I am working on how to load/unload these textures properly.
My simple question is; Where, in the Three.js call graph, does textures get loaded and/or updated into the GPU? Is it when I create a texture (var tex = new THREE.Texture()) or when I apply it to a mesh (var mesh = new THREE.Mesh(geom, mat))? The Texture class of Three suggests that textures are not loaded when creating the texture. But I cannot find anything in Mesh either.
Am I missing something? Are textures loaded in the render loop rather than on object creation? That would probably make sense.
Thanks in advance!
All GPU instructions have been abstracted away to the WebGLRenderer.
This means the creation of any object within three.js will not interact with the GPU in the slightest until you call:
renderer.render(scene, camera);
This call will automatically setup all the relevant WebGL buffers, shaders, attributes, uniforms, textures, etc. So until that point in time, all three.js meshes with their materials and geometries are really just nicely abstracted objects, completely separated from the way they are rendered to the screen (why assume they will be rendered at all?).
The main reason for this is that there are other renderers, such as the CanvasRenderer, which have an entirely different API.