I am using the three.js collada loader to import an .dae file with a texture (.png) image applied. I need to overwrite the .png file with a new texture, and am creating this texture using a canvas element that exports to .png format. If I clear my cache, it will change the texture (I am naming the exported "new" .png texture as the same filename as the original .png texture referenced in the .dae file).
How can I ensure that the new texture is recognized/rendered without having the user clear their cache? Example: User creates a new texture, and it is exported over the original texture, and the collada file re-renders after a user clicks a button to render the box.
Once you have a Javascript Image object in memory, regardless of the source, you can assign it to your objects' materials' textures via the .image attribute. You'll need to let THRE.js know to alter the binding. For example, if you have a new Image called, say, img and a mesh called mesh with a typical material:
mesh.material.map.image = img;
mesh.material.map.needsUpdate = true;
should do the trick. No need to send a DOM elemetn to disk as a .png, just use it.
Related
I have exported model using gltf exporter included in blender 2.8. Exporting to .gltf works fine but when exporting to .glb I cannot see the texture anymore. What is weird is that if I check .glb file in gltf viewer https://gltf-viewer.donmccurdy.com/ it works fine while in my environment and in three.js editor https://threejs.org/editor/ texture is black. Why does this happen and how to fix it? Does the gltf viewer load something differently? Here is the model to check for yourself https://drive.google.com/open?id=1gqdujds0VAgU__92GgTMsgWkO1BbbnPJ
glTF Viewer - works fine
Three.js editor - black texture (ambientlight is added)
glTF Viewer - works fine
That's because the viewer applies an environment map to the materials of your skinned mesh. This is not true for the editor. When you load the model in the viewer, just select for environment the value None in order to see this effect.
Instead of using an environment map, you can also set the metalness value for all materials from 1 to 0. Why a metalness value of 1 is problematic in your case is explained here:
https://stackoverflow.com/a/53349297/5250847
EDIT:
I had a question about exporting to obj and mtl but discovered that I could export from three.js using GLTFExporter.js and had success getting the geometry and texture out of three.js from that.
The issue I'm having with the GLTF Exporter is that I have textures that have offset and repeat settings that seem to not be exported from three.js when I open the file in Blender. In Blender the whole texture takes up the MeshPlane that used to only have a small part of the texture showing in Three.js scene.
Might anyone know what I could add to the GLTF Exporter to be able to record and keep the repeat and offset texture settings?
Many Thanks :)
I've hit this myself.. and as far as I know, the answer is No.
Offset and Repeat are THREE.js specific features. Some other libraries have equivalents.. some engines use direct texture matrix manipulation to achieve the same effect.
One workaround is to modify your models UV coordinates before exporting to reflect the settings of texture.offset and texture.repeat.
You would basically multiply each vertex UV by the texture.repeat, and then add texture.offset. That would effectively "bake" those parameters into the model UV's, but then would require you to reset .repeat and .offset back to 1,1 and 0,0 respectively, in order to render the model correctly again in THREE.js.
Here's a slightly relevant thread from the GLTF working group:
https://github.com/KhronosGroup/glTF/issues/107
I am working to display "animated 3D Models" on webpage. These models are in form of .obj, .mtl & .fbx files with texture and without texture. I successfully displayed .obj 3D Models on webpage (with texture , with mtl file), but i am unable to display .fbx 3D animated models with texture.
I already searched for this topic on Google, SO and also on Github (https://github.com/mrdoob/three.js/issues) but not found any solution.
I want to ask 2 questions here:
Is it possible to display fbx 3D Models with texture via three.js?
If it is possible then how i can do this OR if it is not then what other alternative can i use to render fbx model having texture to webpage?
After spending lots of time i have understood that you can not use .fbx model file directly with three.js (till r82). Alternatively you can convert .fbx file to .json (using three.js maya exporter )or .js (using blender).
So i am moving with .json file format. Thanks for your suggestion #mlkn
Is there a good/recommended way to do image processing in the fragment shaders then export the results to an external Javascript structure?
I am currently using a Shaders Texture with THREEJS (WebGL 1.0) to display my data.
It contains an array of 2D textures as uniform. I use it to emulate a 3D texture.
At this point all my data is stored in the Fragment Shader and I want to run some image processing on the whole data (not just the pixels on screen), such as thresholding, then export the results of the segmentation to a proper JS object.
I want to do it in the shaders as it runs so much faster.
Rendering To Texturing would not help in this case (I believe) because I want to modify/update the whole 3D texture, not only what is visible on screen.
I doesn't seem that the Effect Composer from THREEJS is what I am looking for either.
Does it make sense? Am I missing something?
Is there some code/demo/literature available out there on how to do "advanced" imaging processing in the shaders (or better yet with THREEJS Shader Texture), then save out the results?
Best
You can render as usual into the canvas and then use canvas.getImageData() to retrieve the image.
Then there is the method renderer.readRenderTargetPixels() (see here). I haven't used it yet, but it appears to do what you want.
So you can just render as you described (rendering to texture won't overwrite your textures as far as i can tell) into a framebuffer (i.e. using THREE.WebGLRenderTarget) and then use that method to retrieve the image-data.
I am building an application that dynamically loads images from a server to use as textures in the scene and I am working on how to load/unload these textures properly.
My simple question is; Where, in the Three.js call graph, does textures get loaded and/or updated into the GPU? Is it when I create a texture (var tex = new THREE.Texture()) or when I apply it to a mesh (var mesh = new THREE.Mesh(geom, mat))? The Texture class of Three suggests that textures are not loaded when creating the texture. But I cannot find anything in Mesh either.
Am I missing something? Are textures loaded in the render loop rather than on object creation? That would probably make sense.
Thanks in advance!
All GPU instructions have been abstracted away to the WebGLRenderer.
This means the creation of any object within three.js will not interact with the GPU in the slightest until you call:
renderer.render(scene, camera);
This call will automatically setup all the relevant WebGL buffers, shaders, attributes, uniforms, textures, etc. So until that point in time, all three.js meshes with their materials and geometries are really just nicely abstracted objects, completely separated from the way they are rendered to the screen (why assume they will be rendered at all?).
The main reason for this is that there are other renderers, such as the CanvasRenderer, which have an entirely different API.