I have loaded .obj using OBJLoader2 and also with its .mtl , now when user click on one of Mesh, then i want to change mesh geometry such like that it divides into two equal parts and also have different material for them.
//this.currentobj represents the user clicked mesh.
let geometry = this.currentobj.geometry;
geometry.clearGroups();
geometry.addGroup( 0, Infinity, 0 );
geometry.addGroup( 0, Infinity, 1 );
geometry.addGroup( 0, Infinity, 2 );
geometry.addGroup( 0, Infinity, 3 );
let material0 = new THREE.MeshBasicMaterial({color: 0xff0000});
let material1 = new THREE.MeshBasicMaterial({color: 0x444444});
let material2 = new THREE.MeshBasicMaterial({color: 0x111111});
let material3 = new THREE.MeshBasicMaterial({color: 0x555555});
var materials = [ material0, material1, material2, material3 ];
let mesh = new THREE.Mesh(geometry, materials);
this.scene.add(mesh);
Dividing mesh is a solved problem in three.js. It was recently revised with a new implementation in Jun by #Manthrax and mrdoob requested it be pulled to main as the original csg solution had issues, as per this thread: https://discourse.threejs.org/t/looking-for-updated-plug-in-for-csg/6785/8
I do not know the current status of main, but Manthrax's library is available here: https://github.com/manthrax/THREE-CSGMesh with example code.
The operation returns the resulting mesh collection and the material objects can be modified individually. My own tangentially related question was answered here by Manthrax in April: Threecsg flat sides when expecting volumetric result It shows two different materials on the resulting cut of a sphere and a cube.
For example:
function doCSG(a,b,op,mat){
var bspA = CSG.fromMesh( a );
var bspB = CSG.fromMesh( b );
var bspC = bspA[op]( bspB );
var result = CSG.toMesh( bspC, a.matrix );
result.material = mat;
result.castShadow = result.receiveShadow = true;
return result;
}
var meshA = new THREE.Mesh(new THREE.BoxGeometry(1,1,1));
var meshB = new THREE.Mesh(new THREE.BoxGeometry(1,1,1));
meshB.position.add(new THREE.Vector3( 0.5, 0.5, 0.5);
var meshC = doCSG( meshA,meshB, 'subtract',meshA.material);
console.log(meshC.material);//mesh C result has it's own material derived from meshA but can be new Material.
In your case you'd want to use the bounding box helper to produce a mesh that you move half way into the object and then use that to cut your geometry in half.
I want to display a photograph in my THREE.js scene. The size of the geometry is based on on the window and the photograph, which added as Texture should cover the geometry. Basically what I want is background-size: cover in CSS, but in THREE.js, so that I can add some more transformations to this geometry.
This is my code so far:
const txture = this.loader.load('/site/themes/pride/img/home_1.jpg');
const material = new MeshBasicMaterial({
map: txture,
transparent: true,
opacity: 1,
});
const geom = new BoxBufferGeometry( window.innerWidth, window.innerHeight, 0.0001 );
const plane = new Mesh(geom, material);
plane.overdraw = true;
plane.position.set(0,0,0);
Is this even possible or should I do it in a complete other way?
I am trying to draw a least squares plane through a set of points in Three.js. I have a plane defined as follows:
var plane = new THREE.Plane();
plane.setFromNormalAndCoplanarPoint(normal, point).normalize();
My understanding is that I need to take that plane and use it to come up with a Geometry in order to create a mesh to add to the scene for display:
var dispPlane = new THREE.Mesh(planeGeometry, planeMaterial);
scene.add(dispPlane);
I've been trying to apply this answer to get the geometry. This is what I came up with:
plane.setFromNormalAndCoplanarPoint(dir, centroid).normalize();
planeGeometry.vertices.push(plane.normal);
planeGeometry.vertices.push(plane.orthoPoint(plane.normal));
planeGeometry.vertices.push(plane.orthoPoint(planeGeometry.vertices[1]));
planeGeometry.faces.push(new THREE.Face3(0, 1, 2));
planeGeometry.computeFaceNormals();
planeGeometry.computeVertexNormals();
But the plane is not displayed at all, and there are no errors to indicate where I may have gone wrong.
So my question is, how can I take my Math.Plane object and use that as a geometry for a mesh?
This approach should create a mesh visualization of the plane. I'm not sure how applicable this would be towards the least-squares fitting however.
// Create plane
var dir = new THREE.Vector3(0,1,0);
var centroid = new THREE.Vector3(0,200,0);
var plane = new THREE.Plane();
plane.setFromNormalAndCoplanarPoint(dir, centroid).normalize();
// Create a basic rectangle geometry
var planeGeometry = new THREE.PlaneGeometry(100, 100);
// Align the geometry to the plane
var coplanarPoint = plane.coplanarPoint();
var focalPoint = new THREE.Vector3().copy(coplanarPoint).add(plane.normal);
planeGeometry.lookAt(focalPoint);
planeGeometry.translate(coplanarPoint.x, coplanarPoint.y, coplanarPoint.z);
// Create mesh with the geometry
var planeMaterial = new THREE.MeshLambertMaterial({color: 0xffff00, side: THREE.DoubleSide});
var dispPlane = new THREE.Mesh(planeGeometry, planeMaterial);
scene.add(dispPlane);
var material = ...;
var plane = new THREE.Plane(...);
// Align to plane
var geometry = new THREE.PlaneGeometry(100, 100);
var mesh = new THREE.Mesh(geometry, material);
mesh.translate(plane.coplanarPoint());
mesh.quaternion.setFromUnitVectors(new THREE.Vector3(0,0,1), plane.normal);
Note that Plane.coplanarPoint() simply returns -normal*constant, so it might be a better option to use Plane.projectPoint() to determine a center that is "close to" an arbitrary point.
i have an output date like this:
geom[0] = {
texturesindexT: new Int16Array([0,1,2,3]),
texturesindexS: new Int16Array([-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,...]),
materialsindexT: new Int16Array([-1,-1,-1,-1]),
materialsindexS: new Int16Array([-1,0,1,2,3,4,5,0,6,2,7,8,-1,0,...]),
startIndicesT: new Uint32Array([0,288,606,897,1380]),
startIndicesS: new Uint32Array([1380,1431,1479,1485,1497,1515,1659,...]),
m_indices: new Uint16Array([0,1,2,3,0,2,4,2,5,4,6,2,7,3,2,8,9,10,...]),
m_vertices: new Float32Array([-81.93996,25.7185,-85.53822,-81.93996,...]),
m_normals: new Float32Array([-0.004215205,0.9999894,-0.001817489,-0.004215205,...]),
m_texCoords: new Float32Array([0,0.04391319,0,0.2671326,0.009521127,0.03514284,...]),
}
var textures = new Array("-1_-1/t0.jpg","-1_-1/t1.jpg","-1_-1/t2.jpg",...);
The Data is in order for an Index, Vertex and Normal-Buffer but sections have to be rendered with other Textures and Maretials.
I have tried to make a THREE.Geometry out of the indices, vertices and texCoords/UVCoords but that didn't work.
Now i am trying use a THREE.BufferGeometry() and this work BUT i need to render index 0 to 287 with Texture "textures[0]" and index 288 to 605 with "textures[1]" and so on.
My first attempt was to make a BufferGeometry for each part with index 288 to 605 , but since the Indices are in order for the hole model, i have to put the complete vertices, normales and UVCoords in the Buffer for just a couple of faces.
Is there a way to render sections of the BufferGeometry with other Textures or to set the Texture Index for each Face?
Or is it possible to create a Material, that renders the first X faces with Texture A and the next with Texture B???
If you want to use two different textures with a single BufferGeometry, you can use this pattern, which sets drawcalls:
var geometry1 = new THREE.BufferGeometry();
// ...and set the data...
var geometry2 = geometry1.clone();
// set drawcalls
geometry1.offsets = geometry1.drawcalls = []; // currently required
geometry1.addDrawCall( start1, count1, 0 );
geometry2.offsets = geometry2.drawcalls = []; // currently required
geometry2.addDrawCall( start2, count2, 0 );
var material1 = new THREE.MeshPhongMaterial( { map: map1 } );
var material2 = new THREE.MeshPhongMaterial( { map: map2 } );
var mesh1 = new THREE.Mesh( geometry1, material1 );
var mesh2 = new THREE.Mesh( geometry2, material2 );
three.js r.70
You can create two geometries with same vertex buffers and different indexes:
var position = new THREE.BufferAttribute(positionArray, 3);
var normal = new THREE.BufferAttribute(normalArray, 3);
var uv = new THREE.BufferAttribute(uvArray, 2);
var indices1 = new THREE.BufferAttribute(indexArray1, 1);
var geometry1 = new THREE.BufferGeometry();
geometry1.addAttribute('position', position);
geometry1.addAttribute('normal', normal);
geometry1.addAttribute('uv', uv);
geometry1.addAttribute('index', indices1);
var indices2 = new THREE.BufferAttribute(indexArray2, 1);
var geometry2 = new THREE.BufferGeometry();
geometry2.addAttribute('position', position);
geometry2.addAttribute('normal', normal);
geometry2.addAttribute('uv', uv);
geometry2.addAttribute('index', indices2);
and then create 2 meshes with different materials as you normally would. As far as I understand, this will re-use same data in both meshes.
var arcShape = new THREE.Shape();
arcShape.moveTo( 50, 10 );
arcShape.absarc( 10, 10, 40, 0, Math.PI*2, false );
var map1 = new THREE.ImageUtils.loadTexture( 'moon.jpg' );
var geometry = new THREE.ExtrudeGeometry( arcShape, extrudeSettings );
var new3D = new THREE.Mesh( geometry, new THREE.MeshBasicMaterial( { map: map1 } ) );
new3D.receiveShadow = true;
obj3Dmassive.add( new3D );
Texture (512x512): http://f3.s.qip.ru/cMfvUhNj.png
Result: http://f3.s.qip.ru/cMfvUhNh.png
How to fill a texture figure?
If you just have a straight extrusion path, you can apply textures to extruded shapes without the need for a custom UV generator. e.g.
var extrudeSettings = {
bevelEnabled: false,
steps: 1,
amount: 20, //extrusion depth, don't define an extrudePath
material:0, //material index of the front and back face
extrudeMaterial : 1 //material index of the side faces
};
var geometry = shape.extrude(extrudeSettings);
var mesh = new THREE.Mesh(geometry,
new THREE.MeshFaceMaterial([materialFace, materialSide]));
This is handy for cookie-cutter type shapes.
EDIT: Answer outdated. See Extruding multiple polygons with multiple holes and texturing the combined shape instead.
You are lucky. What you are trying to do has been done in the following example:
https://github.com/mrdoob/three.js/blob/master/examples/webgl_geometry_extrude_uvs2.html
You have to specify your own UV generator function. This example shows you how to do that.
Remember, this is just an example. It may not be correct -- or easy to implement in your case.