i have an output date like this:
geom[0] = {
texturesindexT: new Int16Array([0,1,2,3]),
texturesindexS: new Int16Array([-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,...]),
materialsindexT: new Int16Array([-1,-1,-1,-1]),
materialsindexS: new Int16Array([-1,0,1,2,3,4,5,0,6,2,7,8,-1,0,...]),
startIndicesT: new Uint32Array([0,288,606,897,1380]),
startIndicesS: new Uint32Array([1380,1431,1479,1485,1497,1515,1659,...]),
m_indices: new Uint16Array([0,1,2,3,0,2,4,2,5,4,6,2,7,3,2,8,9,10,...]),
m_vertices: new Float32Array([-81.93996,25.7185,-85.53822,-81.93996,...]),
m_normals: new Float32Array([-0.004215205,0.9999894,-0.001817489,-0.004215205,...]),
m_texCoords: new Float32Array([0,0.04391319,0,0.2671326,0.009521127,0.03514284,...]),
}
var textures = new Array("-1_-1/t0.jpg","-1_-1/t1.jpg","-1_-1/t2.jpg",...);
The Data is in order for an Index, Vertex and Normal-Buffer but sections have to be rendered with other Textures and Maretials.
I have tried to make a THREE.Geometry out of the indices, vertices and texCoords/UVCoords but that didn't work.
Now i am trying use a THREE.BufferGeometry() and this work BUT i need to render index 0 to 287 with Texture "textures[0]" and index 288 to 605 with "textures[1]" and so on.
My first attempt was to make a BufferGeometry for each part with index 288 to 605 , but since the Indices are in order for the hole model, i have to put the complete vertices, normales and UVCoords in the Buffer for just a couple of faces.
Is there a way to render sections of the BufferGeometry with other Textures or to set the Texture Index for each Face?
Or is it possible to create a Material, that renders the first X faces with Texture A and the next with Texture B???
If you want to use two different textures with a single BufferGeometry, you can use this pattern, which sets drawcalls:
var geometry1 = new THREE.BufferGeometry();
// ...and set the data...
var geometry2 = geometry1.clone();
// set drawcalls
geometry1.offsets = geometry1.drawcalls = []; // currently required
geometry1.addDrawCall( start1, count1, 0 );
geometry2.offsets = geometry2.drawcalls = []; // currently required
geometry2.addDrawCall( start2, count2, 0 );
var material1 = new THREE.MeshPhongMaterial( { map: map1 } );
var material2 = new THREE.MeshPhongMaterial( { map: map2 } );
var mesh1 = new THREE.Mesh( geometry1, material1 );
var mesh2 = new THREE.Mesh( geometry2, material2 );
three.js r.70
You can create two geometries with same vertex buffers and different indexes:
var position = new THREE.BufferAttribute(positionArray, 3);
var normal = new THREE.BufferAttribute(normalArray, 3);
var uv = new THREE.BufferAttribute(uvArray, 2);
var indices1 = new THREE.BufferAttribute(indexArray1, 1);
var geometry1 = new THREE.BufferGeometry();
geometry1.addAttribute('position', position);
geometry1.addAttribute('normal', normal);
geometry1.addAttribute('uv', uv);
geometry1.addAttribute('index', indices1);
var indices2 = new THREE.BufferAttribute(indexArray2, 1);
var geometry2 = new THREE.BufferGeometry();
geometry2.addAttribute('position', position);
geometry2.addAttribute('normal', normal);
geometry2.addAttribute('uv', uv);
geometry2.addAttribute('index', indices2);
and then create 2 meshes with different materials as you normally would. As far as I understand, this will re-use same data in both meshes.
Related
I have loaded .obj using OBJLoader2 and also with its .mtl , now when user click on one of Mesh, then i want to change mesh geometry such like that it divides into two equal parts and also have different material for them.
//this.currentobj represents the user clicked mesh.
let geometry = this.currentobj.geometry;
geometry.clearGroups();
geometry.addGroup( 0, Infinity, 0 );
geometry.addGroup( 0, Infinity, 1 );
geometry.addGroup( 0, Infinity, 2 );
geometry.addGroup( 0, Infinity, 3 );
let material0 = new THREE.MeshBasicMaterial({color: 0xff0000});
let material1 = new THREE.MeshBasicMaterial({color: 0x444444});
let material2 = new THREE.MeshBasicMaterial({color: 0x111111});
let material3 = new THREE.MeshBasicMaterial({color: 0x555555});
var materials = [ material0, material1, material2, material3 ];
let mesh = new THREE.Mesh(geometry, materials);
this.scene.add(mesh);
Dividing mesh is a solved problem in three.js. It was recently revised with a new implementation in Jun by #Manthrax and mrdoob requested it be pulled to main as the original csg solution had issues, as per this thread: https://discourse.threejs.org/t/looking-for-updated-plug-in-for-csg/6785/8
I do not know the current status of main, but Manthrax's library is available here: https://github.com/manthrax/THREE-CSGMesh with example code.
The operation returns the resulting mesh collection and the material objects can be modified individually. My own tangentially related question was answered here by Manthrax in April: Threecsg flat sides when expecting volumetric result It shows two different materials on the resulting cut of a sphere and a cube.
For example:
function doCSG(a,b,op,mat){
var bspA = CSG.fromMesh( a );
var bspB = CSG.fromMesh( b );
var bspC = bspA[op]( bspB );
var result = CSG.toMesh( bspC, a.matrix );
result.material = mat;
result.castShadow = result.receiveShadow = true;
return result;
}
var meshA = new THREE.Mesh(new THREE.BoxGeometry(1,1,1));
var meshB = new THREE.Mesh(new THREE.BoxGeometry(1,1,1));
meshB.position.add(new THREE.Vector3( 0.5, 0.5, 0.5);
var meshC = doCSG( meshA,meshB, 'subtract',meshA.material);
console.log(meshC.material);//mesh C result has it's own material derived from meshA but can be new Material.
In your case you'd want to use the bounding box helper to produce a mesh that you move half way into the object and then use that to cut your geometry in half.
I am trying to draw a least squares plane through a set of points in Three.js. I have a plane defined as follows:
var plane = new THREE.Plane();
plane.setFromNormalAndCoplanarPoint(normal, point).normalize();
My understanding is that I need to take that plane and use it to come up with a Geometry in order to create a mesh to add to the scene for display:
var dispPlane = new THREE.Mesh(planeGeometry, planeMaterial);
scene.add(dispPlane);
I've been trying to apply this answer to get the geometry. This is what I came up with:
plane.setFromNormalAndCoplanarPoint(dir, centroid).normalize();
planeGeometry.vertices.push(plane.normal);
planeGeometry.vertices.push(plane.orthoPoint(plane.normal));
planeGeometry.vertices.push(plane.orthoPoint(planeGeometry.vertices[1]));
planeGeometry.faces.push(new THREE.Face3(0, 1, 2));
planeGeometry.computeFaceNormals();
planeGeometry.computeVertexNormals();
But the plane is not displayed at all, and there are no errors to indicate where I may have gone wrong.
So my question is, how can I take my Math.Plane object and use that as a geometry for a mesh?
This approach should create a mesh visualization of the plane. I'm not sure how applicable this would be towards the least-squares fitting however.
// Create plane
var dir = new THREE.Vector3(0,1,0);
var centroid = new THREE.Vector3(0,200,0);
var plane = new THREE.Plane();
plane.setFromNormalAndCoplanarPoint(dir, centroid).normalize();
// Create a basic rectangle geometry
var planeGeometry = new THREE.PlaneGeometry(100, 100);
// Align the geometry to the plane
var coplanarPoint = plane.coplanarPoint();
var focalPoint = new THREE.Vector3().copy(coplanarPoint).add(plane.normal);
planeGeometry.lookAt(focalPoint);
planeGeometry.translate(coplanarPoint.x, coplanarPoint.y, coplanarPoint.z);
// Create mesh with the geometry
var planeMaterial = new THREE.MeshLambertMaterial({color: 0xffff00, side: THREE.DoubleSide});
var dispPlane = new THREE.Mesh(planeGeometry, planeMaterial);
scene.add(dispPlane);
var material = ...;
var plane = new THREE.Plane(...);
// Align to plane
var geometry = new THREE.PlaneGeometry(100, 100);
var mesh = new THREE.Mesh(geometry, material);
mesh.translate(plane.coplanarPoint());
mesh.quaternion.setFromUnitVectors(new THREE.Vector3(0,0,1), plane.normal);
Note that Plane.coplanarPoint() simply returns -normal*constant, so it might be a better option to use Plane.projectPoint() to determine a center that is "close to" an arbitrary point.
Following I'm loading a image map on a custom geometry,
it represents the brown colored geometry on the picture above:
var aqua_ground_geo = new THREE.Geometry();
var top0 = new THREE.Vector3(aqua_ground_geo_x_NEG, user_data['aqua_soil_calc_b_y'], aqua_ground_geo_z_NEG);
var top1 = new THREE.Vector3(aqua_ground_geo_x_POS, user_data['aqua_soil_calc_b_y'], aqua_ground_geo_z_NEG);
var top2 = new THREE.Vector3(aqua_ground_geo_x_NEG, user_data['aqua_soil_calc_f_y'], aqua_ground_geo_z_POS);
aqua_ground_geo.vertices.push(top0);
aqua_ground_geo.vertices.push(top1);
aqua_ground_geo.vertices.push(top2);
aqua_ground_geo.faces.push( new THREE.Face3(0,1,2) );
aqua_ground_geo.computeFaceNormals();
aqua_ground_geo.computeVertexNormals();
var textureUrl = "http://www.lifeguider.de/wp-content/uploads/aquag/bodengrund/dennerle_kies_naturweiss_1-2mm.jpg";
var aqua_bodengrund_tex = new THREE.TextureLoader().load( textureUrl );
var aqua_bodengrund_mat = new THREE.MeshLambertMaterial( {
map: aqua_bodengrund_tex,
color: 0xffffff,
} );
aqua_bodengrund_mat.shading = THREE.FlatShading;
aqua_bodengrund_mat.side = THREE.DoubleSide;
var aqua_bodengrund = new THREE.Mesh( aqua_ground_geo,aqua_bodengrund_mat);
On a simple THREE.BoxGeometry all works as expected with the same material (it represents the cube in the picture above):
var lala = new THREE.BoxGeometry( 100, 100, 100 );
var lala2 = new THREE.Mesh( lala,aqua_bodengrund_mat);
I'm not an expert in 3D, what is missing in my code that the image texture will be shown correctly?
You need to apply the texture in the callback of the THREE.TextureLoader. Check also the documentation here; the second argument (onLoad) is the callback.
var textureUrl = "https://raw.githubusercontent.com/mrdoob/three.js/master/examples/textures/crate.gif";
var aqua_bodengrund_mat = new THREE.MeshLambertMaterial( {
color: 0xffffff
});
var onLoad = function( texture ){
aqua_bodengrund_mat.map = texture;
aqua_bodengrund_mat.needsUpdate = true;
}
var loader = new THREE.TextureLoader();
loader.load( textureUrl, onLoad );
See this fiddle for a demo.
UPDATE
In case you have a custom geometry you also need to calculate the UVs for showing the texture correctly. I used this answer here to calculate them in another fiddle here
Note. The UVs in my fiddle are calculated for faces in the XY plane, if your faces are in another plane you will have to update accordingly...
I am trying to map lat/long data to a sphere. I am able to get vectors with different positions and set the position of the cube mesh to those. After I merge and display it appears that there is only one cube. I am assuming that all the cubes are in the same position. Wondering where I am going wrong here. (latLongToSphere returns a vector);
// simple function that converts the data to the markers on screen
function renderData() {
// the geometry that will contain all the cubes
var geom = new THREE.Geometry();
// add non reflective material to cube
var cubeMat = new THREE.MeshLambertMaterial({color: 0xffffff,opacity:0.6, emissive:0xffffff});
for (var i = quakes.length - 1; i >= 0; i--) {
var objectCache = quakes[i]["geometry"]["coordinates"];
// calculate the position where we need to start the cube
var position = latLongToSphere(objectCache[0], objectCache[1], 600);
// create the cube
var cubeGeom = new THREE.BoxGeometry(2,2,2000,1,1,1),
cube = new THREE.Mesh(cubeGeom, cubeMat);
// position the cube correctly
cube.position.set(position.x, position.y, position.z);
cube.lookAt( new THREE.Vector3(0,0,0) );
// merge with main model
geom.merge(cube.geometry, cube.matrix);
}
// create a new mesh, containing all the other meshes.
var combined = new THREE.Mesh(geom, cubeMat);
// and add the total mesh to the scene
scene.add(combined);
}
You have to update the mesh matrix before merging its geometry:
cube.updateMatrix();
geom.merge(cube.geometry, cube.matrix);
jsfiddle: http://jsfiddle.net/L0rdzbej/222/
I don't know what I'm doing wrong. I have multiple meshes that I am trying to merge into one mesh so that I can save on draw calls.
Each of my meshes has a unique materials. In this example it just has a different color, but really they will have unique textures mapped.
This is my code:
materials = [];
blocks = [];
var tempMat;
var tempCube;
var tempGeo;
var tempvec;
// block 1
tempMat = new THREE.MeshLambertMaterial({ color: '0x0000ff' });
materials.push( tempMat );
tempGeo = new THREE.CubeGeometry(1, 1, 1);
for (var ix=0; ix<tempGeo.faces.length; ix++) {
tempGeo.faces[ix].materialIndex = 0;
}
tempCube = new THREE.Mesh( tempGeo, tempMat );
tempCube.position.set(0, 3, -6);
blocks.push( tempCube );
// block 2
tempMat = new THREE.MeshLambertMaterial({ color: '0x00ff00' });
materials.push( tempMat );
tempGeo = new THREE.CubeGeometry(1, 1, 1);
for (var ix=0; ix<tempGeo.faces.length; ix++) {
tempGeo.faces[ix].materialIndex = 1;
}
tempCube = new THREE.Mesh( tempGeo, tempMat );
tempCube.position.set(1, 3, -6);
blocks.push( tempCube );
// Merging them all into one
var geo = new THREE.Geometry();
for (var i=0; i<blocks.length; i++) {
blocks[i].updateMatrix();
geo.merge(blocks[i].geometry, blocks[i].matrix, i);
}
var newmesh = new THREE.Mesh( geo, new THREE.MeshFaceMaterial( materials ) );
scene.add(newmesh);
Basically, that gives me an error that says:
Uncaught TypeError: Cannot read property 'visible' of undefined
every time my render function is called.
Where did I go wrong?
You are merging geometries into one, and using MeshFaceMaterial (renamed MultiMaterial in r.72).
It does not make any sense to merge geometries having different material indices.
WebGLRenderer needs to segment the geometry by material to render it.
As a rule-of-thumb, only merge geometries if they will be rendered with a single material.
three.js r.72