I am trying to draw a least squares plane through a set of points in Three.js. I have a plane defined as follows:
var plane = new THREE.Plane();
plane.setFromNormalAndCoplanarPoint(normal, point).normalize();
My understanding is that I need to take that plane and use it to come up with a Geometry in order to create a mesh to add to the scene for display:
var dispPlane = new THREE.Mesh(planeGeometry, planeMaterial);
scene.add(dispPlane);
I've been trying to apply this answer to get the geometry. This is what I came up with:
plane.setFromNormalAndCoplanarPoint(dir, centroid).normalize();
planeGeometry.vertices.push(plane.normal);
planeGeometry.vertices.push(plane.orthoPoint(plane.normal));
planeGeometry.vertices.push(plane.orthoPoint(planeGeometry.vertices[1]));
planeGeometry.faces.push(new THREE.Face3(0, 1, 2));
planeGeometry.computeFaceNormals();
planeGeometry.computeVertexNormals();
But the plane is not displayed at all, and there are no errors to indicate where I may have gone wrong.
So my question is, how can I take my Math.Plane object and use that as a geometry for a mesh?
This approach should create a mesh visualization of the plane. I'm not sure how applicable this would be towards the least-squares fitting however.
// Create plane
var dir = new THREE.Vector3(0,1,0);
var centroid = new THREE.Vector3(0,200,0);
var plane = new THREE.Plane();
plane.setFromNormalAndCoplanarPoint(dir, centroid).normalize();
// Create a basic rectangle geometry
var planeGeometry = new THREE.PlaneGeometry(100, 100);
// Align the geometry to the plane
var coplanarPoint = plane.coplanarPoint();
var focalPoint = new THREE.Vector3().copy(coplanarPoint).add(plane.normal);
planeGeometry.lookAt(focalPoint);
planeGeometry.translate(coplanarPoint.x, coplanarPoint.y, coplanarPoint.z);
// Create mesh with the geometry
var planeMaterial = new THREE.MeshLambertMaterial({color: 0xffff00, side: THREE.DoubleSide});
var dispPlane = new THREE.Mesh(planeGeometry, planeMaterial);
scene.add(dispPlane);
var material = ...;
var plane = new THREE.Plane(...);
// Align to plane
var geometry = new THREE.PlaneGeometry(100, 100);
var mesh = new THREE.Mesh(geometry, material);
mesh.translate(plane.coplanarPoint());
mesh.quaternion.setFromUnitVectors(new THREE.Vector3(0,0,1), plane.normal);
Note that Plane.coplanarPoint() simply returns -normal*constant, so it might be a better option to use Plane.projectPoint() to determine a center that is "close to" an arbitrary point.
I am trying to add to texture to my central sphere (the earth) but when I try the object disappears. Can I have some guide to where I am going wrong. Thanks
Here is the link to my jsbin http://jsbin.com/cabape/edit?html,output . I am going to get the moon to rotate around the earth
//earth
var loader = new THREE.TextureLoader();
loader.load( 'http://simpletruthsforhealthyliving.com/js/earth.jpg', function ( texture ) {
var center = new THREE.SphereGeometry(20,20,20);
var materialShereCenter = new THREE.MeshPhongMaterial( { ambient: 0xee0011, color:0xff0000, specular: 0xee0000, shininess: 70,wireframe: false, map: texture } );
centralSphere = new THREE.Mesh(center, materialShereCenter);
centralSphere.position.z = 0;
centralSphere.position.x = 0;
centralSphere.position.y = 0;
scene.add(centralSphere);
});
This needed to be added as well
var texture = new THREE.Texture();
var loader = new THREE.ImageLoader();
loader.addEventListener( 'load', function ( event ) {
Texture1.image = event.content;
Texture1.needsUpdate = true;
} );
loader.load( 'http://simpletruthsforhealthyliving.com/js/earth.jpg' );
i have an output date like this:
geom[0] = {
texturesindexT: new Int16Array([0,1,2,3]),
texturesindexS: new Int16Array([-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,...]),
materialsindexT: new Int16Array([-1,-1,-1,-1]),
materialsindexS: new Int16Array([-1,0,1,2,3,4,5,0,6,2,7,8,-1,0,...]),
startIndicesT: new Uint32Array([0,288,606,897,1380]),
startIndicesS: new Uint32Array([1380,1431,1479,1485,1497,1515,1659,...]),
m_indices: new Uint16Array([0,1,2,3,0,2,4,2,5,4,6,2,7,3,2,8,9,10,...]),
m_vertices: new Float32Array([-81.93996,25.7185,-85.53822,-81.93996,...]),
m_normals: new Float32Array([-0.004215205,0.9999894,-0.001817489,-0.004215205,...]),
m_texCoords: new Float32Array([0,0.04391319,0,0.2671326,0.009521127,0.03514284,...]),
}
var textures = new Array("-1_-1/t0.jpg","-1_-1/t1.jpg","-1_-1/t2.jpg",...);
The Data is in order for an Index, Vertex and Normal-Buffer but sections have to be rendered with other Textures and Maretials.
I have tried to make a THREE.Geometry out of the indices, vertices and texCoords/UVCoords but that didn't work.
Now i am trying use a THREE.BufferGeometry() and this work BUT i need to render index 0 to 287 with Texture "textures[0]" and index 288 to 605 with "textures[1]" and so on.
My first attempt was to make a BufferGeometry for each part with index 288 to 605 , but since the Indices are in order for the hole model, i have to put the complete vertices, normales and UVCoords in the Buffer for just a couple of faces.
Is there a way to render sections of the BufferGeometry with other Textures or to set the Texture Index for each Face?
Or is it possible to create a Material, that renders the first X faces with Texture A and the next with Texture B???
If you want to use two different textures with a single BufferGeometry, you can use this pattern, which sets drawcalls:
var geometry1 = new THREE.BufferGeometry();
// ...and set the data...
var geometry2 = geometry1.clone();
// set drawcalls
geometry1.offsets = geometry1.drawcalls = []; // currently required
geometry1.addDrawCall( start1, count1, 0 );
geometry2.offsets = geometry2.drawcalls = []; // currently required
geometry2.addDrawCall( start2, count2, 0 );
var material1 = new THREE.MeshPhongMaterial( { map: map1 } );
var material2 = new THREE.MeshPhongMaterial( { map: map2 } );
var mesh1 = new THREE.Mesh( geometry1, material1 );
var mesh2 = new THREE.Mesh( geometry2, material2 );
three.js r.70
You can create two geometries with same vertex buffers and different indexes:
var position = new THREE.BufferAttribute(positionArray, 3);
var normal = new THREE.BufferAttribute(normalArray, 3);
var uv = new THREE.BufferAttribute(uvArray, 2);
var indices1 = new THREE.BufferAttribute(indexArray1, 1);
var geometry1 = new THREE.BufferGeometry();
geometry1.addAttribute('position', position);
geometry1.addAttribute('normal', normal);
geometry1.addAttribute('uv', uv);
geometry1.addAttribute('index', indices1);
var indices2 = new THREE.BufferAttribute(indexArray2, 1);
var geometry2 = new THREE.BufferGeometry();
geometry2.addAttribute('position', position);
geometry2.addAttribute('normal', normal);
geometry2.addAttribute('uv', uv);
geometry2.addAttribute('index', indices2);
and then create 2 meshes with different materials as you normally would. As far as I understand, this will re-use same data in both meshes.
When it comes to making skyboxes in three.js, I have seen two different schools of thought. Assuming that we have the code
var imagePrefix = "images/mountains-";
var directions = ["xpos", "xneg", "ypos", "yneg", "zpos", "zneg"];
var imageSuffix = ".jpg";
var skyGeometry = new THREE.CubeGeometry( 10000, 10000, 10000 );
In both methods, one creates a really big cube and applies textures. The difference is whether shaders are used. For example:
Material without using shader:
var materialArray = [];
for (var i = 0; i < 6; i++)
materialArray.push( new THREE.MeshBasicMaterial({
map: THREE.ImageUtils.loadTexture( imagePrefix + directions[i] + imageSuffix ),
side: THREE.BackSide
}));
var skyMaterial = new THREE.MeshFaceMaterial( materialArray );
var skyBox = new THREE.Mesh( skyGeometry, skyMaterial );
scene.add( skyBox );
Material using shader:
var imageURLs = [];
for (var i = 0; i < 6; i++)
imageURLs.push( imagePrefix + directions[i] + imageSuffix );
var textureCube = THREE.ImageUtils.loadTextureCube( imageURLs );
var shader = THREE.ShaderLib[ "cube" ];
shader.uniforms[ "tCube" ].value = textureCube;
var skyMaterial = new THREE.ShaderMaterial( {
fragmentShader: shader.fragmentShader,
vertexShader: shader.vertexShader,
uniforms: shader.uniforms,
depthWrite: false,
side: THREE.BackSide
} );
var skyBox = new THREE.Mesh( skyGeometry, skyMaterial );
scene.add( skyBox );
My own informal performance tests show no significant difference in FPS using 2048x2048 images for textures. The shader-free code is easier (at least for me) to understand. Are there situations in which there is an advantage to using the shader-based texture?
You have a conceptual misunderstanding.
For WebGL, both methods involve shaders. MeshBasicMaterial has a vertex and fragment shader that has been written for you for convenience.
The primary difference between the two examples is the second example uses a cube map for input.
You would use that approach if you were already using the same cube map as an environment map in a reflective material, for example.
The first example is just another way to render a skybox, and is the only one of the two that will work with CanvasRenderer.
three.js r.58
Can someone please verify the following code for three.js r53?
It's taken from this question: How to use multiple materials in a Three.js cube?
I tried this code and a few variations but I don't get visible cubes. My texture images are named as they should be.
var materials = [];
for (var i=0; i<6; i++) {
var img = new Image();
img.src = i + '.png';
var tex = new THREE.Texture(img);
img.tex = tex;
img.onload = function() {
this.tex.needsUpdate = true;
};
var mat = new THREE.MeshBasicMaterial({color: 0xffffff, map: tex});
materials.push(mat);
}
var cubeGeo = new THREE.CubeGeometry(400, 400, 400, 1, 1, 1, materials);
var cube = new THREE.Mesh(cubeGeo, new THREE.MeshFaceMaterial());
Do this instead:
var cubeGeo = new THREE.BoxGeometry( 400, 400, 400, 1, 1, 1 );
var cube = new THREE.Mesh( cubeGeo, materials );
materials is an array of 6 three.js materials, one for each face.
See the Migration Guide: https://github.com/mrdoob/three.js/wiki/Migration-Guide.
EDIT: CubeGeometry has been renamed to BoxGeometry and THREE.MeshFaceMaterial has been deprecated.
three.js r.92
THREE.CubeGeometry() doesn't support a list of materials. I thought it did too, but if you check the current source code.... it doesn't