i'm trying to realize a robotic arm using the three.js library. My idea is to use hierachical levels in order to create the arm's geometry in such a way that the first level of the geometry would be the basis of the movements for the entire arm. Could someone help me accomplishing this task?
You should use a different Mesh for each part in the robotic arm,
and then add each part to it's parent Mesh.
then when you will rotate a parent Mesh, It's children Meshes will rotate with him.
for example:
var mat = new THREE.MeshBasicMaterial();
var mainHandGeometry = new THREE.BoxGeometry(50,10,10);
var mainMesh = new THREE.Mesh(mainHandGeometry ,mat );
var midHandGeometry = new THREE.BoxGeometry(30,5,5);
var midMesh = new THREE.Mesh(midHandGeometry ,mat );
var lastHandGeometry = new THREE.BoxGeometry(15,3,3);
var lastMesh = new THREE.Mesh(lastHandGeometry ,mat );
midMesh.add(lastMesh);
lastMesh.position.set(10,10,10);
mainMesh.add(lastMesh);
midMesh.position.set(10,10,10);
Now when you will rotate mainMesh it will rotate with it's children.
If you will rotate midMesh it will rotate with lastMesh.
Related
Create object
A three.js mesh object is created:
var geometry = new THREE.BufferGeometry();
var standardMaterial = new THREE.MeshStandardMaterial( {/* inputs */ } );
var mesh = new THREE.Mesh( geometry, standardMaterial );
// Then mesh is added as an object to 3D scene.
Set transformation matrix
I intend to set the transformation matrix of the object:
object = ... // Is already created and passed around.
const pkm = ... // Containing 16 numbers, equivalent of a 4x4 matrix.
var matrix = new THREE.Matrix4();
matrix.set(
pkm.X00, pkm.X01, pkm.X02, pkm.X03,
pkm.X10, pkm.X11, pkm.X12, pkm.X13,
pkm.X20, pkm.X21, pkm.X22, pkm.X23,
pkm.X30, pkm.X31, pkm.X32, pkm.X33,
);
object.applyMatrix4( matrix );
object.updateMatrixWorld( true );
Problem
The problem is that the above approach of setting transformation matrix just multiplies the new matrix into the previous matrix of the object. But we want the previous matrix to be completely replaced by the new matrix.
What is the best practice - most robust way - to replace the previous matrix of a three.js object/mesh with a completely new one?
I am going over an example from ami.js https://github.com/FNNDSC/ami/blob/dev/examples/viewers_quadview/viewers_quadview.js to visualize a 2D slice of a 3D model of a bone overlayed as a segmentation mask over a 2D image of a corresponding CT scan.
The only difference is that I modified it to read the 3D model mesh from a STL file instead of freesurfer.
The problem is that it renders artifacts of the segmentation mask from previous slices on top of the current CT scan. Instead of rendering a mask corresponding to the current CT scan.
In order to visualize the slices of the 3D model as 2D segmentation mask over 2D CT scans, the example creates a plane corresponding to the 2D slice with the CT scan.
function updateClipPlane(refObj, clipPlane) {
const stackHelper = refObj.stackHelper;
const camera = refObj.camera;
let vertices = stackHelper.slice.geometry.vertices;
let p1 = new THREE.Vector3(vertices[0].x, vertices[0].y, vertices[0].z)
.applyMatrix4(stackHelper._stack.ijk2LPS);
let p2 = new THREE.Vector3(vertices[1].x, vertices[1].y, vertices[1].z)
.applyMatrix4(stackHelper._stack.ijk2LPS);
let p3 = new THREE.Vector3(vertices[2].x, vertices[2].y, vertices[2].z)
.applyMatrix4(stackHelper._stack.ijk2LPS);
clipPlane.setFromCoplanarPoints(p1, p2, p3);
let cameraDirection = new THREE.Vector3(1, 1, 1);
cameraDirection.applyQuaternion(camera.quaternion);
if (cameraDirection.dot(clipPlane.normal) > 0) {
clipPlane.negate();
}
}
and applies it to the 3D mesh as a clipping plane during rendering
function render() {
... some code ...
// render r2 view
r2.renderer.clear();
r2.renderer.render(r2.scene, r2.camera);
// mesh
r2.renderer.clearDepth();
data.forEach(function(object, key) {
object.materialFront.clippingPlanes = [clipPlane2];
object.materialBack.clippingPlanes = [clipPlane2];
});
r2.renderer.render(sceneClip, r2.camera);
... some code...
}
jsfiddle that illustrates the problem is here http://jsfiddle.net/crfs6ugq/226/
Currently I am developing a FPS with three.js and pointerlockcontrols.
Using the code below I can shoot into any horizontal direction:
var direction = new THREE.Vector3( 0, 0, -1 );
var rotation = new THREE.Euler( 0, 0, 0, "XYZ" );
var cameraDirection = new THREE.Vector3(this.game.usermodel.root.children[0].position.x, this.game.usermodel.root.children[0].rotation._x, this.game.usermodel.root.children[0].position.z);
cameraDirection.copy( direction ).applyEuler( this.game.user.rotation );
var raycaster = new THREE.Raycaster(this.game.usermodel.root.children[0].position, cameraDirection);
But my code doesn't take the y-axis into account. The line below holds the pitch rotation:
this.game.usermodel.root.children[0].rotation._x
How can I apply this value so I can shoot along the y-axis (vertically into any direction) as well? Currently the bullet is going along a straight line.
Thanks in advance for your assistance.
If you are using PointerLockControls and you want to set a raycaster, you can use this pattern:
var direction = new THREE.Vector3();
var raycaster = new THREE.Raycaster(); // create once and reuse
...
controls.getDirection( direction );
raycater.set( controls.getObject().position, direction );
Do not set the camera position or rotation directly if you are using PointerLockControls.
three.js r.71
Investigating this a bit more, I finally came up with a workaround myself. It might not be the perfect way to do this, but it works.
It now works like this: I'm getting the basic mesh rotation and apply the euler, I then add the pitch rotation. In this way I pass the horizontal and vertical rotation into the raycaster.
var direction = new THREE.Vector3( 0, 0, -1 );
direction.copy( direction ).applyEuler( this.game.user.rotation );
direction.y = this.game.usermodel.root.children[0].rotation._x;
var raycaster = new THREE.Raycaster(this.game.usermodel.root.children[0].position, direction);
Everyone is still welcome to comment on this or come up with a more elegant solution.
My goal is to create an interactive Earth that has lines normal to the surface so that you can click on them and it pulls up pictures that my health care team has taken from around the world. I have the world completely coded (or more accurately someone else did it and I made a few small changes).
Below is the code for the Earth which functions as expected. What I want to know is how to make lines normal to the surface and have them be clickable. It would be optimal if the lines faded and disappeared as they went to the back of the earth rotated or the user rotated the earth and the lines on the side the user couldn't see faded.
I thought about making an array of cities and having a location on the sphere be associated with it but I'm not really sure how to do that. I am very new to Three.js and HTML/JS in general.
it may be helpful to know that I am using three.mins.js, Detector.js, and TrackballControl.js
Code so far as follows:
(function () {
var webglEl = document.getElementById('webgl');
if (!Detector.webgl) {
Detector.addGetWebGLMessage(webglEl);
return;
}
var width = window.innerWidth,
height = window.innerHeight;
// Earth params
var radius = 0.5,
segments = 32,
rotation = 6;
var scene = new THREE.Scene();
var uniforms, mesh, meshes =[];
var camera = new THREE.PerspectiveCamera(45, width / height, 0.01, 1000);
camera.position.z = 1.5;
var renderer = new THREE.WebGLRenderer();
renderer.setSize(width, height);
scene.add(new THREE.AmbientLight(0x333333));
var light = new THREE.DirectionalLight(0xffffff, 1);
light.position.set(5,3,5);
scene.add(light);
var sphere = createSphere(radius, segments);
sphere.rotation.y = rotation;
scene.add(sphere)
var clouds = createClouds(radius, segments);
clouds.rotation.y = rotation;
scene.add(clouds)
var stars = createStars(90, 64);
scene.add(stars);
var controls = new THREE.TrackballControls(camera);
webglEl.appendChild(renderer.domElement);
render();
function render() {
controls.update();
sphere.rotation.y += 0.0005;
clouds.rotation.y += 0.0007;
requestAnimationFrame(render);
renderer.render(scene, camera);
}
function createSphere(radius, segments) {
return new THREE.Mesh(
new THREE.SphereGeometry(radius, segments, segments),
new THREE.MeshPhongMaterial({
map: THREE.ImageUtils.loadTexture('images/Color_Map.jpg'),
bumpMap: THREE.ImageUtils.loadTexture('images/elev_bump_4k.jpg'),
bumpScale: 0.005,
specularMap: THREE.ImageUtils.loadTexture('images/water_4k.png'),
specular: new THREE.Color('grey')
})
);
}
function createClouds(radius, segments) {
return new THREE.Mesh(
new THREE.SphereGeometry(radius + 0.003, segments, segments),
new THREE.MeshPhongMaterial({
map: THREE.ImageUtils.loadTexture('images/fair_clouds_4k.png'),
transparent: true
})
);
}
function createStars(radius, segments) {
return new THREE.Mesh(
new THREE.SphereGeometry(radius, segments, segments),
new THREE.MeshBasicMaterial({
map: THREE.ImageUtils.loadTexture('images/galaxy_starfield.png'),
side: THREE.BackSide
})
);
}
}());
The hope is that it would look like this link but with Earth and not a building (http://3d.cl3ver.com/uWfsD?tryitlocation=3) [also click explore when you go there].
I built a quick demo that most faithfully represents what I think your needs are. It shows some images that seem to be attached to an Earth sphere through lines. It uses sprites to create those images (and the lines themselves, actually). I think it resembles quite well that demo of a building that you linked to. Here is the technique:
Images are added using GIMP to this template and saved as PNGs.
Those images are loaded as textures in the Js apps.
The sprite is created, using the loaded texture.
The sprite is added to a Object3D and its position set to (0,0,radiusOfTheEarthSphere)
The Object3D is added to the sphere, and rotated until the center of the sprite lies in the position in Earth that you want it to rest in.
Each frame, a dot product between a vector from the center of Earth to the camera and a vector from the center of the Earth to each sprite is used to calculate the sprite's opacity.
That equation in 6 is:
opacity = ((|cameraPosition - centerOfEarth| x |spriteCenter - centerOfEarth|) + 1) * 0.5
where "x" is dot product and "||" denotes normalization.
Also note that sprite center is different from its position due to the Object3D used as parent, I calculate its center using the .localToWorld(vec) method.
Please see the demo here: https://33983769c6a202d6064de7bcf6c5ac7f51fd6d9e.googledrive.com/host/0B9scOMN0JFaXSE93YTZRTE5XeDQ/test.html
It is hosted in my Google Drive and it may take some time to load. Three.js will give some errors in the console untill all textures are loaded because I coded it quickly just to show you my implementation ideas.
I have added a normal map to a model in Three.js that is mirrored down the middle. It looks like one of the channels (green perhaps?) is flipped on the mirrored side.
I have one ambient light, one directional headlight, and one spotlight. Here is the code that I use to make the material:
// Create a MeshPhongMaterial for the model
var material = new THREE.MeshPhongMaterial();
material.map = THREE.ImageUtils.loadTexture(texture_color);
// Wrapping modes
//THREE.RepeatWrapping = 1000;
//THREE.ClampToEdgeWrapping = 1001;
//THREE.MirroredRepeatWrapping = 1002;
material.map.wrapS = THREE.RepeatWrapping;
material.map.wrapT = THREE.MirroredRepeatWrapping;
if (texture_normal != null) {
material.normalMap = THREE.ImageUtils.loadTexture(texture_normal);
material.normalMap.wrapS = THREE.RepeatWrapping;
material.normalMap.wrapT = THREE.MirroredRepeatWrapping;
}
material.wrapAround = true;
material.morphTargets = true;
material.shininess = 15;
material.specular = new THREE.Color(0.1, 0.1, 0.1);
material.ambient = new THREE.Color(0, 0, 0);
material.alphaTest = 0.5;
var mesh = new THREE.MorphAnimMesh( geometry, material );
// Turn on shadows
mesh.castShadow = true;
if (shadows) {
mesh.receiveShadow = true;
}
scene.add( mesh );
I tried all of the different combinations of material.normalMap.wrapS and material.normalMap.wrapT but that didn't solve it (tried diffuse map too). What am I doing wrong?
Thank you!
Normal maps are dependent on the geometry, so you can't just mirror it and expect it to work like a diffuse texture would.
To make it work, you need to flip the normal map's red channel wherever the UVWs are mirrored on the model.
http://www.polycount.com/forum/showthread.php?t=116922
Turns out I was using an older version (1.2) of the Blender Three.js exporter. By switching to the latest version (1.5) of the exporter from the r67 repository, Three.js now correctly handles mirrored normal maps with its Phong shader out of the box.
Edit: The Phong Shader was still having issues with the flipped channel. I ended up using the "Normal Map Shader" (see the Three.js examples) and that gave me correct results. Unfortunately the Normal Map Shader doesn't work with Morph animations, only Skeletal.