Refraction using Three.CubeCamera - javascript

I'm trying to set up a refraction sphere in a-frame, following Mr. Lee Stemkoski's example. I've managed to get a reflection sphere, which is basically a strip of the cool a-frame mirror component.
The CubeCamera is created, and its renderTarget is extracted as envMap texture upon initialize:
Init:
this.refractionCamera = new THREE.CubeCamera( 0.5, 3000, 128);
this.el.object3D.add( this.refractionCamera );
this.refractionMaterial = new THREE.MeshBasicMaterial({
color: 0xffffff,
refractionRatio: 0.4,
envMap: this.refractionCamera.renderTarget.texture });
Update the cubemap on tick:
tick:
this.refractionCamera.updateCubeMap( AFRAME.scenes[0].renderer,
this.el.sceneEl.object3D );
this.mesh.material = this.refractionMaterial;
Of course, the image is mirrored, because of how the Three.CubeCamera works, so I tried to rotate the texture or flip it somehow, so I won't have a mirror.
As I see Lee Stemkoski is only using:
refractSphereCamera.renderTarget.mapping = new THREE.CubeRefractionMapping();
I tried doing that on init, but nothing changes. Check out the fiddle.
Also, I tried offsetting the texture, or vertical flipping, but nothing happens.
Any ideas what's wrong in my approaches?

You do
refractSphereCamera.renderTarget.mapping = THREE.CubeRefractionMapping;
If you want refraction, it's better to do
refractSphereCamera.renderTarget.texture.mapping = THREE.CubeRefractionMapping;
jsfiddle example

Lee Stemkoski's example was built four years ago for an older version of Three.js. His source code states Date: July 2013 (three.js v59dev), but your fiddle is using v84.
You should look at this example which is compatible with the latest version of Three.js. If you look at the source code, the implementation is much simpler now:
// Load textureCube, change mapping from reflection to refraction
var textureCube = new THREE.CubeTextureLoader().load( urls );
textureCube.mapping = THREE.CubeRefractionMapping;
// Assign textureCube to background
scene = new THREE.Scene();
scene.background = textureCube;
// Use textureCube as environment map, which will use refraction mapping.
var cubeMaterial3 = new THREE.MeshPhongMaterial( { color: 0xccddff, envMap: textureCube, refractionRatio: 0.98, reflectivity: 0.9 } );

Related

Three.js Mirror reflection issue

I am using Three.js Mirror class to reflect a ring mesh in my scene and the reflection has strange artifacts in it.
This is the relevant code:
verticalMirror = new THREE.Mirror( renderer, camera, { clipBias: 0.003, textureWidth: 512, textureHeight: 512, color:0xdddddd} );
var verticalMirrorMesh = new THREE.Mesh( new THREE.PlaneBufferGeometry( 300, 300 ), verticalMirror.material );
verticalMirrorMesh.add( verticalMirror );
verticalMirrorMesh.position.y = 0;
verticalMirrorMesh.position.z = -10;
editor.scene.add( verticalMirrorMesh );
function render () {
renderer.clear();
verticalMirror.render();
renderer.render(scene, camera);
}
In the center the reflection looks more or less ok, but on the sides it's stretched to the edges of the mirror. Please see the images below.
I've researched other posts on mirroring in Three.js and seem to follow what they are suggesting. What am I still doing wrong? Any advice is appreciated.
Thank you,
Anton.

Size of sprites along geometry Three.js

I want to show an animation of sprites along the figure. specifically moving sprites walking figure.
so as shown here:
http://armsglobe.chromeexperiments.com/
I am basing on the answer to this question:
http://stackoverflow.com/questions/25898635/three-js-how-to-animate-particles-along-a-line
This is my result, sprites start out small, then they get bigger. I want who are the same size.
but I need you all sprites the same size. I looked at the code but does not know what to do for all sprites the same size .
I would greatly appreciate if you can help.
Sprite cannot use sizeAttenuation feature to be false.
You need to use Pointcloud.
http://codepen.io/seanseansean/pen/EaBZEY
Parameter in pontCloud material you searching for is the sizeAttenuation= false.
Will make all particles size relative to canvas size. will not change in perspective.
var geometry = new THREE.Geometry();
geometry.vertices.push( new THREE.Vector3() );
var size = 32; // means 32px
var sectorIcon = new THREE.PointCloud( geometry, new THREE.PointCloudMaterial( { size: size, color: new THREE.Color( 0xffffff ), depthTest: 1, depthWrite: false, opacity: 0.5, sizeAttenuation: false, blending: THREE.AdditiveBlending, transparent: true, map: THREE.ImageUtils.loadTexture("img/sectors/"+value.image_file) } ) );

How to add outline on child mesh in three js

I'd like to add outline to meshes. I followed the example which created a new mesh using the same geometry, and scaled the mesh.
var outlineMaterial = new THREE.MeshBasicMaterial({color: 0x00ffff, side: THREE.BackSide});
this.outlineMesh = new THREE.Mesh(target.geometry, outlineMaterial);
this.outlineMesh.quaternion = target.quaternion;
this.outlineMesh.position = target.position;
this.outlineMesh.scale.copy(target.scale);
this.outlineMesh.scale.multiplyScalar(1.05);
this.scene.add(this.outlineMesh);
It works fine, the position of outlineMesh is always same to target mesh. However, when I added the target mesh as child to other mesh, the position of outlineMesh is different to the target mesh. I thought it's because the target position is related to parent's coordinate, but the outlineMesh is still in the world coordinate.
Any idea how to make outline work for child mesh? Thank you very much!
Just add the outlineMesh as a child of the target mesh, like so:
var outlineMaterial = new THREE.MeshBasicMaterial( { color: 0x00ffff, side: THREE.BackSide } );
outlineMesh = new THREE.Mesh( geometry, outlineMaterial );
outlineMesh.scale.multiplyScalar( 1.05 );
mesh.add( outlineMesh );
three.js r.67

Using textures with Three.js and PLY file

I have the ascii ply file loading fine now, but it has a texture and I can not seem to get it to load not matter how I configure things.
cameraMain = new THREE.Camera(8, MainWidth / MainHeight, 1, 10000);
sceneMain = new THREE.Scene();
ambientLightMain = new THREE.AmbientLight(0x202020);
directionalLightMainRight = new THREE.DirectionalLight(0xffffff, 0.5);
directionalLightMainLeft = new THREE.DirectionalLight(0xffffff, 0.5);
pointLightMain = new THREE.PointLight(0xffffff, 0.3);
directionalLightMainRight.position.x = dlpx;
directionalLightMainRight.position.y = dlpy;
directionalLightMainRight.position.z = dlpz;
directionalLightMainRight.position.normalize();
directionalLightMainLeft.position.x = -dlpx;
directionalLightMainLeft.position.y = dlpy;
directionalLightMainLeft.position.z = dlpz;
directionalLightMainLeft.position.normalize();
pointLightMain.position.x = plpx;
pointLightMain.position.y = plpy;
pointLightMain.position.z = plpz;
sceneMain.addLight(ambientLightMain);
sceneMain.addLight(directionalLightMainRight);
sceneMain.addLight(directionalLightMainLeft);
sceneMain.addLight(pointLightMain);
rendererMain = new THREE.WebGLRenderer();
texture = THREE.ImageUtils.loadTexture('3D/'+ mainURL +'.jpg', {}, function() {
rendererMain.render(sceneMain);
});
rendererMain.domElement.style.backgroundColor = backgroundColor;
rendererMain.setSize(MainWidth, MainHeight);
rendererMain.domElement.addEventListener('DOMMouseScroll', onRendererMainScroll, false);
rendererMain.domElement.addEventListener('mousewheel', onRendererMainScroll, false);
rendererMain.domElement.addEventListener('dblclick', onRendererMainDblClick, false);
rendererMain.domElement.addEventListener('mousedown', onRendererMainMouseDown, false);
$("#viewerMain").append(rendererMain.domElement);
window.addEventListener('mousemove', onMouseMove, false);
window.addEventListener('mouseup', onMouseUp, false);
That chunk of code initializes the model. Some of it is not needed for question purposes, it has code for zooming, rotating, etc. I know there is another line, I have it commented out but for some reason it wont show here so I will do it separately.
material = new THREE.MeshBasicMaterial({map: texture});
That line is after the loadTexture code.
The rest of it is loaded with a loadPly function
var geometryMain = new THREE.Geometry();
for (i in event.data.content[0])
geometryMain.vertices.push(new THREE.Vertex(new THREE.Vector3(event.data.content[0][i][0], event.data.content[0][i][1], event.data.content[0][i][2])));
for (i in event.data.content[1])
geometryMain.faces.push(new THREE.Face3(event.data.content[1][i][0], event.data.content[1][i][1], event.data.content[1][i][2]));
geometryMain.computeCentroids();
geometryMain.computeFaceNormals();
mainModel = new THREE.Mesh(geometryMain, new THREE.MeshLambertMaterial({color:0xffffff, shading:THREE.FlatShading}));
sceneMain.addObject(mainModel);
mainModel.overdraw = true;
mainModel.doubleSided = true;
I have tried tweaking the THREE.mesh() section to include the material but it breaks everything. I have tried just adding the map:material to the MeshLambertMaterial, also a no go. Anyone have any insight? Sorry if this is over complicated but i am far from knowing this enough to be efficient yet.
If I add
map:THREE.ImageUtils.loadTexture('3D/'+ mainURL +'.jpg')
to the THREE.Mesh() line, I get
.WebGLRenderingContext: GL ERROR :GL_INVALID_OPERATION : glDrawElements: attempt to access out of range vertices in attribute 2
In order to display a texture on an object, it needs texture coordinates. You only seem to add vertices and faces to the geometry. Looking at THREE.PLYLoader sources, it doesn't seem to support loading the texture coords. (Although I'm not sure if you use that loader as it should already return a ready Geometry instance instead of requiring you to construct arrays manually)
Use THREE.MeshNormalMaterial. It shows texture when you use MeshNormalMaterial.

WebRTC and ThreeJS to create a brushed metal textureCube

I'm trying to apply a THREE.ImageUtils.loadTextureCube() using the real time camera onto a spinning cube.
Until now, I managed to apply a simple texture using my video to a MeshLambertMaterial :
var geometry = new THREE.CubeGeometry(100, 100, 100, 10, 10, 10);
videoTexture = new THREE.Texture( Video ); // var "Video" is my <video> element
var material = new THREE.MeshLambertMaterial({ map: videoTexture });
Cube = new THREE.Mesh(geometry, material);
Scene.add( Cube );
That's OK and you can see the result at http://jmpp.fr/three-camera
Now I'd like to use this Video stream to have a brushed metal texture, so I tried to create another kind of material :
var videoSource = decodeURIComponent(Video.src);
var environment = THREE.ImageUtils.loadTextureCube([videoSource, // left
videoSource, // right
videoSource, // top
videoSource, // bottom
videoSource, // front
videoSource]); // back
var material = new THREE.MeshPhongMaterial({ envMap: environment });
... but it throws the following error :
blob:http://localhost/dad58cd1-1557-41dd-beed-dbfea4c340db 404 (Not Found)
I guess loadTextureCube() is trying to get the 6 array parameters as an image, but doesn't seems to appreciate a videoSource instead.
I'm beginning with three and wondered if there is a way to do that ?
Thx,
jmpp
There are two ways I could see. First, if you just want the same image but with some specular highlights/shininess then just change
var material = new THREE.MeshLambertMaterial({ map:texture});
to
var material = new THREE.MeshPhongMaterial({
map: texture ,
ambient: 0x030303,
specular: 0xffffff,
shininess: 90
});
and play with the ambient, specular, shininess settings to find what you like.
Second, if you really want to add effects to the video image itself, you could draw the image to a canvas, manipulate the pixels, and then set the texture image to that new image. This could also be done with custom shaders, avoiding the canvas step, but there are already libraries for applying image filters to elements, so I'd stick with that. That would work something like this:
You would need a canvas to draw to <canvas id='testCanvas' width=256 height=256></canvas> Then with javascript
var ctx = document.getElementById('testCanvas').getContext('2d');
texture = new THREE.Texture();
// in the render loop
ctx.drawImage(Video,0,0);
var img = ctx.getImageData(0,0,c.width,c.height);
// do something with the img.data pixels, see
// this article http://www.html5rocks.com/en/tutorials/canvas/imagefilters/
// then write it back to the texture
texture.image = img;
texture.needsUpdate = true
Updated!
Actually, you can do it as an envMap, you just need to force the video to be a power of 2 with same width/height. Videos stream in to chrome as 640x480, so you still need to draw a canvas, but only to crop/square the image. So I got this to work:
// In the access camera part
var canvas = document.createElement('canvas')
canvas.width = 512;
canvas.height = 512;
ctx = canvas.getContext('2d');
// In render loop
ctx.drawImage(Video,0,0, 512, 512);
img = ctx.getImageData(0,0,512,512);
// This part is a little different, but env maps have an array
// of images instead of just one
cubeVideo.image = [img,img,img,img,img,img];
if (Video.readyState === Video.HAVE_ENOUGH_DATA)
cubeVideo.needsUpdate = true;
Try this:
var environment = new THREE.Texture( [ Video, Video, Video, Video, Video, Video ] );
var material = new THREE.MeshPhongMaterial({ envMap: environment });
// in animate()
environment.needsUpdate = true;
Okay, now I managed to get a shiny effect on the cube using a Phong material :
videoTexture = new THREE.Texture( Video );
var material = new THREE.MeshPhongMaterial({
map: videoTexture,
ambient: 0x030303,
specular: 0xc0c0c0,
shininess: 25
});
This looks not so bad.
But it seems that a THREE.Texture([Video,Video,Video,Video,Video,Video]); isn't working as an envMap. I still get a black cube.

Categories