I try to render a CircleGeometry over a cube one (from the camera point of view we show both). The cube is with a flat color, and the circle got a canvas with just an arc and no background color.
If I use a Canvasrenderer the canvas transparency is ok, and the arc is just print.
If I use the WebGL renderer, the full circle is filled with the page background color, with just the arc shown on it, so the transparency is lost.
I create a test for this : http://jsfiddle.net/f4u7s/6/
where you can switch between WebGL and CanvasRendering to show the problem.
(look for
// ------------> Switch HERE
//renderer = new THREE.CanvasRenderer();
renderer = new THREE.WebGLRenderer();
)
It sounds alike the three.js textures working with CanvasRenderer, but show up as black with WebGLRenderer ticket, with even with the solution proposed (mesh.dynamic = true), the problem is still here.
Am I missing something?
You need to set material.transparent to true.
plane = new THREE.Mesh(
new THREE.CircleGeometry( 50, 50 ),
new THREE.MeshBasicMaterial( {
map: texture,
transparent: true
} )
);
three.js r.144
Related
This is the basic example from the Threejs documentation
function initSphere(){
const geometry = new THREE.SphereGeometry( 150, 14, 14 );
const material = new THREE.MeshBasicMaterial( {color: 0xFF0000, vertexColors: 0xFFFFFF} );
const sphere = new THREE.Mesh( geometry, material );
scene.add( sphere );
}
It creates a red sphere which is what I wanted, but I can't really see the sphere and it's edges because it just looks like a red circle. I was thinking that changing the edges to a white color would help make the effect I want, but I can't seem to know how to ask this question to solve it.
Can anyone help?
First, your shape is "just a red circle" because you are using MeshBasicMaterial. This material is simply a color and does not include any kind of shading or highlights--it doesn't even need a light source! Every bit of the shape will be rendered as 0xff0000.
If you want shading/highlights, You will need to use a more complex material like MeshPhongMaterial, MeshLambertMaterial, or MeshStandardMaterial. Because these are shaded, you will need to include a light in your scene.
Secondly, the vertexColors property does not change the color of the "edges." Instead, it is a Boolean that indicates whether vertex colors are used to color the Mesh.
If you want to show edges, you could try using EdgesGeometry to define a secondary shape.
I'm having the following Mesh in my scene:
const cylinder = new Mesh(
new CylinderGeometry(2, 2, 1, 32),
new MeshPhongMaterial({
color: color,
shininess: 32,
opacity: 0,
transparent: true,
specular: 0xffff82,
}),
);
Because I want to fade each circle in, I made the Mesh transparent. When I move my camera there is some weird rendering and I have no clue why this happens or what I need to change. As soon as I remove transparent it'll work just fine.
EDIT
Here is a fiddle showing the problem. Line 139 in css is where the cylinders get created.
https://jsfiddle.net/mxmtsk/tb6gqm10/35/
It seems that some faces of the transparent cylinders disappear behind the plane. You can easily fix this by slightly move the cylinders towards the camera like so:
cylinder.rotation.x = Math.PI / 2;
cylinder.position.z = 0.5; // fix
In this way, the cylinders do not intersect with the plane.
Updated fiddle: https://jsfiddle.net/f8m1u4rg/
I cam across this site: https://travisscott.com/
As you can see, when you move the camera, the gradient background has different 360 degree shading.
What part of THREE.js would something like this be part of?
Can someone point me in the right direction?
As #gaitat said in their comment above- the background is a cube map wrapped in a sphere. This is just a normal three.js material with a texture map applied. Here is the code from the page cleaned up to be readable:
var backgroundSphere = new THREE.Mesh(
new THREE.SphereGeometry(30,10,10),
new THREE.MeshBasicMaterial({
map: (new THREE.TextureLoader).load("assets/images/textures/pano.jpg"),
side: c.DoubleSide
})
);
The shiny material on the model is achieved using the same environment map:
var shinyMaterial = new THREE.MeshStandardMaterial({
color: 16777215,
metalness: 1,
roughness: -1,
envMap: <A loaded cube texture (the same as the pano image above)>,
side: c.DoubleSide,
shading: c.FlatShading
});
There is more information on loading a cube texture in the three.js docs: https://threejs.org/docs/#api/textures/CubeTexture
Page is using: three.js [r79] from what I can see.
Here's the process.
Create the asset - Get a 360 panorama image from some source. Photoshop it to make it blurred.
Create sphere in your Threejs setup. Make its scale 10x heigher than the main model scale
Apply MeshLambertMaterial to it with face sides to be BackSided
Load the 360 image that you edited in your scene
Apply the image as emissiveMap. Make sure that the emissive color is set to white.
Render
I'm developing for the OculusRift using the OculusRiftEffect from https://github.com/mrdoob/three.js/blob/master/examples/js/effects/OculusRiftEffect.js and am using Sprites. The problem is the sprites don't appear in the correct position in each eye as in the screenshot. You can see the house sprite is in different positions in each eye and causes a 'double vision' effect in the oculus. While playing around with the code (have a demo plunker here) you can notice that near the edges of the screen the positioning is more accurate but I need it nearer the center of the screen where the positioning is off. I assume this has something to do with the shading/rendering in OculusRiftEffect but don't know enough about it to break it down, any direction would be appreciated, thanks!
Sample code:
var _scene, _camera, _renderer, _effect, _sprite;
function init() {
_scene = new THREE.Scene();
_camera = new THREE.PerspectiveCamera(70, window.innerWidth / window.innerHeight, .1, 100000);
_camera.lookAt(new THREE.Vector3());
_renderer = new THREE.WebGLRenderer({
antialias: true,
canvas: document.getElementById('legit')
});
_renderer.setSize(window.innerWidth, window.innerHeight);
_effect = new THREE.OculusRiftEffect(_renderer, {
worldScale: 1000
});
_effect.setSize(window.innerWidth, window.innerHeight);
THREE.ImageUtils.crossOrigin = 'anonymous';
_sprite = new THREE.Sprite(
new THREE.SpriteMaterial({
map: new THREE.Texture(document.getElementById('icon')),
color: 0xff0000
})
);
_sprite.scale.set(200, 200, 1);
_sprite.position.set(500, 800, 1);
_scene.add(_sprite);
_scene.add(new THREE.Mesh(
new THREE.SphereGeometry(3000, 64, 32),
new THREE.MeshBasicMaterial({
color: 0xffffff,
wireframe: true,
side: THREE.DoubleSide
})
));
animate();
}
function animate() {
requestAnimationFrame(animate);
render();
}
function render() {
_renderer.render(_scene, _camera);
_effect.render(_scene, _camera);
}
document.addEventListener('DOMContentLoaded', init);
I am not that familiar with the Oculus plugin but I think your understanding of how sprites work is wrong.
Sprite is a rendering technique - a renderable rasterized surface... It can still be anywhere in space, and space is a relative term.
Is your house aware that it's part of the HUD and not part of a particle system somewhere in the distance?
One way to achieve what you want would be to overlay a copy at equal distance from the two points that represent the center of each eye, all in screen space. I think that this would give the effect that you are looking for.
That's as far as the positioning goes. Orientation wise, i'm not sure if they are actually properly aligned in your image above but the fish-eye effect is kicking in.
My reading of Three.js sprites indicates that they are positioned using screen coordinates, not in-scene geometry. Because of this they're ignoring the per-eye projection matrix offset that is imposed on the scene. I don't believe they will function properly in combination with the Oculus Rift distortion effect because of this.
As pailhead and Jherico mentioned this is not possible with the Oculus plugin. A workaround I found effective is to simply use a PlaneGeometry and set it as a child of the camera. Set it to look at the camera's position and it will act just like a sprite and render correctly for the OculusRiftEffect. Basic example (assuming camera and scene):
var sprite = new THREE.Mesh(
new THREE.PlaneGeometry(100, 100),
new THREE.MeshBasicMaterial({color: 0xffffff})
);
//assuming camera is at (0,0,0) we need to offset it
sprite.position.set(20, 20, -100);
sprite.lookAt(camera.position);
camera.add(sprite);
//normally camera doesnt need to be added
//but if it has child meshes it is necessary
scene.add(camera);
Since the plane is a child of the camera once its offset is positioned correctly it will follow the camera wherever it goes. The caveat is that it won't function exactly the same as a Sprite in that you can't move it independently around the scene but it's the closest solution I've found.
I tried to create a skybox in three.js.
I created 2 scenes. The first is the skybox, and the second is my game scene.
I'm just learning three.js, and I don't really know, why it doesn't work. Only the skybox is rendered, the other scene isn't.
Code: http://jsfiddle.net/5bqFr/
Thanks in advance
What's happening now is that, even if the skybox is being rendered first, you're also writing on the depth buffer. The skybox happens to be closer to the camera than the sphere and that's why you don't see the sphere.
You just need to disable writing into depth:
new THREE.MeshBasicMaterial( { color: 0x0000FF, depthWrite: false } );