I want to get as best performance as possible with rendering simple textured shapes. The problem is with phong model it requires extra lighting (which involves calculations) + the colors are not like the one desired and needs some tweeking.
To simplify the case I've decided to use a simple flat shader, but some problems occur:
<script id="vertShader" type="shader">
varying vec2 vUv;
void main() {
vUv = uv;
gl_Position = projectionMatrix * modelViewMatrix * vec4(position,1.0);
}
</script>
<script id="fragShader" type="shader">
varying vec2 vUv;
uniform sampler2D material;
void main() {
gl_FragColor = texture2D(material, vUv);
}
</script>
Under certain camera angles some of the shelves dissapear (you can notice the darker places, and see through them), which does not occur using the phong material:
It happens with the shadow texture put inside each shelf. It's a textured cube with a shadow texture put inside each space (don't ask me why, this is just a task I got:))
I don't know what may be causing this. Maybe the loading?
Im using the standard obj loader and adding textures. Obj loader sets the material to phong and im switching it to custom shader like this:
var objLoader = new THREE.OBJLoader( manager );
objLoader.load( obj, function ( model ) {
elements[name] = model;
console.log('loaded ', name);
var img = THREE.ImageUtils.loadTexture(mat);
elements[name].traverse( function ( child ) {
if ( child instanceof THREE.Mesh ) {
child.material = new THREE.ShaderMaterial( {
uniforms: {
color: {type: 'f', value: 0.0},
material: {type: 't', value: img}
},
fragmentShader: document.getElementById('fragShader').text,
vertexShader: document.getElementById('vertShader').text,
} );
}
});
any suggestions would be helpful
Every surface is drawn in one direction (clockwise or counter-clockwise). if you are showing a surface from the other side, it will "dissapear". I think this is the problem of your own shader. -> you should render them from both sides (-> worse performance) or calculate, from which side it should render.
To optimize the performance slightly you should use a standard material from THREE. You can use them without writing your own shader.
something like:
child.material = new THREE.MeshBasicMaterial({
side: THREE.DoubleSide,
color: 0x000000
// ...
});
i created a skybox-material with textures in an own project:
function getSkyboxMaterial() {
var faceMaterials = getSkyboxFaces();
var skyboxMaterial = new THREE.MeshFaceMaterial(faceMaterials);
return skyboxMaterial;
}
function getSkyboxFaces() {
var NUMBER_OF_FACES = 6, faces = [], texture, faceMaterial, texturePath, i;
for (i = 0; i < NUMBER_OF_FACES; i++) {
texturePath = IMAGE_PREFIX + DIRECTIONS[i] + IMAGE_SUFFIX;
texture = loadFlippedTexture( texturePath );
faceMaterial = getFaceMaterial( texture );
faces.push( faceMaterial );
}
return faces;
}
function loadFlippedTexture(texturePath) {
var texture = loadTexture(texturePath);
flipTexture(texture); // This is necessary, because the skybox-textures are mirrored.
return texture;
}
function loadTexture(path) {
return THREE.ImageUtils.loadTexture(path);
}
function flipTexture(texture) {
texture.repeat.set(-1, 1);
texture.offset.set(1, 0);
return texture;
}
function getFaceMaterial(texture) {
var faceMaterial = new THREE.MeshBasicMaterial({
map: texture,
side: THREE.DoubleSide
});
return faceMaterial;
}
Related
i have a scene where i want to add a background, i have a png image (2k resolution), but when i try it on pc it is of the right size, on mobile is a lot "disproportionated"
My code is the following:
var texture = THREE.ImageUtils.loadTexture('img/texture.png');
And to add it as background is just this:
scene = new THREE.Scene();
scene.background = texture;
I've seen with some search that maybe i have to create a separate scene for the background, but i don't think it is the easiest solution, maybe there are a better solution for this?
(As always, sorry for my bad english)
You can try approaching this with THREE.ShaderMaterial
class MyBackgroundPlane extends THREE.Mesh{
constructor(){
super(
new THREE.PlaneBufferGeometry(2,2,1,1),
new THREE.ShaderMaterial({
uniforms:{
uTexture: { value: null },
uAspect: { value: 1 }
},
vertexShader: `
varying vec2 vUv;
uniform float uAspect;
void main(){
vUv = uv; //pass coordinates to screen
vUv.x *= uAspect; //scale the coordinates
gl_Position = vec4(position.xy, 1., 1.);
}
`,
fragmentShader:`
varying vec2 vUv;
uniform sampler2D uTexture;
void main(){
gl_FragColor = texture2D( uTexture, vUv );
}
`
})
)
this.frustumCulled = false
}
setAspect( aspect ){
this.material.uniforms.uAspect.value = aspect
}
setTexture( texture ){
this.material.uniforms.uTexture.value = texture
}
}
You kinda have to figure out what needs to happen when its portrait and when its landscape.
One approach could be to use uniform vec2 uScale; and then set the vertical and horizontal aspects differently depending on the orientation.
The same thing could be done with the scene graph by attaching a regular plane to a camera for example, and then managing it's scale.
As an alternative, you can use a CSS based background:
#background {
background-image: url('http://youring.com/test/img/texture.png');
position: fixed;
top: 0;
left: 0;
height: 100%;
width: 100%;
z-index: -1;
}
Just create your renderer like this so it's possible to see through the canvas:
renderer = new THREE.WebGLRenderer( { antialias: true, alpha: true } );
DEMO: https://jsfiddle.net/f2Lommf5/5052/
I was wondering, as the rgba is not supported in three.js (the alpha is not used), is there a way to make a face with an opacity gradient?
I saw it's probably possible with a ShaderMaterial, using custom attributes, but as I'm new in WebGL, I don't really understand yet.
attributes = {
// ...
customColor: { type: 'v4', value: [] }
// ...
};
var values_color = attributes.customColor.value;
for( var v = 0; v < vertices.length; v++ ) {
// ...
values_color[ v ] = new THREE.Vector4();
// ...
}
I would like to do something like this, but with transparency: http://jsfiddle.net/FtML5/3/
You can use THREE.ShaderMaterial with a custom vertex attribute for the alpha value. Here is a step by step guide -
1) In you vertex shader, declare a attribute float which will take the alpha value. Also declare a varying float in both vertex and fragment shader.
Vertex shader:
attribute float alphaValue;
varying float vAlphaValue;
Fragment shader:
varying float vAlphaValue;
2) Assign the alpha attribute value to the varying value in vertex shader.
Vertex shader:
vAlphaValue = alphaValue;
3) After all the calculation has been done, assign the alpha varying value to the alpha value of gl_FragColor.
Fragment shader:
gl_FragColor.a = vAlphaValue;
4) From host side, add an array with the length of total vertex. Here is the code sample -
var geometry = new THREE.BufferGeometry();
geometry.addAttribute('position', new THREE.BufferAttribute(vertices, 3));
var alphaArray = [];
var alphaArrayLength = vertices.length / 3;
for(var i = 0; i < alphaArrayLength; i++) {
alphaArray.push(0.5);
}
5) Add a custom attribute for alpha value in the geometry and update it with the created array -
geometry.addAttribute('alphaValue', new THREE.BufferAttribute(new Float32Array(alphaArray), 1));
6) Create a THREE.ShaderMaterial -
var material = new THREE.ShaderMaterial({
vertexColors: THREE.VertexColors,
side: THREE.DoubleSide,
transparent: true,
vertexShader: document.
getElementById('vertex_shader_for_face').text,
fragmentShader: document.
getElementById('fragment_shader_for_face').text
});
7) Create the mesh with the geometry and material -
var mesh = new THREE.Mesh(geometry, material);
The quickest solution seems like using a custom shader and setting fragment opacity based on UV values.
In the past it was possible to incorporate the shadow calculations in custom shaders as described here and summed up here.
With r75, the lighting and shadow systems seem to have been merged, changing this. I attempted to work my way through the source to to understand but the abstraction/modules are a little tricky to follow.
I've distilled my shaders down to what I have so far:
Vertex:
#chunk(shadowmap_pars_vertex);
void main() {
vec4 worldPosition = modelMatrix * vec4(position, 1.0);
vec4 mvPosition = modelViewMatrix * vec4(position, 1.0);
gl_Position = projectionMatrix * mvPosition;
#chunk(shadowmap_vertex);
}
Fragment
#chunk(common);
#chunk(lights_pars);
#chunk(shadowmap_pars_fragment);
void main() {
//#if ( NUM_DIR_LIGHTS > 0 )
IncidentLight directLight;
DirectionalLight directionalLight;
float shadowValue = 1.0;
for ( int i = 0; i < NUM_DIR_LIGHTS; i ++ ) {
directionalLight = directionalLights[ i ];
shadowValue = getShadow( directionalShadowMap[ i ], directionalLight.shadowMapSize, directionalLight.shadowBias, directionalLight.shadowRadius, vDirectionalShadowCoord[ i ] );
}
//#endif
gl_FragColor = vec4(vec3(shadowValue), 1.0);
}
I pulled the directional light loop from the lights_template chunk. Unfortunately, shadowValue always seems to return 1.0, but it does work and the shader renders correctly otherwise.
My JS has the appropriate castShadow and receiveShadow set. Other meshes using Phong render shadows correctly.
Thanks so much in advance.
Edit:
Adding material.lights = true; to the ShaderMaterial makes something appear, however the value of shadowValue in the fragment shader is clearly incorrect on the side of the sphere facing away from the light. Screenshots attached.
In Three js, I'm using a vertex shader to animate a large geometry.
I've also set up a Depth of Field effect on the output. The problem is that the Depth of Field effect doesn't seem to know about the changed positioning created in my vertex shader. It is responding as if the geometry is in the original position.
How can I update the depth information in my shader/material so that the DOF works correctly? THREE.Material has a depthWrite property, but it doesn't seem to be that...
My depth of field pass works like this:
renderer.render( this.originalScene, this.originalCamera, this.rtTextureColor, true );
this.originalScene.overrideMaterial = this.material_depth;
renderer.render( this.originalScene, this.originalCamera, this.rtTextureDepth, true );
rtTextureColor and rtTextureDepth are both WebGLRenderTargets. For some reason rtTextureColor is correct, but rtTextureDepth is not
here is my vertex shader:
int sphereIndex = int(floor(position.x/10.));
float displacementVal = displacement[sphereIndex].w;
vec3 rotationDisplacement = displacement[sphereIndex].xyz;
vNormal = normalize( normalMatrix * normal );
vec3 vNormel = normalize( normalMatrix * viewVector );
intensity = abs(pow( c - dot(vNormal, vNormel), p ));
float xVal = (displacementVal*orbitMultiplier) * sin(timeValue*rotationDisplacement.x);
float yVal = (displacementVal*orbitMultiplier) * cos(timeValue*rotationDisplacement.y);
float zVal = 0;
vec3 rotatePosition = vec3(xVal,yVal,zVal);
vec3 newPos = (position-vec3((10.*floor(position.x/10.)),0,0))+rotatePosition;
vec4 mvPosition;
mvPosition = (modelViewMatrix * vec4(newPos,1));
vViewPosition = -mvPosition.xyz;
vec4 p = projectionMatrix * mvPosition;
gl_Position = p;
Because you set the scene override material (this.originalScene.overrideMaterial = this.material_depth) before rendering into this.rtTextureDepth, the renderer doesn't use your custom vertex shader. The scene override material is a THREE.MeshDepthMaterial, which includes its own vertex shader.
One thing to try is writing a THREE.ShaderMaterial that works like THREE.MeshDepthMaterial but uses your custom vertex shader. Modifying built-in shaders isn't straightforward, but I would start from something like this:
var depthShader = THREE.ShaderLib['depth'];
var uniforms = THREE.UniformsUtils.clone(depthShader.uniforms);
var material = new THREE.ShaderMaterial({
uniforms: uniforms,
vertexShader: /* your custom vertex shader */
fragmentShader: depthShader.fragmentShader
});
You'll have to add the uniforms for your custom vertex shader and also set the uniforms for the built-in depth shaders; search WebGLRenderer.js in the three.js source for MeshDepthMaterial.
I'm trying to make an app that will simulate long exposure photography. The idea is that I grab the current frame from the webcam and composite it onto a canvas. Over time, the photo will 'expose', getting brighter and brighter. (see http://www.chromeexperiments.com/detail/light-paint-live-mercury/?f=)
I have a shader that works perfectly. It's just like the 'add' blend mode in photoshop. The problem is that I can't get it to recycle the previous frame.
I thought that it would be something simple like renderer.autoClear = false; but that option seems to do nothing in this context.
Here's the code that uses THREE.EffectComposer to apply the shader.
onWebcamInit: function () {
var $stream = $("#user-stream"),
width = $stream.width(),
height = $stream.height(),
near = .1,
far = 10000;
this.renderer = new THREE.WebGLRenderer();
this.renderer.setSize(width, height);
this.renderer.autoClear = false;
this.scene = new THREE.Scene();
this.camera = new THREE.OrthographicCamera(width / -2, width / 2, height / 2, height / -2, near, far);
this.scene.add(this.camera);
this.$el.append(this.renderer.domElement);
this.frameTexture = new THREE.Texture(document.querySelector("#webcam"));
this.compositeTexture = new THREE.Texture(this.renderer.domElement);
this.composer = new THREE.EffectComposer(this.renderer);
// same effect with or without this line
// this.composer.addPass(new THREE.RenderPass(this.scene, this.camera));
var addEffect = new THREE.ShaderPass(addShader);
addEffect.uniforms[ 'exposure' ].value = .5;
addEffect.uniforms[ 'frameTexture' ].value = this.frameTexture;
addEffect.renderToScreen = true;
this.composer.addPass(addEffect);
this.plane = new THREE.Mesh(new THREE.PlaneGeometry(width, height, 1, 1), new THREE.MeshBasicMaterial({map: this.compositeTexture}));
this.scene.add(this.plane);
this.frameTexture.needsUpdate = true;
this.compositeTexture.needsUpdate = true;
new FrameImpulse(this.renderFrame);
},
renderFrame: function () {
this.frameTexture.needsUpdate = true;
this.compositeTexture.needsUpdate = true;
this.composer.render();
}
Here is the shader. Nothing fancy.
uniforms: {
"tDiffuse": { type: "t", value: null },
"frameTexture": { type: "t", value: null },
"exposure": { type: "f", value: 1.0 }
},
vertexShader: [
"varying vec2 vUv;",
"void main() {",
"vUv = uv;",
"gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );",
"}"
].join("\n"),
fragmentShader: [
"uniform sampler2D frameTexture;",
"uniform sampler2D tDiffuse;",
"uniform float exposure;",
"varying vec2 vUv;",
"void main() {",
"vec4 n = texture2D(frameTexture, vUv);",
"vec4 o = texture2D(tDiffuse, vUv);",
"vec3 sum = n.rgb + o.rgb;",
"gl_FragColor = vec4(mix(o.rgb, sum.rgb, exposure), 1.0);",
"}"
].join("\n")
This is in essence equivalent to posit labs answer, but I've had success with a more streamlined solution - I create an EffectComposer with only the ShaderPass I want recycled, then swap renderTargets for that composer with each render.
Initialization:
THREE.EffectComposer.prototype.swapTargets = function() {
var tmp = this.renderTarget2;
this.renderTarget2 = this.renderTarget1;
this.renderTarget1 = tmp;
};
...
composer = new THREE.EffectComposer(renderer,
new THREE.WebGLRenderTarget(512, 512, { minFilter: THREE.LinearFilter, magFilter: THREE.NearestFilter, format: THREE.RGBFormat })
);
var addEffect = new THREE.ShaderPass(addShader, 'frameTexture');
addEffect.renderToScreen = true;
this.composer.addPass(addEffect);
render:
composer.render();
composer.swapTargets();
A secondary EffectComposer can then take one of the two renderTargets and push it to the screen or transform it further.
Also note I declare the "frameTexture" as textureID when initializing the ShaderPass. This lets ShaderPass know to update the frameTexture uniform with the results from the previous Pass.
To achieve this kind of feedback effect, you have to alternate writing to separate instances of WebGLRenderTarget. Otherwise, the frame buffer is overwritten. Not totally sure why this happens... but here is the solution.
init:
this.rt1 = new THREE.WebGLRenderTarget(512, 512, { minFilter: THREE.LinearFilter, magFilter: THREE.NearestFilter, format: THREE.RGBFormat });
this.rt2 = new THREE.WebGLRenderTarget(512, 512, { minFilter: THREE.LinearFilter, magFilter: THREE.NearestFilter, format: THREE.RGBFormat });
render:
this.renderer.render(this.scene, this.camera);
this.renderer.render(this.scene, this.camera, this.rt1, false);
// swap buffers
var a = this.rt2;
this.rt2 = this.rt1;
this.rt1 = a;
this.shaders.add.uniforms.tDiffuse.value = this.rt2;
Try with this:
this.renderer = new THREE.WebGLRenderer( { preserveDrawingBuffer: true } );