I'm trying to make an app that will simulate long exposure photography. The idea is that I grab the current frame from the webcam and composite it onto a canvas. Over time, the photo will 'expose', getting brighter and brighter. (see http://www.chromeexperiments.com/detail/light-paint-live-mercury/?f=)
I have a shader that works perfectly. It's just like the 'add' blend mode in photoshop. The problem is that I can't get it to recycle the previous frame.
I thought that it would be something simple like renderer.autoClear = false; but that option seems to do nothing in this context.
Here's the code that uses THREE.EffectComposer to apply the shader.
onWebcamInit: function () {
var $stream = $("#user-stream"),
width = $stream.width(),
height = $stream.height(),
near = .1,
far = 10000;
this.renderer = new THREE.WebGLRenderer();
this.renderer.setSize(width, height);
this.renderer.autoClear = false;
this.scene = new THREE.Scene();
this.camera = new THREE.OrthographicCamera(width / -2, width / 2, height / 2, height / -2, near, far);
this.scene.add(this.camera);
this.$el.append(this.renderer.domElement);
this.frameTexture = new THREE.Texture(document.querySelector("#webcam"));
this.compositeTexture = new THREE.Texture(this.renderer.domElement);
this.composer = new THREE.EffectComposer(this.renderer);
// same effect with or without this line
// this.composer.addPass(new THREE.RenderPass(this.scene, this.camera));
var addEffect = new THREE.ShaderPass(addShader);
addEffect.uniforms[ 'exposure' ].value = .5;
addEffect.uniforms[ 'frameTexture' ].value = this.frameTexture;
addEffect.renderToScreen = true;
this.composer.addPass(addEffect);
this.plane = new THREE.Mesh(new THREE.PlaneGeometry(width, height, 1, 1), new THREE.MeshBasicMaterial({map: this.compositeTexture}));
this.scene.add(this.plane);
this.frameTexture.needsUpdate = true;
this.compositeTexture.needsUpdate = true;
new FrameImpulse(this.renderFrame);
},
renderFrame: function () {
this.frameTexture.needsUpdate = true;
this.compositeTexture.needsUpdate = true;
this.composer.render();
}
Here is the shader. Nothing fancy.
uniforms: {
"tDiffuse": { type: "t", value: null },
"frameTexture": { type: "t", value: null },
"exposure": { type: "f", value: 1.0 }
},
vertexShader: [
"varying vec2 vUv;",
"void main() {",
"vUv = uv;",
"gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );",
"}"
].join("\n"),
fragmentShader: [
"uniform sampler2D frameTexture;",
"uniform sampler2D tDiffuse;",
"uniform float exposure;",
"varying vec2 vUv;",
"void main() {",
"vec4 n = texture2D(frameTexture, vUv);",
"vec4 o = texture2D(tDiffuse, vUv);",
"vec3 sum = n.rgb + o.rgb;",
"gl_FragColor = vec4(mix(o.rgb, sum.rgb, exposure), 1.0);",
"}"
].join("\n")
This is in essence equivalent to posit labs answer, but I've had success with a more streamlined solution - I create an EffectComposer with only the ShaderPass I want recycled, then swap renderTargets for that composer with each render.
Initialization:
THREE.EffectComposer.prototype.swapTargets = function() {
var tmp = this.renderTarget2;
this.renderTarget2 = this.renderTarget1;
this.renderTarget1 = tmp;
};
...
composer = new THREE.EffectComposer(renderer,
new THREE.WebGLRenderTarget(512, 512, { minFilter: THREE.LinearFilter, magFilter: THREE.NearestFilter, format: THREE.RGBFormat })
);
var addEffect = new THREE.ShaderPass(addShader, 'frameTexture');
addEffect.renderToScreen = true;
this.composer.addPass(addEffect);
render:
composer.render();
composer.swapTargets();
A secondary EffectComposer can then take one of the two renderTargets and push it to the screen or transform it further.
Also note I declare the "frameTexture" as textureID when initializing the ShaderPass. This lets ShaderPass know to update the frameTexture uniform with the results from the previous Pass.
To achieve this kind of feedback effect, you have to alternate writing to separate instances of WebGLRenderTarget. Otherwise, the frame buffer is overwritten. Not totally sure why this happens... but here is the solution.
init:
this.rt1 = new THREE.WebGLRenderTarget(512, 512, { minFilter: THREE.LinearFilter, magFilter: THREE.NearestFilter, format: THREE.RGBFormat });
this.rt2 = new THREE.WebGLRenderTarget(512, 512, { minFilter: THREE.LinearFilter, magFilter: THREE.NearestFilter, format: THREE.RGBFormat });
render:
this.renderer.render(this.scene, this.camera);
this.renderer.render(this.scene, this.camera, this.rt1, false);
// swap buffers
var a = this.rt2;
this.rt2 = this.rt1;
this.rt1 = a;
this.shaders.add.uniforms.tDiffuse.value = this.rt2;
Try with this:
this.renderer = new THREE.WebGLRenderer( { preserveDrawingBuffer: true } );
Related
I want to do an audio input vizualizer with Phaser 3, I’m trying to get the mic input to a shader but I can’t find a way to make it work.
I have a basic understanding on shaders and I can work with textures that are images but I really don’t understand how to provide a sound. I checked a working example made in three.js: three.js webaudio - visualizer and I already managed to get the sound input from the microphone as an Uint8Array of 1024 numbers.
Here’s the shader I’m using:
// simplesound.gsl.js
#ifdef GL_ES
precision highp float;
#endif
precision mediump float;
uniform vec2 resolution;
uniform sampler2D iChannel0;
varying vec2 fragCoord;
void main() {
vec2 uv = fragCoord.xy / resolution.xy;
vec2 mu = texture2D(iChannel0, uv).rg;
float y = uv.y - mu.x;
y = smoothstep(0., 0.02, abs(y - 0.1));
gl_FragColor = vec4(y);
}
And here's my scene code, trying to make it work:
import Phaser from 'phaser';
// This will provide the array mentioned above with code that will use `navigator.getUserMedia`.
import { setupAudioContext } from '../audiostream';
export default class MainScene2 extends Phaser.Scene {
constructor() {
super({ key: 'MainScene2' });
}
preload() {
this.load.glsl('simplesound', '/static/simplesound.glsl.js');
}
create() {
this.shader = this.add.shader('simplesound', 400, 300, 800, 600);
// When the user presses the 'g' key we will start listening for mic input
const GKey = this.input.keyboard.addKey('G');
GKey.on('down', () => {
setupAudioContext((array) => {
// this array is the array mentioned above, in the three.js example they do something like creating
// a texture from this input and providing that texture to the shader uniform. I tried different things but
// nothing worked :(
//
// I tried using this.shader.setChannel0 and this.shader.setUniform but nothing seems to work as well.
});
});
}
}
I've been trying to make this work for a while, but couldn't get anything :(
For a possible solution without shaders, by using only using phaser and javascript could look like this (really no shader, but I'm also really interested in how a shader version, would look like).
In this demo I'm using data from an audio file. So that it works for your UseCase, you just would have to plug the microphone data into the data variable.
Demo:
(comments in the code, are made to highlight the main idea)
Click and wait some seconds. btw.: I added some screenshake, to give the demo more juice.
document.body.style = 'margin:0;';
var data = [];
var playing = -1;
var audioContext = new (window.AudioContext || window.webkitAudioContext)();
var analyser = audioContext.createAnalyser();
var buffer;
var source;
var url = 'https://labs.phaser.io/assets/audio/Rossini - William Tell Overture (8 Bits Version)/left.ogg'
// START Audio part for Demo
function loadAudio() {
var request = new XMLHttpRequest();
request.open('GET', url, true);
request.responseType = 'arraybuffer';
request.onload = function() {
audioContext.decodeAudioData(request.response, function(buf) {
buffer = buf;
playAudio();
});
};
request.send();
}
function playAudio() {
source = audioContext.createBufferSource();
source.buffer = buffer;
source.connect(audioContext.destination);
source.connect(analyser);
source.start(0);
}
// END Audio part for Demo
var config = {
type: Phaser.AUTO,
width: 536,
height: 183,
scene: {
create,
update
},
banner: false
};
var game = new Phaser.Game(config);
// this would be the varibale that should be updated from the audio source
var markers;
var createRandomData = true;
function create () {
// Start create Marker texture
// this could be remove if you want to load an actual image
let g = this.make.graphics({x: 0, y: 0, add: false});
g.lineStyle(10, 0xffffff);
g.beginPath();
g.moveTo(0, 0);
g.lineTo(50, 0);
g.strokePath();
g.generateTexture('marker', 30, 10);
// End create Marker texture
// Create the markers
// the repeat property sets how many markers you want to display, if you want all 1024 => that would be your value
markers = this.add.group({ key: 'marker', repeat: 50,
setXY: { x: 10, y: 10, stepX: 35 }, setOrigin: { x: 0, y: 0}});
this.add.rectangle(10, 10, 180, 20, 0).setOrigin(0);
let label = this.add.text( 10, 10, 'Click to start music', {color: 'red', fontSize:'20px', fontStyle:'bold'} )
// start and stop the playback of music
this.input.on('pointerdown', function () {
switch (playing) {
case -1:
loadAudio();
playing = 1;
label.setText('Click to stop music');
break;
case 0:
playAudio();
playing = 1;
label.setText('Click to stop music');
break;
case 1:
source.stop();
playing = 0;
label.setText('Click to start music');
break;
}
});
}
function update(){
if (markers){
// here we update the y-position of the marker in depending on the value of the data.
// ( min y = 10 and max y ~ 245)
markers.children.iterate(function (child, idx) {
child.y = 10 + (config.height - 20) / 255 * data[idx];
// you could even add some camera shake, for more effect
if(idx < 3 && data[idx] > 253){
this.cameras.main.shake(30);
}
}, this);
// if the analyser is valid and updates the data variable
// this part could some where else, I just wanted to keep the code concise
if(analyser){
var spectrums = new Uint8Array(analyser.frequencyBinCount);
analyser.getByteFrequencyData(spectrums);
// convert data to a plain array and updating the data variable
data = [].slice.call(spectrums);
}
}
}
<script src="https://cdn.jsdelivr.net/npm/phaser#3.55.2/dist/phaser.js"></script>
Basically this application "only" alters the Y-positions of each marker, based on the value returned from the audio array, which is loaded from the audio file.
Disclaimer: this is rough demo code, and could use some cleanup/improvement, if it should be used in production.
This is a working solution, using the mentioned shader, through phaser
After a short deep-dive into glsl and after starting to understand shader's and the surrounding concepts, I found out how shader work's with phaser (atleast for this task), it is actually pretty elegant and straightforward, how it was implemented.
Basically you "just" would need to:
load the shader in preload
this.load.glsl('simplesound', 'simplesound.glsl.js');
add the shader to the scene, in the create function for example
this.shader = this.add.shader('simplesound', x, y, width, height);
create two texture to pass to the shader (so that you can swap them), in the create function
(becarful to select a texture size that respects the power-of-two size)
this.g = this.make.graphics({ x: 0, y: 0, add: false });
this.g.generateTexture('tex0', 64, 64);
this.g.generateTexture('tex1', 64, 64);
In the update function, create a texture that represents the audio data
...
var spectrums = new Uint8Array(analyser.frequencyBinCount);
analyser.getByteFrequencyData(spectrums);
data = [].slice.call(spectrums);
...
let g = this.make.graphics({add:false});
data.forEach( (value, idx ) => {
g.fillStyle(value <<16 );
g.fillRect(idx * 10, 0, 10, value);
});
and at last update the next texture, and pass new texture-key of the newly recreated texture to the shader
if(this.shader.getUniform('iChannel0').textureKey == 'tex0'){
this.textures.remove('tex1');
g.generateTexture('tex1', 64, 64);
this.shader.setChannel0('tex1');
} else {
this.textures.remove('tex0');
g.generateTexture('tex0', 64, 64);
this.shader.setChannel0('tex0');
}
Ofcourse this code could be optimizied with functions, better variable and texture-key naming, but this is left to the reader.
Small working Demo:
(uses audio file, instead of mic, and is based on the code of my previous answer)
document.body.style = 'margin:0;';
const TEXTURE_SIZE = 128;
const glslScript = `#ifdef GL_ES
precision highp float;
#endif
precision mediump float;
uniform vec2 resolution;
uniform sampler2D iChannel0;
varying vec2 fragCoord;
void main() {
vec2 uv = fragCoord.xy / resolution.xy;
vec2 mu = texture2D(iChannel0, uv).rg;
float y = uv.y - mu.x;
y = smoothstep(0., 0.02, abs(y - 0.1));
gl_FragColor = vec4(y);
}`;
var playing = -1;
var audioContext = new (window.AudioContext || window.webkitAudioContext)();
var analyser = audioContext.createAnalyser();
var source;
var url = 'https://labs.phaser.io/assets/audio/Rossini - William Tell Overture (8 Bits Version)/left.ogg'
// START Audio part for Demo
function loadAudio() {
var request = new XMLHttpRequest();
request.open('GET', url, true);
request.responseType = 'arraybuffer';
request.onload = function () {
audioContext.decodeAudioData(request.response, function (buf) {
playAudio(buf);
});
};
request.send();
}
function playAudio(buffer) {
source = audioContext.createBufferSource();
source.buffer = buffer;
source.connect(audioContext.destination);
source.connect(analyser);
source.start(0);
}
// END Audio part for Demo
var config = {
type: Phaser.WEBGL,
width: 536,
height: 183,
scene: {
create,
update
},
banner: false
};
var game = new Phaser.Game(config);
function create() {
this.add.text(10, 10, 'Click to start and stop the music')
.setColor('#000000')
.setOrigin(0)
.setDepth(1000)
.setFontFamily('Arial');
this.add.rectangle(0, 0, config.width, 40, 0xffffff)
.setDepth(999)
.setOrigin(0);
var baseShader = new Phaser.Display.BaseShader('shader', glslScript);
this.shader = this.add.shader(baseShader, 0, 10, config.width, config.height - 10)
.setOrigin(0);
// start and stop the playback of music
this.input.on('pointerdown', function () {
switch (playing) {
case -1:
loadAudio();
playing = 1;
break;
case 0:
playAudio();
playing = 1;
break;
case 1:
source.stop();
playing = 0;
break;
}
});
}
function update() {
if (analyser) {
var spectrums = new Uint8Array(analyser.frequencyBinCount);
analyser.getByteFrequencyData(spectrums);
let data = [].slice.call(spectrums);
let g = this.make.graphics({ add: false });
data.forEach((value, idx) => {
g.fillStyle(value << 16);
g.fillRect(idx * 10, 0, 10, value);
});
let textureName = this.shader.getUniform('iChannel0').textureKey == 'tex0' ? 'tex1' : 'tex0';
if(this.textures.exists(textureName)){
this.textures.remove(textureName);
}
g.generateTexture(textureName, TEXTURE_SIZE, TEXTURE_SIZE);
this.shader.setChannel0(textureName);
}
}
<script src="https://cdn.jsdelivr.net/npm/phaser#3.55.2/dist/phaser.js"></script>
I'm trying to figure out how to code a slider that will change the radius of a certain shapes. The assignment will output as an HTML webdoc and many of the ways ive tried to solve this problem end up causing the webdoc to output as a blank page. I have created a slider in my HTML file inside a table element, and created subsequent code in my javascript file. I looked up one tutorial that suggested using object.setRadius, to change the radius. It was for a java file so that could explain why it didn't work. I've tried multiple ways and everything seems to allow the slider to move, but nothing seems to ever effect the shapes. What am I doing wrong?
threeComponents2.js
"use strict";
let vertex_shader = `
attribute vec4 vPosition;
attribute vec4 vColor;
varying vec4 fcolor;
void main()
{
gl_PointSize = 10.0;
gl_Position = vPosition;
fcolor = vColor;
}
`;
let fragment_shader = `
precision mediump float;
varying vec4 fcolor;
void
main()
{
gl_FragColor = fcolor;
}
`;
let lg;
window.addEventListener("load", function init() {
lg = noLogging();
lg.insertAtEnd = false;
const canvas = document.getElementById( "gl-canvas" );
const gl = WebGLUtils.setupWebGL( canvas );
if ( !gl ) { alert( "WebGL isn't available" ); }
gl.viewport( 0, 0, canvas.width, canvas.height );
gl.clearColor( 1.0, 1.0, 1.0, 1.0 );
gl.enable(gl.DEPTH_TEST);
//
// Load shaders and initialize attribute buffers
//
const program = initShaders( gl, vertex_shader, fragment_shader);
gl.useProgram( program );
let node1 = new cs4722_objects.Node(new cs4722_objects.Cylinder(10));
node1.object.center = [.27,.25,.25,1];
node1.object.radius_top = .4;
node1.object.radius_bottom = .25;
node1.object.height = .5;
let node2 = new cs4722_objects.Node(new cs4722_objects.Sphere());
node2.object.center = [-.25,-.25,-.25,1];
node2.object.radius = .25;
node2.object.scheme_colors = [X11.SkyBlue3, X11.Gold2];
let node3 = new cs4722_objects.Node(new cs4722_objects.Block());
node3.object.center = [-.4, .25, 0, 1];
node3.object.height = .5;
node3.object.width = .5;
node3.object.scheme_colors = [X11.SkyBlue1, X11.SkyBlue2, X11.SkyBlue3, X11.Purple1, X11.Purple2, X11.Purple3];
let node0 = new cs4722_objects.Node();
node0.children.push(node2);
node0.children.push(node1);
node0.children.push(node3);
let scene = new cs4722_objects.Scene(node0);
scene.load_vertices(gl, program);
scene.load_colors(gl, program);
const csc = cs4722_controls;
function render() {
gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);
scene.draw(gl);
}
var sphereSlider = document.getElementById("radius");
var output = document.getElementById("demo");
// Update the current slider value (each time you drag the slider handle)
sphereSlider.oninput = function() {
output.innerHTML = this.value;
node2.setRadius(this.value);
}
/* csc.addCheckbox("controls", "sphere-vis", "Sphere Visible",
node2.visible,
(event) => {
node2.visible = event.target.checked;
},
render);
csc.addCheckbox("controls", "cylinder-vis", "Cylinder Visible",
node1.visible,
(event) => {
node1.visible = event.target.checked;
},
render);
csc.addCheckbox("controls", "block-vis", "Block Visible",
node3.visible,
(event) => {
node3.visible = event.target.checked;
},
render);*/
render();
});
try look at this it might help you some
https://codepen.io/codedevlo/pen/BOOOaQ
I think it is something like this
var slider;
ellipse(200, 200, slider);
document.getElementById("slider").innerHTML = slider;
and the HTML
<input type="range" min="40" max="400" value="40" class="slider">
i think that should work
I want to get as best performance as possible with rendering simple textured shapes. The problem is with phong model it requires extra lighting (which involves calculations) + the colors are not like the one desired and needs some tweeking.
To simplify the case I've decided to use a simple flat shader, but some problems occur:
<script id="vertShader" type="shader">
varying vec2 vUv;
void main() {
vUv = uv;
gl_Position = projectionMatrix * modelViewMatrix * vec4(position,1.0);
}
</script>
<script id="fragShader" type="shader">
varying vec2 vUv;
uniform sampler2D material;
void main() {
gl_FragColor = texture2D(material, vUv);
}
</script>
Under certain camera angles some of the shelves dissapear (you can notice the darker places, and see through them), which does not occur using the phong material:
It happens with the shadow texture put inside each shelf. It's a textured cube with a shadow texture put inside each space (don't ask me why, this is just a task I got:))
I don't know what may be causing this. Maybe the loading?
Im using the standard obj loader and adding textures. Obj loader sets the material to phong and im switching it to custom shader like this:
var objLoader = new THREE.OBJLoader( manager );
objLoader.load( obj, function ( model ) {
elements[name] = model;
console.log('loaded ', name);
var img = THREE.ImageUtils.loadTexture(mat);
elements[name].traverse( function ( child ) {
if ( child instanceof THREE.Mesh ) {
child.material = new THREE.ShaderMaterial( {
uniforms: {
color: {type: 'f', value: 0.0},
material: {type: 't', value: img}
},
fragmentShader: document.getElementById('fragShader').text,
vertexShader: document.getElementById('vertShader').text,
} );
}
});
any suggestions would be helpful
Every surface is drawn in one direction (clockwise or counter-clockwise). if you are showing a surface from the other side, it will "dissapear". I think this is the problem of your own shader. -> you should render them from both sides (-> worse performance) or calculate, from which side it should render.
To optimize the performance slightly you should use a standard material from THREE. You can use them without writing your own shader.
something like:
child.material = new THREE.MeshBasicMaterial({
side: THREE.DoubleSide,
color: 0x000000
// ...
});
i created a skybox-material with textures in an own project:
function getSkyboxMaterial() {
var faceMaterials = getSkyboxFaces();
var skyboxMaterial = new THREE.MeshFaceMaterial(faceMaterials);
return skyboxMaterial;
}
function getSkyboxFaces() {
var NUMBER_OF_FACES = 6, faces = [], texture, faceMaterial, texturePath, i;
for (i = 0; i < NUMBER_OF_FACES; i++) {
texturePath = IMAGE_PREFIX + DIRECTIONS[i] + IMAGE_SUFFIX;
texture = loadFlippedTexture( texturePath );
faceMaterial = getFaceMaterial( texture );
faces.push( faceMaterial );
}
return faces;
}
function loadFlippedTexture(texturePath) {
var texture = loadTexture(texturePath);
flipTexture(texture); // This is necessary, because the skybox-textures are mirrored.
return texture;
}
function loadTexture(path) {
return THREE.ImageUtils.loadTexture(path);
}
function flipTexture(texture) {
texture.repeat.set(-1, 1);
texture.offset.set(1, 0);
return texture;
}
function getFaceMaterial(texture) {
var faceMaterial = new THREE.MeshBasicMaterial({
map: texture,
side: THREE.DoubleSide
});
return faceMaterial;
}
I'm trying to render a few objects to a canvas and I'm having a bit of trouble understanding what's not working.
I'm building two objects at the moment that represent the two meshes that I want to render. If I create one mesh the code works fine so the problem, I think, is that the data gets screwed up when I'm building two or more.
Here's an example of the mesh data:
"name":"cone",
"buffers":{
"vertexPosition":{}, // Buffer
"vertexIndex":{} // Buffer
},
"mesh":{
"vertices":[], // emptied it to fit on page
"faces":[] // emptied it to fit on page
},
"mvMatrix": Float32Array[16],
"itemSize":3,
"numItems":12,
"program":{
"vertexPosAttrib":0,
"mvMatrixUniform":{},
"pMatrixUniform":{}
}
This is build from this function:
buildMeshData: function(){
this.mvMatrix = mat4.create();
this.buffers.vertexPosition = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, this.buffers.vertexPosition);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(this.mesh.vertices), gl.STATIC_DRAW);
this.buffers.vertexIndex = gl.createBuffer();
gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, this.buffers.vertexIndex);
gl.bufferData(gl.ELEMENT_ARRAY_BUFFER, new Uint16Array(this.mesh.faces), gl.STATIC_DRAW);
this.itemSize = 3;
this.numItems = this.mesh.faces.length;
var vertexProps = {
attributes: ['vec3', 'VertexPosition'],
uniforms: ['mat4', 'MVMatrix', 'mat4', 'PMatrix'],
varyings: ['vec3', 'TexCoord']
}
var vertexShaderFunction = 'vTexCoord = aVertexPosition + 0.5; gl_Position = uPMatrix * uMVMatrix * vec4(aVertexPosition, 1);';
var vshaderInput = utils.buildVertexShader(vertexProps, vertexShaderFunction);
var fragmentProps = {
attributes: [],
uniforms: [],
varyings: ['vec3', 'TexCoord']
}
var fragmentShaderFunction = 'gl_FragColor = vec4(vTexCoord, 1);';
var fshaderInput = utils.buildFragmentShader(fragmentProps, fragmentShaderFunction);
this.program = gl.createProgram();
var vshader = gl.createShader(gl.VERTEX_SHADER);
gl.shaderSource(vshader, vshaderInput);
gl.compileShader(vshader);
var fshader = gl.createShader(gl.FRAGMENT_SHADER);
gl.shaderSource(fshader, fshaderInput);
gl.compileShader(fshader);
gl.attachShader(this.program, vshader);
gl.attachShader(this.program, fshader);
gl.linkProgram(this.program);
gl.useProgram(this.program);
this.program.vertexPosAttrib = gl.getAttribLocation(this.program, 'aVertexPosition');
gl.vertexAttribPointer(this.program.vertexPosAttrib, this.itemSize, gl.FLOAT, false, 0, 0);
gl.enableVertexAttribArray(this.program.vertexPosAttrib);
this.program.mvMatrixUniform = gl.getUniformLocation(this.program, "uMVMatrix");
this.program.pMatrixUniform = gl.getUniformLocation(this.program, "uPMatrix");
scene.add(this);
}
and the render function goes like this:
function render(){
currentTime = new Date().getTime();
deltaTime = (currentTime - initialTime) / 1000; // in seconds
gl.viewport(0, 0, stage.width, stage.height);
gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);
for(var i in scene.meshes){
(function(mesh){
mat4.translate(mesh.mvMatrix, mesh.mvMatrix, [0, 2 * i, -10 - (10 * i)]);
gl.useProgram(mesh.program);
gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, null);
gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, mesh.buffers.vertexIndex);
gl.vertexAttribPointer(mesh.program.vertexPosAttrib, mesh.itemSize, gl.FLOAT, false, 0, 0);
gl.enableVertexAttribArray(mesh.program.vertexPosAttrib);
gl.uniformMatrix4fv(mesh.program.mvMatrixUniform, false, mesh.mvMatrix);
gl.uniformMatrix4fv(mesh.program.pMatrixUniform, false, scene.pMatrix);
gl.drawElements(gl.TRIANGLES, mesh.numItems, gl.UNSIGNED_SHORT, 0);
gl.disableVertexAttribArray(mesh.program.vertexPosAttrib);
})(scene.meshes[i])
}
// requestAnimationFrame(render);
}
The result of this is the second object is drawn correctly but the first causes the error:
[.WebGLRenderingContext]GL ERROR :GL_INVALID_OPERATION : glDrawElements: attempt to access out of range vertices in attribute 0
...and is therefore not drawn.
Any ideas where the problem lies. Hopefully thats enough information from the code, I didn't want to put up too much, but if you need to see anything else I'll update.
This code
gl.vertexAttribPointer(this.program.vertexPosAttrib, this.itemSize, gl.FLOAT, false, 0, 0);
gl.enableVertexAttribArray(this.program.vertexPosAttrib);
Need to be called when drawing each mesh and not where it's called now. Additionally before calling gl.vertexAttribPointer for this.program.vertexPosAttrib you need to call
gl.bindBuffer(gl.ARRAY_BUFFER, mesh.buffers.vertexPosition);
Because gl.vertexAttribPointer binds the buffer currently bound to gl.ARRAY_BUFFER to the specified attribute.
In other words
gl.bindBuffer(gl.ARRAY_BUFFER, mesh.buffers.vertexPosition);
gl.vertexAttribPointer(mesh.program.vertexPosAttrib, mesh.itemSize, gl.FLOAT, false, 0, 0);
gl.enableVertexAttribArray(mesh.program.vertexPosAttrib);
I've modified this example: http://stemkoski.github.io/Three.js/Shader-Fireball.html
and inserted:
var depthShader = THREE.ShaderLib["depthRGBA"];
var depthUniforms = THREE.UniformsUtils.clone(depthShader.uniforms);
depthMaterial = new THREE.ShaderMaterial({
fragmentShader : depthShader.fragmentShader,
vertexShader : depthShader.vertexShader,
uniforms : depthUniforms
});
depthMaterial.blending = THREE.NoBlending;
depthTarget = new THREE.WebGLRenderTarget(window.innerWidth, window.innerHeight, {
minFilter : THREE.NearestFilter,
magFilter : THREE.NearestFilter,
format : THREE.RGBAFormat
});
quadCamera = new THREE.OrthographicCamera(window.innerWidth / -2, window.innerHeight / 2, window.innerWidth / 2, window.innerHeight / -2, -1000, 2000);
quadCamera.position.z = 100;
var shader = THREE.UnpackDepthRGBAShader;
var uniforms = new THREE.UniformsUtils.clone(shader.uniforms);
uniforms.tDiffuse.value = depthTarget;
quadMaterial = new THREE.ShaderMaterial({
vertexShader : shader.vertexShader,
fragmentShader : shader.fragmentShader,
uniforms : uniforms
});
var mesh = new THREE.Mesh(new THREE.PlaneGeometry(window.innerWidth, window.innerHeight*1.5), quadMaterial);
mesh.position.z = -500;
mesh.position.y = 200;
quadScene = new THREE.Scene();
quadScene.add(mesh);
And changed the render function to :
function render() {
renderer.overrideMaterial = depthMaterial;
renderer.render(scene, camera, depthTarget, true);
renderer.overrideMaterial = null;
renderer.render(quadScene, quadCamera);
}
and it looks like: http://i.imgur.com/hiHLc8g.png
How do I get the depth buffer to look like a depth buffer and not be black?
Will custom ShaderMaterials that displace vertices write correctly to depthbuffer? Because I have another project with objects of displaced vertices and the depth doesn't account for the displacement. Is there a way to do that?
Well, I think it is strange that the plane is black, too. Because in your theory, wouldn't the plane be correctly colored if it was for displaced vertices only?
Maybe the camera near and far clip plane values are off? Thus it is not working with the depth correctly. Have you checked that the near and far values are in a sane range?