Related
I am trying to create a skybox and for that, I am building a unit cube textured with cube texture.
I have my six images list stored in an array:
const cubeImageSources = [
"grimmnight_bk.jpg",
"grimmnight_dn.jpg",
"grimmnight_ft.jpg",
"grimmnight_lf.jpg",
"grimmnight_rt.jpg",
"grimmnight_up.jpg",
];
and created cube texture image in webgl:
let indexedImage = new Image();
indexedImage.crossOrigin = "";
let cubeMapTexture = gl.createTexture();
gl.activeTexture(gl.TEXTURE1);
gl.bindTexture(gl.TEXTURE_CUBE_MAP, cubeMapTexture);
for (let i = 0, length = cubeImageSources.length; i < length; i++) {
indexedImage.src =
`${window.location.origin}/game/images/texture/${cubeImageSources[i]}`;
indexedImage.onLoad = function(){
gl.texImage2D(
gl.TEXTURE_CUBE_MAP_POSITIVE_X + i,
0,
gl.RGBA,
gl.RGBA,
gl.UNSIGNED_BYTE,
indexedImage
);
}
}
gl.texParameteri(gl.TEXTURE_CUBE_MAP, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
gl.texParameteri(gl.TEXTURE_CUBE_MAP, gl.TEXTURE_MIN_FILTER, gl.LINEAR);
gl.texParameteri(gl.TEXTURE_CUBE_MAP, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_CUBE_MAP, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
in GLSL fragment shader:-
precision highp float;
in vec3 texPosition;
//texPosition is the position attribute passed from the Vertex Shader
out vec4 outColor;
uniform samplerCube u_SkyTexture;
void main(){
// outColor = vec4(0.0, 1.0, 0.0, 1.0);
outColor = texture(u_SkyTexture, texPosition);
}
samplerCube value is assigned 1 with
gl.uniform1i(texIndexLocation, 1);
where texIndexLocation is a location of samplerCube in Fragment Shader.
I tried to google but I couldn't find what is the solution of this problem or what is generating this error.
Thank you in advance !!!
I am currently working on a simple project in WebGL that requires rendering multiple 2D objects with a simple image textured on. The actual project generates a random number of objects, usually between 5 and 9, sets vertex data for them around the canvas to separate them, and is supposed to render, however it will only render one at a time (usually the last but I can change around gl.activeTexture to show other objects in the array). I tried to use a question on here about texture arrays in the shader, but to no avail, so I ended up creating a very simple test program that just tries to load two objects and textures, one on the left of the canvas and the other on the right.
From here I tried to do everything completely separate, even giving each object their own shaders, programs, buffers and everything, and subsequently binding everything in the draw call for each before calling gl.drawElements for each. This still doesn't show me the correct result, only the second texture appears, however it did lead me to discover what I believe to be happening. By commenting out the bindings and draw call for the second one, the first texture shows up, however it appears at the location of the second texture, not where it's vertices should be placing it. So, I assume what is happening in this program (and my project code) is that it is in fact drawing both, but for some reason applying the vertices of the last drawn one to all of them, thus stacking them and only showing the top (or last drawn one).
I have also tried a mishmash of tweaks to the below code, using only one program, using the same indices, texture coordinates, there are some commented out lines from trying to make calls in different orders as well. Anything commented out doesn't mean I necessarily think it is wrong or right, just from various things I've aimlessly tried at this point.
I have worked with OpenGL a little and had little to no trouble drawing multiple objects with their own textures, and I know that WebGL works differently than OpenGL in some ways including textures, but I do not see where I am creating the issue. I'm sure it is something very simple, and any guidance would be greatly appreciated.
I apologize for the long block of code, it's pretty much just straight typing everything out that I believe to be needed without trying to take any shortcuts. The initShaders call is from the WebGL js files I'm using from my textbook and isn't something I've written, and the loadImage call just simply loads an <img> from the html code. There are no issues with the images being loaded correctly as far as I can tell. I only included the first vertex and fragment shader because the other two are the same save for the id.
<script id="vertex-shader1" type="x-shader/x-vertex">
attribute vec4 vPosition;
attribute vec2 vTexCoord;
varying vec2 fTexCoord;
void main() {
fTexCoord = vTexCoord;
gl_Position = vPosition;
}
</script>
<script id="fragment-shader1" type="x-shader/x-fragment">
precision mediump float;
varying vec2 fTexCoord;
uniform sampler2D texture;
void main() {
gl_FragColor = texture2D(texture, fTexCoord);
}
</script>
"use-strict"
var gl;
var images = [];
var program1;
var program2;
var texture1;
var texture2;
var vBuff1;
var vBuff2;
var iBuff1;
var iBuff2;
var tBuff1;
var tBuff2;
var vPos1;
var vPos2;
var fTexCoord1;
var fTexCoord2;
var sampler1;
var sampler2;
var vertices1 = [
vec4(-0.8, 0.1, 0.0, 1.0),
vec4(-0.8, 0.3, 0.0, 1.0),
vec4(-0.6, 0.3, 0.0, 1.0),
vec4(-0.6, 0.1, 0.0, 1.0)
];
var vertices2 = [
vec4(0.1, 0.1, 0.0, 1.0),
vec4(0.1, 0.3, 0.0, 1.0),
vec4(0.3, 0.3, 0.0, 1.0),
vec4(0.3, 0.1, 0.0, 1.0)
];
var indices1 = [
0, 1, 2,
0, 2, 3
];
var indices2 = [
0, 1, 2,
0, 2, 3
];
var tcs1 = [
vec2(0, 0),
vec2(0, 1),
vec2(1, 1),
vec2(1, 0)
];
var tcs2 = [
vec2(0, 0),
vec2(0, 1),
vec2(1, 1),
vec2(1, 0)
];
window.onload = function init() {
var canvas = document.getElementById("gl-canvas");
gl = WebGLUtils.setupWebGL(canvas);
if (!gl) { alert("WebGL isn't available"); }
gl.viewport(0, 0, canvas.width, canvas.height);
gl.clearColor(0.0, 0.0, 0.0, 1.0);
loadImages();
program1 = initShaders(gl, "vertex-shader1", "fragment-shader1");
gl.useProgram(program1);
vBuff1 = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, vBuff1);
gl.bufferData(gl.ARRAY_BUFFER, flatten(vertices1), gl.STATIC_DRAW);
vPos1 = gl.getAttribLocation(program1, "vPosition");
gl.vertexAttribPointer(vPos1, 4, gl.FLOAT, false, 0, 0);
//gl.enableVertexAttribArray(vPos1);
iBuff1 = gl.createBuffer();
gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, iBuff1);
gl.bufferData(gl.ELEMENT_ARRAY_BUFFER, new Uint8Array(indices1), gl.STATIC_DRAW);
tBuff1 = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, tBuff1);
gl.bufferData(gl.ARRAY_BUFFER, flatten(tcs1), gl.STATIC_DRAW);
fTexCoord1 = gl.getAttribLocation(program1, "vTexCoord");
gl.vertexAttribPointer(fTexCoord1, 2, gl.FLOAT, false, 0, 0);
//gl.enableVertexAttribArray(fTexCoord1);
sampler1 = gl.getUniformLocation(program1, "texture");
texture1 = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, texture1);
gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, true);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGB, gl.RGB, gl.UNSIGNED_BYTE, images[0]);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
gl.bindTexture(gl.TEXTURE_2D, null);
///////////////////////////////////////////////////////////////////////////////////////
/*
program2 = initShaders(gl, "vertex-shader2", "fragment-shader2");
gl.useProgram(program2);
*/
vBuff2 = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, vBuff2);
gl.bufferData(gl.ARRAY_BUFFER, flatten(vertices2), gl.STATIC_DRAW);
vPos2 = gl.getAttribLocation(program1, "vPosition");
gl.vertexAttribPointer(vPos2, 4, gl.FLOAT, false, 0, 0);
//gl.enableVertexAttribArray(vPos2);
iBuff2 = gl.createBuffer();
gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, iBuff2);
gl.bufferData(gl.ELEMENT_ARRAY_BUFFER, new Uint8Array(indices2), gl.STATIC_DRAW);
tBuff2 = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, tBuff2);
gl.bufferData(gl.ARRAY_BUFFER, flatten(tcs2), gl.STATIC_DRAW);
fTexCoord2 = gl.getAttribLocation(program1, "vTexCoord");
gl.vertexAttribPointer(fTexCoord2, 2, gl.FLOAT, false, 0, 0);
//gl.enableVertexAttribArray(fTexCoord2);
sampler2 = gl.getUniformLocation(program1, "texture");
texture2 = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, texture2);
gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, true);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGB, gl.RGB, gl.UNSIGNED_BYTE, images[1]);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
gl.bindTexture(gl.TEXTURE_2D, null);
render();
};
function render() {
gl.clear(gl.COLOR_BUFFER_BIT);
gl.useProgram(program1);
gl.bindBuffer(gl.ARRAY_BUFFER, vBuff1);
gl.enableVertexAttribArray(vPos1);
gl.enableVertexAttribArray(fTexCoord1);
gl.activeTexture(gl.TEXTURE0);
gl.bindTexture(gl.TEXTURE_2D, texture1);
gl.uniform1i(sampler1, 0);
// gl.bindBuffer(gl.ARRAY_BUFFER, vBuff1);
// gl.enableVertexAttribArray(vPos1);
gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, iBuff1);
gl.drawElements(gl.TRIANGLES, indices1.length, gl.UNSIGNED_BYTE, 0);
//gl.bindTexture(gl.TEXTURE_2D, null);
// gl.useProgram(program2);
gl.bindBuffer(gl.ARRAY_BUFFER,vBuff2);
gl.enableVertexAttribArray(vPos2);
gl.enableVertexAttribArray(fTexCoord2);
gl.activeTexture(gl.TEXTURE0);
gl.bindTexture(gl.TEXTURE_2D, texture2);
gl.uniform1i(sampler2, 0);
// gl.bindBuffer(gl.ARRAY_BUFFER, vBuff2);
// gl.enableVertexAttribArray(vPos2);
gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, iBuff2);
gl.drawElements(gl.TRIANGLES, indices2.length, gl.UNSIGNED_BYTE, 0);
requestAnimFrame(render);
}
First off AFAIK your code can't work. It calls a function loadImages and then immediately uses the images. Images load asychronously in the browser so you need to either have a callback for when the images load or else use async functions
Here is your code working. First I made a loadImage that returns a Promise. Then I made a async function called loadImages that uses that to load all the images and wait for them to load. Then I made another async function called main what first waits for loadImages and then calls init
The second issue was that in WebGL1 attributes are global state. That means you need to set them at render time not init time so the calls go gl.enableVertexAttribArray and gl.vertexAttribPointer need to happen at render time with the appropriate values for rendering the particular thing you are rendering. gl.vertexAttribPointer copies the current ARRAY_BUFFER binding to that attribute.
you might find these tutorials helpful and in particular this one about attributes and this state diagram that might help you visualize what is happening inside WebGL
"use-strict";
const vec2 = (...args) => [...args];
const vec4 = (...args) => [...args];
const flatten = a => new Float32Array(a.flat());
const WebGLUtils = {
setupWebGL: (canvas) => { return canvas.getContext('webgl'); },
};
const initShaders = (gl, vs, fs) => twgl.createProgram(gl, [vs, fs]);
const requestAnimFrame = requestAnimationFrame;
var gl;
var images = [];
var program1;
var program2;
var texture1;
var texture2;
var vBuff1;
var vBuff2;
var iBuff1;
var iBuff2;
var tBuff1;
var tBuff2;
var vPos1;
var vPos2;
var fTexCoord1;
var fTexCoord2;
var sampler1;
var sampler2;
var vertices1 = [
vec4(-0.8, 0.1, 0.0, 1.0),
vec4(-0.8, 0.3, 0.0, 1.0),
vec4(-0.6, 0.3, 0.0, 1.0),
vec4(-0.6, 0.1, 0.0, 1.0)
];
var vertices2 = [
vec4(0.1, 0.1, 0.0, 1.0),
vec4(0.1, 0.3, 0.0, 1.0),
vec4(0.3, 0.3, 0.0, 1.0),
vec4(0.3, 0.1, 0.0, 1.0)
];
var indices1 = [
0, 1, 2,
0, 2, 3
];
var indices2 = [
0, 1, 2,
0, 2, 3
];
var tcs1 = [
vec2(0, 0),
vec2(0, 1),
vec2(1, 1),
vec2(1, 0)
];
var tcs2 = [
vec2(0, 0),
vec2(0, 1),
vec2(1, 1),
vec2(1, 0)
];
function init() {
var canvas = document.getElementById("gl-canvas");
gl = WebGLUtils.setupWebGL(canvas);
if (!gl) { alert("WebGL isn't available"); }
gl.viewport(0, 0, canvas.width, canvas.height);
gl.clearColor(0.0, 0.0, 0.0, 1.0);
program1 = initShaders(gl, "vertex-shader1", "fragment-shader1");
gl.useProgram(program1);
vBuff1 = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, vBuff1);
gl.bufferData(gl.ARRAY_BUFFER, flatten(vertices1), gl.STATIC_DRAW);
vPos1 = gl.getAttribLocation(program1, "vPosition");
iBuff1 = gl.createBuffer();
gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, iBuff1);
gl.bufferData(gl.ELEMENT_ARRAY_BUFFER, new Uint8Array(indices1), gl.STATIC_DRAW);
tBuff1 = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, tBuff1);
gl.bufferData(gl.ARRAY_BUFFER, flatten(tcs1), gl.STATIC_DRAW);
fTexCoord1 = gl.getAttribLocation(program1, "vTexCoord");
sampler1 = gl.getUniformLocation(program1, "texture");
texture1 = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, texture1);
gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, true);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGB, gl.RGB, gl.UNSIGNED_BYTE, images[0]);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
gl.bindTexture(gl.TEXTURE_2D, null);
///////////////////////////////////////////////////////////////////////////////////////
/*
program2 = initShaders(gl, "vertex-shader2", "fragment-shader2");
gl.useProgram(program2);
*/
vBuff2 = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, vBuff2);
gl.bufferData(gl.ARRAY_BUFFER, flatten(vertices2), gl.STATIC_DRAW);
vPos2 = gl.getAttribLocation(program1, "vPosition");
iBuff2 = gl.createBuffer();
gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, iBuff2);
gl.bufferData(gl.ELEMENT_ARRAY_BUFFER, new Uint8Array(indices2), gl.STATIC_DRAW);
tBuff2 = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, tBuff2);
gl.bufferData(gl.ARRAY_BUFFER, flatten(tcs2), gl.STATIC_DRAW);
fTexCoord2 = gl.getAttribLocation(program1, "vTexCoord");
gl.vertexAttribPointer(fTexCoord2, 2, gl.FLOAT, false, 0, 0);
//gl.enableVertexAttribArray(fTexCoord2);
sampler2 = gl.getUniformLocation(program1, "texture");
texture2 = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, texture2);
gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, true);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGB, gl.RGB, gl.UNSIGNED_BYTE, images[1]);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
gl.bindTexture(gl.TEXTURE_2D, null);
render();
};
function render() {
gl.clear(gl.COLOR_BUFFER_BIT);
gl.useProgram(program1);
gl.bindBuffer(gl.ARRAY_BUFFER, vBuff1);
gl.enableVertexAttribArray(vPos1);
gl.vertexAttribPointer(vPos1, 4, gl.FLOAT, false, 0, 0);
gl.bindBuffer(gl.ARRAY_BUFFER, tBuff1);
gl.enableVertexAttribArray(fTexCoord1);
gl.vertexAttribPointer(fTexCoord1, 2, gl.FLOAT, false, 0, 0);
gl.activeTexture(gl.TEXTURE0);
gl.bindTexture(gl.TEXTURE_2D, texture1);
gl.uniform1i(sampler1, 0);
gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, iBuff1);
gl.drawElements(gl.TRIANGLES, indices1.length, gl.UNSIGNED_BYTE, 0);
gl.bindBuffer(gl.ARRAY_BUFFER,vBuff2);
gl.enableVertexAttribArray(vPos2);
gl.vertexAttribPointer(vPos2, 4, gl.FLOAT, false, 0, 0);
gl.activeTexture(gl.TEXTURE0);
gl.bindTexture(gl.TEXTURE_2D, texture2);
gl.uniform1i(sampler2, 0);
gl.bindBuffer(gl.ARRAY_BUFFER,tBuff2);
gl.enableVertexAttribArray(fTexCoord2);
gl.vertexAttribPointer(fTexCoord2, 2, gl.FLOAT, false, 0, 0);
gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, iBuff2);
gl.drawElements(gl.TRIANGLES, indices2.length, gl.UNSIGNED_BYTE, 0);
requestAnimFrame(render);
}
function loadImage(url) {
return new Promise((resolve, reject) => {
const img = new Image();
img.onload = () => resolve(img);
img.onerror = reject;
img.crossOrigin = 'anonymous';
img.src = url;
});
}
async function loadImages(imgs) {
images = await Promise.all(imgs.map(loadImage));
}
async function main() {
await loadImages([
'https://webglfundamentals.org/webgl/resources/f-texture.png',
'https://webglfundamentals.org/webgl/lessons/resources/noodles-01.jpg',
]);
init();
}
main();
<script id="vertex-shader1" type="x-shader/x-vertex">
attribute vec4 vPosition;
attribute vec2 vTexCoord;
varying vec2 fTexCoord;
void main() {
fTexCoord = vTexCoord;
gl_Position = vPosition;
}
</script>
<script id="fragment-shader1" type="x-shader/x-fragment">
precision mediump float;
varying vec2 fTexCoord;
uniform sampler2D texture;
void main() {
gl_FragColor = texture2D(texture, fTexCoord);
}
</script>
<canvas id="gl-canvas"></canvas>
<script src="https://twgljs.org/dist/4.x/twgl.min.js"></script>
Currently, I'm using 2D canvas context to draw an image generated (from pixel to pixel, but refreshed as a whole buffer in once after a generated frame) from JavaScript at about a 25fps rate. The generated image is always one byte (integer / typed array) per pixel and a fixed palette is used to generate RGB final result. Scaling is also needed to adopt to the size of the canvas (ie: going to fullscreen) and/or at user request (zoom in/out buttons).
The 2D context of canvas is OK for this purpose, however I'm curious if WebGL can provide better result and/or better performance. Please note: I don't want to put pixels via webGL, I want to put pixels into my buffer (which is basically Uint8Array), and use that buffer (in once) to refresh the context. I don't know too much about WebGL, but using the needed generated image as some kind of texture would work somehow for example? Then I would need to refresh the texture at about 25fps rate, I guess.
It would be really fantastic, if WebGL support the colour space conversion somehow. With 2D context, I need to convert 1 byte / pixel buffer into RGBA for the imagedata in JavaScript for every pixel ... Scaling (for 2D context) is done now by altering the height/width style of the canvas, so browsers scales the image then. However I guess it can be slower than what WebGL can do with hw support, and also (I hope) WebGL can give greater flexibility to control the scaling, eg with the 2D context, browsers will do antialiasing even if I don't want to do (eg: integer zooming factor), and maybe that's a reason it can be quite slow sometimes.
I've already tried to learn several WebGL tutorials but all of them starts with objects, shapes, 3D cubes, etc, I don't need any - classical - object to render only what 2D context can do as well - in the hope that WebGL can be a faster solution for the very same task! Of course if there is no win here with WebGL at all, I would continue to use 2D context.
To be clear: this is some kind of computer hardware emulator done in JavaScript, and its output (what would be seen on a PAL TV connected to it) is rendered via a canvas context. The machine has fixed palette with 256 elements, internally it only needs one byte for a pixel to define its colour.
You can use a texture as your palette and a different texture as your image. You then get a value from the image texture and use it too look up a color from the palette texture.
The palette texture is 256x1 RGBA pixels. Your image texture is any size you want but just a single channel ALPHA texture. You can then look up a value from the image
float index = texture2D(u_image, v_texcoord).a * 255.0;
And use that value to look up a color in the palette
gl_FragColor = texture2D(u_palette, vec2((index + 0.5) / 256.0, 0.5));
Your shaders might be something like this
Vertex Shader
attribute vec4 a_position;
varying vec2 v_texcoord;
void main() {
gl_Position = a_position;
// assuming a unit quad for position we
// can just use that for texcoords. Flip Y though so we get the top at 0
v_texcoord = a_position.xy * vec2(0.5, -0.5) + 0.5;
}
Fragment shader
precision mediump float;
varying vec2 v_texcoord;
uniform sampler2D u_image;
uniform sampler2D u_palette;
void main() {
float index = texture2D(u_image, v_texcoord).a * 255.0;
gl_FragColor = texture2D(u_palette, vec2((index + 0.5) / 256.0, 0.5));
}
Then you just need a palette texture.
// Setup a palette.
var palette = new Uint8Array(256 * 4);
// I'm lazy so just setting 4 colors in palette
function setPalette(index, r, g, b, a) {
palette[index * 4 + 0] = r;
palette[index * 4 + 1] = g;
palette[index * 4 + 2] = b;
palette[index * 4 + 3] = a;
}
setPalette(1, 255, 0, 0, 255); // red
setPalette(2, 0, 255, 0, 255); // green
setPalette(3, 0, 0, 255, 255); // blue
// upload palette
...
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, 256, 1, 0, gl.RGBA,
gl.UNSIGNED_BYTE, palette);
And your image. It's an alpha only image so just 1 channel.
// Make image. Just going to make something 8x8
var image = new Uint8Array([
0,0,1,1,1,1,0,0,
0,1,0,0,0,0,1,0,
1,0,0,0,0,0,0,1,
1,0,2,0,0,2,0,1,
1,0,0,0,0,0,0,1,
1,0,3,3,3,3,0,1,
0,1,0,0,0,0,1,0,
0,0,1,1,1,1,0,0,
]);
// upload image
....
gl.texImage2D(gl.TEXTURE_2D, 0, gl.ALPHA, 8, 8, 0, gl.ALPHA,
gl.UNSIGNED_BYTE, image);
You also need to make sure both textures are using gl.NEAREST for filtering since one represents indices and the other a palette and filtering between values in those cases makes no sense.
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
Here's a working example:
var canvas = document.getElementById("c");
var gl = canvas.getContext("webgl");
// Note: createProgramFromScripts will call bindAttribLocation
// based on the index of the attibute names we pass to it.
var program = twgl.createProgramFromScripts(
gl,
["vshader", "fshader"],
["a_position", "a_textureIndex"]);
gl.useProgram(program);
var imageLoc = gl.getUniformLocation(program, "u_image");
var paletteLoc = gl.getUniformLocation(program, "u_palette");
// tell it to use texture units 0 and 1 for the image and palette
gl.uniform1i(imageLoc, 0);
gl.uniform1i(paletteLoc, 1);
// Setup a unit quad
var positions = [
1, 1,
-1, 1,
-1, -1,
1, 1,
-1, -1,
1, -1,
];
var vertBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, vertBuffer);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(positions), gl.STATIC_DRAW);
gl.enableVertexAttribArray(0);
gl.vertexAttribPointer(0, 2, gl.FLOAT, false, 0, 0);
// Setup a palette.
var palette = new Uint8Array(256 * 4);
// I'm lazy so just setting 4 colors in palette
function setPalette(index, r, g, b, a) {
palette[index * 4 + 0] = r;
palette[index * 4 + 1] = g;
palette[index * 4 + 2] = b;
palette[index * 4 + 3] = a;
}
setPalette(1, 255, 0, 0, 255); // red
setPalette(2, 0, 255, 0, 255); // green
setPalette(3, 0, 0, 255, 255); // blue
// make palette texture and upload palette
gl.activeTexture(gl.TEXTURE1);
var paletteTex = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, paletteTex);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, 256, 1, 0, gl.RGBA, gl.UNSIGNED_BYTE, palette);
// Make image. Just going to make something 8x8
var image = new Uint8Array([
0,0,1,1,1,1,0,0,
0,1,0,0,0,0,1,0,
1,0,0,0,0,0,0,1,
1,0,2,0,0,2,0,1,
1,0,0,0,0,0,0,1,
1,0,3,3,3,3,0,1,
0,1,0,0,0,0,1,0,
0,0,1,1,1,1,0,0,
]);
// make image textures and upload image
gl.activeTexture(gl.TEXTURE0);
var imageTex = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, imageTex);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.ALPHA, 8, 8, 0, gl.ALPHA, gl.UNSIGNED_BYTE, image);
gl.drawArrays(gl.TRIANGLES, 0, positions.length / 2);
canvas { border: 1px solid black; }
<script src="https://twgljs.org/dist/twgl.min.js"></script>
<script id="vshader" type="whatever">
attribute vec4 a_position;
varying vec2 v_texcoord;
void main() {
gl_Position = a_position;
// assuming a unit quad for position we
// can just use that for texcoords. Flip Y though so we get the top at 0
v_texcoord = a_position.xy * vec2(0.5, -0.5) + 0.5;
}
</script>
<script id="fshader" type="whatever">
precision mediump float;
varying vec2 v_texcoord;
uniform sampler2D u_image;
uniform sampler2D u_palette;
void main() {
float index = texture2D(u_image, v_texcoord).a * 255.0;
gl_FragColor = texture2D(u_palette, vec2((index + 0.5) / 256.0, 0.5));
}
</script>
<canvas id="c" width="256" height="256"></canvas>
To animate just update the image and then re-upload it into the texture
gl.texImage2D(gl.TEXTURE_2D, 0, gl.ALPHA, 8, 8, 0, gl.ALPHA,
gl.UNSIGNED_BYTE, image);
Example:
var canvas = document.getElementById("c");
var gl = canvas.getContext("webgl");
// Note: createProgramFromScripts will call bindAttribLocation
// based on the index of the attibute names we pass to it.
var program = twgl.createProgramFromScripts(
gl,
["vshader", "fshader"],
["a_position", "a_textureIndex"]);
gl.useProgram(program);
var imageLoc = gl.getUniformLocation(program, "u_image");
var paletteLoc = gl.getUniformLocation(program, "u_palette");
// tell it to use texture units 0 and 1 for the image and palette
gl.uniform1i(imageLoc, 0);
gl.uniform1i(paletteLoc, 1);
// Setup a unit quad
var positions = [
1, 1,
-1, 1,
-1, -1,
1, 1,
-1, -1,
1, -1,
];
var vertBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, vertBuffer);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(positions), gl.STATIC_DRAW);
gl.enableVertexAttribArray(0);
gl.vertexAttribPointer(0, 2, gl.FLOAT, false, 0, 0);
// Setup a palette.
var palette = new Uint8Array(256 * 4);
// I'm lazy so just setting 4 colors in palette
function setPalette(index, r, g, b, a) {
palette[index * 4 + 0] = r;
palette[index * 4 + 1] = g;
palette[index * 4 + 2] = b;
palette[index * 4 + 3] = a;
}
setPalette(1, 255, 0, 0, 255); // red
setPalette(2, 0, 255, 0, 255); // green
setPalette(3, 0, 0, 255, 255); // blue
// make palette texture and upload palette
gl.activeTexture(gl.TEXTURE1);
var paletteTex = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, paletteTex);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, 256, 1, 0, gl.RGBA, gl.UNSIGNED_BYTE, palette);
// Make image. Just going to make something 8x8
var width = 8;
var height = 8;
var image = new Uint8Array([
0,0,1,1,1,1,0,0,
0,1,0,0,0,0,1,0,
1,0,0,0,0,0,0,1,
1,0,2,0,0,2,0,1,
1,0,0,0,0,0,0,1,
1,0,3,3,3,3,0,1,
0,1,0,0,0,0,1,0,
0,0,1,1,1,1,0,0,
]);
// make image textures and upload image
gl.activeTexture(gl.TEXTURE0);
var imageTex = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, imageTex);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.ALPHA, width, height, 0, gl.ALPHA, gl.UNSIGNED_BYTE, image);
var frameCounter = 0;
function render() {
++frameCounter;
// skip 3 of 4 frames so the animation is not too fast
if ((frameCounter & 3) == 0) {
// rotate the image left
for (var y = 0; y < height; ++y) {
var temp = image[y * width];
for (var x = 0; x < width - 1; ++x) {
image[y * width + x] = image[y * width + x + 1];
}
image[y * width + width - 1] = temp;
}
// re-upload image
gl.activeTexture(gl.TEXTURE0);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.ALPHA, width, height, 0, gl.ALPHA,
gl.UNSIGNED_BYTE, image);
gl.drawArrays(gl.TRIANGLES, 0, positions.length / 2);
}
requestAnimationFrame(render);
}
render();
canvas { border: 1px solid black; }
<script src="https://twgljs.org/dist/twgl.min.js"></script>
<script id="vshader" type="whatever">
attribute vec4 a_position;
varying vec2 v_texcoord;
void main() {
gl_Position = a_position;
// assuming a unit quad for position we
// can just use that for texcoords. Flip Y though so we get the top at 0
v_texcoord = a_position.xy * vec2(0.5, -0.5) + 0.5;
}
</script>
<script id="fshader" type="whatever">
precision mediump float;
varying vec2 v_texcoord;
uniform sampler2D u_image;
uniform sampler2D u_palette;
void main() {
float index = texture2D(u_image, v_texcoord).a * 255.0;
gl_FragColor = texture2D(u_palette, vec2((index + 0.5) / 256.0, 0.5));
}
</script>
<canvas id="c" width="256" height="256"></canvas>
Of course that assumes your goal is to do the animation on the CPU by manipulating pixels. Otherwise you can use any normal webgl techniques to manipulate texture coordinates or whatever.
You can also update the palette similarly for palette animation. Just modify the palette and re-upload it
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, 256, 1, 0, gl.RGBA,
gl.UNSIGNED_BYTE, palette);
Example:
var canvas = document.getElementById("c");
var gl = canvas.getContext("webgl");
// Note: createProgramFromScripts will call bindAttribLocation
// based on the index of the attibute names we pass to it.
var program = twgl.createProgramFromScripts(
gl,
["vshader", "fshader"],
["a_position", "a_textureIndex"]);
gl.useProgram(program);
var imageLoc = gl.getUniformLocation(program, "u_image");
var paletteLoc = gl.getUniformLocation(program, "u_palette");
// tell it to use texture units 0 and 1 for the image and palette
gl.uniform1i(imageLoc, 0);
gl.uniform1i(paletteLoc, 1);
// Setup a unit quad
var positions = [
1, 1,
-1, 1,
-1, -1,
1, 1,
-1, -1,
1, -1,
];
var vertBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, vertBuffer);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(positions), gl.STATIC_DRAW);
gl.enableVertexAttribArray(0);
gl.vertexAttribPointer(0, 2, gl.FLOAT, false, 0, 0);
// Setup a palette.
var palette = new Uint8Array(256 * 4);
// I'm lazy so just setting 4 colors in palette
function setPalette(index, r, g, b, a) {
palette[index * 4 + 0] = r;
palette[index * 4 + 1] = g;
palette[index * 4 + 2] = b;
palette[index * 4 + 3] = a;
}
setPalette(1, 255, 0, 0, 255); // red
setPalette(2, 0, 255, 0, 255); // green
setPalette(3, 0, 0, 255, 255); // blue
// make palette texture and upload palette
gl.activeTexture(gl.TEXTURE1);
var paletteTex = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, paletteTex);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, 256, 1, 0, gl.RGBA, gl.UNSIGNED_BYTE, palette);
// Make image. Just going to make something 8x8
var width = 8;
var height = 8;
var image = new Uint8Array([
0,0,1,1,1,1,0,0,
0,1,0,0,0,0,1,0,
1,0,0,0,0,0,0,1,
1,0,2,0,0,2,0,1,
1,0,0,0,0,0,0,1,
1,0,3,3,3,3,0,1,
0,1,0,0,0,0,1,0,
0,0,1,1,1,1,0,0,
]);
// make image textures and upload image
gl.activeTexture(gl.TEXTURE0);
var imageTex = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, imageTex);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.ALPHA, width, height, 0, gl.ALPHA, gl.UNSIGNED_BYTE, image);
var frameCounter = 0;
function render() {
++frameCounter;
// skip 3 of 4 frames so the animation is not too fast
if ((frameCounter & 3) == 0) {
// rotate the 3 palette colors
var tempR = palette[4 + 0];
var tempG = palette[4 + 1];
var tempB = palette[4 + 2];
var tempA = palette[4 + 3];
setPalette(1, palette[2 * 4 + 0], palette[2 * 4 + 1], palette[2 * 4 + 2], palette[2 * 4 + 3]);
setPalette(2, palette[3 * 4 + 0], palette[3 * 4 + 1], palette[3 * 4 + 2], palette[3 * 4 + 3]);
setPalette(3, tempR, tempG, tempB, tempA);
// re-upload palette
gl.activeTexture(gl.TEXTURE1);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, 256, 1, 0, gl.RGBA,
gl.UNSIGNED_BYTE, palette);
gl.drawArrays(gl.TRIANGLES, 0, positions.length / 2);
}
requestAnimationFrame(render);
}
render();
canvas { border: 1px solid black; }
<script src="https://twgljs.org/dist/twgl.min.js"></script>
<script id="vshader" type="whatever">
attribute vec4 a_position;
varying vec2 v_texcoord;
void main() {
gl_Position = a_position;
// assuming a unit quad for position we
// can just use that for texcoords. Flip Y though so we get the top at 0
v_texcoord = a_position.xy * vec2(0.5, -0.5) + 0.5;
}
</script>
<script id="fshader" type="whatever">
precision mediump float;
varying vec2 v_texcoord;
uniform sampler2D u_image;
uniform sampler2D u_palette;
void main() {
float index = texture2D(u_image, v_texcoord).a * 255.0;
gl_FragColor = texture2D(u_palette, vec2((index + 0.5) / 256.0, 0.5));
}
</script>
<canvas id="c" width="256" height="256"></canvas>
Slightly related is this tile shader example
http://blog.tojicode.com/2012/07/sprite-tile-maps-on-gpu.html
Presumably you're building up a javascript array that's around 512 x 512 (PAL size)...
A WebGL fragment shader could definitely do your palette conversion pretty nicely. The recipe would go something like this:
Set up WebGL with a "geometry" of just two triangles that span your viewport. (GL is all triangles.) This is the biggest bother, if you're not already GL fluent. But it's not that bad. Spend some quality time with http://learningwebgl.com/blog/?page_id=1217 . But it will be ~100 lines of stuff. Price of admission.
Build your in-memory frame buffer 4 times bigger. (I think textures always have to be RGBA?) And populate every fourth byte, the R component, with your pixel values. Use new Float32Array to allocate it. You can use values 0-255, or divide it down to 0.0 to 1.0. We'll pass this to webgl as a texture. This one changes every frame.
Build a second texture that's 256 x 1 pixels, which is your palette lookup table. This one never changes (unless the palette can be modified?).
In your fragment shader, use your emulated frame buffer texture as a lookup into your palette. The first pixel in the palette is accessed at location (0.5/256.0, 0.5), middle of the pixel.
On each frame, resubmit the emulated frame buffer texture and redraw. Pushing pixels to the GPU is expensive... but a PAL-sized image is pretty small by modern standards.
Bonus step: You could enhance the fragment shader to imitate scanlines, interlace video, or other cute emulation artifacts (phosphor dots?) on modern high-resolution displays, all at no cost to your javascript!
This is just a sketch. But it will work. WebGL is a pretty low-level API, and quite flexible, but well worth the effort (if you like that kind of thing, which I do. :-) ).
Again, http://learningwebgl.com/blog/?page_id=1217 is well-recommended for overall WebGL guidance.
I can't get this webgl rippling effect to happen on touch screens: http://onestopdesktop.com/D2/ apparently the problem has nothing to do with the touch functions after all. something in the webgl coding itself is preventing it from operating on touch devices. I uncovered that when I changed the SOURCEFLOW default to 4 which causes a single ripple to flow on PC's on screen load. however, nothing at all happens on touch devices on load. it does work perfectly in the Chrome touch device emulator. my test touch devices (iPad 2 and 3, iPhone5, Android Asus, are all up to date OS and browser wise and are fully webgl capable.
I really love this beautiful water rippling code that I found in a tutorial at Webgl Academy, and I'm trying to help a student get this effect to work with her website.
below is the code I'm using. I added the touch event stuff to it.
var main=function() {
var CANVAS=document.getElementById("your_canvas");
CANVAS.width = window.innerWidth;
CANVAS.height = window.innerHeight;
var GL = CANVAS.getContext("webgl", {antialias: false, alpha: false});
var POINTER_X=0.5, POINTER_Y=0.5;
CANVAS.addEventListener("mousemove", waterMove);
CANVAS.addEventListener("touchstart", waterTouch);
CANVAS.addEventListener("touchmove", waterMove2);
CANVAS.addEventListener("touchend", waterStop);
function waterMove(event) {
event.preventDefault();
POINTER_X=(event.clientX-CANVAS.offsetLeft)/CANVAS.width;
POINTER_Y=1-event.clientY/CANVAS.height;
};
function waterMove2(event) {
event.preventDefault();
POINTER_X=(event.changedTouches[0].clientX-CANVAS.offsetLeft)/CANVAS.width;
POINTER_Y=1-event.changedTouches[0].clientY/CANVAS.height;
};
function waterTouch(event) {
event.preventDefault();
SOURCEFLOW = 4;
POINTER_X=(event.touches[0].clientX-CANVAS.offsetLeft)/CANVAS.width;
POINTER_Y=1-event.touches[0].clientY/CANVAS.height;
};
function waterStop(event) {
event.preventDefault();
SOURCEFLOW = 0;
POINTER_X=(event.changedTouches[0].clientX-CANVAS.offsetLeft)/CANVAS.width;
POINTER_Y=1-event.changedTouches[0].clientY/CANVAS.height;
};
SOURCEFLOW=4;
CANVAS.addEventListener("mouseup", function(event) { event.preventDefault();SOURCEFLOW = 0; } , false);
CANVAS.addEventListener("mousedown", function(event) { event.preventDefault();SOURCEFLOW = 4; } , false);
//enable texture float
var EXT_FLOAT = GL.getExtension('OES_texture_float') ||
GL.getExtension('MOZ_OES_texture_float') ||
GL.getExtension('WEBKIT_OES_texture_float');
/*========================= PARAMETERS ========================= */
var SIMUSIZEPX=512; //GPGPU simulation texture size in pixel
var SIMUWIDTH=1; //Simulation size in meters
var GPGPU_NPASS=6; //number of GPGPU pass per rendering
var WATER_DEEP=0.01; //mean height of water in meters
var RENDERING_FLOOR_SIZE=1; //size of the water floor texture in meters
/*========================= RENDERING SHADERS ========================= */
var vertSrc_render="\n\
attribute vec2 position;\n\
\n\
varying vec2 vUV;\n\
\n\
void main(void) {\n\
gl_Position = vec4(position, 0., 1.);\n\
vUV=0.5*(position+vec2(1.,1.));\n\
}";
var fragSrc_render="\n\
precision mediump float;\n\
\n\
uniform float H; //water deep (meters)\n\
uniform float L; //simulation size (meters)\n\
uniform float l; //ground texture tile size (meters)\n\
uniform sampler2D sampler;\n\
uniform sampler2D sampler_normals;\n\
\n\
varying vec2 vUV;\n\
\n\
const vec3 light=vec3(1.,1.,1.);\n\
const vec4 color_sky=vec4(60./255., 90./255., 150./255., 1.);\n\
const float n=1./1.33;\n\
\n\
void main(void) {\n\
vec4 water=texture2D(sampler_normals, vUV);\n\
vec3 water_normal=water.rgb+vec3(0.,0.,1.);\n\
float water_height=water.a;\n\
\n\
vec3 I=vec3(0.,0.,1.); //incident vector\n\
vec3 R = refract(I, water_normal, n); //refracted vector\n\
\n\
float k=(water_height+H)/R.z; //k so that M=kR belongs to the water floor\n\
vec3 M=k*R; //belongs to the water floor\n\
vec2 uv=(vUV*L-M.xy)/l; //texture coordinates of the water floor\n\
vec4 color_floor=texture2D(sampler, uv);\n\
\n\
float F=water_normal.z; //Fresnel reflexion coefficient = (I.N)\n\
\n\
vec4 color=mix(color_sky, color_floor, 0.6+F*0.3);\n\
\n\
vec3 lightDir=normalize(light-vec3(L*(vUV-vec2(0.5,0.5)), water_height));\n\
\n\
float lightPow=clamp(pow(dot(lightDir, water_normal),4.),1., 1.3);\n\
\n\
gl_FragColor=lightPow*color;\n\
}";
/*================= SHALLOW WATER EQUATION SHADERS ================== */
var fragSrc_water="\n\
precision mediump float;\n\
\n\
uniform float dt, H, b, g, epsilon;\n\
uniform float scale;\n\
uniform vec2 mouse;\n\
\n\
uniform float sourceRadius, sourceFlow;\n\
uniform sampler2D sampler_water, sampler_normals;\n\
\n\
varying vec2 vUV;\n\
\n\
void main(void) {\n\
\n\
vec4 water_t = texture2D(sampler_water, vUV);\n\
float h = water_t.r;\n\
vec2 uvSpeed = water_t.gb;\n\
\n\
vec2 dx=vec2(epsilon, 0.);\n\
vec2 dy=vec2(0., epsilon);\n\
float du_dx=(texture2D(sampler_water, vUV+dx).g-texture2D(sampler_water, vUV- dx).g)/(2.*scale);\n\
float dv_dy=(texture2D(sampler_water, vUV+dy).b-texture2D(sampler_water, vUV- dy).b)/(2.*scale);\n\
\n\
vec3 normals=texture2D(sampler_normals,vUV).xyz;\n\
\n\
//we add 1 to Nz because RGB = (0,0,0) -> Normal = (0,0,1)\n\
vec2 d_uvSpeed = -dt * (g * normals.xy/(normals.z+1.) + b*uvSpeed);\n\
\n\
float d_h = -dt * H * (du_dx + dv_dy);\n\
\n\
float dSource = length(vUV-mouse);\n\
\n\
d_h += dt * sourceFlow * (1. - smoothstep(0., sourceRadius, dSource));\n\
gl_FragColor = vec4(h+d_h, uvSpeed+d_uvSpeed, 1.);\n\
}";
/*================= TEXTURE COPY SHADERS ================== */
var fragSrc_copy="\n\
precision mediump float;\n\
\n\
uniform float scale;\n\
uniform sampler2D sampler;\n\
\n\
varying vec2 vUV;\n\
\n\
void main(void) {\n\
float dxy=1./scale;\n\
vec4 waterData = texture2D(sampler, vUV);\n\
vec4 waterDataAvg=(texture2D(sampler, vUV+vec2(dxy,0.))\n\
+.5*texture2D(sampler, vUV+vec2(dxy,dxy))\n\
+texture2D(sampler, vUV+vec2(0.,dxy))\n\
+.5*texture2D(sampler, vUV+vec2(-dxy,dxy))\n\
+texture2D(sampler, vUV+vec2(-dxy,0.))\n\
+.5*texture2D(sampler, vUV+vec2(-dxy,-dxy))\n\
+texture2D(sampler, vUV+vec2(0.,-dxy))\n\
+.5*texture2D(sampler, vUV+vec2(dxy,-dxy)))/6.;\n\
\n\
gl_FragColor = mix(waterData, waterDataAvg, 0.3);\n\
}";
/*================= NORMALS SHADERS ================== */
var fragSrc_normals="\n\
precision mediump float;\n\
\n\
uniform sampler2D sampler;\n\
uniform float epsilon, scale; //horizontal scale in meters\n\
varying vec2 vUV;\n\
\n\
vec3 getPoint(float x, float y, vec2 uv){\n\
float h = texture2D(sampler, uv+vec2(x,y)).r; //water height\n\
return vec3(x*scale,y*scale,h);\n\
}\n\
\n\
void main(void) {\n\
vec3 points[4];\n\
points[0]=getPoint(-epsilon,0., vUV);\n\
points[1]=getPoint(0.,-epsilon, vUV);\n\
points[2]=getPoint(epsilon ,0., vUV);\n\
points[3]=getPoint(0. ,epsilon, vUV);\n\
\n\
vec3 normal=normalize(cross(points[1]-points[3], points[2]-points[0]));\n\
\n\
//We substract 1 to Nz because Normal = (0,0,1) -> RGB = (0,0,0)\n\
normal.z-=1.;\n\
\n\
float height=texture2D(sampler, vUV).r;\n\
gl_FragColor=vec4(normal, height);\n\
}";
//compile a shader :
var get_shader=function(source, type, typeString) {
var shader = GL.createShader(type);
GL.shaderSource(shader, source);
GL.compileShader(shader);
if (!GL.getShaderParameter(shader, GL.COMPILE_STATUS)) {
alert("ERROR IN "+typeString+ " SHADER : " + GL.getShaderInfoLog(shader));
return false;
}
return shader;
};
//build a shader program :
var get_shaderProgram=function(vertex_source, fragment_source, typeStr){
var shader_vertex=get_shader(vertex_source, GL.VERTEX_SHADER, typeStr+" VERTEX");
var shader_fragment=get_shader(fragment_source, GL.FRAGMENT_SHADER, typeStr+" FRAGMENT");
var shader_program=GL.createProgram();
GL.attachShader(shader_program, shader_vertex);
GL.attachShader(shader_program, shader_fragment);
GL.linkProgram(shader_program);
return shader_program;
};
//final rendering shader program
var SHP_VARS={};
var SHP_RENDERING=get_shaderProgram(vertSrc_render, fragSrc_render, "RENDER");
SHP_VARS.rendering={
H: GL.getUniformLocation(SHP_RENDERING, "H"),
L: GL.getUniformLocation(SHP_RENDERING, "L"),
l: GL.getUniformLocation(SHP_RENDERING, "l"),
sampler: GL.getUniformLocation(SHP_RENDERING, "sampler"),
sampler_normals: GL.getUniformLocation(SHP_RENDERING, "sampler_normals"),
position: GL.getAttribLocation(SHP_RENDERING, "position")
};
var SHP_WATER=get_shaderProgram(vertSrc_render, fragSrc_water, "WATER");
SHP_VARS.water={
dt: GL.getUniformLocation(SHP_WATER, "dt"),
H: GL.getUniformLocation(SHP_WATER, "H"),
b: GL.getUniformLocation(SHP_WATER, "b"),
g: GL.getUniformLocation(SHP_WATER, "g"),
mouse: GL.getUniformLocation(SHP_WATER, "mouse"),
sourceFlow: GL.getUniformLocation(SHP_WATER, "sourceFlow"),
sourceRadius: GL.getUniformLocation(SHP_WATER, "sourceRadius"),
epsilon: GL.getUniformLocation(SHP_WATER, "epsilon"),
scale: GL.getUniformLocation(SHP_WATER, "scale"),
sampler_water: GL.getUniformLocation(SHP_WATER, "sampler_water"),
sampler_normals : GL.getUniformLocation(SHP_WATER, "sampler_normals"),
position: GL.getAttribLocation(SHP_WATER, "position")
};
var SHP_COPY=get_shaderProgram(vertSrc_render, fragSrc_copy, "COPY");
SHP_VARS.copy={
scale : GL.getUniformLocation(SHP_COPY, "scale"),
sampler: GL.getUniformLocation(SHP_COPY, "sampler"),
position: GL.getAttribLocation(SHP_COPY, "position")
};
var SHP_NORMALS=get_shaderProgram(vertSrc_render, fragSrc_normals, "NORMALS");
SHP_VARS.normals={
sampler: GL.getUniformLocation(SHP_NORMALS, "sampler"),
scale: GL.getUniformLocation(SHP_NORMALS, "scale"),
epsilon: GL.getUniformLocation(SHP_NORMALS, "epsilon"),
position: GL.getAttribLocation(SHP_NORMALS, "position")
};
/*========================= THE QUAD ========================= */
//POINTS :
var quad_vertex=[
-1,-1, //first summit -> bottom left of the viewport
1,-1, //bottom right
1,1, //top right
-1,1 //top left
];
var QUAD_VERTEX= GL.createBuffer ();
GL.bindBuffer(GL.ARRAY_BUFFER, QUAD_VERTEX);
GL.bufferData(GL.ARRAY_BUFFER,new Float32Array(quad_vertex),GL.STATIC_DRAW);
//FACES :
var quad_faces = [0,1,2, 0,2,3];
var QUAD_FACES= GL.createBuffer ();
GL.bindBuffer(GL.ELEMENT_ARRAY_BUFFER, QUAD_FACES);
GL.bufferData(GL.ELEMENT_ARRAY_BUFFER, new Uint16Array([0,1,2, 0,2,3]),GL.STATIC_DRAW);
/*========================= THE TEXTURE ========================= */
var renderingImage=new Image();
renderingImage.src='ressources/waterFloor.jpg';
var renderingTexture=GL.createTexture();
GL.pixelStorei(GL.UNPACK_FLIP_Y_WEBGL, true);
GL.bindTexture(GL.TEXTURE_2D, renderingTexture);
GL.texImage2D(GL.TEXTURE_2D, 0, GL.RGBA, 1, 1, 0, GL.RGBA, GL.UNSIGNED_BYTE,
new Uint8Array([255, 0, 0, 255]));
renderingImage.onload=function() {
GL.bindTexture(GL.TEXTURE_2D, renderingTexture);
GL.texParameteri(GL.TEXTURE_2D, GL.TEXTURE_MAG_FILTER, GL.LINEAR);
GL.texParameteri(GL.TEXTURE_2D, GL.TEXTURE_MIN_FILTER, GL.LINEAR);
GL.texImage2D(GL.TEXTURE_2D, 0, GL.RGBA, GL.RGBA, GL.UNSIGNED_BYTE, renderingImage);
};
/*====================== RENDER TO TEXTURE ====================== */
//GPGPU WATER RTT :
var fb_water=GL.createFramebuffer();
GL.bindFramebuffer(GL.FRAMEBUFFER, fb_water);
var texture_water=GL.createTexture();
GL.bindTexture(GL.TEXTURE_2D, texture_water);
GL.texParameteri(GL.TEXTURE_2D, GL.TEXTURE_MAG_FILTER, GL.NEAREST);
GL.texParameteri(GL.TEXTURE_2D, GL.TEXTURE_MIN_FILTER, GL.NEAREST);
GL.texParameteri( GL.TEXTURE_2D, GL.TEXTURE_WRAP_S, GL.CLAMP_TO_EDGE );
GL.texParameteri( GL.TEXTURE_2D, GL.TEXTURE_WRAP_T, GL.CLAMP_TO_EDGE );
GL.texImage2D(GL.TEXTURE_2D, 0, GL.RGBA, SIMUSIZEPX, SIMUSIZEPX, 0, GL.RGBA, GL.FLOAT, null);
GL.framebufferTexture2D(GL.FRAMEBUFFER, GL.COLOR_ATTACHMENT0, GL.TEXTURE_2D, texture_water, 0);
//COPY RTT :
var fb_copy=GL.createFramebuffer();
GL.bindFramebuffer(GL.FRAMEBUFFER, fb_copy);
var texture_water_copy=GL.createTexture();
GL.bindTexture(GL.TEXTURE_2D, texture_water_copy);
GL.texParameteri(GL.TEXTURE_2D, GL.TEXTURE_MAG_FILTER, GL.NEAREST);
GL.texParameteri(GL.TEXTURE_2D, GL.TEXTURE_MIN_FILTER, GL.NEAREST);
GL.texParameteri( GL.TEXTURE_2D, GL.TEXTURE_WRAP_S, GL.CLAMP_TO_EDGE );
GL.texParameteri( GL.TEXTURE_2D, GL.TEXTURE_WRAP_T, GL.CLAMP_TO_EDGE );
GL.texImage2D(GL.TEXTURE_2D, 0, GL.RGBA, SIMUSIZEPX, SIMUSIZEPX, 0, GL.RGBA, GL.FLOAT, null);
GL.framebufferTexture2D(GL.FRAMEBUFFER, GL.COLOR_ATTACHMENT0, GL.TEXTURE_2D, texture_water_copy, 0);
//NORMALS RTT :
var fb_normals=GL.createFramebuffer();
GL.bindFramebuffer(GL.FRAMEBUFFER, fb_normals);
var texture_normals=GL.createTexture();
GL.bindTexture(GL.TEXTURE_2D, texture_normals);
GL.texParameteri(GL.TEXTURE_2D, GL.TEXTURE_MAG_FILTER, GL.NEAREST);
GL.texParameteri(GL.TEXTURE_2D, GL.TEXTURE_MIN_FILTER, GL.NEAREST);
GL.texParameteri( GL.TEXTURE_2D, GL.TEXTURE_WRAP_S, GL.CLAMP_TO_EDGE );
GL.texParameteri( GL.TEXTURE_2D, GL.TEXTURE_WRAP_T, GL.CLAMP_TO_EDGE );
GL.texImage2D(GL.TEXTURE_2D, 0, GL.RGBA, SIMUSIZEPX, SIMUSIZEPX, 0, GL.RGBA, GL.FLOAT, null);
GL.framebufferTexture2D(GL.FRAMEBUFFER, GL.COLOR_ATTACHMENT0, GL.TEXTURE_2D, texture_normals, 0);
/*========================= INIT ========================= */
//WEBGL GENERAL INIT
GL.disable(GL.DEPTH_TEST);
GL.disable(GL.SCISSOR_TEST);
GL.clearColor(0.0, 0.0, 0.0, 0.0);
//SHADER PROGRAM RENDERING INIT
GL.useProgram(SHP_RENDERING);
GL.enableVertexAttribArray(SHP_VARS.rendering.position);
GL.uniform1f(SHP_VARS.rendering.H, WATER_DEEP);
GL.uniform1f(SHP_VARS.rendering.L, SIMUWIDTH);
GL.uniform1f(SHP_VARS.rendering.l, RENDERING_FLOOR_SIZE);
GL.uniform1i(SHP_VARS.rendering.sampler, 0);
GL.uniform1i(SHP_VARS.rendering.sampler_normals, 1);
GL.vertexAttribPointer(SHP_VARS.rendering.position, 2, GL.FLOAT, false,8,0) ;
GL.bindBuffer(GL.ARRAY_BUFFER, QUAD_VERTEX);
GL.bindBuffer(GL.ELEMENT_ARRAY_BUFFER, QUAD_FACES);
GL.disableVertexAttribArray(SHP_VARS.rendering.position);
//SHADER PROGRAM GPGPU WATER INIT
GL.useProgram(SHP_WATER);
GL.uniform1i(SHP_VARS.water.sampler_water, 0);
GL.uniform1i(SHP_VARS.water.sampler_normals, 1);
//WE SIMULATE A SQUARE WATER SURFACE SIDE MEASURING 2 METERS
GL.uniform1f(SHP_VARS.water.g, -9.8); //gravity acceleration
GL.uniform1f(SHP_VARS.water.H, WATER_DEEP); //mean height of water in meters
GL.uniform1f(SHP_VARS.water.b, 0.001); //viscous drag coefficient
GL.uniform1f(SHP_VARS.water.epsilon, 1/SIMUSIZEPX); //used to compute space derivatives
GL.uniform1f(SHP_VARS.water.scale, SIMUWIDTH/SIMUSIZEPX);
GL.uniform1f(SHP_VARS.water.sourceRadius, 0.04); //percentage of the surface which is flowed by the source
GL.enableVertexAttribArray(SHP_VARS.water.position);
GL.vertexAttribPointer(SHP_VARS.water.position, 2, GL.FLOAT, false,8,0) ;
GL.bindBuffer(GL.ARRAY_BUFFER, QUAD_VERTEX);
GL.bindBuffer(GL.ELEMENT_ARRAY_BUFFER, QUAD_FACES);
GL.disableVertexAttribArray(SHP_VARS.water.position);
//SHADER PROGRAM TEXTURE COPY INIT
GL.useProgram(SHP_COPY);
GL.uniform1f(SHP_VARS.copy.scale, SIMUSIZEPX);
GL.uniform1i(SHP_VARS.copy.sampler, 0);
GL.enableVertexAttribArray(SHP_VARS.copy.position);
GL.vertexAttribPointer(SHP_VARS.copy.position, 2, GL.FLOAT, false,8,0) ;
GL.bindBuffer(GL.ARRAY_BUFFER, QUAD_VERTEX);
GL.bindBuffer(GL.ELEMENT_ARRAY_BUFFER, QUAD_FACES);
GL.disableVertexAttribArray(SHP_VARS.copy.position);
//SHADER PROGRAM NORMALS INIT
GL.useProgram(SHP_NORMALS);
GL.uniform1i(SHP_VARS.normals.sampler, 0);
GL.uniform1f(SHP_VARS.normals.epsilon, 1/SIMUSIZEPX); //used to compute space derivatives
GL.uniform1f(SHP_VARS.normals.scale, SIMUWIDTH);
GL.enableVertexAttribArray(SHP_VARS.normals.position);
GL.vertexAttribPointer(SHP_VARS.normals.position, 2, GL.FLOAT, false,8,0) ;
GL.bindBuffer(GL.ARRAY_BUFFER, QUAD_VERTEX);
GL.bindBuffer(GL.ELEMENT_ARRAY_BUFFER, QUAD_FACES);
GL.disableVertexAttribArray(SHP_VARS.normals.position);
/*========================= RENDER LOOP ========================= */
var old_timestamp=new Date().getTime();
var old_timestamp=0;
var animate=function(timestamp) {
var dt=(timestamp-old_timestamp)/1000; //time step in seconds;
dt=Math.min(Math.abs(dt), 0.017);
old_timestamp=timestamp;
GL.clear(GL.COLOR_BUFFER_BIT);
for (var i=0; i<GPGPU_NPASS; i++) {
//COPY
GL.bindFramebuffer(GL.FRAMEBUFFER, fb_copy);
GL.useProgram(SHP_COPY);
GL.viewport(0.0, 0.0, SIMUSIZEPX, SIMUSIZEPX);
GL.enableVertexAttribArray(SHP_VARS.copy.position);
GL.bindTexture(GL.TEXTURE_2D, texture_water);
GL.drawElements(GL.TRIANGLES, 6, GL.UNSIGNED_SHORT, 0);
GL.disableVertexAttribArray(SHP_VARS.copy.position);
//GPGPU PHYSICAL SIMULATION :
GL.bindFramebuffer(GL.FRAMEBUFFER, fb_water);
GL.useProgram(SHP_WATER);
GL.enableVertexAttribArray(SHP_VARS.water.position);
GL.activeTexture(GL.TEXTURE1);
GL.bindTexture(GL.TEXTURE_2D, texture_normals);
GL.activeTexture(GL.TEXTURE0);
GL.bindTexture(GL.TEXTURE_2D, texture_water_copy);
if (!i) {
GL.uniform2f(SHP_VARS.water.mouse, POINTER_X, POINTER_Y);
GL.uniform2f(SHP_VARS.water.touch, POINTER_X, POINTER_Y);
GL.uniform1f(SHP_VARS.water.sourceFlow, SOURCEFLOW);
GL.uniform1f(SHP_VARS.water.dt, dt/GPGPU_NPASS);
};
GL.drawElements(GL.TRIANGLES, 6, GL.UNSIGNED_SHORT, 0);
GL.disableVertexAttribArray(SHP_VARS.water.position);
//NORMALS :
GL.bindFramebuffer(GL.FRAMEBUFFER, fb_normals);
GL.useProgram(SHP_NORMALS);
GL.enableVertexAttribArray(SHP_VARS.normals.position);
GL.bindTexture(GL.TEXTURE_2D, texture_water);
GL.drawElements(GL.TRIANGLES, 6, GL.UNSIGNED_SHORT, 0);
GL.disableVertexAttribArray(SHP_VARS.normals.position);
} //end for GPGPU_NPASS
//RENDERING :
GL.bindFramebuffer(GL.FRAMEBUFFER, null);
GL.useProgram(SHP_RENDERING);
GL.enableVertexAttribArray(SHP_VARS.rendering.position);
GL.viewport(0.0, 0.0, CANVAS.width, CANVAS.height);
GL.activeTexture(GL.TEXTURE1);
GL.bindTexture(GL.TEXTURE_2D, texture_normals);
GL.activeTexture(GL.TEXTURE0);
GL.bindTexture(GL.TEXTURE_2D, renderingTexture);
GL.drawElements(GL.TRIANGLES, 6, GL.UNSIGNED_SHORT, 0);
GL.disableVertexAttribArray(SHP_VARS.rendering.position);
GL.flush();
window.requestAnimationFrame(animate);
};
animate(new Date().getTime());
};
Thanks for any help!!!!
I used the WEBKIT_WEBGL_depth_texture Extension.
And init the buffers below.
But how am I able to draw this framebuffer?? I'm totaly stuck right now. -.-
function InitDepthtextures (){
var size = 256;
// Create a color texture
var colorTexture = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, colorTexture);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, size, size, 0, gl.RGBA, gl.UNSIGNED_BYTE, null);
// Create the depth texture
depthTexture = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, depthTexture);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.DEPTH_COMPONENT, size, size, 0, gl.DEPTH_COMPONENT, gl.UNSIGNED_SHORT, null);
framebuffer = gl.createFramebuffer();
gl.bindFramebuffer(gl.FRAMEBUFFER, framebuffer);
gl.framebufferTexture2D(gl.FRAMEBUFFER, gl.COLOR_ATTACHMENT0, gl.TEXTURE_2D, colorTexture, 0);
gl.framebufferTexture2D(gl.FRAMEBUFFER, gl.DEPTH_ATTACHMENT, gl.TEXTURE_2D, depthTexture, 0);
//set to default
gl.bindTexture(gl.TEXTURE_2D, null);
gl.bindFramebuffer(gl.FRAMEBUFFER, null);
}
You don't draw framebuffers. You draw textures. So, first you attach a texture to a framebuffer. Now, with that framebuffer bound, draw something, the result is drawn into the the attachments of the framebuffer. Next unbind the framebuffer and draw something using the textures you attached to the framebuffer to see their contents.
Example:
const gl = document.querySelector('#c').getContext('webgl');
const vshader = `
attribute vec4 a_position;
varying vec2 v_texcoord;
void main() {
gl_Position = a_position;
v_texcoord = a_position.xy * 0.5 + 0.5;
}
`;
const fshader = `
precision mediump float;
varying vec2 v_texcoord;
uniform sampler2D u_sampler;
void main() {
gl_FragColor = texture2D(u_sampler, v_texcoord);
}
`;
// compiles shaders, links program
const program = twgl.createProgram(gl, [vshader, fshader]);
gl.useProgram(program);
const positionLocation = gl.getAttribLocation(program, "a_position");
// a single triangle from top right to bottom left to bottom right
const verts = [
1, 1,
-1, 1,
-1, -1,
];
const vertBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, vertBuffer);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(verts), gl.STATIC_DRAW);
{
const numElements = 2;
const type = gl.FLOAT;
const normalize = false;
const stride = 0;
const offset = 0;
gl.vertexAttribPointer(
positionLocation, numElements, type, normalize, stride, offset);
gl.enableVertexAttribArray(positionLocation);
}
// create an empty texture
const tex = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, tex);
{
const level = 0;
const internalFormat = gl.RGBA;
const width = 1;
const height = 1;
const border = 0;
const format = gl.RGBA;
const type = gl.UNSIGNED_BYTE;
const data = null;
gl.texImage2D(gl.TEXTURE_2D, level, internalFormat, width, height, border,
format, type, data);
}
// Create a framebuffer and attach the texture.
const fb = gl.createFramebuffer();
gl.bindFramebuffer(gl.FRAMEBUFFER, fb);
{
const level = 0
gl.framebufferTexture2D(
gl.FRAMEBUFFER, gl.COLOR_ATTACHMENT0, gl.TEXTURE_2D, tex, level);
}
// Render to the texture (using clear because it's simple)
gl.clearColor(0, 1, 0, 1); // green;
gl.clear(gl.COLOR_BUFFER_BIT);
// Now draw with the texture to the canvas
// NOTE: We clear the canvas to red so we'll know
// so anywhere that shows up green is coming from the canvas texture
// from above.
gl.bindFramebuffer(gl.FRAMEBUFFER, null);
gl.clearColor(1, 0, 0, 1); // red
gl.clear(gl.COLOR_BUFFER_BIT);
{
const offset = 0;
const count = 3;
gl.drawArrays(gl.TRIANGLES, offset, count);
}
canvas { border: 1px solid black; }
<script src="https://twgljs.org/dist/4.x/twgl.min.js"></script>
<canvas id="c" width="400" height="400"></canvas>
You can read more about this topic here