WebGL : data to non square alpha texture - javascript

I have a weird problem with webGL.
I'm using a dynamically generated texture of which only the alpha channel matters.
Here's the code:
var texture = new Uint8Array(ar); // ar is my array
gl.bindTexture(gl.TEXTURE_2D, this.transparencyTexture);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.ALPHA, array.length, array[0].length, 0, gl.ALPHA, gl.UNSIGNED_BYTE, texture);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
I'm always using a POT array' "width" and "height" but whenever "width" <> "height" it doesn't work. So it currently only works with squares.
What can be done?
EDIT:
http://jsfiddle.net/SergeJcqmn/EAmjU/9/

In the jsfiddle, line 76 of the js is incorrect:
ar.push(array[x][(array.length - 1) - y] ? 128 : 0);
I believe this should be:
ar.push(array[x][(array[0].length - 1) - y] ? 128 : 0);

Related

error: WebGL warning: texImage2D: Desired upload requires more data than is available: (when loading the texture with triangle mesh data and normals)

i probably have messed up when i load 2 textures. i am getting this error :"WebGL warning: texImage2D: Desired upload requires more data than is available: (0 rows plus 246 pixels needed, 0 rows plus 244 pixels available)"
and nothing in the screen. I have 2 textures one with the vertices of a mesh and another with the normals. i will post the way i try to load the textures:
const uLSr = gl.getUniformLocation(P, 'uMeshData');
const uNormData = gl.getUniformLocation(P, 'uNormData');
const texture = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, texture);
const verts = [
...
4.910892,0.000000,4.910892,
-4.910892,0.000000,-4.910892,
-4.910892,0.000000,4.910892,
4.910892,0.000000,4.910892,
4.910892,0.000000,-4.910892,
...
];
const vertsNorm = [
...
0.0000,-1.0000,0.0000,
0.4253,-0.8506,0.3090,
-0.1625,-0.8506,0.5000,
0.7236,-0.4472,0.5257,
0.4253,-0.8506,0.3090,
...
];
const meshVerts = new Float32Array(verts);
const vertsLenght = meshVerts.length / 3;
gl.uniform1i(uLvertices, vertsLenght);
gl.pixelStorei(gl.UNPACK_ALIGNMENT, 1);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGB32F, vertsLenght, 1, 0, gl.RGB, gl.FLOAT, meshVerts);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
const textureNorm = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, textureNorm);
const meshNorm = new Float32Array(vertsNorm);
gl.pixelStorei(gl.UNPACK_ALIGNMENT, 1);
//vertsLength is the same cause normal and vertices have the same length
//foreachone normal there is one single vertex
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGB32F, vertsLenght, 1, 0, gl.RGB, gl.FLOAT, meshNorm);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
and then in the draw function i call them like that:
gl.activeTexture(gl.TEXTURE0);
gl.bindTexture(gl.TEXTURE_2D, texture);
gl.activeTexture(gl.TEXTURE1);
gl.bindTexture(gl.TEXTURE_2D, textureNorm);
gl.uniform1i(uLSr, 0);
gl.uniform1i(uNormData, 1);
and then inside the shader i try to unpack them like that:
for (int i = 6; i < vertsCount; i += 3) {
a = texelFetch(uMeshData, ivec2(i, 0), 0);
b = texelFetchOffset(uMeshData, ivec2(i, 0), 0, ivec2(1, 0));
c = texelFetchOffset(uMeshData, ivec2(i, 0), 0, ivec2(2, 0));
aN = texelFetch(uNormData, ivec2(i, 0), 0);
bN = texelFetchOffset(uNormData, ivec2(i, 0), 0, ivec2(1, 0));
cN = texelFetchOffset(uNormData, ivec2(i, 0), 0, ivec2(2, 0));
triangleNormal = (aN.xyz + bN.xyz + cN.xyz) / 3.;
vec3 uvt;
vec3 intersect;
float z;
bool isHit = hitTriangleSecond(R_.orig, R_.dir, a.xyz, b.xyz, c.xyz, uvt, triangleNormal, intersect, z);;
if (isHit) {
if (z<mindist && z > 0.001) {
hitPos1 = intersect;
mindist = z;
weHitSomething = true;
material.type = DIEL;
material.albedo = vec3(.8, .3, .4);
normal = triangleNormal;
hitPos = hitPos1;
}
}
}
if i comment the line where i calculate the normal of the surface and then replace it with the normal i got from texture i got a white page. So i assume i have made something wrong with the loading of the textures. Or is it in the HitTriandglSecond function where i calculate the face normal of the triangle?
and below i post the link to the jsfiddle with a minimal example(its long cause i put the vertices and normals of a plane and a sphere of triangles):
link of jsfiddle example
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGB32F, vertsLenght, 1, 0, gl.RGB, gl.FLOAT, meshNorm);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
i found the mistake.
it was the vertsLength variable, i made a new variable for the second texture cause to reuse the vertsLength didnt seem to work some reason.
const normLength = meshNorm.length /3;
gl.pixelStorei(gl.UNPACK_ALIGNMENT, 1);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGB32F, normLength, 1, 0, gl.RGB, gl.FLOAT, meshNorm);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);

Rendering Pipeline works in WebGL 1 but not in WebGL 2

Update
This already answered by #gman.
A call to gl.getExtension('EXT_color_buffer_float') was need when starting things up.
The issue is that a call to gl.checkFramebufferStatus(gl.FRAMEBUFFER) does not return gl.FRAMEBUFFER_COMPLETE under WebGL2. I make this call just prior to calling gl.drawArrays(gl.TRIANGLE_STRIP, 0, 4).
Here's the common sequence of things for either version. The differences are in createTexture() and readTexture(). Those are specific to the versions of WebGL.
General Flow:
gl.useProgram(program);
var texShape = getTextureShape(programInfo.outputTensor);
var outputTexture = this.textureManager.getOrCreateTexture(programInfo.outputTensor);
this.gpuContext.attachFramebuffer(outputTexture, texShape.width, texShape.height);
var inputTextures = this.createTextures(programInfo.textureData);
this.bindAttributes(buildArtifact.attribLocations);
this.bindUniforms(buildArtifact.uniformLocations, programInfo.uniformData);
this.bindTextures(buildArtifact.uniformLocations, inputTextures);
if (!this.gpuContext.isFramebufferReady()) {
throw new Error("Framebuffer is not ready");
}
gl.drawArrays(gl.TRIANGLE_STRIP, 0, 4);
this.copyToOutput(outputTexture, programInfo.outputTensor);
attachFramebuffer: common
GpuContext.prototype.attachFramebuffer = function (texture, width, height) {
var gl = this.gl;
gl.activeTexture(gl.TEXTURE0 + (this.maxTextureImageUnits - 1));
gl.bindTexture(gl.TEXTURE_2D, texture);
gl.bindFramebuffer(gl.FRAMEBUFFER, this.framebuffer);
gl.framebufferTexture2D(gl.FRAMEBUFFER,
gl.COLOR_ATTACHMENT0,
gl.TEXTURE_2D,
texture,
0);
gl.viewport(0, 0, width, height);
};
createTexture: WebGL 1.0:
GpuContext.prototype.createTexture = function (width, height, data) {
var gl = this.gl;
var texture = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, texture);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
// TODO: remove this override
var type = gl.FLOAT;
var buffer = null;
if (data) {
buffer = new Float32Array(data.length * 4);
data.forEach(function (value, index) { return buffer[index * 4] = value; });
}
// Pixel format and data for the texture
gl.texImage2D(gl.TEXTURE_2D, // Target, matches bind above.
0,
gl.RGBA,
width,
height,
0,
gl.RGBA,
type,
buffer);
gl.bindTexture(gl.TEXTURE_2D, null);
return texture;
};
readTexture: WebGL 1.0:
GpuContext.prototype.readTexture = function (texture, width, height) {
var gl = this.gl;
var buffer = new Float32Array(width * height * 4);
var format = gl.RGBA;
var type = gl.FLOAT;
// bind texture to framebuffer
gl.framebufferTexture2D(gl.FRAMEBUFFER,
gl.COLOR_ATTACHMENT0,
gl.TEXTURE_2D,
texture,
0);
if (!this.isFramebufferReady()) {
throw new Error("Framebuffer is not ready after attaching texture");
}
gl.readPixels(0, // x-coord of lower left corner
0, // y-coord of lower left corner
width, // width of the block
height, // height of the block
format, // Format of pixel data.
type, // Data type of the pixel data, must match makeTexture
buffer); // Load pixel data into buffer
gl.bindFramebuffer(gl.FRAMEBUFFER, null);
return buffer.filter(function (value, index) { return index % 4 === 0; });
};
WebGL 2.0 Overrides:
WebGL2GpuContext.prototype.createTexture = function (width, height, data) {
var gl = this.gl;
var texture = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, texture);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
var internalFormat = WebGL2RenderingContext.R32F;
var format = WebGL2RenderingContext.RED;
var type = gl.FLOAT;
gl.texImage2D(gl.TEXTURE_2D,
0,
internalFormat,
width,
height,
0,
format,
type,
data);
gl.bindTexture(gl.TEXTURE_2D, null);
return texture;
};
WebGL2GpuContext.prototype.readTexture = function (texture, width, height) {
var gl = this.gl;
var buffer = new Float32Array(width * height);
var format = WebGL2RenderingContext.RED;
var type = gl.FLOAT;
gl.bindFramebuffer(gl.FRAMEBUFFER, this.framebuffer);
gl.framebufferTexture2D(gl.FRAMEBUFFER,
gl.COLOR_ATTACHMENT0,
gl.TEXTURE_2D,
texture,
0);
if (!this.isFramebufferReady()) {
throw new Error("Framebuffer is not ready after attaching texture");
}
gl.readPixels(0,
0,
width,
height,
format,
type,
buffer);
gl.bindFramebuffer(gl.FRAMEBUFFER, null);
return buffer;
};

webGL texture not showed in rectangle

I'm having problem texturing my rectangle where is displayed black canvas instead of textured canvas with image.
First I'm creating webGL program, attaching shaders and linking webGL program as usual.
Then I'm creating texture when image is loaded like this:
var texture = gl.createTexture();
var image = document.createElement("img");
image.src = "https://upload.wikimedia.org/wikipedia/commons/3/3a/Saint-Gervais-les-Bains_-_Mt-Blanc_JPG01.jpg";
image.onload = function() {
gl.bindTexture(gl.TEXTURE_2D, texture);
// Set the parameters so we can render any size image.
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
// Upload the image into the texture.
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, image);
}
After that I pass the information about rectangle verticies into the vertex shader:
var pos = gl.getAttribLocation(program, "pos");
gl.enableVertexAttribArray(pos);
var pos_Buffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, pos_Buffer);
var vertices = [-1.0, -1.0, // "left-down"
-1.0, 1.0, // "left-top"
1.0, -1.0, // "right-down"
1.0, 1.0, // "right-top"
];
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(vertices), gl.STATIC_DRAW);
gl.vertexAttribPointer(pos, 2, gl.FLOAT, false, 0, 0);
And in the end I draw my rectangle by passing indexes of verticies into drawElements function:
var indexBuffer = gl.createBuffer();
gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, indexBuffer);
var indices = [0, 1, 2, 1, 2, 3];
gl.bufferData(gl.ELEMENT_ARRAY_BUFFER, new Uint8Array(indices), gl.STATIC_DRAW);
// draw triangles
gl.drawElements(gl.TRIANGLES, indices.length, gl.UNSIGNED_BYTE, 0);
There is a jsfiddle with my problem.
Please, don't you have any idea how to solve it?
image.onload is asynchronous function and you do draw call before the function is executed (image isn't loaded when you draw on canvas).
You must put gl.drawElements inside:
image.onload = function() { // image.onload STARTS
gl.bindTexture(gl.TEXTURE_2D, texture);
// Set the parameters so we can render any size image.
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
// Upload the image into the texture.
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, image);
// anything you want, blehblahbleh ...
// draw on canvas
gl.drawElements(gl.TRIANGLES, indices.length, gl.UNSIGNED_BYTE, 0);
} // image.onload ENDS
Also yes there was a problem with cross origin resource call, but I guess you solved it out with extension (or for jsfiddle tests you might use base64 format).
Slightly updated sample: http://jsfiddle.net/windkiller/6cLo3890/

Using array of images as texture in WebGL

How to use an array of textures (which contain different images) in WebGL without initializing each texture separately? I want to display a cube and a pyramid (organized as an array), each with a different texture image. Here is a part of the code (to initialize the textures):
function handleTexture(texture) {
gl.bindTexture(gl.TEXTURE_2D, texture);
gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, true);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, texture.Img);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR_MIPMAP_NEAREST);
gl.generateMipmap(gl.TEXTURE_2D);
gl.bindTexture(gl.TEXTURE_2D, null);
}
function initTextures() {
for (i=0; i<2; i++) {
tex[i] = gl.createTexture();
tex[i].Img = new Image();
tex[i].Img.onload = function() { handleTexture(tex[i]); }
tex[i].Img.src = texImgs[i]; // The name of the image file
}
}
The cube and pyramid are displayed in black color (no texture), and I get this error in the page:
Uncaught TypeError: Cannot read property 'Img' of undefined // gl.texImage2D()
tex.(anonymous function).Img.onload // tex[i].Img.onload = ...
I don't have this error if I initialize the texture separately, not using an array, but tex1 and tex2. Any suggestion on how to do this using array?
When you do:
tex[i] = gl.createTexture();
tex[i] becomes a type of WebGLTexture and you cannot attach new properties on that object as you did with .Img property. So slightly more correct version would be:
var tex = [], texImgs = ['pyra.jpg', 'cube.jpg'];
function handleTexture(myTexture) {
myTexture.texture = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, myTexture.texture);
gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, true);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, myTexture.image);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR_MIPMAP_NEAREST);
gl.generateMipmap(gl.TEXTURE_2D);
gl.bindTexture(gl.TEXTURE_2D, null);
}
function initTextures() {
for (var i=0; i<2; i++) {
(function(index) {
tex[index] = new Object();
tex[index].texture = null;
tex[index].image = new Image();
tex[index].image.onload = function() { handleTexture(tex[index]); }
tex[index].image.src = texImgs[index]; // The name of the image file
})(i);
}
}
Note that you create an object that has image and texture property separated.
Hope this helps you.
EDIT:
Closure messes the index of the tex.

Webgl weird rendering with simple texture

I'm having trouble rendering textures on my custom 3D shape.
With the following parameters:
gl.bindTexture(gl.TEXTURE_2D, texture);
gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, true);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, texture.image);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.REPEAT);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.REPEAT);
gl.generateMipmap(gl.TEXTURE_2D);
gl.bindTexture(gl.TEXTURE_2D, null);
It gives me this result:
Changing the following parameters:
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.REPEAT);
It gives me this :
And with these parameters:
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.REPEAT);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
I tried changing the texture coordinates but without success the one used for each face here was:
0.0, 0.0,
0.0, 10.0,
10.0, 10.0,
10.0, 0.0
Any idea why one triangle(actually 2 the one on the parallel face) behaves weirdly?
The issue is that you're using faces with varying amount of points - but still using a fixed number (4) of texture coords for a face. As some faces have 5 points, at some point the texture coords are "shifted".
Instead, you should add them just like the vertices:
for (var j=0; j < face.length; j++){
vertices...
//generate texture coords, they could also be stored like the points
textureCoords.push(points[face[j]][0] / 4);
textureCoords.push((points[face[j]][1]+points[face[j]][2]) / 4);
http://jsfiddle.net/bNTkK/9/

Categories