How to move a texture in WebGL in x-axis? - javascript

I am able to render a image to a texture , which is rendered in the top left of the screen. Now I want to move the texture slowly right in the X-Axis and add another texture in the top-left corner, and want to do it continuously .
Now, I am able to render a single texture , but unable to move further. Can anybody help me move further ?
I am putting my code which I have done so far.
Code:
Vertex Shader
attribute vec2 a_position;
uniform vec2 u_resolution;
uniform mat3 u_matrix;
varying vec2 v_texCoord;
void main() {
gl_Position = vec4(u_matrix * vec3(a_position, 1), 1);
v_texCoord = a_position;
}
Fragment Shader
precision mediump float;
// our texture
uniform sampler2D u_image;
// the texCoords passed in from the vertex shader.
varying vec2 v_texCoord;
void main() {
gl_FragColor = texture2D(u_image, v_texCoord);
}
Javascript
gl.clear(gl.COLOR_BUFFER_BIT || gl.DEPTH_BUFFER_BIT);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, image.width, image.height, 0, gl.RGBA, gl.UNSIGNED_BYTE, dataTypedArray);
//Draw the rectangle.
gl.drawArrays(gl.TRIANGLES, 0, 6);
Where dataTypedArray is a Uint8Array of pixels.
Next I want to move this rendered texture and add more texture previous locations . Any suggestions?

I think the easiest way to accomplish this is basically to draw 2 quads with 2 separate draw calls. In the first draw call, draw the first quad at desired x location, then, switch to the second texture and draw the second quad at whatever the desired location is. You do not have to alter the shader in any way.
In pseudo-js:
// allocate the two textures
gl.clear(gl.COLOR_BUFFER_BIT || gl.DEPTH_BUFFER_BIT);
gl.bindTexture(gl.TEXTURE_2D, firstTexture);
// use whatever method you want uniform/attribute/watever to upload the quad position
gl.drawArrays(gl.TRIANGLES, 0, 6);
gl.bindTexture(gl.TEXTURE_2D, secondTexture);
// upload the second quad position
gl.drawArrays(gl.TRIANGLES, 0, 6);

This is typically accomplished by translating the texture coordinates themselves. The vertex shader will often include a texture matrix:
attribute vec2 vertexTexCoord; // input texture coord (s,t)
uniform mat4 TextureMatrix; // for transforming (s,t)
varying vec2 texCoord; // output texture coord
...
void main() {
gl_Position = ModelViewProjection*vec4(vertexPosition,1.0);
texCoord = (TextureMatrix*vec4(vertexTexCoord,0.0,1.0)).st;
...
}
The texture matrix can translate, rotate, scale,... the texture. The client JS code constructs the appropriate transform and loads the shader's texture matrix. If you wish to animate the translation then you might recompute the texture matrix at each from:
... compute s-traslation ds based on frame time ...
Texture = new Matrix4x4;
Texture.translate(ds, 0, 0).scale(4, 2, 1);
...
gl.uniformMatrix4fv(program.TextureMatrix, false, Texture.array);
If you wish to involve multiple textures, then your will have to use multiple texture units and access them as needed in the fragment shader:
varying vec2 texCoord;
uniform sampler2D texUnit0;
uniform sampler2D texUnit1;
...
void main() {
... choose which texture you want to access based on texCoord.st
gl_FragColor = texture2D(desiredTexUnit, texCoord);
...
}
Take a look at this textured donut to see the effect of scaling texture coordinates with a single texture. You'll need to perform multi-texturing to get the effect you desire.

Related

check if webgl texture is loaded in fragment shader

i am writing a webgl program with texturing.
As long as the image isn´t loaded, the texture2D-function returns a vec4(0.0, 0.0, 0.0, 1.0). So all objects are black.
So i would like to check, if my sampler2D is available.
I have already tried:
<script id="shader-fs" type="x-shader/x-fragment">
precision mediump float;
varying vec2 vTextureCoord;
uniform sampler2D uSampler;
void main(void) {
vec4 color = texture2D(uSampler, vec2(vTextureCoord.s, vTextureCoord.t));
if(color.r == 0.0 && color.g == 0.0 && color.b == 0.0)
color = vec4(1.0, 1.0, 1.0, 1.0);
gl_FragColor = color;
}
</script>
But of course this doesn´t make sense, because the texture could be black.
Can anybody help me? How can I check, whether my texture image is already loaded in the fragment shader?
You can't really check that in WebGL.
Solutions:
Don't render until the texture is loaded
Use a 1x1 pixel texture to start, fill it in with the image
once it's loaded. See this answer
Pass in more info to the shader like uniform bool textureLoaded.
Me, I always pick #2 because it means the app runs immediately and the textures get filled in as they download.
I'd provide new uniform which will store data whether texture is loaded or not.
Or you can write 2 shaders with/without texture and select proper one before rendering.

Calculating accurate UV coordinates in a spritesheet

I am working with a WebGL website.
I have a canvas with a texture at 256x256 that I use to render to WebGL.
On this canvas I have rendered several images packed together with 1px spacing between, using regular canvas rendering.
I use a 1x1 rectangle (scaled with the world matrix) to render the images in batches. I.e.: I set up the entire render state, then change the UV as a uniform to the shader. It's a spritesheet of icons.
The shader I use to render it is
precision highp float;
attribute vec3 vertexPosition;
attribute vec2 textureCoordinate;
uniform mat4 worldMatrix;
uniform mat4 projectionMatrix;
uniform vec4 actualUV;
uniform float cacheSize;
varying vec2 fragCoord;
vec2 scaleVec;
void main(void) {
scaleVec = vec2(cacheSize,cacheSize);
gl_Position = projectionMatrix * worldMatrix * vec4(vertexPosition, 1.0);
fragCoord = textureCoordinate * actualUV.zw;
fragCoord = fragCoord + actualUV.xy;
fragCoord = fragCoord * scaleVec;
}
The values I use are
actualUV={x:50, y:50, z:19:, w:19}; // for example
cacheSize = 256;
Which should render 19x19 pixels at 50,50 on the texture into a rectangle on the screen 19x19 size. And it does, almost.
The image is slightly off. It's blurry and when I set the MAG_FILTER to NEAREST I get a sharper image, but it is sometimes off by one pixel, or worse, a half pixel causing some (very minor but noticable) stretching. If I add a slight offset to correct this other images rendered in the same way are off in the other direction. I cannot seem to figure it out. It seems like it's a issue with the floating point calculation not being exact, but I cannot figure out where.
Try re-adjusting your coordinate system so that the UVs passed are within [0-1] range and get rid of your scaling factor. This can also be a pre multiplied alpha problem, try use gl.blendFunc(gl.ONE, gl.ONE_MINUS_SRC_ALPHA); with gl.pixelStorei(gl.UNPACK_PREMULTIPLY_ALPHA_WEBGL, true); instead. If you are using this for tiles, then you need to pad the borders with actual pixels. Need to pad more if you need anisotropic filtering.
Also, try to use a quad with dimensions equal to the image (portion of spritesheet) instead of a 1x1 quad if all above fails.

WebGL Weird Performance Drop in Chrome

I have been trying to learn WebGL so I have made a simple model loading script. I thought it was working all fine and dandy until I pulled up Chrome's devtools timeline and saw that I was getting ~20 FPS when rendering 100 models. The weird part was, the CPU was idle most of the time on each frame.
My first thought was that I must be GPU bound, which seemed unlikely because my shaders were short and not very complex. I tested with 10 models and again with only one, and that did increase performance, but I was still getting less than 40 FPS. I pulled up chrome://tracing, and saw that both my GPU wasn't doing much for each frame either. The majority of each frame in CrGpuMain was filled with "Processing Swap", and CrBrowserMain spent nearly all of the time on glFinish (called by "CompositingIOSurfaceMac::DrawIOSurface").
Processing Swap appears to be the major problem, not glFinish.
So my question is, what is causing the browser to spend so much time on glFinish? Any ideas on how to fix this problem?
Each model has 40608 elements and 20805 vertices,
Update - Corrected numbers of simplified mesh:
3140 elements
6091 vertices
The models are pseudo-instanced so data is only passed to the GPU once per frame. Is the large number of vertices the problem?
The demo can be found here.
Rendering Code:
Thing.prototype.renderInstances = function(){
if (this.loaded){
var instance;
gl.bindBuffer(gl.ARRAY_BUFFER, this.vertexbuffer);
gl.vertexAttribPointer(shaderProgram.vertexPositionAttribute, 3, gl.FLOAT, false, 0, 0);
gl.bindBuffer(gl.ARRAY_BUFFER, this.normalbuffer);
gl.vertexAttribPointer(shaderProgram.vertexNormalAttribute, 3, gl.FLOAT, true, 0, 0);
gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, this.indexbuffer);
for(var i in this.instances){
instance = this.instances[i];
setMatrixUniforms(instance.matrix);
gl.drawElements(gl.TRIANGLES, this.numItems, gl.UNSIGNED_SHORT,0);
}
}
};
Vertex Shader Code:
attribute vec3 aVertexPosition;
attribute vec3 aVertexNormal;
uniform mat4 uMMatrix;
uniform mat4 uVMatrix;
uniform mat4 uPMatrix;
uniform vec3 uLightDirection;
varying float vLight;
void main(void) {
mat4 MVMatrix = uVMatrix*uMMatrix;
//Toji's manual transpose
mat3 normalMatrix = mat3(MVMatrix[0][0], MVMatrix[1][0], MVMatrix[2][0], MVMatrix[0][1], MVMatrix[1][1], MVMatrix[2][1], MVMatrix[0][2], MVMatrix[1][2], MVMatrix[2][2]);
gl_Position = uPMatrix*uVMatrix*uMMatrix*vec4(aVertexPosition, 1.0);
vec3 lightDirection = uLightDirection*normalMatrix;
vec3 normal = normalize(aVertexNormal*normalMatrix);
vLight = max(dot(aVertexNormal,uLightDirection),0.0);
}

Is it possible to apply fog from a point in a three.js scene that is independent of the camera?

For example, given a terrain with an avatar on it with a camera far away overhead: is it possible to render the fog so that the avatar remains perfectly unfogged while the terrain around the avatar fades into the fog?
Sure, though as far as I know, you'll have to make your own shader rather than using the ones provided with three.js. There may be a way to customize them in this way, but if there is, I'm not familiar with it.
Check out this answer on doing fog as distance from the camera. The idea, as explained there, is to pass the camera position in as a uniform to the shader, then in the vertex shader on all your objects, you find the distance from the camera position to the vertex you're transforming. You then pass that distance along as a varying to the fragment shader, and you can figure out the distance per pixel, which you use to mix between a fogged color and the object's regular color. You can see that in this example from the OpenGL ES 2.0 Programming guide.
To change it to be based on distance from the character is simple: you just pass in the character position as the uniform that you're calculating distance from instead of the camera position (in that sample code, you would replace u_eyePos with something like u_characterPos and maybe change the varying from v_eyeDist to v_characterDist). Except for any name changes, the fragment shader can be exactly the same.
So, something like this (WARNING: NOT TESTED. you're going to have to fix this up to have three.js happy with using it. There are a ton of example of that, though, like this one):
vertex shader:
uniform mat4 matViewProjection;
uniform mat4 matView;
uniform vec4 u_characterPos;
attribute vec4 rm_Vertex;
attribute vec2 rm_TexCoord0;
varying vec2 v_texCoord;
varying float v_characterDist;
void main() {
// Transform vertex to view-space
vec4 vViewPos = matView * rm_Vertex;
// Compute the distance to character
v_characterDist = length(vViewPos - u_characterPos);
gl_Position = matViewProjection * rm_Vertex;
v_texCoord = rm_TexCoord0.xy;
}
fragment shader:
precision mediump float;
uniform vec4 u_fogColor;
uniform float u_fogMaxDist;
uniform float u_fogMinDist;
uniform sampler2D baseMap;
varying vec2 v_texCoord;
varying float v_characterDist;
float computeLinearFogFactor() {
float factor;
// Compute linear fog equation
factor = (u_fogMaxDist - v_characterDist) /
(u_fogMaxDist - u_fogMinDist );
// Clamp in the [0,1] range
factor = clamp(factor, 0.0, 1.0);
return factor;
}
void main() {
float fogFactor = computeLinearFogFactor();
vec4 fogColor = fogFactor * u_fogColor;
vec4 baseColor = texture2D(baseMap, v_texCoord);
// Compute final color as a lerp with fog factor
gl_FragColor = baseColor * fogFactor +
fogColor * (1.0 - fogFactor);
}

WEBGL - How to display an image?

i'm looking to just simply display an image on the canvas at x and y co-ordinates using WEBGL but have no clue how to do it. do i need to include shaders and all that stuff? i've seen code to display images but they are very bulky. I do not wish to use a framework. If possible could you comment and explain what the important sections do? I will be using WEBGL for a 2d tile based game.
thankyou for your time
Yes, you need a vertex and fragment shader, but they can be relatively simple. I'd recommend to start from the Mozilla example, as suggested by Ido, and after you got it running, remove the 3D aspect. In particular, you don't need the uMVPMatrix and uPmatrix, and your coordinate array can be 2D. For the vertex shader, that means:
attribute vec3 aVertexPosition;
attribute vec2 aTextureCoord;
varying highp vec2 vTextureCoord;
void main(void) {
gl_Position = vec4(aVertexPosition, 0.0, 1.0);
vTextureCoord = aTextureCoord;
}

Categories