I am trying to blend two textures with an alpha channel over each other.
After looking through the net it appeared that there are no simple ways to solve this. I tried this trick in the fragment shader:
if(gl_FragColor.a < 0.5){
discard;
}
This works for simpler textures with not a lot of alpha variations, like the human sprite in the background. But I want to be able to work with more complex images like the gradient sprite which doesn't work at all.
This is my fragment shader:
precision mediump float;
varying vec3 fragColor;
varying highp vec2 vTextureCoord;
uniform sampler2D uSampler;
void main()
{
vec4 tex = texture2D(uSampler, vTextureCoord);
gl_FragColor = tex * vec4(fragColor, 1.0);
if(gl_FragColor.a < 0.5){
discard;
}
}'
This is my vertex shader:
precision mediump float;
attribute vec3 vertPosition;
attribute vec3 vertColor;
attribute vec2 aTextureCoord;
varying vec3 fragColor;
varying highp vec2 vTextureCoord;
uniform mat4 uPMatrix;
uniform mat4 uMVMatrix;
uniform vec2 uvOffset;
uniform vec2 uvScale;
void main()
{
fragColor = vertColor;
gl_Position = uPMatrix * uMVMatrix * vec4(vertPosition.x, vertPosition.y, vertPosition.z, 1.0);
vTextureCoord = aTextureCoord * uvScale + uvOffset;
}
This is a part of the gl setup I use:
gl.enable(gl.DEPTH_TEST);
gl.enable(gl.BLEND);
gl.blendEquation(gl.FUNC_ADD);
gl.blendFunc(gl.SRC_ALPHA, gl.ON);
Currently all sprites are being drawn on the same z axis, 0. However I don't know if the is the source of the problem as I tested giving each object a random z value and the problem persisted.
Edit:
In response to Rabbid76's comment.
This works verry well! The alpha is blended but the only problem is that the sprites look "burned":
I tried to alter the fragment shader to this:
<strike>gl_FragColor = tex * vec4(tex.rgb, tex.a);</strike>
But it still looks burned.
Edit 2
I solved it it. gl_FragColor should b:
gl_FragColor = vec4(tex.rgb, tex.a);
and not
gl_FragColor = vec4(fragColor* tex.rgb, tex.a);
otherwise it creates a burn blending effect
Currently all sprites are being drawn on the same z axis, 0.
Since the dept test is enabled (gl.enable(gl.DEPTH_TEST)), and the default depth function (depthFunc) is gl.LESS, the second drawn sprite won't pass the depth test. You have to disable the depth test:
gl.disable(gl.DEPTH_TEST);
Further I recommend to adapt the blend function (blendFunc):
gl.blendFunc(gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA);
Or you use Alpha premultiplication. Therefore you have to adapt the fragment shader:
gl_FragColor = tex * vec4(fragColor * tex.rgb, tex.a);
And you have to use the following blend function (blendFunc):
gl.blendFunc(gl.ONE, gl.ONE_MINUS_SRC_ALPHA);
Note, you don't need if(gl_FragColor.a < 0.5) discard; any more.
Related
When learning the topic from Webgl-2-textures, found that the two textures shared same a_positions and a_texCoord, they displayed as the same size on the screen by using one drawArrays call.
Is that possible to have different a_position and a_texCoord, but still could have the two images processed like: (suppose the u_image0 is the background, and the u_image1 is the front image which could have screen effect with the background as follows.)
precision mediump float;
// our textures
uniform sampler2D u_image0;
uniform sampler2D u_image1;
// the texCoords passed in from the vertex shader.
varying vec2 v_texCoord0;
varying vec2 v_texCoord1;
void main() {
vec4 color0 = texture2D(u_image0, v_texCoord0);
vec4 color1 = texture2D(u_image1, v_texCoord1);
gl_FragColor = color0 * color1;
}
It is not possible to have different positions since WebGL only draws one pixel at a time. There is only one gl_Position to set.
It is possible to have different texture coordinates for each texture, either compute different texture coords in your fragment shader. For example
vec4 color0 = texture2D(u_image0, v_texCoord0);
vec4 color1 = texture2D(u_image1, v_texCoord0 * 2.0);
Now the second image is using different texture coordinates. That example might be silly but the point is it's your code. You can put any math their you want.
uniform vec2 offset1;
uniform vec2 offset2;
uniform vec2 scale1;
uniform vec2 scale2;
vec4 color0 = texture2D(u_image0, v_texCoord0 * scale1 + offset1);
vec4 color1 = texture2D(u_image1, v_texCoord0 * scale2 + offset2);
Otherwise you can also make a vertex shader that passes in different texture coordinates for each texture.
attribute vec2 texcoord0;
attribute vec2 texcoord1;
varying vec2 v_texCoord0;
varying vec2 v_texCoord1;
void main() {
v_texcoord0 = v_texCoord0;
v_texcoord1 = v_texCoord1;
}
In the same way as above what math you include is up to you. Add an offset, a scale, or a full matrix or whatever you want.
It's common to allow a full matrix.
vec2 uv = (some4x4Matrix * vec4(texcoord, 0, 1)).xy;
This lets you translate (offset), scale, and even rotate the texture coordinates.
Here's an article that uses that style
So I made a WebGL website and it can be accessed here:
https://jameswebsite.azurewebsites.net/
It looks fine(or as expected) on the PC, but on mobile devices it looks funny. Looks like the texture mapping might be off (maybe texture clipping is the problem) but also there doesn't appear to be any shading occuring.
Here is the screenshot from the PC:
PC Image]1
Here is the screenshot from the Mobile Device:
Mobile Image
I have turned off the textures and still have this problem. This leads me to believe the problem could be in my shaders. Here are my shaders:
<script id="per-fragment-lighting-fs" type="x-shader/x-fragment">
precision mediump float;
varying vec2 vTextureCoord;
varying vec3 vTransformedNormal;
varying vec4 vPosition;
uniform float uMaterialShininess;
uniform bool uShowSpecularHighlights;
uniform bool uUseLighting;
uniform bool uUseTextures;
uniform vec3 uAmbientColor;
uniform vec3 uPointLightingLocation;
uniform vec3 uPointLightingSpecularColor;
uniform vec3 uPointLightingDiffuseColor;
uniform sampler2D uSampler;
void main(void) {
vec3 lightWeighting;
if (!uUseLighting) {
lightWeighting = vec3(1.0, 1.0, 1.0);
} else {
vec3 lightDirection = normalize(uPointLightingLocation - vPosition.xyz);
vec3 normal = normalize(vTransformedNormal);
float specularLightWeighting = 0.0;
if (uShowSpecularHighlights) {
vec3 eyeDirection = normalize(-vPosition.xyz);
vec3 reflectionDirection = reflect(-lightDirection, normal);
specularLightWeighting = pow(max(dot(reflectionDirection, eyeDirection), 0.0), uMaterialShininess);
}
float diffuseLightWeighting = max(dot(normal, lightDirection), 0.0);
lightWeighting = uAmbientColor
+ uPointLightingSpecularColor * specularLightWeighting
+ uPointLightingDiffuseColor * diffuseLightWeighting;
}
vec4 fragmentColor;
if (uUseTextures) {
fragmentColor = texture2D(uSampler, vec2(vTextureCoord.s, vTextureCoord.t));
} else {
fragmentColor = vec4(1.0, 1.0, 1.0, 1.0);
}
gl_FragColor = vec4(fragmentColor.rgb * lightWeighting, fragmentColor.a);
}
<script id="per-fragment-lighting-vs" type="x-shader/x-vertex">
attribute vec3 aVertexPosition;
attribute vec3 aVertexNormal;
attribute vec2 aTextureCoord;
uniform mat4 uMVMatrix;
uniform mat4 uPMatrix;
uniform mat3 uNMatrix;
varying vec2 vTextureCoord;
varying vec3 vTransformedNormal;
varying vec4 vPosition;
void main(void) {
vPosition = uMVMatrix * vec4(aVertexPosition, 1.0);
gl_Position = uPMatrix * vPosition;
vTextureCoord = aTextureCoord;
vTransformedNormal = uNMatrix * aVertexNormal;
}
Any ideas? Thanks in Advance!
On mobile devices, WebGL implementations can be sensitive to things like non-base two texture dimensions, or can require texture wrapping behaviour to be explicitly set.
I noticed your grass texture has dimensions of 590x590. Consider resizing your textures to the closes base two dimensions (ie in the case of your grass texture, 512x512)
Also, I would recommend you explicity set the gl.TEXTURE_WRAP_S and gl.TEXTURE_WRAP_T parameters on your texture object(s) to gl.GL_REPEAT, as this may be another cause for your textures only partially displaying on geometry.
You can set wrapping behaviour on a texture object in the following way:
gl.bindTexture(gl.TEXTURE_2D, texture);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.REPEAT);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.REPEAT);
Per #gman:
change mediump to highp in your shaders and see what that does
This actually worked!! Thanks #gman!
New to Three.JS. Mostly I'm wondering about this being the right approach for what I want to do.
I'm trying render a certain type of wireframe material on a simple spherical geometry. I'm after this particular look:
My current efforts:
Note: Moved to Plunker below
http://plnkr.co/edit/FrCUIwxH1IL3wFKwHSRJ?p=preview
Currently I'm using EdgesHelper to get a neat grid, I'm not sure on how to remove the vertical lines.
Ideally I need to control the distance between the horizontal lines and their opacity as well, but have been unable to do this with the helper. My other idea is to draw separate line geometries for each "line", but I think this is a bit of overkill. Any ideas are appreciated.
Would a simple shader like this be good enough for what you need?
vertex:
varying vec2 vUv;
void main() {
vUv = uv;
gl_Position = projectionMatrix * modelViewMatrix * vec4(position,1.0);
}
fragment:
uniform vec3 color1;
uniform float alpha1;
uniform vec3 color2;
uniform float alpha2;
uniform float lines;
uniform float linewidth;
varying vec2 vUv;
void main() {
float p = abs(fract(lines*vUv.y)*2.0-1.0);
if(p < linewidth / 100.0){
gl_FragColor = vec4(color1, alpha1);
}else{
gl_FragColor = vec4(color2, alpha2);
}
}
I'm trying to display multiple textured objects using HTML5 and WebGL. The problem is that the textures are being shaded very dark. I believe it has to be something with the way my shaders are being generated or used. I have been using the default shaders from https://developer.mozilla.org/en-US/docs/Web/API/WebGL_API/Tutorial/Lighting_in_WebGL. It works fine when I use one object such as in their example, but if I use two, both objects are drawn very dark. My fragment shader with 4 textures shared between 5 objects looks like this:
varying highp vec2 vTextureCoord;varying highp vec3 vLighting;
uniform sampler2D u_image0;
uniform sampler2D u_image1;
uniform sampler2D u_image2;
uniform sampler2D u_image3;
void main(void){
highp vec4 texelColor0 = texture2D(u_image0, vec2(vTextureCoord.s, vTextureCoord.t));
highp vec4 texelColor1 = texture2D(u_image1, vec2(vTextureCoord.s, vTextureCoord.t));
highp vec4 texelColor2 = texture2D(u_image2, vec2(vTextureCoord.s, vTextureCoord.t));
highp vec4 texelColor3 = texture2D(u_image3, vec2(vTextureCoord.s, vTextureCoord.t));
gl_FragColor =
vec4(texelColor0.rgb * vLighting, texelColor0.a) *
vec4(texelColor1.rgb * vLighting, texelColor1.a) *
vec4(texelColor2.rgb * vLighting, texelColor2.a) *
vec4(texelColor3.rgb * vLighting, texelColor3.a);
}
The fragment shader:
attribute highp vec3 aVertexNormal;
attribute highp vec3 aVertexPosition;
attribute highp vec2 aTextureCoord;
uniform highp mat4 uNormalMatrix;
uniform highp mat4 uMVMatrix;
uniform highp mat4 uPMatrix;
varying highp vec2 vTextureCoord;
varying highp vec3 vLighting;
void main(void) {
gl_Position = uPMatrix * uMVMatrix * vec4(aVertexPosition, 1.0);
vTextureCoord = aTextureCoord;
// Apply lighting effect
highp vec3 ambientLight = vec3(0.5, 0.5, 0.5);
highp vec3 directionalLightColor = vec3(1.0, 1.0, 1.0);
highp vec3 directionalVector = vec3(1, 2.0, 2.0);
highp vec4 transformedNormal = uNormalMatrix * vec4(aVertexNormal, 1.0);
highp float directional = max(dot(transformedNormal.xyz, directionalVector), 0.0);
vLighting = ambientLight + (directionalLightColor * directional);
}
I also call this at the start of each draw cycle:
gl.clearColor(255.0, 255.0, 255.0, 1.0);
gl.clearDepth(1.0); // Clear everything
gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);
My canvas draws something like this, it is very light on some parts of the objects and very dark on others. What can I do to create an evenly distributed but "normal" looking object with no "glare" but more "clear" looking textures?
Here is a link to what my scene looks like:
http://i.imgur.com/S9fwrEm.png
It seems that the problem was with the gl_fragColor calculation. I thought that using multiple textures you were to multiply each together. However this would make sense that multiplying the current texture by the others not in use would darken the currently drawn textures. If you only use something like:
gl_fragColor =
vec4(texelColor0.rgb * vLighting, texelColor0.a);
}
Then it is drawn fine. However, this doesn't seem proper since I am using one fragColor of one texture for each texture drawn. If anyone has insight as to how to change fragColors based on the current texture being used, then please leave another answer, Thanks!
The color is off because you are blending in 4 different "colors" from 4 different texture in your fragment shader. Of course the result will be wrong. The way your doing it is not how you draw multiple models. If you are serious about this you should go find some tutorials on webGL.
In webgl I created a scene where I have 4 objects, and I added different textures on each of them. Each texture is passed on to the shader via uniform variables. I am trying to make half of one of my objects red. But every time I try to change half of one of the objects to red, i somehow get the half of every texture red. The following is what I have in my fragment shader at the moment.
varying vec2 vTextureCoord;
uniform sampler2D Texture2;
vec4 a = texture2D(Texture2, vTextureCoord);
if(a.t>0.5)
{
gl_FragColor = vec4(1.0,0.0,0.0,1.0)
}
else
{
gl_FragColor = vec4(t.x,0.0,0.0,1.0)
}
So to sum up my question, how can I get half of only one of my textures to red?
Thank you
What is t.x? You're looking up a color from a texture, then based on the green value of the texel you looked up from texture a.t, if that texel's green value is > 0.5 you're deciding to return either red or vec4(t.x, 0, 0, 1) which is another shade of red.
I'm guessing from your description you want to choose red based on the texture coordinates, not the texture's texel colors. In that case you want
if (vTextureCoord.t > 0.5) // If the texture coordinate's t is > 0.5
Instead of
if (a.t) // If the texel's green value is > 0.5
And if you want to choose the texture color or red then you'd do this
varying vec2 vTextureCoord;
uniform sampler2D Texture2;
void main() {
vec4 a = texture2D(Texture2, vTextureCoord);
if (vTextureCoord.t > 0.5)
{
gl_FragColor = vec4(1.0,0.0,0.0,1.0);
}
else
{
gl_FragColor = a;
}
}
You could also do this
varying vec2 vTextureCoord;
uniform sampler2D Texture2;
void main() {
vec4 a = texture2D(Texture2, vTextureCoord);
gl_FragColor = (vTextureCoord.t > 0.5) ? vec4(1,0,0,1) : a;
}
or this
varying vec2 vTextureCoord;
uniform sampler2D Texture2;
void main() {
gl_FragColor = (vTextureCoord.t > 0.5) ? vec4(1,0,0,1) : texture2D(Texture2, vTextureCoord);
}