ThreeJS: Adding shadows to custom shaders r75 - javascript

In the past it was possible to incorporate the shadow calculations in custom shaders as described here and summed up here.
With r75, the lighting and shadow systems seem to have been merged, changing this. I attempted to work my way through the source to to understand but the abstraction/modules are a little tricky to follow.
I've distilled my shaders down to what I have so far:
Vertex:
#chunk(shadowmap_pars_vertex);
void main() {
vec4 worldPosition = modelMatrix * vec4(position, 1.0);
vec4 mvPosition = modelViewMatrix * vec4(position, 1.0);
gl_Position = projectionMatrix * mvPosition;
#chunk(shadowmap_vertex);
}
Fragment
#chunk(common);
#chunk(lights_pars);
#chunk(shadowmap_pars_fragment);
void main() {
//#if ( NUM_DIR_LIGHTS > 0 )
IncidentLight directLight;
DirectionalLight directionalLight;
float shadowValue = 1.0;
for ( int i = 0; i < NUM_DIR_LIGHTS; i ++ ) {
directionalLight = directionalLights[ i ];
shadowValue = getShadow( directionalShadowMap[ i ], directionalLight.shadowMapSize, directionalLight.shadowBias, directionalLight.shadowRadius, vDirectionalShadowCoord[ i ] );
}
//#endif
gl_FragColor = vec4(vec3(shadowValue), 1.0);
}
I pulled the directional light loop from the lights_template chunk. Unfortunately, shadowValue always seems to return 1.0, but it does work and the shader renders correctly otherwise.
My JS has the appropriate castShadow and receiveShadow set. Other meshes using Phong render shadows correctly.
Thanks so much in advance.
Edit:
Adding material.lights = true; to the ShaderMaterial makes something appear, however the value of shadowValue in the fragment shader is clearly incorrect on the side of the sphere facing away from the light. Screenshots attached.

Related

generating a texture to pull values from during fragment shading yields blank screen for correct width and height

I would like to create a texture in code consisting of an array of RGBA color values and use those values to determine the colors of tiles that I'm generating in a fragment shader. I got the idea, and much of the code to do this from the top solution provided to this SO question: Index expression must be constant - WebGL/GLSL error
However, if I create the texture using the height and width that correspond to my color array, I don't see anything render to the canvas. If I hardcode different values, I sometimes get an image, but that image doesn't place the tile colors in the desired positions, of course, and they move around as I change my viewPos variables.
From trial and error testing with a handful of handpicked values, it seems that I MIGHT only be getting an image when gl.texImage2D() receives a height and a width equal to a power of 2, though I don't see anything about this in documentation. 32 was the largest width I could produce an image with, and 16 was the largest height I could produce an image with. 1, 2, 4, and 8 also work. (the texture size should be 27 by 20 for the window size I'm testing with)
Note that the fragment shader still receives the uTileColorSampSize vector that relates to the size of the color array. I only need the gl.texImage2D() width and height values to be hardcoded to produce an image. In fact, every value i've tried for the uniform has produced an image, though each with different tile color patterns.
I've included a slightly simplified version of my Gfx class (the original is kinda messy, and includes a lot of stuff not relevant to this issue) below. I'd imagine the problem is above like 186 or so, but I've included a few additional functions below that in case those happen to be relevant.
class Gfx {
constructor() {
this.canvas = document.getElementById("canvas");
this.gl = canvas.getContext("webgl");
//viewPos changes as you drag your cursor across the canvas
this.x_viewPos = 0;
this.y_viewPos = 0;
}
init() {
this.resizeCanvas(window.innerWidth, window.innerHeight);
const vsSource = `
attribute vec4 aVertPos;
uniform mat4 uMVMat;
uniform mat4 uProjMat;
void main() {
gl_Position = uProjMat * uMVMat * aVertPos;
}
`;
//my tiles get drawn in the frag shader below
const fsSource = `
precision mediump float;
uniform vec2 uViewPos;
uniform vec2 uTileColorSampSize;
uniform sampler2D uTileColorSamp;
void main() {
//tile width and height are both 33px including a 1px border
const float lineThickness = (1.0/33.0);
//gridMult components will either be 0.0 or 1.0. This is used to place the grid lines
vec2 gridMult = vec2(
ceil(max(0.0, fract((gl_FragCoord.x-uViewPos.x)/33.0) - lineThickness)),
ceil(max(0.0, fract((gl_FragCoord.y-uViewPos.y)/33.0) - lineThickness))
);
//tileIndex is used to pull color data from the sampler texture
//add 0.5 due to pixel coords being off in gl
vec2 tileIndex = vec2(
floor((gl_FragCoord.x-uViewPos.x)/33.0) + 0.5,
floor((gl_FragCoord.y-uViewPos.y)/33.0) + 0.5
);
//divide by samp size as tex coords are 0.0 to 1.0
vec4 tileColor = texture2D(uTileColorSamp, vec2(
tileIndex.x/uTileColorSampSize.x,
tileIndex.y/uTileColorSampSize.y
));
gl_FragColor = vec4(
tileColor.x * gridMult.x * gridMult.y,
tileColor.y * gridMult.x * gridMult.y,
tileColor.z * gridMult.x * gridMult.y,
1.0 //the 4th rgba in our sampler is always 1.0 anyway
);
}
`;
const shader = this.buildShader(vsSource, fsSource);
this.programInfo = {
program: shader,
attribLocs: {
vertexPosition: this.gl.getAttribLocation(shader, 'aVertPos')
},
uniformLocs: {
projMat: this.gl.getUniformLocation(shader, 'uProjMat'),
MVMat: this.gl.getUniformLocation(shader, 'uMVMat'),
viewPos: this.gl.getUniformLocation(shader, 'uViewPos'),
tileColorSamp: this.gl.getUniformLocation(shader, 'uTileColorSamp'),
tileColorSampSize: this.gl.getUniformLocation(shader, 'uTileColorSampSize')
}
};
const buffers = this.initBuffers();
//check and enable OES_texture_float to allow us to create our sampler tex
if (!this.gl.getExtension("OES_texture_float")) {
alert("Sorry, your browser/GPU/driver doesn't support floating point textures");
}
this.gl.clearColor(0.0, 0.0, 0.15, 1.0);
this.gl.clearDepth(1.0);
this.gl.enable(this.gl.DEPTH_TEST);
this.gl.depthFunc(this.gl.LEQUAL);
const FOV = 45 * Math.PI / 180; // in radians
const aspect = this.gl.canvas.width / this.gl.canvas.height;
this.projMat = glMatrix.mat4.create();
glMatrix.mat4.perspective(this.projMat, FOV, aspect, 0.0, 100.0);
this.MVMat = glMatrix.mat4.create();
glMatrix.mat4.translate(this.MVMat, this.MVMat, [-0.0, -0.0, -1.0]);
this.gl.bindBuffer(this.gl.ARRAY_BUFFER, buffers.position);
this.gl.vertexAttribPointer(this.programInfo.attribLocs.vertPos, 2, this.gl.FLOAT, false, 0, 0);
this.gl.enableVertexAttribArray(this.programInfo.attribLocs.vertPos);
this.glDraw();
}
//glDraw() gets called once above, as well as in every frame of my render loop
//(not included here as I have it in a seperate Timing class)
glDraw() {
this.gl.clear(this.gl.COLOR_BUFFER_BIT | this.gl.DEPTH_BUFFER_BIT);
this.gl.useProgram(this.programInfo.program);
//X and Y TILE_COUNTs varrified to correspond to colorArray size in testing
//(colorArray.length = rgbaLength * X_TILE_COUNT * Y_TILE_COUNT)
//(colorArray.length = rgbaLength * widthInTiles * heightInTiles)
//(colorArray.length = 4 * 27 * 20)
let x_tileColorSampSize = X_TILE_COUNT;
let y_tileColorSampSize = Y_TILE_COUNT;
//getTileColorArray() produces a flat array of floats between 0.0and 1.0
//equal in length to rgbaLength * X_TILE_COUNT * Y_TILE_COUNT
//every 4th value is 1.0, representing tile alpha
let colorArray = this.getTileColorArray();
let colorTex = this.colorMapTexFromArray(
x_tileColorSampSize,
y_tileColorSampSize,
colorArray
);
//SO solution said to use anyting between 0 and 15 for texUnit, they used 3
//I imagine this is just an arbitrary location in memory to hold a texture
let texUnit = 3;
this.gl.activeTexture(this.gl.TEXTURE0 + texUnit);
this.gl.bindTexture(this.gl.TEXTURE_2D, colorTex);
this.gl.uniform1i(
this.programInfo.uniformLocs.tileColorSamp,
texUnit
);
this.gl.uniform2fv(
this.programInfo.uniformLocs.tileColorSampSize,
[x_tileColorSampSize, y_tileColorSampSize]
);
this.gl.uniform2fv(
this.programInfo.uniformLocs.viewPos,
[-this.x_viewPos, this.y_viewPos] //these change as you drag your cursor across the canvas
);
this.gl.uniformMatrix4fv(
this.programInfo.uniformLocs.projMat,
false,
this.projMat
);
this.gl.uniformMatrix4fv(
this.programInfo.uniformLocs.MVMat,
false,
this.MVMat
);
this.gl.drawArrays(this.gl.TRIANGLE_STRIP, 0, 4);
}
colorMapTexFromArray(width, height, colorArray) {
let float32Arr = Float32Array.from(colorArray);
let oldActive = this.gl.getParameter(this.gl.ACTIVE_TEXTURE);
//SO solution said "working register 31, thanks", next to next line
//not sure what that means but I think they're just looking for any
//arbitrary place to store the texture?
this.gl.activeTexture(this.gl.TEXTURE15);
var texture = this.gl.createTexture();
this.gl.bindTexture(this.gl.TEXTURE_2D, texture);
this.gl.texImage2D(
this.gl.TEXTURE_2D, 0, this.gl.RGBA,
//if I replace width and height with certain magic numbers
//like 4 or 8 (all the way up to 32 for width and 16 for height)
//I will see colored tiles, though obviously they don't map correctly.
//I THINK I've only seen it work with a widths and heights that are
//a power of 2... could the issue be that I need my texture to have
//width and height equal to a power of 2?
width, height, 0,
this.gl.RGBA, this.gl.FLOAT, float32Arr
);
//use gl.NEAREST to prevent gl from blurring texture
this.gl.texParameteri(this.gl.TEXTURE_2D, this.gl.TEXTURE_MAG_FILTER, this.gl.NEAREST);
this.gl.texParameteri(this.gl.TEXTURE_2D, this.gl.TEXTURE_MIN_FILTER, this.gl.NEAREST);
this.gl.bindTexture(this.gl.TEXTURE_2D, null);
this.gl.activeTexture(oldActive);
return texture;
}
//I don't think the issue would be in the functions below, but I included them anyway
resizeCanvas(baseWidth, baseHeight) {
let widthMod = 0;
let heightMod = 0;
//...some math is done here to account for some DOM elements that consume window space...
this.canvas.width = baseWidth + widthMod;
this.canvas.height = baseHeight + heightMod;
this.gl.viewport(0, 0, this.gl.canvas.width, this.gl.canvas.height);
}
initBuffers() {
const posBuff = this.gl.createBuffer();
this.gl.bindBuffer(this.gl.ARRAY_BUFFER, posBuff);
const positions = [
-1.0, 1.0,
1.0, 1.0,
-1.0, -1.0,
1.0, -1.0,
];
this.gl.bufferData(
this.gl.ARRAY_BUFFER,
new Float32Array(positions),
this.gl.STATIC_DRAW
);
return {
position: posBuff
};
}
buildShader(vsSource, fsSource) {
const vertShader = this.loadShader(this.gl.VERTEX_SHADER, vsSource);
const fragShader = this.loadShader(this.gl.FRAGMENT_SHADER, fsSource);
const shaderProg = this.gl.createProgram();
this.gl.attachShader(shaderProg, vertShader);
this.gl.attachShader(shaderProg, fragShader);
this.gl.linkProgram(shaderProg);
if (!this.gl.getProgramParameter(shaderProg, this.gl.LINK_STATUS)) {
console.error('Unable to initialize the shader program: ' + gl.getProgramInfoLog(shaderProg));
return null;
}
return shaderProg;
}
loadShader(type, source) {
const shader = this.gl.createShader(type);
this.gl.shaderSource(shader, source);
this.gl.compileShader(shader);
if (!this.gl.getShaderParameter(shader, this.gl.COMPILE_STATUS)) {
console.error('An error occurred compiling the shaders: ' + this.gl.getShaderInfoLog(shader));
this.gl.deleteShader(shader);
return null;
}
return shader;
}
//getTileColorArray as it appears in my code, in case you want to take a peak.
//every tileGrid[i][j] has a color, which is an array of 4 values between 0.0 and 1.0
//the fourth (last) value in tileGrid[i][j].color is always 1.0
getTileColorArray() {
let i_min = Math.max(0, Math.floor(this.x_pxPosToTilePos(this.x_viewPos)));
let i_max = Math.min(GLOBAL.map.worldWidth-1, i_min + Math.ceil(this.x_pxPosToTilePos(this.canvas.width)) + 1);
let j_min = Math.max(0, Math.floor(this.y_pxPosToTilePos(this.y_viewPos)));
let j_max = Math.min(GLOBAL.map.worldHeight-1, j_min + Math.ceil(this.y_pxPosToTilePos(this.canvas.height)) + 1);
let colorArray = [];
for (let i=i_min; i <= i_max; i++) {
for (let j=j_min; j <= j_max; j++) {
colorArray = colorArray.concat(GLOBAL.map.tileGrid[i][j].color);
}
}
return colorArray;
}
}
I've also included a pastebin of my full unaltered Gfx class in case you would like to look at that as well: https://pastebin.com/f0erR9qG
And a pastebin of my simplified code for the line numbers: https://pastebin.com/iB1pUZJa
WebGL 1.0 does not support texture wrapping on textures with non-power of two dimensions. There are two ways to solve this issue, one is to buffer the texture with enough extra data to make it have power of two dimensions, and the other solution it to simply turn off texture wrapping, like so:
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
I'm still getting strange behavior in my frag shader, but its at least showing tiles now. I think the additional strange behavior is just a result of my shader algorithm not matching what I have envisioned.

Texture which maintain proportion as scene background in three.js

i have a scene where i want to add a background, i have a png image (2k resolution), but when i try it on pc it is of the right size, on mobile is a lot "disproportionated"
My code is the following:
var texture = THREE.ImageUtils.loadTexture('img/texture.png');
And to add it as background is just this:
scene = new THREE.Scene();
scene.background = texture;
I've seen with some search that maybe i have to create a separate scene for the background, but i don't think it is the easiest solution, maybe there are a better solution for this?
(As always, sorry for my bad english)
You can try approaching this with THREE.ShaderMaterial
class MyBackgroundPlane extends THREE.Mesh{
constructor(){
super(
new THREE.PlaneBufferGeometry(2,2,1,1),
new THREE.ShaderMaterial({
uniforms:{
uTexture: { value: null },
uAspect: { value: 1 }
},
vertexShader: `
varying vec2 vUv;
uniform float uAspect;
void main(){
vUv = uv; //pass coordinates to screen
vUv.x *= uAspect; //scale the coordinates
gl_Position = vec4(position.xy, 1., 1.);
}
`,
fragmentShader:`
varying vec2 vUv;
uniform sampler2D uTexture;
void main(){
gl_FragColor = texture2D( uTexture, vUv );
}
`
})
)
this.frustumCulled = false
}
setAspect( aspect ){
this.material.uniforms.uAspect.value = aspect
}
setTexture( texture ){
this.material.uniforms.uTexture.value = texture
}
}
You kinda have to figure out what needs to happen when its portrait and when its landscape.
One approach could be to use uniform vec2 uScale; and then set the vertical and horizontal aspects differently depending on the orientation.
The same thing could be done with the scene graph by attaching a regular plane to a camera for example, and then managing it's scale.
As an alternative, you can use a CSS based background:
#background {
background-image: url('http://youring.com/test/img/texture.png');
position: fixed;
top: 0;
left: 0;
height: 100%;
width: 100%;
z-index: -1;
}
Just create your renderer like this so it's possible to see through the canvas:
renderer = new THREE.WebGLRenderer( { antialias: true, alpha: true } );
DEMO: https://jsfiddle.net/f2Lommf5/5052/

Change values of Phong Shader with sliders

I am trying to implement a 3D scene with WebGL and Javascript. The final scene is supposed to show a cuboid with smaller cuboids, pyramids and spheres on all sides. The smaller spheres have to rotate with the big cuboid. I implemented Phong Shading, this works fine. Now I want to change the values of shininess, lightPos, and lightIntensity with three sliders on the right of my canvas that displays the scene. The slider for shininess is apparently not working and I'm even more struggeling with the other two sliders, as lightPos and lightIntensity are vec3 elements that are constants. The code for the three variables looks like this:
const vec3 lightPos = vec3(1.0,-1.0,1.0);
float shininess = 16.0;
const vec3 lightIntensity = vec3(1.0, 1.0, 1.0);
At the moment the slider for shininess looks like this:
<input id="shininess" type="range" min="1" max="50"></input>
var shininessElement = document.getElementById("shininess");
shininessElement.onchange = function(){
shininess = shininessElement.value;
window.requestAnimationFrame(animate);
I'm pretty sure that I did something terribly wrong but a research didn't lead to any result and I've no idea what to do next, so I'd really appreciate your help.
If you need the complete code, please let me know.
You probably should read some other tutorials on WebGL. In particular you can't set shininess unless you make it a uniform, then look up the uniform's location and set it with gl.uniform???.
Here's simple example of using a slider to set a value and then sending that value to a shader by setting a uniform variable in the shader.
const gl = document.querySelector("canvas").getContext('webgl');
const vs = `
void main() {
gl_Position = vec4(0, 0, 0, 1);
gl_PointSize = 100.0;
}
`;
const fs = `
precision mediump float;
uniform float shininess;
void main() {
gl_FragColor = vec4(shininess, 0, 0, 1);
}
`;
// compiles shaders, links program
const prg = twgl.createProgram(gl, [vs, fs]);
const shininessLocation = gl.getUniformLocation(prg, "shininess");
let shininess = .5;
draw();
function draw() {
gl.useProgram(prg);
gl.uniform1f(shininessLocation, shininess);
gl.drawArrays(gl.POINTS, 0, 1);
}
document.querySelector("input").addEventListener('input', (e) => {
shininess = e.target.value / 100;
draw();
});
<script src="https://twgljs.org/dist/3.x/twgl.min.js"></script>
<canvas></canvas>
<input type="range" min="0" max="100" value="50" />

Three js Shader Material modify depth buffer

In Three js, I'm using a vertex shader to animate a large geometry.
I've also set up a Depth of Field effect on the output. The problem is that the Depth of Field effect doesn't seem to know about the changed positioning created in my vertex shader. It is responding as if the geometry is in the original position.
How can I update the depth information in my shader/material so that the DOF works correctly? THREE.Material has a depthWrite property, but it doesn't seem to be that...
My depth of field pass works like this:
renderer.render( this.originalScene, this.originalCamera, this.rtTextureColor, true );
this.originalScene.overrideMaterial = this.material_depth;
renderer.render( this.originalScene, this.originalCamera, this.rtTextureDepth, true );
rtTextureColor and rtTextureDepth are both WebGLRenderTargets. For some reason rtTextureColor is correct, but rtTextureDepth is not
here is my vertex shader:
int sphereIndex = int(floor(position.x/10.));
float displacementVal = displacement[sphereIndex].w;
vec3 rotationDisplacement = displacement[sphereIndex].xyz;
vNormal = normalize( normalMatrix * normal );
vec3 vNormel = normalize( normalMatrix * viewVector );
intensity = abs(pow( c - dot(vNormal, vNormel), p ));
float xVal = (displacementVal*orbitMultiplier) * sin(timeValue*rotationDisplacement.x);
float yVal = (displacementVal*orbitMultiplier) * cos(timeValue*rotationDisplacement.y);
float zVal = 0;
vec3 rotatePosition = vec3(xVal,yVal,zVal);
vec3 newPos = (position-vec3((10.*floor(position.x/10.)),0,0))+rotatePosition;
vec4 mvPosition;
mvPosition = (modelViewMatrix * vec4(newPos,1));
vViewPosition = -mvPosition.xyz;
vec4 p = projectionMatrix * mvPosition;
gl_Position = p;
Because you set the scene override material (this.originalScene.overrideMaterial = this.material_depth) before rendering into this.rtTextureDepth, the renderer doesn't use your custom vertex shader. The scene override material is a THREE.MeshDepthMaterial, which includes its own vertex shader.
One thing to try is writing a THREE.ShaderMaterial that works like THREE.MeshDepthMaterial but uses your custom vertex shader. Modifying built-in shaders isn't straightforward, but I would start from something like this:
var depthShader = THREE.ShaderLib['depth'];
var uniforms = THREE.UniformsUtils.clone(depthShader.uniforms);
var material = new THREE.ShaderMaterial({
uniforms: uniforms,
vertexShader: /* your custom vertex shader */
fragmentShader: depthShader.fragmentShader
});
You'll have to add the uniforms for your custom vertex shader and also set the uniforms for the built-in depth shaders; search WebGLRenderer.js in the three.js source for MeshDepthMaterial.

Change color in middle of circle

I'm new to WebGL and I'm trying to create a black ring in the middle of this green circle without making additional circles. I believe I can do this by making the normal of those triangles go the other way but I'm not sure exactly how to do this. My friend suggested changing the texture coordinates but I don't really understand how this would help. Can anyone shine some light on these ideas and possible intuition?
_______HTML File__________
<!DOCTYPE html>
<html>
<head>
<script id="vertex-shader" type="x-shader/x-vertex">
attribute vec4 vPosition;
void
main()
{
gl_Position = vPosition;
}
</script>
<script id="fragment-shader" type="x-shader/x-fragment">
precision mediump float;
void
main()
{
gl_FragColor = vec4( 0.0, 1.0, 0.0, 1.0 );
}
</script>
<script type="text/javascript" src="../Common/webgl-utils.js"></script>
<script type="text/javascript" src="../Common/initShaders.js"></script>
<script type="text/javascript" src="../Common/MV.js"></script>
<script type="text/javascript" src="Circle.js"></script>
</head>
<body>
<canvas id="gl-canvas" width="512" height="512">
Oops ... your browser doesn't support the HTML5 canvas element
</canvas>
</body>
</html>
_____Javascript File______
var gl;
var points;
window.onload = function init()
{
var canvas = document.getElementById( "gl-canvas" );
gl = WebGLUtils.setupWebGL( canvas );
if ( !gl ) { alert( "WebGL isn't available" ); }
// The Vertices
var pi = 3.14159;
var x = 2*pi/100;
var y = 2*pi/100;
var r = 0.9;
points = [ vec2(0.0, 0.0) ]; //establish origin
//for loop to push points
for(var i = 0; i < 100; i++){
points.push(vec2(r*Math.cos(x*i), r*Math.sin(y*i)));
points.push(vec2(r*Math.cos(x*(i+1)), r*Math.sin(y*(i+1))));
}
//
// Configure WebGL
//
gl.viewport( 0, 0, canvas.width, canvas.height );
gl.clearColor( 0.3, 0.3, 0.3, 1.0 );
// Load shaders and initialize attribute buffers
var program = initShaders( gl, "vertex-shader", "fragment-shader" );
gl.useProgram( program );
// Load the data into the GPU
var bufferId = gl.createBuffer();
gl.bindBuffer( gl.ARRAY_BUFFER, bufferId );
gl.bufferData( gl.ARRAY_BUFFER, flatten(points), gl.STATIC_DRAW );
// Associate out shader variables with our data buffer
var vPosition = gl.getAttribLocation( program, "vPosition" );
gl.vertexAttribPointer( vPosition, 2, gl.FLOAT, false, 0, 0 );
gl.enableVertexAttribArray( vPosition );
render();
};
function render() {
gl.clear( gl.COLOR_BUFFER_BIT );
gl.drawArrays( gl.TRIANGLE_FAN, 0, points.length );
}
I assembled some part of your task as you requested. I tried to not change your code much, so you can understand all changes I have done. First small show:
Triangle with your code
Circle made out of 3 points
You made circle out of 100 points (vertices). Now you want to make another shape inside. It means use another 100 points, which is probably what you don't want to do. Instead of this, you would like to use normals. But from the point of view of shaders (which are responsible for drawing), normals, vertices and other things like texture coordinates are just data and you are the one who decides, if data means vertices, normals, texture coordinates or anything else.
If I understand good, you want to customize your object without adding too much additional data. I don't think normals or textures can help you.
There are few problems you will have to face with texture ...
First is, if circle will be too big (close to you), then it will be not that nice with just 100 points.
If circle will be too small (far from you), but there will be a lot circles, you will use too many points for nothing which will lower performance.
If you use texture for black ring inside, it will be fuzzy if you will be closer.
And if you use too large texture for a lot of small circles, it will again lower performance.
... and normals are used to do light reflection like this.
Way I think about the problem. You can define circle with just few params, radius and center. With webgl, you can draw only triangles (and points). But you can for example customize shader to draw inscribed circle in each triangle.
So I defined just radius and center:
var r = 0.9;
var middle = vec2(0.0, 0.0);
Then I generate 3 points of triangle around the circle (circle is inscribed circle of this new triangle):
function buildCircle(center, r) {
var points = [];
points.push(vec2((r * TRI_HEIGHT_MOD * Math.cos(0 * DEG_TO_RAD)) + center[0], (r * TRI_HEIGHT_MOD * Math.sin(0 * DEG_TO_RAD)) + center[1]));
points.push(vec2((r * TRI_HEIGHT_MOD * Math.cos(120 * DEG_TO_RAD)) + center[0], (r * TRI_HEIGHT_MOD * Math.sin(120 * DEG_TO_RAD) + center[1])));
points.push(vec2((r * TRI_HEIGHT_MOD * Math.cos(240 * DEG_TO_RAD)) + center[0], (r * TRI_HEIGHT_MOD * Math.sin(240 * DEG_TO_RAD)) + center[1]));
vertexPositions = points;
}
Then I pass middle, radius and triangle to my shader:
var vPosition = gl.getAttribLocation(program, "vPosition");
gl.vertexAttribPointer(vPosition, 2, gl.FLOAT, false, 0, 0);
gl.enableVertexAttribArray(vPosition);
program.middle = gl.getUniformLocation(program, "middle");
gl.uniform2f(program.middle, middle[0], middle[1]);
program.r = gl.getUniformLocation(program, "r");
gl.uniform1f(program.r, r);
And then I just render it with same as you do, except I need to allow alpha drawing, because some parts of triangle will be invisible, so it will look as circle:
gl.blendFunc(gl.SRC_ALPHA, gl.ONE);
gl.enable(gl.BLEND);
gl.disable(gl.DEPTH_TEST);
Ok now shaders.
There are few things you really need to know to continue, so please read about it here: http://webglfundamentals.org/webgl/lessons/webgl-how-it-works.html
My vertex shader is same as yours, except I need to pass interpolated vertex position to fragment shader:
varying vec4 pos;
...
void main() {
pos = vPosition;
My fragment shader needs to do only one thing and it is to decide, if pixel is in the circle or not. Simple equation:
If the left side is smaller then the right side, then pixel is inside the circle. If not, then it is outside, so invisible:
float inside = pow(pos.r - middle.r, 2.0) + pow(pos.g - middle.g, 2.0);
if (inside < pow(r, 2.0)) {
gl_FragColor = vec4(0.0, 1.0, 0.0, 1.0);
} else {
gl_FragColor = vec4(0.0, 0.0, 0.0, 0.0);
}
End
So now you might know how to make a circle just from few points. You can use similar way to draw a ring inside. Then you can draw thousands of them in any distance and make them move. Program will be still fast and shapes will be as sharp as possible.
Just one last thing. Usually you dont simplify shapes like that, but sometimes you might. Good example is Bézier curve which might help you to do crazy sharp shapes with just few points. But it all matters what would you like to do. One technique can't solve all problems and you have to keep looking for more solutions.
EDIT 1: "What is var middle = vec2(0.0, 0.0)? I meam, vec2?"
There are 3 other scripts in this question that I replicated in my solution (in jsfiddle on the left: External Resources). It wasnt part of this question, but it was easy to find theirs origin:
<script type="text/javascript" src="../Common/webgl-utils.js"></script>
<script type="text/javascript" src="../Common/initShaders.js"></script>
<script type="text/javascript" src="../Common/MV.js"></script>
MV.js is some supply javascript with basic math... or algebraic constructs like vectors and matrices. vec2 is function that returns array with length 2. So var middle = [0.0, 0.0]; is exactly the same thing. This is not part of native javascript, so you need some library for it (you don't need it, but it is very useful). I use glmatrix.
On the other hand in shaders, vectors and matrices are native. Find it out on your own in chapter 4.1 Basic Types.

Categories