Hello I am a bit new to 3d programming. I am trying to improve the efficiency of a particle system that I am simulating with liquid fun.
Currently I am drawing the particle system this way:
for (var j = 0; j < maxParticleSystems; j++) {
var currentParticleSystem = world.particleSystems[j];
var particles = currentParticleSystem.GetPositionBuffer();
var maxParticles = particles.length;
for (var k = 0; k < maxParticles; k += 2) {
context.drawImage(particleImage, (particles[k] * mToPx) + offsetX, (particles[k + 1] * mToPx) + offsetY);
context.fill();
}
}
This basically draws each particle one at a time which is very slow. I have been doing some reading and I read about Position Buffer objects in webGL. How would I use one to draw these?
This is arguably too broad a question for Stack Overflow. WebGL is just a rasterization API which means there's an infinite number of ways to render and/or compute particles with it.
Some common ways
Compute particle positions in JavaScript, Render with POINTS in WebGL
Compute particle positions in JavaScript, Render with quads in WebGL (rendering quads lets you orient the particles)
Compute particle positions based on time alone in a shader, render POINTS.
Compute particle positions based on time alone in a shader, render quads
Compute particle positions in shaders with state by reading and writing state to a texture through a framebuffer
And hundreds of other variations.
Particle system using webgl
Efficient particle system in javascript? (WebGL)
Related
I am new to THREE.js. I have some questions...
can someone help me with simple squishing this ball when he contacts the border and change direction? or maybe just scaling the ball from angle of contacting border point? to make this ball little more realistic
some code: Making rotating 3d sphere with velocity
code from this theme:
THREE.js - moving a 3D ball with a rotation
for (var y = 0; y < 16; y++)
for (var x = 0; x < 16; x++)
if ((x & 1) != (y & 1)) ctx.fillRect(x * 16, y * 16, 16, 16);
var ballTex = new THREE.Texture(canv);
ballTex.needsUpdate = true;
Sorry for my bad English!
If the ball is rolling when it hits an edge, it'd be really difficult to create a believable squish by using the .scale attribute, since that only affects its scaling along the x, y, or z axis.
The best way to realistically achieve this effect is to use a physics engine to detect collisions and morph the geometry accordingly. There's already an example of this on the Three.js website. You can look at its source code to follow along. Just keep in mind it uses Ammo.js as its physics engine, so you'd need to learn how to use the Ammo API if you want to make modifications.
I'm using Three.js to create points on a sphere, similar to the periodic table of elements example.
My data set is circles of irregular size, and I wish to evenly distribute them around the surface of a sphere. After numerous hours searching the web, I realize that is much harder than it sounds.
Here are examples of this idea in action:
Vimeo
Picture
circlePack Java applet
Is there an algorithm that will allow me to do this? The packing ratio doesn't need to be super high and it'd ideally be something quick and easy to calculate in JavaScript for rendering in Three.js (Cartesian or Coordinate system). Efficiency is key here.
The circle radii can vary widely. Here's an example using the periodic table code:
Here's a method to try: an iterative search using a simulated repulsive force.
Algorithm
First initialize the data set by arranging the circles across the surface in any kind of algorithm. This is just for initialization, so it doesn't have to be great. The periodic table code will do nicely. Also, assign each circle a "mass" using its radius as its mass value.
Now begin the iteration to converge on a solution. For each pass through the main loop, do the following:
Compute repulsive forces for each circle. Model your repulsive force after the formula for gravitational force, with two adjustments: (a) objects should be pushed away from each other, not attracted toward each other, and (b) you'll need to tweak the "force constant" value to fit the scale of your model. Depending on your math ability you may be able to calculate a good constant value during planning; other wise just experiment a little at first and you'll find a good value.
After computing the total forces on each circle (please look up the n-body problem if you're not sure how to do this), move each circle along the vector of its total calculated force, using the length of the vector as the distance to move. This is where you may find that you have to tweak the force constant value. At first you'll want movements with lengths that are less than 5% of the radius of the sphere.
The movements in step 2 will have pushed the circles off the surface of the sphere (because they are repelling each other). Now move each circle back to the surface of the sphere, in the direction toward the center of the sphere.
For each circle, calculate the distance from the circle's old position to its new position. The largest distance moved is the movement length for this iteration in the main loop.
Continue iterating through the main loop for a while. Over time the movement length should become smaller and smaller as the relative positions of the circles stabilize into an arrangement that meets your criteria. Exit the loop when the movement legth drops below some very small value.
Tweaking
You may find that you have to tweak the force calculation to get the algorithm to converge on a solution. How you tweak depends on the type of result you're looking for. Start by tweaking the force constant. If that doesn't work, you may have to change the mass values up or down. Or maybe change the exponent of the radius in the force calculation. For example, instead of this:
f = ( k * m[i] * m[j] ) / ( r * r );
You might try this:
f = ( k * m[i] * m[j] ) / pow( r, p );
Then you can experiment with different values of p.
You can also experiment with different algorithms for the initial distribution.
The amount of trial-and-error will depend on your design goals.
Here is something you can build on perhaps. It will randomly distribute your spheres along a sphere. Later we will iterate over this starting point to get an even distribution.
// Random point on sphere of radius R
var sphereCenters = []
var numSpheres = 100;
for(var i = 0; i < numSpheres; i++) {
var R = 1.0;
var vec = new THREE.Vector3(Math.random(), Math.random(), Math.random()).normalize();
var sphereCenter = new THREE.Vector3().copy(vec).multiplyScalar(R);
sphereCenter.radius = Math.random() * 5; // Random sphere size. Plug in your sizes here.
sphereCenters.push(sphereCenter);
// Create a Three.js sphere at sphereCenter
...
}
Then run the below code a few times to pack the spheres efficiently:
for(var i = 0; i < sphereCenters.length; i++) {
for(var j = 0; j < sphereCenters.length; j++) {
if(i === j)
continue;
// Calculate the distance between sphereCenters[i] and sphereCenters[j]
var dist = new THREE.Vector3().copy(sphereCenters[i]).sub(sphereCenters[j]);
if(dist.length() < sphereSize) {
// Move the center of this sphere to compensate.
// How far do we have to move?
var mDist = sphereSize - dist.length();
// Perturb the sphere in direction of dist magnitude mDist
var mVec = new THREE.Vector3().copy(dist).normalize();
mVec.multiplyScalar(mDist);
// Offset the actual sphere
sphereCenters[i].add(mVec).normalize().multiplyScalar(R);
}
}
}
Running the second section a number of times will "converge" on the solution you are looking for. You have to choose how many times it should be run in order to find the best trade-off between speed, and accuracy.
You can use the same code as in the periodic table of elements.
The rectangles there do not touch, so you can get the same effect with circles, virtually by using the same code.
Here is the code they have:
var vector = new THREE.Vector3();
for ( var i = 0, l = objects.length; i < l; i ++ ) {
var phi = Math.acos( -1 + ( 2 * i ) / l );
var theta = Math.sqrt( l * Math.PI ) * phi;
var object = new THREE.Object3D();
object.position.x = 800 * Math.cos( theta ) * Math.sin( phi );
object.position.y = 800 * Math.sin( theta ) * Math.sin( phi );
object.position.z = 800 * Math.cos( phi );
vector.copy( object.position ).multiplyScalar( 2 );
object.lookAt( vector );
targets.sphere.push( object );
}
What is PlaneBufferGeometry exactly and how it is different from PlaneGeometry? (r69)
PlaneBufferGeometry is a low memory alternative for PlaneGeometry. the object itself differs in a lot of ways. for instance, the vertices are located in PlaneBufferGeometry are located in PlaneBufferGeometry.attributes.position instead of PlaneGeometry.vertices
you can take a quick look in the browser console to figure out more differences, but as far as i understand, since the vertices are usually spaced on a uniform distance (X and Y) from each other, only the heights (Z) need to be given to position a vertex.
The main differences are between Geometry and BufferGeometry.
Geometry is a "user-friendly", object-oriented data structure, whereas BufferGeometry is a data structure that maps more directly to how the data is used in the shader program. BufferGeometry is faster and requires less memory, but Geometry is in some ways more flexible, and certain operations can be done with greater ease.
I have very little experience with Geometry, as I have found that BufferGeometry does the job in most cases. It is useful to learn, and work with, the actual data structures that are used by the shaders.
In the case of a PlaneBufferGeometry, you can access the vertex positions like this:
let pos = geometry.getAttribute("position");
let pa = pos.array;
Then set z values like this:
var hVerts = geometry.heightSegments + 1;
var wVerts = geometry.widthSegments + 1;
for (let j = 0; j < hVerts; j++) {
for (let i = 0; i < wVerts; i++) {
//+0 is x, +1 is y.
pa[3*(j*wVerts+i)+2] = Math.random();
}
}
pos.needsUpdate = true;
geometry.computeVertexNormals();
Randomness is just an example. You could also (another e.g.) plot a function of x,y, if you let x = pa[3*(j*wVerts+i)]; and let y = pa[3*(j*wVerts+i)+1]; in the inner loop. For a small performance benefit in the PlaneBufferGeometry case, let y = (0.5-j/(hVerts-1))*geometry.height in the outer loop instead.
geometry.computeVertexNormals(); is recommended if your material uses normals and you haven't calculated more accurate normals analytically. If you don't supply or compute normals, the material will use the default plane normals which all point straight out of the original plane.
Note that the number of vertices along a dimension is one more than the number of segments along the same dimension.
Note also that (counterintuitively) the y values are flipped with respect to the j indices: vertices.push( x, - y, 0 ); (source)
I'm using a BufferGeometry and some predefined data to create an object similar to a Minecraft chunk (made of voxels and containing cave-like structures). I'm having a problem lighting up this object efficently.
At the moment I'm using a MeshLambertMaterial and a DirectionalLight which enables me to cast shadows on voxels not in view of the light, however this isn't efficient to use for a large terrain because it requires a very large shadow map and will often cause glitchy shadow artifacts as a result.
Here's the code I'm using to add the indices and vertices to the BufferGeometry:
// Add indices to BufferGeometry
for ( var i = 0; i < section.indices.length; i ++ ) {
var j = i * 3;
var q = section.indices[i];
indices[ j ] = q[0] % chunkSize;
indices[ j + 1 ] = q[1] % chunkSize;
indices[ j + 2 ] = q[2] % chunkSize;
}
// Add vertices to BufferGeometry
for ( var i = 0; i < section.vertices.length; i ++ ) {
var q = section.vertices[i];
// There's 1 color for every 4 vertices (square)
var hexColor = section.colors[i / 4];
addVertex( i, q[0], q[1], q[2], hexColor );
}
And my 'chunk' example: http://jsfiddle.net/9sSyz/4/
A screenshot:
If I were to remove the shadows from my example, all voxels on the correct side would be lit up even if another voxel obstructed the light. I just need another scalable way to give the illusion of a shadow. Perhaps by changing vertex colors if not in view of the light? It doesn't have to be as accurate as the current shadow implementation so changing the vertex colors (to give a blocky vertex-bound shadow) would be enough.
Would appreciate any help or advice. Thanks.
Generally, if you have large terrains, the idea is to split the scene into more cascades and each cascade has its own shadow map. Technique is called CSM - cascaded shadow maps. Problem is, I haven't heard of an webGL example that implements this technique. CSMs are used on dynamic scenes. But I'm not sure how easy would be to implement this with Three.js.
Second option is adding ambient occlusion, as suggested by WestLagnley, but it's just an occlusion, not a shadow. Results are very different.
Third option, if your scene is mostly static - baked shadows. So, preprocessed textures that you simply apply to the terrain etc. To support dynamic objects, just render their shadow maps and apply those to some geometry that just mimics shadowed area (perhaps, a plane that hovers slightly above ground and receives the shadow etc).
Any combination of the techniques mentioned is also an option.
P.S. Could you also supply a screenshot, fiddles fail to load.
I have a basic particle system in JavaScript (utilising canvas for rendering), and I'm trying to find the best way to handle collisions between particles. The particle system can handle about 70,000 particles at a pretty decent FPS.
It consists of an array that contains every Particle object.
Each Particle object contains 3 Vector objects (one for displacement, velocity, and acceleration) which contain an x and a y variable.
Before each frame, acceleration vectors are applied to velocity vectors, and velocity vectors are applied to displacement vectors for every single Particle object.
The renderer then iterates through each Particle and then draws a 1x1 pixel square at the location of every displacement vector.
The particle system also has 'magnetic' fields also, which can cause the particles to accelerate towards/away from a given point.
I tried applying a 'magnetic' field to each particle, but the calculations I use to get the updated acceleration vectors for each particle are too inefficient, and this method reduced the FPS considerably.
Below is the code I use to recalculate Particle acceleration vectors, with respect to nearby magnetic fields (This function is called before every frame):
Particle.prototype.submitToFields = function (fields) {
// our starting acceleration this frame
var totalAccelerationX = 0;
var totalAccelerationY = 0;
// for each passed field
for (var i = 0; i < fields.length; i++) {
var field = fields[i];
// find the distance between the particle and the field
var vectorX = field.point.x - this.point.x;
var vectorY = field.point.y - this.point.y;
// calculate the force via MAGIC and HIGH SCHOOL SCIENCE!
var force = field.mass / Math.pow(vectorX*vectorX+vectorY*vectorY,1.5);
// add to the total acceleration the force adjusted by distance
totalAccelerationX += vectorX * force;
totalAccelerationY += vectorY * force;
}
// update our particle's acceleration
this.acceleration = new Vector(totalAccelerationX, totalAccelerationY);
}
It's obvious why the above method reduced the performance drastically - the number of calculations rises exponentially with every new particle added.
Is there another method of particle collision detection that will have good performance with thousands of particles? Will these methods work with my current object structure?
Don't create a new Vector here. It means that you're creating 70 000 new Vectors each frame. Just change the vector values :
this.acceleration.x = totalAccelerationX; // or : this.acceleration[0] = totalAccelerationX;
this.acceleration.y = totalAccelerationY; // or : this.acceleration[1] = totalAccelerationY;
If it doesn't helps enough, you'll have to use a WebWorker.