How to rotate an object along a circular path? - javascript

I'm trying to realize a bicycle that moves along a circular path, using webGL programming language. My problem is related to the rotation of the bicycle itself that does not rotate on itself during its circular movement, but it remains with its initial angle, although the object is correctly translated in a circular fashion along the track.
In order to provide circular movements to the bicycle, I'm using the cosine and sine functions and each time varying the angle. The axes that I have to take into consideration are the x-axis and z-axis while the y-axis is fixed.
Any suggestions?

If I understood you correctly, what you want is essentially to orient the bicycle so that it faces its direction of motion?
That is usually done by calculating the model (bicycle)'s modelToWorld matrix. Your matrix lib probably have a .lookAt function and you should use that to calculate the modelToWorld matrix.
You should be able to calculate the bike's forward direction. If its moving around in a circle then it is normalize(cross(normalize(bikePos-circleCenter), UP_VECTOR)).

Related

Ray-Tracing inspired algorithm for sound reflection simulation

I am a civil architect by profession, with a passion for maths, physics and computers. For one of my projects, I am designing a room with 3 straight walls and 4th curved wall. A sound source is near the left wall.
+-------
| \
| + |
| /
+-------
Having some spare time on my hands, I decided to try modeling the acoustics of this room using JavaScript and Canvas API. My goal was to calculate for every point in the room:
Net intensity of sound by summing sound coming directly from source and reflections off the walls (including curved one). This would include attenuation due to inverse square law and absorption by walls.
Reverb characteristics by keeping track of path lengths directly from source and reflections from walls. If a point in the room received reflected signal about 0.05 seconds after the primary signal arrives, we might have an echo problem.
I assumed a canvas size of 800x600 pixels and real world dimensions of the room as 45x44 feet (left wall = 44ft, top/bottom walls 31ft, curved wall radius 22ft) and sound source 5ft from left wall. I modeled each wall as a line or circle equation and wrote a function that tells me if a point is inside the room or not. For each pixel in the canvas, I convert it to a real world coordinate and calculate its distance from the source and use the inverse square law to calculate sound intensity. What I ended up was this:
However, needless to say, this only captures the primary bounce from the source. It doesn't capture any reflections and those are proving way too hard for me to calculate.
I'm looking for an insight into how I can do that. I have tried the following partially:
Instead of iterating points in the room grid-wise, I've tried to generate rays from the source. Calculating reflections off the straight walls is easy. But the curved wall presents some challenges. The biggest problem I'm having is this: If I start with 360 rays, the closest points to the source have way too many points per pixel, but as we move outwards, the points become so diluted that there may be tens of pixels between adjacent points. This also means that when I reflect a ray, it would most-certainly not land on the points created by the primary bounce and I wouldn't be able to simply add them up. Even if I interpolate, the result would not be correct as some points would register intensity due to primary bounce, even fewer would register intensity due to secondary/tertiary bounces, and many points would register nothing. In the image below, I've tried this approach with primary bounces only.
Iterate the room grid-wise. For each point in the room, calculate direct direction to the source and reflected location of source in each wall. Use these distances to calculate net intensity at every sample point in the room. This is easy to do for the straight walls. But the math turns EXTRAORDINARILY complicated and unsolvable for me for the curved wall.
X
+
+B
+ +
A O
Assume A is the source, O is the center of curve, B is the point in room we're currently testing, and X is a point on the curve. For secondary bounces, ‹AXO = ‹BXO. We know A, O and B. If we can find X, then BX needs to be extended backwards a distance equal to AX and the image of the source would be located there. The problem is that finding X is a very hard problem. And even if this can be done, it only accounts for secondary bounces. Tertiary bounces would be even harder to calculate.
I believe Option #2 is the better way to go about this. But I do not possess enough math/computer skills to tackle this problem on my own. At this point in time, I'm trying to solve this not for my project, but for my personal curiosity. I would be grateful if any of you can solve this and share your solutions. Or if you can just give me insight into this problem, I would be glad to work on this further.
I lack expertise in computing interpolation and ray-tracing (which would be required for this problem, I think).
Thanks,
Asim
So. With great pointers from lara and a deep dive into matrix math and bresenham's line rendering algorithms, I was finally able to complete this hobby project. =) Outlined here are the steps for anyone wishing to follow down this route for similar problems.
Ditch algebraic equations in favor of matrix math. Ditch lines in favor of parametric lines.
Represent walls in terms of rays and circles. Lines can be represented as [x y 1] = [x0 y0 1] + t*[dx dy 1]. Circles can be represented as (X - C)^2 = r^2.
Project rays outwards from the source. For each ray, calculate its intersection with one of the walls, and resize it to span from its starting point to the intersection point.
When an intersection point is calculated, calculate the normal vector of the wall at the point of intersection. For straight walls, calculating normal is simple ([-dy dx 1]). For circles, the normal is X - C.
Use matrices to reflect the incident ray about the normal.
Repeat the process for as many bounces as needed.
Map the World Coordinate System the plan is in to a Cell Coordinate System to divide the world into a grid of cells. Each cell can be 1'-0" x 1'-0" in size (in feet). Use matrices again for transformation between the two coordinate systems.
Use the transformation matrix to convert the ray to the Cell Coordinate System.
Use Bresemham's Line algorithm to determine which cells the ray passes thru. For each cell, use the inverse square law to calculate ray intensity for that cell. Add this value to the cells grid.
Finally, use Canvas API and another transformation matrix to convert between Cell Coordinate System to Screen Coordinate System and render the cells grid on screen.
Here are a few screenshots of what I achieved using this:
Rays of light emanating from the source and intersecting with the walls.
Calculating multiple reflections of a single ray.
Calculating single reflections of all rays emanating from the source.
Using Bresenham's Line algorithm to identify cells crossed by the ray and plotting logarithmic intensity.
Rendering all cells for all primary rays only.
Rendering all cells for reflected rays only. The caustic surface is clearly visible here.
Rendering all cells for primary and reflected rays using color-coding.
I got to learn a lot of very interesting math skills during the course of this project. I respect matrices a lot more now. All during my college years, I wondered if I would ever need to use the Bresenham's Line algorithm ever in my life, since all graphics libraries have line drawing algorithms build-in. For the first time, I found that I needed to directly use this algorithm and without it, this project would not have been possible.
I will make available the code on GitHub soon. Thanks to everyone who contributed to my understanding of these concepts.
Asim

how to get real measurements of two points from a 360 image in a-frame?

how to get a real measure of two points in a 360 picture of an interior using a-frame.io framework?
we tried converting the unit system of a-frame to centimeter and took two points where the dimensions were known and set it as default. and estimated that any other points we take would be relatively correct but it isn't.
any other suggestions or formula that could help?
thank you
That can't work. At least unless you have a depth-image as well. What you can easily get from a single 360° image are two angles for pan and tilt. If you add a third value, the distance from the camera (also called depth), you have so called spherical coordinates which can be converted to cartesian coordinates (x, y, z).
Without knowing that distance you can only reconstruct a ray, but not a single point. You need one more piece of information to determine where along that ray the point is (which is what you need to know for any measurements in the image).

Mathematics for rotation in d3.js for 3d charts

I am trying implement the rotate on drag functionality in d3 for a 3d chart, for reference I am using following example from d3 but not getting the maths behind it.
Can anyone please explain me the math for rotation in this example
http://bl.ocks.org/supereggbert/aff58196188816576af0
as #coderPi mentioned you have to rotate and project every single point. Think about it like so: Your 3d point (x,y,z) can not be displayed on your screen because it only has x and y. To get the point onto your screen you have to project it. A very common projection is the orthographic projection. For the rotation you should use a rotation matrix (and there are various ones). If you don't want to do this all by your self, I created a decent d3.js plugin.

Three.js rotate everything except camera

I have given up trying to orbit a camera around my scene in Three.js and have now decided to revert to doing what I used to do in XNA, just rotate everything except the camera.
The reason I have given up is because I cannot get the camera to orbit properly 360 degrees in all the axis, it starts inverting after going over the top or under the bottom. Using THREE.OrbitControls does not solve this because it merely restricts rotation in the problematic axis instead of fixing the problem.
My problem is now getting this other rotation story working. What I have done is put all objects except the camera in another object "rotSection" and I am now just rotating that object. This is working but rotation is always performed according to the relative (0, 0, 0) position of the rotation object which seems to always stay in the one corner but I would like to rotate around the centre of my world on not around the edge. I have tried to centre the rotSection relative to the scene but it still rotates around its corner and not its centre. Any idea how I can get rotation of an Object3D around a certain point?
The engines don’t move the ship at all. The ship stays where it is and
the engines move the universe around it.
Futurama
The camera in 3d technically never rotates, everything else is rotated and move in order to bring it to camera's local space. You don't have to do any tricks in order to do this, this should be the core of the 3d engine, setting the matrices, setting up the shaders, and doing the correct transforms. Three.js does this for you.
Perhaps you should look into quaternions? Specifically the axisAngle conversion to quats. THREE.OrbitControls won't do what you want.

Averaging rotations with euler angles or rotational matrices

I have been working on a small app that controls the rotation of a cubic map panorama via the gyroscope of a mobile device or tablet. I finally have it working, albeit roughly. My solution involved converting the euler angles coming in from the gyroscope into rotational matrices and passing those matrices through various modification matrices.
Now that I have this working, I am looking to smooth out the animation. I was thinking it would be best to collect rotational data in an array and then take their average. However, I am totally unsure how to do this.
Can I average the rotational matrices, or the euler angles themselves? Or am I going to need to convert the data into Quaternions and then apply some kind of averaging function?
Any help would be great. Thanks!
Can I average the rotational matrices, or the euler angles themselves?
Nope.
Or am I going to need to convert the data into Quaternions and then apply some kind of averaging function?
Yes, only quaternions are appropriate for inter/extrapolation. See 45:05 here (David Sachs, Google Tech Talk).
I haven't done smoothings like the one you are looking for but in any case, only quaternions are appropriate.
Quaternion Slerps are commonly used to construct smooth animation
curves ...
From Wikipedia, Slerp.

Categories