BabylonJS Radial vs Rectangular Textures (Conversion or Code Change) - javascript

I am working with the planetary textures from this site. They are all in rectangular form.
However, in my BabylonJS application, textures are expected to be like this.
I have tried setting the coordinates mode, but it doesn't seem to do anything.
// These didn't have an effect
material.diffuseTexture.coordinatesMode = BABYLON.Texture.SPHERICAL_MODE;
material.diffuseTexture.coordinatesMode = BABYLON.Texture.EXPLICIT_MODE;
material.diffuseTexture.coordinatesMode = BABYLON.Texture.SPHERICAL_MODE;
material.diffuseTexture.coordinatesMode = BABYLON.Texture.PLANAR_MODE;
material.diffuseTexture.coordinatesMode = BABYLON.Texture.CUBIC_MODE;
material.diffuseTexture.coordinatesMode = BABYLON.Texture.PROJECTION_MODE;
material.diffuseTexture.coordinatesMode = BABYLON.Texture.SKYBOX_MODE;
Is there a way to convert between these two kinds of textures? Alternatively, are their planet textures like the bottom.

In fact this is related to the texture coordinates embedded into your mesh. You should use Blender to export different coordinates or you can also play with texture.uOffset, texture.vOffset, texture.uScale and texture.vScale to move your texture on your mesh

Related

Three.js is there a way to adapt the geometry to the texture

I am working in a project in Three.js and I need to have multiple images floating into a 3D space. So I started to simply use these images as textures on planes. However, images have a different height and width so I am just wondering if there is a way to make the plane adapt to the size of the textures. Or be proportional with it.
There's maybe a simple way to do it but I didn't find anything. May one of you can help me, or tell me to stop looking for it ?
When loading a texture, you can check its size and THEN create the plane to host it with the right width/height ratio:
var loader = new THREE.TextureLoader();
var texture = loader.load( "./img.png", function ( tex ) {
console.log( tex.image.width, tex.image.height );
// here you can create a plane based on width/height image linear proportion
} );

Parallax effect using three.js

I would like to build a parallax effect from a 2D image using a depth map, similar to this, or this but using three.js.
Question is, where should I start with? Using just a PlaneGeometry with a MeshStandardMaterial renders my 2D image without parallax occlusion. Once I add my depth map as displacementMap property I can see some sort of displacement, but it is very low-res. (Maybe, since displacement maps are not meant to be used for this?)
My first attempt
import * as THREE from "three";
import image from "./Resources/Images/image.jpg";
import depth from "./Resources/Images/depth.jpg";
[...]
const geometry = new THREE.PlaneGeometry(200, 200, 10, 10);
const material = new THREE.MeshStandardMaterial();
const spriteMap = new THREE.TextureLoader().load(image);
const depthMap = new THREE.TextureLoader().load(depth);
material.map = spriteMap;
material.displacementMap = depthMap;
material.displacementScale = 20;
const plane = new THREE.Mesh(geometry, material);
Or should I use a Sprite object, which face always points to the camera? But how to apply the depth map to it then?
I've set up a codesandbox with what I've got so far. It also contains event listener for mouse movement and rotates the camera on movement as it is work in progress.
Update 1
So I figured out, that I seem to need a custom ShaderMaterial for this. After looking at pixijs's implementation I've found out, that it is based on a custom shader.
Since I have access to the source, all I need to do is rewrite it to be compatible with threejs. But the big question is: HOW
Would be awesome if someone could point me into the right direction, thanks!

syncing d3.js with THREE.js earth

I am trying to combine WebGL earth with d3.geo.satellite projection.
I have managed to to overlay the 2 projections on top of each other and sync rotation, but I am having trouble to sync zooming. When I sync them to match size, WebGL projection gets deformed, but the d3.geo.satellite remains the same. I have tried different combination of projection.scale, projection.distance without much success.
Here is JS fiddle (it take a little while to load the resources). You can drag it to rotate (works well). But if you zoom in (use mousewheel) you can see the problem.
https://jsfiddle.net/nxtwrld/7x7dLj4n/2/
The important code is at the bottom of the script - the scale function.
function scale(){
var scale = d3.event.scale;
var ratio = scale/scale0;
// scale projection
projection.scale(scale);
// scale Three.js earth
earth.scale.x = earth.scale.y = earth.scale.z = ratio;
}
I do not using WebGL earth either , checking on your jsfiddle is not working anymore, and my assumption of your problem that you want to integrated D3.js with Threejs as a solution for 3d globe.
May I suggest you to try earthjs as your solution. Under the hood it use D3.js v4 & Threejs revision 8x both are the latest, and it can combine between Svg, canvas & threejs(WebGL).
const g = earthjs({padding:60})
.register(earthjs.plugins.mousePlugin())
.register(earthjs.plugins.threejsPlugin())
.register(earthjs.plugins.autorotatePlugin())
.register(earthjs.plugins.dropShadowSvg(),'dropshadow')
.register(earthjs.plugins.worldSvg('../d/world-110m.json'))
.register(earthjs.plugins.globeThreejs('../globe/world.jpg'))
g._.options.showLakes = false;
g.ready(function(){
g.create();
})
above snippet code you can run it from here.

ThreeJS: loading OBJ files keeping quadrilateral faces

Is it possible to load an OBJ file under ThreeJS keeping the quadrilateral faces? Here is an example:
http://www.professores.im-uff.mat.br/hjbortol/disciplinas/2014.2/hwc00001/test/threejs/viewer-04/viewer-04-b.html
Note that each quadrilateral face is rendered as two triangles in wireframe. I would like to keep the original quadrilateral faces, as shown here (in Java):
http://www.uff.br/cdme/triplets/triplets-html/triplets-en.html
And what about a general n-polygon face in OBJ files? Is it possible to keep it?
Thanks, Humberto.
Unfortunately everything gets translated to triangles. However, you may be able to achieve the results you are after with this code:
var edges = new THREE.EdgesHelper( mesh );
scene.add( edges );

Rendering spheres (or points) in a particle system

I am using the Three.JS library to display a point cloud in a web brower. The point cloud is generated once at start up and no further points are added or removed. But it does need to be rotated, panned and zoomed. I've gone through the tutorial about creating particles in three.js here
Using the example I can create particles that are squares or use an image of a sphere to create a texture. The image is closer to what I want, but is it possible to generate the point clouds without using the image? The sphere geometry for example.
The problem with the image is that when you have thousands of points it seems they sometimes obscure each other around the edges. From what I can gather it seems like the black region in a point's png file blocks the image immediately behind the current point. (But it is transparent to points further behind)
This obscuring of the images is the reason I would like to generate the points using shapes. I have tried replacing particles = new THREE.Geometry() with THREE.SphereGeometry(radius, segments, rings) and tried to change the vertices to spheres.
So my question is. How do I modify the example code so that it renders spheres (or points) instead of squares? Also, is a particle system the most efficient system for my particular case or should I just generate the particles and set their individual positions? As I mentioned I only generate the points once, but then rotate, zoom, pan the points. (I used the TrackBall sample code to get the mouse events working).
Thanks for your help
I don't think rendering a point cloud with spheres is very efficient. You should be able to get away with a particle system and use a texture or a small canvas program to draw a circle.
One of the first three.js sample uses a canvas program, here are the important bits:
var PI2 = Math.PI * 2;
var program = function ( context )
{
context.beginPath();
context.arc( 0, 0, 1, 0, PI2, true );
context.closePath();
context.fill();
};
var particle = new THREE.Particle( new THREE.ParticleCanvasMaterial( {
color: Math.random() * 0x808008 + 0x808080,
program: program
} ) );
Feel free to adapt the code for the WebGL renderer.
Another clever solution I've seen in the examples is using an encoded webm video to store the data and pass that to a GLSL shader which is rendered through a particle system in three.js
If your point cloud comes from a Kinect, these resources might be useful:
DepthCam
KinectJS
When comparing my code to http://threejs.org/examples/#webgl_custom_attributes_particles3
I saw the only difference was:
vec4 outColor = texture2D( texture, gl_PointCoord );
if ( outColor.a < 0.5 ) discard;
gl_FragColor = outColor;
Added to the fragment shader, fixed this problem for me.
It wasn't z fighting because randomly, some corners would overlap distant particles.
material.alphaTest = 0.5 didn't work and turning off depth writes/tests messed up the viewing order.
The problem with the image is that when you have thousands of points
it seems they sometimes obscure each other around the edges. From what
I can gather it seems like the black region in a point's png file
blocks the image immediately behind the current point. (But it is
transparent to points further behind)
You can get rid of the transparency overlapping problem of the underlying square structure by turning
depthTest:false
The problem then is, if you are adding additional objects to the scene the depth-testing will fail and the PointCloud will be rendered in front of the other objects, ignoring the actual order. To get around that you can additionally turn off
depthWrite:false

Categories