I am using the D3-threeD2.js to translate SVG files into THREE.Shape(s) that I can then extrude with three.js. It works fine except for holes.
Lets say I have a donut shape: a disc with a hole inside. The library gives me one THREE.Shape that represents the disc and one THREE.Shape that represents the hole.
I know I can punch a hole in the disc if I have a THREE.Path, but I don't - I have a THREE.Shape.
So is there a way to get a THREE.Path from a THREE.Shape? Or alternatively is there a way to punch a hole in a THREE.Shape with another THREE.Shape?
I post here again, because I recently realized that the answer to this question is almost too simple to be true and worth mentioning separately.
THREE.Shape extends THREE.Path and you can simply use the THREE.Shape to punch a hole in another shape directly by adding it as a hole.
I tested it in two different ways. With shapes directly:
var shape = new THREE.Shape();
//...define your shape
var hole = new THREE.Shape();
//...define your hole
shape.holes.push( hole );
shape.extrude( extrusionSettings );
But if you make a path to shapes with the toShapes method it also works:
var path = new THREE.Path();
//...define your path
var shape = path.toShapes()[0];
var hole = new THREE.Shape();
//...define your hole
var holes = hole.toShapes();
shape.holes = holes;
shape.extrude( extrusionSettings );
See a fiddle here which demonstrates that both solutions work...
There is a ThreeCSG library on GitHub that you can use for performing boolean operations using Three.js meshes. That could be something for you.
The library is a bit outdated, but there are many forks that are also compatible with the latest Three.js versions.
EDIT:
I don't think it should be hard to convert your shape to a path using the points array.
var shape = ... your shape ...
var points = shape.extractAllPoints();
var path = new THREE.Path( points );
Not tested but I think it should work.
Related
I am trying to take any three.js geometry and subdivide its existing faces into smaller faces. This would essentially give the geometry a higher "resolution". There is a subdivision modifier tool in the examples of three.js that works great for what I'm trying to do, but it ends up changing and morphing the original shape of the geometry. I'd like to retain the original shape.
View the Subdivision Modifier Example
Example of how the current subdivision modifier behaves:
Rough example of how I'd like it to behave:
The subdivision modifier is applied like this:
let originalGeometry = new THREE.BoxGeometry(1, 1, 1);
let subdivisionModifier = new THREE.SubdivisionModifier(3);
let subdividedGeometry = originalGeometry.clone();
subdivisionModifier.modify(subdividedGeometry);
I attempted to dig around the source of the subdivision modifier, but I wasn't sure how to modify it to get the desired result.
Note: The subdivision should be able to be applied to any geometry. My example of the desired result might make it seem that a three.js PlaneGeometry with increased segments would work, but I need this to be applied to a variety of geometries.
Based on the suggestions in the comments by TheJim01, I was able to dig through the original source and modify the vertex weight, edge weight, and beta values to retain the original shape. My modifications should remove any averaging, and put all the weight toward the source shape.
There were three sections that had to be modified, so I went ahead and made it an option that can be passed into the constructor called retainShape, which defaults to false.
I made a gist with the modified code for SubdivisionGeometry.js.
View the modified SubdivisionGeometry.js Gist
Below is an example of a cube being subdivided with the option turned off, and turned on.
Left: new THREE.SubdivisionModifier(2, false);
Right: new THREE.SubdivisionModifier(2, true);
If anyone runs into any issues with this or has any questions, let me know!
The current version of three.js has optional parameters for PlaneGeometry that specify the number of segments for the width and height; both default to 1. In the example below I set both widthSegments and heightSegments to 128. This has a similar effect as using SubdivisionModifier. In fact, SubdivisionModifier distorts the shape, but specifying the segments does not distort the shape and works better for me.
var widthSegments = 128;
var heightSegments = 128;
var geometry = new THREE.PlaneGeometry(10, 10, widthSegments, heightSegments);
// var geometry = new THREE.PlaneGeoemtry(10,10); // segments default to 1
// var modifier = new THREE.SubdivisionModifier( 7 );
// geometry = modifier.modify(geometry);
https://threejs.org/docs/#api/en/geometries/PlaneGeometry
I'm new in three.js, so I ask you for advice.
I use CubeTexture as envMap for my materials to make my objects looks like steel.
loader = new THREE.CubeTextureLoader();
this.cubeTexture = loader.load([
posXUrl, negXUrl,
posYUrl, negYUrl,
posZUrl, negZUrl
]);
...
mesh.material.envMap = this.textureCube;
And everything is ok with it but I want to make one enhancement in my scene.
The thing is that floor (negYUrl) on CubeTexture is static, but my scene assumes that floor is rotated. Unfortunately didn't find any API that allow to rotate buttom side of TextureCube instance.
Could you help me and point me on techniques that allow to do such things?
I did this post asking your opinion about what JS library is better, or can do the work
that I have shown. Since I'm not allowed to do that here I did a research and tried out EaselJS to do the work. So my question now have changed.
I have this piece of code:
function handleImageLoad(event) {
var img = event.target
bmp = new createjs.Bitmap(img);
/*Matrix2D Transformation */
var a = 0.880114;
var b = 0.0679298;
var c = -0.053145;
var d = 0.954348;
var tx = 37.4898;
var ty = -16.5202;
var matrix = new createjs.Matrix2D(a, b, c, d, tx, ty);
var polygon = new createjs.Shape();
polygon.graphics.beginStroke("blue");
polygon.graphics.beginBitmapFill(img, "no-repeat", matrix).moveTo(37.49, -16.52).lineTo(336.27, -36.20).lineTo(350.96, 171.30).lineTo(50.73, 169.54).lineTo(37.49, -16.52);
stage.addChild(polygon);
stage.update();
}
where the variables a,b,c,tx and ty are values from a Homography matrix,
0.880114 0.067979298 37.4898
-0.053145 0.954348 -16.5202
-0.000344 1.0525-006 1
As you can see in attached files, I draw well a deformed rectangle but the image still doesn´t wrap the shape created. Anyone know how can I do it? There is a way better do to this? I'm doing something wrong?
Thanks for your time.
Edit: To be more specific I have added other image to see what I want.
You are attempting to do something similar to a perspective transform, using a 3x3 matrix.
Canvas's 2D context, and by extension EaselJS, only supports affine transformations with a 2x3 matrix - transformations where the opposite edges of the bounding rectangle remain parallel. For example, scaling, rotation, skewing, and translation.
http://en.wikipedia.org/wiki/Affine_transformation
You might be able to fake this with multiple objects that have been skewed (this was used extensively in Flash to fake perspective transforms), or you may have to look into another solution.
Is it possible to load an OBJ file under ThreeJS keeping the quadrilateral faces? Here is an example:
http://www.professores.im-uff.mat.br/hjbortol/disciplinas/2014.2/hwc00001/test/threejs/viewer-04/viewer-04-b.html
Note that each quadrilateral face is rendered as two triangles in wireframe. I would like to keep the original quadrilateral faces, as shown here (in Java):
http://www.uff.br/cdme/triplets/triplets-html/triplets-en.html
And what about a general n-polygon face in OBJ files? Is it possible to keep it?
Thanks, Humberto.
Unfortunately everything gets translated to triangles. However, you may be able to achieve the results you are after with this code:
var edges = new THREE.EdgesHelper( mesh );
scene.add( edges );
I am working with the planetary textures from this site. They are all in rectangular form.
However, in my BabylonJS application, textures are expected to be like this.
I have tried setting the coordinates mode, but it doesn't seem to do anything.
// These didn't have an effect
material.diffuseTexture.coordinatesMode = BABYLON.Texture.SPHERICAL_MODE;
material.diffuseTexture.coordinatesMode = BABYLON.Texture.EXPLICIT_MODE;
material.diffuseTexture.coordinatesMode = BABYLON.Texture.SPHERICAL_MODE;
material.diffuseTexture.coordinatesMode = BABYLON.Texture.PLANAR_MODE;
material.diffuseTexture.coordinatesMode = BABYLON.Texture.CUBIC_MODE;
material.diffuseTexture.coordinatesMode = BABYLON.Texture.PROJECTION_MODE;
material.diffuseTexture.coordinatesMode = BABYLON.Texture.SKYBOX_MODE;
Is there a way to convert between these two kinds of textures? Alternatively, are their planet textures like the bottom.
In fact this is related to the texture coordinates embedded into your mesh. You should use Blender to export different coordinates or you can also play with texture.uOffset, texture.vOffset, texture.uScale and texture.vScale to move your texture on your mesh