ThreeJS static text animations - javascript

I was hoping that ThreeJS has some way of animating text besides on top of a plane. I would prefer to animate the text as 2D and just floating above a model. I attempted to use divs outside of the canvas, but that was having issues pointing to the correct place and still remaining responsive.
I want it to look something like this, while animating the line underneath the text.

What you can do is you want to create a 2d canvas and use that as the texture, something like this
var canvas1 = document.createElement('canvas');
var context1 = canvas1.getContext('2d');
context1.font = "Bold 40px Arial";
context1.fillStyle = "rgba(255,0,0,0.95)";
context1.fillText('Hello, world!', 0, 50);
// canvas contents will be used for a texture
var texture1 = new THREE.Texture(canvas1)
texture1.needsUpdate = true;
var material1 = new THREE.MeshBasicMaterial( {map: texture1, side:THREE.DoubleSide } );
material1.transparent = true;
var mesh1 = new THREE.Mesh(
new THREE.PlaneGeometry(canvas1.width, canvas1.height),
material1
);

Related

Three.js - PlaneGeometry from Math.Plane

I am trying to draw a least squares plane through a set of points in Three.js. I have a plane defined as follows:
var plane = new THREE.Plane();
plane.setFromNormalAndCoplanarPoint(normal, point).normalize();
My understanding is that I need to take that plane and use it to come up with a Geometry in order to create a mesh to add to the scene for display:
var dispPlane = new THREE.Mesh(planeGeometry, planeMaterial);
scene.add(dispPlane);
I've been trying to apply this answer to get the geometry. This is what I came up with:
plane.setFromNormalAndCoplanarPoint(dir, centroid).normalize();
planeGeometry.vertices.push(plane.normal);
planeGeometry.vertices.push(plane.orthoPoint(plane.normal));
planeGeometry.vertices.push(plane.orthoPoint(planeGeometry.vertices[1]));
planeGeometry.faces.push(new THREE.Face3(0, 1, 2));
planeGeometry.computeFaceNormals();
planeGeometry.computeVertexNormals();
But the plane is not displayed at all, and there are no errors to indicate where I may have gone wrong.
So my question is, how can I take my Math.Plane object and use that as a geometry for a mesh?
This approach should create a mesh visualization of the plane. I'm not sure how applicable this would be towards the least-squares fitting however.
// Create plane
var dir = new THREE.Vector3(0,1,0);
var centroid = new THREE.Vector3(0,200,0);
var plane = new THREE.Plane();
plane.setFromNormalAndCoplanarPoint(dir, centroid).normalize();
// Create a basic rectangle geometry
var planeGeometry = new THREE.PlaneGeometry(100, 100);
// Align the geometry to the plane
var coplanarPoint = plane.coplanarPoint();
var focalPoint = new THREE.Vector3().copy(coplanarPoint).add(plane.normal);
planeGeometry.lookAt(focalPoint);
planeGeometry.translate(coplanarPoint.x, coplanarPoint.y, coplanarPoint.z);
// Create mesh with the geometry
var planeMaterial = new THREE.MeshLambertMaterial({color: 0xffff00, side: THREE.DoubleSide});
var dispPlane = new THREE.Mesh(planeGeometry, planeMaterial);
scene.add(dispPlane);
var material = ...;
var plane = new THREE.Plane(...);
// Align to plane
var geometry = new THREE.PlaneGeometry(100, 100);
var mesh = new THREE.Mesh(geometry, material);
mesh.translate(plane.coplanarPoint());
mesh.quaternion.setFromUnitVectors(new THREE.Vector3(0,0,1), plane.normal);
Note that Plane.coplanarPoint() simply returns -normal*constant, so it might be a better option to use Plane.projectPoint() to determine a center that is "close to" an arbitrary point.

How insert a canvas using a Mesh inside a WebGL Render in Three JS

I'm experiencing with Three JS after reading an article about augmented reality using a webcam.
Like experiment, i try to put a simple canvas in a scene, where before i have put a webglrender, by insert an webcam stream.
I read in this article (https://hacks.mozilla.org/2013/10/an-ar-game-technical-overview/) that for the virtual scene, is needed create a virtual scene, where is renderer by the object webglrenderer, and a reality scene(the webcam). In my file testingthreejs.js, between the line 30 and 41 is where i wrote the lines for create the new scene (the virtual), and where i created and inserting the canvas inside my mesh object.
But nothing happen, ¿how i can insert the canvas like a Mesh object?
I try to avoid draw a canvas, as it is supposed that the rendering by canvas is more slow that rendering with webgl (i also think use the ray class)
This is my gist : https://gist.github.com/fernandosg/75ec701c0295761a77e6, inside are the files testingthreejs.js and index.html.
Thanks for your help.
You can take a look at the Stemkoski s website which has a lot of examples with very nice explanations.
I think this is what you are looking for, take a look at the script.
More precisely, here is the commented code :
/////// draw text on canvas /////////
// create a canvas element
var canvas1 = document.createElement('canvas');
var context1 = canvas1.getContext('2d');
context1.font = "Bold 40px Arial";
context1.fillStyle = "rgba(255,0,0,0.95)";
context1.fillText('Hello, world!', 0, 50);
// canvas contents will be used for a texture
var texture1 = new THREE.Texture(canvas1)
texture1.needsUpdate = true;
var material1 = new THREE.MeshBasicMaterial( {map: texture1, side:THREE.DoubleSide } );
material1.transparent = true;
var mesh1 = new THREE.Mesh(
new THREE.PlaneGeometry(canvas1.width, canvas1.height),
material1
);
mesh1.position.set(0,50,0);
scene.add( mesh1 );
/////// draw image on canvas /////////
// create a canvas element
var canvas2 = document.createElement('canvas');
var context2 = canvas2.getContext('2d');
// canvas contents will be used for a texture
var texture2 = new THREE.Texture(canvas2);
// load an image
var imageObj = new Image();
imageObj.src = "images/Dice-Blue-1.png";
// after the image is loaded, this function executes
imageObj.onload = function()
{
context2.drawImage(imageObj, 0, 0);
if ( texture2 ) // checks if texture exists
texture2.needsUpdate = true;
};
var material2 = new THREE.MeshBasicMaterial( {map: texture2, side:THREE.DoubleSide} );
material2.transparent = true;
var mesh2 = new THREE.Mesh(
new THREE.PlaneGeometry(canvas2.width, canvas2.height),
material2
);
mesh2.position.set(0,50,-50);
scene.add( mesh2 );
}
I think that maybe i am getting closer to the solution, finally i can put the canvas (using a MeshBasicMaterial and a PlaneGeometry), but now the canvas was drawed, the video was not rendered:
problem canvas
The code:
var canvas=document.createElement("canvas");
canvas.width=512;
canvas.height=512;
var video=document.createElement("video");
streamVideo(video);
var canvasContext=canvas.getContext("2d");
var glCanvas=document.createElement("canvas");
var renderer_webgl=new THREE.WebGLRenderer({canvas:glCanvas});
renderer_webgl.setSize(512,512);
renderer_webgl.autoClear=false;
document.getElementById("container").appendChild(glCanvas);
var camera = new THREE.Camera();
var scene=new THREE.Scene();
var geometry=new THREE.PlaneGeometry(2,2,0);
var texture=new THREE.Texture(canvas);
var material=new THREE.MeshBasicMaterial({
map:texture,
depthTest:false,
depthWrite:false
});
var mesh=new THREE.Mesh(geometry,material);
canvas_draw=document.createElement("canvas");
canvas_draw.width=128;
canvas_draw.height=128;
ctx=canvas_draw.getContext("2d");
ctx.fillStyle = "#AAA";
ctx.fillRect(1,1,20,25);
// HERE IS WHEN INSERT THE CANVAS
var texture_canvas=new THREE.Texture(canvas_draw);
texture_canvas.needsUpdate=true;
var basic_material_canvas=new THREE.MeshBasicMaterial({
map:texture_canvas
});
var mesh_canvas=new THREE.Mesh(new THREE.PlaneGeometry(2,2,0),basic_material_canvas);
scene.add(mesh);
scene.add(mesh_canvas);
function renderScenes(){
renderer_webgl.render(scene,camera);
}
function loop(){
if(video.readyState===video.HAVE_ENOUGH_DATA){
canvas.changed=true;
canvasContext.drawImage(video,0,0);
verifyDetector();
texture.needsUpdate=true;
texture_canvas.needsUpdate=true;
renderScenes();
}
requestAnimationFrame(loop);
}
loop();
function verifyDetector(){
detector = new AR.Detector();
imageData = canvasContext.getImageData(0, 0, canvas.width, canvas.height);
var markers = detector.detect(imageData);
/*if(markers.length>0)
canvasContext.drawImage(canvas_draw,markers[0].corners[0].x,markers[0].corners[0].y);
*/
}
¿Why this happen?

three.js edges and text jagged/blurry on rotation

I'm a three.js beginner. Trying to apply an html canvas as texture to plane geometry. Works fine when there is no rotation. When rotated, a) outside edges of the mesh become jagged (edges/lines within the image are fine) and b) text becomes blurry.
What do I have to do to keep outside edges and text crisp?
My texture is power-of-two (128 x 512). Turning antialiasing on doesn't help.
Here is a screenshot without rotation
And here with rotation.
Code looks like this:
var eltTexture = toTexture2(d, this, go.w, go.Res.ow[0]);
// map texture to material
var material = new THREE.MeshBasicMaterial(
{ map : eltTexture } );
//define sub image of texture to be applied to mesh
var cutout = [
new THREE.Vector2(0, (128 - go.Res.ow[0])/128),
new THREE.Vector2(go.w/512, (128 - go.Res.ow[0])/128),
new THREE.Vector2(go.w/512, 1),
new THREE.Vector2(0, 1)];
// geometry and UV mapping
var geometry = new THREE.PlaneGeometry (go.w, go.Res.ow[0]);
geometry.faceVertexUvs[0] = []; // initialize
geometry.faceVertexUvs[0][0] =
[cutout[3], cutout[0], cutout[2]];
geometry.faceVertexUvs[0][1] =
[cutout[0], cutout[1], cutout[2]];
var mesh = new THREE.Mesh( geometry, material );
mesh.position.set(d.threeX, d.threeY, d.threeX);
mesh.rotation.set(0,0.6,0);
scene.add( mesh );
});
renderer.render( scene, camera );
function toTexture2 (d, svgNode, svgWidth, svgHeight){
// step 1: serialize eltSVG to xml
var eltXML = new XMLSerializer().serializeToString(svgNode);
eltXML = 'data:image/svg+xml;charset = utf8,' + eltXML;
// step 2: draw eltXML to image
var eltImage = document.body.appendChild(
document.createElement("img"));
eltImage.id = "eltImage";
eltImage.style.display = "none";
eltImage.width = svgWidth;
eltImage.height = svgHeight;
eltImage.src = eltXML;
// step 3: draw image to canvas
// NOTE: define canvas parameters position, width and
// height in html, NOT IN CSS, otherwise image
// will become blurry - don't ask why!
var eltCanvas = document.body.appendChild(
document.createElement("canvas"));
eltCanvas.id = "eltCanv";
eltCanvas.style.display = "none";
eltCanvas.width = 512;
eltCanvas.height = 128;
// get context
var ctx = eltCanvas.getContext("2d", {alpha: false});
// draw svg element to canvas, not including portrait image
ctx.drawImage(eltImage, parseInt(0), parseInt(0),
svgWidth, svgHeight);
// draw portrait image to canvas
var portrait = document.getElementById(d.nameConcat + "Img");
ctx.globalAlpha = 0.6; // opacity of portrait image
ctx.drawImage(portrait, go.Res.strw[0], go.Res.strw[0],
go.Res.iw[0], go.Res.iw[0]);
var texture = new THREE.Texture(eltCanvas);
texture.needsUpdate = true;
return texture;
} // function toTexture2
Many thanks in advance for your help!
Even though it's been a long time since you've posted this, I have the solution in case anyone else is experiencing the same issue.
Add this to your renderer:
const renderer = new THREE.WebGLRenderer({antialias: true});

Three.js - Things disappear when zooming out

In my three.js project I use a high z position for my camera.
When the z position is too high my scene becomes black.
So, when I zoom out it becomes black. But I don't want that to happen.
This is how it is with camera.position.z = 3000;
And when I zoom out, just one zoom, it is like this:
For the controls I use OrbitControls, My camera is like:
var camera = new THREE.PerspectiveCamera(45, window.innerWidth / window.innerHeight, 1, 3000);
camera.position.z = 3000;
And here the code for the planet and some planets' orbits:
var scene = new THREE.Scene();
var material = new THREE.MeshLambertMaterial({
map: THREE.ImageUtils.loadTexture("assets/img/sun.jpg")
});
var sun = new THREE.Mesh(new THREE.SphereGeometry(200, 50, 50), material);
scene.add(sun);
var orbitLine = function(radius,y)
{
var segments = 64,
line_material = new THREE.LineBasicMaterial( { color: 0xffffff } ),
geometry = new THREE.CircleGeometry( radius, segments );
geometry.vertices.shift();
var orbit = new THREE.Line( geometry, line_material );
if(y)
orbit.position.y=y;
else if(!y)
orbit.position.y=0;
scene.add(orbit);
};
var Mercury_orbit = orbitLine(400,-70);
var Venus_orbit = orbitLine(700,70);
var Earth_orbit = orbitLine(900,70);
var Mars_orbit = orbitLine(1250,70);
var Jupiter_orbit = orbitLine(3000,70);
Couldn't provide a fiddle as for some reason it didn't work.
If you need more code tell me in the comments and I will add it.
Any ideas?
thanks.
Your camera's far plane is at 3000 which means everything that is 3000 units away will be clipped and not drawn.
At the same time you have placed your camera at (0,0,3000) so you are right on the position where things will start to disappear.

Threejs, making glass shatter effect

I have an idea to create, but since my knowledge about Three.js and 3D programming in general is limited, I am stuck...
Idea: User is doing some things. At some point - whole front screen crashes, and reveals something different behind.
At the beginning, my idea was - make screenshot, of what is happening right now, put that image in front of everything (still, having difficulties making that plane to take 100% size - idea - when it shows, user cannot tell the difference, between old 3d renderings, and new 2d picture), and then - shatter it. So - by looking around the Web for different examples, I made some... thing - Created screenshot, made plane and applied screenshot as a texture to it. To create shattering effect, I used TessellateModifier to divide that plane, and ExplodeModifier to create each face as separated face.
Here will be the code I made so far.
function drawSS()
{
var img = new Image;
img.onload = function() { // When screenshot is ready
var canvas = document.createElement('canvas');
canvas.width = window.innerWidth;
canvas.height = window.innerHeight;
var context = canvas.getContext('2d');
// CANVAS DRAWINGS
// Draws screenshot
// END
var texture = new THREE.Texture(canvas);
texture.needsUpdate = true;
var multiMaterial = [
new THREE.MeshBasicMaterial({map: texture, side: THREE.FrontSide}), // canvas drawings
new THREE.MeshBasicMaterial( { color: 0xffffff, wireframe: true, transparent: true}) // for displaying wireframe
];
var geometry = new THREE.PlaneGeometry(canvas.width, canvas.height); // create plane
for (var i = 0, len = geometry.faces.length; i < len; i++) { // Snippet from some stackoverflow post, that works.
var face = geometry.faces[i].clone();
face.materialIndex = 1;
geometry.faces.push(face);
geometry.faceVertexUvs[0].push(geometry.faceVertexUvs[0][i].slice(0));
}
geometry.dynamic = true;
THREE.GeometryUtils.center( geometry ); // ?
var tessellateModifier = new THREE.TessellateModifier( 10 );
for ( var i = 0; i < 5; i ++ )
tessellateModifier.modify( geometry );
new THREE.ExplodeModifier().modify( geometry );
geometry.vertices[0].x -=300;
geometry.vertices[1].x -=300;
geometry.vertices[2].x -=300;
var mesh = new THREE.Mesh( geometry, new THREE.MeshFaceMaterial(multiMaterial) );
scene.add( mesh );
};
// THEN, set the src
img.src = THREEx.Screenshot.toDataURL(renderer);
}
For now - I moved 1 face, by changing coordinates of 3 vertices. I'm asking - if this approach is the way to go? Result looks like this. (white lines - wireframe, black lines - (drawings in canvas) my desired wireframes - problem for later). Note - by moving this way - texture goes along, I don't know, if I make new triangles using these vertices, how would I set texture.

Categories