Manage click on a mesh - javascript

I am trying to manage click on a mesh (cube) to do some processing stuff.
var cubeFor3D = new THREE.Mesh(new THREE.CubeGeometry(40,80,40),img3D);
scene.add(cubeFor3D);
//
renderer.render(scene,camera);
//
animate();
//Track the click on cube3d
cubeFor3D.on('click', function(){
// response to click...
console.log('you have clicked on cube 2D');
});
When running, i got this error on the web console:
TypeError: cubeFor3D.on is not a function
In the API documentation, it's shown to do like this:
mesh.on('click',..
But i guess i should replace mesh by my mesh real name. Seems i am going wrong. help please.
I am including the API JS in my file: <script src='threex.domevent.js'></script>

Late answer but, if you add this to the begging of the init function it should work:
THREE.Object3D._threexDomEvent.camera(camera);
In fact I normally put in straight after scene = new THREE.Scene();

Related

Texture not displayed on object Forge Three.js

I am trying to display a textured plane with Three.js. I'm working with Forge RCDB.
At first, I managed to display the plane, but instead of being textured, it was completely black... I made some changes and now nothing is displayed anymore...
Here is my code :
render () {
var viewer=NOP_VIEWER;
var scene=viewer.impl.scene;
var camera = viewer.autocamCamera;
var renderer = viewer.impl.renderer();
renderer.render( scene, camera );
}
and in the function supposed to display the textured plane :
new THREE.TextureLoader(texture).load(texture, this.render);
tex.wrapS = THREE.RepeatWrapping //ClampToEdgeWrapping //MirroredRepeatWrapping
tex.wrapT = THREE.RepeatWrapping //ClampToEdgeWrapping //MirroredRepeatWrapping
tex.mapping = THREE.UVMapping
At the beginning I used loadTexture(). I managed to display my plane, but it was all black, and no texture was applied on it.
Then, I use THREE.TextureLoader().load(), in this case, I believe it is trying to find the image on localhost. The image is downloaded, I can see it on the console.
But now I get these errors :
Uncaught TypeError: scope.manager.itemStart is not a function
and :
Uncaught TypeError: renderer.render is not a function
Now the object is not displayed, even in black.
So I think this may be linked to render, but I don't understand how...
I found this, and it answers my question partially.
Finally, I decided to keep THREE.ImageUtils.loadTexture(), and I replaced MeshLambertMaterial by MeshBasicMaterial.
No need for render.

Babylon.js OnIntersectionEnterTrigger not triggering with camera

I'm using Babylon.js 2.4.0.
I have a mesh (in the shape of a couch) loaded from a .obj file, and a camera set up like this:
let camera = new BABYLON.FreeCamera('camera1', new BABYLON.Vector3(0, 2, 0), scene);
camera.checkCollisions = true;
camera.applyGravity = true;
camera.ellipsoid = new BABYLON.Vector3(1, 1, 1);
camera.attachControl(canvas, false);
camera.speed = 0.5;
camera.actionManager = new BABYLON.ActionManager(scene);
I want to set up an event so that when I walk through the couch, "intersection" is logged to the console:
let action = new BABYLON.ExecuteCodeAction(
{ trigger: BABYLON.ActionManager.OnIntersectionEnterTrigger, parameter: { mesh: couchMesh }},
(evt) => {
console.log("intersection");
}
);
this.camera.actionManager.registerAction(action);
When I walk through the mesh, nothing is logged to the console.
I've created an example on the Babylon.js Playground using an example that they provide to check that it wasn't a problem with my mesh or camera set up, and it doesn't appear to be (the playground doesn't work either).
A camera in Babylon.js has no action manager, so even if you set one it won't really work.
To get this to work using action managers, you could define an invisible box around the camera, with a predefined size and attach the action manager to the mesh created. then set the mesh's parent to be the camera, and you are done. Here is your playground with those changes - http://www.babylonjs-playground.com/#KNXZF#3
Another solution is to use the internal collision system of babylon js, and set the camera's onCollide function to actually do something :) Here is en example - http://www.babylonjs-playground.com/#KNXZF#4
Notice that in the second playground, the camera won't go throug the box, as the collision system prevents it from doing so. I am not sure about your usecase, so it is hard to say which one of the two will work better.
If you need a "gate" system (knowing when a player moved through a gate, for example), use the 1st method. The 2nd is much cleaner, but has its downsides.

THREE.Audio setRefDistance is not a function

I'm using three r73 and got stuck with a simple 3D audio example:
// ...
var listener = new THREE.AudioListener();
camera.add( listener );
var sound1 = new THREE.Audio( listener );
sound1.load( 'sounds/song.ogg' );
sound1.setVolume(1);
sound1.setRefDistance(10);
sound1.autoplay = true;
mesh.add(sound1);
I got setRefDistance is not a function.
If I remove this line sound1.setRefDistance(10);, the sound play but isn't "3D aware".
I don't know what is different from this simple example http://threejs.org/examples/misc_sound.html except I'm in an angular context + nodejs
Ok, my bad, I was in fact using r74dev and I needed to use THREE.PositionalAudio

Cannot render Three js objects added to the scene within a jQuery $.get request function

I'm a three.js beginner. I'm attempting to add a sphere to the scene, using position coordinates returned from a JavaScript get request.
A sphere created before the get request is rendered properly, but a sphere created and added to the scene in the callback function is not rendered - although if I debug and inspect the scene, both spheres exist as its children.
My code:
var sphere = new THREE.Mesh(
new THREE.SphereGeometry(radius),
sphereMaterial);
sphere.position.set(-20,-20,-20);
scene.add(sphere); // this sphere shows
sphere2 = sphere.clone();
sphere2.position.set(50,50,50); // testing initializing outside
$.get("{% /graph %}",function(data,status){
scene.add(sphere2); // this sphere does not show
});
renderer.render(scene, camera);
I tried initializing the second sphere inside and outside the callback, I've tried creating the new sphere instead of cloning, I'm not sure what else to try and I don't know what I'm missing.
The jQuery get request is an asynchronous request. The callback function will not be called until after the render call. This means that the object will get added to the scene, but never drawn.
You will need to call the render function inside the get handler.
$.get("{% /graph %}",function(data,status){
scene.add(sphere2);
renderer.render(scene, camera);
});
Alternately, if you create a rendering loop for things like animation, this is unnecessary.

Uncaught TypeError: Type error in WebGL using Three.js

I am building a small music visualizer with WebGL and Three.js, using the ThreeAudio.js library to convert the audio into a texture that is passed into the shader. Though everything is currently functioning, I am getting the following error that I'd like to track down:
"Uncaught Type Error: Type error"
Which then traces back from my animate function, to my render function, to the three.js render function, to something called "l", to renderBuffer, to "z".
My animate function is as follows:
function animate() {
requestAnimationFrame( animate );
audioTextures.update();
stats.update();
render();
}
And my render function is as follows:
function render(){
renderer.render( scene, camera );
}
I believe it's an issue with the mesh I am creating, because when I comment out the code to add it to the scene the error goes away.
The code for the animated sphere I have is as follows:
audioSource = (new ThreeAudio.Source()).load('https://api.soundcloud.com/tracks/125098652/stream?client_id=MYCLIENTID').play();
audioTextures = new ThreeAudio.Textures(renderer, audioSource);
audioMaterial = new ThreeAudio.Material(audioTextures, vertexShader, fragmentShader);
audioMesh = new THREE.Mesh(geometry, audioMaterial);
scene.add(audioMesh);
The ThreeAudio github can be found here: https://github.com/unconed/ThreeAudio.js
Please let me know if it would be helpful to post my shaders as well.
Does anyone know how I should begin to solve this error? Has anyone seen it present in this way? Please let me know.
Ok, I ended up answering this question myself. In case anyone stumbles upon this looking for the answer to a similar problem, this is how I fixed it:
I realized that the animate function was being called before the audio source was loaded, which meant the audio textures did not have any data in them. It seemed like the library (ThreeAudio) that I was using handled the exception, in my limited experience issues with data getting to the shader tends to blow everything up.
The solution was to move the animate() call into a callback function once the audio source loaded. My code looks like this:
audioSource = (new ThreeAudio.Source()).load('https://api.soundcloud.com/tracks/125098652/stream?client_id=MYCLIENTID', function(){
clock = new THREE.Clock();
audioSource.play();
audioTextures = new ThreeAudio.Textures(renderer, audioSource);
audioMaterial = new ThreeAudio.Material(audioTextures, vertexShader, fragmentShader);
audioMaterial.uniforms['volume'] = {type: 'f', value: 0};
audioMaterial.uniforms['volume2'] = {type: 'f', value: 0};
audioMaterial.uniforms['volume3'] = {type: 'f', value: 0};
// My custom cube geometry
var geometry = CubeGeometry(audioTextures, 500, 500, 500, 500, 500);
audioMesh = new THREE.Mesh(geometry, audioMaterial);
audioMesh.position.y = -200;
scene.add(audioMesh);
animate();
});

Categories