Is there a way to disable canvas interaction? - javascript

I'm working in a web application that allows user to create a scene using canvas element, and then using Web Audio Api the scene changes accordingly with extracted frequencies in real time (creating audio visualizations). I implemented the interactivity with canvas with eventListeners.
My question here is if there is a way to disable canvas interaction (stop eventListeners) while the song is playing (the scene is changing) and after user stops it, the scene can be modified again.
Thank you all.

Related

How to have multiple HTML5 canvas with previous, next and add icon

I am making a whiteboard based on the html5 canvas and have added many features like pencil, eraser, upload image, upload pdf and so on. Could anyone help me with how to have multiple canvas which can be accessed by the previous and next buttons. I also need an add button for addition of a canvas.I have already built the buttons in html and css and need help in the javascript.
This is a picture of the next and previous whiteboard buttons; and here id a picture of the add and previous whiteboard buttons. When the page is in the last whiteboard the add icon should be shown. I think there should also be a whiteboard limit which would help conserve the browser memory.
instead of adding multiple canvas you could use only on canvas,
on add canvas button , save this state of canvas and clear it
on pressing previous save this state of canvas and load previous state( and same for next)
in this way you won't have to worry about creating too many canvas and run into memory leak problem also
so basically you will have one canvas and array of state which you will load based on the number

Is it possible to record a video of the contents of an HTML element?

So I want to use javascript and css to create an animated video of maps. I imagined to do some animations using leaflet etc and use some javascript to record a video of it.
However so far I found that video recording is only possible for the entire screen or a canvas according to MDN (https://developer.mozilla.org/en-US/docs/Web/API/MediaStream_Recording_API/Recording_a_media_element) and Google documentation (https://developers.google.com/web/updates/2016/10/capture-stream)
I tried to render leaflet js onto a canvas, but I could not find any way to do so.
So is there a way to create an animated video using javascript? Either by rendering leaflet onto a canvas or by recording the content of a element?
You can use RecordRTC to record a canvas element.
If you already have a canvas that has animations in it, it is very simple to use RecordRTC to record that canvas element.
If you don't have a canvas element and you need to record an HTML element, you would first need to place a hidden canvas on the screen and draw on it your HTML content using window.requestAnimationFrame(). Then you can use RecordRTC to record that canvas element.
You can use RecordRTC's CanvasRecorder to record just the canvas.
But if you want to do other things like record the user's microphone or browser tab's audio alongside your recording your canvas, you can canvas.captureStream() and use the RecordRTC's MediaStreamRecorder to record.
In both the aforementioned cases, you will get a video file as the result.
A full example demo can be found here -> https://www.webrtc-experiment.com/RecordRTC/Canvas-Recording/Canvas-Animation-Recording-Plus-Microphone.html
The source code for that demo can be found here -> https://github.com/muaz-khan/RecordRTC/blob/master/Canvas-Recording/Canvas-Animation-Recording-Plus-Microphone.html
All credits go to Muaz Khan for maintaining such an awesome project alongside many other equally awesome webRTC projects.
NOTE: Rendering HTML element to canvas can be CPU intensive.

Is there a way to avoid javascript audio hiccuping while your program is loading other media at the same time?

I'm building a WebGL program where the scene changes at certain intervals. The scene change consists of destroying the previous scene, and loading up a new one, which means loading some texture files along with it.
At the same time, I'm also playing some audio throughout the program, that should keep playing during the destroy scene/create scene process. The program works fine, but I notice that when I'm loading the new scene with the new assets, the audio hiccups right before all of the new assets are finished loading.
Is there anything I can possibly do to prevent the small hiccup during the new load? I don't think its a file size issue, or an audio buffer issue, since all of the assets including the audio are pretty small (500k or less).
Any ideas would be helpful!
The hiccup was caused by the heavy javascript processes during the switch and was interrupting the audio. I managed to fix this issue by using the Web Audio API instead of the javascript Audio() object, which allowed the audio to play in a different context than what the program was running in.

Three.js - Can I render a sequence of frames, then play back at high framerate?

I'm working on a nonlinear editing animation system with Three.js as the 3D library. I'd like to render a sequence of frames to memory, then play the frames back at an arbitrary frame rate. The idea is that the scene might be too complex to render in real time, so I want to pre-render the frames, then play them back at the target fps. I don't necessarily need interactivity while the animation is playing, but it's important that I see it at full speed.
From these links (How to Render to a Texture in Three.js, Rendering a scene as a texture), I understand how to render to a framebuffer instead of the canvas. Can I store multiple framebuffers then render each of those to the canvas later at a smooth frame rate?
You can try one of the following:
Option a) capture canvas-content, playback as png-sequence (I didn't test that, but it sounds like it could work):
render to the canvas just as you always do
export the canvas as data-URL using canvas.toDataURL()
as huge dataURLs are often a problem with devtools, you might want to consider converting the data-urls to a blob:
Blob from DataURL?
repeat for all frames
playback as png-sequence
Option b) using RenderTargets:
render scene to a render-target/framebuffer
use renderer.readRenderTargetPixels() to read the rendered result into memory (this returns basically a bitmap)
data can be copied into a 2d canvas ImageData instance
Option c) Using Rendertarget-Textures (no downloading from GPU):
render into a rendertarget and create a new one for each frame (there is very likely a limit on how many of them you can keep around, so this might not be the best solution)
image-data is stored on GPU and referenced via rendertarget.texture
use a fullscreen-quad textured with rendertarget.texture for playback. Only needs to rebind the textures for every playback-frame, so this would be the most efficient.
You can use canvas.captureStream() and mediaRecorder to record the scene and then save it. You can later play it as video later without caring about frame rate. You may miss some frames as it has its own performance overhead but it all depends on your use case.

How save canvas state to db?

So I built this real-time drawing app with node.js, socket.io, and html5 canvas. Every pixel that the mouse is moved while clicked is tracked and broadcast (to display drawing input on other computers).
I know it is possible to save an image of the canvas but this canvas is very large (10000x10000+ pixels). Right now, when the page is refreshed all the drawings are gone (as it was just send over a socket, nothing saved).
I would like to save all the canvas data to a db and then somehow rewrite it when the page is loaded again, but it is simply too much. How would you go about doing this??
You can track the clicks and mouse moves that made the canvas look that way while you're sending them over the socket, and simulate them to rebuild the image.

Categories