HTML5 Canvas Recording Crashing onstop new Blob recordedChunks - javascript

I am recording an HTML5 Canvas Stream using MediaRecorder. Stream to MediaRecorder is a mixed stream - it is a mixture of (1) canvas.captureStream(30) (2) Audio Stream (since the canvas animation has an audio on the page)
When the recording is longer than, say 1 minute, the chrome tab crashes after the line:
var blob = new Blob(recordedChunks, { 'type' : 'video/mp4' });
When it is less, say 10 seconds, the crash does not occur.
The resulting video is big in dimensions. Not sure if that is the issue. My canvas animation is mostly images and mp4 being played in a sequence (think of it as a slide show)
How can I fix this crash? Even when there is not a crash it takes a long time for new Blob to complete before I get the final video.

Related

How to capture high quality video of DOM?

I'm working on a web project where user chooses a design of a mobile mockup and save some chat conversations.
As an output the application should give a high quality video (or 1080p at least) of the chat saved before so that it looks like the real chat conversation is captured.
As of now I'm drawing mockup and conversation on HTML5 Canvas and recording it with canvas.captureStream() method.
It is able to record upto 1280px wide canvas but when I tried Increasing it to achieve 1080p video. Canvas animations slows down and browser stop working sometimes.
I'm done with googling how to optimize canvas and all the stuff that can help me.
Looks like canvas is no more able to work for me, So is there any way to record DOM and render it as video.
I was using captureStream method of canvas
const stream = canvas.captureStream();
And mediaRecorder to capture it.
let options = {mimeType: 'video/webm'};
let mediaRecorder = new MediaRecorder(stream, options);
I expect to get a way of recording video of DOM in high quality. So that I can run chat with javascript and it records the same in order to achieve the output.

Capture the html canvas for processing

I am running small projects that requires to capture and analyse the content of the canvas.
It is an agent which plays google no-internet dinosaur game.
I can access content of the canvas from a console with:
canvas = document.getElementById("gamecanvas");
context = canvas.getContext("2d");
imgData = context.getImageData(0,0,600,150);
But I have been trying to use HTMLCanvasElement.captureStream() to generate the event at a given framerate or whenever the canvas changes.
But when I implement it as:
const canvas = document.getElementById("gamecanvas");
const stream = canvas.captureStream(25)
stream.onaddtrack = function(event) { console.log("Called") }
I would expect the console.log("Called") to be called 25 times per second, but nothing gets called. Have I misunderstood something about the streams?
HTMLCanvasElement.captureStream returns a MediaStream. This MediaStream is initially composed of a special kind of MediaStreamTrack: a CanvasCaptureMediaStreamTrack which is simply a special video track with a link to the original HTMLCanvasElement.
This may still sound foreign language at this stage...
A MediaStream is a container object holding tracks themselves holding a stream of raw data, which are part of a media. An audio track will hold a stream of raw audio data, a video or canvas one will hold a stream of raw video data.
Tracks can be added or removed from a MediaStream, so that the MediaStream that was fed by a webcam's video be changed to a video coming webRTC etc. This is what the the onaddtrack event monitors: when a MediaStreamTrack is added to the MediaStream container.
It has nothing to do with frames being appended to the video stream, for the MediaStream, it is either streaming or paused.
So your MediaStream holds a stream of video data, generated from the canvas current state.
Captured stream from canvas have this special that you can require at which maximum frequency the browser should append new frames to
the video stream. However this is just a maximum; if nothing new has been painted on the canvas, then no image will get appended, and the stream will continue to display the last image that got appended.
I don't think there is any way to know when this operation happens, but even if there was one, your process would be too much convoluted.
draw on canvas1
capture stream
render stream in <video>
draw <video> on canvas2
process the image drawn on canvas2
While all you need is
draw on canvas1
process the image drawn on canvas1
If you want to do it at a certain frame-rate, then set up a timeout loop.

Make an audio sprite without loading the complete song

I'm using Howler JS to play songs on a website. I want just a portion of the song to be played.
Im making a sprite of each mp3 and those sprites can be played. However, it takes really long before the audio plays. It's like the whole mp3 is downloaded first and then the sprite begins, which really decrease performances and consume bandwidth.
Im not familiar with Howler at all, maybe there's a method to download just the portion to be played, or if not, is there any other library/ ways to accomplish this ?
<div
className="playExtrait"
onClick={() => {
Howler.unload();
let song = new Howl({
src: [url],
html5: true,
sprite: {
extrait: [0, 30000]
}
});
let songID = song.play("extrait");
setPlayPause("playing");
song.fade(1, 0, 30000, songID);
song.on("end", () => {
setPlayPause("paused");
});
}}
>
You can create recordings of each specific time slices of the media by using Media Fragments URI, for example, by setting src of a <audio> element to /path/to/media#t=10,15 for playback of 10 through 15 seconds of the media resource and MediaRecorder to record the playback and save the recording as a .webm media file, where MediaRecorder is stopped at pause event of HTMLMediaElement.
See
How to edit (trim) a video in the browser?
How to get a precise timeupdate on a video to return upto 2 decimal numbers (milliseconds)?
Javascript - Seek audio to certain position when at exact position in audio track
How to use Blob URL, MediaSource or other methods to play concatenated Blobs of media fragments??
For an example of concatenating multiple media fragments into a single recording see MediaFragmentRecorder (am the author of the code at the repository). MediaSource at Chromium/Chrome has issues when MediaRecorder is used to record a MediaSource stream, though the code should still produce the expected result at Firefox.

Playing Live Audio Stream in HTML5 - MediaSource Errors in Chrome

I need a way to play a live audio stream using HTML5 (primarily in Google Chrome), so I tried using the following:
<audio>
<source src="my-live-stream.ogg" type="audio/ogg">
</audio>
While this does work for a live stream, there seems to be a very large, uncontrollable delay/buffer of around 30 seconds or so when this is played.
I need the delay to be a couple of seconds or less so this method doesn't work.
As an alternative I have tried sending the audio over a WebSocket connection as 1 second individual audio files, which are then appended to a SourceBuffer and played in an audio element using Media Source Extensions.
After experimenting with a number of formats (MediaSource.isTypeSupported seems to be rather limited in audio support), I got this working using a Vorbis audio stream in a WebM container, which sounds perfect with no audible gaps. Other formats worked less well as they need to be gapless - e.g. MP3 and AAC end up with tiny audible gaps between each 1 second segment.
While this seems to work at first, when looking at chrome://media-internals, the following errors repeatedly appear:
00:00:09 544 info Estimating WebM block duration to be 3ms for the last (Simple)Block in the Cluster for this Track. Use BlockGroups with BlockDurations at the end of each Track in a Cluster to avoid estimation.
00:00:09 585 error Large timestamp gap detected; may cause AV sync to drift. time:8994999us expected:9231000us delta:-236001us
00:01:05 239 debug Skipping splice frame generation: not enough samples for splicing new buffer at 65077997us. Have 1us, but need 1000us.
Eventually the playback stops as though the pause button has been pressed on the audio element. It still shows the pause rather than play button, but the current time stops advancing:
Pressing the pause button and then the play button that replaces it doesn't make it start playing again, but manually dragging the position slider further ahead makes it continue playing.
I have tried setting sourceBuffer.mode = 'sequence'; but this doesn't seem to help.
Is there anything that needs to be changed in how the audio files are being encoded, or how they are played back in JavaScript to fix this?
Additional details:
The audio stream is encoded into 1 second WebM/Vorbis files using FFmpeg on Windows.
A background worker is used in the browser to receive the audio segments and pass them to the main page thread, which appends them to the audio stream. Otherwise the playback freezes.
Source code:
Web player: https://github.com/SamuelFisher/WebSocketAudio
WebSocket server and encoder: https://github.com/SamuelFisher/WebSocketAudioServer

How to fully buffer chrome video?

While creating a video player using HTML5 video tag I have noticed undesirable behavior in Google Chrome. When I pause video buffering starts, and when I play buffering stops. As a result I get undesirable user experience.
I'm using large video files about 2-4 GB in size. And often, when I seek to some position and monitor buffered ranges I notice chrome buffers wrong buffer range. If I choose to pause player, chrome continues to buffer wrong buffer range and never buffers range currentTime is in and the one player is monitoring.
Another problem is that, since I choose to play video in background and hide viewer from noticing under other DOM elements, so I can force chrome to buffer while on playback. When is played / buffered enough I seek video back few seconds. Once I do this chrome stops buffering and my buffered range is quickly played, and process starts once again leaving bad user experience.
Is this a known issue, or am I doing something wrong? Is there any workaround to make Google Chrome buffering continue and not to stop?

Categories