Assigning different streams to `srcObject` of a video element - javascript

I have a video element. I have multiple streams captured by navigator.getUserMedia.
I can assign srcObject successfully the first time:
previewVideoElement.srcObject = stream;
However if I re-assign a different stream to srcObject later (same element) then the stream doesn't work (no errors, blank video). How can I do this without recreating video elements each time?
Edit: trying this fails as well:
const previewVideoElement = document.getElementById("new-device-preview");
previewVideoElement.pause();
previewVideoElement.srcObject = stream;
previewVideoElement.play();
Edit: calling this works a few times, but then fails with The play() request was interrupted by a call to pause(). Without pause I get The play() request was interrupted by a new load request..
previewVideoElement.pause();
previewVideoElement.srcObject = stream;
previewVideoElement.load();
previewVideoElement.play();
Edit: even this heap of garbage doesn't work:
const previewVideoElement = document.getElementById("new-device-preview");
//previewVideoElement.pause();
previewVideoElement.srcObject = stream;
previewVideoElement.load();
const isPlaying = previewVideoElement.currentTime > 0 && !previewVideoElement.paused && !previewVideoElement.ended && previewVideoElement.readyState > 2;
if (!isPlaying)
setTimeout(function () {
previewVideoElement.play();
}, 500);
The only thing I could get working reliably:
var previewVideoElement = document.getElementById("new-device-preview");
if (previewVideoElement.srcObject) {
$("#new-device-preview-container").empty();
$("#new-device-preview-container").html('<video autoplay class="new-device-preview" id="new-device-preview"></video>')
}
previewVideoElement = document.getElementById("new-device-preview");
previewVideoElement.srcObject = stream;

Related

Failed to load resource: net::ERR_REQUEST_RANGE_NOT_SATISFIABLE - Audio recorder API Error [duplicate]

The situation
I need to do the following:
Get the video from a <video> and play inside a <canvas>
Record the stream from the canvas as a Blob
That's it. The first part is okay.
For the second part, I managed to record a Blob. The problem is that the Blob is empty.
The view
<video id="video" controls="true" src="http://upload.wikimedia.org/wikipedia/commons/7/79/Big_Buck_Bunny_small.ogv"></video>
<canvas id="myCanvas" width="532" height="300"></canvas>
The code
// Init
console.log(MediaRecorder.isTypeSupported('video/webm')) // true
const canvas = document.querySelector("canvas")
const ctx = canvas.getContext("2d")
const video = document.querySelector("video")
// Start the video in the player
video.play()
// On play event - draw the video in the canvas
video.addEventListener('play', () => {
function step() {
ctx.drawImage(video, 0, 0, canvas.width, canvas.height)
requestAnimationFrame(step)
}
requestAnimationFrame(step);
// Init stream and recorder
const stream = canvas.captureStream()
const recorder = new MediaRecorder(stream, {
mimeType: 'video/webm',
});
// Get the blob data when is available
let allChunks = [];
recorder.ondataavailable = function(e) {
console.log({e}) // img1
allChunks.push(e.data);
}
// Start to record
recorder.start()
// Stop the recorder after 5s and check the result
setTimeout(() => {
recorder.stop()
const fullBlob = new Blob(allChunks, { 'type' : 'video/webm' });
const downloadUrl = window.URL.createObjectURL(fullBlob)
console.log({fullBlob}) // img2
}, 5000);
})
The result
This the console.log of the ondataavailable event:
This is the console.log of the Blob:
The fiddle
Here is the JSFiddle. You can check the results in the console:
https://jsfiddle.net/1b7v2pen/
Browsers behavior
This behavior (Blob data size: 0) it happens on Chrome and Opera.
On Firefox it behaves slightly different.
It records a very small video Blob (725 bytes). The video length is 5 seconds as it should be, but it's just a black screen.
The question
What is the proper way to the record a stream from a canvas?
Is there something wrong in the code?
Why did the Blob come out empty?
MediaRecorder.stop() is kind of an asynchronous method.
In the stop algorithm, there is a call to requestData, which itself will queue a task to fire an event dataavailable with the currently available data since the last such event.
This means that synchronously after you called MediaRecorder#stop() the last data grabbed will not be part of your allChunks Array yet. They will become not so long after (normally in the same event loop).
So, when you are about to save recordings made from a MediaRecorder, be sure to always build the final Blob from the MediaRecorder's onstop event, which will signal that the MediaRecorder is actually ended, did fire its last dataavailable event, and that everything is all good.
And one thing I missed at first, is that you are requesting a cross-domain video. Doing so, without the correct cross-origin request, will make your canvas (and MediaElement) tainted, so your MediaStream will be muted.
Since the video you are trying to request is from wikimedia, you can simply request it as a cross-origin resource, but for other resources, you'll have to be sure the server is configured to allow these requests.
const canvas = document.querySelector("canvas")
const ctx = canvas.getContext("2d")
const video = document.querySelector("video")
// Start the video in the player
video.play()
// On play event - draw the video in the canvas
video.addEventListener('play', () => {
function step() {
ctx.drawImage(video, 0, 0, canvas.width, canvas.height)
requestAnimationFrame(step)
}
requestAnimationFrame(step);
// Init stream and recorder
const stream = canvas.captureStream()
const recorder = new MediaRecorder(stream, {
mimeType: 'video/webm',
});
// Get the blob data when is available
let allChunks = [];
recorder.ondataavailable = function(e) {
allChunks.push(e.data);
}
recorder.onstop = (e) => {
const fullBlob = new Blob(allChunks, { 'type' : 'video/webm' });
const downloadUrl = window.URL.createObjectURL(fullBlob)
console.log({fullBlob})
console.log({downloadUrl})
}
// Start to record
recorder.start()
// Stop the recorder after 5s and check the result
setTimeout(() => {
recorder.stop()
}, 5000);
})
<!--add the 'crossorigin' attribute to your video -->
<video id="video" controls="true" src="https://upload.wikimedia.org/wikipedia/commons/7/79/Big_Buck_Bunny_small.ogv" crossorigin="anonymous"></video>
<canvas id="myCanvas" width="532" height="300"></canvas>
Also, I can't refrain to note that if you don't do any special drawings from your canvas, you might want to save the video source directly, or at least, record the <video>'s captureStream MediaStream directly.

JavaScript: Use MediaRecorder to record streams from <video> but failed

I'm trying to record parts of the video from a tag, save it for later use. And I found this article: Recording a media element, which described a method by first calling stream = video.captureStream(), then use new MediaRecord(stream) to get a recorder.
I've tested on some demos, the MediaRecorder works fine if stream is from user's device (such as microphone). However, when it comes to media element, my FireFox browser throws an exception: MediaRecorder.start: The MediaStream's isolation properties disallow access from MediaRecorder.
So any idea on how to deal with it?
Browser: Firefox
The page (including the js file) is stored at local.
The src attribute of <video> tag could either be a file from local storage or a url from Internet.
Code snippets:
let chunks = [];
let getCaptureStream = function () {
let stream;
const fps = 0;
if (video.captureStream) {
console.log("use captureStream");
stream = video.captureStream(fps);
} else if (video.mozCaptureStream) {
console.log("use mozCaptureStream");
stream = video.mozCaptureStream(fps);
} else {
console.error('Stream capture is not supported');
stream = null;
}
return stream;
}
video.addEventListener('play', () => {
let stream = getCaptureStream();
const mediaRecorder = new MediaRecorder(stream);
mediaRecorder.onstop = function() {
const newVideo = document.createElement('video');
newVideo.setAttribute('controls', '');
newVideo.controls = true;
const blob = new Blob(chunks);
chunks = [];
const videoURL = window.URL.createObjectURL(blob, { 'type' : 'video/mp4; codecs="avc1.42E01E, mp4a.40.2"' });
newVideo.src = videoURL;
document.body.appendChild(video);
}
mediaRecorder.ondataavailable = function(e) {
chunks.push(e.data);
}
stopButton.onclick = function() {
mediaRecorder.stop()
}
mediaRecorder.start(); // This is the line triggers exception.
});
I found the solution myself.
When I turned to Chrome, it shows that a CORS issue limits me from even playing original video. So I guess it's because the secure strategy that preventing MediaRecorder from accessing MediaStreams. Therefore, I deployed the local files to a local server with instruction on this page.
After that, the MediaRecorder started working. Hope this will help someone in need.
But still, the official document doesn't seem to mention much about isolation properties of media elements. So any idea or further explanation is welcomed.

How can I record a canvas with MediaRecorder to the exact same duration as another video?

I have one video of duration 9200 ms, and a canvas displaying user's webcam video. I'm aiming to record the webcam video while the original video plays to create an output blob of the exact same duration with MediaRecorder but seem to always get a video with longer length (typically around 9400ms).
I've found that if I take the difference in durations and skip ahead in the output video by this amount it will basically sync up with the original video, but I'm hoping to not have to use this hack. Knowing this, I assumed the difference was because HTML5 video's play() function is asynchronous, but even calling recorder.start() inside a .then() after the play() promise still results in an output blob with longer duration.
I start() the MediaRecorder after play()ing the original video, and call stop() inside a requestAnimationFrame loop when I see that the original video has ended. Changing the MediaRecorder.start() to begin in the requestAnimationFrame loop only after checking the original video is playing also results in a longer output blob.
What might be the reason for the longer output? From the documentation it doesn't appear that MediaRecorder's start or stop functions are asynchronous, so is there some way to guarantee an exact starting time with HTML5 video and MediaRecorder?
Yes start() and stop() are async, that's why we have onstart and onstop events firing:
const stream = makeEmptyStream();
const rec = new MediaRecorder(stream);
rec.onstart = (evt) => { console.log( "took %sms to start", performance.now() - begin ); };
const begin = performance.now();
rec.start();
setTimeout( () => {
rec.onstop = (evt) => { console.log( "took %sms to stop", performance.now() - begin ); };
const begin = performance.now();
rec.stop();
}, 1000 );
function makeEmptyStream() {
const canvas = document.createElement('canvas');
canvas.getContext('2d').fillRect(0,0,1,1);
return canvas.captureStream();
}
You can thus try to pause your video after it's been readied to play, then wait your recorder starts before starting again the playback of the video.
However, given everything in both the HTMLMediaElement and MediaRecorder is async, there is no way to get a perfect 1 to 1 relation...
const vid = document.querySelector('video');
onclick = (evt) => {
onclick = null;
vid.play().then( () => {
// pause immediately the main video
vid.pause();
// we may have advanced of a few µs already, so go back to beginning
vid.currentTime = 0;
// only when we're back to beginning
vid.onseeked = (evt) => {
console.log( 'recording will begin shortly, please wait until the end of the video' );
console.log( 'original %ss', vid.duration );
const stream = vid.captureStream ? vid.captureStream() : vid.mozCaptureStream();
const chunks = [];
const rec = new MediaRecorder( stream );
rec.ondataavailable = (evt) => {
chunks.push( evt.data );
};
rec.onstop = (evt) => {
logVideoDuration( new Blob( chunks ), "recorded %ss" );
};
vid.onended = (evt) => {
rec.stop();
};
// wait until the recorder is ready before playing the video again
rec.onstart = (evt) => {
vid.play();
};
rec.start();
};
} );
function logVideoDuration( blob, name ) {
const el = document.createElement('video');
el.src = URL.createObjectURL( blob );
el.play().then( () => {
el.pause();
el.onseeked = (evt) => console.log( name, el.duration );
el.currentTime = 10e25;
} );
}
};
video { pointer-events: none; width: 100% }
click to start<br>
<video src="https://upload.wikimedia.org/wikipedia/commons/a/a4/BBH_gravitational_lensing_of_gw150914.webm" controls crossorigin></video>
Also note that there might be some discrepancy in the duration declared by your media, the calculated duration of the recorded media, and their actual duration. Indeed, these durations are often only a value hard-coded in the metadata of the files, but given how the MediaRecorder API works, it's hard to do this there, so for instance Chrome will produce files without duration, and the players will try to approximate that duration based on the last point they can seek in the media.

starting/stopping MediaRecorder API causes Chrome to crash

I am implementing the MediaRecorder API as a way to record webm blobs for use as segments in a livestream. I have gotten the functionality I need but ran into a problem with Chrome crashing when calling MediaRecorder.stop() and MediaRecorder.start() multiple times in regular intervals.
Here is the recording code:
let Recorder = null;
let segmentBuffer = [];
let recordInterval = null;
let times = 0; //limiter for crashes
function startRecording() {
Recorder = new MediaRecorder(LocalStream, { mimeType: 'video/webm;codecs=opus, vp8', audioBitsPerSecond: 50000, videoBitsPerSecond: 1000000, });
//error evt
Recorder.onerror = (evt) => {
console.error(evt.error);
}
//push blob data to segments buffer
Recorder.ondataavailable = (evt) => {
segmentBuffer.push(evt.data);
}
//start initial recording
Recorder.start();
//set stop/start delivery interval every 5 seconds
recordInterval = setInterval(() => {
//stop recording
Recorder.stop();
//here to prevent crash
if (times > 5) {
Recorder = null;
console.log('end')
return;
}
times++;
//check if has segments
if (segmentBuffer.length) {
//produce segment, this segment is playable and not just a byte-stream due to start/stop
let webm = segmentBuffer.reduce((a, b) => new Blob([a, b], { type: "video/webm;codecs=opus, vp8" }));
//unset buffer
segmentBuffer = [];
//handle blob ie. send to server
handleBlob(webm)
}
//restart recorder
Recorder.start();
}, 5000);
}
I've also gone into the performance and discovered that a new audio and video encoder thread is started for each start/stop. I think this is the major problem as setting the interval to 10s vs. 5s creates fewer encoding threads. The buildup of multiple encoding threads causes chrome to lag and then finally crash afer a few passes.
How do I prevent multiple encoding threads from occurring while still being able to start/stop MediaRecorder (start/stop is the only way I found to achieve webm files that can be playable separately, otherwise each subsequent blob is missing the webm header part).
It appears that this is a bug in chrome:
https://bugs.chromium.org/p/chromium/issues/detail?id=1012378&q=mediaRecorder%20thread&can=2
I'm not sure there is anything you can do to fix it.

How can I properly record a MediaStream?

The situation
I need to do the following:
Get the video from a <video> and play inside a <canvas>
Record the stream from the canvas as a Blob
That's it. The first part is okay.
For the second part, I managed to record a Blob. The problem is that the Blob is empty.
The view
<video id="video" controls="true" src="http://upload.wikimedia.org/wikipedia/commons/7/79/Big_Buck_Bunny_small.ogv"></video>
<canvas id="myCanvas" width="532" height="300"></canvas>
The code
// Init
console.log(MediaRecorder.isTypeSupported('video/webm')) // true
const canvas = document.querySelector("canvas")
const ctx = canvas.getContext("2d")
const video = document.querySelector("video")
// Start the video in the player
video.play()
// On play event - draw the video in the canvas
video.addEventListener('play', () => {
function step() {
ctx.drawImage(video, 0, 0, canvas.width, canvas.height)
requestAnimationFrame(step)
}
requestAnimationFrame(step);
// Init stream and recorder
const stream = canvas.captureStream()
const recorder = new MediaRecorder(stream, {
mimeType: 'video/webm',
});
// Get the blob data when is available
let allChunks = [];
recorder.ondataavailable = function(e) {
console.log({e}) // img1
allChunks.push(e.data);
}
// Start to record
recorder.start()
// Stop the recorder after 5s and check the result
setTimeout(() => {
recorder.stop()
const fullBlob = new Blob(allChunks, { 'type' : 'video/webm' });
const downloadUrl = window.URL.createObjectURL(fullBlob)
console.log({fullBlob}) // img2
}, 5000);
})
The result
This the console.log of the ondataavailable event:
This is the console.log of the Blob:
The fiddle
Here is the JSFiddle. You can check the results in the console:
https://jsfiddle.net/1b7v2pen/
Browsers behavior
This behavior (Blob data size: 0) it happens on Chrome and Opera.
On Firefox it behaves slightly different.
It records a very small video Blob (725 bytes). The video length is 5 seconds as it should be, but it's just a black screen.
The question
What is the proper way to the record a stream from a canvas?
Is there something wrong in the code?
Why did the Blob come out empty?
MediaRecorder.stop() is kind of an asynchronous method.
In the stop algorithm, there is a call to requestData, which itself will queue a task to fire an event dataavailable with the currently available data since the last such event.
This means that synchronously after you called MediaRecorder#stop() the last data grabbed will not be part of your allChunks Array yet. They will become not so long after (normally in the same event loop).
So, when you are about to save recordings made from a MediaRecorder, be sure to always build the final Blob from the MediaRecorder's onstop event, which will signal that the MediaRecorder is actually ended, did fire its last dataavailable event, and that everything is all good.
And one thing I missed at first, is that you are requesting a cross-domain video. Doing so, without the correct cross-origin request, will make your canvas (and MediaElement) tainted, so your MediaStream will be muted.
Since the video you are trying to request is from wikimedia, you can simply request it as a cross-origin resource, but for other resources, you'll have to be sure the server is configured to allow these requests.
const canvas = document.querySelector("canvas")
const ctx = canvas.getContext("2d")
const video = document.querySelector("video")
// Start the video in the player
video.play()
// On play event - draw the video in the canvas
video.addEventListener('play', () => {
function step() {
ctx.drawImage(video, 0, 0, canvas.width, canvas.height)
requestAnimationFrame(step)
}
requestAnimationFrame(step);
// Init stream and recorder
const stream = canvas.captureStream()
const recorder = new MediaRecorder(stream, {
mimeType: 'video/webm',
});
// Get the blob data when is available
let allChunks = [];
recorder.ondataavailable = function(e) {
allChunks.push(e.data);
}
recorder.onstop = (e) => {
const fullBlob = new Blob(allChunks, { 'type' : 'video/webm' });
const downloadUrl = window.URL.createObjectURL(fullBlob)
console.log({fullBlob})
console.log({downloadUrl})
}
// Start to record
recorder.start()
// Stop the recorder after 5s and check the result
setTimeout(() => {
recorder.stop()
}, 5000);
})
<!--add the 'crossorigin' attribute to your video -->
<video id="video" controls="true" src="https://upload.wikimedia.org/wikipedia/commons/7/79/Big_Buck_Bunny_small.ogv" crossorigin="anonymous"></video>
<canvas id="myCanvas" width="532" height="300"></canvas>
Also, I can't refrain to note that if you don't do any special drawings from your canvas, you might want to save the video source directly, or at least, record the <video>'s captureStream MediaStream directly.

Categories