I am working with Mediarecorder to record video. I need to show LOCAL preview of this video directly from the browser. My problem is long time recordings and huge sized videos. So i can't store MediaRecorder blobs in memory, because it will lead to browser crush on machines with low memory (RAM). So each webm blob from onDataAvailable i store into indexedDB.
After finishing recording i request all stored data from indexedDB by getAll transaction, but i still have same problem with memory. How can i play this array of blobs without loading all data into memory? How can i buffer this data?
I try to use MediaSource API
My method to get data from indexedDB:
function getAllData(indexedDB){
return new Promise(function (resolve, reject) {
var transaction = indexedDB.transaction(["blobs"], 'readonly');
var dbResult = transaction.objectStore("blobs").getAll();
dbResult.onerror = function (error) {
reject('failed to read from db: ' + error);
}
dbResult.onsuccess = function (event) {
resolve(event.target.result);
}
});
}
And when trying to play by MediaSource API:
getAllData().then(function (chunks) {
var mediaSource = new MediaSource();
var sourceBuffer;
var player = document.createElement('video');
var previewContainer = document.getElementById('preview_container');
player.width = 300;
player.height = 200;
player.autoPlay = false;
player.controls = true;
player.src = URL.createObjectURL(mediaSource);
mediaSource.addEventListener('sourceopen', function (e) {
sourceBuffer = mediaSource.addSourceBuffer('video/webm; codecs="vp8"');
(function readCunks(index) {
var fileReader = new FileReader();
index = index || 0;
if (index >= chunks.length) {
return;
}
fileReader.onload = function (e) {
sourceBuffer.appendBuffer(new Uint8Array(e.target.result));
readCunks(++index);
};
fileReader.readAsArrayBuffer(chunks[index]);
})();
}, false);
previewContainer.innerHTML = "";
previewContainer.appendChild(preview.player);
player.play();
}).catch(function (e) {
console.log('error on get data from db: ' + e);
});
But i get error "Failed to execute 'appendBuffer' on 'SourceBuffer': This SourceBuffer has been removed from the parent media source."
Update:
In addition i tested this code on Firefox. Is partially worked... After adding chunk 100 to sourseBuffer i got error: QuotaExceededError
So i have two main questions:
How can i fix SourceBuffer error and why it happens?
This is the best way to store and play huge .webm files Locally?
Related
I'm trying to record parts of the video from a tag, save it for later use. And I found this article: Recording a media element, which described a method by first calling stream = video.captureStream(), then use new MediaRecord(stream) to get a recorder.
I've tested on some demos, the MediaRecorder works fine if stream is from user's device (such as microphone). However, when it comes to media element, my FireFox browser throws an exception: MediaRecorder.start: The MediaStream's isolation properties disallow access from MediaRecorder.
So any idea on how to deal with it?
Browser: Firefox
The page (including the js file) is stored at local.
The src attribute of <video> tag could either be a file from local storage or a url from Internet.
Code snippets:
let chunks = [];
let getCaptureStream = function () {
let stream;
const fps = 0;
if (video.captureStream) {
console.log("use captureStream");
stream = video.captureStream(fps);
} else if (video.mozCaptureStream) {
console.log("use mozCaptureStream");
stream = video.mozCaptureStream(fps);
} else {
console.error('Stream capture is not supported');
stream = null;
}
return stream;
}
video.addEventListener('play', () => {
let stream = getCaptureStream();
const mediaRecorder = new MediaRecorder(stream);
mediaRecorder.onstop = function() {
const newVideo = document.createElement('video');
newVideo.setAttribute('controls', '');
newVideo.controls = true;
const blob = new Blob(chunks);
chunks = [];
const videoURL = window.URL.createObjectURL(blob, { 'type' : 'video/mp4; codecs="avc1.42E01E, mp4a.40.2"' });
newVideo.src = videoURL;
document.body.appendChild(video);
}
mediaRecorder.ondataavailable = function(e) {
chunks.push(e.data);
}
stopButton.onclick = function() {
mediaRecorder.stop()
}
mediaRecorder.start(); // This is the line triggers exception.
});
I found the solution myself.
When I turned to Chrome, it shows that a CORS issue limits me from even playing original video. So I guess it's because the secure strategy that preventing MediaRecorder from accessing MediaStreams. Therefore, I deployed the local files to a local server with instruction on this page.
After that, the MediaRecorder started working. Hope this will help someone in need.
But still, the official document doesn't seem to mention much about isolation properties of media elements. So any idea or further explanation is welcomed.
I'm using the Screen Capture API and am trying to save the final capture to a video file (WebM, MP4, etc.). I have these two JavaScript functions:
async function startCapture() {
try {
videoElem.srcObject = await navigator.mediaDevices.getDisplayMedia(displayMediaOptions);
} catch(err) {
console.error("Error: " + err);
}
}
function stopCapture() {
let tracks = videoElem.srcObject.getTracks();
tracks.forEach(track => track.stop());
videoElem.srcObject = null;
}
The video is displaying live fine when the capture is started, but I'm not sure how to actually store its contents. videoElem is a Promise that resolves to a MediaStream. tracks is an array of MediaStreamTrack objects. This is my first time doing any kind of web development, so I'm a bit lost!
async function startRecording() {
stream = await navigator.mediaDevices.getDisplayMedia({
video: true,
audio: true
});
recorder = new MediaRecorder(stream);
const chunks = [];
recorder.ondataavailable = e => chunks.push(e.data);
recorder.onstop = e => {
const blob = new Blob(chunks, { type: chunks[0].type });
console.log(blob);
stream.getVideoTracks()[0].stop();
filename="yourCustomFileName"
if(window.navigator.msSaveOrOpenBlob) {
window.navigator.msSaveBlob(blob, filename);
}
else{
var elem = window.document.createElement('a');
elem.href = window.URL.createObjectURL(blob);
elem.download = filename;
document.body.appendChild(elem);
elem.click();
document.body.removeChild(elem);
}
};
recorder.start();
}
startRecording(); //Start of the recording
-----------
recorder.stop() // End your recording by emitting this event
This will save your recording as a webm file
Recording a media element on the MDN docs helped me a ton. Basically, instead of using getUserMedia(), we use getDisplayMedia().
Like with any MediaStream, you can record it using the MediaRecorder API.
I'm working on a project which requires the ability to stream audio from a webpage to other clients. I'm already using websocket and would like to channel the data there.
My current approach uses Media Recorder, but there is a problem with sampling which causes interrupts. It registers 1s audio and then send's it to the server which relays it to other clients. Is there a way to capture a continuous audio stream and transform it to base64?
Maybe if there is a way to create a base64 audio from MediaStream without delay it would solve the problem. What do you think?
I would like to keep using websockets, I know there is webrtc.
Have you ever done something like this, is this doable?
--> Device 1
MediaStream -> MediaRecorder -> base64 -> WebSocket -> Server --> Device ..
--> Device 18
Here a demo of the current approach... you can try it here: https://jsfiddle.net/8qhvrcbz/
var sendAudio = function(b64) {
var message = 'var audio = document.createElement(\'audio\');';
message += 'audio.src = "' + b64 + '";';
message += 'audio.play().catch(console.error);';
eval(message);
console.log(b64);
}
navigator.mediaDevices.getUserMedia({
audio: true
}).then(function(stream) {
setInterval(function() {
var chunks = [];
var recorder = new MediaRecorder(stream);
recorder.ondataavailable = function(e) {
chunks.push(e.data);
};
recorder.onstop = function(e) {
var audioBlob = new Blob(chunks);
var reader = new FileReader();
reader.readAsDataURL(audioBlob);
reader.onloadend = function() {
var b64 = reader.result
b64 = b64.replace('application/octet-stream', 'audio/mpeg');
sendAudio(b64);
}
}
recorder.start();
setTimeout(function() {
recorder.stop();
}, 1050);
}, 1000);
});
Websocket is not the best. I solved by using WebRTC instead of websocket.
The solution with websocket was obtained while recording 1050ms instead of 1000, it causes a bit of overlay but still better than hearing blanks.
Although you have solved this through WebRTC, which is the industry recommended approach, I'd like to share my answer on this.
The problem here is not websockets in general but rather the MediaRecorder API. Instead of using it one can use PCM audio capture and then submit the captured array buffers into a web worker or WASM for encoding to MP3 chunks or similar.
const context = new AudioContext();
let leftChannel = [];
let rightChannel = [];
let recordingLength = null;
let bufferSize = 512;
let sampleRate = context.sampleRate;
const audioSource = context.createMediaStreamSource(audioStream);
const scriptNode = context.createScriptProcessor(bufferSize, 1, 1);
audioSource.connect(scriptNode);
scriptNode.connect(context.destination);
scriptNode.onaudioprocess = function(e) {
// Do something with the data, e.g. convert it to WAV or MP3
};
Based on my experiments this would give you "real-time" audio. My theory with the MediaRecorder API is that it does some buffering first before emitting out anything that causes the observable delay.
I am working with MediaRecoder for save canvas actions as Video.(format is not matter)
var recordedBlobs =[];
var stream = canvas.captureStream();
var mediaRecorder = new MediaRecorder(stream, options);
captureStream function collecting my canvases edited data, and I am saving this data with this function.
function handleDataAvailable(event) {
if (event.data && event.data.size > 0) {
recordedBlobs.push(event.data);
}
}
When stop to record stream I am calling this function for play my recordedBlobs as video.
function handleStop(event) {
debugger;
console.log('Recorder stopped: ', event);
const superBuffer = new Blob(recordedBlobs, { type:'video/webm'});
video.src = window.URL.createObjectURL(superBuffer);
}
Its working fine. Videoplayer is starting to play my canvas data as video.
But i want to record this values to a txt file or something like that.The reason is this canvas actions may take 1 hour and video size may be 1 GB and more then this. My goal is read this txt in another page and play this data as video but I dont know how can i do this.
What is the best way for me to save this datas?
You are pretty much there , just add
const superBuffer = new Blob(recordedBlobs, { type:'video/webm'});
let objUrl = window.URL.createObjectURL(superBuffer);
let anchor = document.createElement('a');
anchor.href = objUrl;
anchor.setAttribute('download', 'YourFileName.txt');
anchor.click();
I am currently trying to stream a .webm video file via socket.io to my client (currently using Chrome as client).
Appending the first Uint8Array to the SourceBuffer works fine but appending further ones does not work and throws the following error:
Uncaught DOMException: Failed to execute 'appendBuffer' on 'SourceBuffer': The HTMLMediaElement.error attribute is not null.
My current code:
'use strict';
let socket = io.connect('http://localhost:1337');
let mediaSource = new MediaSource();
let video = document.getElementById("player");
let queue = [];
let sourceBuffer;
video.src = window.URL.createObjectURL(mediaSource);
mediaSource.addEventListener('sourceopen', function() {
sourceBuffer = mediaSource.addSourceBuffer('video/webm; codecs="vorbis,vp8"');
socket.on("video", function(data) {
let uIntArray = new Uint8Array(data);
if (!sourceBuffer.updating) {
sourceBuffer.appendBuffer(uIntArray);
} else {
queue.push(data);
}
});
});
Server side code (snippet)
io.on('connection', function(socket) {
console.log("Client connected");
let readStream = fs.createReadStream("bunny.webm");
readStream.addListener('data', function(data) {
socket.emit('video', data);
});
});
I also removed the webkit checks since this will only run on Chromium browsers.
I think you have to free the buffer, see the remove() function
http://w3c.github.io/media-source/#widl-SourceBuffer-remove-void-double-start-unrestricted-double-end
Let me know if it helped.