I am trying to use media streams with getUserMedia on Chrome on Android. To test, I worked up the script below which simply connects the input stream to the output. This code works as-expected on Chrome under Windows, but on Android I do not hear anything. The user is prompted to allow for microphone access, but no audio comes out of the speaker, handset speaker, or headphone jack.
navigator.webkitGetUserMedia({
video: false,
audio: true
}, function (stream) {
var audioContext = new webkitAudioContext();
var input = audioContext.createMediaStreamSource(stream);
input.connect(audioContext.destination);
});
In addition, the feedback beeps when rolling the volume up and down do not sound, as if Chrome is playing back audio to the system.
Is it true that this functionality isn't supported on Chrome for Android yet? The following questions are along similar lines, but neither really have a definitive answer or explanation.
HTML5 audio recording not woorking in Google Nexus
detecting support for getUserMedia on Android browser fails
As I am new to using getUserMedia, I wanted to make sure there wasn't something I was doing in my code that could break compatibility.
I should also note that this problem doesn't seem to apply to getUserMedia itself. It is possible to use getUserMedia in an <audio> tag, as demonstrated by this code (utilizes jQuery):
navigator.webkitGetUserMedia({
video: false,
audio: true
}, function (stream) {
$('body').append(
$('<audio>').attr('autoplay', 'true').attr('src', webkitURL.createObjectURL(stream))
);
});
Chrome on Android now properly supports getUserMedia. I suspect that this originally had something to do with the difference in sample rate between recording and playback (which exhibits the same issue on desktop Chrome). In any case, all started working some time on the latest stable around February 2014.
Related
Problem
I am recording the webm chunks by MediaRecorder at Chrome 83 in Windows 10 and sending these to other computer. These chunks are playing on another Chrome by using Media Source Extension(MSE).
sourceBuffer.appendBuffer(webmChunkData);
Everything works fine between 1 to 1.20 seconds. But after that, the audio/video syncing problem starts. The gap between audio and video is minimal at the moment, but as time increases, the gap also rises.
The weird thing is that we can see the different behaviour on different browsers, let me show this by
Chrome's version is 83+ in almost all operating systems.
Camera can be the problem ?
I think Camera is not the problem as I have dual operating systems Fedora and Windows in the same machine. And webm chunks play fine with the Fedora.
Sample rate can be the problem ?
I doubt this. But when I compare the sample rate used by browsers while playing. chrome://media-internals shows 48000 for both with and without a syncing problem.
Warning message from Chrome
Chrome which has sync problem also shows the below message on chrome://media-internals
Question:
Why there is an audio/video syncing problem when both recording and playing are performed on Chrome browser in Windows 10?
How can I eliminate this syncing problem?
I believe I have a workaround for you. The problem seems specific to Chrome + MediaRecorder + VP8, and has nothing to do with MSE or the platform. I have the same issues on Chrome 98 on Mac 12.2.1. Additionally, if you decrease the .start(timeslice) argument, the issue will appear more rapidly and more severely.
However... when I use VP9 the problem does not manifest!
I'm using code like this:
function supportedMimeTypes(): string[] {
// From least desirable to most desirable
return [
// most compatible, but chrome creates bad chunks
'video/webm;codecs=vp8,opus',
// works in chrome, firefox falls back to vp8
'video/webm;codecs=vp9,opus'
].filter(
(m) => MediaRecorder.isTypeSupported(m)
);
}
const mimeType = supportedMimeTypes().pop();
if (!mimeType) throw new Error("Could not find a supported mime-type");
const recorder = new MediaRecorder(stream, {
// be sure to use a mimeType with a specific `codecs=` as otherwise
// chrome completely ignores it and uses video/x-matroska!
// https://stackoverflow.com/questions/64233494/mediarecorder-does-not-produce-a-valid-webm-file
mimeType,
});
recorder.start(1000)
The resulting VP9 appears to play in Firefox, and a VP8 recorded in Firefox plays well in Chrome too.
I use audio-recorder-polyfill in my React project, to make possible audio recording for Safari. It seems to work in getting the recording to take place, however, no audio data gets available. The event "dataavailable" never gets fired, and no data seems to be "compiled" after stopping recording either.
recordFunc() {
navigator.mediaDevices.getUserMedia({ audio: true }).then(stream => {
recorder = new MediaRecorder(stream);
// Set record to <audio> when recording will be finished
recorder.addEventListener('dataavailable', e => {
this.audio.current.src = URL.createObjectURL(e.data);
})
// Start recording
recorder.start();
})
}
stopFunc() {
// Stop recording
recorder.stop();
// Remove “recording” icon from browser tab
recorder.stream.getTracks().forEach(i => i.stop());
}
There have been a number of similar issues posted on audio-recorder-polyfill's issue tracker.
a
b
c
d
e
Root cause
One of those issues, #4 (not listed above), is still open. Several comments on that issue tracker hint that the root issue is that Safari cancels the AudioContext if it was not created in a handler for a user interaction (e.g. a click).
Possible solutions
You may be able to get it to work if you:
Do the initialisation inside a handler for user interaction (i.e. <button onclick="recordFunc()">)
Do not attempt to reuse the MediaStream returned from getUserMedia() for multiple recordings
Do not attempt more than 4 (or 6?) audio recordings on the same page (sources [1], [2] mention that Safari will block this)
Alternative libraries
You might also be able to try the StereoAudioRecorder class from the RecordRTC package, which has more users (3K) but appears less maintained, and might work
Upcoming support
If you'd prefer to stick to the MediaRecorder API and the tips above don't work for you, the good news is that there is experimental support for MediaRecorder in Safari 12.4 and up (iOS and macOS), and it appears to be supported in the latest Safari Technology Preview.
See also
The comments in this issue may also be useful
I've been developing a web app to scan the barcode in live stream. I have used the following code for video streaming:
navigator.mediaDevices.getUserMedia({ video: constraints }).then(function(stream) {
// video.src = window.URL.createObjectURL(stream);
video.srcObject = stream;
video.play();
// ...
}
It works as expected in Android chrome browser, and also in ios safari browser. But, when i tried it in ios Chrome browser, it is not working.
I have also tried adding the following constraints:
video.setAttribute('autoplay', '');
video.setAttribute('muted', '');
video.setAttribute('playsinline', '');
But no use on it. Can anyone please suggest me the right solution to do it.
I'm attempting to display a video in Firefox. The video has to be in MP4, converting the video isn't an option. However this will only work in some situations as Firefox relies on OS level support for MP4, rather than built in support.
It's ok that it won't always work, but I want to be able to detect when it will fail.
I've tried several existing solutions on StackOverflow ( How to check if the browser can play mp4 via html5 video tag? )
My current testing code reads:
var mp4Supported = (!!document.createElement('video').canPlayType('video/mp4; codecs=avc1.42E01E,mp4a.40.2'));
if (!mp4Supported) { console.log("MP4 not supported") } else { console.log("MP4 supported") };
However since Firefox now does (technically) support MP4, this seems to always return true, whether the video can be decoded or not.
Console output from the above on Firefox where there's no native support for MP4:
"MP4 supported"
Media resource <My resource URL> could not be decoded.
Does anyone know of a reliable way to detect successful running now that Firefox has partial support?
Ok, I'm going to try and make my question as clear as possible, but I'm pretty confused, so let me know if I'm not getting the message across.
I'm trying to use getUserMedia to use the webcam, and then use this
http://www.w3.org/TR/mediastream-recording/
to record a brief captured video. Problem is, when I try to define new MediaRecorder(stream), I'm told that it is undefined. I haven't used this api before, so I don't really know what I'm missing. Here is the relevant code:
function onVideoFail(e) {
console.log('webcam fail!', e);
};
function hasGetUserMedia() {
return !!(navigator.getUserMedia || navigator.webkitGetUserMedia || navigator.mozGetUserMedia || navigator.msGetUserMedia);
}
if (hasGetUserMedia()) {
window.URL = window.URL || window.webkitURL;
navigator.getUserMedia = navigator.getUserMedia ||
navigator.webkitGetUserMedia ||
navigator.mozGetUserMedia ||
navigator.msGetUserMedia;
if (navigator.getUserMedia) {
navigator.getUserMedia({video: true, audio: false}, function(stream){
var video = document.querySelector('video');
var recorder = new MediaRecorder(stream); <<<<<< THIS IS MY PROBLEM SPOT
video.src = window.URL.createObjectURL(stream);
video.play();
// webcamstream = stream;
// streamrecorder = webcamstream.record();
}, onVideoFail);
} else {
alert('failed');
}
} else {
alert('getUserMedia() is not supported by this browser!!');
}
I've been trying to look at this for reference:
HTML5 getUserMedia record webcam, both audio and video
MediaStream Recording (or Media Recorder API after the MediaRecorder JS object it defines) has now been implemented in 2 major browsers on desktop:
Firefox 30 audio + video
Chrome 47, 48 only for video with experimental Web Platform on in chrome://flags.
Chrome 49 audio + video
Containers:
Both record to .webm containers.
Video codecs:
Both record VP8 video
Chrome 49+ can record VP9 video
Chrome 52+ can record H.264 video
Audio codecs:
Firefox records Vorbis audio # 44.1 kHz
Chrome 49 records Opus audio # 48 kHz
Demos/GitLab:
https://simpl.info/mediarecorder/
https://addpipe.com/media-recorder-api-demo/
Make sure you run these demos on HTTPS or localhost:
Starting with Chrome 47, getUserMedia() requests are only allowed from secure origins: HTTPS or localhost.
Further reading:
MediaRecorder spec on w3c
HTML5’s Media Recorder API in Action on Chrome and Firefox on addpipe.com
MediaRecorder API on mozilla.org
Chrome 47 WebRTC: media recording, secure origins on developers.google.com
MediaRecorder API Available in Chrome 49 on developers.google.com
Disclaimer: I work at Pipe where we handle video recording.
I am currently using this API, and I've found our it is currently only implemented on the Nightly version of firefox, and it can only record audio.
It isn't implemented on Chrome (to my knowledge).
Here is how I use it, if it can help:
function record(length,stream) {
var recorder = new window.MediaRecorder(stream);
recorder.ondataavailable = function(event) {
if (recorder.state == 'recording') {
var blob = new window.Blob([event.data], {
type: 'audio/ogg'
});
// use the created blob
recorder.stop();
}
};
recorder.onstop = function() {
recorder = null;
};
recorder.start(length);
}
I put a MediaStream Recording demo at simpl.info/mediarecorder.
This currently works with Firefox Nightly and, as #Apzx says, it's audio only. There is an Intent to Implement for Chrome.
As of version 49, Chrome now has support for the MediaRecorder API. You may now record MediaStream objects. Unfortunately, if you're building a product that must record MediaStreams for browsers older than literally the latest version of Chrome (at least as of this writing), then you will need to make use of a WebRTC Media Server/Gateway.
The basic idea is that a peer connection is instantiated on a server and that your local peer attaches a video and/or audio stream to its connection object to be sent to the server peer. The server then listens to the incoming WebRTC stream and records it to a file. A couple of media servers you may want to check out:
Kurento
Janus
I personally am using Kurento and recording streams with it with great success.
In order to get a media server to work, you will need to spin up your own app server that handles signaling and handling of ICE Candidates. This is pretty simple, and Kurento has some pretty good examples with Node and Java.
If you are targeting a general audience and are using a media server, you will also probably need a STUN or TURN server. These servers essentially use the network topology to get your media server's public IP and your client's public IP. Beware that, if either end (the media server or client) lies behind a symmetric NAT, the STUN server will not be enough and you will need to use a TURN server (a free one can be found here). Without too much detail, a good thing to remember is that a STUN server acts as a signaling gateway where as a TURN server is a relay gateway. What that means is that the media streams will literally pass through a TURN server, whereas the media streams will pass from RTC peer connection to the other connection directly.
Hopefully this was helpful. If you really need RTC recording capabilities, then you're going to be going down a long road, so make sure it's worth it.
See also RecordRTC, which has workarounds for Chrome to roughly emulate the capability of MediaStream Recording. Firefox documentation is here