Get user media not working in iOS 11.3 Chrome - javascript

I've been developing a web app to scan the barcode in live stream. I have used the following code for video streaming:
navigator.mediaDevices.getUserMedia({ video: constraints }).then(function(stream) {
// video.src = window.URL.createObjectURL(stream);
video.srcObject = stream;
video.play();
// ...
}
It works as expected in Android chrome browser, and also in ios safari browser. But, when i tried it in ios Chrome browser, it is not working.
I have also tried adding the following constraints:
video.setAttribute('autoplay', '');
video.setAttribute('muted', '');
video.setAttribute('playsinline', '');
But no use on it. Can anyone please suggest me the right solution to do it.

Related

WebRTC Webcam not working on Safari and Microsoft Edge

I want to access the webcam and audio device to record video using WebRTC. However, it's only working on Chrome and Firefox.
Interestingly, it's not working on Edge and Safari. It asks for camera use permission, when we grant permission, the camera doesn't load and I get the following error in console.
ERROR Message on Safari and Edge
navigator.getUserMedia error: ReferenceError
My Code looks like this
async init(constraints) {
try {
if (navigator.mediaDevices && navigator.mediaDevices.getUserMedia) {
const stream = await navigator.mediaDevices.getUserMedia(constraints);
this.handleSuccess(stream);
} else {
this.setState({
error:
'Your Browser is not supported. Please use latest version of Chrome, Safari, Firefox or Edge.',
});
}
if (!MediaRecorder) {
this.setState({
error:
'Your Browser is not supported. Please use latest version of Chrome, Safari, Firefox or Edge.',
});
}
} catch (e) {
this.setState({
error:
'We could not find any audio/video recording device. Please make sure you have given permission to use webcam and microphone.',
});
console.error('navigator.getUserMedia error:', e.name);
}
}
Any suggestions to load the camera perfectly in all these browsers.
Looks like you are having the issue in the MS Edge legacy browser. Correct me if I am misunderstanding.
I try to test the official sample code in the MS Edge legacy browser (44.18362.449.0) and it works fine.
WebRTC samples getUserMedia
I suggest you can test this example in the MS Edge browser and let us know whether it worked or not.
If it works then you can modify your code based on the official sample from their Github page.
If you are using an older version of the MS Edge browser then I suggest you update with the latest version and again test the issue.

Cannot record video+audio on iPhone Safari

I have used RecordRTC for capturing the video+audio from the browser.
For Android devices, it's working perfectly as expected. But in iPhone devices especially on the safari browser, it's not recording as expected.
Browser console produces the following error.
Your browser does not support Media Recorder API. Please try other modules e.g. WhammyRecorder or StereoAudioRecorder.
Could someone please help me out like:
Does Safari support basic video capturing?
It is better to use StereoAudioRecorder which is made in RecordRTC.js as Recorder for Safari.
This is the document of StereoAudioRecorder.
You're going to have to read this first :)
https://recordrtc.org/StereoAudioRecorder.html
const option = {
type: 'video',
recorderType: StereoAudioRecorder
};
const recorder = new RecordRTC(mic, options);
recorder.startRecording();
I hope this will help you.
https://developer.mozilla.org/en-US/docs/Web/API/MediaRecorder
MediaRecorder was introduced in iOS14 (Safari 14), if you have an older version, this can be your problem.

getUserMedia not working for IOS, how to access microphone on IOS from website?

I have recently tried using the getUserMedia function from the navigator.mediaDevices.getUserMedia function. I am trying to access the microphone from a website that I created (www.speechbud.com) so that a speech to text transcription can be performed. This is working on PC and mobile(android) but doesn't seem to work for IOS. I have checked many previous articles and it says that from IOS 11 getUserMedia should work, however it is still not working. Is IOS still not compatible, and if that's the case, how am I supposed to access the microphone from a website?
I have checked previous articles and tried using different npm packages, with no luck.
getUserMedia({video: false, audio: true},function (err, stream) {
if (err) {
console.log('failed');
stream.end(); // end the stream
} else {
micStream.setStream(stream);
if (keepMic) {`enter code here`
preservedMicStream = micStream;
}
}
});
TLDR; I would like to basically be able to access the microphone from an IOS device upon a button click for live transcription.
THANKS!

getUserMedia with MediaStreamAudioSourceNode on Android Chrome

I am trying to use media streams with getUserMedia on Chrome on Android. To test, I worked up the script below which simply connects the input stream to the output. This code works as-expected on Chrome under Windows, but on Android I do not hear anything. The user is prompted to allow for microphone access, but no audio comes out of the speaker, handset speaker, or headphone jack.
navigator.webkitGetUserMedia({
video: false,
audio: true
}, function (stream) {
var audioContext = new webkitAudioContext();
var input = audioContext.createMediaStreamSource(stream);
input.connect(audioContext.destination);
});
In addition, the feedback beeps when rolling the volume up and down do not sound, as if Chrome is playing back audio to the system.
Is it true that this functionality isn't supported on Chrome for Android yet? The following questions are along similar lines, but neither really have a definitive answer or explanation.
HTML5 audio recording not woorking in Google Nexus
detecting support for getUserMedia on Android browser fails
As I am new to using getUserMedia, I wanted to make sure there wasn't something I was doing in my code that could break compatibility.
I should also note that this problem doesn't seem to apply to getUserMedia itself. It is possible to use getUserMedia in an <audio> tag, as demonstrated by this code (utilizes jQuery):
navigator.webkitGetUserMedia({
video: false,
audio: true
}, function (stream) {
$('body').append(
$('<audio>').attr('autoplay', 'true').attr('src', webkitURL.createObjectURL(stream))
);
});
Chrome on Android now properly supports getUserMedia. I suspect that this originally had something to do with the difference in sample rate between recording and playback (which exhibits the same issue on desktop Chrome). In any case, all started working some time on the latest stable around February 2014.

How to use MediaStream Recording

Ok, I'm going to try and make my question as clear as possible, but I'm pretty confused, so let me know if I'm not getting the message across.
I'm trying to use getUserMedia to use the webcam, and then use this
http://www.w3.org/TR/mediastream-recording/
to record a brief captured video. Problem is, when I try to define new MediaRecorder(stream), I'm told that it is undefined. I haven't used this api before, so I don't really know what I'm missing. Here is the relevant code:
function onVideoFail(e) {
console.log('webcam fail!', e);
};
function hasGetUserMedia() {
return !!(navigator.getUserMedia || navigator.webkitGetUserMedia || navigator.mozGetUserMedia || navigator.msGetUserMedia);
}
if (hasGetUserMedia()) {
window.URL = window.URL || window.webkitURL;
navigator.getUserMedia = navigator.getUserMedia ||
navigator.webkitGetUserMedia ||
navigator.mozGetUserMedia ||
navigator.msGetUserMedia;
if (navigator.getUserMedia) {
navigator.getUserMedia({video: true, audio: false}, function(stream){
var video = document.querySelector('video');
var recorder = new MediaRecorder(stream); <<<<<< THIS IS MY PROBLEM SPOT
video.src = window.URL.createObjectURL(stream);
video.play();
// webcamstream = stream;
// streamrecorder = webcamstream.record();
}, onVideoFail);
} else {
alert('failed');
}
} else {
alert('getUserMedia() is not supported by this browser!!');
}
I've been trying to look at this for reference:
HTML5 getUserMedia record webcam, both audio and video
MediaStream Recording (or Media Recorder API after the MediaRecorder JS object it defines) has now been implemented in 2 major browsers on desktop:
Firefox 30 audio + video
Chrome 47, 48 only for video with experimental Web Platform on in chrome://flags.
Chrome 49 audio + video
Containers:
Both record to .webm containers.
Video codecs:
Both record VP8 video
Chrome 49+ can record VP9 video
Chrome 52+ can record H.264 video
Audio codecs:
Firefox records Vorbis audio # 44.1 kHz
Chrome 49 records Opus audio # 48 kHz
Demos/GitLab:
https://simpl.info/mediarecorder/
https://addpipe.com/media-recorder-api-demo/
Make sure you run these demos on HTTPS or localhost:
Starting with Chrome 47, getUserMedia() requests are only allowed from secure origins: HTTPS or localhost.
Further reading:
MediaRecorder spec on w3c
HTML5’s Media Recorder API in Action on Chrome and Firefox on addpipe.com
MediaRecorder API on mozilla.org
Chrome 47 WebRTC: media recording, secure origins on developers.google.com
MediaRecorder API Available in Chrome 49 on developers.google.com
Disclaimer: I work at Pipe where we handle video recording.
I am currently using this API, and I've found our it is currently only implemented on the Nightly version of firefox, and it can only record audio.
It isn't implemented on Chrome (to my knowledge).
Here is how I use it, if it can help:
function record(length,stream) {
var recorder = new window.MediaRecorder(stream);
recorder.ondataavailable = function(event) {
if (recorder.state == 'recording') {
var blob = new window.Blob([event.data], {
type: 'audio/ogg'
});
// use the created blob
recorder.stop();
}
};
recorder.onstop = function() {
recorder = null;
};
recorder.start(length);
}
I put a MediaStream Recording demo at simpl.info/mediarecorder.
This currently works with Firefox Nightly and, as #Apzx says, it's audio only. There is an Intent to Implement for Chrome.
As of version 49, Chrome now has support for the MediaRecorder API. You may now record MediaStream objects. Unfortunately, if you're building a product that must record MediaStreams for browsers older than literally the latest version of Chrome (at least as of this writing), then you will need to make use of a WebRTC Media Server/Gateway.
The basic idea is that a peer connection is instantiated on a server and that your local peer attaches a video and/or audio stream to its connection object to be sent to the server peer. The server then listens to the incoming WebRTC stream and records it to a file. A couple of media servers you may want to check out:
Kurento
Janus
I personally am using Kurento and recording streams with it with great success.
In order to get a media server to work, you will need to spin up your own app server that handles signaling and handling of ICE Candidates. This is pretty simple, and Kurento has some pretty good examples with Node and Java.
If you are targeting a general audience and are using a media server, you will also probably need a STUN or TURN server. These servers essentially use the network topology to get your media server's public IP and your client's public IP. Beware that, if either end (the media server or client) lies behind a symmetric NAT, the STUN server will not be enough and you will need to use a TURN server (a free one can be found here). Without too much detail, a good thing to remember is that a STUN server acts as a signaling gateway where as a TURN server is a relay gateway. What that means is that the media streams will literally pass through a TURN server, whereas the media streams will pass from RTC peer connection to the other connection directly.
Hopefully this was helpful. If you really need RTC recording capabilities, then you're going to be going down a long road, so make sure it's worth it.
See also RecordRTC, which has workarounds for Chrome to roughly emulate the capability of MediaStream Recording. Firefox documentation is here

Categories