How to stream webcam into mobile browser by using ReactJs? - javascript

I have created a simple react app that streams the webcam video stream on the browser. Here's the link to the github project : Basic WebCam Streamer
The code is pretty simple and straightforward :
class AppStreamCam extends React.Component {
constructor(props) {
super(props);
this.streamCamVideo= this.streamCamVideo.bind(this)
}
streamCamVideo() {
var constraints = { audio: true, video: { width: 1280, height: 720 } };
navigator.mediaDevices
.getUserMedia(constraints)
.then(function(mediaStream) {
var video = document.querySelector("video");
video.srcObject = mediaStream;
video.onloadedmetadata = function(e) {
video.play();
};
})
.catch(function(err) {
console.log(err.name + ": " + err.message);
}); // always check for errors at the end.
}
render() {
return (
<div>
<div id="container">
<video autoPlay={true} id="videoElement" controls></video>
</div>
<br/>
<button onClick={this.streamCamVideo}>Start streaming</button>
</div>
);
}
}
And this is the result :
Once, I click on the button, the webcam turns on and starts streaming into the browser.
Here's my problem:
When I open chrome on my phone and enter the localServer address, and click on the button, the app crashes since obviously the app code is meant to be run from the pc browser so that it may turn the pc webcam.
So when I click on the button from my phone, I understandably get this error:
TypeError: Cannot read property 'getUserMedia' of undefined
My goal is to click on the button from my mobile browser and start streaming the pc webcam on my mobile browser just like on the pc.
However, I do not know from where to start exactly. Any help?

I have solved this issue.
1. Open package.json and paste this inside scripts:
"start": "set HTTPS=true&&react-scripts start"
This should serve the app over https
2. If this gives you this error:
React app error: Failed to construct 'WebSocket': An insecure
WebSocket connection may not be initiated from a page loaded over
HTTPS
Open
node_modules/react-dev-utils/webpackHotDevClient.js
And paste this code inside the definition of the connection:
protocol: window.location.protocol === 'https:' ? 'wss' : 'ws',
This is apparently a bug in react-sripts that hasn't been solved yet. If https protocol is being used we should use WebSockets over SSL/TLS (WSS) protocol instead of WebSockets (WS). You can learn more about it here:
NOTE: This will not stream your pc webcam into your phone but rather the phone's camera.

Related

video.play() occurred unhandled rejection (notallowederror) on IOS

using peer.js for stream video on React APP
addVideoStream(video: HTMLVideoElement, stream: MediaStream) {
video.srcObject = stream
video?.addEventListener('loadedmetadata', () => {
video.play()
})
if (this.videoGrid) this.videoGrid.append(video)
}
got this error at 'video.play()'
the request is not allowed by the user agent or the platform in the current context
already I allowed permission for Audio and video on IOS.
this code works well other platforms except IOS.
I have no idea.
If I deploy then I just get black screen on IOS.
how can I fix this?
thanks in advance
the problem was how video tag works in IOS with WebRTC.
used HTTPS environment(production) then add these attributes
if (isMobile && isSafari) {
this.myVideo.playsInline = true
this.myVideo.autoplay = true
}
then it works.

DOMException: Could not start video source in JavaScript

I am working on a webcam recorder app in JavaScript and WebRTC but when I click on the "Start Recording" button, I got this error:
Cannot access media devices: DOMException: Could not start video source
(anonymous) # scripts.js:43
Promise.catch (async)
(anonymous) # scripts.js:42
And here's my code:
HTML:
<button id="btn-start-recording">Start Recording</button>
<hr>
<video id="my-preview" controls autoplay></video>
<script src="./scripts.js"></script>
<script src="https://cdn.webrtc-experiment.com/RecordRTC.js"></script>
<script src="https://webrtc.github.io/adapter/adapter-latest.js"></script>
JavaScript:
// when the user clicks on the button start video recording
document.getElementById("btn-start-recording").addEventListener(
"click",
function () {
// disable the start recording button
this.disabled = true;
// request access to the media devices
navigator.mediaDevices
.getUserMedia({
audio: true,
video: true,
})
.then(function (stream) {
// display a live preview on the video element of the page
setSrcObject(stream, video);
// start to display the preview on the video element
// and mute the video to disable the echo issue!
video.play();
video.muted = true;
// initialize the recorder
recorder = new RecordRTCPromisesHandler(stream, {
mimeType: "video/webm",
bitsPerSecond: 128000,
});
// start recording the video
recorder
.startRecording()
.then(function () {
console.info("Recording video ...");
})
.catch(function (error) {
console.error("Cannot start video recording: ", error);
});
// release stream on stopRecording
recorder.stream = stream;
// enable the stop recording button
document.getElementById("btn-stop-recording").disabled = false;
})
.catch(function (error) {
console.error("Cannot access media devices: ", error); // this is line 43
});
},
false
);
I gave access to the browser microphone and camera on prompt and enabled it in Windows 10 settings.
I also tried in a live server from an extension in Visual Studio Code as well as I tried to run the file locally but this also did not work.
I am working on Windows 10 - Microsoft Edge Chromium 90 and Google Chrome 90.
When I tried in Firefox, I got DOMException: Failed to allocate videosource
getUserMedia in the browser requires the page to be served over HTTPS (aka TLS, usually port 443, and browser has a valid little lock up in the address bar).
If you're using a web server serving the HTML page over http (plain text, port 80, page marked as insecure, and/or no lock in the address bar), the request to getUserMedia will fail.
Source: me https://webrtchacks.com/chrome-secure-origin-https/
Edit
Another potential explanation is that another process is using the video camera at the same time. Have you verified that your webcam is not being used by another application? Consider completely killing all applications or browsers that have used your camera recently to try to free any process lock.

Chrome: navigator.mediaDevices.getUserMedia is not a function

I'm on localhost and trying to use the MediaDevices.getUserMedia method in Chrome. I receive the error as titled. I understand that in Chrome it is only possible to use this function with a secure origin and that localhost is considered a secure origin. Also, this works in Firefox.
This is how I'm using it as shown on the Google Developers website https://developers.google.com/web/updates/2015/10/media-devices?hl=en:
var constraints = window.constraints = {
audio: false,
video: true
};
navigator.mediaDevices.getUserMedia(constraints).then(function(stream) {
callFactory.broadcastAssembly(stream);
...
});
On some latest browsers navigator.getUserMedia does not perform well. So, try using navigator.mediaDevices.getUserMedia. Or, better you check if navigator.mediaDevices.getUserMedia is available for the browser use navigator.mediaDevices.getUserMedia or else use navigator.getUserMedia.
navigator.getWebcam = (navigator.getUserMedia || navigator.webKitGetUserMedia || navigator.moxGetUserMedia || navigator.mozGetUserMedia || navigator.msGetUserMedia);
if (navigator.mediaDevices.getUserMedia) {
navigator.mediaDevices.getUserMedia({ audio: true, video: true })
.then(function (stream) {
//Display the video stream in the video object
})
.catch(function (e) { logError(e.name + ": " + e.message); });
}
else {
navigator.getWebcam({ audio: true, video: true },
function (stream) {
//Display the video stream in the video object
},
function () { logError("Web cam is not accessible."); });
}
Hope this will solve your problem.
Try enabling: chrome://flags/#enable-experimental-web-platform-features
Worked for me in chromium
I too had the same problem in my chrome browser.
first check your phone is supported by testing it in https://test.webrtc.org/
if your phone passes all the cases, then check step 2
step 2:
If your hosting a webpage or running a third party webpage,see whether camera permissions are enabled on your phone.
Also the main issue is WEBRTC is not supported in HTTP site and it is supported only in HTTPS site
This is the https site which allows web
This is the http site which gives a error
I got stuck in the same issue. One solution is to follow and download the extension Web Server for Chrome shared in the comment above by #ellerynz, or
if you have python installed you could also do
python -m SimpleHTTPServer [port]
After you hit enter, you should see the following message:
Serving HTTP on 0.0.0.0 port 8000 ...
Open the browser and put
http://127.0.0.1:[port]
Have you tried to include adapter.js polyfill ? Check this page :
https://developer.mozilla.org/en-US/docs/Web/API/MediaDevices/getUserMedia#Browser_compatibility
It looks like this or enabling chrome://flags/#enable-experimental-web-platform-features as per #Simon Malone's note, is needed for Chrome.
I was having this problem too and changing flags didn't seem to work. I came across this chrome extension — Web Server for Chrome in Google's WebRTC tutorial which seemed to do the trick.
Use navigator.getUserMedia() instead.
navigator.getUserMedia(constraints, successCallback, errorCallback);

Trouble with WebRTC in Nightly (22) and Chrome (25)

I'm experimenting with WebRTC between two browsers using RTCPeerConnection and my own long-polling implementation. I've created demo application, which successfully works with Mozilla Nightly (22), however in Chrome (25), I can't get no remote video and only "empty black video" appears. Is there something wrong in my JS code?
Function sendMessage(message) sends message to server via long-polling and on the other side, it is accepted using onMessage()
var peerConnection;
var peerConnection_config = {"iceServers": [{"url": "stun:23.21.150.121"}]};
// when message from server is received
function onMessage(evt) {
if (!peerConnection)
call(false);
var signal = JSON.parse(evt);
if (signal.sdp) {
peerConnection.setRemoteDescription(new RTCSessionDescription(signal.sdp));
} else {
peerConnection.addIceCandidate(new RTCIceCandidate(signal.candidate));
}
}
function call(isCaller) {
peerConnection = new RTCPeerConnection(peerConnection_config);
// send any ice candidates to the other peer
peerConnection.onicecandidate = function(evt) {
sendMessage(JSON.stringify({"candidate": evt.candidate}));
};
// once remote stream arrives, show it in the remote video element
peerConnection.onaddstream = function(evt) {
// attach media stream to local video - WebRTC Wrapper
attachMediaStream($("#remote-video").get("0"), evt.stream);
};
// get the local stream, show it in the local video element and send it
getUserMedia({"audio": true, "video": true}, function(stream) {
// attach media stream to local video - WebRTC Wrapper
attachMediaStream($("#local-video").get("0"), stream);
$("#local-video").get(0).muted = true;
peerConnection.addStream(stream);
if (isCaller)
peerConnection.createOffer(gotDescription);
else {
peerConnection.createAnswer(gotDescription);
}
function gotDescription(desc) {
sendMessage(JSON.stringify({"sdp": desc}));
peerConnection.setLocalDescription(desc);
}
}, function() {
});
}
My best guess is that there is a problem with your STUN server configuration. To determine if this is the issue, try using google's public stun server stun:stun.l.google.com:19302 (which won't work in Firefox, but should definitely work in Chrome) or test on a local network with no STUN server configured.
Also, verify that your ice candidates are being delivered properly. Firefox doesn't actually generate 'icecandidate' events (it includes the candidates in the offer/answer), so an issue with delivering candidate messages could also explain the discrepancy.
Make sure your video tag attribute autoplay is set to 'autoplay'.

Is HTML5's getUserMedia for audio recording working now?

I had searched a lot of DEMO and examples about getUserMedia , but most are just camera capturing, not microphone.
So I downloaded some examples and tried on my own computer , camera capturing is work ,
But when I changed
navigator.webkitGetUserMedia({video : true},gotStream);
to
navigator.webkitGetUserMedia({audio : true},gotStream);
The browser ask me to allow microphone access first, and then it failed at
document.getElementById("audio").src = window.webkitURL.createObjectURL(stream);
The message is :
GET blob:http%3A//localhost/a5077b7e-097a-4281-b444-8c1d3e327eb4 404 (Not Found)
This is my code: getUserMedia_simple_audio_test
Did I do something wrong? Or only getUserMedia can work for camera now ?
It is currently not available in Google Chrome. See Issue 112367.
You can see in the demo, it will always throw an error saying
GET blob:http%3A//whatever.it.is/b0058260-9579-419b-b409-18024ef7c6da 404 (Not Found)
And also you can't listen to the microphone either in
{
video: true,
audio: true
}
It is currently supported in Chrome Canary. You need to type about:flags into the address bar then enable Web Audio Input.
The following code connects the audio input to the speakers. WATCH OUT FOR THE FEEDBACK!
<script>
// this is to store a reference to the input so we can kill it later
var liveSource;
// creates an audiocontext and hooks up the audio input
function connectAudioInToSpeakers(){
var context = new webkitAudioContext();
navigator.webkitGetUserMedia({audio: true}, function(stream) {
console.log("Connected live audio input");
liveSource = context.createMediaStreamSource(stream);
liveSource.connect(context.destination);
});
}
// disconnects the audio input
function makeItStop(){
console.log("killing audio!");
liveSource.disconnect();
}
// run this when the page loads
connectAudioInToSpeakers();
</script>
<input type="button" value="please make it stop!" onclick="makeItStop()"/>
(sorry, I forgot to login, so posting with my proper username...)
It is currently supported in Chrome Canary. You need to type about:flags into the address bar then enable Web Audio Input.
The following code connects the audio input to the speakers. WATCH OUT FOR THE FEEDBACK!
http://jsfiddle.net/2mLtM/
<script>
// this is to store a reference to the input so we can kill it later
var liveSource;
// creates an audiocontext and hooks up the audio input
function connectAudioInToSpeakers(){
var context = new webkitAudioContext();
navigator.webkitGetUserMedia({audio: true}, function(stream) {
console.log("Connected live audio input");
liveSource = context.createMediaStreamSource(stream);
liveSource.connect(context.destination);
});
}
// disconnects the audio input
function makeItStop(){
console.log("killing audio!");
liveSource.disconnect();
}
// run this when the page loads
connectAudioInToSpeakers();
</script>
<input type="button" value="please make it stop!" onclick="makeItStop()"/>
It's working, you just need to add toString parameter after audio : true
Check this article - link

Categories