Electron: how to capture audio from only one app - javascript

We are developing a desktop application using Electron with screenshare capability. For this we use the getUserMedia API. And we have the option to choose which screen or window to capture. This is part of the code for that
let constraints = {
audio: {
mandatory: {
chromeMediaSource: 'desktop',
chromeMediaSourceId: sourceId
}
},
video: {
mandatory: {
chromeMediaSource: 'desktop',
chromeMediaSourceId: sourceId
}
}
}
let stream = await navigator.mediaDevices.getUserMedia(constraints)
And we would like to only capture audio from the application that is being streamed. Is it possible to do this? Maybe some third party solutions?

Related

Camera stream not loading on all devices with maximum quality requirements (HTML/Angular)

I'm developing an application in Angular to capture a photo with the maximum possible quality depending on the device's camera.
Currently, I have this code:
HTML:
<video #video id="video" autoplay muted playsinline></video>
Angular TS:
_requestCameraStream(width: number, height: number, secondTry: boolean) {
if (navigator.mediaDevices.getUserMedia)
{
navigator.mediaDevices
.getUserMedia({
video: {
facingMode: 'environment',
width: { ideal: width },
height: { ideal: height },
frameRate: { ideal: 30 },
},
})
.then(stream => {
console.log('_getUserMedia -> stream loaded');
this.loadStream(stream);
})
.catch(err => {
console.log(err);
if (!secondTry)
{
console.log('Started second try');
this._requestCameraStream(2560, 1440, true);
}
else
{
this.router.navigateByUrl('id-document/camera-error');
}
});
}
}
private loadStream(stream: MediaStream) {
const videoDevices = stream.getVideoTracks();
this.lastStream = stream;
this.video!.nativeElement.srcObject = stream;
this.video!.nativeElement.play();
this.ref.detectChanges();
}
Basically I check if the device has a camera available and try to load it with the width and height values that the function receives. On the first try I call the function as follows:
this._requestCameraStream(4096, 2160, false);
If the stream fails to load (probably because the camera does not support 4k quality) then it tries again with the values this._requestCameraStream(2560, 1440, true);
This is actually working pretty well on most devices, but on a Galaxy Note 10 Plus, the stream does not load, but if I click the button to take the picture, the camera does capture the image in 4k quality.
I suspect that the camera has a higher resolution than the screen, so the camera can capture a 4k image, but the screen can't load a 4k video as a preview. The problem is: the system does not trigger any warning or errors that I could capture. It is as if the preview loaded successfully.
How can I detect and treat this error? Or maybe, is there any other way that I can request the camera to capture a maximum quality image with the preview loading correctly?
You can try defining a range of resolutions instead of trying only two
async _requestCameraStream() {
if (!navigator.mediaDevices.getUserMedia) return;
try {
const stream = await navigator.mediaDevices
.getUserMedia({
video: {
facingMode: {
exact: "environment"
},
width: { min: 2288, ideal: 4096, max: 4096 },
height: { min: 1080, ideal: 2160, max: 2160 },
frameRate: { ideal: 30 },
},
});
if(stream) {
console.log('_getUserMedia -> stream loaded');
this.loadStream(stream);
}
}catch (err) {
console.log(err);
this.router.navigateByUrl('id-document/camera-error');
}
}
I think your current approach is correct to capture the maximum quality image. I have used a similar approach in one of my projects. I think the problem is with the video playback. There is an autoplay policy in the browser it behaves differently in a different browser. This browser policy does not allow to play video content without any user intervention. Check this URL, https://developer.chrome.com/blog/autoplay/
I think you should add a muted attribute in the video element and would be better if you ask the user to click before capturing the camera stream. May be this does not solve the problem which you are facing but this problem will be there in some browsers like iPhone. Apple devices do not allow any video content without user intervention.
Regards,
omi

When I try to stream my webcam video by using getUserMedia() api a black screen is appearing

I am trying to access the user's webcam by using navigator.getUserMedia(). I am assigning the video.srcObject to this stream. But I am getting a black screen on video.
I event tried with navigator.mediaDevices.getUserMedia()
<video controls id="webcam"></video>
<script>
const webcam = document.getElementById("webcam");
function startVideo() {
navigator
.getUserMedia({
video: true,
audio: false
},
liveStream => {
console.log(liveStream);
webcam.setAttribute("controls", 'true');
webcam.srcObject = liveStream;
webcam.play();
},
error => console.log(error)
)
}
startVideo();
</script>

How to detect user's connection speed using angular 2 and set quality to twilio video call

I am trying to set default video quality for video like this (from the example) :
createLocalTracks({
audio: true,
video: { width: 640 }
}).then(localTracks => {
return connect('$TOKEN', {
name: 'my-room-name',
tracks: localTracks
});
}).then(room => {
console.log('Connected to Room:', room.name);
});
instead of width : 640 i want to set dynamic resolution for the video base on user's internet connection. How can I do that?

Unable to obtain desktop picker dialog #electron

I am unable to get desktop picker dialog for available sources. I am newbie can someone guide me what am I missing? In chrome we use "chrome.desktopCapture.chooseDesktopMedia"? I obtained source from below code.
function onAccessApproved(error, sources) {
if (error) throw error;
for (var i = 0; i < sources.length; ++i) {
{
navigator.webkitGetUserMedia({
audio: false,
video: {
mandatory: {
chromeMediaSource: 'desktop',
chromeMediaSourceId: sources[i].id,
minWidth: 1280,
maxWidth: 1280,
minHeight: 720,
maxHeight: 720
}
}
}, gotShareStream, errorCallback);
return;
}
I have tried Option link but I am getting BrowserWindow undefined error.
Thanks!
I haven't used electron, but in WebRTC you need to use something like this video: {optional: [{sourceId: source.id}]}. And don't do this for all the sources - do this only to get a stream from that source.
To get the available sources use navigator.mediaDevices.enumerateDevices() and then filter them by kind which can be audioinput, audiooutput, videoinput and videooutput.

Desktop Audio Capture Not working for chrome app

According to https://code.google.com/p/chromium/issues/detail?id=223639 chromium has issues with audio Loopback. and it never works in chrome app. Can anyone share some links and explanation to why is this not working?Or if it is possible? I tried below code but lot of disturbance in desktop audio.
video: {
mandatory: {
chromeMediaSource:'screen',
chromeMediaSourceId: id
}
},
audio: {
mandatory: {
chromeMediaSource: 'system',
chromeMediaSourceId: id,
}
}
Multiple streams are captured and attached to a single peer connection?
Thanks!
AFAIK only desktopCapture API are supporting audio+tab.
chromeMediaSource value must be desktop.
video: {
mandatory: {
chromeMediaSource:'desktop',
chromeMediaSourceId: 'sourceIdCapturedUsingChromeExtension'
}
},
audio: {
mandatory: {
chromeMediaSource: 'desktop',
chromeMediaSourceId: 'sourceIdCapturedUsingChromeExtension'
}
}
Try following demo on Chrome-Canary:
https://rtcmulticonnection.herokuapp.com/demos/Audio+ScreenSharing.html
However make sure to enable chrome://flags#tab-for-desktop-share flag.

Categories