I'm trying to mute only the local audio playback in WebRTC, more specifically after getUserMedia() and prior to any server connection being made. None of the options I've found work; this one from Muaz Khan fails:
var audioTracks = localMediaStream.getAudioTracks();
// if MediaStream has reference to microphone
if (audioTracks[0]) {
audioTracks[0].enabled = false;
}
source
This technique is also described here as "working", but fails here on Chrome Version 39.0.2171.95 (64-bit) (Ubuntu 14.04).
Additional method that is said to work by using volume gain:
window.AudioContext = window.AudioContext || window.webkitAudioContext;
var audioContext = new AudioContext();
var source = audioContext.createMediaStreamSource(clientStream);
var volume = audioContext.createGain();
source.connect(volume);
volume.connect(audioContext.destination);
volume.gain.value = 0; //turn off the speakers
tl;dr I don't want to hear the input from my microphone on my speakers, but I do want to see my video image.
Workaround
This workaround was suggested by Benjamin Trent and it mutes the audio by setting the muted attribute on the video tag like so:
document.getElementById("html5vid").muted = true;
Also similar question, but its for video, so not the same
Adding this as the answer because it is the de facto correct answer:
What you stated as a workaround is what's used by many major WebRTC Video platforms:
navigator.mediaDevices.getUserMedia({ video: true, audio: true })
.then(stream => {
const vid = document.getElementById('html5vid');
vid.autoplay = true;
vid.muted = true;
vid.srcObject = stream;
});
Related
I am taking the mediaStream from WebRTC and doing some audio processing and monitoring. It works on FireFox but is silent on Chrome.
Here is a simplified version with a single gainNode as an example.
const AudioContext = window.AudioContext || window.webkitAudioContext;
let myAudioCtx = new AudioContext();
let mySource = myAudioCtx.createMediaStreamSource(stream);
let gainNode = myAudioCtx.createGain();
gainNode.gain.value = 2;
mySource.connect(gainNode);
gainNode.connect(myAudioCtx.destination);
Whereas if I instead assign stream directly to srcObject I hear the sound.
It appears that createMediaStreamSource() is not returning any audio because my monitoring shows silence. However if I assign the stream from WebRTC to srcObect as well as run though my monitoring then the monitoring detects sound.
myAudioCtx.state says 'running'
Can't think of where else to check. Any help would be appreciated
Found the solution after a good nights sleep and looking at MDN docs again.
You must assign the stream to the audio element
audio.srcObject = stream;
but you then have to mute the output so it doesn't go directly to the speakers
audio.muted = true;
this doesn't stop your web audio from working
const AudioContext = window.AudioContext || window.webkitAudioContext;
let myAudioCtx = new AudioContext();
let mySource = myAudioCtx.createMediaStreamSource(stream);
let gainNode = myAudioCtx.createGain();
gainNode.gain.value = 2;
mySource.connect(gainNode);
gainNode.connect(myAudioCtx.destination);
This works on Chrome, Safari and Firefox.
Is there a global way to detect when audio is playing or starts playing in the browser.
something like along the idea of if(window.mediaPlaying()){...
without having the code tied to a specific element?
EDIT: What's important here is to be able to detect ANY audio no matter where the audio comes from. Whether it comes from an iframe, a video, the Web Audio API, etc.
No one should use this but it works.
Basically the only way that I found to access the entire window's audio is using MediaDevices.getDisplayMedia().
From there a MediaStream can be fed into an AnalyserNode that can be used to check the if the audio volume is greater than zero.
Only works in Chrome and maybe Edge (Only tested in Chrome 80 on Linux)
JSFiddle with <video>, <audio> and YouTube!
Important bits of code (cannot post in a working snippet because of the Feature Policies on the snippet iframe):
var audioCtx = new AudioContext();
var analyser = audioCtx.createAnalyser();
var bufferLength = analyser.fftSize;
var dataArray = new Float32Array(bufferLength);
window.isAudioPlaying = () => {
analyser.getFloatTimeDomainData(dataArray);
for (var i = 0; i < bufferLength; i++) {
if (dataArray[i] != 0) return true;
}
return false;
}
navigator.mediaDevices.getDisplayMedia({
video: true,
audio: true
})
.then(stream => {
if (stream.getAudioTracks().length > 0) {
var source = audioCtx.createMediaStreamSource(stream);
source.connect(analyser);
document.body.classList.add('ready');
} else {
console.log('Failed to get stream. Audio not shared or browser not supported');
}
}).catch(err => console.log("Unable to open capture: ", err));
I read all MDN docs about Web Audio API but I didn't find any global flag on window that shows audio playing. But I have found a tricky way that shows ANY audio playing, no matter an iframe or video but about Web Audio API:
const allAudio = Array.from( document.querySelectorAll('audio') );
const allVideo = Array.from( document.querySelectorAll('video') );
const isPlaying = [...allAudio, ...allVideo].some(item => !item.paused);
Now, by the isPlaying flag we can detect if any audio or video is playing in the browser.
There is a playbackState property (https://developer.mozilla.org/en-US/docs/Web/API/MediaSession/playbackState), but not all browsers support it.
if(navigator.mediaSession.playbackState === "playing"){...
I was looking for a solution in Google, but i didn't find anything yet.
Maybe you could check some data that has X value only when audio is playing. If you have some button that start playing the audio file, maybe you can be sure that the audio is playing by adding some event listener on the rep. button...
Maybe something like adding an event listener to the "audio" tag? If i remember correctly, audio tag has a "paused" attribute...
And now i just remember that the audio has "paused" attribute...
Also, you may want to check this topic HTML5 check if audio is playing?
i jus find it five seconds ago jaja
I'm looking to get the microphone activity level of a WebRTC MediaStream. However, I need to get this information without playing back the microphone to the user (otherwise there will be the loopback effect).
The answer in Microphone activity level of WebRTC MediaStream relies on the audio being played back to the user. How can I do this, without playing back the microphone?
Take a look at createGain method. It allows you to set stream's volume.
Here is my (simplified) example that I use in my project:
navigator.getUserMedia({audio: true, video: true}, function(stream) {
var audioContext = new AudioContext; //or webkitAudioContext
var source = audioContext.createMediaStreamSource(stream);
var volume = audioContext.createGain();
source.connect(volume);
volume.connect(audioContext.destination);
volume.gain.value = 0; //turn off the speakers
//further manipulations with source
}, function(err) {
console.log('error', err);
});
I want to make a simple audio only stream over WebRTC, using Peer.js. I'm running the simple PeerServer locally.
The following works perfectly fine in Firefox 30, but I can't get it to work in Chrome 35. I would expect there was something wrong with the PeerJS setup, but Chrome -> Firefox works perfectly fine, while Chrome -> Chrome seems to send the stream, but won't play over speakers.
Setting up getUserMedia Note: uncommenting those lines below will let me hear the loopback in Chrome and Firefox.
navigator.getUserMedia = (navigator.getUserMedia || navigator.webkitGetUserMedia || navigator.mozGetUserMedia || navigator.msGetUserMedia);
window.AudioContext = window.AudioContext || window.webkitAudioContext;
if(navigator.getUserMedia) {
navigator.getUserMedia({video: false, audio: true}, getMediaSuccess, getMediaError);
} else {
alert('getUserMedia not supported.');
}
var localMediaStream;
//var audioContext = new AudioContext();
function getMediaSuccess(mediaStream) {
//var microphone = audioContext.createMediaStreamSource(mediaStream);
//microphone.connect(audioContext.destination);
localMediaStream = mediaStream;
}
function getMediaError(err) {
alert('getUserMedia error. See console.');
console.error(err);
}
Making the connection
var peer = new Peer({host: '192.168.1.129', port: 9000});
peer.on('open', function(id) {
console.log('My ID:', id);
});
peer.on('call', function(call) {
console.log('answering call with', localMediaStream);
call.answer(localMediaStream);
//THIS WORKS IN CHROME, localMediaStream exists
call.on('stream', function(stream) {
console.log('streamRecieved', stream);
//THIS WORKS IN CHROME, the stream has come through
var audioContext = new AudioContext();
var audioStream = audioContext.createMediaStreamSource(stream);
audioStream.connect(audioContext.destination);
//I HEAR AUDIO IN FIREFOX, BUT NOT CHROME
});
call.on('error', function(err) {
console.log(err);
//LOGS NO ERRORS
});
});
function connect(id) {
var voiceStream = peer.call(id, localMediaStream);
}
This still appears to be an issue even in Chrome 73.
The solution that saved me for now is to also connect the media stream to a muted HTML audio element. This seems to make the stream work and audio starts flowing into the WebAudio nodes.
This would look something like:
let a = new Audio();
a.muted = true;
a.srcObject = stream;
a.addEventListener('canplaythrough', () => {
a = null;
});
let audioStream = audioContext.createMediaStreamSource(stream);
audioStream.connect(audioContext.destination);
JSFiddle: https://jsfiddle.net/jmcker/4naq5ozc/
Original Chromium issue and workaround:
https://bugs.chromium.org/p/chromium/issues/detail?id=121673#c121
New Chromium issue: https://bugs.chromium.org/p/chromium/issues/detail?id=687574 https://bugs.chromium.org/p/chromium/issues/detail?id=933677
In Chrome, it is a known bug currently where remote audio streams gathered from a peer connection are not accessible through the AudioAPI.
Latest comment on the bug:
We are working really hard towards the feature. The reason why this
takes long time is that we need to move the APM to chrome first,
implement a render mixer to get the unmixed data from WebRtc, then we
can hook up the remote audio stream to webaudio.
It was recently patched in Firefox as I remember this being an issue on there as well in the past.
I was unable to play the stream using web audio but I did manage to play it uses a basic audio element:
var audio = new Audio();
audio.src = (URL || webkitURL || mozURL).createObjectURL(remoteStream);
audio.play();
I've been toying with WebRTC but I'm completely unable to play a simple audio stream after properly granting rights to the browser to use the input device.
I just try to connect the input device to the context destination, but it doesn't work.
This snippet isn't working and I think it should:
function success(stream)
{
var audioContext = new webkitAudioContext();
var mediaStreamSource = audioContext.createMediaStreamSource(stream);
mediaStreamSource.connect(audioContext.destination);
}
navigator.webkitGetUserMedia({audio:true, video:false}, success);
This doesn't seem to capture any sound from my working microphone, but if I use a simple tag and create a blob url the code suddenly starts working.
function success(stream)
{
audio = document.querySelector('audio');
audio.src = window.URL.createObjectURL(stream);
audio.play();
}
navigator.webkitGetUserMedia({audio:true, video:false}, success);
Also, not a single of these demos seems to be working for me: http://webaudiodemos.appspot.com/.
Fiddle for the first snippet: http://jsfiddle.net/AvMtt/
Fiddle for the second snippet: http://jsfiddle.net/vxeDg/
Using Chrome 28.0.1500.71 beta-m on Windows 7x64.
I have a single input device, and two output devices (speakers, headsets). Every device is using the same sample rate.
This question is almost 6 years old, but for anyone who stumbles across it, the modern version of this looks something like:
function success(stream) {
let audioContext = new AudioContext();
let mediaStreamSource = audioContext.createMediaStreamSource(stream);
mediaStreamSource.connect(audioContext.destination);
}
navigator.mediaDevices.getUserMedia({audio: true, video: false})
.then(success)
.catch((e) => {
console.dir(e);
});
And appears to work based on https://jsfiddle.net/jmcker/g3j1yo85