WebRTC via Web Audio API silent on google Chrome - javascript

I am taking the mediaStream from WebRTC and doing some audio processing and monitoring. It works on FireFox but is silent on Chrome.
Here is a simplified version with a single gainNode as an example.
const AudioContext = window.AudioContext || window.webkitAudioContext;
let myAudioCtx = new AudioContext();
let mySource = myAudioCtx.createMediaStreamSource(stream);
let gainNode = myAudioCtx.createGain();
gainNode.gain.value = 2;
mySource.connect(gainNode);
gainNode.connect(myAudioCtx.destination);
Whereas if I instead assign stream directly to srcObject I hear the sound.
It appears that createMediaStreamSource() is not returning any audio because my monitoring shows silence. However if I assign the stream from WebRTC to srcObect as well as run though my monitoring then the monitoring detects sound.
myAudioCtx.state says 'running'
Can't think of where else to check. Any help would be appreciated

Found the solution after a good nights sleep and looking at MDN docs again.
You must assign the stream to the audio element
audio.srcObject = stream;
but you then have to mute the output so it doesn't go directly to the speakers
audio.muted = true;
this doesn't stop your web audio from working
const AudioContext = window.AudioContext || window.webkitAudioContext;
let myAudioCtx = new AudioContext();
let mySource = myAudioCtx.createMediaStreamSource(stream);
let gainNode = myAudioCtx.createGain();
gainNode.gain.value = 2;
mySource.connect(gainNode);
gainNode.connect(myAudioCtx.destination);
This works on Chrome, Safari and Firefox.

Related

Adding panner / spacial audio to Web Audio Context from a WebRTC stream not working

I would like to create a Web Audio panner to position the sound from a WebRTC stream.
I have the stream connecting OK and can hear the audio and see the video, but the panner does not have any effect on the audio (changing panner.setPosition(10000, 0, 0) to + or - 10000 makes no difference to the sound).
This is the onaddstream function where the audio and video get piped into a video element and where I presume i need to add the panner.
There are no errors, it just isn't panning at all.
What am I doing wrong?
Thanks!
peer_connection.onaddstream = function(event) {
var AudioContext = window.AudioContext || window.webkitAudioContext;
var audioCtx = new AudioContext();
audioCtx.listener.setOrientation(0,0,-1,0,1,0)
var panner = audioCtx.createPanner();
panner.panningModel = 'HRTF';
panner.distanceModel = 'inverse';
panner.refDistance = 1;
panner.maxDistance = 10000;
panner.rolloffFactor = 1;
panner.coneInnerAngle = 360;
panner.coneOuterAngle = 0;
panner.coneOuterGain = 0;
panner.setPosition(10000, 0, 0); //this doesn't do anything
peerInput.connect(panner);
panner.connect(audioCtx.destination);
// attach the stream to the document element
var remote_media = USE_VIDEO ? $("<video>") : $("<audio>");
remote_media.attr("autoplay", "autoplay");
if (MUTE_AUDIO_BY_DEFAULT) {
remote_media.attr("muted", "false");
}
remote_media.attr("controls", "");
peer_media_elements[peer_id] = remote_media;
$('body').append(remote_media);
attachMediaStream(remote_media[0], event.stream);
}
Try to get the event stream before setting the panner
var source = audioCtx.createMediaStreamSource(event.stream);
Reference: Mozilla Developer Network - AudioContext
CreatePaneer Refernce: Mozilla Developer Network - createPanner
3rd Party Library: wavesurfer.js
Remove all the options you've set for the panner node and see if that helps. (The cone angles seem a little funny to me, but I always forget how they work.)
If that doesn't work, create a smaller test with the panner but use a simple oscillator as the input. Play around with the parameters and positions to make sure it does what you want.
Put this back into your app. Things should work then.
Figured this out for myself.
The problems was not the code, it was because I was connected with Bluetooth audio.
Bluetooth apparently can only do stereo audio with the microphone turned off. As soon as you activate the mic, that steals one of the channels and audio output downgrades to mono.
If you have mono audio, you definitely cannot do 3D positioned sound, hence me thinking the code was not working.

HTML5 mobile: Play only one side of audio from a video stream on iOS

I am trying to play a video on iOS while listening to only one side of the stereo audio.
Code below works fine on desktop Chrome but not on Safari on iPhone5 with 8.3 iOS.
var AudioContext = window.webkitAudioContext;
var audioCtx = new AudioContext();
var splitter = audioCtx.createChannelSplitter(2);
var merger = audioCtx.createChannelMerger(1);
source = audioCtx.createMediaElementSource(video);
source.connect(splitter);
splitter.connect(merger, 0);
merger.connect(audioCtx.destination);
'video' is the reference to the DOM video element.
Any help would be much appreciated
thanks, a lot
Sa'ar
Here is a polyfill that checks whether Web audio exists and whether to use -webkit or not.
//borrowed from underscore.js
function isUndef(val){
return val === void 0;
}
if (isUndef(window.AudioContext)){
window.AudioContext = window.webkitAudioContext;
}
if (!isUndef(AudioContext)){
audioContext = new AudioContext();
} else {
throw new Error("Web Audio is not supported in this browser");
}
Also there is a similar IOS error here that might help.
Check what kind of audio context the device uses instead of using ||. For example:
var AudioContext
if('webkitAudioContext' in window) {
AudioContext = new webkitAudioContext();
}
else{
AudioContext = new AudioContext ();
}

Clicking sounds in Stream played with Web Audio Api

I have a strange Problem. I'm using Web Audio to play a stream from the server. I do that the following way:
var d2 = new DataView(evt.data);
var data = new Float32Array(d2.byteLength / Float32Array.BYTES_PER_ELEMENT);
for (var jj = 0; jj < data.length; ++jj) {
data[jj] = d2.getFloat32(jj * Float32Array.BYTES_PER_ELEMENT, true);
}
var buffer = context.createBuffer(1, data.length, 44100);
buffer.getChannelData(0).set(data);
source = context.createBufferSource();
source.buffer = buffer;
source.start(startTime);
source.connect(context.destination);
startTime += buffer.duration;
This works fine.
If i play the stream on my Computer i don't have any problems.
If i play the same stream on my Windows 8 tablet (same Chrome version) i have a lot of clicking sounds in the audio. There are multiple of them within one second.
It kinda seams that on the end of each buffer i hear a click.
I don't understand the difference... The only difference i could find was that the samplingrate of the soundcard on my computer is 44100 and on the tablet it's 48000.
The transmitted stream is in 44100 and i don't have any samplerate problems. just the clicking sounds.
Does anybody have an idea why this is happening?
Thank you,
metabolic
AudioBufferSourceNode resample their buffers to the AudioContext samplerate. As you can imagine, the API does not allow you to keep the resampler state between one AudioBufferSourceNode and the other, so there is a discontinuity between the two buffers.
I think the easiest way is to provide a stream at the sample-rate of the device, by resampling server-side. When the AudioWorkerNode will be ready and implemented, you'll be able to fix this yourself as well client side, but it's not.
Alternatively also you can just stream using an element, and pipe that to Web Audio API using AudioContext.createMediaElementSource().
I had the same issue, thanks to Padenot's answer I checked the sample rates. AudioContext.sampleRate defaulted to 44100, but the PCM data and AudioBuffer was 48000. Initialising the AudioContext with a matching sampleRate solved the problem:
var AudioContext = window.AudioContext || window.webkitAudioContext;
var audioCtx = new AudioContext({
latencyHint: 'interactive',
sampleRate: 48000,
});
With this, I can schedule the playback of 20ms 48khz PCM16 AudioBuffers back-to-back without any clicks or distortion.

WebRTC - How to mute local audio output

I'm trying to mute only the local audio playback in WebRTC, more specifically after getUserMedia() and prior to any server connection being made. None of the options I've found work; this one from Muaz Khan fails:
var audioTracks = localMediaStream.getAudioTracks();
// if MediaStream has reference to microphone
if (audioTracks[0]) {
audioTracks[0].enabled = false;
}
source
This technique is also described here as "working", but fails here on Chrome Version 39.0.2171.95 (64-bit) (Ubuntu 14.04).
Additional method that is said to work by using volume gain:
window.AudioContext = window.AudioContext || window.webkitAudioContext;
var audioContext = new AudioContext();
var source = audioContext.createMediaStreamSource(clientStream);
var volume = audioContext.createGain();
source.connect(volume);
volume.connect(audioContext.destination);
volume.gain.value = 0; //turn off the speakers
tl;dr I don't want to hear the input from my microphone on my speakers, but I do want to see my video image.
Workaround
This workaround was suggested by Benjamin Trent and it mutes the audio by setting the muted attribute on the video tag like so:
document.getElementById("html5vid").muted = true;
Also similar question, but its for video, so not the same
Adding this as the answer because it is the de facto correct answer:
What you stated as a workaround is what's used by many major WebRTC Video platforms:
navigator.mediaDevices.getUserMedia({ video: true, audio: true })
.then(stream => {
const vid = document.getElementById('html5vid');
vid.autoplay = true;
vid.muted = true;
vid.srcObject = stream;
});

web audio in firefox

i am trying to build a web app that visualises and and controls the source audio, it works brilliant in chrome, but completely breaks in firefox, it won't even play the audio. here is the code:
var audio = new Audio();
audio.src='track.mp3';
audio.controls = true;
audio.loop = false;
audio.autoplay = false;
window.addEventListener("load", initPlayer, false);
function initPlayer(){
$("#player").append(audio);
context = new AudioContext();
analyser = context.createAnalyser();
canvas = document.getElementById("vis");;
ctx = canvas.getContext("2d");
source = context.createMediaElementSource(audio);
source.connect(analyser);
analyser.connect(context.destination);
}
the line that breaks everything is:
source = context.createMediaElementSource(audio);
after adding this line the player just hangs at 0:00 in firefox. i have done my research and have come across CORS, but as far as i can understand this should be irrelevant as the file is kept on the same server.
Please help
You have to serve the audio correctly with a server so that MIME types are set, so run it from localhost rather than file:///..../track.mp3
We used to have a bug in Firefox where MediaElementSourceNode did not work properly in some case. It's now fixed (I believe the fix is in Aurora and Nightly, at the time of writing).
Sorry about that.

Categories