Web Audio: No sound in right channel - javascript

I'm trying to create a custom panning control using the Web Audio API, but I can't get any sound to come out of the right channel using the channel splitter and merger nodes:
var context = new webkitAudioContext(),
destination = context.destination,
osc = context.createOscillator(),
gainL = context.createGainNode(),
gainR = context.createGainNode(),
splitter = context.createChannelSplitter(2),
merger = context.createChannelMerger(2);
osc.frequency.value = 500;
osc.connect(splitter);
splitter.connect(gainL, 0);
splitter.connect(gainR, 1);
gainL.connect(merger, 0, 0);
gainR.connect(merger, 0, 1);
osc.noteOn(0);
gainL.gain.value = 0.1;
gainR.gain.value = 0.5;
osc.noteOff(2);
merger.connect(destination);
Am I missing something obvious here? There's a JSBin preview of the above code here: http://jsbin.com/ayijoy/1/
I'm running Chrome v24.0.1312.57, just in case that's of any use.

My best guess is this happens because the Oscillator outputs a mono signal. Try using a stereo source and you should probably have more luck.
Edit: here's how you can pan a "mono" signal (bypass the splitter, since there is no stereo signal to split, and connect the oscillator directly to the two gains. Then connect the two mono signals to the merger after adjusting the gain for each channel) http://jsbin.com/ayijoy/16/

Related

Web audio: Output to center speaker only

I'm playing around with the web audio context and trying to understand how the splitter node works.
I've managed to split an oscillator node and connect it to output only to the left/right channels, but I'm having some hard time outputting it to the "center" channel alone...
given the following code
const ac = new AudioContext();
const splitter = ac.createChannelSplitter(6);
const oscilator = ac.createOscillator();
const merger = ac.createChannelMerger(6);
oscilator.frequency.value = 440;
const gainNode = ac.createGain();
// connect oscilator to splitter channel
oscilator.connect(splitter);
gainNode.gain.setValueAtTime(0.5, ac.currentTime);
splitter.connect(gainNode, 0);
gainNode.connect(merger, 0, 2);
merger.connect(ac.destination);
oscilator.start(0);
oscilator.stop(1);
I expect the oscillator tune to be outputted only in my "center" speaker, but I hear the sound from both.
what am I missing?
This depends a lot on how your output device is configured. If you only have stereo output speakers, then the center channel will be heard in both speakers.
For these kinds of things, I find it best to use a 6-channel OfflineAudioContext and then to look at the channels of output. Don't know if this is still maintained, but I find Canopy to be very useful for this kind of exploration.

Web Audio : Play stereo (2 channel) mp3 so only hear channel 1 in left ear and channel 2 in right

I have a stereo mp3 file. Is it possible using Web Audio to play the file so that channel 1 is only heard through the left speaker and channel 2 is only heard through the right speaker?
I've played with splitter, panner and merger nodes (channelCounts, Interpretation, CountMode) however the result always seem to be mixed somewhere e.g. if I attach a blank source as a second input to the merger, I still always hear channel 1 clearly in the right speaker.
See below as an example I have tried - with various permutations of connect ins and outs.
audioCtx = new window.AudioContext();
var source = audioCtx.createMediaElementSource(myAudio);
var blank = audioCtx.createBufferSource();
var splitter = audioCtx.createChannelSplitter(2);
source.connect(splitter);
var panner = audioCtx.createPanner();
panner.setPosition(-1,0,0);
splitter.connect(panner, 0, 0);
var merger = audioCtx.createChannelMerger(2);
panner.connect(merger, 0, 0);
blank.connect(merger, 0, 1);
merger.connect(audioCtx.destination);

Adding panner / spacial audio to Web Audio Context from a WebRTC stream not working

I would like to create a Web Audio panner to position the sound from a WebRTC stream.
I have the stream connecting OK and can hear the audio and see the video, but the panner does not have any effect on the audio (changing panner.setPosition(10000, 0, 0) to + or - 10000 makes no difference to the sound).
This is the onaddstream function where the audio and video get piped into a video element and where I presume i need to add the panner.
There are no errors, it just isn't panning at all.
What am I doing wrong?
Thanks!
peer_connection.onaddstream = function(event) {
var AudioContext = window.AudioContext || window.webkitAudioContext;
var audioCtx = new AudioContext();
audioCtx.listener.setOrientation(0,0,-1,0,1,0)
var panner = audioCtx.createPanner();
panner.panningModel = 'HRTF';
panner.distanceModel = 'inverse';
panner.refDistance = 1;
panner.maxDistance = 10000;
panner.rolloffFactor = 1;
panner.coneInnerAngle = 360;
panner.coneOuterAngle = 0;
panner.coneOuterGain = 0;
panner.setPosition(10000, 0, 0); //this doesn't do anything
peerInput.connect(panner);
panner.connect(audioCtx.destination);
// attach the stream to the document element
var remote_media = USE_VIDEO ? $("<video>") : $("<audio>");
remote_media.attr("autoplay", "autoplay");
if (MUTE_AUDIO_BY_DEFAULT) {
remote_media.attr("muted", "false");
}
remote_media.attr("controls", "");
peer_media_elements[peer_id] = remote_media;
$('body').append(remote_media);
attachMediaStream(remote_media[0], event.stream);
}
Try to get the event stream before setting the panner
var source = audioCtx.createMediaStreamSource(event.stream);
Reference: Mozilla Developer Network - AudioContext
CreatePaneer Refernce: Mozilla Developer Network - createPanner
3rd Party Library: wavesurfer.js
Remove all the options you've set for the panner node and see if that helps. (The cone angles seem a little funny to me, but I always forget how they work.)
If that doesn't work, create a smaller test with the panner but use a simple oscillator as the input. Play around with the parameters and positions to make sure it does what you want.
Put this back into your app. Things should work then.
Figured this out for myself.
The problems was not the code, it was because I was connected with Bluetooth audio.
Bluetooth apparently can only do stereo audio with the microphone turned off. As soon as you activate the mic, that steals one of the channels and audio output downgrades to mono.
If you have mono audio, you definitely cannot do 3D positioned sound, hence me thinking the code was not working.

JavaScript Web Audio API not properly connected to each other

Locked. There are disputes about this question’s content being resolved at this time. It is not currently accepting new answers or interactions.
I have created a Web Audio API Biquad filters (Lowpass, Highpass etc) using JavaScript. The application works (I think....) well, it's displaying on the canvas without errors so i'm guessing it does. Anyway, I'm not a pro at JavaScript, far from it. I showed someone a small snippet of my code and they said it was very messy and that i'm not building my audio graph properly for example, not connecting all of the nodes from start to finish.
Now I know that the Source connects to Gain. Gain connects to Filter. Filter connects to Destination. I tried to look at it but I can't figure out what's wrong and how to fix it.
JavaScript
// Play the sound.
function playSound(buffer) {
aSoundSource = audioContext.createBufferSource(); // creates a sound source.
aSoundSource.buffer = buffer; // tell the source which sound to play.
aSoundSource.connect(analyser); // Connect the source to the analyser.
analyser.connect(audioContext.destination); // Connect the analyser to the context's destination (the speakers).
aSoundSource.start(0); // play the source now.
//Create Filter
var filter = audioContext.createBiquadFilter();
//Create the audio graph
aSoundSource.connect(filter);
//Set the gain node
gainNode = audioContext.createGain();
aSoundSource.connect(gainNode); //Connect the source to the gain node
gainNode.connect(audioContext.destination);
//Set the current volume
var volume = document.getElementById('volume').value;
gainNode.gain.value = volume;
//Create and specify parameters for Low-Pass Filter
filter.type = "lowpass"; //Low pass filter
filter.frequency.value = 440;
//End Filter
//Connect source to destination(speaker)
filter.connect(audioContext.destination);
//Set the playing flag
playing = true;
//Clear the spectrogram canvas
var canvas = document.getElementById("canvas2");
var context = canvas.getContext("2d");
context.fillStyle = "rgb(255,255,255)";
context.fillRect (0, 0, spectrogramCanvasWidth, spectrogramCanvasHeight);
// Start visualizer.
requestAnimFrame(drawVisualisation);
}
Because of this, my volume bar thingy has stopped working. I also tried doing "Highpass filter" but it's displaying the same thing. I'm confused and have no one else to ask. By the way, the person I asked didn't help but just said it's messy...
Appreciate all of the help guys and thank you!
So, there's a lot of context missing because of how you posted this - e.g. you don't have your drawVisualisation() code, and you don't explain exactly what you mean by your "volume bar thingy has stopped working".
My guess is that it's just that you have a graph that connects your source node to the output (audiocontext.destination) three times in parallel - through the analyser (which is a pass-thru, and is connected to the output), through the filter, AND through the gain node. Your analyser in this case would show the unfiltered signal output only (you won't see any effect from the the filter, because that's a parallel signal path), and the actual output is summing three chains of the source node (one through the filter, one through the analyser, one through the gain node) - which might have some odd phasing effects, but will also triple the volume (approximately) and quite possibly clip.
Your graph looks like this:
source → destination
↳ filter → destination
↳ gain → destination
What you probably want is to connect each of these nodes in series, like this:
source → filter → gain → destination
I think you want something like this:
// Play the sound.
function playSound(buffer) {
aSoundSource = audioContext.createBufferSource(); // creates a sound source.
aSoundSource.buffer = buffer; // tell the source which sound to play.
//Create Filter
var filter = audioContext.createBiquadFilter();
//Create and specify parameters for Low-Pass Filter
filter.type = "lowpass"; //Low pass filter
filter.frequency.value = 440;
//Create the gain node
gainNode = audioContext.createGain();
//Set the current volume
var volume = document.getElementById('volume').value;
gainNode.gain.value = volume;
//Set up the audio graph
aSoundSource.connect(filter);
filter.connect(gainNode);
gainNode.connect(analyser);
analyser.connect(audioContext.destination);
aSoundSource.start(0); // play the source now.
aSoundSource.connect(gainNode); //Connect the source to the gain node
//Set the playing flag
playing = true;
//Clear the spectrogram canvas
var canvas = document.getElementById("canvas2");
var context = canvas.getContext("2d");
context.fillStyle = "rgb(255,255,255)";
context.fillRect (0, 0, spectrogramCanvasWidth, spectrogramCanvasHeight);
// Start visualizer.
requestAnimFrame(drawVisualisation);
}

ChannelMergerNode in Web audio API not merging channels

I'm trying to use the web audio API to create an audio stream with the left and right channels generated with different oscillators. The output of the left channel is correct, but the right channel is 0. Based on the spec, I can't see what I'm doing wrong.
Tested in Chrome dev.
Code:
var context = new AudioContext();
var l_osc = context.createOscillator();
l_osc.type = "sine";
l_osc.frequency.value = 100;
var r_osc = context.createOscillator();
r_osc.type = "sawtooth";
r_osc.frequency.value = 100;
// Combine the left and right channels.
var merger = context.createChannelMerger(2);
merger.channelCountMode = "explicit";
merger.channelInterpretation = "discrete";
l_osc.connect(merger, 0, 0);
r_osc.connect(merger, 0, 1);
var dest_stream = context.createMediaStreamDestination();
merger.connect(dest_stream);
// Dump the generated waveform to a MediaStream output.
l_osc.start();
r_osc.start();
var track = dest_stream.stream.getAudioTracks()[0];
var plugin = document.getElementById('plugin');
plugin.postMessage(track);
The channelInterpretation means the merger node will mix the stereo oscillator connections to two channels each - but then because you have an explicit channelCountMode, it's stacking the two-channels-per-connection to get four channels and (because it's explicit) just dropping the top two channels. Unfortunately the second two channels are the two channels from the second input - so it loses all channels from the second connection.
In general, you shouldn't need to mess with the channelCount interpretation stuff, and it will do the right thing for stereo.

Categories