Web audio: Output to center speaker only - javascript

I'm playing around with the web audio context and trying to understand how the splitter node works.
I've managed to split an oscillator node and connect it to output only to the left/right channels, but I'm having some hard time outputting it to the "center" channel alone...
given the following code
const ac = new AudioContext();
const splitter = ac.createChannelSplitter(6);
const oscilator = ac.createOscillator();
const merger = ac.createChannelMerger(6);
oscilator.frequency.value = 440;
const gainNode = ac.createGain();
// connect oscilator to splitter channel
oscilator.connect(splitter);
gainNode.gain.setValueAtTime(0.5, ac.currentTime);
splitter.connect(gainNode, 0);
gainNode.connect(merger, 0, 2);
merger.connect(ac.destination);
oscilator.start(0);
oscilator.stop(1);
I expect the oscillator tune to be outputted only in my "center" speaker, but I hear the sound from both.
what am I missing?

This depends a lot on how your output device is configured. If you only have stereo output speakers, then the center channel will be heard in both speakers.
For these kinds of things, I find it best to use a 6-channel OfflineAudioContext and then to look at the channels of output. Don't know if this is still maintained, but I find Canopy to be very useful for this kind of exploration.

Related

javascript new audioContext mono to Stereo not wordking

I need to convert the mp3 files I play in a project from mono to stereo with the web audio api. But do this a1 = new Audio(/1.mp3); I can't with. My entire system is based on this build. Converting all the sounds playing on the page to stereo or new Audio(/1.mp3); Is there a way to convert a sound created with .
var a1 = new Audio(`/1.mp3`);
a1.volume = .5;
a1.play()
I am using a simple code structure as above.
https://stackoverflow.com/a/54990085/15929287
I couldn't adapt the above answer for myself. In no way can I convert the sound I created with new audio() to stereo. In the example in the link, the oscillator is also added. I'm just trying to do something where I can adjust the mono/stereo setting. I need your help.
The linked answer assumes the mono audio source is an oscillator created with the Web Audio API but it is also possible to use an audio element as the source by using a MediaElementAudioSourceNode.
The audio graph would then need to be wired up like this:
const audio = new Audio('/1.mp3');
const audioCtx = new AudioContext();
const source = audioCtx.createMediaElementSource(audio);
const gainNodeL = audioCtx.createGain();
const gainNodeR = audioCtx.createGain();
const merger = audioCtx.createChannelMerger(2);
source.connect(gainNodeL);
source.connect(gainNodeR);
gainNodeL.connect(merger, 0, 0);
gainNodeR.connect(merger, 0, 1);
merger.connect(audioCtx.destination);
Please note that it's probably still necessary to call resume() on the AudioContext in response to a user gesture to make sure the AudioContext is running.

Web Audio : Play stereo (2 channel) mp3 so only hear channel 1 in left ear and channel 2 in right

I have a stereo mp3 file. Is it possible using Web Audio to play the file so that channel 1 is only heard through the left speaker and channel 2 is only heard through the right speaker?
I've played with splitter, panner and merger nodes (channelCounts, Interpretation, CountMode) however the result always seem to be mixed somewhere e.g. if I attach a blank source as a second input to the merger, I still always hear channel 1 clearly in the right speaker.
See below as an example I have tried - with various permutations of connect ins and outs.
audioCtx = new window.AudioContext();
var source = audioCtx.createMediaElementSource(myAudio);
var blank = audioCtx.createBufferSource();
var splitter = audioCtx.createChannelSplitter(2);
source.connect(splitter);
var panner = audioCtx.createPanner();
panner.setPosition(-1,0,0);
splitter.connect(panner, 0, 0);
var merger = audioCtx.createChannelMerger(2);
panner.connect(merger, 0, 0);
blank.connect(merger, 0, 1);
merger.connect(audioCtx.destination);

JavaScript Web Audio API not properly connected to each other

Locked. There are disputes about this question’s content being resolved at this time. It is not currently accepting new answers or interactions.
I have created a Web Audio API Biquad filters (Lowpass, Highpass etc) using JavaScript. The application works (I think....) well, it's displaying on the canvas without errors so i'm guessing it does. Anyway, I'm not a pro at JavaScript, far from it. I showed someone a small snippet of my code and they said it was very messy and that i'm not building my audio graph properly for example, not connecting all of the nodes from start to finish.
Now I know that the Source connects to Gain. Gain connects to Filter. Filter connects to Destination. I tried to look at it but I can't figure out what's wrong and how to fix it.
JavaScript
// Play the sound.
function playSound(buffer) {
aSoundSource = audioContext.createBufferSource(); // creates a sound source.
aSoundSource.buffer = buffer; // tell the source which sound to play.
aSoundSource.connect(analyser); // Connect the source to the analyser.
analyser.connect(audioContext.destination); // Connect the analyser to the context's destination (the speakers).
aSoundSource.start(0); // play the source now.
//Create Filter
var filter = audioContext.createBiquadFilter();
//Create the audio graph
aSoundSource.connect(filter);
//Set the gain node
gainNode = audioContext.createGain();
aSoundSource.connect(gainNode); //Connect the source to the gain node
gainNode.connect(audioContext.destination);
//Set the current volume
var volume = document.getElementById('volume').value;
gainNode.gain.value = volume;
//Create and specify parameters for Low-Pass Filter
filter.type = "lowpass"; //Low pass filter
filter.frequency.value = 440;
//End Filter
//Connect source to destination(speaker)
filter.connect(audioContext.destination);
//Set the playing flag
playing = true;
//Clear the spectrogram canvas
var canvas = document.getElementById("canvas2");
var context = canvas.getContext("2d");
context.fillStyle = "rgb(255,255,255)";
context.fillRect (0, 0, spectrogramCanvasWidth, spectrogramCanvasHeight);
// Start visualizer.
requestAnimFrame(drawVisualisation);
}
Because of this, my volume bar thingy has stopped working. I also tried doing "Highpass filter" but it's displaying the same thing. I'm confused and have no one else to ask. By the way, the person I asked didn't help but just said it's messy...
Appreciate all of the help guys and thank you!
So, there's a lot of context missing because of how you posted this - e.g. you don't have your drawVisualisation() code, and you don't explain exactly what you mean by your "volume bar thingy has stopped working".
My guess is that it's just that you have a graph that connects your source node to the output (audiocontext.destination) three times in parallel - through the analyser (which is a pass-thru, and is connected to the output), through the filter, AND through the gain node. Your analyser in this case would show the unfiltered signal output only (you won't see any effect from the the filter, because that's a parallel signal path), and the actual output is summing three chains of the source node (one through the filter, one through the analyser, one through the gain node) - which might have some odd phasing effects, but will also triple the volume (approximately) and quite possibly clip.
Your graph looks like this:
source → destination
↳ filter → destination
↳ gain → destination
What you probably want is to connect each of these nodes in series, like this:
source → filter → gain → destination
I think you want something like this:
// Play the sound.
function playSound(buffer) {
aSoundSource = audioContext.createBufferSource(); // creates a sound source.
aSoundSource.buffer = buffer; // tell the source which sound to play.
//Create Filter
var filter = audioContext.createBiquadFilter();
//Create and specify parameters for Low-Pass Filter
filter.type = "lowpass"; //Low pass filter
filter.frequency.value = 440;
//Create the gain node
gainNode = audioContext.createGain();
//Set the current volume
var volume = document.getElementById('volume').value;
gainNode.gain.value = volume;
//Set up the audio graph
aSoundSource.connect(filter);
filter.connect(gainNode);
gainNode.connect(analyser);
analyser.connect(audioContext.destination);
aSoundSource.start(0); // play the source now.
aSoundSource.connect(gainNode); //Connect the source to the gain node
//Set the playing flag
playing = true;
//Clear the spectrogram canvas
var canvas = document.getElementById("canvas2");
var context = canvas.getContext("2d");
context.fillStyle = "rgb(255,255,255)";
context.fillRect (0, 0, spectrogramCanvasWidth, spectrogramCanvasHeight);
// Start visualizer.
requestAnimFrame(drawVisualisation);
}

ChannelMergerNode in Web audio API not merging channels

I'm trying to use the web audio API to create an audio stream with the left and right channels generated with different oscillators. The output of the left channel is correct, but the right channel is 0. Based on the spec, I can't see what I'm doing wrong.
Tested in Chrome dev.
Code:
var context = new AudioContext();
var l_osc = context.createOscillator();
l_osc.type = "sine";
l_osc.frequency.value = 100;
var r_osc = context.createOscillator();
r_osc.type = "sawtooth";
r_osc.frequency.value = 100;
// Combine the left and right channels.
var merger = context.createChannelMerger(2);
merger.channelCountMode = "explicit";
merger.channelInterpretation = "discrete";
l_osc.connect(merger, 0, 0);
r_osc.connect(merger, 0, 1);
var dest_stream = context.createMediaStreamDestination();
merger.connect(dest_stream);
// Dump the generated waveform to a MediaStream output.
l_osc.start();
r_osc.start();
var track = dest_stream.stream.getAudioTracks()[0];
var plugin = document.getElementById('plugin');
plugin.postMessage(track);
The channelInterpretation means the merger node will mix the stereo oscillator connections to two channels each - but then because you have an explicit channelCountMode, it's stacking the two-channels-per-connection to get four channels and (because it's explicit) just dropping the top two channels. Unfortunately the second two channels are the two channels from the second input - so it loses all channels from the second connection.
In general, you shouldn't need to mess with the channelCount interpretation stuff, and it will do the right thing for stereo.

Web Audio: No sound in right channel

I'm trying to create a custom panning control using the Web Audio API, but I can't get any sound to come out of the right channel using the channel splitter and merger nodes:
var context = new webkitAudioContext(),
destination = context.destination,
osc = context.createOscillator(),
gainL = context.createGainNode(),
gainR = context.createGainNode(),
splitter = context.createChannelSplitter(2),
merger = context.createChannelMerger(2);
osc.frequency.value = 500;
osc.connect(splitter);
splitter.connect(gainL, 0);
splitter.connect(gainR, 1);
gainL.connect(merger, 0, 0);
gainR.connect(merger, 0, 1);
osc.noteOn(0);
gainL.gain.value = 0.1;
gainR.gain.value = 0.5;
osc.noteOff(2);
merger.connect(destination);
Am I missing something obvious here? There's a JSBin preview of the above code here: http://jsbin.com/ayijoy/1/
I'm running Chrome v24.0.1312.57, just in case that's of any use.
My best guess is this happens because the Oscillator outputs a mono signal. Try using a stereo source and you should probably have more luck.
Edit: here's how you can pan a "mono" signal (bypass the splitter, since there is no stereo signal to split, and connect the oscillator directly to the two gains. Then connect the two mono signals to the merger after adjusting the gain for each channel) http://jsbin.com/ayijoy/16/

Categories