Create custom oscillator with long periodic wave - javascript

I'm trying to create a complex periodic sound with long period. I want to define frequencies as accurately as I can, so I'm using step sampleRate*0.5/tableLen. But I have some issues with large wave tables. The sound becomes distorted and loses high frequencies.
Here is a minimal example with ~440 Hz sine wave. When I use table with length 8192, the resulting sine wave is quite recognizable:
https://jsfiddle.net/zxqzntf0/
var gAudioCtx = new AudioContext();
var osc = gAudioCtx.createOscillator();
var tableSize = 8192;
var real = new Float32Array(tableSize);
var imag = new Float32Array(tableSize);
var freq = 440;
var step = gAudioCtx.sampleRate*0.5/tableSize;
real[Math.floor(freq/step)] = 1;
var wave = gAudioCtx.createPeriodicWave(real, imag, {disableNormalization: true});
osc.frequency.value = step;
osc.setPeriodicWave(wave);
osc.connect(gAudioCtx.destination);
osc.start();
But when I increase my table size, I'm getting something strange. Result is not a sine wave at all!
https://jsfiddle.net/0cc75nnm/
This problem reproduces in all browsers (Chrome, Firefox, Edge), so it doesn't seem to be a browser bug. But I've found nothing about this in documentation.
Added
I found that if oscillator frequency is a whole number >= 2 Hz, I have no any artifacts in resulting sound with table size 16384. I think it is quite acceptable for my needs for now. But someday I may want to create longer periods. If someone explains me why I get sound artifacts when step is less than 2 Hz, I will accept his answer.
There is an example of a complex sound melody that I generate in JavaScript:
https://jsfiddle.net/h9rfzrnL/1/

You're creating you periodic waves incorrectly. When filling the arrays for the periodic wave, assume the sample rate is 1. Then if you want an oscillator at a frequency of 440 Hz, set the oscillator frequency to 440 Hz.
Thus, for a sine wave, the real array should be all zeroes and the imaginary array is [0, 1]. (You're actually creating a cosine wave, but that doesn't really matter too much.)

Related

How to create beat by multiplying square wave with tone?

See the code below. How I understand things:
beat is a square wave oscillating between -1 and 1.
Connecting beat to multiplier.gain adds the square wave of beat to the default gain of 1. The result is a gain that oscillates between 0 and 2.
As tone is connected to multiplier, I expect to hear a tone of 440Hz for two seconds, then a pause for two seconds, then the tone again, and so on.
However, where I expect the gain to be 0, I still hear a tone, only muted. What am I doing wrong?
I tested with Chrome 74 and Firefox 66, both on Windows 10.
Code:
<!doctype html>
<meta charset=utf-8>
<script>
var context = new window.AudioContext();
var tone = context.createOscillator();
var beat = context.createOscillator();
beat.frequency.value = 0.25;
beat.type = "square";
var multiplier = context.createGain();
tone.connect(multiplier);
beat.connect(multiplier.gain);
multiplier.connect(context.destination);
tone.start();
beat.start();
</script>
<button onclick="context.resume()">Play</button>
The problem is that the 'square' type doesn't really oscillate between -1 and 1. The range is more or less from -0.848 to 0.848. Setting the GainNode's gain AudioParam to this value should work.
multiplier.gain.value = 0.848;
To see the actual output of an oscillator you could for example use Canopy. It can run Web Audio code and then visualizes the results.
If you do for example execute the following snippet, it will show you the corresponding waveform.
var osc = new OscillatorNode(context);
osc.type = "square";
osc.connect(context.destination);
osc.start();
I hope this helps.

Wavetable Synthesis - WebAudioApi

I am trying to create a wavetable synthesizer using the Web Audio Api. What i would like to achieve is the possibility to linearly swap from a wave form to another one (like Massive or Serum).
For example: starting from a sine wave, i rotate a knob that will gradually transform it into a square wave.
I've searched on the documentation and so far i found how to create a custom waveform:
var real = new Float32Array(2);
var imag = new Float32Array(2);
var ac = new AudioContext();
var osc = ac.createOscillator();
real[0] = 0;
imag[0] = 0;
real[1] = 1;
imag[1] = 0;
var wave = ac.createPeriodicWave(real, imag, {disableNormalization: true});
osc.setPeriodicWave(wave);
osc.connect(ac.destination);
osc.start();
osc.stop(2);
The main problem is that this waveform is static, i am not able to change it gradually into something else.
How can i achieve my goal? I was thinking about 2 gain nodes placed after each wave that will work complementary to each other.
For example: my sine wave goes into Gain1 which is 10 and my square wave into Gain2 which is 0. Then i change them complementary, Gain1=5,Gain2=5 and so on.
Is it a valid approach?
IIUC, I don't think using a set of gain nodes will produce what you want. And there's no builtin node to do this.
I think you will have to do this yourself with an AudioWorkletNode.

Getting dampingRatio to work with Box2D and DistanceJoint

I have tried unsuccessfully on several projects to get a distance joint to stop swinging forever in Box2D for JavaScript. No matter what values I set for the density of the bodies and the dampingRatio and frequencyHz of the distant joint definition - the result is you pick up one end and the other end swings endlessly. I want the swing to get smaller and then stop after a few swings.
// I have made a world and bodies with density of 1 (although I have tried bigger)
var distanceJointDef = new b2DistanceJointDef();
distanceJointDef.Initialize(circleBody, triBody, circleBody.GetWorldCenter(), triBody.GetWorldCenter());
distanceJointDef.dampingRatio = 1; // tried .5, 20, etc. no difference
distanceJointDef.frequencyHz = 30; // tried all sorts of numbers
world.CreateJoint(distanceJointDef);
The joint works - but the damping does not. Any help would be appreciated. Here is a link to the Box2D I am using: https://github.com/joelgwebber/bench2d/tree/master/js/Box2dWeb-2.1a.3
The answer is to put linear damping on the objects you are swinging. var definition = new b2BodyDef(); definition.linearDamping = .5; // etc. where numbers towards 1 slow quickly

Why the size of the analysis array needs to be half of the fftSize aka frequencyBinCount

Trying to make sense of WebAudioAPI's spec.
What is the reason that we are using the frequencyBinCount and not the fftSize for the size of the analysis array when getting the frequency data?
And should we use frequencyBinCount or the fftSize for the size of the array when getting the time domain data?
And the last question. In the spec, it is mentioned that if we pass a larger sized array than the frequencyBinCount the excess elements will be ignored, but what if you pass it a smaller array?
So:
var analyser = new context.createAnalyser();
analyser.fftSize = 1024;
//should fft.size be used?
//or frequency.binCount
//what happens if the size is smaller than fftSize?
var timeArray = new Float32Array(analyser.fftSize);
//why are we using frequencyBinCount and not fftSize?
var freqArray = new Uint8Array(analyser.frequencyBinCount);
analyser.getByteFrequencyData(freqArray);
analyser.getFloatTimeDomainData(timeArray);
It's true that, generally, an FFT of size N will give you N frequency bins. When you're analyzing "real" signals, though, half of these bins will be redundant. Specifically, the first half of the FFT will mirror the second half: bins [2..(N/2)+1] will equal bins [N..(N/2)+1]. Since all audio signals are "real", this symmetry property will hold for any FFT you do in the Web Audio API. The result will only contain N/2 unique values.
In other words, the analysis array has size N/2 because that's the size of the result. A larger array would be wasteful.
A more rigorous discussion of FFT symmetry is here: https://dsp.stackexchange.com/questions/4825/why-is-the-fft-mirrored

How can I set a phase offset for an OscillatorNode in the Web Audio API?

I'm trying to implement Stereo Phase as it is described here: http://www.image-line.com/support/FLHelp/html/plugins/3x%20OSC.htm
"Stereo Phase (SP) - Allows you to set different phase offset for the left and right channels of the generator. The offset results in the oscillator starting at a different point on the oscillator's shape (for example, start at the highest value of the sine function instead at the zero point). Stereo phase offset adds to the richness and stereo panorama of the sound produced."
I'm trying to achieve this for an OscillatorNode. My only idea is to use createPeriodicWave (https://dvcs.w3.org/hg/audio/raw-file/tip/webaudio/specification.html#dfn-createPeriodicWave) However, the description of create periodic wave from the specification is above my understanding and I have not found any examples via Google.
Any help in deciphering the description of createPeriodicWave would be helpful as would any other ideas about how to achieve this effect.
Thanks!
Mcclellan and others,
This answer helped and subsequently warped me into the world of Fourier. With the help of a page on the subject and some Wikipedia, I think I got the square and sawtooth patterns down, but the triangle pattern still eludes me. Does anyone know?
It indeed gives you the ability to phase shift, as this article by Nick Thompson explains (although he calls the AudioContext methods differently, but the principle is the same).
As far as the square and sawtooth patterns go:
var n = 4096;
var real = new Float32Array(n);
var imag = new Float32Array(n);
var ac = new AudioContext();
var osc = ac.createOscillator();
/* Pick a wave pattern */
/* Sine
imag[1] = 1; */
/* Sawtooth
for(x=1;x<n;x++)
imag[x] = 2.0 / (Math.pow(-1, x) * Math.PI * x); */
/* Square */
for(x=1;x<n;x+=2)
imag[x] = 4.0 / (Math.PI * x);
var wave = ac.createPeriodicWave(real, imag);
osc.setPeriodicWave(wave);
osc.connect(ac.destination);
osc.start();
osc.stop(2); /* Have it stop after 2 seconds */
This will play the activated pattern, the square pattern in this case. What would the triangle formula look like?
A simple way to fake it would be to add separate delay nodes to the left and right channels, and give them user-controlled delay values. This would be my approach and will have more-or-less the same effect as a phase setting.
If you want to use createPeriodicWave, unfortunately you'll probably have to understand the somewhat difficult math behind it.
Basically, you'll first have to represent your waveform as a sum of sine wave "partials". All periodic waves have some representation of this form. Then, once you've found the relative magnitudes of each partial, you'll have to phase shift them separately for left and right channels by multiplying each by a complex number. You can read more details about representing periodic waves as sums of sine waves here: http://music.columbia.edu/cmc/musicandcomputers/chapter3/03_03.php
Using createPeriodicWave has a significant advantage over using a BufferSourceNode: createPeriodicWave waveforms will automatically avoid aliasing. It's rather difficult to avoid aliasing if you're generating the waveforms "by hand" in a buffer.
I do not think that it would be possible to have a phase offset using a OscillatorNode.
A way to do that would be to use context.createBuffer and generate a sine wave buffer (or any type of wave that you want) and set it as the buffer for a BufferSourceNode and then use the offset parameter in its start() method. But you need to calculate the sample offset amount in seconds.
var buffer = context.createBuffer(1,1024,44100);
var data = buffer.getChannelData(0);
for(var i=0; i < data.length; i++){
//generate waveform
}
var osc = context.createBufferSourceNode();
osc.buffer = buffer;
osc.loop = true;
osc.connect(context.destination);
osc.start(time, offset in seconds);
By the looks of this article on Wolfram the triangle wave can be established like this:
/* Triangle */
for(x=1;x<n;x+=2)
imag[x] = 8.0 / Math.pow(Math.PI, 2) * Math.pow(-1, (x-1)/2) / Math.pow(x, 2) * Math.sin(Math.PI * x);
Also helpful by the way is the Wikipedia page that actually shows how the Fourier constructions work.
function getTriangleWave(imag, n) {
for (var i = 1; i < n; i+=2) {
imag[i] = (8*Math.sin(i*Math.PI/2))/(Math.pow((Math.PI*i), 2));
}
return imag;
}
With Chrome 66 adding AudioWorklet's, you can write sound processing programs the same way as the now deprecated ScriptProcessorNode.
I made a convenience library using this that's a normal WebAudio API OscillatorNode that can also have its phase (among other things) varied. You can find it here
const context = new AudioContext()
context.audioWorklet.addModule("worklet.js").then(() => {
const osc = new BetterOscillator(context)
osc.parameters.get("phase").value = Math.PI / 4
osc.connect(context.destination)
osc.start()
}

Categories