I am trying to create a wavetable synthesizer using the Web Audio Api. What i would like to achieve is the possibility to linearly swap from a wave form to another one (like Massive or Serum).
For example: starting from a sine wave, i rotate a knob that will gradually transform it into a square wave.
I've searched on the documentation and so far i found how to create a custom waveform:
var real = new Float32Array(2);
var imag = new Float32Array(2);
var ac = new AudioContext();
var osc = ac.createOscillator();
real[0] = 0;
imag[0] = 0;
real[1] = 1;
imag[1] = 0;
var wave = ac.createPeriodicWave(real, imag, {disableNormalization: true});
osc.setPeriodicWave(wave);
osc.connect(ac.destination);
osc.start();
osc.stop(2);
The main problem is that this waveform is static, i am not able to change it gradually into something else.
How can i achieve my goal? I was thinking about 2 gain nodes placed after each wave that will work complementary to each other.
For example: my sine wave goes into Gain1 which is 10 and my square wave into Gain2 which is 0. Then i change them complementary, Gain1=5,Gain2=5 and so on.
Is it a valid approach?
IIUC, I don't think using a set of gain nodes will produce what you want. And there's no builtin node to do this.
I think you will have to do this yourself with an AudioWorkletNode.
Related
I'm trying to create a complex periodic sound with long period. I want to define frequencies as accurately as I can, so I'm using step sampleRate*0.5/tableLen. But I have some issues with large wave tables. The sound becomes distorted and loses high frequencies.
Here is a minimal example with ~440 Hz sine wave. When I use table with length 8192, the resulting sine wave is quite recognizable:
https://jsfiddle.net/zxqzntf0/
var gAudioCtx = new AudioContext();
var osc = gAudioCtx.createOscillator();
var tableSize = 8192;
var real = new Float32Array(tableSize);
var imag = new Float32Array(tableSize);
var freq = 440;
var step = gAudioCtx.sampleRate*0.5/tableSize;
real[Math.floor(freq/step)] = 1;
var wave = gAudioCtx.createPeriodicWave(real, imag, {disableNormalization: true});
osc.frequency.value = step;
osc.setPeriodicWave(wave);
osc.connect(gAudioCtx.destination);
osc.start();
But when I increase my table size, I'm getting something strange. Result is not a sine wave at all!
https://jsfiddle.net/0cc75nnm/
This problem reproduces in all browsers (Chrome, Firefox, Edge), so it doesn't seem to be a browser bug. But I've found nothing about this in documentation.
Added
I found that if oscillator frequency is a whole number >= 2 Hz, I have no any artifacts in resulting sound with table size 16384. I think it is quite acceptable for my needs for now. But someday I may want to create longer periods. If someone explains me why I get sound artifacts when step is less than 2 Hz, I will accept his answer.
There is an example of a complex sound melody that I generate in JavaScript:
https://jsfiddle.net/h9rfzrnL/1/
You're creating you periodic waves incorrectly. When filling the arrays for the periodic wave, assume the sample rate is 1. Then if you want an oscillator at a frequency of 440 Hz, set the oscillator frequency to 440 Hz.
Thus, for a sine wave, the real array should be all zeroes and the imaginary array is [0, 1]. (You're actually creating a cosine wave, but that doesn't really matter too much.)
I would like to try making a synth using JavaScript, but I can't find any basic examples on how to do so.
What I have figured out from research is that it appears to be possible and that you should use a Canvas Pixel Array rather than normal ECMA arrays
I've also found info in MDN Audio, and I have seen audio elements used for continuous playback by web radio players before, although I couldn't figure out how.
My goal is to make something which allows me to synthesize continuous sin waves and play them using my keyboard without using pre-made samples.
EDIT: One of the comments below pointed me in the right direction. I'm currently working on a solution, but if you would like to post one as well, feel free.
Here is a basic example from which anyone should be able to figure out how to play sine waves with their keyboard:
<script type="text/javascript">
//WARNING: VERY LOUD. TURN DOWN YOUR SPEAKERS BEFORE TESTING
// create web audio api context
var audioCtx = new (window.AudioContext || window.webkitAudioContext)();
// create Oscillator node
var oscillator = audioCtx.createOscillator();
oscillator.type = 'sine';
oscillator.frequency.value = 750; // value in hertz
oscillator.connect(audioCtx.destination);
oscillator.start();
//uncomment for fun
// setInterval(changeFreq, 100);
//choose a random interval from a list of consonant ratios
var intervals = [1.0, 0.5, 0.3333, 1.5, 1.3333, 2.0, 2.25]
function changeFreq() {
var intervalIndex = ~~(Math.random() * intervals.length);
var noteFreq = oscillator.frequency.value * intervals[intervalIndex];
//because this is random, make an effort to keep it in comfortable frequency range.
if(noteFreq > 1600)
noteFreq *= 0.5;
else if(noteFreq < 250)
noteFreq *= 2;
oscillator.frequency.value = noteFreq;
}
</script>
<body>
<button onclick="changeFreq()">Change Places!</button>
</body>
Suppose that I make a simple canvas drawing app like this:
I now have a series of points. How can I feed them to some of the WebAudio objects (an oscillator or a sound make from a byte array or something) to actually generate and play a wave out of them (in this case a sine-like wave)? What is the theory behind it?
If you have the data from your graph in an array, y, you can do something like
var buffer = context.createBuffer(1, y.length, context.sampleRate);
buffer.copyToChannel(y);
var src = context.createBufferSource();
src.buffer = buffer;
src.start()
You may need to set the sample rate in context.createBuffer to something other than context.sampleRate, depending on the data from your graph.
I did this post asking your opinion about what JS library is better, or can do the work
that I have shown. Since I'm not allowed to do that here I did a research and tried out EaselJS to do the work. So my question now have changed.
I have this piece of code:
function handleImageLoad(event) {
var img = event.target
bmp = new createjs.Bitmap(img);
/*Matrix2D Transformation */
var a = 0.880114;
var b = 0.0679298;
var c = -0.053145;
var d = 0.954348;
var tx = 37.4898;
var ty = -16.5202;
var matrix = new createjs.Matrix2D(a, b, c, d, tx, ty);
var polygon = new createjs.Shape();
polygon.graphics.beginStroke("blue");
polygon.graphics.beginBitmapFill(img, "no-repeat", matrix).moveTo(37.49, -16.52).lineTo(336.27, -36.20).lineTo(350.96, 171.30).lineTo(50.73, 169.54).lineTo(37.49, -16.52);
stage.addChild(polygon);
stage.update();
}
where the variables a,b,c,tx and ty are values from a Homography matrix,
0.880114 0.067979298 37.4898
-0.053145 0.954348 -16.5202
-0.000344 1.0525-006 1
As you can see in attached files, I draw well a deformed rectangle but the image still doesn´t wrap the shape created. Anyone know how can I do it? There is a way better do to this? I'm doing something wrong?
Thanks for your time.
Edit: To be more specific I have added other image to see what I want.
You are attempting to do something similar to a perspective transform, using a 3x3 matrix.
Canvas's 2D context, and by extension EaselJS, only supports affine transformations with a 2x3 matrix - transformations where the opposite edges of the bounding rectangle remain parallel. For example, scaling, rotation, skewing, and translation.
http://en.wikipedia.org/wiki/Affine_transformation
You might be able to fake this with multiple objects that have been skewed (this was used extensively in Flash to fake perspective transforms), or you may have to look into another solution.
I'm trying to implement Stereo Phase as it is described here: http://www.image-line.com/support/FLHelp/html/plugins/3x%20OSC.htm
"Stereo Phase (SP) - Allows you to set different phase offset for the left and right channels of the generator. The offset results in the oscillator starting at a different point on the oscillator's shape (for example, start at the highest value of the sine function instead at the zero point). Stereo phase offset adds to the richness and stereo panorama of the sound produced."
I'm trying to achieve this for an OscillatorNode. My only idea is to use createPeriodicWave (https://dvcs.w3.org/hg/audio/raw-file/tip/webaudio/specification.html#dfn-createPeriodicWave) However, the description of create periodic wave from the specification is above my understanding and I have not found any examples via Google.
Any help in deciphering the description of createPeriodicWave would be helpful as would any other ideas about how to achieve this effect.
Thanks!
Mcclellan and others,
This answer helped and subsequently warped me into the world of Fourier. With the help of a page on the subject and some Wikipedia, I think I got the square and sawtooth patterns down, but the triangle pattern still eludes me. Does anyone know?
It indeed gives you the ability to phase shift, as this article by Nick Thompson explains (although he calls the AudioContext methods differently, but the principle is the same).
As far as the square and sawtooth patterns go:
var n = 4096;
var real = new Float32Array(n);
var imag = new Float32Array(n);
var ac = new AudioContext();
var osc = ac.createOscillator();
/* Pick a wave pattern */
/* Sine
imag[1] = 1; */
/* Sawtooth
for(x=1;x<n;x++)
imag[x] = 2.0 / (Math.pow(-1, x) * Math.PI * x); */
/* Square */
for(x=1;x<n;x+=2)
imag[x] = 4.0 / (Math.PI * x);
var wave = ac.createPeriodicWave(real, imag);
osc.setPeriodicWave(wave);
osc.connect(ac.destination);
osc.start();
osc.stop(2); /* Have it stop after 2 seconds */
This will play the activated pattern, the square pattern in this case. What would the triangle formula look like?
A simple way to fake it would be to add separate delay nodes to the left and right channels, and give them user-controlled delay values. This would be my approach and will have more-or-less the same effect as a phase setting.
If you want to use createPeriodicWave, unfortunately you'll probably have to understand the somewhat difficult math behind it.
Basically, you'll first have to represent your waveform as a sum of sine wave "partials". All periodic waves have some representation of this form. Then, once you've found the relative magnitudes of each partial, you'll have to phase shift them separately for left and right channels by multiplying each by a complex number. You can read more details about representing periodic waves as sums of sine waves here: http://music.columbia.edu/cmc/musicandcomputers/chapter3/03_03.php
Using createPeriodicWave has a significant advantage over using a BufferSourceNode: createPeriodicWave waveforms will automatically avoid aliasing. It's rather difficult to avoid aliasing if you're generating the waveforms "by hand" in a buffer.
I do not think that it would be possible to have a phase offset using a OscillatorNode.
A way to do that would be to use context.createBuffer and generate a sine wave buffer (or any type of wave that you want) and set it as the buffer for a BufferSourceNode and then use the offset parameter in its start() method. But you need to calculate the sample offset amount in seconds.
var buffer = context.createBuffer(1,1024,44100);
var data = buffer.getChannelData(0);
for(var i=0; i < data.length; i++){
//generate waveform
}
var osc = context.createBufferSourceNode();
osc.buffer = buffer;
osc.loop = true;
osc.connect(context.destination);
osc.start(time, offset in seconds);
By the looks of this article on Wolfram the triangle wave can be established like this:
/* Triangle */
for(x=1;x<n;x+=2)
imag[x] = 8.0 / Math.pow(Math.PI, 2) * Math.pow(-1, (x-1)/2) / Math.pow(x, 2) * Math.sin(Math.PI * x);
Also helpful by the way is the Wikipedia page that actually shows how the Fourier constructions work.
function getTriangleWave(imag, n) {
for (var i = 1; i < n; i+=2) {
imag[i] = (8*Math.sin(i*Math.PI/2))/(Math.pow((Math.PI*i), 2));
}
return imag;
}
With Chrome 66 adding AudioWorklet's, you can write sound processing programs the same way as the now deprecated ScriptProcessorNode.
I made a convenience library using this that's a normal WebAudio API OscillatorNode that can also have its phase (among other things) varied. You can find it here
const context = new AudioContext()
context.audioWorklet.addModule("worklet.js").then(() => {
const osc = new BetterOscillator(context)
osc.parameters.get("phase").value = Math.PI / 4
osc.connect(context.destination)
osc.start()
}