Using getByteFrequencyData on iPad and Safari/Chrome - javascript

I am working on a (web-) animation that moves lips with played audio speech. I use Javascript getByteFrequencyData() to determine low and high frequencies and get a very realistic mouth movement.
The code runs perfectly on all Windows browsers, on android (chrome) but I face a huge problem on IOS with Safari and also with Chrome: getByteFrequencyData always returns a constant value. No errors are logged to the console so all objects are created, the audio plays (as started in a user interaction context).
UPDATE: I tested Safari, Firefox, Edge and Chrome, all with the same result.
UPDATE2: The whole app does not work on Safari OSX but on Chrome OSX. On Chrome OSX (Catalina) getByteFrequencyData also works well. It seems to be an iOS only issue.
This is how I init the audio system (once):
...
audio = document.getElementById("audio");
audio.addEventListener('ended', onAudioPlayed, false);
var AudioContext = window.AudioContext || window.webkitAudioContext;
ctx = new AudioContext();
audioSrc = ctx.createMediaElementSource(audio);
audioSrc.connect(ctx.destination);
analyser = ctx.createAnalyser();
frequencyData = new Uint8Array(analyser.frequencyBinCount);
...
And I use the requestAnimationFrame-Method to feed the animation:
...
audio.src = "audio/" + audioQueue.shift();
console.log("play: "+audio.src);
audio.type = "audio/mp3";
analyser.fftSize = 64;
analyser.smoothingTimeConstant = 0.4;
analyser.minDecibels = -80;
analyser.maxDecibels = -20;
audioSrc.connect(analyser);
analyser.getByteFrequencyData(frequencyData);
function renderFrame() {
if (!audio.paused) requestAnimationFrame(renderFrame);
analyser.getByteFrequencyData(frequencyData);
avgHigh = 0;
avgLow = 0;
count = analyser.frequencyBinCount;
//console.log("rendermund " + frequencyData[0]);
for (i = 0; i < count; i++) {
if (i < count/4)
avgLow += frequencyData[i]
else
avgHigh += frequencyData[i];
}
console.log("rendermund " + avgHigh + "/" + avgLow);
avgLow = avgLow/2000;
avgHigh = avgHigh/2000;
if (avgLow > 1) avgLow = 1;
if (avgHigh > 1) avgHigh = 1;
mund(avgHigh,avgLow); // Here I draw the mouth...
}
audio.load();
audio.play();
renderFrame();
...
Is there an alternative way to get any live information about the samples currently played?
The strange thing is that I do not get any error messages - getByteFrequencyData() returns the array of the expected size but with constant values (0) on iOS. I tested with the latest IO Version on Ipad Air, IPad Air 2 and IPads of the 5th and 6th generation all with the same result.

Related

Is there a way to use a js function as a source of audio? [duplicate]

Is it possible to generate a constant sound stream with javascript/html5? For example, to generate a perpetual sine wave, I would have a callback function, that would be called whenever the output buffer is about to become empty:
function getSampleAt(timestep)
{
return Math.sin(timestep);
}
(The idea is to use this to make an interactive synth. I don't know in advance how long a key will be pressed, so I can't use a fixed length buffer)
You can use the Web Audio API in most browsers now (excepting IE and Opera Mini).
Try out this code:
// one context per document
var context = new (window.AudioContext || window.webkitAudioContext)();
var osc = context.createOscillator(); // instantiate an oscillator
osc.type = 'sine'; // this is the default - also square, sawtooth, triangle
osc.frequency.value = 440; // Hz
osc.connect(context.destination); // connect it to the destination
osc.start(); // start the oscillator
osc.stop(context.currentTime + 2); // stop 2 seconds after the current time
If you want the volume lower, you can do something like this:
var context = new webkitAudioContext();
var osc = context.createOscillator();
var vol = context.createGain();
vol.gain.value = 0.1; // from 0 to 1, 1 full volume, 0 is muted
osc.connect(vol); // connect osc to vol
vol.connect(context.destination); // connect vol to context destination
osc.start(context.currentTime + 3); // start it three seconds from now
I got most of this from experimenting in chromium while reading the Web Audio API Working Draft, which I found from #brainjam 's link.
Lastly, it is very helpful to inspect the various objects in the chrome inspector (ctrl-shift-i).
Using the HTML5 audio element
Cross-browser generative sustained audio using JavaScript and the audio element isn't currently possible, as Steven Wittens notes in a blog post on creating a JavaScript synth:
"...there is no way to queue up chunks of synthesized audio for seamless playback".
Using the Web Audio API
The Web Audio API was designed to facilitate JavaScript audio synthesis. The Mozilla Developer Network has a Web Based Tone Generator that works in Firefox 4+ [demo 1]. Add these two lines to that code and you have a working synth with generative sustained audio upon keypress [demo 2 - works in Firefox 4 only, click the 'Results' area first, then press any key]:
window.onkeydown = start;
window.onkeyup = stop;
The BBC's page on the Web Audio API is worth reviewing too. Unfortunately, support for the Web Audio API doesn't extend to other browsers yet.
Possible workarounds
To create a cross-browser synth at present, you'll likely have to fall back on prerecorded audio by:
Using long prerecorded ogg/mp3 sample tones, embedding them in separate audio elements and starting and stopping them upon keypress.
Embedding an swf file containing the audio elements and controlling playback via JavaScript. (This appears to be the method that the Google Les Paul Doodle employs.)
Sure! You could use the tone synthesizer in this demo:
audioCtx = new(window.AudioContext || window.webkitAudioContext)();
show();
function show() {
frequency = document.getElementById("fIn").value;
document.getElementById("fOut").innerHTML = frequency + ' Hz';
switch (document.getElementById("tIn").value * 1) {
case 0: type = 'sine'; break;
case 1: type = 'square'; break;
case 2: type = 'sawtooth'; break;
case 3: type = 'triangle'; break;
}
document.getElementById("tOut").innerHTML = type;
volume = document.getElementById("vIn").value / 100;
document.getElementById("vOut").innerHTML = volume;
duration = document.getElementById("dIn").value;
document.getElementById("dOut").innerHTML = duration + ' ms';
}
function beep() {
var oscillator = audioCtx.createOscillator();
var gainNode = audioCtx.createGain();
oscillator.connect(gainNode);
gainNode.connect(audioCtx.destination);
gainNode.gain.value = volume;
oscillator.frequency.value = frequency;
oscillator.type = type;
oscillator.start();
setTimeout(
function() {
oscillator.stop();
},
duration
);
};
frequency
<input type="range" id="fIn" min="40" max="6000" oninput="show()" />
<span id="fOut"></span><br>
type
<input type="range" id="tIn" min="0" max="3" oninput="show()" />
<span id="tOut"></span><br>
volume
<input type="range" id="vIn" min="0" max="100" oninput="show()" />
<span id="vOut"></span><br>
duration
<input type="range" id="dIn" min="1" max="5000" oninput="show()" />
<span id="dOut"></span>
<br>
<button onclick='beep();'>Play</button>
Have fun!
I got the solution from Houshalter here:
How do I make Javascript beep?
You can clone and tweak the code here:
Tone synthesizer demo on JS Bin
Compatible browsers:
Chrome mobile & desktop
Firefox mobile & desktop Opera mobile, mini & desktop
Android browser
Microsoft Edge browser
Safari on iPhone or iPad
Not Compatible
Internet Explorer version 11 (but does work on the Edge browser)
Web Audio API is coming to Chrome. See http://googlechrome.github.io/web-audio-samples/samples/audio/index.html
Follow the directions in "Getting Started" there, and then check out the very impressive demos.
Update(2017): by now this is a much more mature interface. The API is documented at https://developer.mozilla.org/en-US/docs/Web/API/Web_Audio_API
You can generate wav-e file in the fly and play it (src)
// Legend
// DUR - duration in seconds SPS - sample per second (default 44100)
// NCH - number of channels BPS - bytes per sample
// t - is number from range [0, DUR), return number in range [0, 1]
function getSampleAt(t,DUR,SPS)
{
return Math.sin(6000*t);
}
function genWAVUrl(fun, DUR=1, NCH=1, SPS=44100, BPS=1) {
let size = DUR*NCH*SPS*BPS;
let put = (n,l=4) => [(n<<24),(n<<16),(n<<8),n].filter((x,i)=>i<l).map(x=> String.fromCharCode(x>>>24)).join('');
let p = (...a) => a.map( b=> put(...[b].flat()) ).join('');
let data = `RIFF${put(44+size)}WAVEfmt ${p(16,[1,2],[NCH,2],SPS,NCH*BPS*SPS,[NCH*BPS,2],[BPS*8,2])}data${put(size)}`
for (let i = 0; i < DUR*SPS; i++) {
let f= Math.min(Math.max(fun(i/SPS,DUR,SPS),0),1);
data += put(Math.floor( f * (2**(BPS*8)-1)), BPS);
}
return "data:Audio/WAV;base64," + btoa(data);
}
var WAV = new Audio( genWAVUrl(getSampleAt,5) ); // 5s
WAV.setAttribute("controls", "controls");
document.body.appendChild(WAV);
//WAV.play()
Here is visualistation
function getSampleAt(t,DUR,SPS)
{
return 0.5+Math.sin(15*t)/(1+t*t);
}
// ----------------------------------------------
function genWAVUrl(fun, DUR=1, NCH=1, SPS=44100, BPS=1) {
let size = DUR*NCH*SPS*BPS;
let put = (n,l=4) => [(n<<24),(n<<16),(n<<8),n].filter((x,i)=>i<l).map(x=> String.fromCharCode(x>>>24)).join('');
let p = (...a) => a.map( b=> put(...[b].flat()) ).join('');
let data = `RIFF${put(44+size)}WAVEfmt ${p(16,[1,2],[NCH,2],SPS,NCH*BPS*SPS,[NCH*BPS,2],[BPS*8,2])}data${put(size)}`
for (let i = 0; i < DUR*SPS; i++) {
let f= Math.min(Math.max(fun(i/SPS,DUR,SPS),0),1);
data += put(Math.floor( f * (2**(BPS*8)-1)), BPS);
}
return "data:Audio/WAV;base64," + btoa(data);
}
function draw(fun, DUR=1, NCH=1, SPS=44100, BPS=1) {
time.innerHTML=DUR+'s';
time.setAttribute('x',DUR-0.3);
svgCh.setAttribute('viewBox',`0 0 ${DUR} 1`);
let p='', n=100; // n how many points to ommit
for (let i = 0; i < DUR*SPS/n; i++) p+= ` ${DUR*(n*i/SPS)/DUR}, ${1-fun(n*i/SPS, DUR,SPS)}`;
chart.setAttribute('points', p);
}
function frame() {
let t=WAV.currentTime;
point.setAttribute('cx',t)
point.setAttribute('cy',1-getSampleAt(t))
window.requestAnimationFrame(frame);
}
function changeStart(e) {
var r = e.target.getBoundingClientRect();
var x = e.clientX - r.left;
WAV.currentTime = dur*x/r.width;
WAV.play()
}
var dur=5; // seconds
var WAV = new Audio(genWAVUrl(getSampleAt,dur));
draw(getSampleAt,dur);
frame();
.chart { border: 1px dashed #ccc; }
.axis { font-size: 0.2px}
audio { outline: none; }
Click at blue line (make volume to max):
<svg class="chart" id="svgCh" onclick="changeStart(event)">
<circle cx="0" cy="-1" r="0.05" style="fill: rgba(255,0,0,1)" id="point"></circle>
<polyline id="chart" fill="none" stroke="#0074d9" stroke-width="0.01" points=""/>
<text x="0.03" y="0.9" class="axis">0</text>
<text x="0.03" y="0.2" class="axis">1</text>
<text x="4.8" y="0.9" class="axis" id="time"></text>
</svg><br>
This is not real answer on your question because you have asked for a JavaScript solution, but you can use ActionScript. It should run on all major browsers.
ActionScript API Audio Generation
You can call ActionScript functions from within JavaScript.
ActionScript API Call JavaScript
In that way you can wrap the ActionScript sound generation functions and make a JavaScript implementation of them. Just use Adobe Flex to build a tiny swf and then use that as backend for your JavaScript code.
This is what I have looked for like forever and in the end I managed to do it myself like I wanted. Maybe you will like it too.
Simple slider with frequency and push on/off:
buttonClickResult = function () {
var button = document.getElementById('btn1');
button.onclick = function buttonClicked() {
if(button.className=="off") {
button.className="on";
oscOn ();
}
else if(button.className=="on") {
button.className="off";
oscillator.disconnect();
}
}
};
buttonClickResult();
var oscOn = function(){
window.AudioContext = window.AudioContext || window.webkitAudioContext;
var context = new AudioContext();
var gainNode = context.createGain ? context.createGain() : context.createGainNode();
//context = new window.AudioContext();
oscillator = context.createOscillator(),
oscillator.type ='sine';
oscillator.frequency.value = document.getElementById("fIn").value;
//gainNode = createGainNode();
oscillator.connect(gainNode);
gainNode.connect(context.destination);
gainNode.gain.value = 1;
oscillator.start(0);
};
<p class="texts">Frekvence [Hz]</p>
<input type="range" id="fIn" min="20" max="20000" step="100" value="1234" oninput="show()" />
<span id="fOut"></span><br>
<input class="off" type="button" id="btn1" value="Start / Stop" />
you can use the following code to generate sounds and try different frequencies to generate even more sounds:
// generate sounds using frequencies
const audioContext = new AudioContext();
const oscillator = audioContext.createOscillator();
oscillator.type = "triangle"; // "square" "sine" "sawtooth"
oscillator.frequency.value = frequency; // 440 is default (try different frequencies)
oscillator.connect(audioContext.destination); // connects to your audio output
oscillator.start(0); // immediately starts when triggered
oscillator.stop(0.5); // stops after 0.5 seconds

Different behaviour of WebAudio API on Google Chrome

I'm doing a sound player with a spectrum visualizer. On Firefox works very well but in Google Chrome I'm getting problems. I came from this question I made the other day My previous question
On Firefox I can go forward or go previous on the track list all the times I want without a problem, but in Google Chrome I get this error when I press "next/previous"
Failed to execute 'createMediaElementSource' on 'BaseAudioContext':
HTMLMediaElement already connected previously to a different MediaElementSourceNode.
I don't know why Google Chrome complains and Firefox doesn't. The code of the visualizer it's:
function visualizer(audio) {
closeAudioContext = true;
let src = context.createMediaElementSource(audio); // Here fails on Google Chrome
let analyser = context.createAnalyser();
let canvas = document.getElementById("canvas");
canvas.width = window.innerWidth;
canvas.height = window.innerHeight;
let ctx = canvas.getContext("2d");
src.connect(analyser);
analyser.connect(context.destination);
analyser.fftSize = 2048;
let bufferLength = analyser.frequencyBinCount;
let dataArray = new Uint8Array(bufferLength);
let WIDTH = ctx.canvas.width;
let HEIGHT = ctx.canvas.height;
let barWidth = (WIDTH / bufferLength) * 1.5;
let barHeight;
let x = 0;
let color = randomColor();
function renderFrame() {
requestAnimationFrame(renderFrame);
x = 0;
analyser.getByteFrequencyData(dataArray);
ctx.clearRect(0, 0, WIDTH, HEIGHT);
for (let i = 0; i < bufferLength; i++) {
barHeight = dataArray[i];
ctx.fillStyle = color;
ctx.fillRect(x, HEIGHT - barHeight, barWidth, barHeight);
x += barWidth + 1;
}
}
renderFrame();
}
And after press the "next/previous" button I do immediately:
if (closeAudioContext && context !== undefined) {
context.close();
}
context = new AudioContext();
And then:
visualizer(audio);
musicPlay();
So my question is, why in Firefox the audio player works fine but in Google Chrome crashes?
I'm using Bowser to check what browser the user is using because as the new policy of Chrome to mute all sounds if its activated the autoplay (in this case I set the autoPlay to false and if the user press play the sound it's not muted). So, if I have to make a different code for Google Chrome I can make an "if" with that code.
Regards.
That's a bug in Chrome; you can't cannot attach an HTMLMediaElement to another MediaElementAudioSourceNode. But since you close the context and create a new one, that's another different bug in Chrome.

How to generate a simple continuous tone in the browser using JavaScript? [duplicate]

Is it possible to generate a constant sound stream with javascript/html5? For example, to generate a perpetual sine wave, I would have a callback function, that would be called whenever the output buffer is about to become empty:
function getSampleAt(timestep)
{
return Math.sin(timestep);
}
(The idea is to use this to make an interactive synth. I don't know in advance how long a key will be pressed, so I can't use a fixed length buffer)
You can use the Web Audio API in most browsers now (excepting IE and Opera Mini).
Try out this code:
// one context per document
var context = new (window.AudioContext || window.webkitAudioContext)();
var osc = context.createOscillator(); // instantiate an oscillator
osc.type = 'sine'; // this is the default - also square, sawtooth, triangle
osc.frequency.value = 440; // Hz
osc.connect(context.destination); // connect it to the destination
osc.start(); // start the oscillator
osc.stop(context.currentTime + 2); // stop 2 seconds after the current time
If you want the volume lower, you can do something like this:
var context = new webkitAudioContext();
var osc = context.createOscillator();
var vol = context.createGain();
vol.gain.value = 0.1; // from 0 to 1, 1 full volume, 0 is muted
osc.connect(vol); // connect osc to vol
vol.connect(context.destination); // connect vol to context destination
osc.start(context.currentTime + 3); // start it three seconds from now
I got most of this from experimenting in chromium while reading the Web Audio API Working Draft, which I found from #brainjam 's link.
Lastly, it is very helpful to inspect the various objects in the chrome inspector (ctrl-shift-i).
Using the HTML5 audio element
Cross-browser generative sustained audio using JavaScript and the audio element isn't currently possible, as Steven Wittens notes in a blog post on creating a JavaScript synth:
"...there is no way to queue up chunks of synthesized audio for seamless playback".
Using the Web Audio API
The Web Audio API was designed to facilitate JavaScript audio synthesis. The Mozilla Developer Network has a Web Based Tone Generator that works in Firefox 4+ [demo 1]. Add these two lines to that code and you have a working synth with generative sustained audio upon keypress [demo 2 - works in Firefox 4 only, click the 'Results' area first, then press any key]:
window.onkeydown = start;
window.onkeyup = stop;
The BBC's page on the Web Audio API is worth reviewing too. Unfortunately, support for the Web Audio API doesn't extend to other browsers yet.
Possible workarounds
To create a cross-browser synth at present, you'll likely have to fall back on prerecorded audio by:
Using long prerecorded ogg/mp3 sample tones, embedding them in separate audio elements and starting and stopping them upon keypress.
Embedding an swf file containing the audio elements and controlling playback via JavaScript. (This appears to be the method that the Google Les Paul Doodle employs.)
Sure! You could use the tone synthesizer in this demo:
audioCtx = new(window.AudioContext || window.webkitAudioContext)();
show();
function show() {
frequency = document.getElementById("fIn").value;
document.getElementById("fOut").innerHTML = frequency + ' Hz';
switch (document.getElementById("tIn").value * 1) {
case 0: type = 'sine'; break;
case 1: type = 'square'; break;
case 2: type = 'sawtooth'; break;
case 3: type = 'triangle'; break;
}
document.getElementById("tOut").innerHTML = type;
volume = document.getElementById("vIn").value / 100;
document.getElementById("vOut").innerHTML = volume;
duration = document.getElementById("dIn").value;
document.getElementById("dOut").innerHTML = duration + ' ms';
}
function beep() {
var oscillator = audioCtx.createOscillator();
var gainNode = audioCtx.createGain();
oscillator.connect(gainNode);
gainNode.connect(audioCtx.destination);
gainNode.gain.value = volume;
oscillator.frequency.value = frequency;
oscillator.type = type;
oscillator.start();
setTimeout(
function() {
oscillator.stop();
},
duration
);
};
frequency
<input type="range" id="fIn" min="40" max="6000" oninput="show()" />
<span id="fOut"></span><br>
type
<input type="range" id="tIn" min="0" max="3" oninput="show()" />
<span id="tOut"></span><br>
volume
<input type="range" id="vIn" min="0" max="100" oninput="show()" />
<span id="vOut"></span><br>
duration
<input type="range" id="dIn" min="1" max="5000" oninput="show()" />
<span id="dOut"></span>
<br>
<button onclick='beep();'>Play</button>
Have fun!
I got the solution from Houshalter here:
How do I make Javascript beep?
You can clone and tweak the code here:
Tone synthesizer demo on JS Bin
Compatible browsers:
Chrome mobile & desktop
Firefox mobile & desktop Opera mobile, mini & desktop
Android browser
Microsoft Edge browser
Safari on iPhone or iPad
Not Compatible
Internet Explorer version 11 (but does work on the Edge browser)
Web Audio API is coming to Chrome. See http://googlechrome.github.io/web-audio-samples/samples/audio/index.html
Follow the directions in "Getting Started" there, and then check out the very impressive demos.
Update(2017): by now this is a much more mature interface. The API is documented at https://developer.mozilla.org/en-US/docs/Web/API/Web_Audio_API
You can generate wav-e file in the fly and play it (src)
// Legend
// DUR - duration in seconds SPS - sample per second (default 44100)
// NCH - number of channels BPS - bytes per sample
// t - is number from range [0, DUR), return number in range [0, 1]
function getSampleAt(t,DUR,SPS)
{
return Math.sin(6000*t);
}
function genWAVUrl(fun, DUR=1, NCH=1, SPS=44100, BPS=1) {
let size = DUR*NCH*SPS*BPS;
let put = (n,l=4) => [(n<<24),(n<<16),(n<<8),n].filter((x,i)=>i<l).map(x=> String.fromCharCode(x>>>24)).join('');
let p = (...a) => a.map( b=> put(...[b].flat()) ).join('');
let data = `RIFF${put(44+size)}WAVEfmt ${p(16,[1,2],[NCH,2],SPS,NCH*BPS*SPS,[NCH*BPS,2],[BPS*8,2])}data${put(size)}`
for (let i = 0; i < DUR*SPS; i++) {
let f= Math.min(Math.max(fun(i/SPS,DUR,SPS),0),1);
data += put(Math.floor( f * (2**(BPS*8)-1)), BPS);
}
return "data:Audio/WAV;base64," + btoa(data);
}
var WAV = new Audio( genWAVUrl(getSampleAt,5) ); // 5s
WAV.setAttribute("controls", "controls");
document.body.appendChild(WAV);
//WAV.play()
Here is visualistation
function getSampleAt(t,DUR,SPS)
{
return 0.5+Math.sin(15*t)/(1+t*t);
}
// ----------------------------------------------
function genWAVUrl(fun, DUR=1, NCH=1, SPS=44100, BPS=1) {
let size = DUR*NCH*SPS*BPS;
let put = (n,l=4) => [(n<<24),(n<<16),(n<<8),n].filter((x,i)=>i<l).map(x=> String.fromCharCode(x>>>24)).join('');
let p = (...a) => a.map( b=> put(...[b].flat()) ).join('');
let data = `RIFF${put(44+size)}WAVEfmt ${p(16,[1,2],[NCH,2],SPS,NCH*BPS*SPS,[NCH*BPS,2],[BPS*8,2])}data${put(size)}`
for (let i = 0; i < DUR*SPS; i++) {
let f= Math.min(Math.max(fun(i/SPS,DUR,SPS),0),1);
data += put(Math.floor( f * (2**(BPS*8)-1)), BPS);
}
return "data:Audio/WAV;base64," + btoa(data);
}
function draw(fun, DUR=1, NCH=1, SPS=44100, BPS=1) {
time.innerHTML=DUR+'s';
time.setAttribute('x',DUR-0.3);
svgCh.setAttribute('viewBox',`0 0 ${DUR} 1`);
let p='', n=100; // n how many points to ommit
for (let i = 0; i < DUR*SPS/n; i++) p+= ` ${DUR*(n*i/SPS)/DUR}, ${1-fun(n*i/SPS, DUR,SPS)}`;
chart.setAttribute('points', p);
}
function frame() {
let t=WAV.currentTime;
point.setAttribute('cx',t)
point.setAttribute('cy',1-getSampleAt(t))
window.requestAnimationFrame(frame);
}
function changeStart(e) {
var r = e.target.getBoundingClientRect();
var x = e.clientX - r.left;
WAV.currentTime = dur*x/r.width;
WAV.play()
}
var dur=5; // seconds
var WAV = new Audio(genWAVUrl(getSampleAt,dur));
draw(getSampleAt,dur);
frame();
.chart { border: 1px dashed #ccc; }
.axis { font-size: 0.2px}
audio { outline: none; }
Click at blue line (make volume to max):
<svg class="chart" id="svgCh" onclick="changeStart(event)">
<circle cx="0" cy="-1" r="0.05" style="fill: rgba(255,0,0,1)" id="point"></circle>
<polyline id="chart" fill="none" stroke="#0074d9" stroke-width="0.01" points=""/>
<text x="0.03" y="0.9" class="axis">0</text>
<text x="0.03" y="0.2" class="axis">1</text>
<text x="4.8" y="0.9" class="axis" id="time"></text>
</svg><br>
This is not real answer on your question because you have asked for a JavaScript solution, but you can use ActionScript. It should run on all major browsers.
ActionScript API Audio Generation
You can call ActionScript functions from within JavaScript.
ActionScript API Call JavaScript
In that way you can wrap the ActionScript sound generation functions and make a JavaScript implementation of them. Just use Adobe Flex to build a tiny swf and then use that as backend for your JavaScript code.
This is what I have looked for like forever and in the end I managed to do it myself like I wanted. Maybe you will like it too.
Simple slider with frequency and push on/off:
buttonClickResult = function () {
var button = document.getElementById('btn1');
button.onclick = function buttonClicked() {
if(button.className=="off") {
button.className="on";
oscOn ();
}
else if(button.className=="on") {
button.className="off";
oscillator.disconnect();
}
}
};
buttonClickResult();
var oscOn = function(){
window.AudioContext = window.AudioContext || window.webkitAudioContext;
var context = new AudioContext();
var gainNode = context.createGain ? context.createGain() : context.createGainNode();
//context = new window.AudioContext();
oscillator = context.createOscillator(),
oscillator.type ='sine';
oscillator.frequency.value = document.getElementById("fIn").value;
//gainNode = createGainNode();
oscillator.connect(gainNode);
gainNode.connect(context.destination);
gainNode.gain.value = 1;
oscillator.start(0);
};
<p class="texts">Frekvence [Hz]</p>
<input type="range" id="fIn" min="20" max="20000" step="100" value="1234" oninput="show()" />
<span id="fOut"></span><br>
<input class="off" type="button" id="btn1" value="Start / Stop" />
you can use the following code to generate sounds and try different frequencies to generate even more sounds:
// generate sounds using frequencies
const audioContext = new AudioContext();
const oscillator = audioContext.createOscillator();
oscillator.type = "triangle"; // "square" "sine" "sawtooth"
oscillator.frequency.value = frequency; // 440 is default (try different frequencies)
oscillator.connect(audioContext.destination); // connects to your audio output
oscillator.start(0); // immediately starts when triggered
oscillator.stop(0.5); // stops after 0.5 seconds

Exporting intensity of audio in Web Audio API

I'm trying to find the intensity of a moment of audio with the Web Audio API. The only things which connect to intensity which I found in the specification are the:
analyser.minDecibels
analyser.maxDecibels
Is there a way to do this?
If I understand correctly, you want a number that is high when the sound is loud, and low when the sound is quiet. You can use the "Sound Pressure Level" for that.
Getting this number from the Web Audio API is rather straightforward, and you had guessed correctly that we will use the AnalyserNode to achieve this. Here is a example code that shows you how to do it:
var ac = new AudioContext();
/* create the Web Audio graph, let's assume we have sound coming out of the
* node `source` */
var an = ac.createAnalyser();
source.connect(an);
/* Get an array that will hold our values */
var buffer = new Uint8Array(an.fftSize);
function f() {
/* note that getFloatTimeDomainData will be available in the near future,
* if needed. */
an.getByteTimeDomainData(buffer);
/* RMS stands for Root Mean Square, basically the root square of the
* average of the square of each value. */
var rms = 0;
for (var i = 0; i < buffer.length; i++) {
rms += buffer[i] * buffer[i];
}
rms /= buffer.length;
rms = Math.sqrt(rms);
/* rms now has the value we want. */
requestAnimationFrame(f);
}
requestAnimationFrame(f);
/* start our hypothetical source. */
source.start(0);
I wanted to thank you for this answer some 4 years later.
I just did a quick POC and got it to work with the following code. I hope it might help somebody else too.
In this example, I am taking the live audio from my Mic and logging the results to the console - in my case, under the chrome dev tools.
<html>
<head>
<title>Intensity test</title>
</head>
<body>
<script>
var ac = new AudioContext();
var an = ac.createAnalyser();
var source = "";
var buffer = new Uint8Array(an.fftSize);
var scriptProcessorNode = ac.createScriptProcessor(16384, 1, 1);
if (!navigator.getUserMedia)
navigator.getUserMedia = navigator.getUserMedia ||
navigator.webkitGetUserMedia || navigator.mozGetUserMedia ||
navigator.msGetUserMedia;
if (navigator.getUserMedia) {
navigator.getUserMedia(
{audio:true},
function(stream) {
source = ac.createMediaStreamSource(stream);
source.connect(an);
requestAnimationFrame(f);
},
function(e) {
alert('Error capturing audio.');
}
);
}
function f() {
an.getByteTimeDomainData(buffer);
var rms = 0;
for (var i = 0; i < buffer.length; i++)
rms += buffer[i] * buffer[i];
rms /= buffer.length;
rms = Math.sqrt(rms);
requestAnimationFrame(f);
console.log(rms);
}
</script>
</body>
</html>

Generating sound on the fly with javascript/html5

Is it possible to generate a constant sound stream with javascript/html5? For example, to generate a perpetual sine wave, I would have a callback function, that would be called whenever the output buffer is about to become empty:
function getSampleAt(timestep)
{
return Math.sin(timestep);
}
(The idea is to use this to make an interactive synth. I don't know in advance how long a key will be pressed, so I can't use a fixed length buffer)
You can use the Web Audio API in most browsers now (excepting IE and Opera Mini).
Try out this code:
// one context per document
var context = new (window.AudioContext || window.webkitAudioContext)();
var osc = context.createOscillator(); // instantiate an oscillator
osc.type = 'sine'; // this is the default - also square, sawtooth, triangle
osc.frequency.value = 440; // Hz
osc.connect(context.destination); // connect it to the destination
osc.start(); // start the oscillator
osc.stop(context.currentTime + 2); // stop 2 seconds after the current time
If you want the volume lower, you can do something like this:
var context = new webkitAudioContext();
var osc = context.createOscillator();
var vol = context.createGain();
vol.gain.value = 0.1; // from 0 to 1, 1 full volume, 0 is muted
osc.connect(vol); // connect osc to vol
vol.connect(context.destination); // connect vol to context destination
osc.start(context.currentTime + 3); // start it three seconds from now
I got most of this from experimenting in chromium while reading the Web Audio API Working Draft, which I found from #brainjam 's link.
Lastly, it is very helpful to inspect the various objects in the chrome inspector (ctrl-shift-i).
Using the HTML5 audio element
Cross-browser generative sustained audio using JavaScript and the audio element isn't currently possible, as Steven Wittens notes in a blog post on creating a JavaScript synth:
"...there is no way to queue up chunks of synthesized audio for seamless playback".
Using the Web Audio API
The Web Audio API was designed to facilitate JavaScript audio synthesis. The Mozilla Developer Network has a Web Based Tone Generator that works in Firefox 4+ [demo 1]. Add these two lines to that code and you have a working synth with generative sustained audio upon keypress [demo 2 - works in Firefox 4 only, click the 'Results' area first, then press any key]:
window.onkeydown = start;
window.onkeyup = stop;
The BBC's page on the Web Audio API is worth reviewing too. Unfortunately, support for the Web Audio API doesn't extend to other browsers yet.
Possible workarounds
To create a cross-browser synth at present, you'll likely have to fall back on prerecorded audio by:
Using long prerecorded ogg/mp3 sample tones, embedding them in separate audio elements and starting and stopping them upon keypress.
Embedding an swf file containing the audio elements and controlling playback via JavaScript. (This appears to be the method that the Google Les Paul Doodle employs.)
Sure! You could use the tone synthesizer in this demo:
audioCtx = new(window.AudioContext || window.webkitAudioContext)();
show();
function show() {
frequency = document.getElementById("fIn").value;
document.getElementById("fOut").innerHTML = frequency + ' Hz';
switch (document.getElementById("tIn").value * 1) {
case 0: type = 'sine'; break;
case 1: type = 'square'; break;
case 2: type = 'sawtooth'; break;
case 3: type = 'triangle'; break;
}
document.getElementById("tOut").innerHTML = type;
volume = document.getElementById("vIn").value / 100;
document.getElementById("vOut").innerHTML = volume;
duration = document.getElementById("dIn").value;
document.getElementById("dOut").innerHTML = duration + ' ms';
}
function beep() {
var oscillator = audioCtx.createOscillator();
var gainNode = audioCtx.createGain();
oscillator.connect(gainNode);
gainNode.connect(audioCtx.destination);
gainNode.gain.value = volume;
oscillator.frequency.value = frequency;
oscillator.type = type;
oscillator.start();
setTimeout(
function() {
oscillator.stop();
},
duration
);
};
frequency
<input type="range" id="fIn" min="40" max="6000" oninput="show()" />
<span id="fOut"></span><br>
type
<input type="range" id="tIn" min="0" max="3" oninput="show()" />
<span id="tOut"></span><br>
volume
<input type="range" id="vIn" min="0" max="100" oninput="show()" />
<span id="vOut"></span><br>
duration
<input type="range" id="dIn" min="1" max="5000" oninput="show()" />
<span id="dOut"></span>
<br>
<button onclick='beep();'>Play</button>
Have fun!
I got the solution from Houshalter here:
How do I make Javascript beep?
You can clone and tweak the code here:
Tone synthesizer demo on JS Bin
Compatible browsers:
Chrome mobile & desktop
Firefox mobile & desktop Opera mobile, mini & desktop
Android browser
Microsoft Edge browser
Safari on iPhone or iPad
Not Compatible
Internet Explorer version 11 (but does work on the Edge browser)
Web Audio API is coming to Chrome. See http://googlechrome.github.io/web-audio-samples/samples/audio/index.html
Follow the directions in "Getting Started" there, and then check out the very impressive demos.
Update(2017): by now this is a much more mature interface. The API is documented at https://developer.mozilla.org/en-US/docs/Web/API/Web_Audio_API
You can generate wav-e file in the fly and play it (src)
// Legend
// DUR - duration in seconds SPS - sample per second (default 44100)
// NCH - number of channels BPS - bytes per sample
// t - is number from range [0, DUR), return number in range [0, 1]
function getSampleAt(t,DUR,SPS)
{
return Math.sin(6000*t);
}
function genWAVUrl(fun, DUR=1, NCH=1, SPS=44100, BPS=1) {
let size = DUR*NCH*SPS*BPS;
let put = (n,l=4) => [(n<<24),(n<<16),(n<<8),n].filter((x,i)=>i<l).map(x=> String.fromCharCode(x>>>24)).join('');
let p = (...a) => a.map( b=> put(...[b].flat()) ).join('');
let data = `RIFF${put(44+size)}WAVEfmt ${p(16,[1,2],[NCH,2],SPS,NCH*BPS*SPS,[NCH*BPS,2],[BPS*8,2])}data${put(size)}`
for (let i = 0; i < DUR*SPS; i++) {
let f= Math.min(Math.max(fun(i/SPS,DUR,SPS),0),1);
data += put(Math.floor( f * (2**(BPS*8)-1)), BPS);
}
return "data:Audio/WAV;base64," + btoa(data);
}
var WAV = new Audio( genWAVUrl(getSampleAt,5) ); // 5s
WAV.setAttribute("controls", "controls");
document.body.appendChild(WAV);
//WAV.play()
Here is visualistation
function getSampleAt(t,DUR,SPS)
{
return 0.5+Math.sin(15*t)/(1+t*t);
}
// ----------------------------------------------
function genWAVUrl(fun, DUR=1, NCH=1, SPS=44100, BPS=1) {
let size = DUR*NCH*SPS*BPS;
let put = (n,l=4) => [(n<<24),(n<<16),(n<<8),n].filter((x,i)=>i<l).map(x=> String.fromCharCode(x>>>24)).join('');
let p = (...a) => a.map( b=> put(...[b].flat()) ).join('');
let data = `RIFF${put(44+size)}WAVEfmt ${p(16,[1,2],[NCH,2],SPS,NCH*BPS*SPS,[NCH*BPS,2],[BPS*8,2])}data${put(size)}`
for (let i = 0; i < DUR*SPS; i++) {
let f= Math.min(Math.max(fun(i/SPS,DUR,SPS),0),1);
data += put(Math.floor( f * (2**(BPS*8)-1)), BPS);
}
return "data:Audio/WAV;base64," + btoa(data);
}
function draw(fun, DUR=1, NCH=1, SPS=44100, BPS=1) {
time.innerHTML=DUR+'s';
time.setAttribute('x',DUR-0.3);
svgCh.setAttribute('viewBox',`0 0 ${DUR} 1`);
let p='', n=100; // n how many points to ommit
for (let i = 0; i < DUR*SPS/n; i++) p+= ` ${DUR*(n*i/SPS)/DUR}, ${1-fun(n*i/SPS, DUR,SPS)}`;
chart.setAttribute('points', p);
}
function frame() {
let t=WAV.currentTime;
point.setAttribute('cx',t)
point.setAttribute('cy',1-getSampleAt(t))
window.requestAnimationFrame(frame);
}
function changeStart(e) {
var r = e.target.getBoundingClientRect();
var x = e.clientX - r.left;
WAV.currentTime = dur*x/r.width;
WAV.play()
}
var dur=5; // seconds
var WAV = new Audio(genWAVUrl(getSampleAt,dur));
draw(getSampleAt,dur);
frame();
.chart { border: 1px dashed #ccc; }
.axis { font-size: 0.2px}
audio { outline: none; }
Click at blue line (make volume to max):
<svg class="chart" id="svgCh" onclick="changeStart(event)">
<circle cx="0" cy="-1" r="0.05" style="fill: rgba(255,0,0,1)" id="point"></circle>
<polyline id="chart" fill="none" stroke="#0074d9" stroke-width="0.01" points=""/>
<text x="0.03" y="0.9" class="axis">0</text>
<text x="0.03" y="0.2" class="axis">1</text>
<text x="4.8" y="0.9" class="axis" id="time"></text>
</svg><br>
This is not real answer on your question because you have asked for a JavaScript solution, but you can use ActionScript. It should run on all major browsers.
ActionScript API Audio Generation
You can call ActionScript functions from within JavaScript.
ActionScript API Call JavaScript
In that way you can wrap the ActionScript sound generation functions and make a JavaScript implementation of them. Just use Adobe Flex to build a tiny swf and then use that as backend for your JavaScript code.
This is what I have looked for like forever and in the end I managed to do it myself like I wanted. Maybe you will like it too.
Simple slider with frequency and push on/off:
buttonClickResult = function () {
var button = document.getElementById('btn1');
button.onclick = function buttonClicked() {
if(button.className=="off") {
button.className="on";
oscOn ();
}
else if(button.className=="on") {
button.className="off";
oscillator.disconnect();
}
}
};
buttonClickResult();
var oscOn = function(){
window.AudioContext = window.AudioContext || window.webkitAudioContext;
var context = new AudioContext();
var gainNode = context.createGain ? context.createGain() : context.createGainNode();
//context = new window.AudioContext();
oscillator = context.createOscillator(),
oscillator.type ='sine';
oscillator.frequency.value = document.getElementById("fIn").value;
//gainNode = createGainNode();
oscillator.connect(gainNode);
gainNode.connect(context.destination);
gainNode.gain.value = 1;
oscillator.start(0);
};
<p class="texts">Frekvence [Hz]</p>
<input type="range" id="fIn" min="20" max="20000" step="100" value="1234" oninput="show()" />
<span id="fOut"></span><br>
<input class="off" type="button" id="btn1" value="Start / Stop" />
you can use the following code to generate sounds and try different frequencies to generate even more sounds:
// generate sounds using frequencies
const audioContext = new AudioContext();
const oscillator = audioContext.createOscillator();
oscillator.type = "triangle"; // "square" "sine" "sawtooth"
oscillator.frequency.value = frequency; // 440 is default (try different frequencies)
oscillator.connect(audioContext.destination); // connects to your audio output
oscillator.start(0); // immediately starts when triggered
oscillator.stop(0.5); // stops after 0.5 seconds

Categories