I'm having a bit of a trouble building an Audio frequency visualization. I am using HTML5 audio tag:
<audio id="music" src="https://cdn.glitch.com/02dcea11-9bd2-4462-ac38-eeb6a5ad9530%2F331_full_beautiful-minds_0171_preview.mp3?1522829295082" crossorigin="use-URL-credentials" controls="true"></audio>
(song is from www.premiumbeat.com)
used with AudioContext and Analyser as shown below:
const audio = document.getElementById('music');
audio.load();
audio.play();
const ctx = new AudioContext();
const audioSrc = ctx.createMediaElementSource(audio);
const analyser = ctx.createAnalyser();
audioSrc.connect(analyser);
analyser.connect(ctx.destination);
analyser.fftSize = 256;
const bufferLength = analyser.frequencyBinCount;
const frequencyData = new Uint8Array(bufferLength);
analyser.getByteFrequencyData(frequencyData);
setTimeout(() => {
analyser.getByteFrequencyData(frequencyData);
console.log(frequencyData);
}, 5000);
Output:
Even though the song is playing when I call the getByteFrequencyData(), the output is 128 items long array full of zeros.
Expected behaviour:
After 5 seconds console should output 128 items long array of current frequency data. (I do it this way because requestAnimationFrame would lag the window, however I tried it and the result is the same).
Any idea what I do wrong? (I'm using Firefox Quantum 59.0.2.)
Try it here: JSFiddle example
Thank you!
Following my experiments with the web audio api i modified your script to use getByteTimeDomainData instead of getByteFrequencyData.
https://developer.mozilla.org/en-US/docs/Web/API/AnalyserNode/getByteTimeDomainData
Using intervals instead of timeout for the demo.
const audio = document.getElementById('music');
audio.load();
audio.play();
const ctx = new AudioContext();
const audioSrc = ctx.createMediaElementSource(audio);
const analyser = ctx.createAnalyser();
audioSrc.connect(analyser);
analyser.connect(ctx.destination);
analyser.fftSize = 256;
const bufferLength = analyser.frequencyBinCount;
const frequencyData = new Uint8Array(bufferLength);
setInterval(() => {
analyser.getByteFrequencyData(frequencyData);
console.log(frequencyData);
}, 1000);
<audio id="music" src="https://cdn.glitch.com/02dcea11-9bd2-4462-ac38-eeb6a5ad9530%2F331_full_beautiful-minds_0171_preview.mp3?1522829295082" crossorigin="use-URL-credentials" controls="true"></audio>
Related
I want to get the frequencies of an audio file with JS in non-real-time, e.g before the file is played, and store it in an array.
I used this code:
var context = new AudioContext();
src = context.createMediaElementSource(source);
analyser = context.createAnalyser();
var listen = context.createGain();
src.connect(listen);
listen.connect(analyser);
analyser.connect(context.destination);
analyser.fftSize = 2 ** 12;
var frequencyBins = analyser.fftSize / 2;
var bufferLength = analyser.frequencyBinCount;
console.log(bufferLength);
dataArray = new Float32Array(bufferLength);
// dataArray = new Uint8Array(bufferLength);
var scale = bufferLength/WIDTH;
And then in an animation frame,
dataArraya = new Float32Array(2048);
analyser.getFloatFrequencyData(dataArraya)
This works great in real-time, but I'd like to get the audio data before, how can I do that?
I want to play only the audio of a mp4 video file. So far I'm here and this function works fine without any errors:
function MovieAudio() {
const video = document.createElement('video');
document.body.appendChild(video);
video.id = 'clip';
const clip = document.getElementById("clip");
clip.style.visibility = "hidden";
const source = document.createElement('source');
source.src = 'myvideo.mp4';
source.type = 'video/mp4';
video.appendChild(source);
video.load();
clip.volume = 1;
clip.play();
} // end of MovieAudio function
The problem is I want to double the volume of the audio and if I set the volume like this I get an error:
clip.volume = 2;
I find a solution here but I can't make the code work...
https://stackoverflow.com/a/43794379/10715551
// create an audio context and hook up the video element as the source
var audioCtx = new AudioContext();
var source = audioCtx.createMediaElementSource(clip);
// create a gain node
var gainNode = audioCtx.createGain();
gainNode.gain.value = 2; // double the volume
source.connect(gainNode);
// connect the gain node to an output destination
gainNode.connect(audioCtx.destination);
How can I double the volume of audio with the given code?
I know how to use navigator.getUserMedia to get the audio stream from the browser and system's default input device(a microphone). But what if I would like to get the MediaStream from an uploaded audio file or an audio file URL?
Appreciate if can provide a code example.
Two ways that come to my mind directly:
three liner with MediaElement.captureStream():
const audio = new Audio(url);
const stream = audio.captureStream();
audio.play(); // stream now has input
// more than two lines actually in stacksnippetsĀ®
const audio = new Audio("https://upload.wikimedia.org/wikipedia/en/d/dc/Strawberry_Fields_Forever_(Beatles_song_-_sample).ogg");
audio.loop = true;
audio.crossOrigin = 'anonymous';
audio.play();
const stream = audio.captureStream ?
audio.captureStream() :
audio.mozCaptureStream ?
audio.mozCaptureStream() :
null;
// feed our visible receiver with this stream
receiver.srcObject = stream;
console.log(stream.toString());
<audio id="receiver" controls></audio>
a bit more with AudioContext.createMediaStreamDestination() and AudioContext.createMediaElementSource():
const audio = new Audio(url);
const ctx = new (window.AudioContext || window.webkitAudioContext)();
const stream_dest = ctx.createMediaStreamDestination();
const source = ctx.createMediaElementSource(audio);
source.connect(stream_dest);
const stream = stream_dest.stream;
audio.play();
const audio = new Audio("https://upload.wikimedia.org/wikipedia/en/d/dc/Strawberry_Fields_Forever_(Beatles_song_-_sample).ogg");
audio.loop = true;
audio.crossOrigin = 'anonymous';
audio.play();
const ctx = new (window.AudioContext || window.webkitAudioContext)();
const stream_dest = ctx.createMediaStreamDestination();
const source = ctx.createMediaElementSource(audio);
source.connect(stream_dest);
const stream = stream_dest.stream;
// feed our visible receiver with this stream
receiver.srcObject = stream;
console.log(stream.toString());
<audio id="receiver" controls></audio>
In real-time we do this to get the frequency of the audio while audio is playing.
Window.onload{
audio.load();
audio.play();
Var context = new audioContext();
Context.createMediaElementSource(audio);
Var analyser = context.createAnalyser();
analyser.fftsize = 512;
Var array = new uintarray(analyser.frequencyBinCount);
Function render(){
RequestAnimationFrame(render);
analyser.getFrequencyBinCount(array);
//Process frequency details here
}
I want to get a frequency of an audio clip 30 times a second in nonreal time.
for example
Var subSecond = 1/30;
Var frequency = getFrequencyAt(64*subSecond);
Function getFrequencyAt(s){
//Logic to get the frequency here
}
How can I achieve this efficiently?
HTML:
<audio id="myAudio" src="song.mp3" oncanplay="done(this)"></audio>
JS:
function done(audio) {
var ctx = new AudioContext();
var audioSrc = ctx.createMediaElementSource(audio);
var analyser = ctx.createAnalyser();
audioSrc.connect(analyser);
audioSrc.connect(ctx.destination);
var array = new Uint8Array(analyser.frequencyBinCount);
analyser.getByteFrequencyData(array);
console.log(array);
}
All that is written to the console is an array of all zeroes. Anybody know what's wrong with this code?