Scheduling Web Audio Api playback, multiple plays issue - javascript

I am trying to schedule the beep sound to play 3x one second apart. However, the sound is only playing once. Any thoughts on why this might be? (It's included within a larger javascript funciton that declares context etc. . .)
var beepBuffer;
var loadBeep = function() {
var getSound = new XMLHttpRequest(); // Load the Sound with XMLHttpRequest
getSound.open("GET", "/static/music/chime.wav", true); // Path to Audio File
getSound.responseType = "arraybuffer"; // Read as Binary Data
getSound.onload = function() {
context.decodeAudioData(getSound.response, function(buffer){
beepBuffer = buffer; // Decode the Audio Data and Store it in a Variable
});
}
getSound.send(); // Send the Request and Load the File
}
var playBeep = function() {
for (var j = 0;j<3;j++) {
var beeper = context.createBufferSource(); // Declare a New Sound
beeper.buffer = beepBuffer; // Attatch our Audio Data as it's Buffer
beeper.connect(context.destination); // Link the Sound to the Output
console.log(j);
beeper.start(j); // Play the Sound Immediately
}
};

Close - and the other answer's code will work - but it's not the synchronicity, it's that you're not asking context.current time to the start time. Start() doesn't take an offset from "now" - it takes an absolute time. Add context.currentTime to the start param, and you should be good.

Your code assumes that beeper.start(j) is a synchronous method, i.e. it waits for the sound to complete playing. This is not the case, so your loop is probably playing all 3 instances at nearly the same exact time.
One solution is to delay the playing of each instance, by passing a time parameter to the start() method:
var numSecondsInBeep = 3;
for (var j = 0; j < 3; j++) {
var beeper = context.createBufferSource(); // Declare a New Sound
beeper.buffer = beepBuffer; // Attatch our Audio Data as it's Buffer
beeper.connect(context.destination); // Link the Sound to the Output
console.log(j);
beeper.start(context.currentTime + j * numSecondsInBeep);
}
See here for more info on the play() API.

Related

Javascript append multiple buffer to sourceBuffer and play them as a single video

I'm trying to concatenate multiple buffers but it didn't work, this is the code i'm using :
let socket = io();
let mediaSource = new MediaSource();
let video = document.getElementById("player");
let queue = [];
let sourceBuffer;
video.src = window.URL.createObjectURL(mediaSource);
mediaSource.addEventListener('sourceopen', function() {
sourceBuffer = mediaSource.addSourceBuffer('video/webm; codecs="vorbis,vp8"');
socket.on('broadcast', function (buffer) {
console.log('new buffer');
let uIntArray = new Uint8Array(buffer);
if (!sourceBuffer.updating) {
sourceBuffer.appendBuffer(uIntArray);
} else {
queue.push(uIntArray);
}
})
});
When the first buffer comes the video start to play but as soon as the second buffer comes through socketIO the video freeze, i don't know how to add the second buffer so when the first one ended it moves to play the second one like it is one video. excuse my poor English
You have to offset current SourceBuffer duration after each append:
var duration = 0;
...
(within loop)
sourceBuffer.timestampOffset = duration;
var delta = buffer.duration
duration = duration + delta;
If you're dealing with sequential stream, just set sourceBuffer.mode = "sequence", then timestampOffset will be increased automatically in the order of added chunks.
See MDN SourceBuffer mode.
i do not think you can。 mediaSource can only have one video sourceBuffer and one audio sourceBuffer, if you add two the video will fire a error
https://developers.google.com/web/fundamentals/media/mse/basics

Play PCM as it is being generated

I am generating some raw audio data in javascript and I need to play it as I am generating it. I searched for this here and the closest thing to what I am looking for is this. However, in the answer given there the array of data points is generated first and then the audio is played. I need to play it while generating it. Basically I am receiving some stream of some other data, processing it and generating the audio based on that. I need to play the audio corresponding to the data I am receiving as I am receiving it. (A simplified example is receiving the audio volume and frequency.)
If I'm getting the request correctly, then all you need is a ScriptProcessorNode.
You will feed it with your PCM data in the following way:
wait for its onaudioprocess event.
get the outputBuffer from the event which is an AudioBuffer.
loop through each channels of the outputBuffer (will return an Float32Array).
loop through all the samples of the outputBuffer's channels data.
set your own data
function makeSomeNoise() {
var ctx = new AudioContext();
var processor = ctx.createScriptProcessor(4096, 1, 1);
processor.onaudioprocess = function(evt) {
var outputBuffer = evt.outputBuffer;
// Loop through the output channels
for (var channel = 0; channel < outputBuffer.numberOfChannels; channel++) {
var outputData = outputBuffer.getChannelData(channel);
// Loop through the 4096 samples
for (var sample = 0; sample < outputBuffer.length; sample++) {
outputData[sample] = ((Math.random() * 2) - 1) * 0.5;
}
}
};
processor.connect(ctx.destination);
}
btn.onclick = function() {
if (confirm("That won't be really nice"))
makeSomeNoise();
}
<button id="btn">make some noise</button>

Javascript: Audio from local file not playing without HTML "<audio>" element present

Now this one really has me stumped, so I was hoping to see if anyone could spot where I am doing things incorrectly.
So essentially, I have a page with two elements. One is an HTML5 file handler, and the second is a button. When the user selects a file I respond to the onchange event that is generated, decoding the audio and constructing a buffer to be used. I know there is the HTML5 audio tag, but this is going to be a utility that needs to be able to break up the file into manageable chunks.
I have done several tests and have found that the audio that I decode myself will only play after an audio element on the page has been played. I have absolutely no idea what could be causing this behavior, since I have studied several examples online on playing audio. I will include my audio engine below.
Just to clarify, everything is handled by this code.
Thank you.
"use strict";
var AudioFile = function (){
this.length = 0; // Number of samples
this.duration = 0; // Time in seconds
this.sampleRate = 0; // Sample rate
this.channels = 0; // Number of channels
this.data = []; //Audio/Waveform data
};
var audioCtx = null;
class AudioEngine {
constructor (){
// All of the necessary audio control variables
if(!audioCtx){
window.AudioContext = window.AudioContext || window.webkitAudioContext;
audioCtx = new AudioContext();
}
// Will hold audio data waiting to be played
this.buffer = null;
// This will hold the decoded audio file upon completion
this.decodedFile = new AudioFile();
// Automatically create buffer upon finished decoding?
this.autoCreateBuffer = true;
// Specify this if you want to have a function recieve
// the decoded audio, i.e. completionCallback(decodedFile);
this.completionCallback = null;
}
// This will decode an audio file
fileCallback (event){
console.log("file callback");
this.buffer = null;
var reader = new FileReader();
var file = event.target.files[0];
reader.onload = this.loadCallback.bind(this);
reader.readAsArrayBuffer(file);
}
// Called by fileCallback after file has been loaded
loadCallback (file){
console.log("load callback");
var raw = file.target.result;
audioCtx.decodeAudioData(raw, this.decodeCallback.bind(this));
}
// Called by loadCallback after file has been decoded
decodeCallback (data){
console.log("decode callback");
var audioTemp = new AudioFile();
audioTemp.length = data.length;
audioTemp.duration = data.duration;
audioTemp.sampleRate = data.sampleRate;
audioTemp.channels = data.numberOfChannels;
var arr = [];
for(var i = 0; i < data.numberOfChannels; i++){
arr.push(new Float32Array(data.length));
data.copyFromChannel(arr[i], i);
}
audioTemp.data = arr.slice(0);
this.decodedFile = audioTemp;
if(this.autoCreateBuffer){
var buffer = audioCtx.createBuffer(audioTemp.channels, audioTemp.length, audioTemp.sampleRate);
var samples;
for(var c = 0; c < audioTemp.channels; c++){
samples = buffer.getChannelData(c);
for(var i = 0; i < audioTemp.length; i++){
samples[i] = this.decodedFile.data[c][i];
}
}
this.buffer = buffer;
}
if(this.completionCallback){
this.completionCallback(audioTemp);
}
}
// Play data that is in buffer
play(){
if(this.buffer){
var source = audioCtx.createBufferSource();
var tmp = this.buffer.getChannelData(0);
source.buffer = this.buffer;
source.connect(audioCtx.destination);
source.start(0);
console.log("play");
}
}
}
Okay, I have figured out what seems to be the problem. This code is currently in the testing phase and it seems to have something to do with the console being open on firefox that messes with sound being played. Although, it is not entirely consistent with how it behaves, at least I know the general cause of the problem I was having.
In other words, "audio" elements have no problems whether the console is open or not, but there seems to be undefined behavior when opening/minimizing the console with JavaScript controlled audio. However, it always behaves as expected when the console is closed.

Play sound with delay (100ms)?

What is the best way to play sound with delay 50ms or 100ms?
Here is something, what i tried:
var beat = new Audio('/sound/BEAT.wav');
var time = 300;
playbeats();
function playbeats(){
beat.cloneNode().play();
setTimeout(playbeats, time);
}
This is working correctly but my goal is to play BEAT.wav after every 100ms. When I change "time" variable to 100, then it is so "laggy".
721ms is my BEAT.wav (that's why im using cloneNode())
What is alternatives to solve this?
You can use setInterval(), the arguments are the same.
setInterval(function() {
playbeats();
}, 100);
and your function playbeats function should be.
function playbeats(){
var tempBeat=beat.cloneNode();
tempBeat.play();
}
your whole program should be like this.
var beat = new Audio('/sound/BEAT.wav');
setInterval(function() {
playbeats();
}, 100);
function playbeats(){
var tempBeat=beat.cloneNode();
tempBeat.play();
}
You can use the Web Audio API but the code will be a bit different.If you want the Web Audio API's timing and loop capabillities you will need to load the file into a buffer first. It also requires that your code is run on a server. Here is an example:
var audioContext = new AudioContext();
var audioBuffer;
var getSound = new XMLHttpRequest();
getSound.open("get", "sound/BEAT.wav", true);
getSound.responseType = "arraybuffer";
getSound.onload = function() {
audioContext.decodeAudioData(getSound.response, function(buffer) {
audioBuffer = buffer;
});
};
getSound.send();
function playback() {
var playSound = audioContext.createBufferSource();
playSound.buffer = audioBuffer;
playSound.loop = true;
playSound.connect(audioContext.destination);
playSound.start(audioContext.currentTime, 0, 0.3);
}
window.addEventListener("mousedown", playback);
I would also recommend using the Web Audio API. From there, you can simply loop a buffer source node every 100ms or 50ms or whatever time you want.
To do this, as stated in other responses, you'll need to use an XMLHttpRequest to load the sound file via a server
// set up the Web Audio context
var audioCtx = new AudioContext();
// create a new buffer
// 2 channels, 4410 samples (100 ms at 44100 samples/sec), 44100 samples per sec
var buffer = audioCtx.createBuffer(2, 4410, 44100);
// load the sound file via an XMLHttpRequest from a server
var request = new XMLHttpRequest();
request.open('GET', '/sound/BEAT.wav', true);
request.responseType = 'arraybuffer';
request.onload = function () {
var audioData = request.response;
audioCtx.decodeAudioData(audioData, function (newBuffer) {
buffer = newBuffer;
});
}
request.send();
Now you can make a Buffer Source Node to loop the playback
// create the buffer source
var bufferSource = audioCtx.createBufferSource();
// set the buffer we want to use
bufferSource.buffer = buffer;
// set the buffer source node to loop
bufferSource.loop = true;
// specify the loop points in seconds (0.1s = 100ms)
// this is a little redundant since we already set our buffer to be 100ms
// so by default it would loop when the buffer comes to an end (at 100ms)
bufferSource.loopStart = 0;
bufferSource.loopEnd = 0.1;
// connect the buffer source to the Web Audio sound output
bufferSource.connect(audioCtx.destination);
// play!
bufferSource.start();
Note that if you stop the playback via bufferSource.stop(), you will not be able to start it again. You can only call start() once, so you'll need to create a new source node if you want to start playback again.
Note that because of the way the sound file is loaded via an XMLHttpRequest, if you try to test this on your machine without running a server, you'll get a cross-reference request error on most browsers. So the simplest way to get around this if you want to test this on your machine is to run a Python SimpleHTTPServer

Optimize audio for iOS Web App

I'm currently developing and testing a game for iOS using Javascript with the Cordova framework. I'm attempting to add sound effects when certain nodes are touched. Since nodes can be touched repeatedly at any rate. I'm using...
var snd = new Audio("audio/note_"+currentChain.length+".mp3");
snd.play();
Which works for what I need but when I enable these effects I find that the game lags. I'm working with mp3 files that have been shrunken down to about 16kb in size and even still the lag is substantial.
What's is the best way to optimize sound in my situation? Am I limited on quality because the application is not native?
Thanks!
It would be the best option to preload them and have them ready when needed. I just wrote up a quick self-contained closure that I think will show you most of what you'd like to know how to do.
var playSound = (function () {
var sounds = 15, // idk how many mp3 files you have
loaded = new Array(sounds),
current = -1,
chain = [],
audio,
i;
function incrementLoaded(index) {
loaded[index] = true;
}
// preloads the sound data
for(i = 0; i < sounds; i++) {
audio = new Audio();
audio.addEventListener('canplay', incrementLoaded.bind(audio, i));
audio.src = 'audio/note_' + i + '.mp3';
chain.push(audio);
}
// this will play audio only when ready and in sequential order automatically
// or "which" index, if supplied
return function (which) {
if(typeof which === 'number') {
current = which;
} else {
current = (current + 1) % sounds;
}
// only play if loaded
if(loaded[current]) {
chain[current].pause();
chain[current].currentTime = 0;
chain[current].play();
}
};
}());
// would play sounds in order every second
setInterval(function () {
playSound();
}, 1000);
If you are using multiple files, I'd suggest you to change that to a single file, using the idea of sound sprites. This link has more tdetails about it: http://www.ibm.com/developerworks/library/wa-ioshtml5/
From my own experience, try increasing the file bitrate if you are not getting the sound to play exactly where you want it to, ref: http://pupunzi.open-lab.com/2013/03/13/making-html5-audio-actually-work-on-mobile/

Categories