Optimize audio for iOS Web App - javascript

I'm currently developing and testing a game for iOS using Javascript with the Cordova framework. I'm attempting to add sound effects when certain nodes are touched. Since nodes can be touched repeatedly at any rate. I'm using...
var snd = new Audio("audio/note_"+currentChain.length+".mp3");
snd.play();
Which works for what I need but when I enable these effects I find that the game lags. I'm working with mp3 files that have been shrunken down to about 16kb in size and even still the lag is substantial.
What's is the best way to optimize sound in my situation? Am I limited on quality because the application is not native?
Thanks!

It would be the best option to preload them and have them ready when needed. I just wrote up a quick self-contained closure that I think will show you most of what you'd like to know how to do.
var playSound = (function () {
var sounds = 15, // idk how many mp3 files you have
loaded = new Array(sounds),
current = -1,
chain = [],
audio,
i;
function incrementLoaded(index) {
loaded[index] = true;
}
// preloads the sound data
for(i = 0; i < sounds; i++) {
audio = new Audio();
audio.addEventListener('canplay', incrementLoaded.bind(audio, i));
audio.src = 'audio/note_' + i + '.mp3';
chain.push(audio);
}
// this will play audio only when ready and in sequential order automatically
// or "which" index, if supplied
return function (which) {
if(typeof which === 'number') {
current = which;
} else {
current = (current + 1) % sounds;
}
// only play if loaded
if(loaded[current]) {
chain[current].pause();
chain[current].currentTime = 0;
chain[current].play();
}
};
}());
// would play sounds in order every second
setInterval(function () {
playSound();
}, 1000);

If you are using multiple files, I'd suggest you to change that to a single file, using the idea of sound sprites. This link has more tdetails about it: http://www.ibm.com/developerworks/library/wa-ioshtml5/
From my own experience, try increasing the file bitrate if you are not getting the sound to play exactly where you want it to, ref: http://pupunzi.open-lab.com/2013/03/13/making-html5-audio-actually-work-on-mobile/

Related

Jest testing HTML Media Element

I have written a web-app which plays selected mp3 files in order of selection. When it comes to testing I cannot get my jest tests to enter the onended() handler of the HTMLAudioElement.
I have tried to spyOn the play() method on the elements but cannot find a way to set the audio's ended attribute to true.
The code for playing the audio is as follows:
playAudioFiles = () =\> {
const audioFiles = this.getAudioFiles()
const audio = audioFiles[0]
let index = 1
audio.play()
audio.onended = () => {
if (index < audioFiles.length) {
audio.src = audioFiles[index].src
audio.play()
index++
}
}
}
I am spying on the play method of the audio elements as follows:
mockPlay = jest.spyOn(window.HTMLMediaElement.prototype, 'play')
Potentially there is something I could do here to trigger the ended event?

starting/stopping MediaRecorder API causes Chrome to crash

I am implementing the MediaRecorder API as a way to record webm blobs for use as segments in a livestream. I have gotten the functionality I need but ran into a problem with Chrome crashing when calling MediaRecorder.stop() and MediaRecorder.start() multiple times in regular intervals.
Here is the recording code:
let Recorder = null;
let segmentBuffer = [];
let recordInterval = null;
let times = 0; //limiter for crashes
function startRecording() {
Recorder = new MediaRecorder(LocalStream, { mimeType: 'video/webm;codecs=opus, vp8', audioBitsPerSecond: 50000, videoBitsPerSecond: 1000000, });
//error evt
Recorder.onerror = (evt) => {
console.error(evt.error);
}
//push blob data to segments buffer
Recorder.ondataavailable = (evt) => {
segmentBuffer.push(evt.data);
}
//start initial recording
Recorder.start();
//set stop/start delivery interval every 5 seconds
recordInterval = setInterval(() => {
//stop recording
Recorder.stop();
//here to prevent crash
if (times > 5) {
Recorder = null;
console.log('end')
return;
}
times++;
//check if has segments
if (segmentBuffer.length) {
//produce segment, this segment is playable and not just a byte-stream due to start/stop
let webm = segmentBuffer.reduce((a, b) => new Blob([a, b], { type: "video/webm;codecs=opus, vp8" }));
//unset buffer
segmentBuffer = [];
//handle blob ie. send to server
handleBlob(webm)
}
//restart recorder
Recorder.start();
}, 5000);
}
I've also gone into the performance and discovered that a new audio and video encoder thread is started for each start/stop. I think this is the major problem as setting the interval to 10s vs. 5s creates fewer encoding threads. The buildup of multiple encoding threads causes chrome to lag and then finally crash afer a few passes.
How do I prevent multiple encoding threads from occurring while still being able to start/stop MediaRecorder (start/stop is the only way I found to achieve webm files that can be playable separately, otherwise each subsequent blob is missing the webm header part).
It appears that this is a bug in chrome:
https://bugs.chromium.org/p/chromium/issues/detail?id=1012378&q=mediaRecorder%20thread&can=2
I'm not sure there is anything you can do to fix it.

Splitting large file load into chunks, stitching to AudioBuffer?

In my app, I have an hour-long audio file that's entirely sound effects. Unfortunately I do need them all - they're species-specific sounds, so I can't cut any of them out. They were separate before, but I audiosprite'd them all into one large file.
The export file is about 20MB compressed, but it's still a large download for users with a slow connection. I need this file to be in an AudioBuffer, since I'm seeking to sections of an audioSprite and using loopStart/loopEnd to only loop that section. I more or less need the whole thing downloaded before playback can start, because the requested species are randomly picked when the app starts. They could be looking for sounds at the start of the file, or at the very end.
What I'm wondering is, if I were to split this file in fourths, could I load them in in parallel, and stitch them into the full AudioBuffer once loading finishes? I'm guessing I'd be merging multiple arrays, but only performing decodeAudioData() once? Requesting ~100 separate files (too many) was what brought me to audiosprites in the first place, but I'm wondering if there's a way to leverage some amount of async loading to lower the time it takes. I thought about having four <audio> elements and using createMediaElementSource() to load them, but my understanding is that I can't (?) turn a MediaElementSource into an AudioBuffer.
Consider playing the files immediately in chucks instead of waiting for the entire file to download. You could do this with the Streams API and:
Queuing chunks with the MediaSource Extensions (MSE) API and switching between buffers.
Playing back decoded PCM audio with the Web Audio API and AudioBuffer.
See examples for low-latency audio playback of file chunks as they are received.
I think in principle you can. Just download each chunk as an ArrayBuffer, concatenate all of the chunks together and send that to decodeAudioData.
But if you're on a slow link, I'm not sure how downloading in parallel will help.
Edit: this code is functional, but on occasion produces really nasty audio glitches, so I don't recommend using it without further testing. I'm leaving it here in case it helps someone else figure out working with Uint8Arrays.
So here's a basic version of it, basically what Raymond described. I haven't tested this with a split version of the large file yet, so I don't know if it improves the load speed at all, but it works. The JS is below, but if you want to test it yourself, here's the pen.
// mp3 link is from: https://codepen.io/SitePoint/pen/JRaLVR
(function () {
'use strict';
const context = new AudioContext();
let bufferList = [];
// change the urlList for your needs
const URL = 'https://s3-us-west-2.amazonaws.com/s.cdpn.io/123941/Yodel_Sound_Effect.mp3';
const urlList = [URL, URL, URL, URL, URL, URL];
const loadButton = document.querySelector('.loadFile');
const playButton = document.querySelector('.playFile');
loadButton.onclick = () => loadAllFiles(urlList, loadProgress);
function play(audioBuffer) {
const source = context.createBufferSource();
source.buffer = audioBuffer;
source.connect(context.destination);
source.start();
}
// concatenates all the buffers into one collected ArrayBuffer
function concatBufferList(buflist, len) {
let tmp = new Uint8Array(len);
let pos = 0;
for (let i = 0; i < buflist.length; i++) {
tmp.set(new Uint8Array(buflist[i]), pos);
pos += buflist[i].byteLength;
}
return tmp.buffer;
}
function loadAllFiles(list, onProgress) {
let fileCount = 0;
let fileSize = 0;
for (let i = 0; i < list.length; i++) {
loadFileXML(list[i], loadProgress, i).then(e => {
bufferList[i] = e.buf;
fileSize += e.size;
fileCount++;
if (fileCount == bufferList.length) {
let b = concatBufferList(bufferList, fileSize);
context.decodeAudioData(b).then(audioBuffer => {
playButton.disabled = false;
playButton.onclick = () => play(audioBuffer);
}).catch(error => console.log(error));
}
});
}
}
// adapted from petervdn's audiobuffer-load on npm
function loadFileXML(url, onProgress, index) {
return new Promise((resolve, reject) => {
const request = new XMLHttpRequest();
request.open('GET', url, true);
request.responseType = 'arraybuffer';
if (onProgress) {
request.onprogress = event => {
onProgress(event.loaded / event.total);
};
}
request.onload = () => {
if (request.status === 200) {
const fileSize = request.response.byteLength;
resolve({
buf: request.response,
size: fileSize
});
}
else {
reject(`Error loading '${url}' (${request.status})`);
}
};
request.onerror = error => {
reject(error);
};
request.send();
});
}
function loadProgress(e) {
console.log("Progress: "+e);
}
}());

Javascript: Audio from local file not playing without HTML "<audio>" element present

Now this one really has me stumped, so I was hoping to see if anyone could spot where I am doing things incorrectly.
So essentially, I have a page with two elements. One is an HTML5 file handler, and the second is a button. When the user selects a file I respond to the onchange event that is generated, decoding the audio and constructing a buffer to be used. I know there is the HTML5 audio tag, but this is going to be a utility that needs to be able to break up the file into manageable chunks.
I have done several tests and have found that the audio that I decode myself will only play after an audio element on the page has been played. I have absolutely no idea what could be causing this behavior, since I have studied several examples online on playing audio. I will include my audio engine below.
Just to clarify, everything is handled by this code.
Thank you.
"use strict";
var AudioFile = function (){
this.length = 0; // Number of samples
this.duration = 0; // Time in seconds
this.sampleRate = 0; // Sample rate
this.channels = 0; // Number of channels
this.data = []; //Audio/Waveform data
};
var audioCtx = null;
class AudioEngine {
constructor (){
// All of the necessary audio control variables
if(!audioCtx){
window.AudioContext = window.AudioContext || window.webkitAudioContext;
audioCtx = new AudioContext();
}
// Will hold audio data waiting to be played
this.buffer = null;
// This will hold the decoded audio file upon completion
this.decodedFile = new AudioFile();
// Automatically create buffer upon finished decoding?
this.autoCreateBuffer = true;
// Specify this if you want to have a function recieve
// the decoded audio, i.e. completionCallback(decodedFile);
this.completionCallback = null;
}
// This will decode an audio file
fileCallback (event){
console.log("file callback");
this.buffer = null;
var reader = new FileReader();
var file = event.target.files[0];
reader.onload = this.loadCallback.bind(this);
reader.readAsArrayBuffer(file);
}
// Called by fileCallback after file has been loaded
loadCallback (file){
console.log("load callback");
var raw = file.target.result;
audioCtx.decodeAudioData(raw, this.decodeCallback.bind(this));
}
// Called by loadCallback after file has been decoded
decodeCallback (data){
console.log("decode callback");
var audioTemp = new AudioFile();
audioTemp.length = data.length;
audioTemp.duration = data.duration;
audioTemp.sampleRate = data.sampleRate;
audioTemp.channels = data.numberOfChannels;
var arr = [];
for(var i = 0; i < data.numberOfChannels; i++){
arr.push(new Float32Array(data.length));
data.copyFromChannel(arr[i], i);
}
audioTemp.data = arr.slice(0);
this.decodedFile = audioTemp;
if(this.autoCreateBuffer){
var buffer = audioCtx.createBuffer(audioTemp.channels, audioTemp.length, audioTemp.sampleRate);
var samples;
for(var c = 0; c < audioTemp.channels; c++){
samples = buffer.getChannelData(c);
for(var i = 0; i < audioTemp.length; i++){
samples[i] = this.decodedFile.data[c][i];
}
}
this.buffer = buffer;
}
if(this.completionCallback){
this.completionCallback(audioTemp);
}
}
// Play data that is in buffer
play(){
if(this.buffer){
var source = audioCtx.createBufferSource();
var tmp = this.buffer.getChannelData(0);
source.buffer = this.buffer;
source.connect(audioCtx.destination);
source.start(0);
console.log("play");
}
}
}
Okay, I have figured out what seems to be the problem. This code is currently in the testing phase and it seems to have something to do with the console being open on firefox that messes with sound being played. Although, it is not entirely consistent with how it behaves, at least I know the general cause of the problem I was having.
In other words, "audio" elements have no problems whether the console is open or not, but there seems to be undefined behavior when opening/minimizing the console with JavaScript controlled audio. However, it always behaves as expected when the console is closed.

Scheduling Web Audio Api playback, multiple plays issue

I am trying to schedule the beep sound to play 3x one second apart. However, the sound is only playing once. Any thoughts on why this might be? (It's included within a larger javascript funciton that declares context etc. . .)
var beepBuffer;
var loadBeep = function() {
var getSound = new XMLHttpRequest(); // Load the Sound with XMLHttpRequest
getSound.open("GET", "/static/music/chime.wav", true); // Path to Audio File
getSound.responseType = "arraybuffer"; // Read as Binary Data
getSound.onload = function() {
context.decodeAudioData(getSound.response, function(buffer){
beepBuffer = buffer; // Decode the Audio Data and Store it in a Variable
});
}
getSound.send(); // Send the Request and Load the File
}
var playBeep = function() {
for (var j = 0;j<3;j++) {
var beeper = context.createBufferSource(); // Declare a New Sound
beeper.buffer = beepBuffer; // Attatch our Audio Data as it's Buffer
beeper.connect(context.destination); // Link the Sound to the Output
console.log(j);
beeper.start(j); // Play the Sound Immediately
}
};
Close - and the other answer's code will work - but it's not the synchronicity, it's that you're not asking context.current time to the start time. Start() doesn't take an offset from "now" - it takes an absolute time. Add context.currentTime to the start param, and you should be good.
Your code assumes that beeper.start(j) is a synchronous method, i.e. it waits for the sound to complete playing. This is not the case, so your loop is probably playing all 3 instances at nearly the same exact time.
One solution is to delay the playing of each instance, by passing a time parameter to the start() method:
var numSecondsInBeep = 3;
for (var j = 0; j < 3; j++) {
var beeper = context.createBufferSource(); // Declare a New Sound
beeper.buffer = beepBuffer; // Attatch our Audio Data as it's Buffer
beeper.connect(context.destination); // Link the Sound to the Output
console.log(j);
beeper.start(context.currentTime + j * numSecondsInBeep);
}
See here for more info on the play() API.

Categories