Javascript: Audio from local file not playing without HTML "<audio>" element present - javascript

Now this one really has me stumped, so I was hoping to see if anyone could spot where I am doing things incorrectly.
So essentially, I have a page with two elements. One is an HTML5 file handler, and the second is a button. When the user selects a file I respond to the onchange event that is generated, decoding the audio and constructing a buffer to be used. I know there is the HTML5 audio tag, but this is going to be a utility that needs to be able to break up the file into manageable chunks.
I have done several tests and have found that the audio that I decode myself will only play after an audio element on the page has been played. I have absolutely no idea what could be causing this behavior, since I have studied several examples online on playing audio. I will include my audio engine below.
Just to clarify, everything is handled by this code.
Thank you.
"use strict";
var AudioFile = function (){
this.length = 0; // Number of samples
this.duration = 0; // Time in seconds
this.sampleRate = 0; // Sample rate
this.channels = 0; // Number of channels
this.data = []; //Audio/Waveform data
};
var audioCtx = null;
class AudioEngine {
constructor (){
// All of the necessary audio control variables
if(!audioCtx){
window.AudioContext = window.AudioContext || window.webkitAudioContext;
audioCtx = new AudioContext();
}
// Will hold audio data waiting to be played
this.buffer = null;
// This will hold the decoded audio file upon completion
this.decodedFile = new AudioFile();
// Automatically create buffer upon finished decoding?
this.autoCreateBuffer = true;
// Specify this if you want to have a function recieve
// the decoded audio, i.e. completionCallback(decodedFile);
this.completionCallback = null;
}
// This will decode an audio file
fileCallback (event){
console.log("file callback");
this.buffer = null;
var reader = new FileReader();
var file = event.target.files[0];
reader.onload = this.loadCallback.bind(this);
reader.readAsArrayBuffer(file);
}
// Called by fileCallback after file has been loaded
loadCallback (file){
console.log("load callback");
var raw = file.target.result;
audioCtx.decodeAudioData(raw, this.decodeCallback.bind(this));
}
// Called by loadCallback after file has been decoded
decodeCallback (data){
console.log("decode callback");
var audioTemp = new AudioFile();
audioTemp.length = data.length;
audioTemp.duration = data.duration;
audioTemp.sampleRate = data.sampleRate;
audioTemp.channels = data.numberOfChannels;
var arr = [];
for(var i = 0; i < data.numberOfChannels; i++){
arr.push(new Float32Array(data.length));
data.copyFromChannel(arr[i], i);
}
audioTemp.data = arr.slice(0);
this.decodedFile = audioTemp;
if(this.autoCreateBuffer){
var buffer = audioCtx.createBuffer(audioTemp.channels, audioTemp.length, audioTemp.sampleRate);
var samples;
for(var c = 0; c < audioTemp.channels; c++){
samples = buffer.getChannelData(c);
for(var i = 0; i < audioTemp.length; i++){
samples[i] = this.decodedFile.data[c][i];
}
}
this.buffer = buffer;
}
if(this.completionCallback){
this.completionCallback(audioTemp);
}
}
// Play data that is in buffer
play(){
if(this.buffer){
var source = audioCtx.createBufferSource();
var tmp = this.buffer.getChannelData(0);
source.buffer = this.buffer;
source.connect(audioCtx.destination);
source.start(0);
console.log("play");
}
}
}

Okay, I have figured out what seems to be the problem. This code is currently in the testing phase and it seems to have something to do with the console being open on firefox that messes with sound being played. Although, it is not entirely consistent with how it behaves, at least I know the general cause of the problem I was having.
In other words, "audio" elements have no problems whether the console is open or not, but there seems to be undefined behavior when opening/minimizing the console with JavaScript controlled audio. However, it always behaves as expected when the console is closed.

Related

Automating Photoshop with JS

I had to place videos(mp4-files) in one photoshop document. I thought it would be easier to find a solution with png/jpg, and then project it on mp4. but the fact is that photoshop saving png/jpg and mp4 in different ways. Therefore, despite the fact that there is an import solution, I have difficulties with exporting mp4 by code.
I have 2 arrays of mp4 files and each mp4 from the first array needs to be overlaid on each of the second and saved by mp4. I solved the problem by uploading a video to an open photoshop file with a simple code:
function replaceContents(newFile) {
var docRef = app.open(newFile);
return docRef;
}
function importVideos(order_number) {
var doc = app.activeDocument;
var file = new File('E:/path/' + order_number + '.mp4');
// open a new document with needed video
var docTemp = replaceContents(file);
// copy opend layer with video from new doc to my main doc
var layer = docTemp.activeLayer.duplicate(doc.layerSets.getByName(color), ElementPlacement.PLACEATEND);
// close new unnecessary doc
docTemp.close(SaveOptions.DONOTSAVECHANGES);
layer.name = order_number;
return layer;
}
Here is the code for saving videos and in doExport() doc should be saved as a video.
function Saving(color) {
var array1 = app.activeDocument.layerSets.getByName('s');
var array2 = app.activeDocument.layerSets.getByName(color);
for (i = 0; i < 5; i++) {
array1.artLayers[i].visible = true;
for (j = 0; j < 5; j++) {
array2.artLayers[i].visible = true;
doExport();
array2.artLayers[i].visible = false;
}
array1.artLayers[i].visible = false;
}
}
So a new question: how to export a video from photoshop with a code with the ability to specify the file name and the save path?
P.S. if you do this through Actions, you can't enter input parameters like the name of the saved file, it seals the Action as you did it.
If you know how to create arguments for Actions, you are welcome!

Struggling to playback a Float 32 Array (Web Audio API)

I'm building a simple looper, to help me come to an understanding of the Web Audio API however struggling to to get a buffer source to play back the recorded audio.
The code has been simplified as much as possible however with annotation it's still 70+ lines, ommitting the CSS and HTML, so apologies for that. A version including the CSS and HTML can be found on JSFiddle:
https://jsfiddle.net/b5w9j4yk/10/
Any help would be much appreciated. Thank you :)
// Aim of the code is to record the input from the mike to a float32 array. then prass that to a buffer which is linked to a buffer source, so the audio can be played back.
// Grab DOM Elements
const playButton = document.getElementById('play');
const recordButton = document.getElementById('record');
// If allowed access to microphone run this code
const promise = navigator.mediaDevices.getUserMedia({audio: true, video: false})
.then((stream) => {
recordButton.addEventListener('click', () => {
// when the record button is pressed clear enstanciate the record buffer
if (!recordArmed) {
recordArmed = true;
recordButton.classList.add('on');
console.log('recording armed')
recordBuffer = new Float32Array(audioCtx.sampleRate * 10);
}
else {
recordArmed = false;
recordButton.classList.remove('on');
// After the recording has stopped pass the recordBuffer the source's buffer
myArrayBuffer.copyToChannel(recordBuffer, 0);
//Looks like the buffer has been passed
console.log(myArrayBuffer.getChannelData(0));
}
});
// this should stat the playback of the source intended to be used adter the audio has been recorded, I can't get it to work in this given context
playButton.addEventListener('click', () => {
playButton.classList.add('on');
source.start();
});
//Transport variables
let recordArmed = false;
let playing = false;
// this buffer will later be assigned a Float 32 Array / I'd like to keep this intimediate buffer so the audio can be sliced and minipulated with ease later
let recordBuffer;
// Declear Context, input source and a block processor to pass the input sorce to the recordBuffer
const audioCtx = new AudioContext();
const audioIn = audioCtx.createMediaStreamSource(stream);
const processor = audioCtx.createScriptProcessor(512, 1, 1);
// Create a source and corrisponding buffer for playback and then assign link
const myArrayBuffer = audioCtx.createBuffer(1, audioCtx.sampleRate * 10, audioCtx.sampleRate);
const source = audioCtx.createBufferSource();
source.buffer = myArrayBuffer;
// Audio Routing
audioIn.connect(processor);
source.connect(audioCtx.destination);
// When recording is armed pass the samples of the block one at a time to the record buffer
processor.onaudioprocess = ((audioProcessingEvent) => {
let inputBuffer = audioProcessingEvent.inputBuffer;
let i = 0;
if (recordArmed) {
for (let channel = 0; channel < inputBuffer.numberOfChannels; channel++) {
let inputData = inputBuffer.getChannelData(channel);
let avg = 0;
inputData.forEach(sample => {
recordBuffer.set([sample], i);
i++;
});
}
}
else {
i = 0;
}
});
})

Play raw audio with JavaScript

I have a stream of numbers like this
-0.00015259254737998596,-0.00009155552842799158,0.00009155552842799158,0.00021362956633198035,0.0003662221137119663,0.0003967406231879635,0.00024414807580797754,0.00012207403790398877,0.00012207403790398877,0.00012207403790398877,0.0003357036042359691,0.0003357036042359691,0.00018311105685598315,0.00003051850947599719,0,-0.00012207403790398877,0.00006103701895199438,0.00027466658528397473,0.0003967406231879635,0.0003967406231879635,0.0003967406231879635,0.0003967406231879635,0.0003967406231879635,0.0003662221137119663,0.0004882961516159551,0.0004577776421399579,0.00027466658528397473,0.00003051850947599719,-0.00027466658528397473....
Which supposedly represent an audio stream. I got them from here and I've transmitted them over the web, now I'm trying to play the actual sound and I got a snippet from here but I'm getting Uncaught (in promise) DOMException: Unable to decode audio data
I feel like I'm missing quite a lot I just expect this to work like magic and it just could not be the case..
My code
var ws = new WebSocket("ws://....");
ws.onmessage = function (event) {
playByteArray(event.data);
}
var context = new AudioContext();
function playByteArray(byteArray) {
var arrayBuffer = new ArrayBuffer(byteArray.length);
var bufferView = new Uint8Array(arrayBuffer);
for (var i = 0; i < byteArray.length; i++) {
bufferView[i] = byteArray[i];
}
context.decodeAudioData(arrayBuffer, function (buffer) {
buf = buffer;
play();
});
}
// Play the loaded file
function play() {
// Create a source node from the buffer
var source = context.createBufferSource();
source.buffer = buf;
// Connect to the final output node (the speakers)
source.connect(context.destination);
// Play immediately
source.start(0);
}
And the broadcasting part
var ws = new WebSocket("ws://.....");
window.addEventListener("audioinput", function onAudioInput(evt) {
if (ws) {
ws.send(evt.data);
}
}, false);
audioinput.start({
bufferSize: 8192
});
It doesn't look like you're dealing with compatible audio data formats. The code you linked to is for playing byte arrays, in which case your audio data should be a (much longer) string of integer numbers from 0 to 255.
What you've got is a fairly short (as audio data goes) string of floating point numbers. I can't tell what audio format that's supposed to be, but it would require a different player.

Scheduling Web Audio Api playback, multiple plays issue

I am trying to schedule the beep sound to play 3x one second apart. However, the sound is only playing once. Any thoughts on why this might be? (It's included within a larger javascript funciton that declares context etc. . .)
var beepBuffer;
var loadBeep = function() {
var getSound = new XMLHttpRequest(); // Load the Sound with XMLHttpRequest
getSound.open("GET", "/static/music/chime.wav", true); // Path to Audio File
getSound.responseType = "arraybuffer"; // Read as Binary Data
getSound.onload = function() {
context.decodeAudioData(getSound.response, function(buffer){
beepBuffer = buffer; // Decode the Audio Data and Store it in a Variable
});
}
getSound.send(); // Send the Request and Load the File
}
var playBeep = function() {
for (var j = 0;j<3;j++) {
var beeper = context.createBufferSource(); // Declare a New Sound
beeper.buffer = beepBuffer; // Attatch our Audio Data as it's Buffer
beeper.connect(context.destination); // Link the Sound to the Output
console.log(j);
beeper.start(j); // Play the Sound Immediately
}
};
Close - and the other answer's code will work - but it's not the synchronicity, it's that you're not asking context.current time to the start time. Start() doesn't take an offset from "now" - it takes an absolute time. Add context.currentTime to the start param, and you should be good.
Your code assumes that beeper.start(j) is a synchronous method, i.e. it waits for the sound to complete playing. This is not the case, so your loop is probably playing all 3 instances at nearly the same exact time.
One solution is to delay the playing of each instance, by passing a time parameter to the start() method:
var numSecondsInBeep = 3;
for (var j = 0; j < 3; j++) {
var beeper = context.createBufferSource(); // Declare a New Sound
beeper.buffer = beepBuffer; // Attatch our Audio Data as it's Buffer
beeper.connect(context.destination); // Link the Sound to the Output
console.log(j);
beeper.start(context.currentTime + j * numSecondsInBeep);
}
See here for more info on the play() API.

How do we continue to be able to play in the array?

Making the audio player. The browser is not compatible tries to play the audio in the array using the embed tag. Continues to play and the next one is not to be played .Only one is played. I want to play the contents of the array in order.
var files = ['notibell_1.wav', 'notibell_2.wav', 'notibell_3.wav'];
var daudio = document.createElement('embed');
daudio.setAttribute('id', 'daudio');
daudio.height = "50";
daudio.width = "400";
daudio.controls = true;
daudio.autoplay = true;
daudio.type = "audio/wav";
for (var i = 0; i < files.length; i++) {
daudio.src = files[i];
}
bottom.appendChild(daudio);
This won't work because you are updating daudio.src 3 times before they are able to play the sound. I would recommend that you use the HTML5 native audio elements and have your sounds ready at least in mp3 and ogg.
If you want to play the contents of the array in order, you can use the ended event listener to reproduce the next sound:
Javascript
var files = ['notibell_1.wav', 'notibell_2.wav', 'notibell_3.wav'];
var i = 0;
audio = new Audio(files[0]);
//Let's add the ended event listener,
//change the source and play it.
audio.addEventListener('ended',function(){
if(++i < files.lenght){
audio.src = files[i];
audio.pause();
audio.load();
audio.play();
}
else{
audio.pause();
}
});
I haven't tested, but you can get the idea
You are creating one single audio element, assign all the properties, and then set the source in a loop, overwriting it 2 times. Your loop should contain the whole element creation and appension:
var files = ['notibell_1.wav', 'notibell_2.wav', 'notibell_3.wav'];
for (var i = 0; i < files.length; i++) {
var daudio = document.createElement('embed');
daudio.setAttribute('id', 'daudio');
daudio.height = "50";
daudio.width = "400";
daudio.controls = true;
daudio.autoplay = true;
daudio.type = "audio/wav";
daudio.src = files[i];
bottom.appendChild(daudio);
}
Now all 3 audio files play at the same time. One easy way to avoid this would be to make a playlist-file. .m3u is an easy format. Make another file called playlist.m3u:
notibell_1.wav
notibell_2.wav
notibell_3.wav
Then just make one audio element, giving it the playlist as a source:
var daudio = document.createElement('embed');
daudio.setAttribute('id', 'daudio');
daudio.height = "50";
daudio.width = "400";
daudio.controls = true;
daudio.autoplay = true;
daudio.type = "audio/wav";
daudio.src = 'playlist.m3u';
bottom.appendChild(daudio);

Categories