Firefox Audiocontext suspended - javascript

I am trying to record audio and upload it to server using javascript.I am using Recorder js by Matt Diamond.But the issue is file getting generated is of 0 mins. When debugged through firebug console found out the audiocontext property was suspended. When googled found out for recording the audiocontext's state should be in running state. Don't know exactly if the issue is because of state or am I missing on something. Wanted to know what causes the state of audiocontext to be in suspended mode. If I try on other browsers the state is running and file is getting generated. But my restriction is I want to use firefox for my application
Firefox version: 42.0
Below is the code
if(audioRecorder)
{
audioRecorder.clear();
audioRecorder.record();
setTimeout(stopRecorder,9000); // 9 secs
}
function stopRecorder()
{
if(audioRecorder)
{
audioRecorder.stop();
audioRecorder.exportWAV(function(blob){
alert("Blob size : "+blob.size);
// code for sending the blob to server
});
}
}
when debugged the above code in firebug audiocontext was suspended.
Thanks in advance

this is not a direct answer, but it solves the issue, taken from my other answer, if all you need is to send audio files to server, instead of using bulky uncompressed wav files, you can easily( and native-ly) record the audio in compressed ogg format using MediaRecorder API, supported in firefox since v25 and chrome since v47.

Created a JSFiddle and tested it a few times - works correctly on Firefox 42 [macosx]
https://jsfiddle.net/8unmn650/
function createDownloadLink() {
recorder && recorder.exportWAV(function(blob) {
var url = URL.createObjectURL(blob);
var li = document.createElement('li');
var au = document.createElement('audio');
var hf = document.createElement('a');
au.controls = true;
au.src = url;
hf.href = url;
hf.download = new Date().toISOString() + '.wav';
hf.innerHTML = hf.download;
li.appendChild(au);
li.appendChild(hf);
recordingslist.appendChild(li);
});
RecorderJS demo seems to be working correctly on FF 42 [macosx]
There is an open issue on recorder.js github repo regarding Firefox browser creating 0s wav files
Issue 139 Sometimes it is creating wav file of 0.0 duration on firefox #139

Related

Change sample rate of AudioContext (getUserMedia)

Im trying to record a 48000Hz recording via getUserMedia. But without luck. The returned audio MediaStream returns 44100Hz. How can i set this to 48000Hz?
Here are snippets of my code:
var startUsermedia = this.startUsermedia;
navigator.getUserMedia({
audio: true,
//sampleRate: 48000
}, startUsermedia, function (e) {
console.log('No live audio input: ' + e);
});
The startUsermedia function:
startUsermedia: function (stream) {
var input = audio_context.createMediaStreamSource(stream);
console.log('Media stream created.');
// Uncomment if you want the audio to feedback directly
//input.connect(audio_context.destination);
//__log('Input connected to audio context destination.');
recorder = new Recorder(input);
console.log('Recorder initialised.');
},
I tried changing the property sampleRate of the AudioContext, but no luck.
How can i change the sampleRate to 48000Hz?
EDIT : We are also now okay with a flash solution that can record and export wav files at 48000Hz
As far as I know, there is no way to change the sample rate within an audio context. The sample rate will usually be the sample rate of your recording device and will stay that way. So you will not be able to write something like this:
var input = audio_context.createMediaStreamSource(stream);
var resampler = new Resampler(44100, 48000);
input.connect(resampler);
resampler.connect(audio_context.destination);
However, if you want to take your audio stream, resample it and then send it to the backend (or do sth. else with it outside of the Web Audio API), you can use an external sample rate converter (e.g. https://github.com/taisel/XAudioJS/blob/master/resampler.js).
var resampler = new Resampler(44100, 48000, 1, 2229);
function startUsermedia(stream) {
var input = audio_context.createMediaStreamSource(stream);
console.log('Media stream created.');
recorder = audio_context.createScriptProcessor(2048);
recorder.onaudioprocess = recorderProcess;
recorder.connect(audio_context.destination);
}
function recorderProcess(e) {
var buffer = e.inputBuffer.getChannelData(0);
var resampled = resampler.resampler(buffer);
//--> do sth with the resampled data for instance send to server
}
It looks like there is an open bug about the inability to set the sampling rate:
https://github.com/WebAudio/web-audio-api/issues/300
There's also a Chrome issue:
https://bugs.chromium.org/p/chromium/issues/detail?id=432248
I checked the latest Chromium code and there is nothing in there that lets you set the sampling rate.
Edit: Seems like it has been implemented in Chrome, but is broken currently - see the comments in the Chromium issue.
it's been added to chrome:
var ctx = new (window.AudioContext || window.webkitAudioContext)({ sampleRate:16000});
https://developer.mozilla.org/en-US/docs/Web/API/AudioContext/AudioContext
audioContext = new AudioContext({sampleRate: 48000})
Simply Set sample rate when created AudioContext object, This worked for me
NOTE: This answer is outdated.
You can't. The sample rate of the AudioContext is set by the browser/device and there is nothing you can do to change it. In fact, you will find that 44.1kHz on your machine might be 48kHz on mine. It varies to whatever the OS picks by default.
Also remember that not all hardware is capable of all sample rates.
You can use an OfflineAudioContext to essentially render your audio buffer to a different sample rate (but this is batch operation).
So you would record your recording using the normal audio context, and then use an OfflineAudioContext with a different sample rate to render your buffer. There is an example on the Mozilla page.
It is now in the spec but not yet implemented in Chromium.
Also in bugs.chromium.org, "Status: Available" does not mean it is implemented. It just means that nobody is working on it and that it is available for anyone who wants to work on it. So "Available" means "Not assigned".

Cannot extract local video file duration in mobile browser

I am using a file input to capture recorded video from a user's mobile device. What I want to do is then read that file somehow and determine whether it is over a certain duration (30 seconds in this case). If it is over that duration, then the file should not be allowed to be uploaded to the server. It is under the duration, then it is okay.
I can accurately detect the duration of the file in javascript on desktop, but not on mobile, which is what I need. This is my code:
onEndRecord = function(e) {
var file = e.target.files[0];
var videoElement = document.createElement('video');
document.body.appendChild(videoElement);
var fileURL = window.URL.createObjectURL(file);
videoElement.addEventListener('loadeddata', function(e) {
console.log('loadeddata',e.target.duration);
});
videoElement.onload = function () { // binding onload event
console.log('onload',videoElement.duration);
};
videoElement.src = fileURL;
}
Anybody know how to get this information? The duration just reports as zero on mobile.
I've also tried running it through the file reader api:
readBlob = function(file){
console.log('readBlob',file);
var reader = new FileReader();
reader.onload = function (e) {
console.log('reader load');
var player = document.getElementById('videoReader');
player.addEventListener('loadeddata', function(e) {
console.log('loadeddata',e.target.duration);
player.play();
});
var fileURL = window.URL.createObjectURL(file);
player.src = fileURL;
}
reader.readAsDataURL(file);
}
What is happening I believe is that the loadedmetadata event (or loadeddata event as in your question) just does not fire on the mobile devices you are testing hence the duration is not available for reading and is rendered as 0. Have a look here for the events linked to the HTML5 media element specs. Note that you could use the loadstart event for the media element rather than the onload event for fine tuning your web application.
Typically on iOS the event will fire on user interaction ... not before (as with the canplay event). This is a limitation to attempt to reduce bandwidth consumption of users on paid data plans for their mobile device. This is described here for Apple. The same generally goes for Android.
Dealing with the Web Audio API you could get the duration through the buffer received from the decodeAudioData method. Here is some information on the subject.
You can read this information server side with PHP or Java but this would not work at best for your design.
So either you could get a user to playback the recorded sample before uploading to have access to the duration or if you know the average bitrate at which the video was recorded and the file size (File API) then you could approximate the duration.
Solved this by using FFMPEG's FFprobe service. We download just a small amount of the video - about 4k is enough - and then read the metadata. For quicktime, however, the metadata is at the end of the video, so you have to swap the beginning for the end. This was done using a modified version of qt fast start:
https://github.com/danielgtaylor/qtfaststart

web audio in firefox

i am trying to build a web app that visualises and and controls the source audio, it works brilliant in chrome, but completely breaks in firefox, it won't even play the audio. here is the code:
var audio = new Audio();
audio.src='track.mp3';
audio.controls = true;
audio.loop = false;
audio.autoplay = false;
window.addEventListener("load", initPlayer, false);
function initPlayer(){
$("#player").append(audio);
context = new AudioContext();
analyser = context.createAnalyser();
canvas = document.getElementById("vis");;
ctx = canvas.getContext("2d");
source = context.createMediaElementSource(audio);
source.connect(analyser);
analyser.connect(context.destination);
}
the line that breaks everything is:
source = context.createMediaElementSource(audio);
after adding this line the player just hangs at 0:00 in firefox. i have done my research and have come across CORS, but as far as i can understand this should be irrelevant as the file is kept on the same server.
Please help
You have to serve the audio correctly with a server so that MIME types are set, so run it from localhost rather than file:///..../track.mp3
We used to have a bug in Firefox where MediaElementSourceNode did not work properly in some case. It's now fixed (I believe the fix is in Aurora and Nightly, at the time of writing).
Sorry about that.

html audio tag, duration always infinity

I've been working on using the html audio tag to play some audio files. The audio plays alright, but the duration property of the audio tag is always returning infinity.
I tried the accepted answer to this question but with the same result. Tested with Chrome, IE and Firefox.
Is this a bug with the audio tag, or am I missing something?
Some of the code I'm using to play the audio files.
javascript function when playbutton is pressed
function playPlayerV2(src) {
document.getElementById("audioplayerV2").addEventListener("loadedmetadata", function (_event) {
console.log(player.duration);
});
var player = document.getElementById("audioplayer");
player.src = "source";
player.load();
player.play();
}
the audio tag in html
<audio controls="true" id="audioplayerV2" style="display: none;" preload="auto">
note: I'm hiding the standard audio player with the intend of using custom layout and make use of the player via javascript, this does not seem to be related to my problem.
try this
var getDuration = function (url, next) {
var _player = new Audio(url);
_player.addEventListener("durationchange", function (e) {
if (this.duration!=Infinity) {
var duration = this.duration
_player.remove();
next(duration);
};
}, false);
_player.load();
_player.currentTime = 24*60*60; //fake big time
_player.volume = 0;
_player.play();
//waiting...
};
getDuration ('/path/to/audio/file', function (duration) {
console.log(duration);
});
I think this is due to a chrome bug. Until it's fixed:
if (video.duration === Infinity) {
video.currentTime = 10000000;
setTimeout(() => {
video.currentTime = 0; // to reset the time, so it starts at the beginning
}, 1000);
}
let duration = video.duration;
This works for me
const audio = document.getElementById("audioplayer");
audio.addEventListener('loadedmetadata', () => {
if (audio.duration === Infinity) {
audio.currentTime = 1e101
audio.addEventListener('timeupdate', getDuration)
}
})
function getDuration() {
audio.currentTime = 0
this.voice.removeEventListener('timeupdate', getDuration)
console.log(audio.duration)
},
In case you control the server and can make it to send proper media header - this what helped the OP.
I faced this problem with files stored in Google Drive when getting them in Mobile version of Chrome. I cannot control Google Drive response and I have to somehow deal with it.
I don't have a solution that satisfies me yet, but I tried the idea from both posted answers - which basically is the same: make audio/video object to seek the real end of the resource. After Chrome finds the real end position - it gives you the duration. However the result is unsatisfying.
What this hack really makes - it forces Chrome to load the resource into the memory completely. So, if the resource is too big, or connection is too slow you end up waiting a long time for the file to be downloaded behind the scenes. And you have no control over that file - it is handled by Chrome and once it decides that it is no longer needed - it will dispose it, so the bandwidth may be spent ineficciently.
So, in case you can load the file yourself - it is better to download it (e.g. as blob) and feed it to your audio/video control.
If this is a Twilio mp3, try the .wav version. The mp3 is coming across as a stream and it fools the audio players.
To use the .wav version, just change the format of the source url from .mp3 to .wav (or leave it off, wav is the default)
Note - the wav file is 4x larger, so that's the downside to switching.
Not a direct answer but in case anyone using blobs came here, I managed to fix it using a package called webm-duration-fix
import fixWebmDuration from "webm-duration-fix";
...
fixedBlob = await fixWebmDuration(blob);
...
//If you want to modify the video file completely, you can use this package "webmFixDuration" Other methods are applied at the display level only on the video tag With this method, the complete video file is modified
webmFixDuration github example
mediaRecorder.onstop = async () => {
const duration = Date.now() - startTime;
const buggyBlob = new Blob(mediaParts, { type: 'video/webm' });
const fixedBlob = await webmFixDuration(buggyBlob, duration);
displayResult(fixedBlob);
};

Unable to play sound in Google Chrome using a MediaStreamAudioSourceNode

I've been toying with WebRTC but I'm completely unable to play a simple audio stream after properly granting rights to the browser to use the input device.
I just try to connect the input device to the context destination, but it doesn't work.
This snippet isn't working and I think it should:
function success(stream)
{
var audioContext = new webkitAudioContext();
var mediaStreamSource = audioContext.createMediaStreamSource(stream);
mediaStreamSource.connect(audioContext.destination);
}
navigator.webkitGetUserMedia({audio:true, video:false}, success);
This doesn't seem to capture any sound from my working microphone, but if I use a simple tag and create a blob url the code suddenly starts working.
function success(stream)
{
audio = document.querySelector('audio');
audio.src = window.URL.createObjectURL(stream);
audio.play();
}
navigator.webkitGetUserMedia({audio:true, video:false}, success);
Also, not a single of these demos seems to be working for me: http://webaudiodemos.appspot.com/.
Fiddle for the first snippet: http://jsfiddle.net/AvMtt/
Fiddle for the second snippet: http://jsfiddle.net/vxeDg/
Using Chrome 28.0.1500.71 beta-m on Windows 7x64.
I have a single input device, and two output devices (speakers, headsets). Every device is using the same sample rate.
This question is almost 6 years old, but for anyone who stumbles across it, the modern version of this looks something like:
function success(stream) {
let audioContext = new AudioContext();
let mediaStreamSource = audioContext.createMediaStreamSource(stream);
mediaStreamSource.connect(audioContext.destination);
}
navigator.mediaDevices.getUserMedia({audio: true, video: false})
.then(success)
.catch((e) => {
console.dir(e);
});
And appears to work based on https://jsfiddle.net/jmcker/g3j1yo85

Categories