I come here hoping that you lovely folks here on SO can help me out with a bit of a problem that I'm having.
Specifically, every time I attempt to use the decodeAudioData method of a webkitAudioContext, it always triggers the error handler with a null error. This is the code that I'm currently using:
var soundArray;
var context = new webkitAudioContext();
function loadSound(soundName) {
var request = new XMLHttpRequest();
request.open('GET',soundName);
request.responseType = 'arraybuffer';
request.onload = function() {
context.decodeAudioData(this.response, function(buf) {
sounds[soundName] = buf;
},function(err) { console.log("err(decodeAudioData): "+err); });
}
request.send();
}
At this point, it constantly logs error messages to the console saying err(decodeAudioData) = null, mostly because that was just how I decided to log it.
In any case, any idea why this might be going on?
I'm using Chrome Canary, v20.0.1121.0, to try and get something working. But, obviously, it's not working! So, any idea what I might be able to do? If any new information is needed, let me know, and I'll update as necessary.
The real reason is that both createBuffer and decodeAudioData right now have a Bug and throw weird vague DOM exception 12 for files they should normally play.
But we should be aware that this is new and evolving technology and be thankful even for web audio api as it is now since its small miracle that happened to us.
They are missing stream syncing on header boundary that any reasonable decoder of streaming audio format should start with.
And mp3 or many aac/adts files are streaming fileformats. streaming means that you can cut them anywhere or insert append anything (various tags even image artwork) decoder shouldnt care about unknown data. decoder should just seek until he finds header he knows and can decode.
I thrown together this temporary solution that seeks to nearest frame header start and passes data from this offset only.
mp3 or mp2 all start header for every audio frame (every around 200bytes) with 0XFFE and aac(adts) on oxFFF syncword that is there just for this reason. therefore both will sync on 0xFFE.
Here is the code I currently use to play previously not played files.
What I hate is that arrayBuffer doesnt have subarray() like its typed childs to return just different view from different offset instead of whole new array copy that slice() returns. if only webaudio api accepted typedarrays as input but unfortunately the only way to create arraybuffer back seems huge slice() copy.
thankfully usually only one or two seeks are needed.
Forcing Web Audio Api to not being Picky about files
node={};
node.url='usual_mp3_with_tags_or_album_artwork.mp3';
function syncStream(node){ // should be done by api itself. and hopefully will.
var buf8 = new Uint8Array(node.buf);
buf8.indexOf = Array.prototype.indexOf;
var i=node.sync, b=buf8;
while(1) {
node.retry++;
i=b.indexOf(0xFF,i); if(i==-1 || (b[i+1] & 0xE0 == 0xE0 )) break;
i++;
}
if(i!=-1) {
var tmp=node.buf.slice(i); //carefull there it returns copy
delete(node.buf); node.buf=null;
node.buf=tmp;
node.sync=i;
return true;
}
return false;
}
function decode(node) {
try{
context.decodeAudioData(node.buf,
function(decoded){
node.source = context.createBufferSource();
node.source.connect(context.destination);
node.source.buffer=decoded;
node.source.noteOn(0);
},
function(){ // only on error attempt to sync on frame boundary
if(syncStream(node)) decode(node);
});
} catch(e) {
log('decode exception',e.message);
}
}
function playSound(node) {
node.xhr = new XMLHttpRequest();
node.xhr.onload=function(){
node.buf=node.xhr.response;
node.sync=0;
node.retry=0;
decode(node);
}
node.xhr.open("GET", node.url, true);
node.xhr.responseType = "arraybuffer";
node.xhr.send();
}
I was using webkitAudioContext with Chrome 19. Today I've upgraded to Chrome 20 and I have the same problem as you.
I have taken another MP3 file and it works again. The only difference between the two files is the cover embedded in the wrong MP3 file.
I have removed the cover and it works again.
Related
My team is adapting the sipml5 library to create a html5 softphone for use in our organization. The full repository is here: https://github.com/L1kMakes/sipml5-ng . We have the code working well; audio and video calls work flawlessly. In the original code we forked from (which was from like 2012) screen sharing was accomplished with a browser plugin, but HTML 5 and WebRTC have changed to allow this to be done with just JavaScript now.
I am having difficulty adapting the code to accommodate this. I am starting with the code here on line 828: https://github.com/L1kMakes/sipml5-ng/blob/master/src/tinyMEDIA/src/tmedia_session_jsep.js This works, though without audio. That makes sense as the only possible audio stream from a screen share is the screen audio, not the mic audio. I am attempting to initialize an audio stream from getUserMedia, grab a video stream from getDisplayMedia, and present that to the client as a single mediaStream. Here's my adapted code:
if ( this.e_type == tmedia_type_e.SCREEN_SHARE ) {
// Plugin-less screen share using WebRTC requires "getDisplayMedia" instead of "getUserMedia"
// Because of this, audio constraints become limited, and we have to use async to deal with
// the promise variable for the mediastream. This is a change since Chrome 71. We are able
// to use the .then aspect of the promise to call a second mediaStream, then attach the audio
// from that to the video of our second screenshare mediaStream, enabling plugin-less screen
// sharing with audio.
let o_stream = null;
let o_streamAudio = null;
let o_streamVideo = null;
let o_streamAudioTrack = null;
let o_streamVideoTrack = null;
try {
navigator.mediaDevices.getDisplayMedia(
{
audio: false,
video: !!( this.e_type.i_id & tmedia_type_e.VIDEO.i_id ) ? o_video_constraints : false
}
).then(o_streamVideo => {
o_streamVideoTrack = o_streamVideo.getVideoTracks()[0];
navigator.mediaDevices.getUserMedia(
{
audio: o_audio_constraints,
video: false
}
).then(o_streamAudio => {
o_streamAudioTrack = o_streamAudio.getAudioTracks()[0];
o_stream = new MediaStream( [ o_streamVideoTrack , o_streamAudioTrack ] );
tmedia_session_jsep01.onGetUserMediaSuccess(o_stream, This);
});
});
} catch ( s_error ) {
tmedia_session_jsep01.onGetUserMediaError(s_error, This);
}
} else {
try {
navigator.mediaDevices.getUserMedia(
{
audio: (this.e_type == tmedia_type_e.SCREEN_SHARE) ? false : !!(this.e_type.i_id & tmedia_type_e.AUDIO.i_id) ? o_audio_constraints : false,
video: !!(this.e_type.i_id & tmedia_type_e.VIDEO.i_id) ? o_video_constraints : false // "SCREEN_SHARE" contains "VIDEO" flag -> (VIDEO & SCREEN_SHARE) = VIDEO
}
).then(o_stream => {
tmedia_session_jsep01.onGetUserMediaSuccess(o_stream, This);
});
} catch (s_error ) {
tmedia_session_jsep01.onGetUserMediaError(s_error, This);
}
}
My understanding is, o_stream should represent the resolved mediaStream tracks, not a promise, when doing a screen share. On the other end, we are using the client "MicroSIP." When making a video call, when the call is placed, I get my video preview locally in our web app, then when the call is answered the MicroSIP client gets a green square for a second, then resolves to my video. When I make a screen share call, my local web app sees the local preview of the screen share, but upon answering the call, my MicroSIP client just gets a green square and never gets the actual screen share.
The video constraints for both are the same. If I add debugging output to get more descriptive of what is actually in the media streams, they appear identical as far as I can tell. I made a test video call and a test screen share call, captured debug logs from each and held them side by side in notepad++...all appears to be identical save for the explicit debug describing the traversal down the permission request tree with "GetUserMedia" and "GetDisplayMedia." I can't really post the debug logs here as cleaning them up of information from my organization would leave them pretty barren. Save for the extra debug output on the "getDisplayMedia" call before "getUserMedia", timestamps, and uniqueID's related to individual calls, the log files are identical.
I am wondering if the media streams are not resolving from their promises before the "then" is completed, but asynchronous javascript and promises is still a bit over my head. I do not believe I should convert this function to async, but I have nothing else to debug here; the mediaStream is working as I can see it locally, but I'm stumped on figuring out what is going on with the remote send.
The solution was...nothing, the code was fine. It turns out the recipient SIP client we were using had an issue where it just aborts if it gets video larger than 640x480.
I have a JavaScript audio player with skip forward/back 10 second buttons. I do this by setting the currentTime of my audio element:
function Player(skipTime)
{
this.skipTime = skipTime;
this.waitLoad = false;
// initialise main narration audio
this.narration = new Audio(getFileName(dynamicNarration));
this.narration.preload = "auto";
this.narration.addEventListener('canplaythrough', () => { this.loaded(); });
this.narration.addEventListener('timeupdate', () => { this.seek(); });
this.narration.addEventListener('ended', () => { this.ended(); });
this.narration.addEventListener('waiting', () => { this.audioWaiting(); });
this.narration.addEventListener('playing', () => { this.loaded(); });
}
Player.prototype = {
rew: function rew()
{
if (!this.waitLoad) {
this.skip(-this.skipTime);
}
},
ffw: function ffw()
{
if (!this.waitLoad) {
this.skip(this.skipTime);
}
},
skip: function skip(amount)
{
const curTime = this.narration.currentTime;
const newTime = curTime + amount;
console.log(`Changing currentTime (${curTime}) to ${newTime}`);
this.narration.currentTime = newTime;
console.log(`Result: currentTime = ${this.narration.currentTime}`);
},
loaded: function loaded()
{
if (this.waitLoad) {
this.waitLoad = false;
playButton.removeClass('loading');
}
},
audioWaiting: function audioWaiting()
{
if (!this.waitLoad) {
this.waitLoad = true;
playButton.addClass('loading');
}
},
}
(I'm including here some of the event listeners I'm attaching because previously I'd debugged a similar problem as being down to conflicts in event listeners. Having thoroughly debugged event listeners this time though, I don't think that's the root of the problem.)
Though this all works fine on my local copy, when I test an online version I get the following results:
Chrome: resets play position to 0. Final console line reads Result: currentTime = 0.
Safari: doesn't change play position at all. Final console.log line gives a value for currentTime equal to newTime (even though the play position actually doesn't change).
Firefox: skipping forward works; skipping backwards interrupts the audio for a few seconds, then it starts playing again from a couple of seconds before where the playhead had been. In both cases, final console.log line gives a value for currentTime equal to newTime
The issue must have something to do with the way audio is loaded. I have tried adding another console log line to show the start and end values for buffered.
In Chrome it goes up to 2 seconds after current play position. In Safari it goes up to ~170 seconds, and in Firefox it seems to buffer the full audio length.
However, in each case the start of the buffered object is 0.
Does anyone have any idea what might be going wrong?
There are some requirements to properly load an audio file and use the properties.
Your response while serving the file needs to have the following headers.
accept-ranges: bytes
Content-Length: BYTE_LENGTH_OF_YOUR_FILE
Content-Range: bytes 0-BYTE_LENGTH_OF_YOUR_FILE/BYTE_LENGTH_OF_YOUR_FILE
content-type: audio/mp3
My colleagues and I have been struggling over this for a few days and finally this worked
Image of Response header for an audio file
If your browser did not load the audio then the audio can not be played. The browser did not know your audio file and becaue of this it tries to play your audio from the start. May be your audio could be only 1 second long or even shorter.
Solution
You have to wait for loadedmetadata event and after it you can play your audion from any time position. After this event your browser knows all relevant information about your audio file.
Please change your code like follows:
function Player(skipTime)
{
this.skipTime = skipTime;
// initialise main narration audio
this.narration = new Audio(getFileName(dynamicNarration));
this.narration.preload = "auto";
this.narration.addEventListener('canplaythrough', () => { this.loaded(); });
this.narration.addEventListener('timeupdate', () => { this.seek(); });
this.narration.addEventListener('ended', () => { this.ended(); });
this.narration.addEventListener('waiting', () => { this.audioWaiting(); });
this.narration.addEventListener('playing', () => { this.loaded(); });
this.narration.addEventListener('loadedmetadata', () => {playButton.removeClass('loading');});
playButton.addClass('loading');
}
Player.prototype =
{
rew: function()
{
this.skip(-this.skipTime);
},
ffw: function()
{
this.skip(this.skipTime);
},
skip: function(amount)
{
var curTime = this.narration.currentTime;
var newTime = curTime + amount;
console.log(`Changing currentTime (${curTime}) to ${newTime}`);
this.narration.currentTime = newTime;
console.log(`Result: currentTime = ${this.narration.currentTime}`);
}
};
But if you do not want long to wait for audio loading then you have only one option more: to convert all your audiofiles to dataURL format which looks like follows:
var data = "data:audio/mp3;base64,...
But in this case you have to wait for your page load even more than for one audio file load. And by audio file load it is only the metadata and it is faster.
This solved my issue...
private refreshSrc() {
const src = this.media.src;
this.media.src = '';
this.media.src = src;
}
I found a solution to my problem, if not exactly an explanation.
My hosting provider uses a CDN, for which it must replace resource's URLs with those of a different domain. The URLs of my audio resources are dynamically constructed by JS, because there's a random element to them; as such, the deployment process that replaces URLs wasn't catching those for my audio files. To get around this, I manually excluded the audio files from the CDN, meaning I could refer to them using relative file paths.
This was how things stood when I was having this issue.
Then, due to a separate issue, I took a different approach: I got the audio files back on the CDN and wrote a function to extract the domain name I needed to use to retrieve the files. When I did that, suddenly I found that all my problems to do with setting currentTime had disappeared. Somehow, not having the files on the CDN was severely interfering with the browser's ability to load them in an orderly manner.
If anyone can volunteer an explanation for why this might have been, I'd be very curious to hear it...
Edit
I've been working on another project which involves streaming audio, this time also with PWA support, so I had to implement a caching mechanism in my service worker for audio files. Through this guide I learned all about the pitfalls of range requests, and understand now that failing to serve correct responses to requests with range headers will break seeking on some browsers.
It seems that in the above case, when I excluded my files from the CDN they were served from somewhere that didn't support range headers. When I moved them back on the CDN this was fixed, as it must have been built with explicit support for streaming media.
Here is a good explanation of correct responses to range requests. But for anyone having this issue while using a third party hosting service, it suffices to know that probably they do not support range headers for streaming media. If you want to verify this is the case, you can query the audio object's duration. At least in Safari's case, the duration is set to infinity when it can't successfully make a range request, and at that point seeking will be disabled.
I have an array of about 60 audio files, (~3min each). I loop through this array - for each item I create an OfflineAudioContext and then do some filtering and processing like so:
var request = new XMLHttpRequest();
request.open('GET', audioFile.source, true);
request.responseType = "arraybuffer";
request.onload = function(){
context.decodeAudioData(request.response, function(buffer) {
audioFileBuffer = buffer;
offlineContext = new OfflineAudioContext(1, buffer.length, buffer.sampleRate);
//do some processing
//do some checks
}
}
Even without any processing or 'checks' this will cause the browser to crash at around the 30 mark. I've tried going through the array slowly (button clicks for each item) but the browser will still crash around this threshold.
After the processing & checks are complete the offlineContext and anything used to create it are no longer needed - is this still taking up memory somewhere and causing the browser to crash?
EDIT: changed the code to test more specific areas and it appears that offline audio context will crash only chrome, the following test will complete all 1000 runs in opera, ff & safari but will crash at ~170 in chrome.
for(i=0; i<1000; i++){
var off = new webkitOfflineAudioContext(1, 1764000, 44100);
console.log(i);
}
Chrome gives the error: "Uncaught NotSupportedError: Failed to construct 'OfflineAudioContext': OfflineAudioContext(1, 1764000, 44100)" and then will crash if the page is refreshed
Without seeing all the surrounding code, I can't tell. Can you remove the decodeAudioData calls and just create 30+ OfflineAudioContexts of the given lengths and see if it has the same issues? (i.e. don't load the buffers).
OfflineAudioContext was crashing my page when I accidentally created one that was 24 times too long.
I think if you create too many, or create one that is too long, it will immediately crash because it doesn't have enough memory.
Try garbage collecting, or perhaps doing them one at a time in sequence.
We have a set of HTML blocks -- say around 50 of them -- which are iteratively parsed and have Audio objects dynamically added:
var SomeAudioWrapper = function(name) {
this.internal_player = new Audio();
this.internal_player.src = this.determineSrcFromName(name);
// ultimately an MP3
this.play = function() {
if (someOtherConditionsAreMet()) {
this.internal_player.play();
}
}
}
Suppose we generate about 40 to 80 of these on page load, but always the same set for a particular configuration. In all browsers tested, this basic strategy appears to work. The audio load and play successfully.
In IE's 9 and 10, a transient bug surfaces. On occasion, calling .play() on the inner Audio object fails. Upon inspection, the inner Audio object has a .error.code of 4 (MEDIA_ERR_SRC_NOT_SUPPORTED). The file's .duration shows NaN.
However, this only happens occasionally, and to some random subset of the audio files. E.g., usually file_abc.mp3 plays, but sometimes it generates the error. The network monitor shows a successful download in either case. And attempting to reload the file via the console also fails -- and no requests appears in IE's network monitor:
var a = new Audio();
a.src = "the_broken_file.mp3";
a.play(); // fails
a.error.code; // 4
Even appending a query value fails to refetch the audio or trigger any network requests:
var a = new Audio();
a.src = "the_broken_file.mp3?v=12345";
a.play(); // fails
a.error.code; // 4
However, attempting the load the broken audio file in a new tab using the same code works: the "unsupported src" plays perfectly.
Are there any resource limits we could be hitting? (Maybe the "unsupported" audio finishes downloading late?) Are there any known bugs? Workarounds?
I think we can pretty easily detect when a file fails. For other compatibility reasons we run a loop to check audio progress and completion stats to prevent progression through the app (an assessment) until the audio is complete. We could easily look for .error values -- but if we find one, what do we do about it!?
Addendum: I just found a related question (IE 9/10/11 sound file limit) that suggests there's an undocumented limit of 41 -- not sure whether that's a limit of "41 requests for audio files", "41 in-memory audio objects", or what. I have yet to find any M$ documentation on the matter -- or known solutions.
Have you seen these pages on the audio file limits within IE? These are specific to Sound.js, but the information may be applicable to your issue:
https://github.com/CreateJS/SoundJS/issues/40 ...
Possible solution as mentioned in the last comment: "control the maximum number of audio tags depending on the platform and reuse these instead of recreating them"
Additional Info: http://community.createjs.com/kb/faq/soundjs-faq (see the section entitled “I load a lot of sounds, why am running into errors in Internet Explorer?”)
I have not experienced this problem in Edge or IE11. But, I wrote a javascript file to run some tests by looping through 200 audio files and seeing what happens. What I found is that the problem for IE9 and IE10 is consistent between ALL tabs. So, you are not even guaranteed to be able to load 41 files if other tabs have audio opened.
The app that I am working on has a custom sound manager. Our solution is to disable preloading audio for IE9 and IE10 (just load on demand) and then when the onended or onpause callback gets triggered, to run:
this.src = '';
This will free up the number of audio that are contained in IE. Although I should warn that it may make a request to the current page the user is on. When the play method in the sound manager is called again, set the src and play it.
I haven't tested this code, but I wrote something similar that works. What I think you could do for your implementation, is resolve the issue by using a solution like this:
var isIE = window.navigator.userAgent.match(/MSIE (9|10)/);
var SomeAudioWrapper = function(name) {
var src = this.determineSrcFromName(name);
this.internal_player = new Audio();
// If the browser is IE9 or IE10, remove the src when the
// audio is paused or done playing. Otherwise, set the src
// at the start.
if (isIE) {
this.internal_player.onended = function() {
this.src = '';
};
this.internal_player.onpause = this.internal_player.onended;
} else {
this.internal_player.src = src;
}
this.play = function() {
if (someOtherConditionsAreMet()) {
// If the browser is IE, set the src before playing.
if (isIE) {
this.internal_player.src = src;
}
this.internal_player.play();
}
}
}
How can I know, in Xul, if the network is (dis)connected?
--update
Using:
function observe(aSubject, aTopic, aState) {
if (aTopic == "network:offline-status-changed") {
write("STATUS CHANGED!");
}
}
var os = Components.classes["#mozilla.org/observer-service;1"].getService(Components.interfaces.nsIObserverService);
os.addObserver(observe, "network:offline-status-changed", false);
and the preference:
pref("network.manage-offline-status", true);
it's not working.. There's a bug report here, but I don't think it has something to do with it.
--
Actually I think it's not possible to be notified, as even in Firefox we're never notified, and the user need to manually mark "work offline" if he wants the browser to know that it's offline..
--
Screenshot my of Firefox "about:config" filtering for "offline" string, unfortunately, there no "network.manage-offline-status":
You should be able to use navigator.onLine. Here is the help page
https://developer.mozilla.org/en/Online_and_offline_events
navigator.onLine is a property that
maintains a true/false value (true for
online, false for offline). This
property is updated whenever the user
switches into "Offline Mode" by
selecting the corresponding menu item
(File -> Work Offline in Firefox).
Another solution (as commented by #Neil):
Components.classes["#mozilla.org/observer-service;1"]
.getService(Components.interfaces.nsIObserverService)
.addObserver(myFunction, "network:offline-status-changed", false);
The best way I found is to use the following javascript code, that behaves like a ping, and make the test with some big websites, and assume that if none of them answers, so the network must be disconnected.
var ping = {};
ping = {
img:null,
imgPreload:null,
timer:null,
init:function() {
var sess = new Date();
var nocache = sess.getTime();
var imguri = ping.img+"?time="+nocache;
var ping.imgPreload = new Image();
ping.imgPreload.onload = function() {
clearTimeout(ping.timer);
ping.timer = null;
alert("Domain is available");
};
ping.imgPreload.src = imguri;
ping.timer = setTimeout("ping.fail_to_ping()",60000);
},
fail_to_ping:function() {
clearTimeout(ping.timer);
ping.timer = null;
ping.imgPreload = null;
alert("Ping to domain failed!");
}
};
(from http://crynobone.com/ci/index.php/archive/view/852)
--update
But, as it's not a reliable solution (as you can't rely that the image will be in the website forever), the best solution might be to develop a new XPCom component.
Eh... as per HTML5 (read echmascript 5), the on-/offline events are available.
See it here at Mozilla Hacks
Edit 20/4/2011:
I just encountered an update for this answer, when i was watching a podcast from MS MIX11:
http://channel9.msdn.com/Events/MIX/MIX11/HTM14 around time 43:36, the lecturer is actually talking about the window.navigator.onLine property, where he uses it for detecting if the browser (and the computer) is online. Then he uses the online event to do something when he gets online again.
This method is only available in modern browsers, however. So IE 8 and below have to poll for the connection.