I'm developing a firefox extension, and I'd like it to play a notification sound when an event occurs. However, despite following the instructions from Play audio from firefox extension's data directory, the sound doesn't play. It's stored in my data directory. Here is my code:
var pageworker = require ("sdk/page-worker");
let { setTimeout } = require("sdk/timers"); //setTimeout is not enabled by default in ff extensions
function playSound(sound){
console.log("playing sound: "+sound);
var soundPlayer = pageworker.Page({
contentURL: data.url("blank.html"),
contentScript: "new Audio('notification-1.mp3').play();console.log('the audio did run')",//This is where it should play, but it doesn't, even if I remove the console.log
//The script does work if I type it into the javascript console, but replace the file name with a file:/// URL
});
setTimeout(function(){soundPlayer.destroy();},5000);//destroy the sound player after 5 seconds
}
but although both console.log's are called, no audio ever plays.
This is both in the XPI and when using cfx.
As noted in the comments, try using an absolute URL in the contentScript string:
"new Audio("+data.url("notification-1.mp3")+").play();"
Related
https://github.com/BuildFire/sdk/wiki/BuildFire-Audio-Player-Service
in index.html:
function Player() {} //make it global
Player.instance = buildfire.services.media.audioPlayer;
in the triggering function inside player.js
let track = {};
track.title = "Given Title";
track.url = <insert url here>;
track.image = <insert img.jpg url here>;
Player.instance.play(track);
when initializing the Media Player UI in player.js:
Player.instance.onEvent ( function(e){
console.log(e.event);
//other code goes here
};
Right now, on the offline tester, my media player works perfectly, updating the time, and adding the changes when I pause or play. Unfortunately, when I move it to the online tester and the actual app, my phone, and my client's phone don't receive the "timeUpdate" event. And I'm confirming that, with the code above, I DO get "play" and "pause" events, but not the "timeUpdate" event.
Can anyone confirm if this happens for them too, or if there are any fixes?
I think the problem you're having is not with the code, but the server where the audio is hosted. Try hosting the audio on a s3 or other cloud storage bucket instead of bundling it into the app (sorry can't comment yet)
I am attempting to play a play an audio blob within safari and it plays for a fraction of a second and I never hear any audio. The media element fires a "paused" event a tiny fraction of a second into playback (example : 0.038s ).
The blob is recorded in Chrome. Playback works just fine in Chrome and Firefox.
Also the reported duration of the media in safari is much shorter than what it should be. For example a given recording is 7.739 seconds and chrome recognizes the correct duration but safari shows a duration of 1.584. Or another had a duration of 9.96 but safari reported 6.552.
I have tried making sure this is not an issue with Safari preventing playback when not initiated by the user. So playback starts on a tap. I have also tried different mime types. mpeg. webm with h264 and vp8 codecs.
I have made sure that the download blob is the same size in safari as it is on chrome.
I have looked through a number of similar posts including the one with the answer by #lastmjs Loading audio via a Blob URL fails in Safari where there is a demo provided. The demo does work and I am doing more or less what is shown. I suspect the problem is on the record side.
Recorder:
self.mediaRecorder = new MediaRecorder(stream,{'audio' : {'sampleRate' : 22000}});
...assemble the chunks...
self.audioBlob = new Blob(self.audioChunks, {type: 'audio/webm; codecs=vp8'});
...upload the blob to cloud (S3)...
Player:
...in the success handler that downloads blob...
self.audioBlob = new Blob([data],{type: 'audio/webm'});
...I later prepare the element for playback...
let audioUrl = window.URL.createObjectURL(self.audioBlob);
let audioElement = document.createElement('audio');
let sourceElement = document.createElement('source');
audioElement.muted = true;
audioElement.appendChild(sourceElement);
sourceElement.src = audioUrl;
sourceElement.type = 'audio/webm';
document.body.appendChild(audioElement);
audioElement.load()
... when the user taps on a button...
self.audioElement.muted = false;
let playPromise = self.audioElement.play();
playPromise.then(()=>{
console.log("playing should have started: " + self.audioElement.muted + " - " + self.audioElement.paused);
});
...shortly after this - the paused event handler gets fired.
There are no error messages. I am trying this on Safari on Mac and on iOS. No errors. I also listen for the error event on the media element and nothing fires. It just doesnt play for very long. I am clearly missing something. Again capture and playback works great in Chrome. And playback works in Firefox. But playback in Safari just won't work. What should I try?
for everyone having the same problem, try changing 'audio/webm' to 'audio/wav'
We have a set of HTML blocks -- say around 50 of them -- which are iteratively parsed and have Audio objects dynamically added:
var SomeAudioWrapper = function(name) {
this.internal_player = new Audio();
this.internal_player.src = this.determineSrcFromName(name);
// ultimately an MP3
this.play = function() {
if (someOtherConditionsAreMet()) {
this.internal_player.play();
}
}
}
Suppose we generate about 40 to 80 of these on page load, but always the same set for a particular configuration. In all browsers tested, this basic strategy appears to work. The audio load and play successfully.
In IE's 9 and 10, a transient bug surfaces. On occasion, calling .play() on the inner Audio object fails. Upon inspection, the inner Audio object has a .error.code of 4 (MEDIA_ERR_SRC_NOT_SUPPORTED). The file's .duration shows NaN.
However, this only happens occasionally, and to some random subset of the audio files. E.g., usually file_abc.mp3 plays, but sometimes it generates the error. The network monitor shows a successful download in either case. And attempting to reload the file via the console also fails -- and no requests appears in IE's network monitor:
var a = new Audio();
a.src = "the_broken_file.mp3";
a.play(); // fails
a.error.code; // 4
Even appending a query value fails to refetch the audio or trigger any network requests:
var a = new Audio();
a.src = "the_broken_file.mp3?v=12345";
a.play(); // fails
a.error.code; // 4
However, attempting the load the broken audio file in a new tab using the same code works: the "unsupported src" plays perfectly.
Are there any resource limits we could be hitting? (Maybe the "unsupported" audio finishes downloading late?) Are there any known bugs? Workarounds?
I think we can pretty easily detect when a file fails. For other compatibility reasons we run a loop to check audio progress and completion stats to prevent progression through the app (an assessment) until the audio is complete. We could easily look for .error values -- but if we find one, what do we do about it!?
Addendum: I just found a related question (IE 9/10/11 sound file limit) that suggests there's an undocumented limit of 41 -- not sure whether that's a limit of "41 requests for audio files", "41 in-memory audio objects", or what. I have yet to find any M$ documentation on the matter -- or known solutions.
Have you seen these pages on the audio file limits within IE? These are specific to Sound.js, but the information may be applicable to your issue:
https://github.com/CreateJS/SoundJS/issues/40 ...
Possible solution as mentioned in the last comment: "control the maximum number of audio tags depending on the platform and reuse these instead of recreating them"
Additional Info: http://community.createjs.com/kb/faq/soundjs-faq (see the section entitled “I load a lot of sounds, why am running into errors in Internet Explorer?”)
I have not experienced this problem in Edge or IE11. But, I wrote a javascript file to run some tests by looping through 200 audio files and seeing what happens. What I found is that the problem for IE9 and IE10 is consistent between ALL tabs. So, you are not even guaranteed to be able to load 41 files if other tabs have audio opened.
The app that I am working on has a custom sound manager. Our solution is to disable preloading audio for IE9 and IE10 (just load on demand) and then when the onended or onpause callback gets triggered, to run:
this.src = '';
This will free up the number of audio that are contained in IE. Although I should warn that it may make a request to the current page the user is on. When the play method in the sound manager is called again, set the src and play it.
I haven't tested this code, but I wrote something similar that works. What I think you could do for your implementation, is resolve the issue by using a solution like this:
var isIE = window.navigator.userAgent.match(/MSIE (9|10)/);
var SomeAudioWrapper = function(name) {
var src = this.determineSrcFromName(name);
this.internal_player = new Audio();
// If the browser is IE9 or IE10, remove the src when the
// audio is paused or done playing. Otherwise, set the src
// at the start.
if (isIE) {
this.internal_player.onended = function() {
this.src = '';
};
this.internal_player.onpause = this.internal_player.onended;
} else {
this.internal_player.src = src;
}
this.play = function() {
if (someOtherConditionsAreMet()) {
// If the browser is IE, set the src before playing.
if (isIE) {
this.internal_player.src = src;
}
this.internal_player.play();
}
}
}
In my small HTML5 web-app, I want to play sounds in response to user actions. When the user clicks a button, in the onclick handler I play a sound like this:
url = "assets/sounds/buzz" + (this.canPlayMP3 ? ".mp3" : ".ogg");
sound = new Audio(url);
sound.load();
sound.play();
This works great on Firefox. Unfortunately, on an iPad (iPad 2 running iOS 5.1.1), I get a 2-second delay before the sound is played. This happens every time I play the sound sample, not just the first time.
The MP3 file is 9KB long. The iPad is connected to the network using exactly the same Wifi connection as the computer running Firefox.
How can I figure out what's going on?
You might want to create a single instance of the audio element for each sound:
var Sounds = {
cat: new Audio('/sounds/meow.ogg'),
bird: new Audio('/sounds/tweet.ogg')
};
Then you can play the same element over and over again:
function playSound(name) {
Sounds[name].currentTime = 0;
Sounds[name].play();
}
playSound('cat');
If iOS destroys your Audio objects, you could cache sound files in the cache manifest:
CACHE MANIFEST
# 2012-08-09:v1.3
NETWORK:
*
CACHE:
/sounds/meow.ogg
/sounds/tweet.ogg
How about moving the loading outside the handler e.g. make it global/preloaded? Then inside handler call play method only.
I had searched a lot of DEMO and examples about getUserMedia , but most are just camera capturing, not microphone.
So I downloaded some examples and tried on my own computer , camera capturing is work ,
But when I changed
navigator.webkitGetUserMedia({video : true},gotStream);
to
navigator.webkitGetUserMedia({audio : true},gotStream);
The browser ask me to allow microphone access first, and then it failed at
document.getElementById("audio").src = window.webkitURL.createObjectURL(stream);
The message is :
GET blob:http%3A//localhost/a5077b7e-097a-4281-b444-8c1d3e327eb4 404 (Not Found)
This is my code: getUserMedia_simple_audio_test
Did I do something wrong? Or only getUserMedia can work for camera now ?
It is currently not available in Google Chrome. See Issue 112367.
You can see in the demo, it will always throw an error saying
GET blob:http%3A//whatever.it.is/b0058260-9579-419b-b409-18024ef7c6da 404 (Not Found)
And also you can't listen to the microphone either in
{
video: true,
audio: true
}
It is currently supported in Chrome Canary. You need to type about:flags into the address bar then enable Web Audio Input.
The following code connects the audio input to the speakers. WATCH OUT FOR THE FEEDBACK!
<script>
// this is to store a reference to the input so we can kill it later
var liveSource;
// creates an audiocontext and hooks up the audio input
function connectAudioInToSpeakers(){
var context = new webkitAudioContext();
navigator.webkitGetUserMedia({audio: true}, function(stream) {
console.log("Connected live audio input");
liveSource = context.createMediaStreamSource(stream);
liveSource.connect(context.destination);
});
}
// disconnects the audio input
function makeItStop(){
console.log("killing audio!");
liveSource.disconnect();
}
// run this when the page loads
connectAudioInToSpeakers();
</script>
<input type="button" value="please make it stop!" onclick="makeItStop()"/>
(sorry, I forgot to login, so posting with my proper username...)
It is currently supported in Chrome Canary. You need to type about:flags into the address bar then enable Web Audio Input.
The following code connects the audio input to the speakers. WATCH OUT FOR THE FEEDBACK!
http://jsfiddle.net/2mLtM/
<script>
// this is to store a reference to the input so we can kill it later
var liveSource;
// creates an audiocontext and hooks up the audio input
function connectAudioInToSpeakers(){
var context = new webkitAudioContext();
navigator.webkitGetUserMedia({audio: true}, function(stream) {
console.log("Connected live audio input");
liveSource = context.createMediaStreamSource(stream);
liveSource.connect(context.destination);
});
}
// disconnects the audio input
function makeItStop(){
console.log("killing audio!");
liveSource.disconnect();
}
// run this when the page loads
connectAudioInToSpeakers();
</script>
<input type="button" value="please make it stop!" onclick="makeItStop()"/>
It's working, you just need to add toString parameter after audio : true
Check this article - link