The following JavaScript running in canvas should play audio fine:
var audio = new Audio('tune.wav');
audio.play();
Most of the time it does work, the wav is 24bit 14100kbps and plays fine on several machines, but on my laptop (Win7, using Firefox 22.0) I get the error:
HTTP "Content-Type" of "x-unknown/unknown" is not supported. Load of media resource file:///C:/code/sound/tune.wav failed.
I'm aware that there are other libraries to play sound, but I want to keep this pure JavaScript and since it works fine on other machines it might be a hardware problem.
But I am able to play other audio files fine, so I'm not sure what's going wrong here. Any ideas?
Hmm. Based upon my experience with the JS Audio elements, you're missing a line.
var audio = new Audio('tune.wav');
audio.load();
audio.play();
I don't think that's causing the error though. Based upon the responses to this question:
Firefox won't play .WAV files using the HTML5 <audio> tag?
and the back-and-forth in this forum: https://bugzilla.mozilla.org/show_bug.cgi?id=524109 (comment 7)
It looks like Firefox simply doesn't support 24-bit WAVE files. 16-bit is probably a safer option.
Related
I'm implementing a WebRTC Audio chat. I have everything working, and was initially using <audio> elements to output the audio, which worked fine.
But then I wanted to implement a "Speaking indicator" feature, and decided to go with AudioContext.
It works, in Safari + Firefox, but no Chrome. I just don't get any output.
This is my code:
const audioContext = new AudioContext();
// Create an audio source node from the stream received by the
// RTCPeerConnection with peerConnection.ontrack()
const audioSourceNode = audioContext.createMediaStreamSource(stream);
// Connect the audio source to the destination
audioSourceNode.connect(audioContext.destination);
Am I missing something? Do I need to somehow use an <audio> element to get sound on Chrome?
It's an old known bug in Chrome that wasn't fixed so far.
A common workaround is to create a muted <audio> element to make the audio flowing (it can be deleted after). See this answer for an example.
I want to make the JWPlayer to buffer whole video while in "paused" state.
I used this according to JWPlayer API Reference:
playerInstance.getState("paused", function() {
playerInstance.getBuffer("100");
});
I also tried:
if (playerInstance.getState("paused")) {
playerInstance.getBuffer("100");
}
and lately:
playerInstance.on("bufferChange", function(callback) {
console.log(callback.buffer);
console.log(playerInstance.getBuffer());
});
This last one works with small mp4 video files, but not with large ones.*And of course, I'm including JQuery library into my code.
They aren't all working! I'm running it on Chrome, and the video files are large, I know about Chrome issues in download large video files, but come on, isn't there any workaround to bypass it?! I'll appreciate the support, thanks.
I'll try to answer this since I cannot comment but having the same desire to force load/buffer the entire video so I can create a simulated highlight reel. From this post it looks like some hacks are needed since Chrome tries to pre-load as little as possible (presumably to save bandwidth usage). Basically use an ajax call to download the file to the browser then play that downloaded file.
I'm looking for a solution to fully preload an html5 video so that I can play it through and seek to different times without any risk of buffering. I've seen solutions that involve using xhr to download the video file as a 'blob' type and subsequently construct a url to that blob using the createObjectURL method. This is the code example in the solution I mentioned above:
var r = new XMLHttpRequest();
r.onload = function() {
myVid.src = URL.createObjectURL(r.response);
myVid.play();
};
if (myVid.canPlayType('video/mp4;codecs="avc1.42E01E, mp4a.40.2"')) {
r.open("GET", "slide.mp4");
}
else {
r.open("GET", "slide.webm");
}
r.responseType = "blob";
r.send();
This works for me in Chrome and Firefox, but not in Safari when using a video hosted on a CDN. This solution does work in Safari if I use a video hosted on the same server. I found this Safari bug, although I'm not sure if the bug is still valid. There's no mention of the Safari bug on the page with the above solution. I've seen another method which essentially pauses the video and waits for it to buffer to 100%, but Chrome doesn't seem to ever fully buffer the video.
I looked into PreloadJS, which apparently supports video preloading, but I couldn't find any working examples. I also looked into html5Preloader, but again I couldn't figure out what to do once the finish event was fired.
I'm not sure if it makes any difference, but I'm using Videogular to play my video, which needs to be fed a video url. I suppose if I use some preloader library such as PreloadJS or html5Preloader, which I'm guessing would in turn use xhr for video, I would need access to a new blob url in my finished handler.
Has anyone come up with a video preloading solution that works in Safari? Thanks in advance.
It turns out the problem was being caused by the content type response header on the videos coming from Amazon S3. They were set to octet-stream, which Chrome and Firefox were able to handle, but Safari threw a media error 4. Changing the content type in the Amazon S3 admin site to 'video/mp4' solved the problem for me.
More info about Safari and octet-stream here in the 'Known issues' tab: http://caniuse.com/#feat=bloburls
I'm developing a webapp that (in part) records some audio using recorder.js, and sends it to a server. I'm trying to target Firefox, so I have to use this hack to keep the audio source from cutting off:
// Hack for a Firefox bug that stops input after a few seconds
window.source = audio_context.createMediaStreamSource(stream);
source.connect(audio_context.destination);
I think that this is causing audio to be played back through the computer speakers, but I'm not sure. I'm kind of a newbie when it comes to web audio. My goal is to eliminate the audio that is being played out of the speakers.
EDIT:
Here's a link to my JS file on Github: https://github.com/miller9904/Jonathan/blob/master/js/main.js#L87
you have to connect the source to the node( through which you retrieve data which you are going to record), replace this.node with what variable name you have assigned to yuor node used for processing.
window.source.connect(this.node);
//this.node.connect(this.context.destination);
edit: just checked, connecting to destination is not compulsory, also make sure your node variable does not get garbage collected( which i am assuming is happening in your case, since recording stops after few seconds.)
I have this javascript audio player which plays mp3 files. On FF v23.0.1 (Mac) it doesn't work (the reason for this is explained everywhere and here)
What I don't understand is, if I point the URL directly to the mp3 file FF shows its own player and the song plays just fine. But when using the javascript Audio API
var audio = new Audio('/my-song.mp3') ; // --> HTTP “Content-Type” van “audio/mpeg” not supported
audio.autoplay = true ;
it doesn't work. Can someone explain to me why this is ?
Thnx
The error is (note that I've translated it to english): HTTP “Content-Type” of “audio/mpeg” is not supported.
Your Firefox build does not seem to support MP3 yet.
The player that is shown when directly browsing the .mp3 might be just some plugin handling the Content-Type, such as QuickTime, VLC, etc... That won't fly when using that file in an <audio> element, though.
See the "Media formats supported..." article for information on what codecs are supported by what version of Firefox on what platform.