Here is my script:
<script>
var bgMusic = new Audio("../song.mp3");
var bgMusic2 = new Audio("");
</script>
IE10 of server 2008 R2 throw this error: Not implemented.
I tried in IE10 of normal desktop like window 8 and no this error.
Please help, Thanks!!
If sound support has been disabled (the default in windows server), then IE throws an error when trying to access the html5 Audio object.
You could use Try Catch block to test for audio support like this:
//test file type support:
var snd = null;
var mp3support = false;
var oggsupport = false;
try {
snd = new Audio();
if (snd.canPlayType('audio/ogg')) {
//play your ogg file
oggsupport = true;
}
else if (snd.canPlayType('audio/mp3')) {
//play your mp3 file
mp3support = true;
}
} catch(e) {
return;
}
I have tried this and it works great.
Related
I have created an application that sings along in the app with the web audio API of JavaScript. This worked perfectly on iOS safari and Chrome, but the sound quality was poor on Android Chrome. To solve this, I tried changing the audio deviceId, but it still didn't work. Does someone have information that might help?
Doubt: After recording, I pass the file to the server and play it on another page. I am wondering if this is causing the problem.
This is my code
function captureUserMedia(mediaConstraints) {
navigator.mediaDevices.getUserMedia(mediaConstraints).then(onMediaSuccess)["catch"]();
}
function record() {
if (getParameterByName("startSec").length !== 0) {
masterSound.currentTime = getParameterByName("startSec");
}
masterSound.play();
if (document.querySelectorAll(".record")[0].getAttribute("status") == "off") {
document.querySelectorAll(".record")[0].setAttribute("status", "on");
document.querySelectorAll(".record")[0].classList.add("stoped");
var mediaConstraints;
const devices = navigator.mediaDevices.enumerateDevices()
devices.then((value) => {
// mediaConstraints = {
// audio: {
// deviceId: {
// exact: value[0].deviceId
// }
// },
// video: false
// };
mediaConstraints = {
audio: true,
video: false,
};
captureUserMedia(mediaConstraints, onMediaSuccess);
});
} else {
document.querySelectorAll(".record")[0].setAttribute("status", "off");
document.querySelectorAll(".record")[0].classList.remove("stoped");
mediaRecorder.stream.stop();
masterSound.pause();
}
}
function onMediaSuccess(stream) {
var audio = document.createElement('audio');
audio.controls = true;
audio.files = true;
audio.muted = true;
audio.srcObject = stream;
audio.play();
var audiosContainer = document.querySelectorAll(".audio_wrapper")[0];
audiosContainer.appendChild(audio);
audiosContainer.appendChild(document.createElement('hr'));
mediaRecorder = new MediaStreamRecorder(stream);
mediaRecorder.mimeType = 'audio/wav';
mediaRecorder.stream = stream;
mediaRecorder.recorderType = MediaRecorderWrapper;
mediaRecorder.audioChannels = 1;
mediaRecorder.start();
mediaRecorder.ondataavailable = function (blob) {
audioFile = blob;
var blobURL = URL.createObjectURL(blob);
document.querySelectorAll(".append_audio")[0].setAttribute("src", blobURL);
function blobToFile(theBlob, fileName) {
theBlob.lastModifiedDate = new Date();
theBlob.name = fileName;
return theBlob;
}
submit();
function submit() {
var audioTest = new Audio(URL.createObjectURL(blob));
audioTest.play();
}
};
}
When trying to build high-quality audio with getDisplayMedia, in the past I've passed in MediaStreamConstraints that remove some of the default processing on the input track:
stream = await navigator.mediaDevices.getDisplayMedia(
{
video: true,
audio:
{
channels: 2,
autoGainControl: false,
echoCancellation: false,
noiseSuppression: false
}
}
);
I'm still learning WebRTC myself, so I'm not sure if these same properties can be passed when using getUserMedia and MediaConstraints, but I thought I'd share in case helpful. It sounds like this might also be about available devices. Good luck!
Had a similar issue where we were getting complaints about very low sound/gain - barely hearable - with our HTML/JS recording client when running on Chrome on some Android devices.
Ended up buying an older Samsung phone (Galaxy A8) to easily replicate the issue.
The culprit was echoCancellation being set to false. With it disabled, we had a very low volume on the recorded audio. The solution was to set echoCancellation as true.
We ended up removing the constraint altogether and relied on each browser's defaults (echoCancellation is enabled by default on Chrome, Safari, Firefox).
Worth mentioning that autoGainControl and noiseSuppression inherit the value of echoCancellation, more exactly, if you only set audio: {echoCancellation: true} the other 2 constraints will also be set as true.
I create websocket server in python to handle notification event. Now, i can receive notification, the problem is i can't play sound because new autoplay policy changed, if i play sound using javascript it give me domexception. Any suggestion please ?
As i know, playing sound is simple in html-javascript. like this example: https://stackoverflow.com/a/18628124/7514010
but it depend to your browsers and how you load and play, so issues is:
Some of browsers wait till user click something, then let you play it (Find a way for it)
In some case browsers never let you play till the address use SSL (means the HTTPS behind your url)
The loading be late so the playing be late / or even not start.
So i usually do this:
HTML
<audio id="notifysound" src="notify.mp3" autobuffer preload="auto" style="visibility:hidden;width:0px;height:0px;z-index:-1;"></audio>
JAVASCRIPT (Generally)
var theSound = document.getElementById("notifysound");
theSound.play();
And the most safe if i want sure it be played when i notify is :
JAVASCRIPT (In your case)
function notifyme(theTitle,theBody) {
theTitle=theTitle || 'Title';
theBody=theBody || "Hi. \nIt is notification!";
var theSound = document.getElementById("notifysound");
if ("Notification" in window && Notification) {
if (window.Notification.permission !== "granted") {
window.Notification.requestPermission().then((result) => {
if (result != 'denied') {
return notifyme(theTitle,theBody);
} else {
theSound.play();
}
});
} else {
theSound.play();
try {
var notification = new Notification(theTitle, {
icon: 'icon.png',
body: theBody
});
notification.onclick = function () {
window.focus();
};
}
catch(err) {
return;
}
}
} else {
theSound.play();
}
}
(and just hope it be played. because even possible to volume or some customization make it failed.)
to bypass new autoplay policy :
create a button that can play the sound, hide it and trigger the sound with :
var event = new Event('click');
playBtn.dispatchEvent(event);
EDIT
assuming you have :
let audioData = 'data:audio/wav;base64,..ᴅᴀᴛᴀ...'; // or the src path
you can use this function to trigger whenever you want without appending or create element to the DOM:
function playSound() {
let audioEl = document.createElement('audio');
audioEl.src = audioData;
let audioBtn = document.createElement('button');
audioBtn.addEventListener('click', () => audioEl.play(), false);
let event = new Event('click');
audioBtn.dispatchEvent(event);
}
usage :
just playSound()
EDIT 2
I re test my code and it does'nt work hum ... weird
The javascript error is: Unhandled Promise Rejection: NotAllowedError: The request is not allowed by the user agent or the platform in the current context, possibly because the user denied permission.
My setup works across other browsers, desktop and mobile.
The way it works is:
have a flag first_audio_played = false;
add a touch event listener that plays some audio, and sets first_audio_played = true; (then removes the touch listener)
all subsequent audio checks if(first_audio_played) some_other_audio.play();
this way, only the first audio played requires direct user input. after that, all audio is free to be triggered by in-game events, timing, etc...
this appears to be the "rule" for audio across most browsers. is the iOS "rule" that every audio needs to be triggered by user input? or is there some other step I'm missing?
For my javascript game, sounds stopped working on iOS recently. They all have readyState=4, but only the sound I played on tap works, the others won't play. Maybe you could play all the sounds on the first tap. But the solution I found that works for now for me is to load all the sounds from ajax arraybuffers and use decodeAudioData(). Then once you've played 1 sound from user tap (on not the body), they all play whenever.
Here is my working code where the second way of doing it is on bottom. When I tap to play sound2, sound1 starts working also.
<html>
<body>
<div id=all style='font-size:160%;background:#DDD' onclick="log('clicked');playSound(myAudio)">
Sound1 should be playing every couple seconds.
<br />Tap here to play sound1.
</div>
<div id=debug style='font-size:120%;' onclick="playSound(myAudio2)">
Tap here to play the sound2.
</div>
<script>
var url = "http://curtastic.com/drum.wav"
var url2 = "http://curtastic.com/gold.wav"
var myAudio, myAudio2
if(0)
{
var playSound = function(sound)
{
log("playSound() readyState="+sound.readyState)
log("gold readyState="+myAudio2.readyState)
sound.play()
}
var loadSound = function(url, callback)
{
var audio = new Audio(url)
audio.addEventListener('canplaythrough', function()
{
log('canplaythrough');
if(callback)
callback()
}, false)
audio.load()
if(audio.readyState > 3)
{
log('audio.readyState > 3');
if(callback)
callback()
}
return audio
}
myAudio = loadSound(url, startInterval)
myAudio2 = loadSound(url2)
}
else
{
var playSound = function(sound)
{
log("playSound()")
var source = audioContext.createBufferSource()
if(source)
{
source.buffer = sound
if(!source.start)
source.start = source.noteOn
if(source.start)
{
var gain = audioContext.createGain()
source.connect(gain)
gain.connect(audioContext.destination)
source.start()
}
}
}
var loadSound = function(url, callback)
{
log("start loading sound "+url)
var ajax = new XMLHttpRequest()
ajax.open("GET", url, true)
ajax.responseType = "arraybuffer"
ajax.onload = function()
{
audioContext.decodeAudioData(
ajax.response,
function(buffer)
{
log("loaded sound "+url)
log(buffer)
callback(buffer)
},
function(error)
{
log(error)
}
)
}
ajax.send()
}
var AudioContext = window.AudioContext || window.webkitAudioContext
var audioContext = new AudioContext()
loadSound(url, function(r) {myAudio = r; startInterval()})
loadSound(url2, function(r) {myAudio2 = r})
}
function startInterval()
{
log("startInterval()")
setInterval(function()
{
playSound(myAudio)
}, 2000)
}
function log(m)
{
console.log(m)
debug.innerHTML += m+"<br />"
}
</script>
</body>
</html>
You can use either [WKWebViewConfiguration setMediaTypesRequiringUserActionForPlayback:WKAudiovisualMediaTypeNone] or [UIWebView setMediaPlaybackRequiresUserAction:NO] depending on your WebView class (or Swift equivalent).
I use this function for play sound
var soundObject = null;
function PlaySound() {
if (soundObject != null) {
document.body.removeChild(soundObject);
soundObject.removed = true;
soundObject = null;
}
soundObject = document.createElement("embed");
//soundObject.setAttribute("src", "../Sounds/beep-01a1.wav");
soundObject.setAttribute("src", "../Sounds/Small Blink.mp3");
soundObject.setAttribute("hidden", true);
soundObject.setAttribute("autostart", true);
document.body.appendChild(soundObject);
// alert('hii sound');
}
it works fine in all browser instead of Microsoft Edge browser it shows buzzer icon in browser
Any help is appreciable, Thanks.
The Edge browser seemingly ignores the "hidden" attribute. Try this:
soundObject.setAttribute("style", "display:none");
I want to play background music on page load.
My code runs fine on all browsers except Safari.
Safari shows.
Undefined is not constructor(evaluating new audio())
How can I fix this error on Safari?
var audio, playbtn, mutebtn, seek_bar;
function initAudioPlayer(){
audio = new Audio();
audio.src = "html/audio/di-evantile_behind-your-dream.mp3";
audio.loop = true;
audio.play();
// Set object references
playbtn = document.getElementById("playpausebtn");
mutebtn = document.getElementById("mutebtn");
// Add Event Handling
playbtn.addEventListener("click",playPause);
mutebtn.addEventListener("click", mute);
// Functions
function playPause(){
if(audio.paused){
audio.play();
playbtn.style.background = "url(html/images/pause.png) no-repeat";
} else {
audio.pause();
playbtn.style.background = "url(html/images/play.png) no-repeat";
}
}
function mute(){
if(audio.muted){
audio.muted = false;
mutebtn.style.background = "url(html/images/speaker.png) no-repeat";
} else {
audio.muted = true;
mutebtn.style.background = "url(html/images/speaker_muted.png) no-repeat";
}
}
}
//play music on page load
window.addEventListener("load", initAudioPlayer);
You seem to have a strange comment in your JavaScript. I had to remove "## Heading ##" in order for your script to run on my machine.
Regarding Safari:
Safari for Windows was discontinued back in 2012. I wouldn't be surprised if all sorts of HTML5 goodies didn't work in the Windows version.
It's working for me on a Mac OSX system with Safari 9.1.1 (11601.6.17).
I also tried using Safari for Windows as you described, and I can confirm that it doesn't work because the Audio object is not supported.