The following script reads the audio from the user's microphone and renders an oscilloscope on a html canvas.
The source is taken from an example of the mozilla developer network: Visualizations with Web Audio API
And here is the fiddle: http://jsfiddle.net/b7j8pktp/
mozGetUserMedia
(note: code has no fork mechanism for different browsers: works only with firefox)
It works fine for a few seconds and then immediately stops rendering.
Whereas this works totally stable: http://mdn.github.io/voice-change-o-matic/
The problem can be reduced to the following code. The microphone activation icon (next to the the address bar in firefox) disappears after about 5 seconds:
navigator.mozGetUserMedia({audio: true},
function() {}, function() {} );
(http://jsfiddle.net/b7j8pktp/2/)
This is a known bug in Firefox. Just take the stream from the getUserMedia call and hook it up to the window like so:
navigator.mozGetUserMedia({audio: true}, function(stream) {
window.stream = stream;
// rest of the code
}, function err() {
// handle error
});
Hopefully we can get it fixed soon. The problem is that we're failing to add a reference to the stream when we do the AudioContext.createMediaStreamSource call, so that the stream is not referenced by anything anymore when the getUserMedia callback returns, and it is collected by the cycle collector when it runs, that is, a couple seconds later.
You can follow along in https://bugzilla.mozilla.org/show_bug.cgi?id=934512.
Related
Is it possible to apply constraints to a running live audio track? It doesn't seem to work for me, at least on Chrome v80.
Suppose I have a stream:
const stream = await navigator.mediaDevices.getUserMedia({
audio: {
autoGainControl: true
channelCount: 2
echoCancellation: true
noiseSuppression: true
},
video: false
});
Now, later on I want to change some of those parameters:
for (const track of stream.getAudioTracks()) {
track.applyConstraints({
autoGainControl: false,
echoCancellation: false,
noiseSuppression: false
});
}
This has no effect. If I call track.getConstraints(), I see my new constraints but audibly they have no effect until I reload the page and apply them from the beginning. Additionally, when I call track.getSettings(), I see that my new constraints haven't been applied.
I have also tried calling track.enabled = false before applying the constraints, with track.enabled = true afterwards, with no luck.
Any advice on how to get this to work without making a fresh call to getUserMedia()?
SO user jib, which works on Firefox and adapter.js projects wrote a blog post in 2017 about this exact feature.
Here is how they did apply the constraints to the track:
async function apply(c) {
await track.applyConstraints(Object.assign(track.getSettings(), c));
update();
}
c is an object with the particular constraints to add.
They do it this way because all the properties that are omitted when passing the
MediaTrackConstraints dictionary will get reset to their defaults when applied.
Now, your solution should have worked too for the properties you did set.
So using this fiddle I did try the few UAs I have on my macOS machine:
Chrome
As you reported, settings are not applied.
Here is the issue tracking the upcoming implementation.
From this issue's comments you can also find a workaround which implies requesting a new MediaStream from the same deviceId as the one you got, and applying the desired constraints.
Here is a fork of jib's fiddle with such a workaround. Note that the deviceId is gotten from track.getSettings()
async function apply(c) {
track.stop(); // required
const new_constraints = Object.assign(track.getSettings(), c, );
const new_stream = await gUM({ audio: new_constraints });
updateSpectrum( audio.srcObject = new_stream );
track = new_stream.getAudioTracks()[0];
update();
}
Firefox
Works seamlessly.
Safari
Badly crashes. On my machine, running the original fiddle, with only the tweak of the spectrum will completely crash the gUM of the whole browser.
- The current stream is stopped
- Any attempt to get a new stream fails - until reboot of the whole app.
The forked fiddle we made for Chrome at least doesn't crash, but it doesn't seem to produce any audible change either...
I have a problem understanding the documentation for the WebExtensions notification.onClicked event.
Ultimately, I'm trying to get the text of the notification copied to the clipboard when you click on it. However, right now I am having a problem understanding the callback thing, or where I have to insert the notification.onClicked function.
At the moment, I don't know why the notification.onClicked listener does nothing.
My code (all the code needed to demonstrate the problem as a WebExtension Firefox add-on):
manifest.json
{
"description": "Test Webextension",
"manifest_version": 2,
"name": "Σ",
"version": "1.0",
"permissions": [
"<all_urls>",
"notifications",
"webRequest"
],
"background": {
"scripts": ["background.js"]
}
}
background.js
'use strict';
function logURL(requestDetails) {
notify("Testmessage");
chrome.notifications.onClicked.addListener(function() {
console.log("TEST TEST");
});
}
function notify(notifyMessage) {
var options = {
type: "basic",
iconUrl: chrome.extension.getURL("icons/photo.png"),
title: "",
message: notifyMessage
};
chrome.notifications.create("ID123", options);
}
chrome.webRequest.onBeforeRequest.addListener(
logURL, {
urls: ["<all_urls>"]
}
);
First, you need to be testing this in Firefox 47.0+, as support for chrome.notifications.onClicked() was added in version 47.0. While this is probably not your problem, it is one contributing possibility.
There are multiple issues with your code. Some are in your code, but primarily you are running into a Firefox bug.
Firefox Bug:
Your primary issue is that you are running into a Firefox bug where Firefox gets confused if you try to create notifications too rapidly. Thus, I have implemented a notification queue and rate limited the creation of notifications. What is "too rapidly" is probably both OS and CPU dependent, so you are best off erroring on the side of caution and set the delay between calls to chrome.notifications.create() to a higher value. in the code below, the delay is 500ms. I have added a note regarding this issue in the chrome.notifications.create() page on MDN and on the Chrome incompatibilities page.
Adding multiple copies of the same listener:
The main thing that you are doing wrong in your code is that you are adding an anonymous function as a listener, using chrome.notifications.onClicked.addListener(), multiple times to the same event. This is a generic issue with event handlers. When you use an anonymous function it is a different actual function each time you are trying to add it, so the same functionality (in multiple identical functions) gets added multiple times. You should not be adding functions, which do the exact same thing, multiple times to the same event. Doing so is almost always an error in your program and results in unexpected operation.
In this case, the multiple functions would have ended up outputing multiple lines of TEST TEST to the console each time the user clicked on a notification. The number of lines output per click would increase by one for each web request which resulted in a call to logURL.
The way to prevent doing this is to be sure to add the listener only once. If you are using an anonymous function, you can only do this by being sure you only execute the addListener (or addEventlistener) once (usually by only adding the listener from your main code (not from within a function), or from a function that is only called once. Alternately, you can name/define your listener function directly within the global scope (or other scope accessible to all places where you try to add the listener) (e.g. function myListener(){...}). Then, when you are adding myListener you are always referring to the same exact function which JavaScript automatically prevents you from adding in the same way to the same event more than once.
It should be noted that if you are trying to add a anonymous function as a listener from another listener, you are almost always doing something wrong. Adding copies of identical anonymous listeners multiple times to the same event is a common error.
Access to the notification text:
While you do not implement anything regarding using the text of the notification, you state that you want to add the text of the notification to the clipboard when the user clicks on the notification. You can not obtain the notification text from any portion of the chrome.notifications API. Thus, you have to store that information yourself. The code below implements an Object to do that so the text can be accessed in the chrome.notifications.onClicked() handler.
Example code:
The code below implements what I believe you desire. It is just creating and clicking the notification while having access to the notification text in the chrome.notifications.onClicked() listener. It does not implement the part about putting the text into the clipboard, as that was not actually implemented in the code in your Question. I have added liberal comments to the code to explain what is happening and provided quite a bit of console.log() output to help show what is going on. I have tested it in both Firefox Developer Edition (currently v51.0a2) and Google Chrome.
background.js (no changes to your manifest.json):
'use strict';
//* For testing, open the Browser Console
var isFirefox = window.InstallTrigger?true:false;
try{
if(isFirefox){ //Only do this in Firefox
//Alert is not supported in Firefox. This forces the Browser Console open.
//This abuse of a misfeature works in FF49.0b+, not in FF48
alert('Open the Browser Console.');
}
}catch(e){
//alert throws an error in Firefox versions below 49
console.log('Alert threw an error. Probably Firefox version below 49.');
}
//*
//Firefox gets confused if we try to create notifications too fast (this is a bug in
// Firefox). So, for Firefox, we rate limit showing the notifications.
// The maximum rate possible (minimum delay) is probably OS and CPU speed dependent.
// Thus, you should error on the side of caution and make the delay longer.
// No delay is needed in Chrome.
var notificationRateLimit = isFirefox ? 500:0;//Firefox:Only one notification every 500m
var notificationRateLimitTimeout=-1; //Timeout for notification rate limit
var sentNotifications={};
var notificationsQueue=[];
var notificationIconUrl = chrome.extension.getURL("icons/photo.png");
function logURL(requestDetails) {
//console.log('webRequest.onBeforeRequest URL:' + requestDetails.url);
//NOTE: In Chrome, a webRequest is issued to obtain the icon for the notification.
// If Chrome finds the icon, that webRequest for the icon is only issued twice.
// However, if the icon does not exist, then this sets up an infinite loop which
// will peg one CPU at maximum utilization.
// Thus, you should not notify for the icon URL.
// You should consider excluding from notification all URLs from within your
// own extension.
if(requestDetails.url !== notificationIconUrl ){
notify('webRequest URL: ' + requestDetails.url);
}
//Your Original code in the Question:
//Unconditionally adding an anonymous notifications.onClicked listener
// here would result in multiple lines of 'TEST TEST' ouput for each click
// on a notification. You should add the listener only once.
}
function notify(notifyMessage) {
//Add the message to the notifications queue.
notificationsQueue.push(notifyMessage);
console.log('Notification added to queue. message:' + notifyMessage);
if(notificationsQueue.length == 1){
//If this is the only notification in the queue, send it.
showNotificationQueueWithRateLimit();
}
//If the notificationsQueue has additional entries, they will get
// shown when the current notification has completed being shown.
}
function showNotificationQueueWithRateLimit(){
if(notificationRateLimitTimeout===-1){
//There is no current delay active, so immediately send the notification.
showNextNotification();
}
//If there is a delay active, we don't need to do anything as the notification
// will be sent when it gets processed out of the queue.
}
function showNextNotification() {
notificationRateLimitTimeout=-1; //Indicate that there is no current timeout running.
if(notificationsQueue.length === 0){
return; //Nothing in queue
}
//Indicate that there will be a timeout running.
// Neeed because we set the timeout in the notifications.create callback function.
notificationRateLimitTimeout=-2;
//Get the next notification from the queue
let notifyMessage = notificationsQueue.shift();
console.log('Showing notification message:' + notifyMessage);
//Set our standard options
let options = {
type: "basic",
//If the icon does not exist an error is generated in Chrome, but not Firefox.
// In Chrome a webRequest is generated to fetch the icon. Thus, we need to know
// the iconUrl in the webRequest handler, and not notify for that URL.
iconUrl: notificationIconUrl,
title: "",
message: notifyMessage
};
//If you want multiple notifications shown at the same time, your message ID must be
// unique (at least within your extension).
//Creating a notification with the same ID causes the prior notification to be
// destroyed and the new one created in its place (not just the text being replaced).
//Use the following two lines if you want only one notification at a time. If you are
// actually going to notify on each webRequest (rather than doing so just being a way
// to test), you should probably only have one notification as they will rapedly be
// off the screen for many pages.
//let myId = 'ID123';
//chrome.notifications.create(myId,options,function(id){
//If you want multiple notifications without having to create a unique ID for each one,
// then let the ID be created for you by using the following line:
chrome.notifications.create(options,function(id){
//In this callback the notification has not necessarily actually been shown yet,
// just that the notification ID has been created and the notification is in the
// process of being shown.
console.log('Notification created, id=' + id + ':: message:' + notifyMessage);
logIfError();
//Remember the text so we can get it later
sentNotifications[id] = {
message: notifyMessage
}
//Show the next notification in the FIFO queue after a rate limiting delay
// This is called unconditionally in order to start the delay should another
// notification be queued, even if one is not in the queue now.
notificationRateLimitTimeout = setTimeout(showNextNotification
,notificationRateLimit);
});
}
function logIfError(){
if(chrome.runtime.lastError){
let message =chrome.runtime.lastError.message;
console.log('Error: ' + message);
}
}
chrome.webRequest.onBeforeRequest.addListener(
logURL, {
urls: ["<all_urls>"]
}
);
//Add the notifications.onClicked anonymous listener only once:
// Personally, I consider it better practice to use a named function that
// is defined in the global scope. Doing so prevents inadvertantly adding
// it multiple times. Although, your code should be written such that you
// don't do that anyway.
chrome.notifications.onClicked.addListener(function(id) {
//We can not get the notification text from here, just the ID. Thus, we
// have to use the text which was remembered.
console.log('Clicked notification message text: ', sentNotifications[id].message);
//In Firefox the notification is automatically cleared when it is clicked.
// If you want the same functionality in Chrome, you will need to clear() it
// yourself:
//Always do this instead of only when not in Firefox so that it remains consistent
// Even if Firefox changes to match Chrome.
chrome.notifications.clear(id);
//This is the last place we use the text of the notification, so we delete it
// from sentNotifications so we don't have a memory leak.
delete sentNotifications[id];
});
//Test the notifications directly without the need to have webRequests:
notify('Background.js loaded');
notify('Second notification');
In the process of working on this, I found multiple incompatibilities between Chrome and Firefox. I am in the process of updating MDN to mention the incompatibilities in the documentation on MDN.
I've got simple video stream working via getUserMedia, but I would like to handle case when webCam what i'm streaming from becomes disconnected or unavailable. So I've found oninactive event on stream object passed to successCallback function. Also I would like to restart video stream when exactly same webcam/mediaDevice will be plugged in.
Code example:
navigator.getUserMedia = navigator.getUserMedia || navigator.webkitGetUserMedia || navigator.mozGetUserMedia;
navigator.getUserMedia(constrains, function successCallback(stream) {
this.video.src = URL.createObjectURL(stream);
stream.oninactive = function (error) {
//this handler runs when device becomes unavailable.
this.onStreamInactive(error, stream);
}.bind(this);
}.bind(this), function errorCallback () {});
Based on the example above how i can:
Detect recently connected media device
Check is it the same device what I was streaming from
A better way would be to use MediaDevices.ondevicechange() as mentioned in the other answer in this thread, but it is still behind a flag on Chrome. Instead of using ondevicechange() to enumerate devices, poll MediaDevices.enumerateDevices() at regular interval when you start the call, at end of every poll interval compare the list of devices you get from the devices in the previous poll. This way you can know the new devices added/remove during the call.
A little late to answer, but it looks like you can use MediaDevices.ondevicechange to attach an event handler, and then in the event handler you can query MediaDevices.enumerateDevices() to get the full list. Then you inspect the list of devices, identify the one that was recently added by comparing by a cached list you have, and comparing properties to a record you kept of the properties of the current device. The links have more thorough examples.
Adapted from the ondevicechange reference page
navigator.mediaDevices.ondevicechange = function(event) {
navigator.mediaDevices.enumerateDevices()
.then(function(devices) {
devices.forEach(function(device) {
console.log(device);
// check if this is the device that was disconnected
});
});
}
Note that the type of the device objects returned by enumerateDevices is described here
Browser Support
It looks like it's pretty patchy as of writing this. See this related question: Audio devices plugin and plugout event on chrome browser for further discussion, but the short story is for Chrome you'll need to enable the "Experimental Web Platform features" flag.
I want to play a single audio file (mp3) and my only problem is media length.
It works just fine on Android 5.0.1, but on 4.4.2/4.4.4 it doesn't work!
With native implementation I get a value but it's incorrect and if I use the Media plugin API (from Phonegap) the media.duration is undefined and media.getDuration() returns -1.
I'm trying to get duration only after loadedmetadata event is fired, so this could not be the problem.
The native implementation is done through js with new Audio(), no DOM element involved.
The file is stored on sdcard, and src looks like file:///storage/sdcard/audio.mp3. Everything else regarding html5 audio api works, but duration.
Are there any solutions to fix this?
Thanks to #tawpie's answer I figured out a workaround for this issue I'm having.
That setInterval made me thing about my custom seekbar been updated (correctly) while the audio is playing and in calculating the width of it I was using audio duration value and from that results that the duration is working after media file play method is fired.
The problem is that loadedmetadata event doesn't return the correct duration value (in some browsers like android webView), but after audio played for at least 1s the duration is updated and you can use it.
So you can forget about loadedmetadata event and jump straight to canplay event and from there you can make something like this:
var myAudio = new Audio();
myAudio.src = 'file://mnt/sdcard/audio.mp3';
myAudio.load();
myAudio.correctDuration = null;
myAudio.addEventListener('canplay', function(){
myAudio.play();
myAudio.muted = true;
setTimeout(function(){
myAudio.pause();
myAudio.currentTime = 0;
myAudio.muted = false;
myAudio.correctDuration = myAudio.duration;
},1000);
});
...of course, you can use volume = 0.0/1.0 instead of mute.
Another method would be to create a helper function (in my case - a AngularJS service) which takes your src value and uses the code above and returns the correctDuration. This one is preferred if you have listeners to audio timeUpdate which changes the DOM.
The Media plugin works exactly the same way - if the audio haven't played for at least 1s you cannot use getDuration() method or duration property inside a interval/timeout wrapper to get the correct duration.
I think the video element behaves similarly. I'll test it these days.
Hope this workaround helps!
Try Media.node.duration. That works on windows... For what it's worth, as long as getDuration is called in an interval, I don't have any problems on Android 4.4. But I'm using just the media plugin new Media(src, onSuccess, onError, playbackStatus) and not the HTML5 player.
Hardcoded values. It's a pain, but you can do this if the files are local.
I ran into an issue where chrome was reporting different duration values than other browsers, and this is where we landed. I know it's not really a solution, but it works.
OR... you can use some external process to generate a json of duration times, and reference those values at runtime.
For the sake of reference:
audio.addEventListener('durationchange', function(e) {
console.log(e.target.duration); //FIRST 0, THEN REAL DURATION
});
worked for me.
Credit: this stackowerflow question
I'm getting the audio/video duration of a file without appending it to the screen. "Using the same code", when I try to get the video duration on both sides it works as expected. But when using audio files it says that the duration is 0 on Android, but it works on a desktop computer.
// Only working on Desktop
var audio = new Audio(url);
// Hide audio player
// player.appendChild(audio);
audio.addEventListener('loadedmetadata', function() {
alert(audio.duration);
});
And the below code is working:
// Working on Desktop and Android
var video = document.createElement('video');
video.src = url;
// Hide video
// player.appendChild(video);
video.addEventListener('loadedmetadata', function() {
alert(video.duration);
});
There is a different approach you can try but, if duration doesn't work with your device (which IMO is a bug) then it's likely this doesn't either; worth a shot though:
audio.seekable.end(audio.seekable.length-1);
or even
audio.buffered.end(audio.buffered.length-1);
though the latter is dependent on content being loaded which in this case probably then won't help.
EDIT: Using the durationchange event is much easier. First the 0 is being output, but as soon as the file is loaded (that's where loadedmetadata fails I guess) the updated and real duration will be output.
audio.addEventListener('durationchange', function(e) {
console.log(e.target.duration); //FIRST 0, THEN REAL DURATION
});
OLD WAY (ABOVE IS MUCH FASTER)
Looks like this "bug" (if this is actually a real bug) is still around. Chrome (40) for Android still outputs 0 as the audio files duration. Researching the web didn't get me a solution but I found out the bug also occurs on iOS. I figured I should post my fix here for you guys.
While audio.duration outputs 0, logging audio outputs the object and you can see that the duration is displayed just right there. All this is happening in the loadedmetadata event.
audio.addEventListener('loadedmetadata', function(e) {
console.log(e.target.duration); //0
});
If you log audio.duration in the timeupdate event though, the real duration is being output. To only output it once you could do something like:
var fix = true;
audio.addEventListener('timeupdate', function(e) {
if(fix === true) {
console.log(e.target.duration); //REAL DURATION
fix = false;
}
console.log(e.target.currentTime); //UPDATED TIME POSITION
});
I'm not sure why all this is happening. But let's be happy it's nothing serious.