<video>.playbackRate not working on firefox when using HTMLMediaElementSourceNode - javascript

As stated in the title, I have been running in an issue regarding the HTMLVideoElement when connected to the WebAudioAPI inside Firefox.
The following sample gives a minimal example reproducing the issue:
var video = document.getElementById('video');
var ctx = new AudioContext();
var sourceNode = ctx.createMediaElementSource(video);
sourceNode.connect(ctx.destination);
video.playbackRate = 3;
video.play();
As soon as the video element is connected to the audio pipeline, I cannot get the playbackRate setter to work anymore.
I've been looking for a way to set this value somewhere inside the AudioContext or the HTMLMediaElementSourceNode objects but those classes do not seem to handle playback-rate on their own.
Please note that this sample works fine on Chrome. And I don't really see what seems to be the problem here.
Thanks

Already reported over the Firefox's bug tracker: https://bugzilla.mozilla.org/show_bug.cgi?id=966247

Related

Safari - A MediaStreamTrack ended due to a capture failure

Today I upgraded to macOS Big Sur 11.0.1 and Safari 14, and my website (one-to-one video chat based on WebRTC) stopped working on Safari. After 10 seconds of a video call, the following console error appears: "A MediaStreamTrack ended due to a capture failure" and the other person can no longer see the video.
My code looks like this:
const userMedia = await navigator.mediaDevices.getUserMedia({
video: true,
audio: true,
});
if (userMedia != null) {
userMedia.getTracks().forEach((track) => {
otherRtcPeer.addTrack(track, userMedia);
});
}
Is it a Safari bug or an implementation issue? And how to solve it?
After going through this guide, I have made changes and resolved the issue.
Have stream object in react state
when video element render/re-render the stream object is cloned and assigned to video element srcObject
After capture of the picture, stop all media tracks in the stream.
This way error mentioned above has been overcome.
I was able to fix the issue on my end by styling a video element I'm using as a webGL texture as display:block, opacity:0 (instead of display: none).
Perhaps they removed the ability to play offscreen video textures on ios14/big sur.

oddity about createMediaElementSource()

I was building an audio program and hit a stumbling block on the .createMediaElementSource method. I was able to solve the problem, but I do not quite know why the solution works.
In my HTML, I created an audio player: <audio id="myAudio><source src="music.mp3"></audio>
Now in my JS:
context = new AudioContext();
audio = document.getElementById('myAudio');
source = context.createMediaElementSource(audio);
audio.play();
doesn't work. The audio element loads, but doesn't play the song, nor is there audio.
However! This JS code works:
context = ...; //same as above
audio...;
source = context.createMediaElementSource(audio[0]);
audio.play();
All I changed was adding a [0] to the audio and the program suddenly works again. Since .getElementById doesn't return an array, I don't know why referring to audio as an array works, but just referring to audio does not.
A few months late, but in case others stumble upon this and want an answer:
This behaviour is described in the Web Audio API spec:
The createMediaElementSource method
Creates a MediaElementAudioSourceNode given an HTMLMediaElement. As a consequence of calling this method, audio playback from the HTMLMediaElement will be re-routed into the processing graph of the AudioContext.
Emphasis mine. Since the output from the audio element is now routed into the newly created MediaElementAudioSourceNode instance (instead of the original destination, usually your speakers), you need to route the output of the instance back to the original destination:
var audio = document.getElementById('myAudio');
var ctx = new AudioContext();
var src = ctx.createMediaElementSource(audio);
src.connect(ctx.destination); // connect the output of the source to your speakers
audio.play();
The reason it worked when you added [0] is that document.getElementById doesn't return an array, or an element with a defined key of "0". As such, you might as well have written ctx.createMediaElementSource(undefined), which doesn't re-route the audio from the #myAudio element.

How do I get microphone data from AudioContext

So, I just found out that you can record sound using javascript. That's just awesome!
I intantly created new project to do something on my own. However, as soon as I opened source code of the example script, I found out that there are no explanatory comments at all.
I started googling and found a long and interesting article about AudioContext that doesn't to be aware of the recording at all (it only mentions remixinf sounds) and MDN article, that contains all the information - succesfully hiding the one I'm after.
I'm also aware of existing frameworks that deal with the thing (somehow, maybe). But if I wanted to have a sound recorder I'd download one - but I'm really curious how the thing works.
Now not only that I'm not familiar with the coding part of the thing, I'm also curious how the whole thing will work - do I get intensity in specific time? Much like in any osciloscope?
Or can I already get spectral analysis for the sample?
So, just to avoid any mistakes: Please, could anyone explain the simplest and most straightforward way to get the input data using above-mentioned API and eventually provide a code with explanatory comments?
If you just want to use mic input as a source on WebAudio API, the following code worked for me. It was based on: https://gist.github.com/jarlg/250decbbc50ce091f79e
navigator.getUserMedia = navigator.getUserMedia
|| navigator.webkitGetUserMedia
|| navigator.mozGetUserMedia;
navigator.getUserMedia({video:false,audio:true},callback,console.log);
function callback(stream){
ctx = new AudioContext();
mic = ctx.createMediaStreamSource(stream);
spe = ctx.createAnalyser();
spe.fftSize = 256;
bufferLength = spe.frequencyBinCount;
dataArray = new Uint8Array(bufferLength);
spe.getByteTimeDomainData(dataArray);
mic.connect(spe);
spe.connect(ctx.destination);
draw();
}

How to correctly/cleanly draw SVG graphics using ProcessingJS?

I'm getting started with ProcessingJS and I'm currently playing with SVG a bit.
Unfortunately I've ran into a strange behaviour displaying SVG.
Here is the output:
and here is the code that produces that:
<script src="http://ajax.googleapis.com/ajax/libs/jquery/2.0.0/jquery.min.js"></script>
<script src="http://cloud.github.com/downloads/processing-js/processing-js/processing-1.4.1.min.js"></script>
<script>
$(document).ready(function() {
$('body').append($('<canvas id="preview"><p>Your browser does not support the canvas tag.</p></canvas>'));
function onPJS(p) {
var glasses;
p.setup = function(){
glasses = p.loadShape("brighteyes.svg");
p.size(Math.floor(glasses.width), Math.floor(glasses.height)+Math.floor(glasses.height * .25));
//p.shape(glasses,0,0);
p.frameRate(1);
}
//*
p.draw = function() {
p.background(32);
p.shape(glasses, 0, 0);
console.log(p.mousePressed);//prints undefined
};
//*/
}
new Processing(document.getElementById("preview"), onPJS);
});
</script>
I'm experiencing this odd rendering (renderer seems to place a vertex at 0,0 for the shape)
on OSX 10.8 on Chrome Version 26.0.1410.65 (but not on Safari (6.0 (8536.25))). You can run the code here.
How do I get read of these weird rendering bug ?
There is another unexpected thing happening: mousePressed prints undefined, but might address that in a different question.
Processing's support for SVG is patchy. You should probably just report the bug at https://github.com/processing-js/processing-js/issues/new, and then use a PNG copy instead.
NB: renders OK on Firefox 35.
Edit: reported for you. https://github.com/processing-js/processing-js/issues/137
GoToLoop at the above link says that your problem with mousepressed is caused by the fact that to avoid name collisions in JS, the boolean mousepressed is called __mousePressed in JS. They recommend that you use Java syntax to code your app and have it automatically translated into JS, to avoid these gotchas.
Result: just update Chrome, which you've probably done anyway now.

javascript stack overflow while drawing images

I am simply loading a ton of images (about 5000) into "new Image()" objects and draw them each in a canvas by calling canvas.drawImage(image, 0, 0);
This works totally fine with IE10, but as soon as I am using Firefox I will get an stack overflow error, because somehow the memory usage of the Firefox rises and rises till it overflows. Does anyone have an idea why? I think the GC dont really collect my images after drawing them into the canvas. Even when I am using 100 Image objects and just cycle the src of the Image objects before drawing them, the memory usage rises and rises. I will test Chrome and Safari soon, but still need a solution for that, cause everyone is using "the best browser" Firefox.
EDIT:
function play() {
//calculated iLag here
//calculated wondow.FrameCtr here
var iFrameRate = Math.round(1000 / 25);
var oImage = new Image();
oImage.onload = function () {
renderImage(this);
}
//window.Video is an array of window.URL.createObjectURL(data) (about 500 items)
oImage.src = window.Video[window.FrameCtr];
oImage = null;
setTimeout(
function () {
play()
}, iFrameRate - iLag
);
function renderImage(oImage) {
$("#video")[0].getContext("2d").drawImage(oImage, 0, 0);
}
I do loop this video (500 items, 25fps) 10 times, and ff isn't even able to play it once, cause of stack overflow.
As I mentioned before it is working fine with IE10 and works even better with Chrome, so I don't think the problem here is the recursion. Is there any other way to get binary data into in canvas, than using an Image object and setting the src?
It is already noted as a bug in Firefox. You can see the bug report here. It is showing a last modified date of 2010-09-17 but I am not sure has it been resolved in newer version or what.
But I guess newer version of Firefox should not have that problem.

Categories