I am currently trying to implement an audio player for my Angular web application following a tutorial on Google Developers and also some help I've found on " Can't seek video when playing from MediaSource ".
The big difference in my case though is that I want to stream the audio chunk-wize such that the user does not have to download the entire file in order to listen to it right away.
Listening to a track from the start to the beginning is already working as I am just downloading byte-chunks from the server and simply append each chunk to the SourceBuffer as they arrive.
However, I am not able to implement the "seek" functionality.
I do not quite understand how to handle this on the client. At the moment I only work with mp3 files. I cannot find any example where seeking is explained.
I know that setting currentTime of an audio element will trigger a seeking event according to the Media Events doc.
We have:
this.audioObj = document.createElement('audio');
and a setter:
public seekTo(seconds) {
this.logger.debug(`seekTo ${seconds.toFixed(2)} seconds`);
// Setting currentTime will cause a "seeking" event to be emitted.
this.audioObj.currentTime = seconds;
}
I think that I have to load new data before I set currentTime and append it to the sourceBuffer. However, this simply cannot work for obvious reasons.
How can I make this work?
If you've set the duration, you should be able to set the currentTime but there will be nothing to play. You can use the currentTime to inform your chunk fetcher which portion of the file you need and append it so the media element can continue to play.
Note that when using mp3, the SourceBuffer will be operating in sequence mode since there are no timestamps, which means that if you just blindly append bytes they will not be at the correct point in time - I believe you need to set timestampOffset to the relevant time for the frame in question (I haven't tried this myself).
Related
I'm developing an Elecron app (JavaScript) to audio visualization. There is a Playlist() instance which receives audio file paths the user wants to play. When the first audio finishes, it plays the next one. So far so good. The app does an intense computational work extracting audio features from each channel, re-rendering canvases and animating plots. It does it beautifully.
The problem is: each time the app plays a next file, the more slow it gets, as if all the audio data before is still somewhere. I've found in documentation the method close() from AudioContext():
"The close() method of the AudioContext Interface closes the audio context, releasing any system audio resources that it uses."
"An AudioContext can now be explicitly closed, thereby releasing any hardware resources associated with the AudioContext. Without this, developers had to depend on garbage collection of the AudioContext to release hardware resources."
I also have found this example of closing and restarting audio contexts:
https://github.com/mdn/webaudio-examples/blob/master/audiocontext-states/index.html
https://mdn.github.io/webaudio-examples/audiocontext-states/
The problem is that I use a audioContext.createMediaElementSource(HTMLelementID) and it doesn't allow me to restart everything recreating all the nodes like in the example. A simplified code that represents what I did before is:
class Audio() {
constructor(audioElementID, playlistObj) {
this.audioContext = new AudioContext();
this.audioElement = document.getElementById(audioElementID);
this.track = this.audioContext.createMediaElementSource(this.audioElement);
this.gainNode = this.audioContext.createGain();
this.track.connect(this.gainNode);
this.gainNode.connect(this.audioContext.destination);
this.audioElement.addEventListener('ended', () => {
playlistObj.playnextTrack() // changes the src from the html element (audioElementID) and sets this.audioElement.currentTime to 0
}
}
// everything is a property here for debugging reasons
}
const audio = new Audio('audioID', playlist);
// playlist defined somewhere else
To implement the close() method I had to change (just exactly the example, a function that recreates everything again):
class Audio() {
constructor(audioElementID, playlistObj) {
this.createAudioContext = () => {
this.audioContext = new AudioContext();
this.audioElement = document.getElementById(audioElementID);
this.track = this.audioContext.createMediaElementSource(this.audioElement);
this.gainNode = this.audioContext.createGain();
this.track.connect(this.gainNode);
this.gainNode.connect(this.audioContext.destination);
this.audioElement.addEventListener('ended', () => {
playlistObj.playNextTrack() // changes the src from the html element (audioElementID) and sets this.audioElement.currentTime to 0
}
}
this.createAudioContext();
}
}
and in playlist.playNextTrack() I pause the audioElement, call audio.audioContext.close(), wait for it (it's a promise), call audio.createAudioContext() to recreate everything and plays. The logic returns an error at this.track = this.audioContext.createMediaElementSource(this.audioElement) with:
"Failed to execute 'createMediaElementSource' on 'BaseAudioContext': HTMLMediaElement already connected previously to a different MediaElementSourceNode, at Audio.createAudioContext"
In the example, the audio source is just a random oscillator and not a mp3 audio file.
I'm really stuck here. Don't know what to do. I'm not even sure if AudioContext() really holds data from all the audio files before causing this performance problem. And if so, how could I reconnect the HTMLMediaElement to a new node audio.createAudioContext() creates? I've already tried audio.track.disconnect()but it doesn't work (as it shouldn't because here I'm disconnecting track from gainNode). And also audioElement doesn't have a disconnect()method as It's just a html element.
Any idea?
UPDATE:
I passed over the problem of recreating the audio context deleting and creating again the html element. But the problem persist: the more new audio files are played, the app gets slower. More precisely now: the more new AudioContext() is created, the slower it gets (even if I close the previous one).
I'm really stuck here. Don't know what to do. I'm not even sure if AudioContext() really holds data from all the audio files before causing this performance problem.
No, it's really unlikely this is the case. The AudioContext sets up things like the sample rate, output destination, and the graph. That's all.
The close() method of the AudioContext Interface closes the audio context, releasing any system audio resources that it uses.
You're misunderstanding what this means. Those "system audio resources" are the sound devices. While the AudioContext is running, there is an audio device requested. This is particularly meaningful in low power environments, like mobile. Another example would be Bluetooth. If the AudioContext is kept running, your Bluetooth headset may just stay on. If the AudioContext is allowed to close, then the Bluetooth headset may go to sleep.
And if so, how could I recconect the HTMLMediaElement to a new node audio.createAudioContext() creates?
You don't. While it would be nice if the API supported this, it seems it doesn't. Simply create a new HTMLMediaElement.
What you should do is properly profile your application to figure out where the slowdown is occurring. Use your developer tools. Might be faster though just to start commenting out sections of things that are running. We certainly can't tell you where the problem is, specifically, from the code you've shown.
I have a python/flask application that sends mjpeg video to a element.
It works fine regarding the streaming, but I have a problem aborting the video. To abort the video at my phone (android) I have to click a link to another page or else the stream continues.
Currently I am controlling the stream with javascript. Setting the "src" to either a url for a static image from the cam, or an url to the video stream.
But between the src-change I first change it to "#".
A problem, using flask, is that when 1 client is receiving the stream (using generator & yield) no other cant communicate with the server. This might be the source to the problem?!
So, With javascript I control the stream with the following code:
if (streaming==false){
document.getElementById(img_id).src=C_vidsource;
streaming = true;
} else {
var currDate = new Date();
document.getElementById(img_id).src="#";
document.getElementById(img_id).src=C_statimage + "?" + currDate.getTime();
streaming = false;
}
I control this using a simple
I Think that androids web browser differ from the one I am using on the computer. It seems like it tries to download content before changing anything on the page. So it lets the videostream continue until the new image is loaded. But the new image will not be loaded until the stream has stopped.
is there a way to solve this?
Thanks!
Directly after i posted the question I found the solution.
I added an delay between the two src-changes.
after:
document.getElementById(img_id).src="";
I added
sleep(1000);
And sleep is a function I created (a very dirty):
function sleep(ms){
stoptime = Date.now() + ms;
while(Date.now() < stoptime){ }
return;
}
I guess that for a longer sleep this is not a good solutoion, but it solves my problem, or at least gives me a hint about what to search for.
I'm developing a webapp that (in part) records some audio using recorder.js, and sends it to a server. I'm trying to target Firefox, so I have to use this hack to keep the audio source from cutting off:
// Hack for a Firefox bug that stops input after a few seconds
window.source = audio_context.createMediaStreamSource(stream);
source.connect(audio_context.destination);
I think that this is causing audio to be played back through the computer speakers, but I'm not sure. I'm kind of a newbie when it comes to web audio. My goal is to eliminate the audio that is being played out of the speakers.
EDIT:
Here's a link to my JS file on Github: https://github.com/miller9904/Jonathan/blob/master/js/main.js#L87
you have to connect the source to the node( through which you retrieve data which you are going to record), replace this.node with what variable name you have assigned to yuor node used for processing.
window.source.connect(this.node);
//this.node.connect(this.context.destination);
edit: just checked, connecting to destination is not compulsory, also make sure your node variable does not get garbage collected( which i am assuming is happening in your case, since recording stops after few seconds.)
In the document.ready function, I have this:
audioElement = document.createElement('audio');
audioElement.setAttribute('src', 'http://www.mfiles.co.uk/mp3-downloads/Toccata-and-Fugue-Dm.mp3');
$('#ToggleStart').click(function () {
audioElement.play();
});
$('#ToggleStop').click(function () {
audioElement.pause();
});
The problem is that the MP3 is downloaded when the page loads, which causes significant load time since the MP3 is over 2MB. What I want is the MP3 to be streamed. Is this possible and if so, what do I need to change?
jsFiddle here
You're very close to getting it right. I've had a look at your JSFiddle and noticed that the audio does stream already (I can play the file before it's finished downloading). You can easily see this by checking the Network traffic in your browser:
Chrome displays 'partial content' but is playing the mp3 at the same time. Your specific problem seems to be that it is downloading and playing too early. So if we take a look a the spec we can see some options.
preload = "none" or "metadata" or "auto" or "" (empty string) or empty
Represents a hint to the UA about whether optimistic downloading of the audio stream itself or its metadata is considered worthwhile.
- "none": Hints to the UA that the user is not expected to need the audio stream, or that minimizing unnecessary traffic is desirable.
- "metadata": Hints to the UA that the user is not expected to need the audio stream, but that fetching its metadata (duration and so on) is desirable.
- "auto": Hints to the UA that optimistically downloading the entire audio stream is considered desirable.
As you're not displaying any information about the audio file we can ignore the metdata option, this means you want to set the preload="none" attribute. Therefore if you change your JSFiddle slightly to dynamically set this:
audioElement.setAttribute('preload', "none");
audioElement.setAttribute('src', 'http://www.mfiles.co.uk/mp3-downloads/Toccata-and-Fugue-Dm.mp3');
Here is a JSFiddle showing the result, if you bring up the network tab in Chrome you'see see that the download doesn't start till you begin playing the mp3.
I'm making an audio heavy webpage. I've read that there are some issues with audio playback on certain systems that are solved by calling the load() method before the play() method, so I'm designing everything around that premise.
I'm clueless about the audio element, and I'm worried that the load() method is rising the bandwith consumption. This is what I'm doing:
var x = new Audio("x.mp3");
function playMe(){
x.load();
x.play();
}
It's my understanding that the audio file is downloaded when the x Audio object is created. My concern is if the load() method is downloading it again every time the play button is clicked.
Thanks for your time.
You can check your own if the x.load() method re-downloads the file:
open your browser developer tools [e.g. in Chrome];
check the Network tab for activity on x.load() calls.