HTML5 Web Audio - Slowed down audio playback cuts off early - javascript

I'm working on a web-based music sequencer/tracker, and I've noticed that in my sample playback routine, audio contexts seem to exist only for the duration of of a sample, and that the Web Audio API doesn't seem to adjust playback duration when I pitchshift a sample. For instance, if I shift a note down an octave, the routine only plays the first half of the sound before cutting off. More intense pitch downshifts result in even less of the sound playing, and while I'm not sure I can confirm this, I suspect that speeding up the audio results in relatively long periods of silence before the sound exits the buffer.
Here's my audio playback routine at the moment. So far, a lot more work has gone into making sure other functions send the right data to this than into extending the functionality of this routine.
function playSound(buffer, pitch, dspEffect, dspValue, volume) {
var source = audioEngine.createBufferSource();
source.buffer = buffer;
source.playbackRate.value = pitch;
// Volume adjustment is handled before other DSP effects are added.
var volumeAdjustment = audioEngine.createGain();
source.connect(volumeAdjustment);
// Very basic error trapping in case of bad volume input.
if(volume >= 0 && volume <= 1) {
volumeAdjustment.gain.value = volume;
} else {
volumeAdjustment.gain.value = 0.6;
}
switch(dspEffect){
case 'lowpass':
var createLowPass = audioEngine.createBiquadFilter();
volumeAdjustment.connect(createLowPass);
createLowPass.connect(audioEngine.destination);
createLowPass.type = 'lowpass';
createLowPass.frequency.value = dspValue;
break;
// There are a couple of other optional DSP effects,
// but so far they all operate in about the same fashion.
}
source.start();
}
My intent is for samples to play back fully no matter how much pitch shifting is applied, and to limit the amount of pitch shifting allowed if necessary. I've found that appending several seconds of silence to a sound works around this issue, but it's cumbersome due to the large amount of sounds I would need to process, and I'd prefer a code-based solution.
EDIT: Of the browsers I can test this in, this only appears to be an issue in Google Chrome. Samples play back fully in Firefox, Internet Explorer does not yet support the Web Audio API, and I do not have ready access to Safari or Opera. This definitely changes the nature of the help I'm looking for.

I've found that appending several seconds of silence to a sound works around this issue, but it's cumbersome due to the large amount of sounds I would need to process, and I'd prefer a code-based solution.
You could upload a sound file that is just several seconds of silence and append it to the actual audio file. Here is an SO answer that shows how to do this...

Related

HTML5 Video: Don't stop playback time when no media is available

I'm developing an application, where it is necessary that a video is exactly at the playback position, where it should be if there were no lack of data, even if there is such.
eg. video.currentTime == timeSinceTheVideoWasStarted
The tolerance is about 0.1s and it is also possible to increase or decrease the playback speed a little bit to match the time.
All the media data is cached in a ObjectURL, but the decoding may be slow, because the application is very computation intensive.
I thought about setting the correct playback time at the playing event, but when data is present again and the currentTime is updated this will case the next lack of data.
I'm using the video element as a video texture source in WebGL as described here.
The fixed points are that the video is downloaded from a local server (for example http://localhost:8000/assets/foo.mp4) and displayed on WebGL. Your Soulutions may include different video decodings than the native <video>.
To reproduce it: Do something CPU intensive and play a video - it won't be smooth. (As you would expect)
I hope you can help me.
EDIT:
The main thing I worry about is the situation, I aleady experienced many times while nornaly working, likely windows itself is doing some sync disk IO, which causes everything to wait, even the mouse cursor...
I'll try it out to compensate small time differences (< 1s) by the playback speed

Playing video frame-by-frame very slow on Chrome/Firefox but not on Safari

I have a use case that consists in seeking specific frames in a video. The frequency of frame changes can be quite high as it depends on mouse movements. My problem is that depending on the browser the updates can be very smooth (Safari) or introduce a lot of latency (Chrome/Firefox).
I came up with an example to test the differences between browsers using the following code (here is a codepen). I am using this video that is 10 seconds long and 60fps and I play it frame-by-frame by changing its currentTime at an interval equals to the video's frame rate.
HTML
<video id="player" src="https://test-videos.co.uk/vids/bigbuckbunny/mp4/h264/360/Big_Buck_Bunny_360_10s_1MB.mp4" preload="auto"/>
JS
window.onload = () => {
const player = document.getElementById("player")
setInterval(() => {
player.currentTime = (player.currentTime + 0.016) % 10
}, 16)
}
The main problem seems to me that Chrome and Firefox just handle video time changes differently than Safari, and my use case is well supported in the latter. Would there be any way to also make that work smoothly in Chrome/Firefox? I checked a few video player libraries (e.g., plyr, react-player, video-js) but they are all based on native HTML5 players.
For anyone having similar issues, I found out the problem has to do with the video encoding. The significant parameter is the keyframe interval that, if I got it correctly, defines the size of the chunk of frames a specific frame would refer to when rendering. The smaller this number, the faster the response when rendering it, but I guess the files can grow bigger as less information is compressed.
You can modify this parameter using the -g and -min_keyint options with ffmpeg https://ffmpeg.org/ffmpeg-codecs.html.

How to keep a live MediaSource video stream in-sync?

I have a server application which renders a 30 FPS video stream then encodes and muxes it in real-time into a WebM Byte Stream.
On the client side, an HTML5 page opens a WebSocket to the server, which starts generating the stream when connection is accepted. After the header is delivered, each subsequent WebSocket frame consists of a single WebM SimpleBlock. A keyframe occurs every 15 frames and when this happens a new Cluster is started.
The client also creates a MediaSource, and on receiving a frame from the WS, appends the content to its active buffer. The <video> starts playback immediately after the first frame is appended.
Everything works reasonably well. My only issue is that the network jitter causes the playback position to drift from the actual time after a while. My current solution is to hook into the updateend event, check the difference between the video.currentTime and the timecode on the incoming Cluster and manually update the currentTime if it falls outside an acceptable range. Unfortunately, this causes a noticeable pause and jump in the playback which is rather unpleasant.
The solution also feels a bit odd: I know exactly where the latest keyframe is, yet I have to convert it into a whole second (as per the W3C spec) before I can pass it into currentTime, where the browser presumably has to then go around and find the nearest keyframe.
My question is this: is there a way to tell the Media Element to always seek to the latest keyframe available, or keep the playback time synchronised with the system clock time?
network jitter causes the playback position to drift
That's not your problem. If you are experiencing drop-outs in the stream, you aren't buffering enough before playback to begin with, and playback just has an appropriately sized buffer, even if a few seconds behind realtime (which is normal).
My current solution is to hook into the updateend event, check the difference between the video.currentTime and the timecode on the incoming Cluster
That's close to the correct method. I suggest you ignore the timecode of incoming cluster and instead inspect your buffered time ranges. What you've received on the WebM cluster, and what's been decoded are two different things.
Unfortunately, this causes a noticeable pause and jump in the playback which is rather unpleasant.
How else would you do it? You can either jump to realtime, or you can increase playback speed to catch up to realtime. Either way, if you want to catch up to realtime, you have to skip in time to do that.
The solution also feels a bit odd: I know exactly where the latest keyframe is
You may, but the player doesn't until that media is decoded. In any case, keyframe is irrelevant... you can seek to non-keyframe locations. The browser will decode ahead of P/B-frames as required.
I have to convert it into a whole second (as per the W3C spec) before I can pass it into currentTime
That's totally false. The currentTime is specified as a double. https://www.w3.org/TR/2011/WD-html5-20110113/video.html#dom-media-currenttime
My question is this: is there a way to tell the Media Element to always seek to the latest keyframe available, or keep the playback time synchronised with the system clock time?
It's going to play the last buffer automatically. You don't need to do anything. You're doing your job by ensuring media data lands in the buffer and setting playback as close to that as reasonable. You can always advance it forward if a network condition changes that allows you to do this, but frankly it sounds as if you just have broken code and a broken buffering strategy. Otherwise, playback would be simply smooth.
Catching up if fallen behind is not going to happen automatically, and nor should it. If the player pauses due to the buffer being drained, a buffer needs to be built back up again before playback can resume. That's the whole point of the buffer.
Furthermore, your expectation of keeping anything in-time with the system clock is not a good idea and is unreasonable. Different devices have different refresh rates, will handle video at different rates. Just hit play and let it play. If you end up being several seconds off, go ahead and set currentTime, but be very confident of what you've buffered before doing so.

When changing video currentTime too much, video fails?

I'm using a user drag event as well as keypresses to change the position in a HTML5 video element and then updating the video time accordingly using:
video.currentTime = toTime;
and then I am updating a canvas based on the video position by grabbing the video and putting it to the canvas.
Another element is that I actually get the video time from the frame number, i.e:
framenumber = 123;
fps = 25;
toTime = 123/25;
Problem is, every so often it just fails. By fails I mean I lose the video; it just stops working altogether.
Most of the time it works great but sometimes it just fails, and not always at the same point either...
Any ideas would be much appreciated!
There were 2 answers to my question:
Encoding of the video files - basically by controlling keyframes and
sending the right video to the right browser I was able to solve a
lot of problems. Using FFMPEG I changed the GOP length.
ffmpeg -g <frames> in my case, where <frames> is the amount of frames between GOP points desired.
Using videojs to serve up the video seemed to solve a lot of problems and made it a smoother an experience.

variable speed control for audio playback in the browser?

Is there a way to change the playback speed of audio in the browser? What is best to accomplish this task, html5 audio, or flash, or something else? Are there any specific libraries that would help with this?
Use the Web Audio API.
In the following code I answered your other question.
best way to loop audio in the browser?
Modify the code in my answer above as follows for a playback speed example.
Right below
source.loop = loopOnOff;
add
source.playbackRate.value = 1; // change number to between 0.10 to 10 (or larger/smaller) to test.
You can also run the html audio tag through the web audio api and add effects processing.
Interesting question there,
HTMl5 will have player speed control will have speed control..
A couple of noteworthy upcoming features are playbackRate and defaultPlaybackRate. As you can probably imagine, these fellas let us alter the speed and direction of playback. This functionality could be used for fast-forward and rewind functions or perhaps to allow users to tweak the playback speed so they can fit more podcasts into their day.
audio.playbackRate returns 1 at normal speed and acts as a multiple that is applied to the rate of playback. For example, setting playbackRate to 2 would double the speed, while setting it to -1 would play the media backwards.
audio.defaultPlaybackRate is the rate at which the audio will play after you pause and restart the media (or issue any event for that matter).
Flash Player may help( but it will be customized one you may create, with stream buffer, you need to define the player speed once buffer has the content to play.
Sound easy but will take a lot effort,
Refer VLC opesource for better Idea, its documented with ffmpeg which works with Audio,
and works with client software, in browser ti will be heavy, refer Just to have idea.
I hope this may help :)

Categories