HTML5 Video - Percentage Loaded? - javascript

Does anyone know what event or property I need to query in order to get a percentage figure of the amount an HTML5 video has loaded? I want to draw a CSS styled "loaded" bar that's width represents this figure. Just like You Tube or any other video player.
So just like you tube a video will play even if the whole video hasn't loaded and give the user feedback on how much of the video has loaded and is left to load.
Just like the Red Bar on YouTube:

The progress event is fired when some data has been downloaded, up to three times per second. The browser provides a list of ranges of available media through the buffered property; a thorough guide to this is available on Media buffering, seeking, and time ranges on MDN.
Single load start
If the user doesn't skip through the video, the file will be loaded in one TimeRange and the buffered property will have one range:
------------------------------------------------------
|=============| |
------------------------------------------------------
0 5 21
| \_ this.buffered.end(0)
|
\_ this.buffered.start(0)
To know how big that range is, read it this way:
video.addEventListener('progress', function() {
var loadedPercentage = this.buffered.end(0) / this.duration;
...
// suggestion: don't use this, use what's below
});
Multiple load starts
If the user changes the playhead position while it's loading, a new request may be triggered. This causes the buffered property to be fragmented:
------------------------------------------------------
|===========| |===========| |
------------------------------------------------------
1 5 15 19 21
| | | \_ this.buffered.end(1)
| | \_ this.buffered.start(1)
| \_ this.buffered.end(0)
\_ this.buffered.start(0)
Notice how the number of the buffer changes.
Since it's no longer a contiguous loaded, the "percentage loaded" doesn't make a lot of sense anymore. You want to know what the current TimeRange is and how much of that is loaded. In this example you get where the load bar should start (since it's not 0) and where it should end.
video.addEventListener('progress', function() {
var range = 0;
var bf = this.buffered;
var time = this.currentTime;
while(!(bf.start(range) <= time && time <= bf.end(range))) {
range += 1;
}
var loadStartPercentage = bf.start(range) / this.duration;
var loadEndPercentage = bf.end(range) / this.duration;
var loadPercentage = loadEndPercentage - loadStartPercentage;
...
});

The other awnsers didn't work for me so I started digging into this problem and this is what I came up with. The solutions uses jquery to make an progressbar.
function loaded()
{
var v = document.getElementById('videoID');
var r = v.buffered;
var total = v.duration;
var start = r.start(0);
var end = r.end(0);
$("#progressB").progressbar({value: (end/total)*100});
}
$('#videoID').bind('progress', function()
{
loaded();
}
);
I hope this helps others as well

Percentage fix for loaded string.. Output something like 99% loaded inside #loaded element...
function loaded() {
var v = document.getElementById('videoID');
var r = v.buffered;
var total = v.duration;
var start = r.start(0);
var end = r.end(0);
var newValue = (end/total)*100;
var loader = newValue.toString().split(".");
$('#loaded').html(loader[0]+' loaded...');
$("#progress").progressbar({
value: newValue
});
}

I think best event to update the buffered progress bar is timeupdate. whenever time of the media is updated event is fired.
It gives buffered property which we can use like this
audio.addEventListener('timeupdate', function () {
if (this.duration) {
let range = 0;
let bf = this.buffered;
let time = this.currentTime;
while (!(bf.start(range) <= time && time <= bf.end(range))) {
range += 1;
}
let loadStartPercentage = bf.start(range) / this.duration;
let loadEndPercentage = bf.end(range) / this.duration;
let loadPercentage = (loadEndPercentage - loadStartPercentage) * 100;
//Update your progressbar DOM here
}
});
Best advantage of this event is this is fired when media is played. Whereas progress event is fired when media is downloaded and notified by browser.
So just like youtube, buffered percentage can only be shown when media is played

My answer is better than all of the other ones because you want to update buffer progress when the video is paused. This happens with the progress event. The time update event fires when progress fails, as it sometimes does.
$("#video").on("timeupdate progress", function(){
var video = document.getElementById("video");
var vidDur = video.duration;
for(var i = 0; i <= vidDur; i++){
var totBuffX = video.buffered.end(i);
var perBuff = totBuffX/vidDur*100;
$("#xVidBuffX").css("width", perBuff+"%");
}
});
you only need video.buffered.end(i).

Related

How to get a precise timeupdate on a video to return upto 2 decimal numbers (milliseconds)?

I have a video element where I want to compare it's current time to a variable with upto 2 decimal values.
$(video1).on('timeupdate', function(){
var currentTime = Math.round(this.currentTime);
var durationNum = Math.round(this.duration);
var formattedCurrentTime = secondsToHms(currentTime);
var formattedDurationTime = secondsToHms(durationNum)
onTrackedVideoFram(formattedCurrentTime, formattedDurationTime)
if(currentTime >10){ $(".box1, .box2").hide(); }
if(currentTime == choiceHeart){
console.log("HERE");
video1[0].pause();
}
this is from a tutorial file so the only way I know how to get the timeupdate to return a value is through that math.round function which returns rounded whole number.
BUT, I want to trigger an event, like pause the video at, say, 2.5 minutes (or 2 minutes 15 frames - by After effects 30 fps standards)****
The way I could do is create a variable early in the document like this :
var triggerEvent = 2.5;
So how do I get to understand that 2.5 means to return this value in 2 minutes and half minutes and it should pause the video at 2 & a half minutes?
PS :
In order to pause the video I plan to use an if loop :
if(currentTime == triggerEvent){
video1[0].pause();
}
here video1 is referencing to a variable which points to the div which holds the video.
You can use Media Fragments URI to play exactly the time slices that you request
Normal Play Time can either be specified as seconds, with an optional
fractional part to indicate miliseconds, or as colon-separated hours,
minutes and seconds (again with an optional fraction).
For example, the code below will play ten seconds of the media before the pause event is dispatched at the HTMLMediaElement
video.src = "/path/to/media#t=0,10"
See also
HTML5 audio streaming: precisely measure latency?
How to make range request for any portion of a media fragment?
You can use media fragment identifier #t=00:00,02:30 concatenated to the "src" attribute value to make a range request for the specified temporal dimensions of the media resource.
When the media fragment identifier is set at the "src" attribute or .src property of the HTMLMediaElement and .play() is called, exactly two minutes and thirty seconds 02:30 of media will play, before the .pause() event is dispatched when 02:30 is reached, or 150 seconds
<label></label><br>
<video
width="300"
height="200"
controls></video>
<script>
const video = document.querySelector("video");
const label = document.querySelector("label");
const src = "http://mirrors.creativecommons.org/movingimages/webm/ASharedCulture_480p.webm";
const timeSlice = "#t=00:00,02:30";
let raf;
video.src = src + timeSlice;
video.oncanplay = video.play;
video.onpause = function(event) {
video.onmouseenter = video.onmouseleave = raf = null;
// do stuff
if (Math.floor(video.currentTime) === 60*2.5) {
console.log(video.currentTime, video.currentTime/60);
}
}
function update(t) {
let curr = Math.floor(video.currentTime);
let message = `${curr} seconds played, ${(60*2.5)-curr} to play before pause event`;
label.innerHTML = message;
raf = window.requestAnimationFrame(update);
}
video.onmouseenter = function() {
raf = requestAnimationFrame(update);
}
video.onmouseleave = function() {
label.innerHTML = "";
window.cancelAnimationFrame(raf);
}
</script>
You are rounding the variable used in the comparison...
Rounding gets the closer integer:
0.4 gives 0
0.6 gives 1
To get a floating number with two decimals, you can do this:
var currentTime = parseFloat(this.currentTime);
currentTime = parseFloat(currentTime.toFixed(2));
var currentTime = 2.8349354;
currentTime = parseFloat(currentTime.toFixed(2));
console.log(currentTime);
So you can use just this:
var currentTime = parseFloat(this.currentTime.toFixed(2));
And the same thing for durationNum if you also need it to be a float with 2 decimals.
EDIT
You could also try this... It should work on all browsers.
var currentTime = Math.round(this.currentTime*100)/100;

Calculate or Track the seeked / wound / spooled / time for HTML5 Video

i would like to measure the skipped time during video playback if a user skipped some.
Using video.currentTime
First it looks quite trivial
Listen to seeking and get the currentTime A
Listen to seekend and get the currentTime B
B - A = Seeked Time
When i do that in Chrome the result is 0.
Why is that? If you listen to timeupdate TU it gets quite obvious.
Simplified Sequence:
Press Play
TU: 1
TU: 2
TU: 3 // now i use the mouse to seek forward to 19
TU: 19
//chrome & ff fire pause in between, ie not
Seek Start: 19
TU: 19
Seek End: 19
TU: 19
//chrome & ff fire play, ie not
TU: 20
...
I know i can play dirty and save the currentTime somewhere but not in an reliable way ;-)
Using TimeRanges video.played
I can use TimeRanges to calculate the amount of Time which got seeked/skipped. But the Problem is: TimeRanges come in ordered and normalized form as list. So if the User jumps forth an back the Ranges become merged and ordered => not reliable for accurate tracking.
Is there an easier less complicated approch i just dont see?
How I solved it.
I used a RingBuffer (Size 10) from here https://stackoverflow.com/a/28038535/2588818 and push the Math.round(video.currentTime) with every timeupdate to it.
On seekend I go left (prev) in the RingBuffer and take the first datum which differs from the current time.
Works just fine enough to use it for tracking the skipped time.
var createRingBuffer = createRingBuffer || function(length) {
/* https://stackoverflow.com/a/4774081 */
var pointer = 0, buffer = [];
return {
...
push : function(item){
buffer[pointer] = item;
pointer = (pointer + 1) % length;
return item;
},
...
getFirstDifference: function() {
var last_value = buffer[pointer - 1],
prev_value;
for (var i = 1; i <= length; i++) {
prev_value = buffer[(pointer - i) % length]
//check for undefined or initialize the buffer beforehand
if(prev_value === undefined || last_value === prev_value) {
} else {
return prev_value;
}
}
return last_value;
},
print: function() {
console.log(JSON.stringify(buffer));
}
};
};
...
var seekTimes = createRingBuffer(10);
handleUpdateTime = function() {
seekTimes.push(Math.round(video.currentTime));
},
handleSeekEnd = function(e) {
seekTimes.print();
console.log("%s - %s", seekTimes.getFirstDifference(), Math.round(video.currentTime));
},
...

How to manipulate the contents of an audio tag and create derivative audio tags from it?

On my webpage, I have an audio file inside of an tag.
<!DOCTYPE html>
<html>
<audio src="myTrack.mp3" controls preload="auto"></audio>
</html>
I want to chop up this file stored in an tag into multiple 10 second audio files that I could then insert into the webpage as their own audio files in seperate <audio> tags.
Is it possible to do this in javascript?
Yes, of course this is possible! :)
Make sure the audio fulfill CORS-requirements so we can load it with AJAX (loading from same origin as the page will of course fulfill this).
Load the file as ArrayBuffer and decode it with AudioContext
Calculate the number of segments and length of each (I use a time based length independent of channels below)
Split the main buffer into smaller buffers
Create a file-wrapper for the new buffer (below I made a simple WAVE wrapper for the demo)
Feed that as Blob via an Object-URL to a new instance of the Audio element
Keep keep track of the object-URLs so you can free them up when not needed anymore (revokeObjectURL()).
One drawback is of course that you would have to load the entire file into memory before processing it.
Example
Hopefully the file I'm using for the demo will be available through the current CDN that is used to allow CORS usage (I own the copyright, feel free to use it for testing, but only testing!! :) ). The loading and decoding can take some time depending on your system and connection, so please be patient...
Ideally you should use an asynchronous approach splitting the buffers, but the demo targets only the needed steps to make the buffer segments available as new file fragments.
Also note that I did not take into consideration the last segment to be shorter than the others (I use floor, you should use ceil for the segment count and cut the last block length short). I'll leave that as an exercise for the reader...
var actx = new(AudioContext || webkitAudioContext)(),
url = "//dl.dropboxusercontent.com/s/7ttdz6xsoaqbzdl/war_demo.mp3";
// STEP 1: Load audio file using AJAX ----------------------------------
fetch(url).then(function(resp) {return resp.arrayBuffer()}).then(decode);
// STEP 2: Decode the audio file ---------------------------------------
function decode(buffer) {
actx.decodeAudioData(buffer, split);
}
// STEP 3: Split the buffer --------------------------------------------
function split(abuffer) {
// calc number of segments and segment length
var channels = abuffer.numberOfChannels,
duration = abuffer.duration,
rate = abuffer.sampleRate,
segmentLen = 10,
count = Math.floor(duration / segmentLen),
offset = 0,
block = 10 * rate;
while(count--) {
var url = URL.createObjectURL(bufferToWave(abuffer, offset, block));
var audio = new Audio(url);
audio.controls = true;
audio.volume = 0.75;
document.body.appendChild(audio);
offset += block;
}
}
// Convert a audio-buffer segment to a Blob using WAVE representation
function bufferToWave(abuffer, offset, len) {
var numOfChan = abuffer.numberOfChannels,
length = len * numOfChan * 2 + 44,
buffer = new ArrayBuffer(length),
view = new DataView(buffer),
channels = [], i, sample,
pos = 0;
// write WAVE header
setUint32(0x46464952); // "RIFF"
setUint32(length - 8); // file length - 8
setUint32(0x45564157); // "WAVE"
setUint32(0x20746d66); // "fmt " chunk
setUint32(16); // length = 16
setUint16(1); // PCM (uncompressed)
setUint16(numOfChan);
setUint32(abuffer.sampleRate);
setUint32(abuffer.sampleRate * 2 * numOfChan); // avg. bytes/sec
setUint16(numOfChan * 2); // block-align
setUint16(16); // 16-bit (hardcoded in this demo)
setUint32(0x61746164); // "data" - chunk
setUint32(length - pos - 4); // chunk length
// write interleaved data
for(i = 0; i < abuffer.numberOfChannels; i++)
channels.push(abuffer.getChannelData(i));
while(pos < length) {
for(i = 0; i < numOfChan; i++) { // interleave channels
sample = Math.max(-1, Math.min(1, channels[i][offset])); // clamp
sample = (0.5 + sample < 0 ? sample * 32768 : sample * 32767)|0; // scale to 16-bit signed int
view.setInt16(pos, sample, true); // update data chunk
pos += 2;
}
offset++ // next source sample
}
// create Blob
return new Blob([buffer], {type: "audio/wav"});
function setUint16(data) {
view.setUint16(pos, data, true);
pos += 2;
}
function setUint32(data) {
view.setUint32(pos, data, true);
pos += 4;
}
}
audio {display:block;margin-bottom:1px}

HTML Canvas animate sequence images is slow on IPAD

I have built a script which takes a sequence of images and displays them on a canvas element in a animation loop.
This works really well on my desktop, but on IPAD (3 retina) it is very slow. Could you suggest any way to improve the performance?
var videoItts = 0;
function playVideo() {
if(videoItts < 92) {
setTimeout(function() {
ctx.clearRect(0,0,canvas.width,canvas.height)
ctx.drawImage(imagesL[videoItts],0,0,1024,636);
requestAnimationFrame(playVideo);
videoItts ++;
}, 1000/22)
}
}
requestAnimationFrame(playVideo);
The imagesL is an array of pre-loaded images.
I would suggest not mixing setTimeout and requestAnimationFrame. You can solve it using only requestAnimationFrame:
var startTime = Date.now();
var fps = 22;
var lastDrawnIndex = null;
var totalFrames = 92;
function drawVideo() {
var currTime = Date.now();
var currFrameIndex = Math.round((currTime - startTime) / (1000/fps)) % totalFrames;
// Since requestAnimationFrame usually fires at 60 fps,
// we only need to draw the image if the frame to draw
// actually has changed from last call to requestAnimationFrame
if (currFrameIndex !== lastDrawnIndex) {
ctx.drawImage(imagesL[videoItts],0,0,1024,636);
lastDrawnIndex = currFrameIndex;
}
requestAnimationFrame(drawVideo);
}
requestAnimationFrame(drawVideo);
The idea is that for every call to requestAnimationFrame we calculate, based on the elapsed time and the desired animation frame rate, which frame index to draw. If it's different from last calculated frame index we draw it. Then we schedule drawVideo to be called next animation frame by calling requestAnimationFrame(drawVideo) at the end.
The code above will loop frames 0-91 continously at 22 fps. I removed the ctx.clearRect call, it is only needed if the frames contains transparency. So you might want to add that back.

Jump around in a video to times from an array

Following along the lines of Control start position and duration of play in HTML5 video, I am trying to make a video jump from one segment to the next automatically when each has finished playing. Each segment will have the same duration, and the start times for each segment will be in an array.
I can't seem to figure out how to loop through the array after addEventListener.
var video = document.getElementById('player1');
function settheVariables() {
var videoStartTime= ["4","15","26","39"];
for (var i = 0; i < videoStartTime.length; i++) {
if (video.currentTime < videoStartTime[0] ){
video.currentTime = videoStartTime[i];
}
durationTime = 5;
}
//This part works when I plug in numbers for videoStartTime.
video.addEventListener('timeupdate', function() {
if(this.currentTime > (// videoStartTime [current index] + durationTime)){
this.currentTime = (// videoStartTime with next index);
video.play(); }
});
}
you need to change the values in your array to integers, not strings - you're not comparing apples to apples.
the updated and somewhat simplified sample below plays (initially from the start of the video) until the timestamp hits the current marker plus five seconds then jumps to the next marker (and loops back around).
it doesn't cater for the user scrubbing the video themselves (though it will trap as soon as they go >5s past the start of the current section, but going back will confuse things a little) - if you want to control within those 5s boundaries you'll want to do some smarter examination of the time stamp vs the array to make sure you're where you're supposed to be
anyway ... the code:
<script>
var video = document.getElementById('player1');
var videoStartTime= [4,15,26,39];
durationTime = 5;
currentIndex=0;
video.addEventListener('timeupdate', function() {
// console.log(this.currentTime);
if (this.currentTime > (videoStartTime[currentIndex] + durationTime))
{
currentIndex = (currentIndex + 1) % videoStartTime.length // this just loops us back around
this.currentTime = videoStartTime[currentIndex];
// video.play(); // don't need this if the video is already playing
}
});
</script>

Categories