HTML5 video full preload in javascript - javascript

I have a high quality video which I cannot compress too much as it's going to be the base of a lot of image analysis whereby each frame will be redrawn into the canvas and then manipulated.
I'm trying to preload the whole thing before playing it as I can't have the video stop, buffer and continue. Is there an event which I can listen for which signifies that the whole video has preloaded before I commence playback?
Here's how I'm doing it in JS/jQuery:
this.canvas = this.el.find("canvas")[0];
this.video = this.el.find("video")[0];
this.ctx = this.canvas.getContext("2d");
this.video.autoplay = false;
this.video.addEventListener("play",this.draw)
this.video.addEventListener("timeupdate",this.draw)
this.video.addeventlistener("ended",this.trigger("complete",this))

This will load the entire video in JavaScript
var r = new XMLHttpRequest();
r.onload = function() {
myVid.src = URL.createObjectURL(r.response);
myVid.play();
};
if (myVid.canPlayType('video/mp4;codecs="avc1.42E01E, mp4a.40.2"')) {
r.open("GET", "slide.mp4");
}
else {
r.open("GET", "slide.webm");
}
r.responseType = "blob";
r.send();

canplaythrough is the event that should fire when enough data has downloaded to play without buffering.
From the Opera teams excellent (although maybe very slightly dated now) resource Everything you need to know about HTML5 video and audio
If the load is successful, whether using the src attribute or using source elements, then as data is being downloaded, progress events are fired. When enough data has been loaded to determine the video's dimensions and duration, a loadedmetadata event is fired. When enough data has been loaded to render a frame, the loadeddata event is fired. When enugh data has been loaded to be able to play a little bit of the video, a canplay event is fired. When the browser determines that it can play through the whole video without stopping for downloading more data, a canplaythrough event is fired; this is also when the video starts playing if it has a autoplay attribute.
'canplaythrough' support matrix available here: https://caniuse.com/mdn-api_htmlmediaelement_canplaythrough_event
You can get around the support limitations by binding the load element to the same function, as it will trigger on those.

Download the video using fetch
Convert the response to a blob
Create an object URL from the blob (e.g. blob:http://localhost:8080/df3c4336-2d9f-4ba9-9714-2e9e6b2b8888)
async function preloadVideo(src) {
const res = await fetch(src);
const blob = await res.blob();
return URL.createObjectURL(blob);
}
Usage:
const video = document.createElement("video");
video.src = await preloadVideo("https://example.com/video.mp4");

Hope this could help you
var xhrReq = new XMLHttpRequest();
xhrReq.open('GET', 'yourVideoSrc', true);
xhrReq.responseType = 'blob';
xhrReq.onload = function() {
if (this.status === 200) {
var vid = URL.createObjectURL(this.response);
video.src = vid;
}
}
xhrReq.onerror = function() {
console.log('err' ,arguments);
}
xhrReq.onprogress = function(e){
if(e.lengthComputable) {
var percentComplete = ((e.loaded/e.total)*100|0) + '%';
console.log('progress: ', percentComplete);
}
}
xhrReq.send();
and then , if your video src has another domain ,you have to handle CORS .

So far the most trustable solution we found was to play it and wait for the buffer to be fully loaded.
Which means if the video is long, you will have to wait for almost all the video length.
That isn't cool, i know.
Wondering if someone has figured out some other magically reliable way of doing it ( ideally using something like PreloadJS which automatically falls back to flash when HTML5 video isn't supported ).

You can use this nice plugin:
https://github.com/GianlucaGuarini/jquery.html5loader
In its API there is a onComplete event that is triggered when the plugin finishes to load all the sources

Does this work?
video.onloadeddata = function(){
video.onseeked = function(){
if(video.seekable.end(0) >= video.duration-0.1){
alert("Video is all loaded!");
} else {
video.currentTime=video.buffered.end(0); // Seek ahead to force more buffering
}
};
video.currentTime=0; // first seek to trigger the event
};

Related

Using requested XHR data while actually loading

maybe i just got it wrong but... im requesting "large" files via ajax (180mb - 500mb). i thought that im able to fetch and use the data with the method URL.createObjectURL while its actually loading? i need the requested data within 5 seconds but its acutually loading 16 seconds.
ajax request
xhr.onload (worked within 5 seconds or faster, locally, but not live)
within the onload (or progress, onreadystatechange (i tried)) i used URL.createObjectURL(xhr.response) to get the data
var nxtclp = new XMLHttpRequest();
nxtclp.onload = function() {
get_src = URL.createObjectURL(nxtclp.response);
that.preloadSource = get_src;
};
nxtclp.open("GET", "media/vid.mp4");
nxtclp.responseType = "blob";
nxtclp.send();
is there any way to playback data while loading ?
Use autoplay attribute at <video> element
autoplay
A Boolean attribute; if specified, the video automatically begins to
play back as soon as it can do so without stopping to finish loading
the data.
<video controls autoplay src="http://mirrors.creativecommons.org/movingimages/webm/ScienceCommonsJesseDylan_240p.webm">
</video>
alternatively, using javascript
var video = document.createElement("video");
video.autoplay = true;
video.controls = true;
video.onloadedmetadata = (e) => {
console.log(video.readyState);
document.body.appendChild(e.path[0])
}
video.src = "http://mirrors.creativecommons.org/movingimages/webm/ScienceCommonsJesseDylan_240p.webm";
<body>
</body>

How can I play audio elements in sync with key presses?

HTML
<textarea id="words" name="words"></textarea>
<audio id="type" src="type.mp3"></audio>
JS
document.getElementById('words').onkeydown = function(){
document.getElementById('type').play();
}
I want to make type.mp3 to play anytime I press any key.
But, it is not played in sync with the key.
I am looking for a pure JS solution.
The audio media element depends on the buffering mechanism of the browser and may not play instantly when play is called.
To play sounds in sync with key-presses you would have to use the Web Audio API instead which allows you to play a in-memory buffer and therefor instantly.
Here is an example of how you can load and trigger the sound:
window.AudioContext = window.AudioContext || window.webkitAudioContext;
var request = new XMLHttpRequest(),
url = "https://dl.dropboxusercontent.com/s/8fp1hnkwp215gfs/chirp.wav",
actx = new AudioContext(),
abuffer;
// load file via XHR
request.open("GET", url, true);
request.responseType = "arraybuffer";
request.onload = function() {
// Asynchronously decode the audio file data in request.response
actx.decodeAudioData(request.response,
function(buffer) {
if (buffer) {
abuffer = buffer; // keep a reference to decoded buffer
setup(); // setup handler
}
}
)
};
request.send();
// setup key handler
function setup() {
document.getElementById("txt").onkeydown = play;
}
// play sample - a new buffer source must be created each time
function play() {
var src = actx.createBufferSource();
src.buffer = abuffer;
src.connect(actx.destination);
src.start(0);
}
<textarea id=txt></textarea>
(note: there seem to be a bug in Firefox at the time of this writing reporting a column which does not exist in the code at the send() call - if a problem, try the code in Chrome).
Javascript is an asynchronous and event-driven language, so you can't make a synchronous function.

What is the best audio file format to play audio from javascript in custom chromecast receiver

I have developed a simple custom chromecast receiver for a game.
In it I play short sounds from the receiver javascript using:
this.bounceSound = new Audio("paddle.ogg");
to create the audio object when the game is loaded, and then using:
this.bounceSound.play();
to play the sound when needed in the game.
This works fine in chrome on my laptop, but when running the receiver in my chromecast some sounds don't play, others are delayed.
Could this be a problem with my choice of sound format (.ogg) for the audio files?
If not, what else could be the problem.
Are their any best practices on the details of the sounds files (frequency, bit depth, etc?).
Thanks
Just for the record to avoid future confusion of other developers trying to load and play back multiple short sounds at the same time:
On Chromecast, the HTML video and audio tags can only support a
single active media element at a time.
(Source: https://plus.google.com/+LeonNicholls/posts/3Fq5jcbxipJ - make sure to read the rest, it contains also important information about limitations)
Only one audio element will be loaded, others get error code 4 (that was at least the case during my debugging sessions). The correct way of loading and playing back several short sounds is using the Web Audio API as explained by Leon Nicholls in his Google+ post I linked to above.
Simple Web Audio API Wrapper
I whipped up a crude replacement for the HTMLAudioElement in JavaScript that is based on the Web Audio API:
function WebAudio(src) {
if(src) this.load(src);
}
WebAudio.prototype.audioContext = new AudioContext;
WebAudio.prototype.load = function(src) {
if(src) this.src = src;
console.log('Loading audio ' + this.src);
var self = this;
var request = new XMLHttpRequest;
request.open("GET", this.src, true);
request.responseType = "arraybuffer";
request.onload = function() {
self.audioContext.decodeAudioData(request.response, function(buffer) {
if (!buffer) {
if(self.onerror) self.onerror();
return;
}
self.buffer = buffer;
if(self.onload)
self.onload(self);
}, function(error) {
self.onerror(error);
});
};
request.send();
};
WebAudio.prototype.play = function() {
var source = this.audioContext.createBufferSource();
source.buffer = this.buffer;
source.connect(this.audioContext.destination);
source.start(0);
};
It can be used as follows:
var audio1 = new WebAudio('sounds/sound1.ogg');
audio1.onload = function() {
audio1.play();
}
var audio2 = new WebAudio('sounds/sound2.ogg');
audio2.onload = function() {
audio2.play();
}
You should download the sounds up front before you start the game. Also be aware that these sounds will then be stored in memory and Chromecast has very limited memory for that. Make sure these sounds are small and will all fit into memory.

Cannot extract local video file duration in mobile browser

I am using a file input to capture recorded video from a user's mobile device. What I want to do is then read that file somehow and determine whether it is over a certain duration (30 seconds in this case). If it is over that duration, then the file should not be allowed to be uploaded to the server. It is under the duration, then it is okay.
I can accurately detect the duration of the file in javascript on desktop, but not on mobile, which is what I need. This is my code:
onEndRecord = function(e) {
var file = e.target.files[0];
var videoElement = document.createElement('video');
document.body.appendChild(videoElement);
var fileURL = window.URL.createObjectURL(file);
videoElement.addEventListener('loadeddata', function(e) {
console.log('loadeddata',e.target.duration);
});
videoElement.onload = function () { // binding onload event
console.log('onload',videoElement.duration);
};
videoElement.src = fileURL;
}
Anybody know how to get this information? The duration just reports as zero on mobile.
I've also tried running it through the file reader api:
readBlob = function(file){
console.log('readBlob',file);
var reader = new FileReader();
reader.onload = function (e) {
console.log('reader load');
var player = document.getElementById('videoReader');
player.addEventListener('loadeddata', function(e) {
console.log('loadeddata',e.target.duration);
player.play();
});
var fileURL = window.URL.createObjectURL(file);
player.src = fileURL;
}
reader.readAsDataURL(file);
}
What is happening I believe is that the loadedmetadata event (or loadeddata event as in your question) just does not fire on the mobile devices you are testing hence the duration is not available for reading and is rendered as 0. Have a look here for the events linked to the HTML5 media element specs. Note that you could use the loadstart event for the media element rather than the onload event for fine tuning your web application.
Typically on iOS the event will fire on user interaction ... not before (as with the canplay event). This is a limitation to attempt to reduce bandwidth consumption of users on paid data plans for their mobile device. This is described here for Apple. The same generally goes for Android.
Dealing with the Web Audio API you could get the duration through the buffer received from the decodeAudioData method. Here is some information on the subject.
You can read this information server side with PHP or Java but this would not work at best for your design.
So either you could get a user to playback the recorded sample before uploading to have access to the duration or if you know the average bitrate at which the video was recorded and the file size (File API) then you could approximate the duration.
Solved this by using FFMPEG's FFprobe service. We download just a small amount of the video - about 4k is enough - and then read the metadata. For quicktime, however, the metadata is at the end of the video, so you have to swap the beginning for the end. This was done using a modified version of qt fast start:
https://github.com/danielgtaylor/qtfaststart

html audio tag, duration always infinity

I've been working on using the html audio tag to play some audio files. The audio plays alright, but the duration property of the audio tag is always returning infinity.
I tried the accepted answer to this question but with the same result. Tested with Chrome, IE and Firefox.
Is this a bug with the audio tag, or am I missing something?
Some of the code I'm using to play the audio files.
javascript function when playbutton is pressed
function playPlayerV2(src) {
document.getElementById("audioplayerV2").addEventListener("loadedmetadata", function (_event) {
console.log(player.duration);
});
var player = document.getElementById("audioplayer");
player.src = "source";
player.load();
player.play();
}
the audio tag in html
<audio controls="true" id="audioplayerV2" style="display: none;" preload="auto">
note: I'm hiding the standard audio player with the intend of using custom layout and make use of the player via javascript, this does not seem to be related to my problem.
try this
var getDuration = function (url, next) {
var _player = new Audio(url);
_player.addEventListener("durationchange", function (e) {
if (this.duration!=Infinity) {
var duration = this.duration
_player.remove();
next(duration);
};
}, false);
_player.load();
_player.currentTime = 24*60*60; //fake big time
_player.volume = 0;
_player.play();
//waiting...
};
getDuration ('/path/to/audio/file', function (duration) {
console.log(duration);
});
I think this is due to a chrome bug. Until it's fixed:
if (video.duration === Infinity) {
video.currentTime = 10000000;
setTimeout(() => {
video.currentTime = 0; // to reset the time, so it starts at the beginning
}, 1000);
}
let duration = video.duration;
This works for me
const audio = document.getElementById("audioplayer");
audio.addEventListener('loadedmetadata', () => {
if (audio.duration === Infinity) {
audio.currentTime = 1e101
audio.addEventListener('timeupdate', getDuration)
}
})
function getDuration() {
audio.currentTime = 0
this.voice.removeEventListener('timeupdate', getDuration)
console.log(audio.duration)
},
In case you control the server and can make it to send proper media header - this what helped the OP.
I faced this problem with files stored in Google Drive when getting them in Mobile version of Chrome. I cannot control Google Drive response and I have to somehow deal with it.
I don't have a solution that satisfies me yet, but I tried the idea from both posted answers - which basically is the same: make audio/video object to seek the real end of the resource. After Chrome finds the real end position - it gives you the duration. However the result is unsatisfying.
What this hack really makes - it forces Chrome to load the resource into the memory completely. So, if the resource is too big, or connection is too slow you end up waiting a long time for the file to be downloaded behind the scenes. And you have no control over that file - it is handled by Chrome and once it decides that it is no longer needed - it will dispose it, so the bandwidth may be spent ineficciently.
So, in case you can load the file yourself - it is better to download it (e.g. as blob) and feed it to your audio/video control.
If this is a Twilio mp3, try the .wav version. The mp3 is coming across as a stream and it fools the audio players.
To use the .wav version, just change the format of the source url from .mp3 to .wav (or leave it off, wav is the default)
Note - the wav file is 4x larger, so that's the downside to switching.
Not a direct answer but in case anyone using blobs came here, I managed to fix it using a package called webm-duration-fix
import fixWebmDuration from "webm-duration-fix";
...
fixedBlob = await fixWebmDuration(blob);
...
//If you want to modify the video file completely, you can use this package "webmFixDuration" Other methods are applied at the display level only on the video tag With this method, the complete video file is modified
webmFixDuration github example
mediaRecorder.onstop = async () => {
const duration = Date.now() - startTime;
const buggyBlob = new Blob(mediaParts, { type: 'video/webm' });
const fixedBlob = await webmFixDuration(buggyBlob, duration);
displayResult(fixedBlob);
};

Categories