I am using a file input to capture recorded video from a user's mobile device. What I want to do is then read that file somehow and determine whether it is over a certain duration (30 seconds in this case). If it is over that duration, then the file should not be allowed to be uploaded to the server. It is under the duration, then it is okay.
I can accurately detect the duration of the file in javascript on desktop, but not on mobile, which is what I need. This is my code:
onEndRecord = function(e) {
var file = e.target.files[0];
var videoElement = document.createElement('video');
document.body.appendChild(videoElement);
var fileURL = window.URL.createObjectURL(file);
videoElement.addEventListener('loadeddata', function(e) {
console.log('loadeddata',e.target.duration);
});
videoElement.onload = function () { // binding onload event
console.log('onload',videoElement.duration);
};
videoElement.src = fileURL;
}
Anybody know how to get this information? The duration just reports as zero on mobile.
I've also tried running it through the file reader api:
readBlob = function(file){
console.log('readBlob',file);
var reader = new FileReader();
reader.onload = function (e) {
console.log('reader load');
var player = document.getElementById('videoReader');
player.addEventListener('loadeddata', function(e) {
console.log('loadeddata',e.target.duration);
player.play();
});
var fileURL = window.URL.createObjectURL(file);
player.src = fileURL;
}
reader.readAsDataURL(file);
}
What is happening I believe is that the loadedmetadata event (or loadeddata event as in your question) just does not fire on the mobile devices you are testing hence the duration is not available for reading and is rendered as 0. Have a look here for the events linked to the HTML5 media element specs. Note that you could use the loadstart event for the media element rather than the onload event for fine tuning your web application.
Typically on iOS the event will fire on user interaction ... not before (as with the canplay event). This is a limitation to attempt to reduce bandwidth consumption of users on paid data plans for their mobile device. This is described here for Apple. The same generally goes for Android.
Dealing with the Web Audio API you could get the duration through the buffer received from the decodeAudioData method. Here is some information on the subject.
You can read this information server side with PHP or Java but this would not work at best for your design.
So either you could get a user to playback the recorded sample before uploading to have access to the duration or if you know the average bitrate at which the video was recorded and the file size (File API) then you could approximate the duration.
Solved this by using FFMPEG's FFprobe service. We download just a small amount of the video - about 4k is enough - and then read the metadata. For quicktime, however, the metadata is at the end of the video, so you have to swap the beginning for the end. This was done using a modified version of qt fast start:
https://github.com/danielgtaylor/qtfaststart
Related
I have an array of Blobs (binary data, really -- I can express it however is most efficient. I'm using Blobs for now but maybe a Uint8Array or something would be better). Each Blob contains 1 second of audio/video data. Every second a new Blob is generated and appended to my array. So the code roughly looks like so:
var arrayOfBlobs = [];
setInterval(function() {
arrayOfBlobs.append(nextChunk());
}, 1000);
My goal is to stream this audio/video data to an HTML5 element. I know that a Blob URL can be generated and played like so:
var src = URL.createObjectURL(arrayOfBlobs[0]);
var video = document.getElementsByTagName("video")[0];
video.src = src;
Of course this only plays the first 1 second of video. I also assume I can trivially concatenate all of the Blobs currently in my array somehow to play more than one second:
// Something like this (untested)
var concatenatedBlob = new Blob(arrayOfBlobs);
var src = ...
However this will still eventually run out of data. As Blobs are immutable, I don't know how to keep appending data as it's received.
I'm certain this should be possible because YouTube and many other video streaming services utilize Blob URLs for video playback. How do they do it?
Solution
After some significant Googling I managed to find the missing piece to the puzzle: MediaSource
Effectively the process goes like this:
Create a MediaSource
Create an object URL from the MediaSource
Set the video's src to the object URL
On the sourceopen event, create a SourceBuffer
Use SourceBuffer.appendBuffer() to add all of your chunks to the video
This way you can keep adding new bits of video without changing the object URL.
Caveats
The SourceBuffer object is very picky about codecs. These have to be declared, and must be exact, or it won't work
You can only append one blob of video data to the SourceBuffer at a time, and you can't append a second blob until the first one has finished (asynchronously) processing
If you append too much data to the SourceBuffer without calling .remove() then you'll eventually run out of RAM and the video will stop playing. I hit this limit around 1 hour on my laptop
Example Code
Depending on your setup, some of this may be unnecessary (particularly the part where we build a queue of video data before we have a SourceBuffer then slowly append our queue using updateend). If you are able to wait until the SourceBuffer has been created to start grabbing video data, your code will look much nicer.
<html>
<head>
</head>
<body>
<video id="video"></video>
<script>
// As before, I'm regularly grabbing blobs of video data
// The implementation of "nextChunk" could be various things:
// - reading from a MediaRecorder
// - reading from an XMLHttpRequest
// - reading from a local webcam
// - generating the files on the fly in JavaScript
// - etc
var arrayOfBlobs = [];
setInterval(function() {
arrayOfBlobs.append(nextChunk());
// NEW: Try to flush our queue of video data to the video element
appendToSourceBuffer();
}, 1000);
// 1. Create a `MediaSource`
var mediaSource = new MediaSource();
// 2. Create an object URL from the `MediaSource`
var url = URL.createObjectURL(mediaSource);
// 3. Set the video's `src` to the object URL
var video = document.getElementById("video");
video.src = url;
// 4. On the `sourceopen` event, create a `SourceBuffer`
var sourceBuffer = null;
mediaSource.addEventListener("sourceopen", function()
{
// NOTE: Browsers are VERY picky about the codec being EXACTLY
// right here. Make sure you know which codecs you're using!
sourceBuffer = mediaSource.addSourceBuffer("video/webm; codecs=\"opus,vp8\"");
// If we requested any video data prior to setting up the SourceBuffer,
// we want to make sure we only append one blob at a time
sourceBuffer.addEventListener("updateend", appendToSourceBuffer);
});
// 5. Use `SourceBuffer.appendBuffer()` to add all of your chunks to the video
function appendToSourceBuffer()
{
if (
mediaSource.readyState === "open" &&
sourceBuffer &&
sourceBuffer.updating === false
)
{
sourceBuffer.appendBuffer(arrayOfBlobs.shift());
}
// Limit the total buffer size to 20 minutes
// This way we don't run out of RAM
if (
video.buffered.length &&
video.buffered.end(0) - video.buffered.start(0) > 1200
)
{
sourceBuffer.remove(0, video.buffered.end(0) - 1200)
}
}
</script>
</body>
</html>
As an added bonus this automatically gives you DVR functionality for live streams, because you're retaining 20 minutes of video data in your buffer (you can seek by simply using video.currentTime = ...)
Adding to the previous answer...
make sure to add sourceBuffer.mode = 'sequence' in the MediaSource.onopen event handler to ensure the data is appended based on the order it is received. The default value is segments, which buffers until the next 'expected' timeframe is loaded.
Additionally, make sure that you are not sending any packets with a data.size === 0, and make sure that there is 'stack' by clearing the stack on the broadcasting side, unless you are wanting to record it as an entire video, in which case just make sure the size of the broadcast video is small enough, and that your internet speed is fast. The smaller and lower the resolution the more likely you can keep a realtime connection with a client, ie a video call.
For iOS the broadcast needs to made from a iOS/macOS application, and be in mp4 format. The video chunk gets saved to the app's cache and then removed once it is sent to the server. A client can connect to the stream using either a web browser or app across nearly any device.
I am trying to upload a video to server, and on client end. I am reading it using FileReader's readAsBinaryString().
Now, my problem is, I don't know how to read duration of this video file.
If i try reading the file, and assigning the reader's data to a video tag's source, then none of the events associated to the video tag are fired. I need to find the duration of file uploaded on client end.
Can somebody please suggest me something?
You can do something like this for that to work:
read the file as ArrayBuffer (this can be posted directly to server as a binary stream later)
wrap it in a Blob object
create an object URL for the blob
and finally set the url as the video source.
When the video object triggers the loadedmetadata event you should be able to read the duration.
You could use data-uri too, but notice that browsers may apply size limits (as well as other disadvantages) for them which is essential when it comes to video files, and there is a significant encoding/decoding overhead due to the Base-64 process.
Example
Select a video file you know the browser can handle (in production you should of course filter accepted file types based on video.canPlayType()).
The duration will show after the above steps has performed (no error handling included in the example, adjust as needed).
var fileEl = document.querySelector("input");
fileEl.onchange = function(e) {
var file = e.target.files[0], // selected file
mime = file.type, // store mime for later
rd = new FileReader(); // create a FileReader
rd.onload = function(e) { // when file has read:
var blob = new Blob([e.target.result], {type: mime}), // create a blob of buffer
url = (URL || webkitURL).createObjectURL(blob), // create o-URL of blob
video = document.createElement("video"); // create video element
video.preload = "metadata"; // preload setting
video.addEventListener("loadedmetadata", function() { // when enough data loads
document.querySelector("div")
.innerHTML = "Duration: " + video.duration + "s"; // show duration
(URL || webkitURL).revokeObjectURL(url); // clean up
// ... continue from here ...
});
video.src = url; // start video load
};
rd.readAsArrayBuffer(file); // read file object
};
<input type="file"><br><div></div>
you can do something like below, the trick is to use readAsDataURL:
var reader = new FileReader();
reader.onload = function() {
var media = new Audio(reader.result);
media.onloadedmetadata = function(){
media.duration; // this would give duration of the video/audio file
};
};
reader.readAsDataURL(file);
Fiddle Demo
Im trying to record a 48000Hz recording via getUserMedia. But without luck. The returned audio MediaStream returns 44100Hz. How can i set this to 48000Hz?
Here are snippets of my code:
var startUsermedia = this.startUsermedia;
navigator.getUserMedia({
audio: true,
//sampleRate: 48000
}, startUsermedia, function (e) {
console.log('No live audio input: ' + e);
});
The startUsermedia function:
startUsermedia: function (stream) {
var input = audio_context.createMediaStreamSource(stream);
console.log('Media stream created.');
// Uncomment if you want the audio to feedback directly
//input.connect(audio_context.destination);
//__log('Input connected to audio context destination.');
recorder = new Recorder(input);
console.log('Recorder initialised.');
},
I tried changing the property sampleRate of the AudioContext, but no luck.
How can i change the sampleRate to 48000Hz?
EDIT : We are also now okay with a flash solution that can record and export wav files at 48000Hz
As far as I know, there is no way to change the sample rate within an audio context. The sample rate will usually be the sample rate of your recording device and will stay that way. So you will not be able to write something like this:
var input = audio_context.createMediaStreamSource(stream);
var resampler = new Resampler(44100, 48000);
input.connect(resampler);
resampler.connect(audio_context.destination);
However, if you want to take your audio stream, resample it and then send it to the backend (or do sth. else with it outside of the Web Audio API), you can use an external sample rate converter (e.g. https://github.com/taisel/XAudioJS/blob/master/resampler.js).
var resampler = new Resampler(44100, 48000, 1, 2229);
function startUsermedia(stream) {
var input = audio_context.createMediaStreamSource(stream);
console.log('Media stream created.');
recorder = audio_context.createScriptProcessor(2048);
recorder.onaudioprocess = recorderProcess;
recorder.connect(audio_context.destination);
}
function recorderProcess(e) {
var buffer = e.inputBuffer.getChannelData(0);
var resampled = resampler.resampler(buffer);
//--> do sth with the resampled data for instance send to server
}
It looks like there is an open bug about the inability to set the sampling rate:
https://github.com/WebAudio/web-audio-api/issues/300
There's also a Chrome issue:
https://bugs.chromium.org/p/chromium/issues/detail?id=432248
I checked the latest Chromium code and there is nothing in there that lets you set the sampling rate.
Edit: Seems like it has been implemented in Chrome, but is broken currently - see the comments in the Chromium issue.
it's been added to chrome:
var ctx = new (window.AudioContext || window.webkitAudioContext)({ sampleRate:16000});
https://developer.mozilla.org/en-US/docs/Web/API/AudioContext/AudioContext
audioContext = new AudioContext({sampleRate: 48000})
Simply Set sample rate when created AudioContext object, This worked for me
NOTE: This answer is outdated.
You can't. The sample rate of the AudioContext is set by the browser/device and there is nothing you can do to change it. In fact, you will find that 44.1kHz on your machine might be 48kHz on mine. It varies to whatever the OS picks by default.
Also remember that not all hardware is capable of all sample rates.
You can use an OfflineAudioContext to essentially render your audio buffer to a different sample rate (but this is batch operation).
So you would record your recording using the normal audio context, and then use an OfflineAudioContext with a different sample rate to render your buffer. There is an example on the Mozilla page.
It is now in the spec but not yet implemented in Chromium.
Also in bugs.chromium.org, "Status: Available" does not mean it is implemented. It just means that nobody is working on it and that it is available for anyone who wants to work on it. So "Available" means "Not assigned".
JavaScript has the canPlayType method to test if the browser can play a video file. But for more accurate results it needs a string such as "video/mp4; codecs="avc1.66.13, mp4a.40.2". Is there any way for JavaScript to run a test directly on a video file to check it will play, or alternatively to retrieve more accurate codec information with JavaScript, or perhaps even PHP?
There is something like that, it's called canplaythrough.
The canplaythrough event is fired when the user agent can play the
media, and estimates that enough data has been loaded to play the
media up to its end without having to stop for further buffering of
content.
https://developer.mozilla.org/en-US/docs/Web/Events/canplaythrough
There's also an error event that fires when the video fails to load or can't be played by the browser
var v = document.createElement('video'),
s = document.createElement('source');
v.appendChild(s);
s.src = "simpsons.mp4";
s.type = "video/mp4";
s.addEventListener('error', function(ev) {
// catch errors
}, false);
I've been working on using the html audio tag to play some audio files. The audio plays alright, but the duration property of the audio tag is always returning infinity.
I tried the accepted answer to this question but with the same result. Tested with Chrome, IE and Firefox.
Is this a bug with the audio tag, or am I missing something?
Some of the code I'm using to play the audio files.
javascript function when playbutton is pressed
function playPlayerV2(src) {
document.getElementById("audioplayerV2").addEventListener("loadedmetadata", function (_event) {
console.log(player.duration);
});
var player = document.getElementById("audioplayer");
player.src = "source";
player.load();
player.play();
}
the audio tag in html
<audio controls="true" id="audioplayerV2" style="display: none;" preload="auto">
note: I'm hiding the standard audio player with the intend of using custom layout and make use of the player via javascript, this does not seem to be related to my problem.
try this
var getDuration = function (url, next) {
var _player = new Audio(url);
_player.addEventListener("durationchange", function (e) {
if (this.duration!=Infinity) {
var duration = this.duration
_player.remove();
next(duration);
};
}, false);
_player.load();
_player.currentTime = 24*60*60; //fake big time
_player.volume = 0;
_player.play();
//waiting...
};
getDuration ('/path/to/audio/file', function (duration) {
console.log(duration);
});
I think this is due to a chrome bug. Until it's fixed:
if (video.duration === Infinity) {
video.currentTime = 10000000;
setTimeout(() => {
video.currentTime = 0; // to reset the time, so it starts at the beginning
}, 1000);
}
let duration = video.duration;
This works for me
const audio = document.getElementById("audioplayer");
audio.addEventListener('loadedmetadata', () => {
if (audio.duration === Infinity) {
audio.currentTime = 1e101
audio.addEventListener('timeupdate', getDuration)
}
})
function getDuration() {
audio.currentTime = 0
this.voice.removeEventListener('timeupdate', getDuration)
console.log(audio.duration)
},
In case you control the server and can make it to send proper media header - this what helped the OP.
I faced this problem with files stored in Google Drive when getting them in Mobile version of Chrome. I cannot control Google Drive response and I have to somehow deal with it.
I don't have a solution that satisfies me yet, but I tried the idea from both posted answers - which basically is the same: make audio/video object to seek the real end of the resource. After Chrome finds the real end position - it gives you the duration. However the result is unsatisfying.
What this hack really makes - it forces Chrome to load the resource into the memory completely. So, if the resource is too big, or connection is too slow you end up waiting a long time for the file to be downloaded behind the scenes. And you have no control over that file - it is handled by Chrome and once it decides that it is no longer needed - it will dispose it, so the bandwidth may be spent ineficciently.
So, in case you can load the file yourself - it is better to download it (e.g. as blob) and feed it to your audio/video control.
If this is a Twilio mp3, try the .wav version. The mp3 is coming across as a stream and it fools the audio players.
To use the .wav version, just change the format of the source url from .mp3 to .wav (or leave it off, wav is the default)
Note - the wav file is 4x larger, so that's the downside to switching.
Not a direct answer but in case anyone using blobs came here, I managed to fix it using a package called webm-duration-fix
import fixWebmDuration from "webm-duration-fix";
...
fixedBlob = await fixWebmDuration(blob);
...
//If you want to modify the video file completely, you can use this package "webmFixDuration" Other methods are applied at the display level only on the video tag With this method, the complete video file is modified
webmFixDuration github example
mediaRecorder.onstop = async () => {
const duration = Date.now() - startTime;
const buggyBlob = new Blob(mediaParts, { type: 'video/webm' });
const fixedBlob = await webmFixDuration(buggyBlob, duration);
displayResult(fixedBlob);
};