JavaScript has the canPlayType method to test if the browser can play a video file. But for more accurate results it needs a string such as "video/mp4; codecs="avc1.66.13, mp4a.40.2". Is there any way for JavaScript to run a test directly on a video file to check it will play, or alternatively to retrieve more accurate codec information with JavaScript, or perhaps even PHP?
There is something like that, it's called canplaythrough.
The canplaythrough event is fired when the user agent can play the
media, and estimates that enough data has been loaded to play the
media up to its end without having to stop for further buffering of
content.
https://developer.mozilla.org/en-US/docs/Web/Events/canplaythrough
There's also an error event that fires when the video fails to load or can't be played by the browser
var v = document.createElement('video'),
s = document.createElement('source');
v.appendChild(s);
s.src = "simpsons.mp4";
s.type = "video/mp4";
s.addEventListener('error', function(ev) {
// catch errors
}, false);
Related
HTML5 video stop playing roughly about 100 seconds into playback. This happens on every video on try, and happens no matter which method of loading the video I have tried.
Info
For the sake of attempting to keep this brief, this is essentially a video streaming application. There are many different videos, and they will all be starting at a different startTime when loaded.
All videos are properly encoded & support streaming in chunks. All videos are .MP4
What I have tried:
Setting Video SRC Attribute To Video URL
I have tried setting the video src attribute to the URL pointing to the video, loading, setting the startTime, & playing. This stops at about 100 seconds of playback no matter where the startTime is set.
I can pause the video wait a moment and start playing the video again and playback will resume, however it will eventually stop playing again.
<video></video>
let video = document.querySelector( "video" );
video.src = "/media/S01E01.mp4";
video.load();
video.currentTime = 240;
video.play();
If I check how many seconds have been buffered it shows the entire video has been buffered, yet still has this issue.
video.buffered.end( 0 ) - video.buffered.start( 0 );
1383 // The Entire Length Of The Video In Seconds
Downloading The Entire Video Using Fetch & Assigning The SRC As A Blob
I thought this could be a buffering issue, so I tried changing the code to download the entire file using fetch & assigning the response as the SRC as a blob, and this ends in the same result with playback pausing / stopping after about 100 seconds of playback.
I can pause the video wait a moment and start playing the video again and playback will resume, however it will eventually stop playing again.
<video></video>
fetch( "/media/S01E01.mp4" )
.then( response => response.blob() )
.then( ( blob ) => {
let video = document.querySelector( "video" );
let src = URL.createObjectURL( blob );
video.src = src;
video.currentTime = 360;
video.play();
})
MediaStream API
I have tried using the MediaStream API however because the video will start at random start times, I can't get the MediaStream API to work as I would like. I can get the video to play if I request the initial bytes of data of the video. If I attempt to request the entire video using the MediaStream API the fetch request aborts before it can finish with the error:
DOMException: The Operation Was Aborted
I am still working on attempting to use the MediaStream API by requesting the entire video in multiple requests, but I am still working on it, so I do not know the results as of yet. I am apprehensive to spend the time working on writing all of the code to get it working only for it to end in the same result.
Does anyone know why this is happening?
It is always a good practice to monitor video loading/readiness state, i.e.:
<video></video>
fetch( "/media/S01E01.mp4" )
.then( response => response.blob() )
.then( ( blob ) => {
let video = document.querySelector( "video" );
let src = URL.createObjectURL( blob );
video.src = src;
video.currentTime = 360;
// monitor for errors
video.onerror = () => {
console.error(`Error ${video.error.code}; details: ${video.error.message}`);
}
// play the video as soon as the first frames are cached
video.oncanplay = () => {
video.play();
};
// or play when the entire video is cached
video.oncanplaythrough = () => {
video.play();
};
})
Additionally, try to monitor the video tag.
ready state https://developer.mozilla.org/en-US/docs/Web/API/HTMLMediaElement/readyState
network state https://developer.mozilla.org/en-US/docs/Web/API/HTMLMediaElement/networkState
a stalled event to see if the server is not sending enough data https://developer.mozilla.org/en-US/docs/Web/API/HTMLMediaElement/stalled_event
waiting event to see if a server is slow to serve data https://developer.mozilla.org/en-US/docs/Web/API/HTMLMediaElement/waiting_event
These events and approaches should help you to get more insight into what is going on and why playback stops.
Let me know if this helps.
Update based on comments
If you say that In Chrome it locks up the entire page. it makes me think that there is a problem with a video driver. Chrome is using codecs integrated into it, so we cannot blame codecs installed on the machine.
As a first step, I would recommend checking and updating the video driver for your video card.
Another option is to try the same code on a different machine. If you don't have one, you can spin up Ubuntu without installing it and run your code. An alternative is to run a local HTTP server on your machine but open the webpage on your smartphone and see if the video plays there.
I have an array of Blobs (binary data, really -- I can express it however is most efficient. I'm using Blobs for now but maybe a Uint8Array or something would be better). Each Blob contains 1 second of audio/video data. Every second a new Blob is generated and appended to my array. So the code roughly looks like so:
var arrayOfBlobs = [];
setInterval(function() {
arrayOfBlobs.append(nextChunk());
}, 1000);
My goal is to stream this audio/video data to an HTML5 element. I know that a Blob URL can be generated and played like so:
var src = URL.createObjectURL(arrayOfBlobs[0]);
var video = document.getElementsByTagName("video")[0];
video.src = src;
Of course this only plays the first 1 second of video. I also assume I can trivially concatenate all of the Blobs currently in my array somehow to play more than one second:
// Something like this (untested)
var concatenatedBlob = new Blob(arrayOfBlobs);
var src = ...
However this will still eventually run out of data. As Blobs are immutable, I don't know how to keep appending data as it's received.
I'm certain this should be possible because YouTube and many other video streaming services utilize Blob URLs for video playback. How do they do it?
Solution
After some significant Googling I managed to find the missing piece to the puzzle: MediaSource
Effectively the process goes like this:
Create a MediaSource
Create an object URL from the MediaSource
Set the video's src to the object URL
On the sourceopen event, create a SourceBuffer
Use SourceBuffer.appendBuffer() to add all of your chunks to the video
This way you can keep adding new bits of video without changing the object URL.
Caveats
The SourceBuffer object is very picky about codecs. These have to be declared, and must be exact, or it won't work
You can only append one blob of video data to the SourceBuffer at a time, and you can't append a second blob until the first one has finished (asynchronously) processing
If you append too much data to the SourceBuffer without calling .remove() then you'll eventually run out of RAM and the video will stop playing. I hit this limit around 1 hour on my laptop
Example Code
Depending on your setup, some of this may be unnecessary (particularly the part where we build a queue of video data before we have a SourceBuffer then slowly append our queue using updateend). If you are able to wait until the SourceBuffer has been created to start grabbing video data, your code will look much nicer.
<html>
<head>
</head>
<body>
<video id="video"></video>
<script>
// As before, I'm regularly grabbing blobs of video data
// The implementation of "nextChunk" could be various things:
// - reading from a MediaRecorder
// - reading from an XMLHttpRequest
// - reading from a local webcam
// - generating the files on the fly in JavaScript
// - etc
var arrayOfBlobs = [];
setInterval(function() {
arrayOfBlobs.append(nextChunk());
// NEW: Try to flush our queue of video data to the video element
appendToSourceBuffer();
}, 1000);
// 1. Create a `MediaSource`
var mediaSource = new MediaSource();
// 2. Create an object URL from the `MediaSource`
var url = URL.createObjectURL(mediaSource);
// 3. Set the video's `src` to the object URL
var video = document.getElementById("video");
video.src = url;
// 4. On the `sourceopen` event, create a `SourceBuffer`
var sourceBuffer = null;
mediaSource.addEventListener("sourceopen", function()
{
// NOTE: Browsers are VERY picky about the codec being EXACTLY
// right here. Make sure you know which codecs you're using!
sourceBuffer = mediaSource.addSourceBuffer("video/webm; codecs=\"opus,vp8\"");
// If we requested any video data prior to setting up the SourceBuffer,
// we want to make sure we only append one blob at a time
sourceBuffer.addEventListener("updateend", appendToSourceBuffer);
});
// 5. Use `SourceBuffer.appendBuffer()` to add all of your chunks to the video
function appendToSourceBuffer()
{
if (
mediaSource.readyState === "open" &&
sourceBuffer &&
sourceBuffer.updating === false
)
{
sourceBuffer.appendBuffer(arrayOfBlobs.shift());
}
// Limit the total buffer size to 20 minutes
// This way we don't run out of RAM
if (
video.buffered.length &&
video.buffered.end(0) - video.buffered.start(0) > 1200
)
{
sourceBuffer.remove(0, video.buffered.end(0) - 1200)
}
}
</script>
</body>
</html>
As an added bonus this automatically gives you DVR functionality for live streams, because you're retaining 20 minutes of video data in your buffer (you can seek by simply using video.currentTime = ...)
Adding to the previous answer...
make sure to add sourceBuffer.mode = 'sequence' in the MediaSource.onopen event handler to ensure the data is appended based on the order it is received. The default value is segments, which buffers until the next 'expected' timeframe is loaded.
Additionally, make sure that you are not sending any packets with a data.size === 0, and make sure that there is 'stack' by clearing the stack on the broadcasting side, unless you are wanting to record it as an entire video, in which case just make sure the size of the broadcast video is small enough, and that your internet speed is fast. The smaller and lower the resolution the more likely you can keep a realtime connection with a client, ie a video call.
For iOS the broadcast needs to made from a iOS/macOS application, and be in mp4 format. The video chunk gets saved to the app's cache and then removed once it is sent to the server. A client can connect to the stream using either a web browser or app across nearly any device.
I see a lot of questions for how to record audio then stop recording, then play audio or save it to a file, but none of this is what I want.
tl;dr Here's my question in a nutshell: "How can I immediately play audio recorded from the user's microphone?" That is, I don't want to save a recording and play it when the user hits a "Play" button, I don't want to save a recording to a file on the user's computer and I don't want to use WebRTC to stream audio anywhere. I just want to talk into my microphone and hear my voice come out the speakers.
All I'm trying to do is make a very simple "echo" page that just immediately plays back audio recorded from the mic. I started using a mediaRecorder object, but that wasn't working and from what I can tell that's meant for recording full audio files, so I switched to an AudioContext-based approach.
A very simple page would just look like this:
<!DOCTYPE html>
<head>
<script type="text/javascript" src="mcve.js"></script>
</head>
<body>
<audio id="speaker" volume="1.0"></audio>
</body>
and the script looks like this:
if (navigator.mediaDevices) {
var constrains = {audio: true};
navigator.mediaDevices.getUserMedia(constrains).then(
function (stream) {
var context = new AudioContext();
var source = context.createMediaStreamSource(stream);
var proc = context.createScriptProcessor(2048, 2, 2);
source.connect(proc);
proc.onaudioprocess = function(e) {
console.log("audio data collected");
let audioData = new Blob(e.inputBuffer.getChannelData(0), {type: 'audio/ogg' } )
|| new Blob(new Float32Array(2048), {type: 'audio/ogg'});
var speaker = document.getElementById('speaker');
let url = URL.createObjectURL(audioData);
speaker.src = url;
speaker.load();
speaker.play().then(
() => { console.log("Playback success!"); },
(error) => { console.log("Playback failure... ", error); }
);
};
}
).catch( (error) => {
console.error("couldn't get user media.");
});
}
It can record non-trivial audio data (i.e. not every collection winds up as a Blob made from the new Float32Array(2048) call), but it can't play it back. It never hits the "could not get user media" catch, but it always hits the "Playback Failure..." catch. The error prints like this:
DOMException [NotSupportedError: "The media resource indicated by the src attribute or assigned media provider object was not suitable."
code: 9
nsresult: 0x806e0003]
Additionally, the message Media resource blob:null/<long uuid> could not be decoded. is printed to the console repeatedly.
There are two things that could be going on here, near as I can tell (maybe both):
I'm not encoding the audio. I'm not sure if this is a problem, since I thought that data collected from the mic came with 'ogg' encoding automagically, and I've tried leaving the type property of my Blobs blank to no avail. If this is what's wrong, I don't know how to encode a chunk of audio given to me by the audioprocess event, and that's what I need to know.
An <audio> element is fundamentally incapable of playing audio fragments, even if properly encoded. Maybe by not having a full file, there's some missing or extraneous metadata that violates encoding standards and is preventing the browser from understanding me. If this is the case, maybe I need a different element, or even an entirely scripted solution. Or perhaps I'm supposed to construct a file-like object in-place for each chunk of audio data?
I've built this code on examples from MDN and SO answers, and I should mention I've tested my mic at this example demo and it appears to work perfectly.
The ultimate goal here is to stream this audio through a websocket to a server and relay it to other users. I DON'T want to use WebRTC if at all possible, because I don't want to limit myself to only web clients - once it's working okay, I'll make a desktop client as well.
Check example https://jsfiddle.net/greggman/g88v7p8c/ from https://stackoverflow.com/a/38280110/351900
Required to be run from HTTPS
navigator.getUserMedia = navigator.getUserMedia ||navigator.webkitGetUserMedia || navigator.mozGetUserMedia;
var aCtx;
var analyser;
var microphone;
if (navigator.getUserMedia) {
navigator.getUserMedia(
{audio: true},
function(stream) {
aCtx = new AudioContext();
microphone = aCtx.createMediaStreamSource(stream);
var destination=aCtx.destination;
microphone.connect(destination);
},
function(){ console.log("Error 003.")}
);
}
Is there a global way to detect when audio is playing or starts playing in the browser.
something like along the idea of if(window.mediaPlaying()){...
without having the code tied to a specific element?
EDIT: What's important here is to be able to detect ANY audio no matter where the audio comes from. Whether it comes from an iframe, a video, the Web Audio API, etc.
No one should use this but it works.
Basically the only way that I found to access the entire window's audio is using MediaDevices.getDisplayMedia().
From there a MediaStream can be fed into an AnalyserNode that can be used to check the if the audio volume is greater than zero.
Only works in Chrome and maybe Edge (Only tested in Chrome 80 on Linux)
JSFiddle with <video>, <audio> and YouTube!
Important bits of code (cannot post in a working snippet because of the Feature Policies on the snippet iframe):
var audioCtx = new AudioContext();
var analyser = audioCtx.createAnalyser();
var bufferLength = analyser.fftSize;
var dataArray = new Float32Array(bufferLength);
window.isAudioPlaying = () => {
analyser.getFloatTimeDomainData(dataArray);
for (var i = 0; i < bufferLength; i++) {
if (dataArray[i] != 0) return true;
}
return false;
}
navigator.mediaDevices.getDisplayMedia({
video: true,
audio: true
})
.then(stream => {
if (stream.getAudioTracks().length > 0) {
var source = audioCtx.createMediaStreamSource(stream);
source.connect(analyser);
document.body.classList.add('ready');
} else {
console.log('Failed to get stream. Audio not shared or browser not supported');
}
}).catch(err => console.log("Unable to open capture: ", err));
I read all MDN docs about Web Audio API but I didn't find any global flag on window that shows audio playing. But I have found a tricky way that shows ANY audio playing, no matter an iframe or video but about Web Audio API:
const allAudio = Array.from( document.querySelectorAll('audio') );
const allVideo = Array.from( document.querySelectorAll('video') );
const isPlaying = [...allAudio, ...allVideo].some(item => !item.paused);
Now, by the isPlaying flag we can detect if any audio or video is playing in the browser.
There is a playbackState property (https://developer.mozilla.org/en-US/docs/Web/API/MediaSession/playbackState), but not all browsers support it.
if(navigator.mediaSession.playbackState === "playing"){...
I was looking for a solution in Google, but i didn't find anything yet.
Maybe you could check some data that has X value only when audio is playing. If you have some button that start playing the audio file, maybe you can be sure that the audio is playing by adding some event listener on the rep. button...
Maybe something like adding an event listener to the "audio" tag? If i remember correctly, audio tag has a "paused" attribute...
And now i just remember that the audio has "paused" attribute...
Also, you may want to check this topic HTML5 check if audio is playing?
i jus find it five seconds ago jaja
I am using a file input to capture recorded video from a user's mobile device. What I want to do is then read that file somehow and determine whether it is over a certain duration (30 seconds in this case). If it is over that duration, then the file should not be allowed to be uploaded to the server. It is under the duration, then it is okay.
I can accurately detect the duration of the file in javascript on desktop, but not on mobile, which is what I need. This is my code:
onEndRecord = function(e) {
var file = e.target.files[0];
var videoElement = document.createElement('video');
document.body.appendChild(videoElement);
var fileURL = window.URL.createObjectURL(file);
videoElement.addEventListener('loadeddata', function(e) {
console.log('loadeddata',e.target.duration);
});
videoElement.onload = function () { // binding onload event
console.log('onload',videoElement.duration);
};
videoElement.src = fileURL;
}
Anybody know how to get this information? The duration just reports as zero on mobile.
I've also tried running it through the file reader api:
readBlob = function(file){
console.log('readBlob',file);
var reader = new FileReader();
reader.onload = function (e) {
console.log('reader load');
var player = document.getElementById('videoReader');
player.addEventListener('loadeddata', function(e) {
console.log('loadeddata',e.target.duration);
player.play();
});
var fileURL = window.URL.createObjectURL(file);
player.src = fileURL;
}
reader.readAsDataURL(file);
}
What is happening I believe is that the loadedmetadata event (or loadeddata event as in your question) just does not fire on the mobile devices you are testing hence the duration is not available for reading and is rendered as 0. Have a look here for the events linked to the HTML5 media element specs. Note that you could use the loadstart event for the media element rather than the onload event for fine tuning your web application.
Typically on iOS the event will fire on user interaction ... not before (as with the canplay event). This is a limitation to attempt to reduce bandwidth consumption of users on paid data plans for their mobile device. This is described here for Apple. The same generally goes for Android.
Dealing with the Web Audio API you could get the duration through the buffer received from the decodeAudioData method. Here is some information on the subject.
You can read this information server side with PHP or Java but this would not work at best for your design.
So either you could get a user to playback the recorded sample before uploading to have access to the duration or if you know the average bitrate at which the video was recorded and the file size (File API) then you could approximate the duration.
Solved this by using FFMPEG's FFprobe service. We download just a small amount of the video - about 4k is enough - and then read the metadata. For quicktime, however, the metadata is at the end of the video, so you have to swap the beginning for the end. This was done using a modified version of qt fast start:
https://github.com/danielgtaylor/qtfaststart