HTML5: Manually "playing" a video inside a canvas, in IOS Safari - javascript
So, here's what I'm trying to do:
I want to load a video into a video element, but not have it played in the "normal" way.
Using a timed interval, calculated according to the movie's framerate, I want on each iteration to
A. Manually advance the video one 'frame' (or as close as possible to that).
B. Draw that frame into a canvas.
Thereby having the video "play" inside the canvas.
Here's some code:
<video width="400" height="224" id="mainVideo" src="urltovideo.mp4" type="video/mp4"></video>
<canvas width="400" height="224" id="videoCanvas"></canvas>
<script type="text/javascript">
var videoDom = document.querySelector("#mainVideo");
var videoCanvas = document.querySelector("#videoCanvas");
var videoCtx = null;
var interval = null;
videoDom.addEventListener('canplay',function() {
// The video's framerate is 24fps, so we should move one frame each 1000/24=41.66 ms
interval = setInterval(function() { doVideoCanvas(); }, 41.66);
});
videoDom.addEventListener('loadeddata',function() {
videoCtx = videoCanvas.getContext('2d');
});
function doVideoCanvas() {
videoCtx.drawImage(videoDom,0,0);
//AFAIK and seen, currentTime is in seconds
videoDom.currentTime += 0.0416;
}
</script>
This works perfectly in Google Chrome, but it doesn't work in an Iphone's Safari;
The video frames does not get drawn at all to the canvas.
I made sure that:
The video events I hooked into does get triggered (did 'alerts' and they were shown).
I have control over the canvas (did a 'fillRect' and it filled).
[I also tried specifying dimensions in the drawImage - it didn't help]
Is drawImage with a video object not applicable at all in Iphone Safari...?
Even if I'll manage to find a way to capture the video frames, there are also some other issues in the Iphone's browser:
Access to the currentTime property of the video is only granted once the video has started playing (in a standard way). I thought about maybe somehow "playing it hidden" and then capturing, but didn't manage to do that. Also thought of maybe somehow start playing the video and then immediately stop it;
There doesn't seem to be any way to forcefully start the playing of a video in the IOS Safari. video.play(), or autoplay doesn't seem to do anything. Only if the user taps the "play circle" on the video then the video starts playing (taking all over the screen, as usually with videos on the IPhone's browser).
Once the video plays - the currentTime property does get forwarded. On the video itself. When you pause the video and go back to the normal html page - you can see the frames on the video element changing. Though, in a slow phase (unlike in Google Chrome, where the rate seems to be smooth enough to make it look like it's playing) - in the iphone it looks to be a rate of something like 2-3 frames per second maybe. It stays the same even if I try changing the interval timing, I guess there's a minimum time limit that the browser on the IPhone can handle.
"Bonus question" :)
- When the frames on the video element progresses from the event - the circle "play button" is visible on the video element (since it is not actually 'playing'). Is there anyway to hide it and make it invisible?
This has been tested on Iphone 3GS (with both 3G and Wifi) and Iphone 4, both running IOS 5, both having the same results as described.
Unfortunately I don't have an iOS device to test this, but I don't think you need to actually capture the video frames in the way that you're attempting using the currentTime property. The usual process looks something like this:
create a video element without controls (omit the controls attribute)
hide the element
when the video is playing draw it to the canvas on an interval
As you've discovered, iOS devices do not support autoplay on HTML5 video (or indeed audio) but you can create a separate control to initiate playback of the video using the play() method.
This approach should solve the issue you're having with the play button being visible since in this case you are actually playing the video.
I don't believe the loadeddata event is called on iOS, try the loadedmetadata event instead. I also found it necessary on iOS to call the videoDom.load() method after setting videoDom.src.
For my use case, I need to do a "dRAF" (double requestAnimationFrame) after the seeked event to ensure something was actually drawn to the canvas rather than a transparent rectangle.
Try something like:
videoDom.onloadedmetadata = () => {
videoCanvas.height = videoDom.videoHeight
videoCanvas.width = videoDom.videoWidth
videoDom.currentTime = 0
}
videoDom.onseeked = () => {
// delay the drawImage call, otherwise we get an empty videoCanvas on iOS
// see https://stackoverflow.com/questions/44145740/how-does-double-requestanimationframe-work
window.requestAnimationFrame(() => {
window.requestAnimationFrame(() => {
videoCanvas.getContext('2d').drawImage(videoDom, 0, 0, videoCanvas.width, videoCanvas.height)
videoDom.currentTime += 0.0416
})
})
}
videoDom.load()
From this gist
Related
seamlessly change video resolution while video is playing
I am writing code that changes video resolution depending on the current screen size. On fullscreen button clicked I check screen size and if it is bigger than 1280px, a 1080p video is used instead of 720p. I do that by changing src of the video element. Unfortunately, this causes a delay of a second or more, because the video with higher resolution needs to load first. How can I create a seamless transition between the 2 resolutions? Sometimes youtube or facebook videos change resolution depending on your network conditions, and it is seamless in terms of delay. This is my basic code. I use plyr library: html <video id="main-video" playsinline poster="/assets/img/video.png" class="element-video"> <source id="main-video-source" src="/assets/img/video.mp4" type="video/mp4" size="1080"> </video> js var player = new Plyr('#main-video',{controls:['play-large', 'play', 'progress', 'current-time', 'mute', 'volume', 'settings', 'fullscreen']}); player.on('enterfullscreen', event => { var videoPlayer = document.getElementById("main-video"); if(window.devicePixelRatio * window.innerWidth > 1280){ var currentTime = videoPlayer.currentTime; videoPlayer.src = "video.mp4"; videoPlayer.currentTime = currentTime; videoPlayer.play(); }else{ var currentTime = videoPlayer.currentTime; videoPlayer.src = "video-720.mp4"; videoPlayer.currentTime = currentTime; videoPlayer.play(); } });
As Joel says, using Adaptive Bit Rate Streaming is the easiest approach here currently to get the affect you are looking for. See here for more info on ABR and an example of how to view it in action: https://stackoverflow.com/a/42365034/334402 Most video player clients will support ABR, and will give the type of smooth(ish...) transition you see on services like YouTube or Netflix when it steps through different resolutions. Having more different resolutions or 'steps' may make it smoother so it may be worth experimenting to find what is acceptable for your use case. Also, as you already have at least two resolution versions of the video any extra server side overhead is not too great for your case.
How do I listen to resolution change from media stream
When I call getUserMedia() I can get my current video resolution by using onloadedmetadata on the srcObject. However, when the video is changing its resolution for some reason; in my example the user rotates the camera to portrait, then my the old video resolution becomes obsolete and I would like to get the new one. This is how my initial fetch of resolution looks like: this._localVideo.onloadedmetadata = (e) => { e.srcElement.videoWidth; e.srcElement.videoHeight; }; but how do I know if these values are changing? EDIT: Turns out re-setting srcObject on the video element triggers onloadedmetadata again with updated values.
Create.js and Animate CC -> Navigate to a frame in a "Bitmap video"
I am working on what will hopefully be my first create.js project in Animate CC at my current employer. I am trying to load in a .mp4 into a Bitmap object, using HTML5ElementForVideo = document.createElement('video'); HTML5ElementForVideo.src = 'bridge-animation-resized-794x652.mp4'; HTML5ElementForVideo.autoplay = false; video = new createjs.Bitmap(HTML5ElementForVideo); video.x = 110.00; video.y = 42.5; stage.addChild(video); ..which works okay, and as expected. In the video, there are a series of steps which we would like the user to be able to go between, using "Previous" and "Next" step buttons. I assumed that navigation wise, I would be able to use something along the lines of: [video].gotoAndPlay(x) To move to the right frame in the video. However, this does not seem to work or be supported? I only seem to be able to play the video or stop it with .play and .pause? Any suggestions, please? Dave
The video is not an Animate or CreateJS object - only the Bitmap drawing it to the stage is. Check out the HTML Video API for how to control it. To set the position, use the currentTime property to control the video position. Hope that helps!
Control Netflix player using JavaScript
I would like to make a Chrome extension that is able to control the Netflix player. The current Netflix player is written in HTML5 as far as I can tell, so I was wondering if there is a way to control the player, e.g. play, pause, volume control and changing the position of the video. I've tried using this to control the playing and pausing functions and it works. document.getElementsByClassName("player-control-button player-play-pause")[0].click(); I've also tried using but then I just get an error saying that videoPlayer() isn't a function netflix.cadmium.objects.videoPlayer(); Is there something similar I can do to change the volume and the position of the video? Thanks!
First get the <video> element as variable e.g. by: media = document.getElementById("netflixVideoPlayer"); After that you can control the volume. To mute the sound: media.volume = 0 Turn the volume to 100%: media.volume = 1 Turn the volume to 60%: media.volume = 0.6 Start the video: media.start(); Pause the video: media.pause();
context drawImage / canvas toDataURL lagging significantly on opera mobile
I have a bit of Javascript that runs on an Android tablet, running Opera Mobile 12. The tablet is attached to the wall in an office, so it's used by everyone who works in this office, as a timekeeping system (ie. they use it to clock in/clock out of work). This javascript takes a photo of the user when a certain event is raised, then converts this photo to a data URL so it can be sent to a server. This all runs in the background, though - the video and canvas elements are both set to display:none;. Here's the code that handles the webcam interaction: var Webcam = function() { var video = document.getElementById('video'); var stream_webcam = function(stream) { video.addEventListener('loadedmetadata', function(){ video.play(); }, false); video.src = window.URL.createObjectURL(stream); } // start recording if (navigator.getUserMedia) navigator.getUserMedia({video: true}, stream_webcam); else _error("No navigator.getUserMedia support detected!"); var w = {}; w.snap = function(callback) { var canvas = document.getElementById('canvas'); canvas.getContext('2d').drawImage(video, 0, 0, canvas.width, canvas.height); // delay is incurred in writing to canvas on opera mobile setTimeout(function() { callback(canvas.toDataURL()); }, 1); }; return w; } w.snap(callback) gets called when the user presses a certain button. The problem is that there is a significant lag between the call to drawImage, and an image actually being drawn to the canvas. Here's the problem that occurs: User 1 users the tablet, presses the button, has their photo "snapped" and sent to the server. The server receives a blank, transparent, image. (It's not the same issue as html5 canvas toDataURL returns blank image - the data being sent to the server is substantially less for the first image, because it's blank) A few minutes later, user 2 uses the tablet and presses the button, and has their photo "snapped". The server receives the photo of user 1. Later, user 3 does the same thing, and the server gets a photo of user 2. So I'm guessing the call to toDataURL() is returning old data because drawImage() doesn't . (I read somewhere that it's async, but I can't confirm and can't find any way of attaching an event to when it finishes drawing.) I've tried: changing the timeout in snap() to a variety of values. The highest I tried was 2000, but the same issue occurred. (If anyone suggests higher numbers, I'm happy to try them, but I don't want to go much higher because if I go too high, there's potential for user 3 to have their photo taken before I've even processed user 2's photo, which means I might lose it entirely!) having the canvas draw something when the page first loads, but that didn't fix it - when user 1 was photographed, the server received whatever photo was taken when the page loaded, instead of the photo of user 1. "getting" the canvas element again, using getElementById, before calling toDataURL. A few more notes: This works fine on my computer (macbook air) in Chrome and Opera. I can't reliably replicate it using a debugger (linking to the tablet using Opera Dragonfly). AFAIK Opera Mobile is the only Android browser that supports getUserMedia and can thus take photos. If I remove the display:none; from the video and canvas, it works correctly (on the tablet). The issue only occurs when the video and canvas have display:none; set. Because the page is a single page app with no scrolling or zooming required, a possible workaround is to move the video and canvas below the page fold, then disable scrolling with javascript (ie. scroll to the top whenever the user scrolls). But I would rather a proper fix than a workaround, if possible!
I don't have a very good solution but I would suggest not handling an invisible canvas through the dom by setting it invisible. Instead of var canvas = document.getElementById('canvas'); Try var canvas = document.createElement('canvas'); canvas.width = video.width; canvas.height = video.height; canvas.getContext('2d').drawImage(video, 0, 0) This way it's separate from the DOM. Also Why not use canvas.toDataURL("image/jpg")? EDIT: Also, if you're designing it, have you tried the other browsers out there? Whats restricting you to Opera over the other browsers available or using phonegap? EDIT2: Thinking about it, Canvas also has two other options for getting that photo into place that you would want to look in two. Those two being: var imgData = canvas.getContext('2d').getImageData(0,0,canvas.width,canvas.height); -or- canvas.getContext('2d').putImageData(imgData,0,0); These two ignore any scaling you've done to the context but I've found that they are much more direct and often faster. They are a solid alternative to toDataURL and drawImage but the image data you put and get from these are encoded as an array in the form: [r1, b1, g1, a1, r2, b2, g2, a2....] you can find documentation for them here: http://www.w3schools.com/tags/canvas_putimagedata.asp http://www.w3schools.com/tags/canvas_getimagedata.asp
Some version of Android have problems with toDataUrl(); Maybe this is the solution? http://code.google.com/p/todataurl-png-js/wiki/FirstSteps The script: http://todataurl-png-js.googlecode.com/svn/trunk/todataurl.js Paste this in your head: <script src="todataurl.js"></script>