Control Netflix player using JavaScript - javascript

I would like to make a Chrome extension that is able to control the Netflix player.
The current Netflix player is written in HTML5 as far as I can tell, so I was wondering if there is a way to control the player, e.g. play, pause, volume control and changing the position of the video.
I've tried using this to control the playing and pausing functions and it works.
document.getElementsByClassName("player-control-button player-play-pause")[0].click();
I've also tried using but then I just get an error saying that videoPlayer() isn't a function
netflix.cadmium.objects.videoPlayer();
Is there something similar I can do to change the volume and the position of the video?
Thanks!

First get the <video> element as variable e.g. by:
media = document.getElementById("netflixVideoPlayer");
After that you can control the volume.
To mute the sound:
media.volume = 0
Turn the volume to 100%:
media.volume = 1
Turn the volume to 60%:
media.volume = 0.6
Start the video:
media.start();
Pause the video:
media.pause();

Related

seamlessly change video resolution while video is playing

I am writing code that changes video resolution depending on the current screen size. On fullscreen button clicked I check screen size and if it is bigger than 1280px, a 1080p video is used instead of 720p.
I do that by changing src of the video element. Unfortunately, this causes a delay of a second or more, because the video with higher resolution needs to load first.
How can I create a seamless transition between the 2 resolutions? Sometimes youtube or facebook videos change resolution depending on your network conditions, and it is seamless in terms of delay.
This is my basic code. I use plyr library:
html
<video id="main-video" playsinline poster="/assets/img/video.png" class="element-video">
<source id="main-video-source" src="/assets/img/video.mp4" type="video/mp4" size="1080">
</video>
js
var player = new Plyr('#main-video',{controls:['play-large', 'play', 'progress', 'current-time', 'mute', 'volume', 'settings', 'fullscreen']});
player.on('enterfullscreen', event => {
var videoPlayer = document.getElementById("main-video");
if(window.devicePixelRatio * window.innerWidth > 1280){
var currentTime = videoPlayer.currentTime;
videoPlayer.src = "video.mp4";
videoPlayer.currentTime = currentTime;
videoPlayer.play();
}else{
var currentTime = videoPlayer.currentTime;
videoPlayer.src = "video-720.mp4";
videoPlayer.currentTime = currentTime;
videoPlayer.play();
}
});
As Joel says, using Adaptive Bit Rate Streaming is the easiest approach here currently to get the affect you are looking for. See here for more info on ABR and an example of how to view it in action: https://stackoverflow.com/a/42365034/334402
Most video player clients will support ABR, and will give the type of smooth(ish...) transition you see on services like YouTube or Netflix when it steps through different resolutions. Having more different resolutions or 'steps' may make it smoother so it may be worth experimenting to find what is acceptable for your use case.
Also, as you already have at least two resolution versions of the video any extra server side overhead is not too great for your case.

How do I listen to resolution change from media stream

When I call getUserMedia() I can get my current video resolution by using onloadedmetadata on the srcObject.
However, when the video is changing its resolution for some reason; in my example the user rotates the camera to portrait, then my the old video resolution becomes obsolete and I would like to get the new one.
This is how my initial fetch of resolution looks like:
this._localVideo.onloadedmetadata = (e) => {
e.srcElement.videoWidth;
e.srcElement.videoHeight;
};
but how do I know if these values are changing?
EDIT: Turns out re-setting srcObject on the video element triggers onloadedmetadata again with updated values.

Create.js and Animate CC -> Navigate to a frame in a "Bitmap video"

I am working on what will hopefully be my first create.js project in Animate CC at my current employer.
I am trying to load in a .mp4 into a Bitmap object, using
HTML5ElementForVideo = document.createElement('video');
HTML5ElementForVideo.src = 'bridge-animation-resized-794x652.mp4';
HTML5ElementForVideo.autoplay = false;
video = new createjs.Bitmap(HTML5ElementForVideo);
video.x = 110.00;
video.y = 42.5;
stage.addChild(video);
..which works okay, and as expected. In the video, there are a series of steps which we would like the user to be able to go between, using "Previous" and "Next" step buttons.
I assumed that navigation wise, I would be able to use something along the lines of:
[video].gotoAndPlay(x)
To move to the right frame in the video. However, this does not seem to work or be supported? I only seem to be able to play the video or stop it with .play and .pause?
Any suggestions, please?
Dave
The video is not an Animate or CreateJS object - only the Bitmap drawing it to the stage is. Check out the HTML Video API for how to control it. To set the position, use the currentTime property to control the video position.
Hope that helps!

Autoplay once in HTML 5 and/or video.js

I want an embedded video (MP4) to autoplay once only in video.js then either return to a poster or the first frame so a click will commence play for a 2nd time.
Currently its autoplaying but stopping on the last frame and the only way i can re play is by dragging play bar to the left. (except in crome where it plays from the start with a click)
Here's how you can go back to the first frame.
suppose your video.js player object named "player":
player.on("ended", function () {
this.currentTime(0); //move to start
}

HTML5: Manually "playing" a video inside a canvas, in IOS Safari

So, here's what I'm trying to do:
I want to load a video into a video element, but not have it played in the "normal" way.
Using a timed interval, calculated according to the movie's framerate, I want on each iteration to
A. Manually advance the video one 'frame' (or as close as possible to that).
B. Draw that frame into a canvas.
Thereby having the video "play" inside the canvas.
Here's some code:
<video width="400" height="224" id="mainVideo" src="urltovideo.mp4" type="video/mp4"></video>
<canvas width="400" height="224" id="videoCanvas"></canvas>
<script type="text/javascript">
var videoDom = document.querySelector("#mainVideo");
var videoCanvas = document.querySelector("#videoCanvas");
var videoCtx = null;
var interval = null;
videoDom.addEventListener('canplay',function() {
// The video's framerate is 24fps, so we should move one frame each 1000/24=41.66 ms
interval = setInterval(function() { doVideoCanvas(); }, 41.66);
});
videoDom.addEventListener('loadeddata',function() {
videoCtx = videoCanvas.getContext('2d');
});
function doVideoCanvas() {
videoCtx.drawImage(videoDom,0,0);
//AFAIK and seen, currentTime is in seconds
videoDom.currentTime += 0.0416;
}
</script>
This works perfectly in Google Chrome, but it doesn't work in an Iphone's Safari;
The video frames does not get drawn at all to the canvas.
I made sure that:
The video events I hooked into does get triggered (did 'alerts' and they were shown).
I have control over the canvas (did a 'fillRect' and it filled).
[I also tried specifying dimensions in the drawImage - it didn't help]
Is drawImage with a video object not applicable at all in Iphone Safari...?
Even if I'll manage to find a way to capture the video frames, there are also some other issues in the Iphone's browser:
Access to the currentTime property of the video is only granted once the video has started playing (in a standard way). I thought about maybe somehow "playing it hidden" and then capturing, but didn't manage to do that. Also thought of maybe somehow start playing the video and then immediately stop it;
There doesn't seem to be any way to forcefully start the playing of a video in the IOS Safari. video.play(), or autoplay doesn't seem to do anything. Only if the user taps the "play circle" on the video then the video starts playing (taking all over the screen, as usually with videos on the IPhone's browser).
Once the video plays - the currentTime property does get forwarded. On the video itself. When you pause the video and go back to the normal html page - you can see the frames on the video element changing. Though, in a slow phase (unlike in Google Chrome, where the rate seems to be smooth enough to make it look like it's playing) - in the iphone it looks to be a rate of something like 2-3 frames per second maybe. It stays the same even if I try changing the interval timing, I guess there's a minimum time limit that the browser on the IPhone can handle.
"Bonus question" :)
- When the frames on the video element progresses from the event - the circle "play button" is visible on the video element (since it is not actually 'playing'). Is there anyway to hide it and make it invisible?
This has been tested on Iphone 3GS (with both 3G and Wifi) and Iphone 4, both running IOS 5, both having the same results as described.
Unfortunately I don't have an iOS device to test this, but I don't think you need to actually capture the video frames in the way that you're attempting using the currentTime property. The usual process looks something like this:
create a video element without controls (omit the controls attribute)
hide the element
when the video is playing draw it to the canvas on an interval
As you've discovered, iOS devices do not support autoplay on HTML5 video (or indeed audio) but you can create a separate control to initiate playback of the video using the play() method.
This approach should solve the issue you're having with the play button being visible since in this case you are actually playing the video.
I don't believe the loadeddata event is called on iOS, try the loadedmetadata event instead. I also found it necessary on iOS to call the videoDom.load() method after setting videoDom.src.
For my use case, I need to do a "dRAF" (double requestAnimationFrame) after the seeked event to ensure something was actually drawn to the canvas rather than a transparent rectangle.
Try something like:
videoDom.onloadedmetadata = () => {
videoCanvas.height = videoDom.videoHeight
videoCanvas.width = videoDom.videoWidth
videoDom.currentTime = 0
}
videoDom.onseeked = () => {
// delay the drawImage call, otherwise we get an empty videoCanvas on iOS
// see https://stackoverflow.com/questions/44145740/how-does-double-requestanimationframe-work
window.requestAnimationFrame(() => {
window.requestAnimationFrame(() => {
videoCanvas.getContext('2d').drawImage(videoDom, 0, 0, videoCanvas.width, videoCanvas.height)
videoDom.currentTime += 0.0416
})
})
}
videoDom.load()
From this gist

Categories