I have a Netflix account and I have peeked under the hood at its video player running inside Google Chrome. Netflix calls its video player "Cadmium" and the javascript exposes all the functions and event handlers you might expect, such as play, stop, pause, mute, etc. I'm building a little Chrome extension that would enable me to call these Cadmium player function, but the hard part for me is figuring out how to create an instance of the player so I can start calling. The javascript is large, complex, and somewhat obscure. Once I can create an instance of that player, I'm thinking that making calls into the functions will be easy.
Here is a relevant chunk of js:
muteOn: function() {
this.savedVolume = this.getVolume(),
this.updateVolumeDisplay(0),
this.scrubber.updatePercent(0),
this.muted = !0,
this.videoPlayer.setMuted(this.muted)
}
In Chrome dev tools I can set a breakpoint inside that block, and execution hits the breakpoint when I click the Mute button on the netflix video player. The Netflix js is (unsurprisingly) heavily obfuscated via method renaming. I tried stepping through the code in the debugger and ended down a hundred rabbit holes, never able to find my way to the top of the stack, so that I could make that same call (at top of stack) to simulate the user clicking the mute button. I also tried the approach of programmatically clicking the mute button on the UI player, which would meet my needs equally well, but they have serious defensive mechanisms in there, spinning me like a top.
Since there are over 100K lines of javascript, and I'm uncertain which chunks exactly would be relevant for this post, I would like to suggest that you load Netflix in Chrome, open dev tools, play a movie, and inspect the pause or mute button. Interacting with those video player controls takes you into the maze of javascript which I'm trying to see how I can tap into to control aspects of the player programmatically (just from dev tools is fine for now). Another important thing I need to figure out is how to query the video player to determine the current elapsed time of the playing video.
Any ideas how I can crack this nut? (Thanks in advance!)
Using Chrome, I get playback using HTML 5 video.
Once you get a hold on the <video> tag element, you can use the HTML 5 video API:
Get the <video> element
var video = document.evaluate('//*[#id="70143639"]/video',document).iterateNext()
70143639 is the id of the video, as in https://www.netflix.com/watch/70143639
Remaining time (HH:mm)
document.evaluate('//*[#id="netflix-player"]/div[4]/section[1]/label', document).iterateNext().innerHTML
Elapsed time (seconds)
video.currentTime
Elapsed time updates
video.addEventListener("timeupdate",
function(e) {
console.debug("Seconds elapsed: ", e.timeStamp/1000/60);
}
);
Note that I don't get the same results as with video.currentTime, you may need to use an offset based on the difference. Also it may be something explained in the spec: https://www.w3.org/TR/html5/embedded-content-0.html
Play
video.play();
Pause
video.pause();
Go back and forth in time
Courtesy of rebelliard:
netflix.cadmium.UiEvents.events.resize[1].scope.events.dragend[1].handler(null, {value: 600, pointerEventData: {playing: false}}); where 600 is the number of seconds to seek.
Note that I ran into "Whoops, something went wrong..." using this:
video.currentTime += 60;
Even with pause and play calls. This is what this demo page does, you nay need to read the full spec on seeking.
Mute and get muted status
video.muted = true
Like video.currentTime, this is a writeable property.
Related
Summary
I have a playlist of videos that I'm looping through. I'm switching out the video source at the end of each video. So, there is only one video element at all times. However, Chrome seems to keep the web player active even though React switches out the video and changes the element on the DOM. The issue that I'm running into is that Chrome seems to keep the web player available after the video has ended. As of Chrome v92, Chrome will not allow you to create a new video when too many are open, even if the videos have ended. So, if this playlist is looping all day, it easily exceeds the amount that Chrome allows. Then Chrome disables the video and gives me this error:
Blocked attempt to create a WebMediaPlayer as there are too many WebMediaPlayers already in existence
My source looks a little like this:
const VideoPlayer = ({videoSource}) => {
const videoRef = useRef(null);
return (
<video
ref={videoRef}
className="video-player"
src={videoSource}
autoPlay
/>;
)
}
Then I switch out the video source on each new video. This screenshot is what the players look like in Chrome:
All of the videos continue to exist even though they are all ended. React doesn't seem to automatically handle the disposing of the video WebMediaPlayer, even though I'm just reusing the same video element.
I've tried removing the source at the end of the video and reloading like many sites suggest. However, this removes the source for the video element and stops the video from playing altogether.
Does anyone know the best practice for reusing a video element with ReactJS that will allow the WebMediaPlayer to be correctly disposed of in garbage collection?
With the release of OSX High-Sierra*, one of the new features in Safari is that videos on websites will not auto play anymore and scripts can't start it either, just like on iOS. As a user, I like the feature, but as a developer it puts a problem before me: I have an in-browser HTML5 game that contains video. The videos do not get automatically played anymore unless the user changes their settings. This messes up the game flow.
My question is, can I somehow use the players' interaction with the game as a trigger for the video to start playing automatically, even if said activity is not directly linked to the video element?
I cannot use jQuery or other frameworks, because of a restraint that my employer has put on our development. The one exception is pixi.js which - among all other animations - we are also using to play our videos inside a pixi container.
*The same restriction also applies on Mobile Chrome.
Yes, you can bind on event that are not directly ones triggered on the video element:
btn.onclick = e => vid.play();
<button id="btn">play</button><br>
<video id="vid" src="https://dl.dropboxusercontent.com/s/bch2j17v6ny4ako/movie720p.mp4"></video>
So you can replace this button with any other splash screen requesting an user click, and you'll be granted access to play the video.
But to keep this ability, you must call at least once the video's play method inside the event handler itself.
Not working:
btn.onclick = e => {
// won't work, we're not in the event handler anymore
setTimeout(()=> vid.play().catch(console.error), 5000);
}
<button id="btn">play</button><br>
<video id="vid" src="https://dl.dropboxusercontent.com/s/bch2j17v6ny4ako/movie720p.mp4"></video>
Proper fix:
btn.onclick = e => {
vid.play().then(()=>vid.pause()); // grants full access to the video
setTimeout(()=> vid.play().catch(console.error), 5000);
}
<button id="btn">play</button><br>
<video id="vid" src="https://dl.dropboxusercontent.com/s/bch2j17v6ny4ako/movie720p.mp4"></video>
Ps: here is the list of trusted events as defined by the specs, I'm not sure if Safari limits itself to these, nor if it includes all of these.
Important note regarding Chrome and preparing multiple MediaElements
Chrome has a long-standing bug caused by the maximum simultaneous requests per host which does affect MediaElement playing in the page, limiting their number to 6.
This means that you can not use the method above to prepare more than 6 different MediaElements in your page.
At least two workarounds exist though:
It seems that once a MediaElement has been marked as user-approved, it will keep this state, even though you change its src. So you could prepare a maximum of MediaElements and then change their src when needed.
The Web Audio API, while also concerned by this user-gesture requirement can play any number of audio sources once allowed. So, thanks to the decodeAudioData() method, one could load all their audio resources as AudioBuffers, and even audio resources from videos medias, which images stream could just be displayed in a muted <video> element in parallel of the AudioBuffer.
In my case i was combining transparent video (with audio) with GSAP animation. The solution from Kaiido works perfectly!
First, on user interaction, start and pause the video:
videoPlayer.play().then(() => videoPlayer.pause());
After that you can play it whenever you want. Like this:
const tl = gsap.timeline();
tl.from('.element', {scale: 0, duration: 5);
tl.add(() => videoPlayer.play());
Video will play after the scale animation :).
Tested in Chrome, Safari on iPhone
My website plays background music with autoplay. I made it use my custom controls for play and pause. Now, I'd like to set the initial state according to what is going on. If the music is about to play for real, it should show pause icon, otherwise (e.g. on mobile) play icon.
I would use audio.paused boolean value, but it's always false before the audio is loaded.
I would use audio.autoplay value, but it's always true for me, even on devices that don't support it.
Is there any clean way to know whether the audio will be played? I would like to keep it in sync with autoplay attribute, so if I decided to remove it, the state should always show play icon in the beginning.
Just playing or even buffering songs isn't especially fair, when there is the slightest chance people can be on the site for other reasons, like for example to check for updates, to share the link. people can be on the page with a mobile network, with limited bandwidth and downloads of those sizes shouldn't ever start sneaky behind their back.
edit: a few additional references
Here is an overview over the reasons not to have music on autoplay
And contrary, a website I personally like a lot with a great use of background music on autoplay
But if you are already building your own player and want that to be a feature of the page, setting that player to autoplay would not only devalue your own work, totally break your design. Instead you could just trust that people who want to hear the music will identify your audio player and use it.
To fully implement your custom player GUI you may want to listen for all audio events on the player element and update your view accordingly. The event you are looking for is "canplaythrough" but you probably want to react to at least most of the other events too. Those events are:
playing
waiting
seeking
seeked
ended
loadedmetadata
loadeddata
canplay
canplaythrough
timeupdate
play
pause
ratechange
volumechange
suspend
emptied
stalled
You currently may do something along the lines of
view.showPlayButton();
player.play();
but that breaks as soon as you at some point want to toggle your player in some other way or something else happens, like it gets stalled, so better listen to the event and update your GUI in one place, and control the playback (like start / stop the player) in another.
I am working on a functionality explained below-
Once a video is loaded/with a button click, it has to play in reverse. The target devices are smart phones and tablet devices. I am trying to do this using HTML5 tag and read about the playBackRate attribute. Further down, I am using the HTML5 stuff to generate my mobile apps using Phonegap.
Here's a link I found interesting but partially working -
http://www.w3.org/2010/05/video/mediaevents.html
The playbackrate, when changed doesn't do anything except it pushes the video back, by whatever number is given there.
I read it here that when playBackRate is set to -1, the reverse playback should happen.
I am not clear on how to exactly implement this. Does the playbackrate actually do this reversing? or should I opt on doing something else?
NEW LINE HERE:
I am almost there.. found a working example here. But this is not working when I am trying it on my system. I am not sure where I am making it wrong.
This is an old thread so I'm not sure how useful the following will be.
The Mozilla developer notes point out that:
Negative values [for playbackRate] don't currently play the media backwards.
Same goes for Chrome, but Safari on Mac works.
This page from w3c is great at observing and understanding events:
http://www.w3.org/2010/05/video/mediaevents.html
I took the jsfiddle you mentioned and added some extra controls for fast forward/rewind speeds:
http://jsfiddle.net/uvLgbqoa/
However, though this works fine for me with Chrome/Firefox/Safari, I should point out that it doesn't really address your key questions.
Firstly, the approach assumes that negative playback rates don't work (which at the time I write this, is largely true AFAICS). Instead, it fakes it by calculating and setting the current time in the video:
function rewind(rewindSpeed) {
clearInterval(intervalRewind);
var startSystemTime = new Date().getTime();
var startVideoTime = video.currentTime;
intervalRewind = setInterval(function(){
video.playbackRate = 1.0;
if(video.currentTime == 0){
clearInterval(intervalRewind);
video.pause();
} else {
var elapsed = new Date().getTime()-startSystemTime;
log.textContent='Rewind Elapsed: '+elapsed.toFixed(3);
video.currentTime = Math.max(startVideoTime - elapsed*rewindSpeed/1000.0, 0);
}
}, 30);
}
Chrome handles this quite seamlessly, even playing snippets of audio as it goes.
Secondly, I think you want to play the video in reverse as soon as the page loads. I can't imagine a use case for this, but the first issue I see is that the whole video will need to be downloaded prior to playback, so you'll need to wait - but download probably won't happen until you start playing. So you could set the currentTime to near the end and wait for the canPlay event, and then start playing in reverse. Even then, this seems very awkward.
I think there are these broad options:
Use a native video widget rather than HTML. I'm guessing (without checking) that the native API supports reverse playback.
Generate a proper reversed video and play it as normal. For example, on a server somewhere use a program like ffmpeg to reverse the video. Then your app downloads the video and plays it normally, which looks to the user like reverse.
Assuming it really does make sense to have an application that plays a video in reverse when you load it, then I'd personally go for #2.
I'm working with HTML5, making an audio player.
I wonder to know how I can add a buffer bar so that the user can see how long the song is loaded.
I've tried using several properties that I saw in some tutorials, but none have worked, and I can not find anything about this specific topic.
This is the audio player that I'm trying to edit.
I wish someone could guide me on how to edit the code to do it, or, otherwise, recommend tutorials, documentation or any information.
There's no guarantee that all of this is going to work on all browsers...
...however, assume, for the rest of this post that:
var audio = new Audio();
audio.src = "//example.com/some-really-long-song.mp3";
"canplay" is the first useful event, in terms of being able to play the song.
It means that part of the song is ready to go, and it's at least enough to be able to hear something, if you hit the play button.
Alternatively "canplaythrough" is the browser's guess at whether you can start playing the song right now, and it will run without stopping (based on amount of data left to download, and how fast the song is currently downloading).
audio.addEventListener("canplay", function () { audio.play(); });
"durationchange" is an event which should fire when a duration changes.
For file-types which don't have metadata, or where the metadata isn't supported, the whole duration might not be available as the file starts loading.
In these cases, audio.duration might be updated by the browser downloading the first bit of the media, and then the last, and making an educated guess at the length of the file, or the duration might come from the browser having to load more and more of the file, and increasing the duration as it goes. Where we are in terms of support, I don't know.
audio.addEventListener("durationchange", function () {
player.time.duration.update(audio.duration);
});
"progress" is an event which fires ~250ms as data is coming in (while downloading the file).
progress tells you that updates have been made to data which is available to seek through right now.
This makes it a good event to use to check your audio.buffered objects, in order to update the "loaded" portion of your progress bar.
audio.addEventListener("progress", function () {
player.progressBar.update(audio.buffered.start(0), audio.buffered.end(0));
});
Then you can use "timeupdate" to deal with playback.
timeupdate will update a few times a second, as the song is playing (as audio.currentTime moves forward).