I have successfully used the Froogaloop library to monitor embedded Vimeo video players on various sites (for a feature of my Chrome extension).
I'd like to do the same thing on vimeo.com itself, but the video players on vimeo.com pages are not embedded within iframes, so Froogaloop isn't able to interface with them.
Can anybody tell me how to get the current playing time and wire up play/pause event listeners for a (non-iframed) video on vimeo.com?
After some further research I came up with the following basic solution:
// Once the vimeo.com player <object> is ready:
// find vimeo player(s)
var vimeoPagePlayers = document.querySelectorAll('object[type*=flash][data*=moogaloop]');
if (vimeoPagePlayers.length == 0) {
return;
}
// attach event listeners to player(s)
for (var i = 0; i < vimeoPagePlayers.length; i++) {
var player = vimeoPagePlayers[i];
// sometimes .api_addEventListener is not available immediately after
// the <object> element is created; we do not account for this here
player.api_addEventListener('onProgress', 'onVimeoPageProgress');
player.api_addEventListener('onPause', 'onVimeoPagePause');
}
function onVimeoPagePause() {
// video has been paused or has completed
}
function onVimeoPageProgress(data) {
// video is playing
var seconds;
if (typeof(data) == 'number') {
// on vimeo.com homepage, data is a number (float)
// equal to current video position in seconds
seconds = data;
}
else {
// on vimeo.com/* pages, data is a hash
// of values { seconds, percent, duration }
seconds = data.seconds;
}
// do something with seconds value
}
This may not work for HTML5-mode videos on vimeo.com (untested).
It also does not work when a vimeo.com page replaces the existing player with another one via DOM manipulation. I believe those situations could be handled by something like a MutationObserver.
Related
I have a function that loads an audio and sets the current time of the track based off of what the user was listening to the last time they had a page open. The track and progress into the audio is stored in localStorage. It works perfectly on localhost, but when I put it into a php server, the audio loads from the start.
Trying to find the source of the problem, I found out that it loads at the current time, then switches to 0 when it starts playing. Also, sometimes the audio file loads in twice, as shown in the network section of google developer tools (possibly because of native browser functionality and how video/audio files load). This is a major problem that is so frustrating because there is no signs of a problem whatsoever, except that it starts from 0 instead of the current time. I tried putting it in a oncanplaythough loop but that doesn't do any good either.
The code I currently have is (note that it is in a function and time and src are parameters):
//set the volume to 1
audio.volume = 1;
//set the current time to 0
audio.currentTime = 0;
if(muteSound === false) {
//if the user didn't mute the sound, play it
audio.play().catch(() => {});
}
//set the name of the music (for display)
let nameOfMusic = audioChoices.indexOf(src);
//if it exists, then set the name of the currently playing music
if(nameOfMusic !== -1) {
localStorage.currentPlay = music[nameOfMusic];
}
//if the time (the code is in a function, time is the parameter. It gets set when the audio plays to the current time the user left off on) is not 0 (if the program didn't start a new soundtrack when the music ended), set the time to what it was previously.
if(time !== 0) {
audio.currentTime = Number(localStorage.currentTime);
}
//set the currently playing source to the source of the audio
localStorage.setItem('currentPlaySrc', src);
//set the current time to 0 (the local storage value, not the actual value. This might be the problem here, as I have found that the music loads twice, meaning it will get set to 0 since the first time it loads it loads it to 0. However, I have removed this and it still doesn't work).
localStorage.setItem('currentTime', '0');
//set an interval that constantly sets the current playing variable in local storage to the current time of the audio and checks if the user muted the sound. It also checks if the audio ends and replays itself (the addMusic function is what this code is contained in)
let check = window.setInterval(() => {
localStorage.currentTime = audio.currentTime;
if(muteSound === false && soundMutedSession === true) {
soundMutedSession = false;
audio.play().catch(() => {});
}
if(audio.currentTime >= audio.duration) {
clearInterval(check);
localStorage.setItem('currentTime', '0');
let random = Math.floor(Math.random() * audioChoices.length);
addMusic(audioChoices[random], 0);
}
if(muteSound === true) {
audio.pause();
soundMutedSession = true;
}
}, 1);
It's interesting that I didn't find any material on this. I don't think it has anything to do with typos or personal mistakes, but could it possibly have to do with how the browser renders and loads video/audio content?
I'm building an online 'TV' which will use YouTube live-streams for multiple channels.
The channels are contained within tabs. The videos need to be stopped when changing tabs otherwise you can hear the audio in the background.
Here's a link to the JSFiddle: https://jsfiddle.net/matlow/08k4csuh/
I've managed to turn the 'Channel 1' off when changing to another channel with:
var iframe = document.getElementsByClassName("tvscreen")[0].contentWindow;
and
iframe.postMessage('{"event":"command","func":"pauseVideo","args":""}', '*');
In the tab javascript for loop which also handles the tabcontent[i].style.display = "none";
I think I need to use the for loop to call each instance of the iframe... but I'm quite new to javascript so I'm not quite sure how to achieve this.
It will also help to use iframe.postMessage('{"event":"command","func":"playVideo","args":""}', '*'); so the video plays automatically again when clicking on the relevant tab... but again I'm not quite sure how to implement this.
I've been working on this for a few days so if anyone had any tips or pointers I would really appreciate it!
Thanks for reading! :)
You are not using YouTube's API properly. See https://developers.google.com/youtube/iframe_api_reference
In your fiddle, programmatic play is not possible, because you can't know when the YouTube player is ready, as you are not the one initialising it. Your attempts to play the video might take place too early.
Programmatic pause (you managed to pause the first video) is possible thanks to enablejsapi=1 in the iframe src and the fact that the player is ready at that point.
Here's a fork of your fiddle - https://jsfiddle.net/raven0us/ancr2fgz
I added a couple of comments. Check those out.
// load YouTube iframe API as soon as possible, taken from their docs
var tag = document.createElement('script');
tag.id = 'iframe-demo';
tag.src = 'https://www.youtube.com/iframe_api';
var firstScriptTag = document.getElementsByTagName('script')[0];
firstScriptTag.parentNode.insertBefore(tag, firstScriptTag);
// initialised players are kept here so we don't have to destroy and reinit
var ytPlayers = {};
function mountChannel(channel) {
var player;
var iframeContainer = document.querySelectorAll('#' + channel + ' iframe');
// if the channel & iframe we want to "mount" exist, check for playing iframes before doing anything else
if (iframeContainer.length > 0) {
// Object.keys() is ECMA 5+, sorry about this, but no easy to check if an object is empty
// alternatively, you could have an array, but in that case, you won't be able to fetch a specific player as fast
// if you don't need that functionality, array is as good cause you will just loop through active players and destroy them
var activePlayersKeys = Object.keys(ytPlayers);
if (activePlayersKeys.length > 0) { // if players exist in the pool, destroy them
for (var i = 0; i < activePlayersKeys.length; i++) {
var activeChannel = activePlayersKeys[i];
var activePlayer = ytPlayers[activeChannel];
activePlayer.getIframe().classList.remove('playing'); // mark pause accordingly, by removing class, not necessary
activePlayer.pauseVideo();
}
}
// check if player already initialised and if player exists, check if it has resumeVideo as a function
if (ytPlayers.hasOwnProperty(channel)) {
ytPlayers[channel].playVideo();
} else {
var iframe = iframeContainer[0];
player = new YT.Player(iframe, {
events: {
'onReady': function (event) {
// event.target is the YT player
// get the actual DOM node iframe nad mark it as playing via a class, styling purposes, not necessary
event.target.getIframe().classList.add('playing');
// play the video
event.target.playVideo();
// video may not autoplay all the time in Chrome, despite its state being cued and this event getting triggered, this happens due to a lot of factors
},
// you should also implement `onStateChange` in order to track video state (as a result of user actions directly via YouTube controls) - https://developers.google.com/youtube/iframe_api_reference#Events
}
});
// append to the list
ytPlayers[channel] = player;
}
}
}
// Get the element with id="defaultOpen" and click on it
function onYouTubeIframeAPIReady() {
// YouTube API will call this when it's ready, only then attempt to "mount" the initial channel
document.getElementById("defaultOpen").click();
}
This is the first time I worked with YouTube's iframe API, but it seems reasonable.
I am working with an iframe that contains code that we receive from a third party. This third party code contains a Canvas and contains a game created using Phaser.
I am looking for a way to mute the sound that this game does at some point.
We usually do it this way:
function mute(node) {
// search for audio elements within the iframe
// for each audio element,(video, audio) attempt to mute it
const videoEls = node.getElementsByTagName('video');
for (let i = 0; i < videoEls.length; i += 1) {
videoEls[i].muted = true;
}
const audioEls = node.getElementsByTagName('audio');
for (let j = 0; j < audioEls.length; j += 1) {
audioEls[j].muted = true;
}
}
After some research I found out that you can play sound in a web page using new Audio([url]) and then call the play method on the created object.
The issue with the mute function that we use is that, if the sound is created with new Audio([url]), it does not pick it up.
Is there a way from the container to list all the Audio elements that have been created within a document or is it just impossible, and that creates a way to play audio without the possibility for iframe container to mute it?
No, there is no way.
Not only can they use non appended <audio> elements like you guessed, but they can also use the Web Audio API (which I think phaser does) and for neither you have a way of accessing it from outside if they didn't expose such an option.
Your best move would be to ask the developer of this game that it exposes an API where you would be able to control this.
For instance, it could be some query-parameter in the URL ( https://thegame.url?muted=true) or even an API based on the Message API, where you'd be able to do iframe.contentWindow.postMessage({muted: true}) from your own page.
I have an app that tracks video views and integrates it with other marketing activities. In doing so, I needed to keep track of how long a person watches a html5 video and post it back to my app (via an API). I'm using videojs player, but really this is just a wrapper around the HTML5's api for this attribute. This is in an app with various videos can be loaded based on what page they are watching, so I needed a solution that tracked regardless of video length.
The problem I had, as a video plays the API reports back every ~300MS and I didn't want to hit my API that often. So I needed a solution to keep track of last time I posted. After digging around, I couldn't find an answer, so in case someone else with a similar need, my solution to this problem is below.
We've decided that I wanted to post my video viewing results every 5 seconds, but since we have no guarantee that the currentTime will report back at exactly 5 seconds, so we just need to round to closest whole integer value.
On my video wrapper div, I've added a data attribute called data-last-time-push. I post the rounded time every time I push and check to see if we have exceed the interval before we post again.
HTML
<div id="video-wrapper" data-time-last-push="0">
Javascript
Bind the videojs container to the timeupdate property.
var vid = videojs("video-container", {}, function() {
this.on('timeupdate', videoTracker);
});
function for posting ajax...
var videoTracker = function() {
var player = this;
var last_push, wrapper, current;
wrapper = $('#video-wrapper');
last_push = wrapper.attr("data-time-last-push");
current = Math.round(player.currentTime());
//you could make the 5 here to be a variable or your own interval...
if (current%5 === 0) {
if (current > last_push) {
//do your AJAX post here...
wrapper.attr("data-time-last-push", current);
console.log('currentTime = ' + player.currentTime());
console.log(' duration: ' + player.duration());
}
}
};
Note, I tried to do a jsfiddle to show it working, but ended up running into HTTPS videos because the sample videos don't work through secure connections.
I have streaming audio in wave format being played out through html5 capabilities of firefox. Because some of our users complained about choppy audio we decided to come up with a strategy for measuring if the rate of the audio was good enough to sustain playback. We first tried the canplaythrough event, but apparently the browser was too optimistic about it and it did not quite work. Here is some code that I am proposing using jquery. I am a js/jquery beginner so i wanted to see if anyone had any better ideas.
$(document).ready(function() {
var previous_buffer = 0;
var start_time = (new Date()).getTime();
// function to check how fast buffer is
function checkBufferSpeed() {
// if we havent started buffering yet, return
if (this.buffered.length < 1)
{
return;
}
// get the current buffer status
var current_buffer = this.buffered.end(0);
console.log("current_buffer:" + current_buffer);
// if we get the same current buffer twice, browser is done buffering
if (current_buffer > 0 && current_buffer == previous_buffer)
{
this.play();
$("#audio").off("progress");
return;
}
// get the time spent
var time_spent = ((new Date()).getTime() - start_time)/1000;
console.log("time_spent:" + time_spent);
// if we buffered faster than time spent, play
if (current_buffer > time_spent)
{
this.play();
$("#audio").off("progress");
return;
}
previous_buffer = current_buffer
}
$("#audio").on("progress", checkBufferSpeed);
});
also, the reason i am not checking the audio elements duration variable because firefox seems to always report that as infinite, unless the stream is already cached.
also, it seems like firefox always fires 2 progress events even if the stream is fully available in cache which is why i have the check to see if i have the same buffer twice and if so, to just play the sound.
is there a chance that the browser will never fire progress events?
any enhancements I could make to this code to make it better?