When changing video currentTime too much, video fails? - javascript

I'm using a user drag event as well as keypresses to change the position in a HTML5 video element and then updating the video time accordingly using:
video.currentTime = toTime;
and then I am updating a canvas based on the video position by grabbing the video and putting it to the canvas.
Another element is that I actually get the video time from the frame number, i.e:
framenumber = 123;
fps = 25;
toTime = 123/25;
Problem is, every so often it just fails. By fails I mean I lose the video; it just stops working altogether.
Most of the time it works great but sometimes it just fails, and not always at the same point either...
Any ideas would be much appreciated!

There were 2 answers to my question:
Encoding of the video files - basically by controlling keyframes and
sending the right video to the right browser I was able to solve a
lot of problems. Using FFMPEG I changed the GOP length.
ffmpeg -g <frames> in my case, where <frames> is the amount of frames between GOP points desired.
Using videojs to serve up the video seemed to solve a lot of problems and made it a smoother an experience.

Related

Playing video frame-by-frame very slow on Chrome/Firefox but not on Safari

I have a use case that consists in seeking specific frames in a video. The frequency of frame changes can be quite high as it depends on mouse movements. My problem is that depending on the browser the updates can be very smooth (Safari) or introduce a lot of latency (Chrome/Firefox).
I came up with an example to test the differences between browsers using the following code (here is a codepen). I am using this video that is 10 seconds long and 60fps and I play it frame-by-frame by changing its currentTime at an interval equals to the video's frame rate.
HTML
<video id="player" src="https://test-videos.co.uk/vids/bigbuckbunny/mp4/h264/360/Big_Buck_Bunny_360_10s_1MB.mp4" preload="auto"/>
JS
window.onload = () => {
const player = document.getElementById("player")
setInterval(() => {
player.currentTime = (player.currentTime + 0.016) % 10
}, 16)
}
The main problem seems to me that Chrome and Firefox just handle video time changes differently than Safari, and my use case is well supported in the latter. Would there be any way to also make that work smoothly in Chrome/Firefox? I checked a few video player libraries (e.g., plyr, react-player, video-js) but they are all based on native HTML5 players.
For anyone having similar issues, I found out the problem has to do with the video encoding. The significant parameter is the keyframe interval that, if I got it correctly, defines the size of the chunk of frames a specific frame would refer to when rendering. The smaller this number, the faster the response when rendering it, but I guess the files can grow bigger as less information is compressed.
You can modify this parameter using the -g and -min_keyint options with ffmpeg https://ffmpeg.org/ffmpeg-codecs.html.

Automatic Thumbnails/Screenshots for Chapter in HTML Video

I found some examples, where people used a canvas and javascript to take multiple screenshots of a running video.
You can see these examples here or here.
The code sets a time interval, draws the current timeframe to a canvas and uses this to create a screenshot.
I am wondering if it would be possible to use a similar technique, to automatically create a kind of preview for chapters of the video.
But this would require to grab a bunch of screenshots before the video started.
I failed to implement this, so I would like to know, if it is at all possible.
I know that one could use pretaken screenshots for the chapters, but I wanted to automate this process.
Thanks in advance for your answers.
This could be done in theory by jumping to specific times in the video (say every 10 seconds) using video.currentTime, waiting for the frame to be available (using progress events), drawing the frame to a canvas (canvas.drawImage) and storing it in some way (say an array of images having image.src = canvas.toDataURL).
However, this process will take time because at least the relevant parts of the video would need to be loaded in the browser so the frame could be grabbed. The video would not be playable during the process as it is being skipped to different frames.
This behavior is usually not acceptable, but it really depends on your specific use case.

How to keep a live MediaSource video stream in-sync?

I have a server application which renders a 30 FPS video stream then encodes and muxes it in real-time into a WebM Byte Stream.
On the client side, an HTML5 page opens a WebSocket to the server, which starts generating the stream when connection is accepted. After the header is delivered, each subsequent WebSocket frame consists of a single WebM SimpleBlock. A keyframe occurs every 15 frames and when this happens a new Cluster is started.
The client also creates a MediaSource, and on receiving a frame from the WS, appends the content to its active buffer. The <video> starts playback immediately after the first frame is appended.
Everything works reasonably well. My only issue is that the network jitter causes the playback position to drift from the actual time after a while. My current solution is to hook into the updateend event, check the difference between the video.currentTime and the timecode on the incoming Cluster and manually update the currentTime if it falls outside an acceptable range. Unfortunately, this causes a noticeable pause and jump in the playback which is rather unpleasant.
The solution also feels a bit odd: I know exactly where the latest keyframe is, yet I have to convert it into a whole second (as per the W3C spec) before I can pass it into currentTime, where the browser presumably has to then go around and find the nearest keyframe.
My question is this: is there a way to tell the Media Element to always seek to the latest keyframe available, or keep the playback time synchronised with the system clock time?
network jitter causes the playback position to drift
That's not your problem. If you are experiencing drop-outs in the stream, you aren't buffering enough before playback to begin with, and playback just has an appropriately sized buffer, even if a few seconds behind realtime (which is normal).
My current solution is to hook into the updateend event, check the difference between the video.currentTime and the timecode on the incoming Cluster
That's close to the correct method. I suggest you ignore the timecode of incoming cluster and instead inspect your buffered time ranges. What you've received on the WebM cluster, and what's been decoded are two different things.
Unfortunately, this causes a noticeable pause and jump in the playback which is rather unpleasant.
How else would you do it? You can either jump to realtime, or you can increase playback speed to catch up to realtime. Either way, if you want to catch up to realtime, you have to skip in time to do that.
The solution also feels a bit odd: I know exactly where the latest keyframe is
You may, but the player doesn't until that media is decoded. In any case, keyframe is irrelevant... you can seek to non-keyframe locations. The browser will decode ahead of P/B-frames as required.
I have to convert it into a whole second (as per the W3C spec) before I can pass it into currentTime
That's totally false. The currentTime is specified as a double. https://www.w3.org/TR/2011/WD-html5-20110113/video.html#dom-media-currenttime
My question is this: is there a way to tell the Media Element to always seek to the latest keyframe available, or keep the playback time synchronised with the system clock time?
It's going to play the last buffer automatically. You don't need to do anything. You're doing your job by ensuring media data lands in the buffer and setting playback as close to that as reasonable. You can always advance it forward if a network condition changes that allows you to do this, but frankly it sounds as if you just have broken code and a broken buffering strategy. Otherwise, playback would be simply smooth.
Catching up if fallen behind is not going to happen automatically, and nor should it. If the player pauses due to the buffer being drained, a buffer needs to be built back up again before playback can resume. That's the whole point of the buffer.
Furthermore, your expectation of keeping anything in-time with the system clock is not a good idea and is unreasonable. Different devices have different refresh rates, will handle video at different rates. Just hit play and let it play. If you end up being several seconds off, go ahead and set currentTime, but be very confident of what you've buffered before doing so.

How can i increase/decrease the decibel values of an audio file in html5?

Hi I am trying a web app, where i will be running the audio file. I want keep one slide bar which will have the scale of decibels (db). On moving the slide bar up and down, the db value gets changed and audio file has to run according to that db value.
Currently i am using the tag in html5 to run the audio files in browser through media player. Is there any default function or API that serve my purpose or i need to read the sound file and write the logic to change the db value.
I would like to run this feature across IE, chrome, safari and firefox.
Please some one suggest me what is the right approach to achieve this.
Thanks & Regards,
Siva.
SEE EDIT BELOW!
According to W3 - http://www.w3.org/html/wg/drafts/html/master/single-page.html#effective-media-volume - it is possible to do this quite easily.
Using an audio tag, such as:
<audio id="myAudio">
<source src="audio.ogg" type="audio/ogg">
</audio>
Using JavaScript, you can do:
audio = document.getElementById("myAudio");
audio.volume = 0.5; //half of original value
//0 = nothing
//0.5 = half of maximum
//1 = full volume.
Since you want to add a slider, using jQueryUI, you could add a slider with some slideStart/slideEnd events and then set the volume accordingly.
I hoped this helped ;)
Edit:
After reading again that you want decibels, not volume, I realised that what you want is not possible. It's extremely hard - if not impossible - to change the decibels of an audio file in HTML5. This is because:
The volume of the original file will always have a different decibel, and it's not possible to tell the amount of decibels in a file.
There are always hardware limitations, and there comes a point where you can't raise decibels.

sync audio to a picture carrousel

Is it possible to sync audio to a picture carrousel without using the canvas? Im currently using setInterval to realize the carrousel. I dont want to use flash and if possible and not too hard to realize, I want to do this without the canvas.
Have you tried a canvas?
Assuming you're using an <audio> element, you can get its currentTime property. This will tell you where the playhead is. You can use this information to change the carousel when the audio passes a certain position, or you can set it to change the audio position when the carousel changes. Whatever works better for what you're doing.
What do you mean by realize the carousel?
I would set a specific timestamp when you want the image to show. So at 12 seconds you want to show image.
You might want to also make sure that the audio is loaded. You can use canPlayThrough to check and see if enough is loaded to play through or you can use audio.buffered.end(0) to check against the length of the audio file and make sure the entire thing has loaded.
Here is a good article for more on this: http://html5doctor.com/html5-audio-the-state-of-play/
When the audio is loaded you call a function and pass in the timestamp and set it as the setInterval time. So at timestamp * 1000 the function will fire and show the image that is passed at that timestamp.
You could also try and use audio.currentTime and fire then, more on that in the article above as well. Let me know if you have a more specific question.

Categories