I am working with React and Node. My project is having requirement to merge videos and play it on the player. Is possible anyhow, I can do it either on my React side using some canvas or on the back end side using some module other than ffmpeg?
Just want to show preview nothing else is it possible?
Any help would be much appriciated
Right now what I am doing is playing the videos one by one on the player
{vidoes.map((url) => <ReactPlayer src={url}/>)}
Now what I want to do is to draw the images on the canvas. So for that I am playing all the urls at once but how can I play them into series and save the next canvas images until the before one completes?
To achieve continous playing in browsers for multiple input video files there’s no need for server side processing. Static file serving for multiple video files (we’ll call them chunks from now on) is enough.
At first the .appendBytes() method for the playing buffer of a video player was invented in Flash to allow for switching video streams (or different quality chunks). It was also particularly useful for live video where the next video file doesn’t exist when the playing starts. It also allowed multiple resolution video chunks to play one after the other seamelessly, which, at the time, didn’t work in any other player, including VLC (and I think ffmpeg didn’t have it either).
HTML5 browsers have added an .appendBuffer() method to add video chunks to the currently playing video buffer.
It allows you to hands-on pre-load whatever content you need with full control of what gets played and when (you are in control of buffering and what comes next at all times).
You can start dumping video chunks into the playing buffer at any time.
On the server side ffmpeg cannot solve your problem in the same way the browser can as you will have to handle very difficult corner cases where the video resolution, codecs, audio channels, etc. might differ. Also, a browser only solution is vastly superior to doing remuxing or video encoding on the server.
MDN: https://developer.mozilla.org/en-US/docs/Web/API/SourceBuffer/appendBuffer
This will solve all your problems related to video playback of multiple source files.
If you want to do this on the backend, as stated in the comment, you will likely need to include ffmpg. There are some libraries though that make is simpler like fluent-ffmpeg
assuming you already have the files on your node server, you can use something like:
const ffmpeg = require('fluent-ffmpeg');
ffmpeg('/path/to/input1.avi')
.input('/path/to/input2.avi')
.on('error', function(err) {
console.log('An error occurred: ' + err.message);
})
.on('end', function() {
console.log('Merging finished !');
})
.mergeToFile('/path/to/merged.avi', '/path/to/tempDir');
note that this is a simple example taken directly from https://github.com/fluent-ffmpeg/node-fluent-ffmpeg#mergetofilefilename-tmpdir-concatenate-multiple-inputs.
Make sure you read through the prerequisites since it requires ffmpeg to be installed and configured.
There are other aspects that you might need to consider, such as differing codecs and resolutions. Hopefully this gives you a good starting point though.
Pretty sure most, if not all video manipulation libraries use ffmpeg under the hood.
Seeing as you're a react dev, you might appreciate Remotion to manipulate the videos as needed. It doesn't run in the frontend but does have some useful features.
Related
I'll try my best to explain this as good as I can:
I have programmed an art installation (interactive animation with three.js), it is running very smooth on my laptop, but not on older machines and almost on no mobile devices. Is there any way to run the javascript animation on a performant machine/server but display/render it on a website, thus running also on not performant devices. I was thinking about something like done for CloudGaming (as in shadow.tech) or services like parsec.app maybe...
Any ideas/hints?
I've thought about the same concept before too. The answer to your question rendering the three js on server and displaying it on an older machine will be something like this:
You are going to have to render three js on a server then basically live stream it to the client then capture the client's controls and send it back to the server to apply them but latency is going to be a pain in the ass so better write your code nice and neat and a lot of error handling. Come to think about it Google stadia does the same think too when you play games on youtube.
Server Architecture
So roughly thinking about it I would build my renderer backend like this:
First create a room with multiple instances of your three js code. => this basically means creating multiple (i.e index.js, index1.js, index2.js, etc...) Then Now decide if you wanna do this with the Straight Forward Recorded and Stream method or capture the stream directly from the canvas then broadcast it.
Recorded and Stream method
This means you create the js room then render and display it on the server machine then capture what is displaying on your screen using something like OBS studio then broadcast it to the client. I don't recommend this method.
OR
captureStream() method
After creating your room in your server run the three js code then access the canvas you are rendering the three js code in like this:
var canvas = document.querySelector('canvas');
var video = document.querySelector('video');
// Optional frames per second argument.
var stream = canvas.captureStream(60);
// Set the source of the <video> element to be the stream from the <canvas>.
video.srcObject = stream;
Then use something like webrtc and broadcast it to the client.
This is an example site you can look at: https://webrtc.github.io/samples/src/content/capture/canvas-pc/
Honestly don't ditch the client interaction idea. Instead write code to detect latency before hand then decide to enable it for that specific client
Hope this helps
I'm working with mpeg dash to stream live between mp4box and dash.js. The problem arises on the client side when the user pauses for too long time the playhead tries to read a point where the buffer has already passed.
I'can't find how to get a live skin without play pause or seekbar, actually i try video.js to customise the interface, i use the standard configuration, and css configuration don't give me what i need, i wish to know if there is an easy opensource way to display live streaming.
Thanks a lot,
Massimo
I've been researching for this for a minute but I'm not getting a straight answer to my exact question. I'm trying to understand the process behind video players switching video quality (480p, 720p, 1080p, etc.).
In order to achieve that, I am first asking if this is more of a front-end thing or back-end thing? And to illustrate the first answer, does one:
A) Upload one video file to a server (S3/Google Cloud) at the highest quality, then use a video tag in HTML and add a property to specify which quality is desired?
B) Upload one video file to a server (S3/Google Cloud) at the highest quality, then use JS to control the playback quality?
C) Split highest quality uploaded file into multiple files with different streaming quality using server code, then use front-end JS to choose which quality is needed?
D) Realize that this is way more work than it seems and should leave it to professional video streaming services, such as JWPlayer?
Or am I not seeing another option that's possible without a streaming service, without actually building a streaming service?
If the answer is pretty much D, what service do you recommend?
Note: I know YouTube and Vimeo can handle this but I'm not trying to have that kind of overhead.
It is answer 'C' as noted in the comments, and maybe partly answer 'D' also.
You need a video streaming server that supports one of the popular adjustable bitrate streaming protocols (ABR) DASH or HLS. ABR allows the client device or player download the video in chunks, e.g 10 second chunks, and select the next chunk from the bit rate most appropriate to the current network conditions.
There are open source streaming servers available such as GStreamer and licensed ones like Wowza, which you can sue if you want to host the videos yourself.
For some example of ABR see this answer: https://stackoverflow.com/a/42365034/334402
I am currently building a desktop app with electron, in which I include html5 <video/> tags.
For my first implementation, I was setting the src with a file URI (ex: 'file:///Users/bobby/Desktop/video.mp4').
Problem was that the performances were awful (really long load time, the video took at least 2 seconds to change between each click on the time bar). In my current implementation, to fix those issues, I am launching in parallel with my app, a static-files server on localhost to serve the video files. I think this solution is really dirty and overkill.
Is there a clean way to feed the video tags with data read from a node fs stream? I was thinking about overriding the partial requests mechanism with a callback which would return my data, but from my current reading and understanding of the API, this is not possible.
The performance issues you're having are most likely related to either the file size, encoding depth or resolution. Big videos load slower and play slower. Seeking is also usually slower because the player has to spend time buffering from the file system before it can start or resume playing.
Try re-encoding at lower bit rates and resolutions to see if performance improves. It's usually better to upscale than downscale. Additionally, experiment with how far apart you space your keyframes. More frequent keyframes will usually increase file size but can make seeking faster.
I was given 2 MP3 files, one that is 4.5Mb and one that is 5.6Mb. I was instructed to have them play on a website I am managing. I have found a nice, clean looking CSS based jQuery audio player.
My question is, is this the right solution for files that big? I am not sure if the player preloads the file, or streams it? (If that is the correct terminology) I don't deal much with audio players and such...
This player is from happyworm.com/jquery/jplayer/latest/demo-01.htm
Is there another approach I shoudl take to get this to play properly? I don't want it to have to buffer, and the visitor to wait, or slow page loading... etc.etc. I want it to play clean and not affect the visitors session to the site.
The name is a bit misleading - the MP3 playing is done in a Flash component, as in all other similar players, too. The jQuery part of it is the control and customization of the player (which is very nice, I'm not saying anything against the product).
The player should be capable to play an MP3 file while it loads. It's not going to be real streaming (because you can't skip to arbitrary positions) but it should work out all right.
Make sure you test the buffering yourself, using a big MP3 file. Remember to encode the MP3 files according to the rules because otherwise the files will act up, especially in older players.