I am working on an app which has a media server and this media server provide me the URL's of different video files and these videos are in dash format or .mpd format. I go through react-native-video and it has exo-player on android which already have the support of dash,
I have tested exo-player on andriod and it works but on the other hand, it uses AVPlayer for IOS which don't have any support of dash.
I spent some time finding some solution on the IOS side which could also support React-Native android and IOS but didn't find a proper way, Found two solutions but not works for both.
dash.js - https://github.com/Dash-Industry-Forum/dash.js/wiki
Google Shaka Player - https://github.com/google/shaka-player
dash.js have only support for web and shaka-player have some embedded solution available on IOS and which I have to bridge and then I can use.
I am looking for some quick workaround on react-native, Is there any player available that can play dash on both platforms or any other workaround that can work for me.
A key reason you may not see as much attention for DASH players in iOS as you expect is that current apple iOS rules require you to use HLS on iOS devices for any video over 10 minutes, assuming your app is available on mobile networks which is nearly always the case:
2.5.7 Video streaming content over a cellular network longer than 10 minutes must use HTTP Live Streaming and include a baseline 192 kbps HTTP Live stream.
(https://developer.apple.com/app-store/review/guidelines/)
This is a key reason most video streams at this time are available in both HLS and DASH streams.
Note that the CMAF format promises to eventually allow you only store and server single copy of your content by effectively having the segmented video streams be the same with different HLS and DASH 'index' or 'manifest' files to refer to them. Because of some differences in the way encryption has been done in the past, and the time it will take for all devices and players to support the new format and new agreed encryption support, in practice nearly all encrypted streams will be both HLS and DASH for some time.
(https://developer.apple.com/documentation/http_live_streaming/about_the_common_media_application_format_with_http_live_streaming)
Related
I want to stream a real-time video flux that come from udp into a HTML video tag.
I made some research but I got a lot of informations and I struggle to have a clear overview of what I can do and what I can't.
The video flux use H.264 and AAC codecs, MP4 container and has a 3840x2160 (4K) resolution. I'd like to play it on Chrome (latest version).
As I understand from now, HTML video tag can natively read H.264/AAC videos. I made it work with the video direclty on my server (I'm using Meteor JS + React).
I learnt to use FFmpeg to stream an udp flux read by VLC player, and then I used FFserver (I know it's deprecated) to create an HTTP flux also read by VLC but not by the HTML video tag.
So... my question is : is HTML video can natively read video stream from HTTP ?
I've seen a lot of discussions about HLS and DASH, but I didn't understand if (and why) they're mandatory.
I read a post about someone creating a HLS m3u8 using only FFmpeg, is it a viable solution ?
FFserver configuration
HTTPPort 8090
HTTPBindAddress 0.0.0.0
MaxHTTPConnections 20
MaxClients 10
MaxBandwidth 100000
<Feed feed.ffm>
File /tmp/feed.ffm
FileMaxSize 1g
ACL allow 127.0.0.1
</Feed>
<Stream stream.mpeg>
Feed feed.ffm
Format mpeg
AudioCodec aac
AudioBitRate 256
AudioChannels 1
VideoCodec libx264
VideoBitRate 10000 // Total random here
VideoBitRateRange 5000-15000 // And here...
VideoFrameRate 30
VideoQMin 1
VideoQMax 50
VideoSize 3840x2160
VideoBufferSize 20000 // Not sure either
AVOptionVideo flags +global_header
</Stream>
I had to specify QMin and QMax to avoid error message but I don't really understand what is it.
FFmpeg command line
ffmpeg -re -i bbb_sunflower_2160p_30fps_normal.mp4 -strict -2 -r 30 -vcodec libx264 http://localhost:8090/feed.ffm
This work with VLC. I'm working with a file on my computer before moving to an udp stream.
Media support on browsers is a constantly changing landscape so it is worth having some places to look for the latest view.
The table at this link is generally up to date in my experience:
https://developer.mozilla.org/en-US/docs/Web/HTML/Supported_media_formats
You'll notice that the table includes the codec and the container - e.g. h.264 in mp4. This is important to understand as a codec may be supported by your browser but not in the container you want.
For the containers and codecs supported, the HTML5 tag will support HTTP streams or more accurately HTTP downloads. Most severs and browsers will support downloading the video file in chunks so that you can start viewing before the video is fully downloaded.
For better performance across different device types and different network conditions, video is often delivered via an Adjustable Bit Rate (ABR) protocol such as HLS or DASH. ABR also allows the client device or player download the video in chunks, e.g 10 second chunks, but the server provides each chunk in multiple different bit rate versions. The player can select the next chunk from the bit rate most appropriate to the current network conditions. See some more info in this answer also:
https://stackoverflow.com/a/42365034/334402
I've doing Javascript programming for some time but it's always been related to data updating, saving, manipulating, etc.
I have no idea how something like an in-browser audio player gets audio (especially live, streaming audio) from the internet and plays it out of my computer speakers.
How does this happen in Javascript?
For example, how does a website deliver live audio to my speakers using Javascript? http://player.streamtheworld.com/liveplayer.php?callsign=WVIEAM
The live audio is not much different from pre-recorded audio... it's just played back as it's received, and when live it's encoded as it's recorded.
In browsers these days, the most basic form of streaming audio is a simple <audio> tag. By changing the src attribute from a file to a stream, you're up and running:
<audio src="http://cdn.audiopump.co/waug/main_mp3_256k" />
The browser doesn't know or care in this case that the audio is a live stream. All it knows is that there's some media data that it's fetching via HTTP, and playing back while it comes in.
If your browser compatibility is good, it would be preferable to use the MediaSource API, giving you more control (such as switching to a different quality stream mid-stream, like in HLS) and ensuring that the browser doesn't try to cache what is effectively an inifinitely sized file.
For example, how does a website deliver live audio to my speakers using Javascript? http://player.streamtheworld.com/liveplayer.php?callsign=WVIEAM
This particular site is ran by Triton Digital, and they still use Flash. Many sites still do this as a holdover from a time when HTML5 audio was not widely supported. There is little reason to do this today.
Other reasons to use Flash include incompatible server protocols. If your streaming server is using RTMP, you're stuck with Flash as browsers don't speak RTMP.
There used to be an issue with streaming AAC in-browser due to browsers not properly handling AAC wrapped in ADTS. (This encapsulation is required for streaming AAC in most situations.) Most browsers have resolved this, but I suspect that this is the reason Triton Digital is still using their Flash solution. By using Flash, they can play AAC/ADTS streams.
I'm a bit at my wits end here.
I want to stream a live video broadcast to a web browser.
Currently I use ffmpeg to stream a directshow live source as a webm stream to node.js which then forwards the stream to the http request from the <video> element. So far everything works.
live source -> ffmpeg -> POST [webm] -> node.js -> GET [webm] -> video tag
My problem is that the source clock and the web clients clock doesn't exactly match each other (not that surprising). For video this is not a problem, dropping or duplicating a frame every now and then is not noticeable. However, with audio it is another issue. From what I've been able to figure out so far Chrome (or any other browser) does not perform any form of audio resampling compensation (e.g. swr_set_compensation from ffmpeg) to compensate this mismatch. Instead I get quite audible audio distortions (a loud beep) when the playback buffer runs out of samples.
My question is whether it is possible to achieve proper playback (with audio) of a live source in a web browser?
I haven't tried using silverlight or flash for playback yet. Would that possibly work better?
Live media (audio or/and video) streaming to a web browser has been possible for a couple of years though it is still making progress as of today. It is the next big thing for media on the web and many platforms like Youtube are already on board.
A typical live media streaming scenario is:
audio/video feed > transcoding > streaming > player
At each step you have several technological possibilities available. However I should already mention here that the road to live media streaming is paved with proprietary technologies.
audio/video feed: either raw or very lightly compressed media format and cannot be uploaded as such to the Internet. You need to transcode it. You may have to use a grabbing device like a PCI Express card or USB/thunderbolt device to get your cam onto a computer.
transcoding: you have software (ffmpeg, Flash media live encoder, Wirecast) or hardware solutions (streamingmedia.com has a wealth of information on the subject like here). H264/AAC is the current media professional standard and streams are often transcoded to multiple renditions (bitrate) to suit different network conditions.
streaming: you most likely need to target multiple devices to deliver your live stream. Not all devices support the same streaming protocol. HLS works on Apple devices and Android > 4.1. HDS or RTMP works in Flash, Smooth streaming in Silverlight. You cannot reach all devices with one protocol so in this case you would need a streaming server like Wowza or Red5. A streaming server take as an input a transcoded live stream and prepare it for cross device delivery while sustaining a massive number of simultaneous connections (over a thousand is not uncommon nowadays). It can also add functionalities like DVR or DRM. As of today the effort is around HTTP adaptive bitrate delivery. Large companies add CDN support for global delivery.
player: to display your live stream with various options like custom layout, closed captions, ads, chat module and more. Flash has been leading the market up until now for live media streaming on desktop. You can use HTML5 video for iOS and Android where HLS is supported.
Coming in fast is MPEG DASH and it works live with HTML5 video. There is a JS lib that supports live. I have tested it and it works though I may not use it for a production case scenario just yet as it is still a bit clunky (on demand support is better) and browser support is narrow at the moment (As of 8/30/13, Desktop Chrome, Desktop Internet Explorer 11, and Mobile Chrome Beta for Android are the only browsers supported).
I cannot comment much on your solution because I have not used node.js for streaming but it sounds like an interesting effort. A typical solution I would use relating to your case:
Device > ffmpeg (H264/AAC) > Wowza > Hybrid player (Flash + HTML5).
Instead of Wowza you could use Red5 (free/open source - but not much activity as of late). You can also look into Nginx RTMP module which supports HLS and MPEG DASH on top of RTMP.
For flash I use Strobe from Adobe which support live streaming and is easy to set up and a fallback to HTML5 where flash is not supported. I use SWFObject lib to detect flash support and feed a HLS URL to an HTML5 video tag for mobile devices. You can use RTSP for Android < 4.1 and other mobile devices.
Another thing I should mention is real time communications. For video/audio conferencing you could have a look at WebRTC. Those 2 articles should get you on the right track. Here and here. WebRTC will work great for one to few, one to one, few to few. If you need to support more concurrent connections you can have a look at Licode or tokbox.
I want to be able to record videos with audio using HTML and Javascript.
After some research i can get video streaming with getUserMedia. Also There is WebRTC for recording but as far as i understood its not yet implemented in desktop browsers (only mobile browsers support it). So now i can just capture video, but i cant save it to server or record it.
What other options do i have ?. Does anyone knows a good flash alternative or HTML5 alternative that allows me to capture and save video to server with audio and also has maximum time of recording
Full disclosure: I work for Ziggeo.
When it comes to WebRTC, here is the rundown for browsers supporting it:
on Chrome and Opera, you have to record audio and video separately and encode them yourself in JS; then, send them to your servers and transcode them using e.g. ffmpeg to mp4s and other target formats
on Firefox, you can get a webm object for video and audio combined and send it to your servers.
For all other browsers and older versions of the ones mentioned you'd need to fall back to Flash recording which usually is based on RTMP and flv.
I'm looking for a web video player with which I can keep the full compatibility with iOS devices: iPad, iPhone, etc (so I would exclude all Flash video players).
Until now I've used Flowplayer but I have some problems:
the main problem is that using flv files I can start very fastly to play the video but I don't have any compatibility with iOS devices. Instead, using mp4 files, I have the full compatibility with iOS devices but before playing the video I have to wait that all content of the file has been loaded (few minutes).
So my question is: does it exist a video format that allows me to play videos very fastly and also compatible also with iOS devices?
Not talking about HTML5 just yet, lets assume you are first interested in supporting most users and legacy devices. Unless you are using an embedded player, (such as Flash), there is nothing inherent in all browsers and/or JavaScript to allow you to play a video that is standardized across these devices. If you simply reference a link to the video file, you are asking the device to natively download and decode the video file. This is why it does not typically begin playing until the entire file is downloaded. This is dependent on each device having a compatible MIME encoding configured for the file type which points to a player that the browser can invoke to handle the file. When you use something like Flowplayer, usually these Flash applications can begin playing video before it is fully downloaded because it knows how to download the video from your server over HTTP and once it has received enough of the video stream (buffered the video), it can begin playing it. Currently your best option is to use something like you have been using for most devices, and have a separate link to the mp4 for iOS devices. If you just re-encode any videos you already have in FLV or whatever older formats you have been using to mp4, you should be able to play that in a current version of any Flash based player, as Flash will work with those files as well as it's legacy formats.
I've found this resource and it seems to be very good: http://code.google.com/p/php-mobile-detect/