Live Streaming to Browser - javascript

I'm a bit at my wits end here.
I want to stream a live video broadcast to a web browser.
Currently I use ffmpeg to stream a directshow live source as a webm stream to node.js which then forwards the stream to the http request from the <video> element. So far everything works.
live source -> ffmpeg -> POST [webm] -> node.js -> GET [webm] -> video tag
My problem is that the source clock and the web clients clock doesn't exactly match each other (not that surprising). For video this is not a problem, dropping or duplicating a frame every now and then is not noticeable. However, with audio it is another issue. From what I've been able to figure out so far Chrome (or any other browser) does not perform any form of audio resampling compensation (e.g. swr_set_compensation from ffmpeg) to compensate this mismatch. Instead I get quite audible audio distortions (a loud beep) when the playback buffer runs out of samples.
My question is whether it is possible to achieve proper playback (with audio) of a live source in a web browser?
I haven't tried using silverlight or flash for playback yet. Would that possibly work better?

Live media (audio or/and video) streaming to a web browser has been possible for a couple of years though it is still making progress as of today. It is the next big thing for media on the web and many platforms like Youtube are already on board.
A typical live media streaming scenario is:
audio/video feed > transcoding > streaming > player
At each step you have several technological possibilities available. However I should already mention here that the road to live media streaming is paved with proprietary technologies.
audio/video feed: either raw or very lightly compressed media format and cannot be uploaded as such to the Internet. You need to transcode it. You may have to use a grabbing device like a PCI Express card or USB/thunderbolt device to get your cam onto a computer.
transcoding: you have software (ffmpeg, Flash media live encoder, Wirecast) or hardware solutions (streamingmedia.com has a wealth of information on the subject like here). H264/AAC is the current media professional standard and streams are often transcoded to multiple renditions (bitrate) to suit different network conditions.
streaming: you most likely need to target multiple devices to deliver your live stream. Not all devices support the same streaming protocol. HLS works on Apple devices and Android > 4.1. HDS or RTMP works in Flash, Smooth streaming in Silverlight. You cannot reach all devices with one protocol so in this case you would need a streaming server like Wowza or Red5. A streaming server take as an input a transcoded live stream and prepare it for cross device delivery while sustaining a massive number of simultaneous connections (over a thousand is not uncommon nowadays). It can also add functionalities like DVR or DRM. As of today the effort is around HTTP adaptive bitrate delivery. Large companies add CDN support for global delivery.
player: to display your live stream with various options like custom layout, closed captions, ads, chat module and more. Flash has been leading the market up until now for live media streaming on desktop. You can use HTML5 video for iOS and Android where HLS is supported.
Coming in fast is MPEG DASH and it works live with HTML5 video. There is a JS lib that supports live. I have tested it and it works though I may not use it for a production case scenario just yet as it is still a bit clunky (on demand support is better) and browser support is narrow at the moment (As of 8/30/13, Desktop Chrome, Desktop Internet Explorer 11, and Mobile Chrome Beta for Android are the only browsers supported).
I cannot comment much on your solution because I have not used node.js for streaming but it sounds like an interesting effort. A typical solution I would use relating to your case:
Device > ffmpeg (H264/AAC) > Wowza > Hybrid player (Flash + HTML5).
Instead of Wowza you could use Red5 (free/open source - but not much activity as of late). You can also look into Nginx RTMP module which supports HLS and MPEG DASH on top of RTMP.
For flash I use Strobe from Adobe which support live streaming and is easy to set up and a fallback to HTML5 where flash is not supported. I use SWFObject lib to detect flash support and feed a HLS URL to an HTML5 video tag for mobile devices. You can use RTSP for Android < 4.1 and other mobile devices.
Another thing I should mention is real time communications. For video/audio conferencing you could have a look at WebRTC. Those 2 articles should get you on the right track. Here and here. WebRTC will work great for one to few, one to one, few to few. If you need to support more concurrent connections you can have a look at Licode or tokbox.

Related

Play Dash or .mpd videos react-native(IOS)

I am working on an app which has a media server and this media server provide me the URL's of different video files and these videos are in dash format or .mpd format. I go through react-native-video and it has exo-player on android which already have the support of dash,
I have tested exo-player on andriod and it works but on the other hand, it uses AVPlayer for IOS which don't have any support of dash.
I spent some time finding some solution on the IOS side which could also support React-Native android and IOS but didn't find a proper way, Found two solutions but not works for both.
dash.js - https://github.com/Dash-Industry-Forum/dash.js/wiki
Google Shaka Player - https://github.com/google/shaka-player
dash.js have only support for web and shaka-player have some embedded solution available on IOS and which I have to bridge and then I can use.
I am looking for some quick workaround on react-native, Is there any player available that can play dash on both platforms or any other workaround that can work for me.
A key reason you may not see as much attention for DASH players in iOS as you expect is that current apple iOS rules require you to use HLS on iOS devices for any video over 10 minutes, assuming your app is available on mobile networks which is nearly always the case:
2.5.7 Video streaming content over a cellular network longer than 10 minutes must use HTTP Live Streaming and include a baseline 192 kbps HTTP Live stream.
(https://developer.apple.com/app-store/review/guidelines/)
This is a key reason most video streams at this time are available in both HLS and DASH streams.
Note that the CMAF format promises to eventually allow you only store and server single copy of your content by effectively having the segmented video streams be the same with different HLS and DASH 'index' or 'manifest' files to refer to them. Because of some differences in the way encryption has been done in the past, and the time it will take for all devices and players to support the new format and new agreed encryption support, in practice nearly all encrypted streams will be both HLS and DASH for some time.
(https://developer.apple.com/documentation/http_live_streaming/about_the_common_media_application_format_with_http_live_streaming)

Comparing Media Source Extensions (MSE) with WebRTC

What are the fundamental differences between Media Source Extensions and WebRTC?
If I may project my own understanding for a moment. WebRTC includes an the RTCPeerConnection which handles getting streams from Media Streams and passing them into a protocol for streaming to connected peers of the application. It seems under the hood WebRTC abstracting a lot of the bigger issues like codecs and transcoding. Would this be a correct assessment?
Where does Media Source Extensions fit into things? I have limited knowledge but have seen examples where developers are running adaptive streaming. Does MSE only deal with streams from your server?
Help would be much appreciated.
Unfortunately, these new browser-related protocols are being designed and developed by W3C and IETF in rather unorganized manner, not completely technically driven, but reflecting battles between Apple, Google and Microsoft, all trying to standardize their own technologies. Similarly, different browsers choose to adopt only certain standards or parts of standards which makes developer life extremely hard.
I have implemented both Media Source Extensions and WebRTC, so I think I can answer your question:
Media Source Extensions is just a player inside the browser.
You create a MediaSource object
https://developer.mozilla.org/en-US/docs/Web/API/MediaSource
and assign it to your video element like this
video.src = URL.createObjectURL(mediaSource);
Then your javascript code can fetch media segments from somewhere (your server or webserver) and supply to SourceBuffer attached to MediaSource, for playback.
WebRTC is not just a player, it is also a capture, encoding and sending mechanism. So it is a player too, and you use it a little differently from Media Source Extensions. Here you create another object: MediaStream object
https://developer.mozilla.org/en-US/docs/Web/API/MediaStream
and assign it to your video element like this
video.srcObject = URL.createObjectURL(mediaStream);
Notice that in this case the mediaStream object is not created directly by yourself, but supplied to you by WebRTC APIs such as getUserMedia.
So, to summarize, in both cases you use video element to play, but with Media Source Extensions you have to supply media segments by yourself, while with WebRTC you use WebRTC API to supply media. And, once again, with WebRTC you can also capture user's webcam, encode it and send to another browser to play, enabling p2p video chat, for example.
Media Source Extensions browsers adoption: http://caniuse.com/#feat=mediasource
WebRTC browsers adoption: http://iswebrtcreadyyet.com/

How do in-browser audio players work?

I've doing Javascript programming for some time but it's always been related to data updating, saving, manipulating, etc.
I have no idea how something like an in-browser audio player gets audio (especially live, streaming audio) from the internet and plays it out of my computer speakers.
How does this happen in Javascript?
For example, how does a website deliver live audio to my speakers using Javascript? http://player.streamtheworld.com/liveplayer.php?callsign=WVIEAM
The live audio is not much different from pre-recorded audio... it's just played back as it's received, and when live it's encoded as it's recorded.
In browsers these days, the most basic form of streaming audio is a simple <audio> tag. By changing the src attribute from a file to a stream, you're up and running:
<audio src="http://cdn.audiopump.co/waug/main_mp3_256k" />
The browser doesn't know or care in this case that the audio is a live stream. All it knows is that there's some media data that it's fetching via HTTP, and playing back while it comes in.
If your browser compatibility is good, it would be preferable to use the MediaSource API, giving you more control (such as switching to a different quality stream mid-stream, like in HLS) and ensuring that the browser doesn't try to cache what is effectively an inifinitely sized file.
For example, how does a website deliver live audio to my speakers using Javascript? http://player.streamtheworld.com/liveplayer.php?callsign=WVIEAM
This particular site is ran by Triton Digital, and they still use Flash. Many sites still do this as a holdover from a time when HTML5 audio was not widely supported. There is little reason to do this today.
Other reasons to use Flash include incompatible server protocols. If your streaming server is using RTMP, you're stuck with Flash as browsers don't speak RTMP.
There used to be an issue with streaming AAC in-browser due to browsers not properly handling AAC wrapped in ADTS. (This encapsulation is required for streaming AAC in most situations.) Most browsers have resolved this, but I suspect that this is the reason Triton Digital is still using their Flash solution. By using Flash, they can play AAC/ADTS streams.

Javascript player for web video streaming compatible with iPad, iPhone, etc

I'm looking for a web video player with which I can keep the full compatibility with iOS devices: iPad, iPhone, etc (so I would exclude all Flash video players).
Until now I've used Flowplayer but I have some problems:
the main problem is that using flv files I can start very fastly to play the video but I don't have any compatibility with iOS devices. Instead, using mp4 files, I have the full compatibility with iOS devices but before playing the video I have to wait that all content of the file has been loaded (few minutes).
So my question is: does it exist a video format that allows me to play videos very fastly and also compatible also with iOS devices?
Not talking about HTML5 just yet, lets assume you are first interested in supporting most users and legacy devices. Unless you are using an embedded player, (such as Flash), there is nothing inherent in all browsers and/or JavaScript to allow you to play a video that is standardized across these devices. If you simply reference a link to the video file, you are asking the device to natively download and decode the video file. This is why it does not typically begin playing until the entire file is downloaded. This is dependent on each device having a compatible MIME encoding configured for the file type which points to a player that the browser can invoke to handle the file. When you use something like Flowplayer, usually these Flash applications can begin playing video before it is fully downloaded because it knows how to download the video from your server over HTTP and once it has received enough of the video stream (buffered the video), it can begin playing it. Currently your best option is to use something like you have been using for most devices, and have a separate link to the mp4 for iOS devices. If you just re-encode any videos you already have in FLV or whatever older formats you have been using to mp4, you should be able to play that in a current version of any Flash based player, as Flash will work with those files as well as it's legacy formats.
I've found this resource and it seems to be very good: http://code.google.com/p/php-mobile-detect/

Video streaming over websockets using JavaScript

What is the fastest way to stream live video using JavaScript? Is WebSockets over TCP a fast enough protocol to stream a video of, say, 30fps?
Is WebSockets over TCP a fast enough protocol to stream a video of, say, 30fps?
Yes.. it is, take a look at this project. Websockets can easily handle HD videostreaming.. However, you should go for Adaptive Streaming. I explain here how you could implement it.
Currently we're working on a webbased instant messaging application with chat, filesharing and video/webcam support. With some bits and tricks we got streaming media through websockets (used HTML5 Media Capture to get the stream from our webcams).
You need to build a stream API and a Media Stream Transceiver to control the related media processing and transport.
The Media Source Extensions has been proposed which would allow for Adaptive Bitrate Streaming implementations.
To answer the question:
What is the fastest way to stream live video using JavaScript? Is
WebSockets over TCP a fast enough protocol to stream a video of, say,
30fps?
Yes, Websocket can be used to transmit over 30 fps and even 60 fps.
The main issue with Websocket is that it is low-level and you have to deal with may other issues than just transmitting video chunks. All in all it's a great transport for video and also audio.
It's definitely conceivable but I am not sure we're there yet. In the meantime, I'd recommend using something like Silverlight with IIS Smooth Streaming. Silverlight is plugin-based, but it works on Windows/OSX/Linux. Some day the HTML5 <video> element will be the way to go, but that will lack support for a little while.

Categories