How do I stream remote multiple webcams to multiple users - javascript

I need to capture multiple users' webcams in the browser and then broadcast it to multiple users. This must include an audio and video stream with low and high quality options. Ideally, I would not like to use flash but having flash as a fallback is totally fine.
When the webcam is broadcast to the users, this must work on as many platforms as possible.

Related

Can a web browser use multiple audio sources concurrently?

I have a computer with multiple audio out ports on the sound card. Can I use the web audio api or something to play different audio source to different output ports?
For example, in a browser, load 3 tracks and send each track to a different audio output port on the sound card.
Hmm. The channel splitter seems to have more fine grained control of audio channel. I’m not sure if that is what I would need though. One multi channel audio source to multiple monos?
https://developer.mozilla.org/en-US/docs/Web/API/ChannelSplitterNode

Supporting HEVC/H.265 videos in Electron

Chrome/Electron don't normally support H.265 videos. I want the user to be able to play .mov files recorded by Apple devices in my app. How can I do this?
Possible approaches:
Use ffmpeg to transcode to H.264 in real-time, which will be resource intensive (not ideal)
Use WebAssembly to render the video in a <canvas> tag (not ideal)
Fork and manually add H.265 codec support to Chrome/Electron like https://github.com/AAAhs/electron-hevc (overkill)
Use a native module that renders the video in a <canvas> or BrowserView
We ran into a similar problem on our project where users from the iOS app were sending video messages to browser users. In the end, the easiest solution to the problem was to record the video manually in the h264 codec. An example from the documentation:
https://developer.apple.com/documentation/avfoundation/cameras_and_media_capture/capturing_video_in_alternative_formats

How to have different video players for different user accounts in online video streaming

I am using a virtual classroom course. Class room videos are recorded and then it is streamed through a website. The said recorded videos are stored in cloud. I have got a user login and password.But the video player A, which they the vendor use to stream isn't compatible with my device. I tried many techniques physically
to correct as per their customer support but of no use.Wen this issue occured for the first time, they changed my player A to player B.Player B was working fine on my device.But they changed again to player A without me knowing.Wen I asked them to change my player to B, they say it is not possible due to technical issues. What could be those technical limitations this time? Any guess( I am just awailing their services )
Note:There are many other user accounts and devices where their streaming service working fine.

Comparing Media Source Extensions (MSE) with WebRTC

What are the fundamental differences between Media Source Extensions and WebRTC?
If I may project my own understanding for a moment. WebRTC includes an the RTCPeerConnection which handles getting streams from Media Streams and passing them into a protocol for streaming to connected peers of the application. It seems under the hood WebRTC abstracting a lot of the bigger issues like codecs and transcoding. Would this be a correct assessment?
Where does Media Source Extensions fit into things? I have limited knowledge but have seen examples where developers are running adaptive streaming. Does MSE only deal with streams from your server?
Help would be much appreciated.
Unfortunately, these new browser-related protocols are being designed and developed by W3C and IETF in rather unorganized manner, not completely technically driven, but reflecting battles between Apple, Google and Microsoft, all trying to standardize their own technologies. Similarly, different browsers choose to adopt only certain standards or parts of standards which makes developer life extremely hard.
I have implemented both Media Source Extensions and WebRTC, so I think I can answer your question:
Media Source Extensions is just a player inside the browser.
You create a MediaSource object
https://developer.mozilla.org/en-US/docs/Web/API/MediaSource
and assign it to your video element like this
video.src = URL.createObjectURL(mediaSource);
Then your javascript code can fetch media segments from somewhere (your server or webserver) and supply to SourceBuffer attached to MediaSource, for playback.
WebRTC is not just a player, it is also a capture, encoding and sending mechanism. So it is a player too, and you use it a little differently from Media Source Extensions. Here you create another object: MediaStream object
https://developer.mozilla.org/en-US/docs/Web/API/MediaStream
and assign it to your video element like this
video.srcObject = URL.createObjectURL(mediaStream);
Notice that in this case the mediaStream object is not created directly by yourself, but supplied to you by WebRTC APIs such as getUserMedia.
So, to summarize, in both cases you use video element to play, but with Media Source Extensions you have to supply media segments by yourself, while with WebRTC you use WebRTC API to supply media. And, once again, with WebRTC you can also capture user's webcam, encode it and send to another browser to play, enabling p2p video chat, for example.
Media Source Extensions browsers adoption: http://caniuse.com/#feat=mediasource
WebRTC browsers adoption: http://iswebrtcreadyyet.com/

Live Streaming to Browser

I'm a bit at my wits end here.
I want to stream a live video broadcast to a web browser.
Currently I use ffmpeg to stream a directshow live source as a webm stream to node.js which then forwards the stream to the http request from the <video> element. So far everything works.
live source -> ffmpeg -> POST [webm] -> node.js -> GET [webm] -> video tag
My problem is that the source clock and the web clients clock doesn't exactly match each other (not that surprising). For video this is not a problem, dropping or duplicating a frame every now and then is not noticeable. However, with audio it is another issue. From what I've been able to figure out so far Chrome (or any other browser) does not perform any form of audio resampling compensation (e.g. swr_set_compensation from ffmpeg) to compensate this mismatch. Instead I get quite audible audio distortions (a loud beep) when the playback buffer runs out of samples.
My question is whether it is possible to achieve proper playback (with audio) of a live source in a web browser?
I haven't tried using silverlight or flash for playback yet. Would that possibly work better?
Live media (audio or/and video) streaming to a web browser has been possible for a couple of years though it is still making progress as of today. It is the next big thing for media on the web and many platforms like Youtube are already on board.
A typical live media streaming scenario is:
audio/video feed > transcoding > streaming > player
At each step you have several technological possibilities available. However I should already mention here that the road to live media streaming is paved with proprietary technologies.
audio/video feed: either raw or very lightly compressed media format and cannot be uploaded as such to the Internet. You need to transcode it. You may have to use a grabbing device like a PCI Express card or USB/thunderbolt device to get your cam onto a computer.
transcoding: you have software (ffmpeg, Flash media live encoder, Wirecast) or hardware solutions (streamingmedia.com has a wealth of information on the subject like here). H264/AAC is the current media professional standard and streams are often transcoded to multiple renditions (bitrate) to suit different network conditions.
streaming: you most likely need to target multiple devices to deliver your live stream. Not all devices support the same streaming protocol. HLS works on Apple devices and Android > 4.1. HDS or RTMP works in Flash, Smooth streaming in Silverlight. You cannot reach all devices with one protocol so in this case you would need a streaming server like Wowza or Red5. A streaming server take as an input a transcoded live stream and prepare it for cross device delivery while sustaining a massive number of simultaneous connections (over a thousand is not uncommon nowadays). It can also add functionalities like DVR or DRM. As of today the effort is around HTTP adaptive bitrate delivery. Large companies add CDN support for global delivery.
player: to display your live stream with various options like custom layout, closed captions, ads, chat module and more. Flash has been leading the market up until now for live media streaming on desktop. You can use HTML5 video for iOS and Android where HLS is supported.
Coming in fast is MPEG DASH and it works live with HTML5 video. There is a JS lib that supports live. I have tested it and it works though I may not use it for a production case scenario just yet as it is still a bit clunky (on demand support is better) and browser support is narrow at the moment (As of 8/30/13, Desktop Chrome, Desktop Internet Explorer 11, and Mobile Chrome Beta for Android are the only browsers supported).
I cannot comment much on your solution because I have not used node.js for streaming but it sounds like an interesting effort. A typical solution I would use relating to your case:
Device > ffmpeg (H264/AAC) > Wowza > Hybrid player (Flash + HTML5).
Instead of Wowza you could use Red5 (free/open source - but not much activity as of late). You can also look into Nginx RTMP module which supports HLS and MPEG DASH on top of RTMP.
For flash I use Strobe from Adobe which support live streaming and is easy to set up and a fallback to HTML5 where flash is not supported. I use SWFObject lib to detect flash support and feed a HLS URL to an HTML5 video tag for mobile devices. You can use RTSP for Android < 4.1 and other mobile devices.
Another thing I should mention is real time communications. For video/audio conferencing you could have a look at WebRTC. Those 2 articles should get you on the right track. Here and here. WebRTC will work great for one to few, one to one, few to few. If you need to support more concurrent connections you can have a look at Licode or tokbox.

Categories