HTML audio player for mobile browsers with stream as source - javascript

I use VLC media player to stream some content (audio). Right now I stream it to port 80 (http) of my PC. On a different device I open VLC and can listen to the stream by using http://192.168.0.78/ (the IP of the source computer). It plays well also on a phone when I use the VLC app to listen to the content.
Now, I try to implement a player on a website which takes the url of the stream as src and the client just opens the website and plays the stream (this has many logistic advantages). A minimal example can be viewed here. The stream runs fine on Desktop browsers but does not on run on mobile browsers. Unfortunately there is no error message or things like that which could indicate the source of the problem.
Things I tried to do:
Using different encodings of the stream in the VLC (MP3, OGG etc.)
Using third party HTML players like Media element, jPlayer, audio.js, Muses Radio Player and many more. All of them work fine on desktop browser, but buggy, if at all on mobile browsers
It works on a Desktop browser when simulating a mobile browser thought F12 -> Responsive Design mode (in Firefox for example)
The site, in which I host the player runs on https, whereas the source of the stream is http. At this point I am not aware of a problem this could cause, but still mentioning it here.
Does anyone have experience with HTML players on mobile devices which take a stream as source?

The site, in which I host the player runs on https, whereas the source of the stream is http. At this point I am not aware of a problem this could cause, but still mentioning it here.
That is in fact the problem. If you open up your browser's developer tools, you will see error messages related to this. Pages in secure contexts can no longer load data from an insecure context.
Ideally, you need to serve your stream via HTTPS. Otherwise, you'll have to serve your page via HTTP.
Also note that you do have this same problem on desktop browsers.

Related

How to work around lack WebRTC support in a browser?

Some browsers (mobile Mi Browser, for instance) don't support WebRTC - they have no RTCPeerConnection API. So the users of your WebRTC web app have to open it in another one.
Is there a way to make your WebRTC app work without an explicit browser-change action from the user, especially on a mobile device?
I tried to investigate the following:
Deep Link. Looks like we can't redirect the user to another browser using deep link (I haven't found Chrome deep link for mobile).
Send WebRTC sources to browser / use third-party WebRTC lib. This won't work either, you need WebRTC support in the browser source code.
WebRTC is a framework based on a set of standards. It includes not only capability to get information about user input/output devices, but also set of network protocols which based on UDP (from getting client's IP to transfer arbitrary data through data channel using SCTP protocol). So, as you already may guess, it's impossible to support in a browser which doesn't have it, this is why point (2) will not work.
In case of point (1 - Open Chrome). On iOS exists custom protocol to open URL in chrome "googlechromes://stackoverflow.com", but it's better to explicitly say user that current browser doesn't support required functionality; And provide links on list of popular browsers for download (Chrome, Firefox, etc.); And already on these websites user will be redirected to the proper store for downloading native app.

HTML5 audio tag does not work in Android - Chrome when created in JS?

I'm using a .mp3 file, the .mp3 file plays okay when viewed directly and also when embeded using the HTML5 audio tag, however when creating the HTML5 audio tag in JS it does not play! (very strange)
I do not have this issue in any other browser/device, for example Desktop - Chrome works perfectly.
sound = document.createElement('audio');
sound.setAttribute('src', 'sound.mp3');
sound.play();
I've tested sound.canPlayType('audio/mpeg') and this produces true (so it is supported).
Perhaps there's a bug in Android - Chrome? (it is the latest version)
Looks like this is intended feature that spans more then just the Chrome browser. User interaction is required to get media elements to play.
Blink and WebKit have a setting for requiring a “user gesture” to play or pause an audio or video element, which is enabled in Opera for Android, Chrome for Android, the default Android browser, Safari for iOS and probably other browsers. This makes some sense, since mobile devices are used in public and in bed, where unsolicited sound from random Web sites could be a nuisance. Also, autoplaying video ads would waste bandwidth. Block Quote from 'blog.foolip.org'
Duplicate Threads from Other Users
Autoplay audio on mobile safari
How can I autoplay media in ios 4.2.1
Autoplay audio with ios 5 workaround?
Current Status
Developers have requested the deletion of 'mediaPlaybackRequiresUserGesture' which was reviewed and denied (for now). "We're going to gather some data about how users react to autoplaying videos in order to decide whether to keep this restriction."
Upon further inspection i found this...
"I misunderstood the outcome of the discussion (removing mediaPlaybackRequiresUserGesture) surrounding this topic. We need to keep this code in order to not break google.com while gathering data about this feature."
Google.com relies on the feature being disabled, otherwise it breaks (they didn't say what it breaks).
Original Bug Report
Try appending it to the document body.
document.body.appendChild(sound);
Though it is possible that mobile devices will not automatically play the audio or videos. If you are targeting mobile devices, autoplaying is considered bad practice since it can consume bandwidth. So it may be worth considering adding controls.
sound.setAttribute('controls', 'true');
OK, well, now that we know it won't work with audio, the only path left to you is to switch to the Web Audio API. You'll need to load the mp3 into an ArrayBuffer (e.g. using an XHR), then pass that to the decodeAudioData method, which gets you an Audio buffer that you can play back at will from an AudioBufferSourceNode.
Not every browser on every platform can play the mp3 audio format. Generally, as I would recommend, you should provide two <source> elements within your audio element, one providing the mp3 format, and another one providing the ogg vorbis format.
You can read more here: https://developer.mozilla.org/en-US/docs/Web/HTML/Supported_media_formats

Live Streaming to Browser

I'm a bit at my wits end here.
I want to stream a live video broadcast to a web browser.
Currently I use ffmpeg to stream a directshow live source as a webm stream to node.js which then forwards the stream to the http request from the <video> element. So far everything works.
live source -> ffmpeg -> POST [webm] -> node.js -> GET [webm] -> video tag
My problem is that the source clock and the web clients clock doesn't exactly match each other (not that surprising). For video this is not a problem, dropping or duplicating a frame every now and then is not noticeable. However, with audio it is another issue. From what I've been able to figure out so far Chrome (or any other browser) does not perform any form of audio resampling compensation (e.g. swr_set_compensation from ffmpeg) to compensate this mismatch. Instead I get quite audible audio distortions (a loud beep) when the playback buffer runs out of samples.
My question is whether it is possible to achieve proper playback (with audio) of a live source in a web browser?
I haven't tried using silverlight or flash for playback yet. Would that possibly work better?
Live media (audio or/and video) streaming to a web browser has been possible for a couple of years though it is still making progress as of today. It is the next big thing for media on the web and many platforms like Youtube are already on board.
A typical live media streaming scenario is:
audio/video feed > transcoding > streaming > player
At each step you have several technological possibilities available. However I should already mention here that the road to live media streaming is paved with proprietary technologies.
audio/video feed: either raw or very lightly compressed media format and cannot be uploaded as such to the Internet. You need to transcode it. You may have to use a grabbing device like a PCI Express card or USB/thunderbolt device to get your cam onto a computer.
transcoding: you have software (ffmpeg, Flash media live encoder, Wirecast) or hardware solutions (streamingmedia.com has a wealth of information on the subject like here). H264/AAC is the current media professional standard and streams are often transcoded to multiple renditions (bitrate) to suit different network conditions.
streaming: you most likely need to target multiple devices to deliver your live stream. Not all devices support the same streaming protocol. HLS works on Apple devices and Android > 4.1. HDS or RTMP works in Flash, Smooth streaming in Silverlight. You cannot reach all devices with one protocol so in this case you would need a streaming server like Wowza or Red5. A streaming server take as an input a transcoded live stream and prepare it for cross device delivery while sustaining a massive number of simultaneous connections (over a thousand is not uncommon nowadays). It can also add functionalities like DVR or DRM. As of today the effort is around HTTP adaptive bitrate delivery. Large companies add CDN support for global delivery.
player: to display your live stream with various options like custom layout, closed captions, ads, chat module and more. Flash has been leading the market up until now for live media streaming on desktop. You can use HTML5 video for iOS and Android where HLS is supported.
Coming in fast is MPEG DASH and it works live with HTML5 video. There is a JS lib that supports live. I have tested it and it works though I may not use it for a production case scenario just yet as it is still a bit clunky (on demand support is better) and browser support is narrow at the moment (As of 8/30/13, Desktop Chrome, Desktop Internet Explorer 11, and Mobile Chrome Beta for Android are the only browsers supported).
I cannot comment much on your solution because I have not used node.js for streaming but it sounds like an interesting effort. A typical solution I would use relating to your case:
Device > ffmpeg (H264/AAC) > Wowza > Hybrid player (Flash + HTML5).
Instead of Wowza you could use Red5 (free/open source - but not much activity as of late). You can also look into Nginx RTMP module which supports HLS and MPEG DASH on top of RTMP.
For flash I use Strobe from Adobe which support live streaming and is easy to set up and a fallback to HTML5 where flash is not supported. I use SWFObject lib to detect flash support and feed a HLS URL to an HTML5 video tag for mobile devices. You can use RTSP for Android < 4.1 and other mobile devices.
Another thing I should mention is real time communications. For video/audio conferencing you could have a look at WebRTC. Those 2 articles should get you on the right track. Here and here. WebRTC will work great for one to few, one to one, few to few. If you need to support more concurrent connections you can have a look at Licode or tokbox.

HTML5 Video Not Working, IE11

I am in Windows Server 2012. I uploaded a site that works fine in my laptop, but has problems when I test it on the server's IE 11.0.9600.16384.
I have this code to insert html5 videos in the site
document.getElementById("videogal").innerHTML=' ';
elemv.src=mplv[0];
document.getElementById("videogal").appendChild(elemv);
videogal is a div
elemv is a global var:
var elemv = document.createElement("video");
mplv is an array that contains literals, such as "myFolder/myvideo.mp4".
As I said, it works fine in my laptop, but not in server's IE. IE's console says Not Implemented and marks this line elemv.src=mplv[0];
I dont know how to handle that. I alert the mplv[0] and has the proper value.
Most important: will this bug appear in client's browsers also? Or is just locally on server. I connect to the server as Admin and have turned off all the security restrictions of the IE.
Windows Server does not by default include certain "desktop" functionality, which includes the ability to play HTML5 video in Internet Explorer. Typically you wouldn't want users using server resources to play video. The "Desktop Experience" feature can be installed to add that functionality.
Installing this has no bearing on another machine's ability to play video served from this server. Other browsers (e.g. Chrome) running on the server have their own capability to play video and are unaffected by this feature being installed or not.

HTML5 base64 encoded audio on mobile devices

I'm writing a platform with an audio playback component. Audio is uploaded to the server as an wav/mp3/ogg file, and then (like the rest of our media), converted to base64 and stored within our redis database.
To play the audio back at the client side we make an AJAX request to the server for the base64 encoded audio. We have a desktop version that compliments the mobile application, at the moment audio playback works like this:
recording.sound = new Audio("data:audio/ogg;base64," + recording.audio);
recording.sound.play(); // this works
Today we started our tests on mobile devices, and have so far been unable to get it working, even on mobile browsers that apparently support HTML5 audio.
Any ideas? Or if this is not possible, is there a different approach we can take? Ideally there should be a mobile compatible version of the web app, and there has to be a phonegap version.
The reason might not be a technical one here, from Apple developer site:
In Safari on iOS (for all devices, including iPad), where the user may be on a cellular network and be charged per data unit, preload and autoplay are disabled. No data is loaded until the user initiates it. This means the JavaScript play() and load() methods are also inactive until the user initiates playback, unless the play() or load() method is triggered by user action. In other words, a user-initiated Play button works, but an onLoad="play()" event does not.
same applies to Android devices.
read more here: Safari HTML5 Audio and Video Guide
But „audio/wav“ doesn't exist. See spec here: http://www.iana.org/assignments/media-types/audio
You should use „audio/vnd.dts“ for .wav file, „audio/mpeg“ for .mp3 file and „audio/ogg“ for .ogg file...
OK, try StackOverflow search, see:
https://stackoverflow.com/search?q=audio+codec+support+mobile+devices+html5
or https://stackoverflow.com/search?q=audio+codec+support+mobile+devices+html
or try Google
Some search results, that might be useful:
In search for a library that knows to detect support for specific Audio Video format files
or html5 vs flash - full comparison chart anywhere?

Categories