I've doing Javascript programming for some time but it's always been related to data updating, saving, manipulating, etc.
I have no idea how something like an in-browser audio player gets audio (especially live, streaming audio) from the internet and plays it out of my computer speakers.
How does this happen in Javascript?
For example, how does a website deliver live audio to my speakers using Javascript? http://player.streamtheworld.com/liveplayer.php?callsign=WVIEAM
The live audio is not much different from pre-recorded audio... it's just played back as it's received, and when live it's encoded as it's recorded.
In browsers these days, the most basic form of streaming audio is a simple <audio> tag. By changing the src attribute from a file to a stream, you're up and running:
<audio src="http://cdn.audiopump.co/waug/main_mp3_256k" />
The browser doesn't know or care in this case that the audio is a live stream. All it knows is that there's some media data that it's fetching via HTTP, and playing back while it comes in.
If your browser compatibility is good, it would be preferable to use the MediaSource API, giving you more control (such as switching to a different quality stream mid-stream, like in HLS) and ensuring that the browser doesn't try to cache what is effectively an inifinitely sized file.
For example, how does a website deliver live audio to my speakers using Javascript? http://player.streamtheworld.com/liveplayer.php?callsign=WVIEAM
This particular site is ran by Triton Digital, and they still use Flash. Many sites still do this as a holdover from a time when HTML5 audio was not widely supported. There is little reason to do this today.
Other reasons to use Flash include incompatible server protocols. If your streaming server is using RTMP, you're stuck with Flash as browsers don't speak RTMP.
There used to be an issue with streaming AAC in-browser due to browsers not properly handling AAC wrapped in ADTS. (This encapsulation is required for streaming AAC in most situations.) Most browsers have resolved this, but I suspect that this is the reason Triton Digital is still using their Flash solution. By using Flash, they can play AAC/ADTS streams.
Related
I'm using a .mp3 file, the .mp3 file plays okay when viewed directly and also when embeded using the HTML5 audio tag, however when creating the HTML5 audio tag in JS it does not play! (very strange)
I do not have this issue in any other browser/device, for example Desktop - Chrome works perfectly.
sound = document.createElement('audio');
sound.setAttribute('src', 'sound.mp3');
sound.play();
I've tested sound.canPlayType('audio/mpeg') and this produces true (so it is supported).
Perhaps there's a bug in Android - Chrome? (it is the latest version)
Looks like this is intended feature that spans more then just the Chrome browser. User interaction is required to get media elements to play.
Blink and WebKit have a setting for requiring a “user gesture” to play or pause an audio or video element, which is enabled in Opera for Android, Chrome for Android, the default Android browser, Safari for iOS and probably other browsers. This makes some sense, since mobile devices are used in public and in bed, where unsolicited sound from random Web sites could be a nuisance. Also, autoplaying video ads would waste bandwidth. Block Quote from 'blog.foolip.org'
Duplicate Threads from Other Users
Autoplay audio on mobile safari
How can I autoplay media in ios 4.2.1
Autoplay audio with ios 5 workaround?
Current Status
Developers have requested the deletion of 'mediaPlaybackRequiresUserGesture' which was reviewed and denied (for now). "We're going to gather some data about how users react to autoplaying videos in order to decide whether to keep this restriction."
Upon further inspection i found this...
"I misunderstood the outcome of the discussion (removing mediaPlaybackRequiresUserGesture) surrounding this topic. We need to keep this code in order to not break google.com while gathering data about this feature."
Google.com relies on the feature being disabled, otherwise it breaks (they didn't say what it breaks).
Original Bug Report
Try appending it to the document body.
document.body.appendChild(sound);
Though it is possible that mobile devices will not automatically play the audio or videos. If you are targeting mobile devices, autoplaying is considered bad practice since it can consume bandwidth. So it may be worth considering adding controls.
sound.setAttribute('controls', 'true');
OK, well, now that we know it won't work with audio, the only path left to you is to switch to the Web Audio API. You'll need to load the mp3 into an ArrayBuffer (e.g. using an XHR), then pass that to the decodeAudioData method, which gets you an Audio buffer that you can play back at will from an AudioBufferSourceNode.
Not every browser on every platform can play the mp3 audio format. Generally, as I would recommend, you should provide two <source> elements within your audio element, one providing the mp3 format, and another one providing the ogg vorbis format.
You can read more here: https://developer.mozilla.org/en-US/docs/Web/HTML/Supported_media_formats
I'm a bit at my wits end here.
I want to stream a live video broadcast to a web browser.
Currently I use ffmpeg to stream a directshow live source as a webm stream to node.js which then forwards the stream to the http request from the <video> element. So far everything works.
live source -> ffmpeg -> POST [webm] -> node.js -> GET [webm] -> video tag
My problem is that the source clock and the web clients clock doesn't exactly match each other (not that surprising). For video this is not a problem, dropping or duplicating a frame every now and then is not noticeable. However, with audio it is another issue. From what I've been able to figure out so far Chrome (or any other browser) does not perform any form of audio resampling compensation (e.g. swr_set_compensation from ffmpeg) to compensate this mismatch. Instead I get quite audible audio distortions (a loud beep) when the playback buffer runs out of samples.
My question is whether it is possible to achieve proper playback (with audio) of a live source in a web browser?
I haven't tried using silverlight or flash for playback yet. Would that possibly work better?
Live media (audio or/and video) streaming to a web browser has been possible for a couple of years though it is still making progress as of today. It is the next big thing for media on the web and many platforms like Youtube are already on board.
A typical live media streaming scenario is:
audio/video feed > transcoding > streaming > player
At each step you have several technological possibilities available. However I should already mention here that the road to live media streaming is paved with proprietary technologies.
audio/video feed: either raw or very lightly compressed media format and cannot be uploaded as such to the Internet. You need to transcode it. You may have to use a grabbing device like a PCI Express card or USB/thunderbolt device to get your cam onto a computer.
transcoding: you have software (ffmpeg, Flash media live encoder, Wirecast) or hardware solutions (streamingmedia.com has a wealth of information on the subject like here). H264/AAC is the current media professional standard and streams are often transcoded to multiple renditions (bitrate) to suit different network conditions.
streaming: you most likely need to target multiple devices to deliver your live stream. Not all devices support the same streaming protocol. HLS works on Apple devices and Android > 4.1. HDS or RTMP works in Flash, Smooth streaming in Silverlight. You cannot reach all devices with one protocol so in this case you would need a streaming server like Wowza or Red5. A streaming server take as an input a transcoded live stream and prepare it for cross device delivery while sustaining a massive number of simultaneous connections (over a thousand is not uncommon nowadays). It can also add functionalities like DVR or DRM. As of today the effort is around HTTP adaptive bitrate delivery. Large companies add CDN support for global delivery.
player: to display your live stream with various options like custom layout, closed captions, ads, chat module and more. Flash has been leading the market up until now for live media streaming on desktop. You can use HTML5 video for iOS and Android where HLS is supported.
Coming in fast is MPEG DASH and it works live with HTML5 video. There is a JS lib that supports live. I have tested it and it works though I may not use it for a production case scenario just yet as it is still a bit clunky (on demand support is better) and browser support is narrow at the moment (As of 8/30/13, Desktop Chrome, Desktop Internet Explorer 11, and Mobile Chrome Beta for Android are the only browsers supported).
I cannot comment much on your solution because I have not used node.js for streaming but it sounds like an interesting effort. A typical solution I would use relating to your case:
Device > ffmpeg (H264/AAC) > Wowza > Hybrid player (Flash + HTML5).
Instead of Wowza you could use Red5 (free/open source - but not much activity as of late). You can also look into Nginx RTMP module which supports HLS and MPEG DASH on top of RTMP.
For flash I use Strobe from Adobe which support live streaming and is easy to set up and a fallback to HTML5 where flash is not supported. I use SWFObject lib to detect flash support and feed a HLS URL to an HTML5 video tag for mobile devices. You can use RTSP for Android < 4.1 and other mobile devices.
Another thing I should mention is real time communications. For video/audio conferencing you could have a look at WebRTC. Those 2 articles should get you on the right track. Here and here. WebRTC will work great for one to few, one to one, few to few. If you need to support more concurrent connections you can have a look at Licode or tokbox.
I'm writing a platform with an audio playback component. Audio is uploaded to the server as an wav/mp3/ogg file, and then (like the rest of our media), converted to base64 and stored within our redis database.
To play the audio back at the client side we make an AJAX request to the server for the base64 encoded audio. We have a desktop version that compliments the mobile application, at the moment audio playback works like this:
recording.sound = new Audio("data:audio/ogg;base64," + recording.audio);
recording.sound.play(); // this works
Today we started our tests on mobile devices, and have so far been unable to get it working, even on mobile browsers that apparently support HTML5 audio.
Any ideas? Or if this is not possible, is there a different approach we can take? Ideally there should be a mobile compatible version of the web app, and there has to be a phonegap version.
The reason might not be a technical one here, from Apple developer site:
In Safari on iOS (for all devices, including iPad), where the user may be on a cellular network and be charged per data unit, preload and autoplay are disabled. No data is loaded until the user initiates it. This means the JavaScript play() and load() methods are also inactive until the user initiates playback, unless the play() or load() method is triggered by user action. In other words, a user-initiated Play button works, but an onLoad="play()" event does not.
same applies to Android devices.
read more here: Safari HTML5 Audio and Video Guide
But „audio/wav“ doesn't exist. See spec here: http://www.iana.org/assignments/media-types/audio
You should use „audio/vnd.dts“ for .wav file, „audio/mpeg“ for .mp3 file and „audio/ogg“ for .ogg file...
OK, try StackOverflow search, see:
https://stackoverflow.com/search?q=audio+codec+support+mobile+devices+html5
or https://stackoverflow.com/search?q=audio+codec+support+mobile+devices+html
or try Google
Some search results, that might be useful:
In search for a library that knows to detect support for specific Audio Video format files
or html5 vs flash - full comparison chart anywhere?
I would like to make a web application where people can add recorded sounds and samples to a timeline.
I want it to output 1 soundfile (approximately 3min long) which will be send to the server.
Now I would like to do this with the HTML5 Audio API and found out that I could do this with the AudioContext But Audiocontext is only supported in chrome.
Now I do dislike flash and I wanted to ask if there is any way to do this with HTML5 with decent browser support (so newest Chrome, IE and Firefox).
Some thoughts:
I could record audio using the HTML5 audio API and the user can add it to the timeline. When finished the client will upload all the audio files to the server including the location on the timetable. The server can combine the audio files to make it one file. Now this is an solution, but I would prefer to do this kind of work on the client side
At the moment in my opinion it is not possible (if it should be html5 and supported by IE and Firefox), see the List of Browser that Support the Audio API
Browsers with Audio API Support. But this Information could be outdated already.(these Browsers update so frequently)
You could wait, first serve only Chrome and hope the other Browser, catch up(IE might be a problem). Or you use Java (if you dont like flash). The other technologie out there is Silverlight, but it is "dead", so i wouldn't recommend it.
I hope my input helps a bit.
This isn't another one of those "How can I record audio in the browser?" questions... I know that the HTML5 Stream API is around the corner and Flash can already access the user's microphone and camera. I'm simply wondering, as a Javascript developer with little knowledge of Flash, if anyone has developed a JS library that hooks into Flash's device capabilities for recording but sends the results back to javascript (presumably using ExternalInterface).
In other words... libraries like SoundManager2 utilize a Flash fallback for audio playback, but they don't seem to allow for recording. Has anyone written a JS library that uses an invisible Flash movie to allow audio recording?
This does most of what you're looking for:
https://code.google.com/p/wami-recorder/
It records audio and sends it to a server via an HTTP POST (avoiding the need for a Flash Media Server.) A JavaScript API is available via ExternalInterface.
I'm not sure why you'd want the audio bytes in JavaScript, but it would probably be easy to modify it to do that too.
Unfortunately, you can't really do Flash audio recording in browser only. The Flash audio interfaces are all designed (surprise surprise) to talk to a Flash media server (or Red5): there is no interface to store recorded audio data locally and pass the recorded audio data to Javascript.
Once you have Red5/FMS setup you can control the recording process from Javascript: you can start/stop/playback the audio stream to/from the server. However, for security reasons you have to have a flash movie that is a minimum of 216 x 138 (see http://blog.natebeck.net/2009/01/tip-of-the-day-tricks-of-the-mic-settings-panel/ for a writeup) otherwise the settings manager won't be shown: this prevents people hiding an audio recording flash widget on a page and eavesdropping.
So no, no invisible flash controlled from javascript.