I'm trying to find a solution for my website which desperately needs an upgrade. Currently I am using Flash-based players to stream content from Shoutcast, which doesn't work too terribly, but unfortunately it leaves mobile users behind.
Ideally, I would love to have an HTML5/Javascript player that can play Shoutcast (or other) streams online. I've tried jPlayer, which seems pretty good for playing individual files, but have been unsuccessful in trying to get it to work with Shoutcast and Icecast.
Does anybody have any suggestions on where to start?
You can use jPlayer to play your shoutcast stream using HTML5 native player - it works quite well on most browsers/platforms except android. In that case jPlayer provides the flash player backup.
You need to specify your type of audio as 'mp3' (aac streams do not work so make sure you are specifying a straight icy mp3 stream) I have used 'mp3' but you can also try type 'stream'
You need to change the url of your stream slightly:
Normal: http://yourserver.com:8000/listen.pls
jPlayer: http://yourserver.com:8000/;listen.pls
(note the addition of the semicolon - this helps with shoutcast servers)
Here is the documentation from the jPlayer site that should help as well.
http://www.jplayer.org/latest/demo-08/
Happy Streaming!
Related
So I've been battling with an issue for a while now, I've been trying to get the audio playing through the speakers of a device using the MediaDevices singleton object in javascript.
I'm trying to achieve something similar to Shazam or snapchat where they can both get the audio playing from the speakers of a device and also from a microphone so that I can for instance be taking a video using my microphone and video cam and also streaming the sound playing from the background.
I'll like to know if this is possible because I tried using devices with kind of "audiooutput" and deviceId of "default" accessible through the enumerateDevices method assuming that's the device's output speaker but I still get the audio from the microphone even when using the device kind of "audiooutput".
Note: I'm able to combine multiple audio nodes successfully.
I'm not asking for a do it for me answer, just a theory of how to go about implementing it if it's possible in javascript.
Thanks in advance :D
No, this isn't possible via the Web Audio API. Not all platforms even support this capability.
On some platforms, you can use getDisplayMedia() and get audio with it, but compatibility isn't great right now.
I've doing Javascript programming for some time but it's always been related to data updating, saving, manipulating, etc.
I have no idea how something like an in-browser audio player gets audio (especially live, streaming audio) from the internet and plays it out of my computer speakers.
How does this happen in Javascript?
For example, how does a website deliver live audio to my speakers using Javascript? http://player.streamtheworld.com/liveplayer.php?callsign=WVIEAM
The live audio is not much different from pre-recorded audio... it's just played back as it's received, and when live it's encoded as it's recorded.
In browsers these days, the most basic form of streaming audio is a simple <audio> tag. By changing the src attribute from a file to a stream, you're up and running:
<audio src="http://cdn.audiopump.co/waug/main_mp3_256k" />
The browser doesn't know or care in this case that the audio is a live stream. All it knows is that there's some media data that it's fetching via HTTP, and playing back while it comes in.
If your browser compatibility is good, it would be preferable to use the MediaSource API, giving you more control (such as switching to a different quality stream mid-stream, like in HLS) and ensuring that the browser doesn't try to cache what is effectively an inifinitely sized file.
For example, how does a website deliver live audio to my speakers using Javascript? http://player.streamtheworld.com/liveplayer.php?callsign=WVIEAM
This particular site is ran by Triton Digital, and they still use Flash. Many sites still do this as a holdover from a time when HTML5 audio was not widely supported. There is little reason to do this today.
Other reasons to use Flash include incompatible server protocols. If your streaming server is using RTMP, you're stuck with Flash as browsers don't speak RTMP.
There used to be an issue with streaming AAC in-browser due to browsers not properly handling AAC wrapped in ADTS. (This encapsulation is required for streaming AAC in most situations.) Most browsers have resolved this, but I suspect that this is the reason Triton Digital is still using their Flash solution. By using Flash, they can play AAC/ADTS streams.
So far, I have found red5, but I can't get it to run (no video arrives at the server side), so I was looking for a flash-based getUserMedia and found: https://github.com/addyosmani/getUserMedia.js . But how do I get the video on the server? IE10 doesn't support webRTC, which http://lynckia.com/licode/ is built on.
getUserMedia is a WebRTC API. It doesn't exist on IE10.
Your alternatives are
Go with a Flash based solution (red5, Wowza, etc)
Use a plugin for WebRTC on IE (check out this one: https://bloggeek.me/temasys-free-webrtc-plugin/
Use Ziggeo (they should be able to use WebRTC or Flash automatically for you, taking care of all relevant transcoding and format changes necessary to playback the recorded stream). CameraTag (virtually the same at first glance) was also suggested in another answer, and likely, there are more.
I would like to make a web application where people can add recorded sounds and samples to a timeline.
I want it to output 1 soundfile (approximately 3min long) which will be send to the server.
Now I would like to do this with the HTML5 Audio API and found out that I could do this with the AudioContext But Audiocontext is only supported in chrome.
Now I do dislike flash and I wanted to ask if there is any way to do this with HTML5 with decent browser support (so newest Chrome, IE and Firefox).
Some thoughts:
I could record audio using the HTML5 audio API and the user can add it to the timeline. When finished the client will upload all the audio files to the server including the location on the timetable. The server can combine the audio files to make it one file. Now this is an solution, but I would prefer to do this kind of work on the client side
At the moment in my opinion it is not possible (if it should be html5 and supported by IE and Firefox), see the List of Browser that Support the Audio API
Browsers with Audio API Support. But this Information could be outdated already.(these Browsers update so frequently)
You could wait, first serve only Chrome and hope the other Browser, catch up(IE might be a problem). Or you use Java (if you dont like flash). The other technologie out there is Silverlight, but it is "dead", so i wouldn't recommend it.
I hope my input helps a bit.
I need to play videos from Youtube as MP4.
My system (an Android Phone) does not support Flash.
It can play MP4 videos.
So I'm thinking of creating a web page that plays videos from Youtube.
I need to get MP4 videos to play in my page.
What approaches are there?
Take a look at this related question, where the OP has figured out a hacky way to get the necessary ID and checksum parameters for the get_video "API".
Streaming Youtube Videos
He successfully played back the stream in the VideoView component.
You can use the fmt parameter for choosing which format you want, see wikipedia for a list: http://en.wikipedia.org/wiki/YouTube#Quality_and_codecs
Good luck.
Did you actually try viewing videos? I have the Droid Eris, which shipped with a YouTube app. Try going to the mobile version of YouTube and see if that works out.
Google created Android. Google owns YouTube. Transitivity says you shouldn't have any problems.
Why not consider using Mobile youtube interface
Youtube provides alternative RTSP(real time streaming protocol) links for all existing videos.. dig into youtube api and try to get it.
once you get it you can try code playing the video invoking the inbuilt player to play those videos
Cheers.