Setup web audio api source node from soundcloud - javascript

I would like to know if there is any way to create a source node ( https://dvcs.w3.org/hg/audio/raw-file/tip/webaudio/specification.html#MediaElementAudioSourceNode) from a soundcloud track.
I'm ok with the web audio API, but new to the soundcloud sdk, as far I understand it relies on soundmanager2. So maybe there is some options from soundmanager2 available?
Regards

You can request a track and then use stream_url property, that you can set as src for the audio element, to be used as MediaSourceNode.
Here's an example code:
var context = new webkitAudioContext(),
audio = new Audio(),
source,
// `stream_url` you'd get from
// requesting http://api.soundcloud.com/tracks/6981096.json
url = 'http://api.soundcloud.com/tracks/6981096/stream' +
'?client_id=YOUR_CLIENT_ID';
audio.src = url;
source = context.createMediaElementSource(audio);
source.connect(context.destination);
source.mediaElement.play();
Here's the example live: http://jsbin.com/ikixot/1/edit

Related

javascript new audioContext mono to Stereo not wordking

I need to convert the mp3 files I play in a project from mono to stereo with the web audio api. But do this a1 = new Audio(/1.mp3); I can't with. My entire system is based on this build. Converting all the sounds playing on the page to stereo or new Audio(/1.mp3); Is there a way to convert a sound created with .
var a1 = new Audio(`/1.mp3`);
a1.volume = .5;
a1.play()
I am using a simple code structure as above.
https://stackoverflow.com/a/54990085/15929287
I couldn't adapt the above answer for myself. In no way can I convert the sound I created with new audio() to stereo. In the example in the link, the oscillator is also added. I'm just trying to do something where I can adjust the mono/stereo setting. I need your help.
The linked answer assumes the mono audio source is an oscillator created with the Web Audio API but it is also possible to use an audio element as the source by using a MediaElementAudioSourceNode.
The audio graph would then need to be wired up like this:
const audio = new Audio('/1.mp3');
const audioCtx = new AudioContext();
const source = audioCtx.createMediaElementSource(audio);
const gainNodeL = audioCtx.createGain();
const gainNodeR = audioCtx.createGain();
const merger = audioCtx.createChannelMerger(2);
source.connect(gainNodeL);
source.connect(gainNodeR);
gainNodeL.connect(merger, 0, 0);
gainNodeR.connect(merger, 0, 1);
merger.connect(audioCtx.destination);
Please note that it's probably still necessary to call resume() on the AudioContext in response to a user gesture to make sure the AudioContext is running.

Javascript - Streaming Audio On The Fly (Web Audio API & XHR)

I have a simple xmlhttprequest running for fetching an audio file, when it's done fetching it decodes the audio and plays it.
var xhr = new XMLHttpRequest();
xhr.open('GET', /some url/, true);
xhr.responseType = 'arraybuffer';
xhr.onload = function() {
decode(xhr.response);
}.bind(this);
xhr.send(null);
The problem with this however, is that the file decodes only after the request is loaded/finished downloading. Is there an approach for streaming audio without having to wait for it to finish downloading?, without the usage of <audio> tags
You still need HTML5 Audio object but instead of using it directly and playing with it you can use use MediaElementAudioSourceNode along with Audio element to take advantage of Web API.
Excerpt from here
Rather than going the usual path of loading a sound directly by
issuing an XMLHttpRe quest and then decoding the buffer, you can use
the media stream audio source node (MediaElementAudioSourceNode) to
create nodes that behave much like audio source nodes
(AudioSourceNode), but wrap an existing tag. Once we have this
node connected to our audio graph, we can use our knowledge of the Web
Audio API to do great things. This small example applies a low-pass
filter to the tag:
Sample Code:
window.addEventListener('load', onLoad, false);
function onLoad() {
var audio = new Audio();
source = context.createMediaElementSource(audio);
var filter = context.createBiquadFilter();
filter.type = filter.LOWPASS;
filter.frequency.value = 440;
source.connect(this.filter);
filter.connect(context.destination);
audio.src = 'http://example.com/the.mp3';
audio.play();
}

How to play HLS stream (or other video stream) obtained from WebRTC?

There are JavaScript libraries for playing HLS streams natively in browsers, for example https://github.com/dailymotion/hls.js
The example usage from the documentation looks like this:
<script src="dist/hls.js"></script>
<video id="video"></video>
<script>
if(Hls.isSupported()) {
var video = document.getElementById('video');
var hls = new Hls();
hls.loadSource('http://www.streambox.fr/playlists/test_001/stream.m3u8');
hls.attachMedia(video);
hls.on(Hls.Events.MANIFEST_PARSED,function() {
video.play();
});
}
</script>
What I would like to do is to replace URL (http://www.streambox.fr/playlists/test_001/stream.m3u8) with a Blob or ArrayBuffer used by the RTCDataChannel.
So imagine I'm creating a video stream on the fly in the browser (data for this stream IS NOT a video stream created using getUserMedia but is a data obtained from the other peers using RTCDataChannel), can I play it back immediately as a data is written to the buffer?
If you want to take an incoming stream and 'feed' it into the browser's HTML5 video player you can use the media source extensions mechanism - MSE. This will allow you play it back immediately as I think you want.
The MSE spec is available here online: http://w3c.github.io/media-source/
The following link provides a good overview/intro:
http://updates.html5rocks.com/2011/11/Stream-video-using-the-MediaSource-API
An outline example for your case:
.
.
.
<div>
<video id="myStreamedVideo" width="320" height="240"></video>
</div>
.
.
.
Your Javascript pseudocode - won't run like this obviously, but it should give the general idea:
//Get the video element
var videoElement = document.getElementById('myStreamedVideo');
//Create a 'MediaSource' and associate it with this video
var mediaSource = new MediaSource();
video.src = window.URL.createObjectURL(mediaSource);
//Add a listener to the MediaSource object to check for
//the video been opened. In this function you can add your
//code to get the received video chunks from the socket
mediaSource.addEventListener('sourceopen', function(e) {
//Set the format of the source video
var mediaSourceBuffer = mediaSource.addSourceBuffer('video/webm; codecs="vorbis,vp8"');
//Get your video from the web
while (not the end of your video) {
...
//Receive some video packets from web socket
...
//Add packets received to the media source bufer
mediaSourceBuffer.appendBuffer(receivedVideoPackets);
//You can also add the received data to a file here if wanted.
}
};
One thing to note - MSE is relatively recent and while it is supported now by the latest versions of all major browsers (I think) this is still a new feature and everyone may not have it, so it is worth checking first that the users browser sports the feature. A good up to date summary of latest support is here:
http://www.jwplayer.com/html5/mediasource/
And the code to check if it is supported is (https://developer.mozilla.org/en-US/docs/Web/API/MediaSource#Browser_compatibility):
if ('MediaSource' in window && MediaSource.isTypeSupported(mimeCodec)) {

Connecting MediaElementAudioSourceNode to AudioContext.destination doesn't work

Here's a fiddle to show the problem. Basically, whenever the createMediaElementSource method of an AudioContext object is called, the output of the audio element is re-routed into the returned MediaElementAudioSourceNode. This is all fine and according to spec; however, when I then try to reconnect the output to the speakers (using the destination of the AudioContext), nothing happens.
Am I missing something obvious here? Maybe it has to do with cross-domain audio files? I just couldn't find any information on the topic on Google, and didn't see a note of it in the specs.
Code from the fiddle is:
var a = new Audio();
a.src = "http://webaudioapi.com/samples/audio-tag/chrono.mp3";
a.controls = true;
a.loop = true;
a.autoplay = true;
document.body.appendChild(a);
var ctx = new AudioContext();
// PROBLEM HERE
var shouldBreak = true;
var src;
if (shouldBreak) {
// this one stops playback
// it should redirect output from audio element to the MediaElementAudioSourceNode
// but src.connect(ctx.destination) does not fix it
src = ctx.createMediaElementSource(a);
src.connect(ctx.destination);
}
Yes, the Web Audio API requires that the audio adhere to the Same-Origin Policy. If the audio you're attempting to play is not from the same origin then the appropriate Access-Control headers are required. The resource in your example does not have the required headers.

Web Audio API MediaElementSource node and SoundCloud not working with effects?

I am using a SoundCloud URL as audio.src . It is only playing the unprocessed version when i run it through the delay chain i have.
Here is the fiddle:
http://jsfiddle.net/ehsanziya/nwaH3/
var context = new webkitAudioContext();
var audio = new Audio(); //creates a HTML5 Audio Element
url = 'http://api.soundcloud.com/tracks/33925813/stream' + '?client_id=c625af85886c1a833e8fe3d740af753c';
//wraps the soundcloud stream to an audio element.
audio.src = url;
var source = context.createMediaElementSource(audio);
var input = context.createGainNode();
var output = context.createGainNode();
var fb = context.createGainNode();
fb.gain.value = 0.4;
var delay = context.createDelayNode();
delay.delayTime.value = 0.5;
//dry
source.connect(input);
input.connect(output);
//wet
input.connect(delay);
delay.connect(fb);
fb.connect(delay);
delay.connect(output);
source.mediaElement.play();
The chain works with Oscillator node.
What is the reason for it?
And is there any other way of processing a streaming sound from SoundCloud with Web Audio API?
You need to wait for the canplaythrough event on your audio element to fire before you can use it with createMediaElementSource.
So just add the event listener, and wait until the callback fires before you assign source = context.createMediaElementSource(audio); and make all of your connections.
Here's an updated jsFiddle that'll do what you want: http://jsfiddle.net/nwaH3/3/

Categories