howler.js play sound at a specific position - javascript

Hi I would like to start playing a sound on a scecific position. The api said that I can use .pos but it doesn't start where I would like it to start
<p>This sound last 13 sec.</p>
<h1>Audio</h1>
<h3>2:00</h3>
<h3>3:00</h3>
<h3>4:00</h3>
<h3>10:00</h3>
My javascript.
var son = [false, "0000"]
sound = new Howl({
urls: ['http://goldfirestudios.com/proj/howlerjs/sound.mp3'],
autoplay: false
});
function playSequence(events,valeur){
//if there is no sound play the track at a certain position
var playSound = function(valeur){
if(son[0] == true){
sound.stop();
son[0] = false;
}else{
son[0] = true;
sound.pos = parseInt(valeur[0]); // position at 2 sec
sound.play();
}
}
playSound(valeur);
}
//play the sound on click
$("a.val2").on('click', function(events){
valeur = $(this).text();
playSequence(events, valeur);
});

pos is a method, not a property. For best compatibility, you would want to play it like follows (fixed in the 2.0 beta branch):
sound.play(function(id){
sound.pos(valeur[0], id);
});
Here's how you would do it in 2.0 (check 2.0 branch at https://github.com/goldfire/howler.js):
var id = sound.play();
sound.seek(valeur[0], id);
If you are only playing one sound then there is no need to set the id, but it is best practice to change the position after playing when using 1.1.x.

I was having issues playing a sound at a specific time as well, it worked on the first play of a sound, but if I tried to go back and play that sound again, using seek immediately wouldn't work.
I just figured out a way to consistently make it work using howler v2:
function playSoundAtPosition(sound, pos) {
sound.once("play", () => {
sound.seek(pos);
});
sound.play();
}
This makes use of the callback method once which fires the callback to the play event and then immediately removes itself. I believe this helps because it waits until it actually knows the sound is playing before seeking to the desired time.
I know this is an old question but hopefully it's useful to someone if they come across this!

You can use Sprites!
var sound = new Howl({
src: ['sounds.webm', 'sounds.mp3'],
sprite: {
blast: [0, 3000],
laser: [4000, 1000],
winner: [6000, 5000]
}
});
// Shoot the laser!
sound.play('laser');
:)
I found it at this link How to Create and use Sprites

Related

Javascript play audio when loaded

so I have a code for volume slider. What it does is that it plays sound when you have changed volume after loading the slider. For example if default value of the slider is 50 it wont play sound right after loading the audio file, you need to change value to higher or lower so sound can come out. So I need help with JavaScript. What can I do so it starts playing the sound whenever is loaded?
Full code : https://www.w3schools.com/code/tryit.asp?filename=FQZVZUX36JFD
JS:
$("#volume").slider({
min: 0,
max: 100,
value: 50,
range: "min",
slide: function(event, ui) {
setVolume(ui.value / 100);
}
});
var myMedia = document.createElement('audio');
$('#player').append(myMedia);
myMedia.id = "myMedia";
playAudio('http://listen.shoutcast.com/newfm64aac-', 0);
function playAudio(fileName, myVolume) {
myMedia.src = fileName;
myMedia.setAttribute('loop', 'loop');
setVolume(myVolume);
myMedia.play();
}
function setVolume(myVolume) {
var myMedia = document.getElementById('myMedia');
myMedia.volume = myVolume;
}
Google recently released a browser update that prevents initial playing or autoplaying of media elements (Note: not the solution for this answer, but a potential solution for similar cases.)
You're setting the initial volume to zero with this call:
playAudio('http://listen.shoutcast.com/newfm64aac-', 0);
To have it load at a default of 50 your call should be:
playAudio('http://listen.shoutcast.com/newfm64aac-', 50);

HTML5 audio streaming: precisely measure latency?

I'm building a cross-platform web app where audio is generated on-the-fly on the server and live streamed to a browser client, probably via the HTML5 audio element. On the browser, I'll have Javascript-driven animations that must precisely sync with the played audio. "Precise" means that the audio and animation must be within a second of each other, and hopefully within 250ms (think lip-syncing). For various reasons, I can't do the audio and animation on the server and live-stream the resulting video.
Ideally, there would be little or no latency between the audio generation on the server and the audio playback on the browser, but my understanding is that latency will be difficult to control and probably in the 3-7 second range (browser-, environment-, network- and phase-of-the-moon-dependent). I can handle that, though, if I can precisely measure the actual latency on-the-fly so that my browser Javascript knows when to present the proper animated frame.
So, I need to precisely measure the latency between my handing audio to the streaming server (Icecast?), and the audio coming out of the speakers on the computer hosting the speaker. Some blue-sky possibilities:
Add metadata to the audio stream, and parse it from the playing audio (I understand this isn't possible using the standard audio element)
Add brief periods of pure silence to the audio, and then detect them on the browser (can audio elements yield the actual audio samples?)
Query the server and the browser as to the various buffer depths
Decode the streamed audio in Javascript and then grab the metadata
Any thoughts as to how I could do this?
Utilize timeupdate event of <audio> element, which is fired three to four times per second, to perform precise animations during streaming of media by checking .currentTime of <audio> element. Where animations or transitions can be started or stopped up to several times per second.
If available at browser, you can use fetch() to request audio resource, at .then() return response.body.getReader() which returns a ReadableStream of the resource; create a new MediaSource object, set <audio> or new Audio() .src to objectURL of the MediaSource; append first stream chunks at .read() chained .then() to sourceBuffer of MediaSource with .mode set to "sequence"; append remainder of chunks to sourceBuffer at sourceBuffer updateend events.
If fetch() response.body.getReader() is not available at browser, you can still use timeupdate or progress event of <audio> element to check .currentTime, start or stop animations or transitions at required second of streaming media playback.
Use canplay event of <audio> element to play media when stream has accumulated adequate buffers at MediaSource to proceed with playback.
You can use an object with properties set to numbers corresponding to .currentTime of <audio> where animation should occur, and values set to css property of element which should be animated to perform precise animations.
At javascript below, animations occur at every twenty second period, beginning at 0, and at every sixty seconds until the media playback has concluded.
<!DOCTYPE html>
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta charset="utf-8" />
<title></title>
<style>
body {
width: 90vw;
height: 90vh;
background: #000;
transition: background 1s;
}
span {
font-family: Georgia;
font-size: 36px;
opacity: 0;
}
</style>
</head>
<body>
<audio controls></audio>
<br>
<span></span>
<script type="text/javascript">
window.onload = function() {
var url = "/path/to/audio";
// given 240 seconds total duration of audio
// 240/12 = 20
// properties correspond to `<audio>` `.currentTime`,
// values correspond to color to set at element
var colors = {
0: "red",
20: "blue",
40: "green",
60: "yellow",
80: "orange",
100: "purple",
120: "violet",
140: "brown",
160: "tan",
180: "gold",
200: "sienna",
220: "skyblue"
};
var body = document.querySelector("body");
var mediaSource = new MediaSource;
var audio = document.querySelector("audio");
var span = document.querySelector("span");
var color = window.getComputedStyle(body)
.getPropertyValue("background-color");
//console.log(mediaSource.readyState); // closed
var mimecodec = "audio/mpeg";
audio.oncanplay = function() {
this.play();
}
audio.ontimeupdate = function() {
// 240/12 = 20
var curr = Math.round(this.currentTime);
if (colors.hasOwnProperty(curr)) {
// set `color` to `colors[curr]`
color = colors[curr]
}
// animate `<span>` every 60 seconds
if (curr % 60 === 0 && span.innerHTML === "") {
var t = curr / 60;
span.innerHTML = t + " minute" + (t === 1 ? "" : "s")
+ " of " + Math.round(this.duration) / 60
+ " minutes of audio";
span.animate([{
opacity: 0
}, {
opacity: 1
}, {
opacity: 0
}], {
duration: 2500,
iterations: 1
})
.onfinish = function() {
span.innerHTML = ""
}
}
// change `background-color` of `body` every 20 seconds
body.style.backgroundColor = color;
console.log("current time:", curr
, "current background color:", color
, "duration:", this.duration);
}
// set `<audio>` `.src` to `mediaSource`
audio.src = URL.createObjectURL(mediaSource);
mediaSource.addEventListener("sourceopen", sourceOpen);
function sourceOpen(event) {
// if the media type is supported by `mediaSource`
// fetch resource, begin stream read,
// append stream to `sourceBuffer`
if (MediaSource.isTypeSupported(mimecodec)) {
var sourceBuffer = mediaSource.addSourceBuffer(mimecodec);
// set `sourceBuffer` `.mode` to `"sequence"`
sourceBuffer.mode = "sequence";
fetch(url)
// return `ReadableStream` of `response`
.then(response => response.body.getReader())
.then(reader => {
var processStream = (data) => {
if (data.done) {
return;
}
// append chunk of stream to `sourceBuffer`
sourceBuffer.appendBuffer(data.value);
}
// at `sourceBuffer` `updateend` call `reader.read()`,
// to read next chunk of stream, append chunk to
// `sourceBuffer`
sourceBuffer.addEventListener("updateend", function() {
reader.read().then(processStream);
});
// start processing stream
reader.read().then(processStream);
// do stuff `reader` is closed,
// read of stream is complete
return reader.closed.then(() => {
// signal end of stream to `mediaSource`
mediaSource.endOfStream();
return mediaSource.readyState;
})
})
// do stuff when `reader.closed`, `mediaSource` stream ended
.then(msg => console.log(msg))
}
// if `mimecodec` is not supported by `MediaSource`
else {
alert(mimecodec + " not supported");
}
};
}
</script>
</body>
</html>
plnkr http://plnkr.co/edit/fIm1Qp?p=preview
There no way for you to measure latency directly, but any AudioElement generate events like 'playing' if it just played (fired quite often), or 'stalled' if stoped streaming, or 'waiting' if data is loading. So what you can do, is to manipulate your video based on this events.
So play while stalled or waiting is fired, then continue playing video if playing fired again.
But I advice you check other events that might affect your flow (error for example would be important for you).
https://developer.mozilla.org/en-US/docs/Web/API/HTMLAudioElement
What i would try is first create a timestamp with performance.now, process the data, and record it in a blob with the new web recorder api.
The web recorder will ask user access to his audio card, this can be a problem for your app, but it look like mandatory to get the real latency.
As soon this done, there is many way to measure the actual latency between the generation and the actual rendering. Basically, a sound event.
For further reference and example:
Recorder demo
https://github.com/mdn/web-dictaphone/
https://developer.mozilla.org/en-US/docs/Web/API/MediaRecorder_API/Using_the_MediaRecorder_API

New state with Phaser

I'm trying to switch state with a collision. So when the player hits another sprite it should switch state, but it doesn't..
First I'm declaring the player and the enterDoor sprites under create::
playerSprite = this.game.add.sprite(50, 1700, 'player-front');
player = new Player(playerSprite);
this.game.physics.enable(player, Phaser.Physics.ARCADE);
enterDoor = this.game.add.sprite(332, 830, 'player-back');
playerDoor = new Player(enterDoor);
this.game.physics.enable(playerDoor, Phaser.Physics.ARCADE);
Then I'm trying to make the overlap under update: :
this.game.physics.arcade.overlap(player, playerDoor, this.enterHouse, null, this);
And enterHouse is another function:
enterHouse: function() {
this.state.start('Menu');
}
What am I doin' wrong?
So with the code above I wasn't able to get the overlap to trigger at all. After disabling moves on the player sprite's body the overlap was triggered.
player.body.moves = false;
Your enterHouse function doesn't need to accept the two sprites, and can be left as-is.
What I don't know is why this is necessary.

Cordova 3.4.0, how to Media.setVolume()?

I've added sounds to my Cordova app (3.4.0), but I can't set the volume unless I play the sound.
Here is the code to make it works :
var myMedia = new Media("file:///android_asset/www/sounds/button.mp3");
myMedia.play();
myMedia.stop();
myMedia.setVolume("0.2");
I tried the below code, but it isn't working too :
var myMedia = new Media("file:///android_asset/www/sounds/button.mp3", function() {
this.setVolume("0.5");
});
Do you have another clean method ?
I set the volume to 0 before playing then incrementally increase it to the required volume. But you shouldn't need to bother with the fade in. Something like this should work:
mediaPlayer = new Media(localSoundsPath + sound_file_name);
mediaPlayer.setVolume(0);
mediaPlayer.play();
mediaPlayer.setVolume(0.2);

progress bar for playing videos

I'm trying to add a progress bar that shows how far a videos into it's duration, but one that can be placed outside the video itself(like on another part of the screen). I've been looking around for some time, and have only found ones for showing the loading progress, which is what I don't need. Can anyone help me find where to find that, or supply one themselves even. just in case it's needed, here's the script for the video
var numb = $(this).index(),
videos = ['images/talking1.m4v', 'images/talking2.m4v', 'images/talking1.m4v', 'images/talking2.m4v', 'images/talking1.m4v', 'images/talking2.m4v'
],
myVideo = document.getElementById('myVid');
myVideo.src = videos[numb];
myVideo.load();
setTimeout(function(){
myVideo.play();
}, 200);
You could bind an event listener to the timeupdate event:
myVideo.addEventListener("timeupdate", function() {
// if the video is loaded and duration is known
if(!isNaN(this.duration)) {
var percent_complete = this.currentTime / this.duration;
// use percent_complete to draw a progress bar
}
});
Pick a max length for your progress bar, multiply it by percent_complete (which is between 0 and 1), and use that product as the current length of the bar.

Categories