If you want to play some sounds and audio files (like for notification in button's click or for any other purpose) for a specific time ( like 2 seconds or 5 seconds regardless of how long or short that audio file is ) by your javascript code.
First, we create a DOM Audio object for our audio file by providing the path to that audio file.
then set loop to true, call play(), and at last call pause() after a specific time with the help of setTimeout().
funciton play( audio_path, time_in_milisec){
let beep = new Audio( audio_path);
beep.loop = true;
beep.play();
setTimeout(() => { beep.pause(); }, time_in_milisec);
}
play('beep.mp3', 2000);
Did you want to code this yourself, or are you looking for a library to make this easier? You might want to check out WadJS
let song = new Wad({source : 'https://www.myserver.com/audio/mySong.wav'})
song.play({
env: {
// The duration in seconds. It defaults to the length of the audio file, but you can manually set it lower to make the sound stop after a certain amount of time
hold: 10
}
})
Related
I have a StackBlitz minimum code example which illustrates the problem. For brevity, I've also placed the problematic code below.
Whenever the user clicks on a track of a number of tracks, I want the Audio component to immediately play that track, think Spotify etc.
Currently, the source of the audio component is updating, but it does not play the track unless there is a triple click.
I'm not sure why a triple click is needed to create a successful play call.
const changeSource = newSource => {
setSource(newSource);
togglePlayPause();
};
Here is the setSource hook, which is initialised to a default track:
const [source, setSource] = useState(
'https://www.soundhelix.com/examples/mp3/SoundHelix-Song-4.mp3'
);
const togglePlayPause = () => {
const prevValue = isPlaying;
setIsPlaying(!prevValue);
if (!prevValue) {
audioPlayer.current.play();
} else {
audioPlayer.current.pause();
}
};
It'll do that, unless you preload your audio data. The <audio> element would need to go out, bring the data into the browser, process it, then begin to play. There is a way to try to preload the data. By setting <audio preload="auto"> it will "attempt" to preload tracks on page load. But, when dynamically setting source, you can't really get that data until you know what the source is. You can get around this, a little, by using the autoplay attribute, which will cause the browser to automatically begin fetching the source once it is set. (But then, it will also start playing it right away as well.)
I have a function that loads an audio and sets the current time of the track based off of what the user was listening to the last time they had a page open. The track and progress into the audio is stored in localStorage. It works perfectly on localhost, but when I put it into a php server, the audio loads from the start.
Trying to find the source of the problem, I found out that it loads at the current time, then switches to 0 when it starts playing. Also, sometimes the audio file loads in twice, as shown in the network section of google developer tools (possibly because of native browser functionality and how video/audio files load). This is a major problem that is so frustrating because there is no signs of a problem whatsoever, except that it starts from 0 instead of the current time. I tried putting it in a oncanplaythough loop but that doesn't do any good either.
The code I currently have is (note that it is in a function and time and src are parameters):
//set the volume to 1
audio.volume = 1;
//set the current time to 0
audio.currentTime = 0;
if(muteSound === false) {
//if the user didn't mute the sound, play it
audio.play().catch(() => {});
}
//set the name of the music (for display)
let nameOfMusic = audioChoices.indexOf(src);
//if it exists, then set the name of the currently playing music
if(nameOfMusic !== -1) {
localStorage.currentPlay = music[nameOfMusic];
}
//if the time (the code is in a function, time is the parameter. It gets set when the audio plays to the current time the user left off on) is not 0 (if the program didn't start a new soundtrack when the music ended), set the time to what it was previously.
if(time !== 0) {
audio.currentTime = Number(localStorage.currentTime);
}
//set the currently playing source to the source of the audio
localStorage.setItem('currentPlaySrc', src);
//set the current time to 0 (the local storage value, not the actual value. This might be the problem here, as I have found that the music loads twice, meaning it will get set to 0 since the first time it loads it loads it to 0. However, I have removed this and it still doesn't work).
localStorage.setItem('currentTime', '0');
//set an interval that constantly sets the current playing variable in local storage to the current time of the audio and checks if the user muted the sound. It also checks if the audio ends and replays itself (the addMusic function is what this code is contained in)
let check = window.setInterval(() => {
localStorage.currentTime = audio.currentTime;
if(muteSound === false && soundMutedSession === true) {
soundMutedSession = false;
audio.play().catch(() => {});
}
if(audio.currentTime >= audio.duration) {
clearInterval(check);
localStorage.setItem('currentTime', '0');
let random = Math.floor(Math.random() * audioChoices.length);
addMusic(audioChoices[random], 0);
}
if(muteSound === true) {
audio.pause();
soundMutedSession = true;
}
}, 1);
It's interesting that I didn't find any material on this. I don't think it has anything to do with typos or personal mistakes, but could it possibly have to do with how the browser renders and loads video/audio content?
So I can record something with the following code
let rec = new Recorder("filename.mp4").record();
// Stop recording after approximately 3 seconds
setTimeout(() => {
rec.stop((err) => {
// NOTE: In a real situation, handle possible errors here
// Play the file after recording has stopped
new Player("filename.mp4")
.play()
.on('ended', () => {
// Enable button again after playback finishes
this.setState({disabled: false});
});
});
}, 3000);
I can perfectly record audio but how I can access or delete or list recorded files like filename.mp4. Where is this saved in Android or iOS? Will these remain there after each update to app or each recompilation etc.
According to react-native-audio-toolkit documentation you can get the file parth with fsPath property.
fsPath - String (read only)
Get the filesystem path of file being recorded to. Available after
prepare() call has invoked its callback successfully.
I am trying to make something where sound samples are chosen randomly at intervals so that the song evolves and is different each time it's listened to. HTML Audio was not sufficient for this, because the timing was imprecise, so I am experimenting with Web Audio, but it seems quite complicated. For now, I just want to know how to make a new audio file play at 16 seconds exactly, or 32 seconds, etc. I came across something like this
playSound.start(audioContext.currentTime + numb);
But as of now I cannot make it work.
var audioContext = new audioContextCheck();
function audioFileLoader(fileDirectory) {
var soundObj = {};
soundObj.fileDirectory = fileDirectory;
var getSound = new XMLHttpRequest();
getSound.open("GET", soundObj.fileDirectory, true);
getSound.responseType = "arraybuffer";
getSound.onload = function() {
audioContext.decodeAudioData(getSound.response, function(buffer) {
soundObj.soundToPlay = buffer;
});
}
getSound.send();
soundObj.play = function(volumeVal, pitchVal) {
var volume = audioContext.createGain();
volume.gain.value = volumeVal;
var playSound = audioContext.createBufferSource();
playSound.playbackRate.value = pitchVal;
playSound.buffer = soundObj.soundToPlay;
playSound.connect(volume);
volume.connect(audioContext.destination)
playSound.start(audioContext.currentTime)
}
return soundObj;
};
var harp1 = audioFileLoader("IRELAND/harp1.ogg");
var harp2 = audioFileLoader("IRELAND/harp2.ogg");
function keyPressed() {
harp1.play(.5, 2);
harp2.start(audioContext.currentTime + 7.5);
}
window.addEventListener("keydown", keyPressed, false);
You see I am trying to make harp2.ogg play immediately when harp1.ogg finishes. Eventually I want to be able to choose the next file randomly, but for now I just need to know how to make it happen at all. How can I make harp2.ogg play exactly at 7.5 seconds after harp1.ogg begins (or better yet, is there a way to trigger it when harp2 ends (without a gap in audio)?) Help appreciated, thanks!
WebAudio should be able to start audio very precisely using start(time), down to the nearest sample time. If it doesn't, it's because the audio data from decodeAudioData doesn't contain the data you expected, or it's a bug in your browser.
Looks like when you call keyPressed, you want to trigger both songs to start playing. One immediately, and the other in 7.5 seconds.
The function to play the songs is soundObj.play and it needs to take an additional argument, which is the audioContext time to play the song. Something like: soundObj.play = function(volumeVal, pitchVal, startTime) {...}
The function keyPressed() block should look something like this:
harp1.play(.5, 2, 0);
harp2.start(1, 1, audioContext.currentTime + 7.5);
audioContext.resume();
audioContext.resume() starts the actual audio (or rather starts the audio graph timing so it does things you've scheduled)
I have an app that tracks video views and integrates it with other marketing activities. In doing so, I needed to keep track of how long a person watches a html5 video and post it back to my app (via an API). I'm using videojs player, but really this is just a wrapper around the HTML5's api for this attribute. This is in an app with various videos can be loaded based on what page they are watching, so I needed a solution that tracked regardless of video length.
The problem I had, as a video plays the API reports back every ~300MS and I didn't want to hit my API that often. So I needed a solution to keep track of last time I posted. After digging around, I couldn't find an answer, so in case someone else with a similar need, my solution to this problem is below.
We've decided that I wanted to post my video viewing results every 5 seconds, but since we have no guarantee that the currentTime will report back at exactly 5 seconds, so we just need to round to closest whole integer value.
On my video wrapper div, I've added a data attribute called data-last-time-push. I post the rounded time every time I push and check to see if we have exceed the interval before we post again.
HTML
<div id="video-wrapper" data-time-last-push="0">
Javascript
Bind the videojs container to the timeupdate property.
var vid = videojs("video-container", {}, function() {
this.on('timeupdate', videoTracker);
});
function for posting ajax...
var videoTracker = function() {
var player = this;
var last_push, wrapper, current;
wrapper = $('#video-wrapper');
last_push = wrapper.attr("data-time-last-push");
current = Math.round(player.currentTime());
//you could make the 5 here to be a variable or your own interval...
if (current%5 === 0) {
if (current > last_push) {
//do your AJAX post here...
wrapper.attr("data-time-last-push", current);
console.log('currentTime = ' + player.currentTime());
console.log(' duration: ' + player.duration());
}
}
};
Note, I tried to do a jsfiddle to show it working, but ended up running into HTTPS videos because the sample videos don't work through secure connections.