Steaming a growing file using the MediaSource API. - javascript

So I have a downloading .mp4 file. I would like to stream the download file into a video element using the MediaSource API. How would I do this?
const NUM_CHUNKS = 5;
var video = document.querySelector('video');
video.src = video.webkitMediaSourceURL;
video.addEventListener('webkitsourceopen', function(e) {
var chunkSize = Math.ceil(file.size / NUM_CHUNKS);
// Slice the video into NUM_CHUNKS and append each to the media element.
for (var i = 0; i < NUM_CHUNKS; ++i) {
var startByte = chunkSize * i;
// file is a video file.
var chunk = file.slice(startByte, startByte + chunkSize);
var reader = new FileReader();
reader.onload = (function(idx) {
return function(e) {
video.webkitSourceAppend(new Uint8Array(e.target.result));
logger.log('appending chunk:' + idx);
if (idx == NUM_CHUNKS - 1) {
video.webkitSourceEndOfStream(HTMLMediaElement.EOS_NO_ERROR);
}
};
})(i);
reader.readAsArrayBuffer(chunk);
}
}, false);
How would I dynamically change the NUM_CHUNKS,and slice the video?

The code you're using from Eric Bidelman chops up a video that the browser already fully downloaded to demonstrate how the api works. In reality, you'd slice the video on the server, and the client would download each chunk in order, probably with an AJAX request.
I'd first suggest you try your .mp4 in the demo code you have, because MediaSource seems pretty picky about the format of the video files it accepts. See Steven Robertson's answer about how to create an mp4 that'll work.
Then it's up to you whether you want to slice the video manually beforehand, or do it dynamically on the server (which will vary depending on your server). The javascript client shouldn't care how many or how large each chunk each is, as long as they're fed in order (and I think the spec even allows some amount of out-of-order appending).

webkitMediaSourceURL; is now outdated in Chrome, and now createObjectURL(); needs to be used.
The patch here: HTMLMediaElement to the new OO MediaSource API gave me some pointers as to what I needed to update in my code.

Related

MIX Wav file and export it with Web Audio API

I'm a web developer from japan.
This is first question on stack over flow.
I'm creating a simple music Web application now.
making a music system program is a completely beginner, so I am struggling to implement it.
As a result of various investigations, I noticed that using the Web Audio API was the best choice,
so, I decided to use it.
▼ What I want to achieve
Multiple Wav files load with the Web audio API can be grouped into one Wav file &To be able to download from the browser.
For example, load the multiple wav file like guitar, drum and piano, and
edit it on the browser, and finally output it as one Wav file.
Then we can download that edited wav file from the browser and we are able to play itunes.
▼ Question
Is it possible to achieve this requirements by just using web audio api ?
or we need to use another Library ?
I checked Record.js on github but development has stopped about 2 ~ 3 years and has many issues and I can not get support. so I decided not to use it.
and also I checked similar issue Web audio API: scheduling sounds and exporting the mix
Since the information is old, I do not know if I can still use it
thanks.
Hi and welcome to Stack Overflow!
Is it possible to achieve this just using the web audio api?
In terms of merging/mixing the files together this is perfectly achievable! This article goes through many (if not all) of the steps you will need to carry out the task you suggested.
Each file you want to upload can be loaded into an AudioBufferSource (examples explained in that article linked before) Example setting up a buffer source once the audio data has been loaded in:
play: function (data, callback) {
// create audio node and play buffer
var me = this,
source = this.context.createBufferSource(),
gainNode = this.context.createGain();
if (!source.start) { source.start = source.noteOn; }
if (!source.stop) { source.stop = source.noteOff; }
source.connect(gainNode);
gainNode.connect(this.context.destination);
source.buffer = data;
source.loop = true;
source.startTime = this.context.currentTime; // important for later!
source.start(0);
return source;
}
There are then also specific nodes already designed for your mixing purposes like the ChannelMergerNode (combines multiple mono channels into a new channel buffer). This is if you don't want to deal with the signal processing yourself in javascript but will be faster using the Web Audio objects since they are native compiled code already within the browser.
Following that complete guide sent before, there are also options to export the file (as a .wav in the demo case) using the following code :
var rate = 22050;
function exportWAV(type, before, after){
if (!before) { before = 0; }
if (!after) { after = 0; }
var channel = 0,
buffers = [];
for (channel = 0; channel < numChannels; channel++){
buffers.push(mergeBuffers(recBuffers[channel], recLength));
}
var i = 0,
offset = 0,
newbuffers = [];
for (channel = 0; channel < numChannels; channel += 1) {
offset = 0;
newbuffers[channel] = new Float32Array(before + recLength + after);
if (before > 0) {
for (i = 0; i < before; i += 1) {
newbuffers[channel].set([0], offset);
offset += 1;
}
}
newbuffers[channel].set(buffers[channel], offset);
offset += buffers[channel].length;
if (after > 0) {
for (i = 0; i < after; i += 1) {
newbuffers[channel].set([0], offset);
offset += 1;
}
}
}
if (numChannels === 2){
var interleaved = interleave(newbuffers[0], newbuffers[1]);
} else {
var interleaved = newbuffers[0];
}
var downsampledBuffer = downsampleBuffer(interleaved, rate);
var dataview = encodeWAV(downsampledBuffer, rate);
var audioBlob = new Blob([dataview], { type: type });
this.postMessage(audioBlob);
}
So I think Web-Audio has everything you could want for this purpose! However could be challenging depending on your web development experience, but its a skill definately worth learning!
Do we need to use another library?
If you can I think it's definately worth trying it with Web-Audio, as you'll almost definately get the best speeds for processing, but there are other libraries such as Pizzicato.js just to name one. I'm sure you will find plenty others.

How to get duration of video when I am using filereader to read the video file?

I am trying to upload a video to server, and on client end. I am reading it using FileReader's readAsBinaryString().
Now, my problem is, I don't know how to read duration of this video file.
If i try reading the file, and assigning the reader's data to a video tag's source, then none of the events associated to the video tag are fired. I need to find the duration of file uploaded on client end.
Can somebody please suggest me something?
You can do something like this for that to work:
read the file as ArrayBuffer (this can be posted directly to server as a binary stream later)
wrap it in a Blob object
create an object URL for the blob
and finally set the url as the video source.
When the video object triggers the loadedmetadata event you should be able to read the duration.
You could use data-uri too, but notice that browsers may apply size limits (as well as other disadvantages) for them which is essential when it comes to video files, and there is a significant encoding/decoding overhead due to the Base-64 process.
Example
Select a video file you know the browser can handle (in production you should of course filter accepted file types based on video.canPlayType()).
The duration will show after the above steps has performed (no error handling included in the example, adjust as needed).
var fileEl = document.querySelector("input");
fileEl.onchange = function(e) {
var file = e.target.files[0], // selected file
mime = file.type, // store mime for later
rd = new FileReader(); // create a FileReader
rd.onload = function(e) { // when file has read:
var blob = new Blob([e.target.result], {type: mime}), // create a blob of buffer
url = (URL || webkitURL).createObjectURL(blob), // create o-URL of blob
video = document.createElement("video"); // create video element
video.preload = "metadata"; // preload setting
video.addEventListener("loadedmetadata", function() { // when enough data loads
document.querySelector("div")
.innerHTML = "Duration: " + video.duration + "s"; // show duration
(URL || webkitURL).revokeObjectURL(url); // clean up
// ... continue from here ...
});
video.src = url; // start video load
};
rd.readAsArrayBuffer(file); // read file object
};
<input type="file"><br><div></div>
you can do something like below, the trick is to use readAsDataURL:
var reader = new FileReader();
reader.onload = function() {
var media = new Audio(reader.result);
media.onloadedmetadata = function(){
media.duration; // this would give duration of the video/audio file
};
};
reader.readAsDataURL(file);
Fiddle Demo

Store result of type ArrayBuffer in an array in Javascript

I am using the HTML5 FileReader and File API to make a offline music player.
This also includes a basic playlist feature.
Now, when the user selects multiple files, I am retrieving those files as an ArrayBuffer.
Problem is, I want to store these returned files into a normal array so that they can be used in the playlist later.
How can I achieve that in Javascript?
function load_files(){
var files = document.getElementById('file').files;
var k = files.length;
for (var i = 0; i < k; i++) {
var reader = new FileReader();
reader.onload = function(e) {
playlist[i] = this.result;
};
reader.readAsArrayBuffer(this.files[i]);
alert(song_counter);
initSound(playlist[song_counter]);
}
}
You cannot just store an ArrayBuffer in an array and re-read the file from it. The buffer is used to load a fixed length of bytes and keep them coming so that you don't run out of bytes to play.
Instead you should read all the bytes from the ArrayBuffer into a byteArray and store the byteArrays in an array. You can then replay all the songs from the byte arrays.
If you already have the songs locally (since your question doesn't state where you get the files from) you can just store the path to the file and then reload the file from that.
I hope this makes sense.

html audio tag, duration always infinity

I've been working on using the html audio tag to play some audio files. The audio plays alright, but the duration property of the audio tag is always returning infinity.
I tried the accepted answer to this question but with the same result. Tested with Chrome, IE and Firefox.
Is this a bug with the audio tag, or am I missing something?
Some of the code I'm using to play the audio files.
javascript function when playbutton is pressed
function playPlayerV2(src) {
document.getElementById("audioplayerV2").addEventListener("loadedmetadata", function (_event) {
console.log(player.duration);
});
var player = document.getElementById("audioplayer");
player.src = "source";
player.load();
player.play();
}
the audio tag in html
<audio controls="true" id="audioplayerV2" style="display: none;" preload="auto">
note: I'm hiding the standard audio player with the intend of using custom layout and make use of the player via javascript, this does not seem to be related to my problem.
try this
var getDuration = function (url, next) {
var _player = new Audio(url);
_player.addEventListener("durationchange", function (e) {
if (this.duration!=Infinity) {
var duration = this.duration
_player.remove();
next(duration);
};
}, false);
_player.load();
_player.currentTime = 24*60*60; //fake big time
_player.volume = 0;
_player.play();
//waiting...
};
getDuration ('/path/to/audio/file', function (duration) {
console.log(duration);
});
I think this is due to a chrome bug. Until it's fixed:
if (video.duration === Infinity) {
video.currentTime = 10000000;
setTimeout(() => {
video.currentTime = 0; // to reset the time, so it starts at the beginning
}, 1000);
}
let duration = video.duration;
This works for me
const audio = document.getElementById("audioplayer");
audio.addEventListener('loadedmetadata', () => {
if (audio.duration === Infinity) {
audio.currentTime = 1e101
audio.addEventListener('timeupdate', getDuration)
}
})
function getDuration() {
audio.currentTime = 0
this.voice.removeEventListener('timeupdate', getDuration)
console.log(audio.duration)
},
In case you control the server and can make it to send proper media header - this what helped the OP.
I faced this problem with files stored in Google Drive when getting them in Mobile version of Chrome. I cannot control Google Drive response and I have to somehow deal with it.
I don't have a solution that satisfies me yet, but I tried the idea from both posted answers - which basically is the same: make audio/video object to seek the real end of the resource. After Chrome finds the real end position - it gives you the duration. However the result is unsatisfying.
What this hack really makes - it forces Chrome to load the resource into the memory completely. So, if the resource is too big, or connection is too slow you end up waiting a long time for the file to be downloaded behind the scenes. And you have no control over that file - it is handled by Chrome and once it decides that it is no longer needed - it will dispose it, so the bandwidth may be spent ineficciently.
So, in case you can load the file yourself - it is better to download it (e.g. as blob) and feed it to your audio/video control.
If this is a Twilio mp3, try the .wav version. The mp3 is coming across as a stream and it fools the audio players.
To use the .wav version, just change the format of the source url from .mp3 to .wav (or leave it off, wav is the default)
Note - the wav file is 4x larger, so that's the downside to switching.
Not a direct answer but in case anyone using blobs came here, I managed to fix it using a package called webm-duration-fix
import fixWebmDuration from "webm-duration-fix";
...
fixedBlob = await fixWebmDuration(blob);
...
//If you want to modify the video file completely, you can use this package "webmFixDuration" Other methods are applied at the display level only on the video tag With this method, the complete video file is modified
webmFixDuration github example
mediaRecorder.onstop = async () => {
const duration = Date.now() - startTime;
const buggyBlob = new Blob(mediaParts, { type: 'video/webm' });
const fixedBlob = await webmFixDuration(buggyBlob, duration);
displayResult(fixedBlob);
};

How to get the size and duration of an mp3 file?

I need to calculate the total length of an mp3 file.
Currently I am using a PHP class which I found # http://www.zedwood.com/article/php-calculate-duration-of-mp3.
This is working perfectly if the mp3 file in same server.
but if I have a URL from other site it throwing error. Please help me.
Is there any JavaScript J-Query function to get the length of the mp3 file
<?php include("mp3.class.php");
$f = 'http://cdn.enjoypur.vc/upload_file/5570/5738/5739/7924/Blue%20Eyes%20-%20Yo%20Yo%20Honey%20Singh%20(PagalWorld.com)%20-192Kbps%20.mp3';
$m = new mp3file($f);
$a = $m->get_metadata();
if ($a['Encoding']=='Unknown')
echo "?";
else if ($a['Encoding']=='VBR')
print_r($a);
else if ($a['Encoding']=='CBR')
print_r($a);
unset($a);
?>
Here's how you can get mp3 duration using Web Audio API:
const mp3file = 'https://raw.githubusercontent.com/prof3ssorSt3v3/media-sample-files/master/doorbell.mp3'
const audioContext = new (window.AudioContext || window.webkitAudioContext)()
const request = new XMLHttpRequest()
request.open('GET', mp3file, true)
request.responseType = 'arraybuffer'
request.onload = function() {
audioContext.decodeAudioData(request.response,
function(buffer) {
let duration = buffer.duration
console.log(duration)
document.write(duration)
}
)
}
request.send()
There is actually a library that can run at client-side, attempting to fetch just enough of the MP3 to read the ID3 tags:
http://github.com/aadsm/JavaScript-ID3-Reader
or
Try
HTML File API.
http://lostechies.com/derickbailey/2013/09/23/getting-audio-file-information-with-htmls-file-api-and-audio-element/
Perhaps the simplest solution is to use the audio html element to get the time duration and to obtain the size directly from the file returned by the FileReader object. A code example of this approach is shown below.
One down side of this and all the other solutions presented so far is the 10-20 second delay it takes for the audio tag's durationchanged event to fire when loading large, eg > 200MB files. Clearly there is a faster way to get this info because the duration is shown immediately when the file is entered into the browser as a file:///.... URL.
function checkMp3SizeAndDuration()
{
var files = document.getElementById('upload-file').files;
var file = files[0];
if (file.size > MAX_FILE_SIZE) {
return;
}
var reader = new FileReader();
var audio = document.createElement('audio');
reader.onload = function (e) {
audio.src = e.target.result
audio.addEventListener('durationchange', function() {
console.log("durationchange: " + audio.duration);
},false);
audio.addEventListener('onerror', function() {
alert("Cannot get duration of this file.");
}, false);
};
reader.readAsDataURL(file);
});
Having not been able to find something that was fast and didn't require a bunch of extra boilerplate code, I tweaked an existing server side javascript utility to run directly in the browser. Demo code is available at: https://github.com/eric-gilbertson/fast-mp3-duration
A Famous and Very SPI that you can use MP3 SPI
and the code is also very simple
File file = new File("filename.mp3");
AudioFileFormat baseFileFormat = AudioSystem.getAudioFileFormat(file);
Map properties = baseFileFormat.properties();
Long duration = (Long) properties.get("duration");
Use getID3() PHP library that works for VBR files as well.
This link help you sourceforge.net.
It's very much active in development.

Categories