Google just changed how Chrome preloads audio and video; see: https://googlechrome.github.io/samples/media/preload-metadata
It's my understanding that simply setting preload attribute to auto should fix the problem, however, I have been unable to do so:
https://jsfiddle.net/NinoSkopac/f4zscrdy/1/
let mp3 = 'https://s3-staging.read2me.online/audio/5a745d88483d86.76121223.mp3';
let audio = new Audio(mp3);
audio.preload = 'auto';
audio.play();
https://jsfiddle.net/NinoSkopac/rst8aspm/
<audio src="https://s3-staging.read2me.online/audio/5a745d88483d86.76121223.mp3" preload="auto" autoplay></audio>
Both of these will stop playing within a minute on Chrome 64 and Chrome 65-dev (other browsers and older Chromes are unaffected). I have replicated this issue on Mac, Windows and Android.
During my debug process, I have attached all possible media events to the JS object (i.e. audio.addEventListener('timeupdate', () => { console.log('timeupdate') })) and at first the events were firing like this:
progress
timeupdate
progress
timeupdate
[...]
Later like this:
timeupdate
timeupdate
timeupdate
[...]
When the audio playback stopped, I got a handful of error events, and dumping audio.error returns: PIPELINE_ERROR_DECODE: Failed to send audio packet for decoding: timestamp=81763265 duration=26122 size=201 side_data_size=0 is_key_frame=1 encrypted=0 discard_padding (ms)=(0, 0)
How do I fix this? Is this a Chrome bug?
UPDATE:
OGG plays fine: https://jsfiddle.net/NinoSkopac/2hktqcqt/1/
This does seem to be a Chrome bug: https://bugs.chromium.org/p/chromium/issues/detail?id=794782
A similar error on Github: https://github.com/video-dev/hls.js/issues/1529
UPDATE 2:
chrome://media-internals/ reveals this:
UPDATE 3:
This issue has been fixed in Chrome 65.
After a couple of days of trial and error and research, I have confirmed what doesn't and does work.
Doesn't work
mp3wrap
mp3wrap output.mp3 *.mp3
the output file is still corrupted and halts
ffmpeg
ffmpeg -i "concat:0.mp3|1.mp3" -acodec copy output.mp3
the output file is still corrupted and halts
Does work
mp3val with -f argument
Simply concatenate/implode your audio binaries (in PHP I do implode('', $audioBinaries) and then run mp3val -f concatenated-audio-file.mp3. The -f argument is essential and it means "try to fix errors".
How to install mp3val?
On MacOS: brew install mp3val
On Deb/Ubu: apt-get install mp3val
Same issue I am facing with concatenating technique. With ffmpeg, it works fine. Try ffmpeg with this command.
ffmpeg -f concat -i "{textfile}" -c:v copy -ab 48k -y "{output}"
textfile will have a list of files written per line.
Related
I have created a simple Phaser 3 test app (in Typescript, with rollup to transpile) and am using Capacitor to convert it into an iOS app on my Mac.
This is the relevant part of the app:
function preload () {
this.load.audio('boom', ['boom.mp3', 'boom.ogg', './boom-44100.ogg', './boom-44100.mp3']);
this.load.image('wizball', './wizball.png');
}
function create () {
const image = this.add.image(400, 300, 'wizball').setInteractive();;
this.boom = this.sound.add('boom');
image.on('pointerup', () => {
this.boom.play();
});
}
The app shows an image of a wizball, and if you click it, you hear "boom".
When I run it:
With npm run watch, using http://localhost:10001 in a browser on my Mac, it works fine;
By loading index.html in the dist/ dir in a browser on my Mac, it works fine;
By loading https://garion.org/soundtest-ts/ on either my Mac or my iPad, it works fine;
But when I use Capacitor to build an iOS app in Xcode, clicking the image gives no sound at all.
These are the steps to generate the iOS app:
npm i
npm run watch
npx cap add ios
npx cap copy ios
npx cap open ios
The console log in Xcode shows the following error:
Error: There is no audio asset with key "boom" in the audio cache
⚡️ URL: capacitor://localhost/game.js
I find this strange, because the image asset can be found just fine. In the ios/App/public/ dir, both boom.mp3 and wizball.png are present.
I have put the full code, with steps to reproduce, here: https://github.com/joostvunderink/soundtest You will need node 10+ and Xcode (with at least one virtual device configured) installed to build the iOS app.
What am I overlooking?
Disable web audio in your game config, add to bottom of your game config to be like this.
let game = new Phaser.Game({
...
audio: {
disableWebAudio: true
}
});
Warning:
Disable web audio will make phaser use html5 audio.
Using html5 audio
instead of web audio will make your game lag
Another workaround for this problem are:
Use external audio files, web audio still can work if you using
audio files not from internal assets (I still can't find out why)
Using native audio/media plugin to play the audio for the phaser-capacitor app
Have the same problem. I found out that only with ios platform phaser loader can't parse ArrayBuffer on sound load. That's why I load audio with simple fetch and add to phaser with scene.sound.decodeAudio.
Something like that:
fetch(sound.url)
.then((response) => response.arrayBuffer())
.then((response) => {
scene.sound.decodeAudio({
data: response,
key: 'audioKey',
});
});
Let me Explain by my Code what issue i am facing...
This is my js file for using with PhantomJS. It simple tell it to open a page and take screenshots of it and store them in stdout.
var page = require("webpage").create();
page.viewportSize = { width: 640, height: 480 };
page.open("http://www.goodboydigital.com/pixijs/examples/12-2/", function() {
setInterval(function() {
page.render("/dev/stdout", { format: "png" });
}, 25);
});
And this is the cmd command I'm running to receive the captured images in ffmpeg in Windows Command Prompt.
phantomjs runner.js | ffmpeg -y -c:v png -f image2pipe -r 25 -t 10 -i - -c:v libx264 -pix_fmt yuv420p -movflags +faststart dragon.mp4
This command successfully starts the processes of PhantomJS and ffmpeg. But nothing happens for quite some time, after 15 minutes it gives an error saying:
"Failed to reallocate parser buffer"
thats it. I have referenced this code from this site on which the developer claims that it works
https://mindthecode.com/recording-a-website-with-phantomjs-and-ffmpeg/
Please see the attached Image for more explanation.
Image of Code
It could be related to stdout the ffmpeg process as it is being stdin through the pipe and after taking continuous image buffer is filled up and gives error.
You can review this from a well organized canvas recording application "puppeteer-recorder" on nodeJS
https://github.com/clipisode/puppeteer-recorder
I understand emscripten is a super powerful way to encode C code into Javascript.
Is it possible to use this for video, capture the webcam and stream this over RTMP using something like the rtmpdump library?
rtmpdump can be recompiled to javascript using Emscripten. However, that does not guarantee that the recompiled code is capable of executing within a Javascript environment in the way that the RTMP spec requires (namely the requirement for TCP).
Steps used to recompile rtmpdump with Emscripten:
Obtain latest portable emscripten tools:
Obtain rtmpdump source:
git clone git://git.ffmpeg.org/rtmpdump
Clear make cache
make clean
Set C compiler to CC in Makefile
Edit the rtmpdump Makefile on line 5 to the following:
CC=$(CROSS_COMPILE)cc
Run emmake to create bytecode from make output:
emmake make CRYPTO=
(Per rtmpdump README, I opted to use 'CRYPTO=' to build without SSL support as it was giving errors)
Run emcc to compile and link resulting bytecode into javascript:
emcc -01 ./librtmp/*.o rtmpdump.o -o rtmpdump.js
Run the recompiled rtpmpdump.js:
chmod 755 rtmpdump.js
node rtmpdump.js -r rtmp://127.0.0.1/live/STREAM_NAME
Of course, we will need a live RTMP stream to test against.
Steps to create live RTMP stream:
Obtain latest node-rtsp-rtmp-server:
git clone
https://github.com/iizukanao/node-rtsp-rtmp-server.git
Add an mp4 to livestream over RTMP:
(Using Big Buck Bunny as our test video)
cd node-rtps-rtmp-server/
npm install -d
cd file/
wget http://download.bl4ender.org/peach/bigbuckbunny_movies/BigBuckBunny_320x180.mp4
Start the RTMP server
sudo coffee server.coffee
Publish mp4 to RTMP server with ffmpeg
ffmpeg -re -i /node-rtsp-rtmp-server/file/BigBuckBunny_320x180.mp4 -c:v copy -c:a copy -f flv rtmp://localhost/live/STREAM_NAME
Observations
You should be able to confirm that the RTMP stream is successfully published by connecting with something like VLC Media Player. Once we confirm the stream is properly running, we can test rtmpdump.js with:
node rtmpdump.js -4 rtmp://127.0.0.1/live/STREAM_NAME -o out.flv
However, we immediately encounter:
ERROR: RTMP_Connect0, failed to connect socket. 113 (Host is unreachable)
Conclusion
While my answer explores a path to recompiling rtmpdump and it's supporting libraries (librtmp) to Javascript, it does not produce a working implementation.
Some quick research concludes that RTMP relies on TCP communication for transmission from server to client. Javascript by nature, confines communication to XHR and WebSocket requests only. The steps I have outlined for recompilation of rtmpdump produces XHR requests for the RTMP_Connect0 method which are HTTP based (i.e. != TCP). It may be possible to rewrite an RTMP client to use websockets and pass those connections over to TCP using something like WebSockify, however, if successful you would move the RTMP's dependency on flash to a dependency on Websockify if you intend to consume an RTMP stream. Producing a flashless RTMP client does not appear to be a simple matter of recompiling RTMP to Javascript as the transport mechanism (TCP) must be accounted for.
Notes
For anyone looking to pick up on this work, be aware that testing against a remote stream from a browser running a theoretically proper rtmp implementation in Javascript would require that CORS is enabled on the remote host due to Same-Origin-Policy. See: https://github.com/Bilibili/flv.js/blob/master/docs/cors.md
I am trying to get this example to work. It works fine when I click the link. But when I try to download the HTML file on my local machine and try the same, it is throwing this error.
Not allowed to load local resource: blob:null/6771f68d-c4b8-49a1-8352-f2c277ddfbd4
The line of code that seems to be causing the issue is this,
video.src = window.URL.createObjectURL(mediaSource);
What this line of code is doing is basically trying to set the source of the video tag media element to the MediaSource object. I have tried various permutations without much luck.
I am using Chrome Version 28.0.1500.72 m, which is the latest stable release.
I would appreciate any pointers.
As #dandavis has said, "run it from http: not file".
I'm posting this as an answer for the sake of organization.
For starters:
Running you project from http means having a http server (such as apache or a simple node http-server) and running your project via http://localhost.
Install http-server globally using npm command(provided you have installed Node.js in your system beforehand). Navigate to your file folder in CMD and type http-server. Your app should run in localhost:8080.
I'm playing around with the file and filesystem api in html5 and apparently you need to have the --unlimited-quota-for-files flag turned on for it to work but I can't work out how this is done, can anyone tell me how to do this?
http://www.html5rocks.com/tutorials/file/filesystem/
Just close Chrome and run it from command line like this
google-chrome --unlimited-quota-for-files
Thanks, that was nearly there, you need to cd into the directory where chrome.exe lives and run: chrome --unlimited-quota-for-files