Capture, modify and then output audio in electron - javascript

I'm trying to capture, modify and finally output audio in node with Electron (Mac OSX). These are the steps:
Capture the audio before it's output, possibly via CoreAudio.
Modify the audio stream/buffer via the Web Audio API.
Output the modified buffer to the sound device.
I've tried node-core-audio. However, the most I can achieve is a rapid glitching sound. Other than this a I haven't been able to find a good solution I/O of audio.
How can I achieve this without sacrificing sound quality?

I'm not sure what you want to accomplish, but on MacOS, this is not yet possible. I came accross the problem of recording system sound on MacOS, and I found a solution. Finally! Using Soundflower and Javascript with electron, I finally could record system audio. Though it is not exactly what you want, I modified this audio stream by adding it with video stream from the System, then displayed it to the user. Here is my solution to the issue IN this detailed blog post I think it is better than posting all the long steps here on stackoverflow.

Related

How to generate an audio waveform from an HTML5 web video?

Given a plain web video, e.g.:
<video src="my-video.mp4"></video>
How could I generate its audio waveform?
Is it possible to generate the waveform without playing the video?
Notes:
I'm interested in the available APIs to achieve this, not a "how to render a waveform to a canvas" guide.
Plain JavaScript, please. No libraries.
You might try AudioContext.decodeAudioData, though I can't tell whether video medias are well supported or not, it will probably depends a lot on the codecs and the browsers.
I am a bit surprised to see that FF, chrome and Safari accept it with an mp4 with and mpeg4a audio inside.
If it works, then you can use an OfflineAudioContext to analyze your audio data as fast as possible and generate your waveform data.
See also MDN article on Visualizations with Web Audio API
If you will use web audio API, you will render on the client side.
You will not control the quality of experience because rendering may cause memory and time issues for clients.
you can generate an image of the waveform using FFmpeg on the backend and pass it to the front.
It is pretty simple.
https://trac.ffmpeg.org/wiki/Waveform
example:
ffmpeg -i C:\test.mp3 -filter_complex "showwavespic=s=640x120" -frames:v 1 C:\waveform.png

play all video types in one browser - using any way possible

I have a problem that most the internet has with html 5 videos. There is no way to play all video types in any one browser. You need to include many different formats to make sure that my browser can play any video type. This is a big problem that one of my clients has, and he has pushed to me.
I can only read from the server to get the videos, and there is only one version of any video on the server. The person running the server has refused to convert any videos before saving to the server, and people upload their videos to this server from any type of device, and as such there are about 7 different formats. No browser supports all 7. I need some way, and I mean any way to be able to stream these videos in real time from the web.
I have found ways to convert them in javascript, but it takes like 2 minutes to convert a 30 second video. I have tried adding plugins, but I have not found any plugin that can do it. My current thought, and I am testing this: use Explorer and windows media player to play videos, or use quicktime with safari to play the videos.
Maybe there is some magical plugin that can play all video MIME types, like VLC, but VLC is no longer supported for some reason in chrome, and I cannot find it in fire fox. or maybe I can add the codec to Chrome or firefox? Maybe draw the video on a canvas, and stream audio magically? or sacrifice my first born son? heck i'll even throw in the second kid.
If someone can help me I would be so happy, and I think a lot of the internet would be too.

Can jPlayer do live RTMP streaming yet?

I'm working on a website for a podcast that does live video and audio streaming. We're currently stuck using JWPlayer AND jPlayer to do it.
The reason is that past attempts to get jPlayer to play live RTMP streams have never seemed to work. Live audio on the other hand works fine though.
I've seen plenty of demos for RTMP video streaming, but they're streaming a static, prerecorded file, not a live feed.
Here's a sample stream I'm trying to use: rtmp://pclix-channel.videocdn.scaleengine.net/pclix-channel/live/mp4:priestly (we're using ScaleEngine, so if it works with this it'll work with ours).
It won't seem to play at all. No errors in console that I can find, nothing.
I can't find any recent information saying one way or another if it's even currently able to support it. So I'm not sure if I'm missing something, or if it's currently futile and I should just find another free, open-source solution (we don't want JWPlayer). All we need is a player composed of a video window and a JavaScript API; we already have a custom CSS skin for it.

Is it possible to play synthesized sound in the browser using JavaScript?

I just came across a Nintendo emulator written entirely in JavaScript on the interwebs, but it doesn't have sound. It got me thinking: Is there any way to synthesize sound in the browser using JavaScript and then play it? If it's not possible in general, then are there any Safari/Opera/FireFox/IE/Etc. extensions that would make it possible?
I am not asking about techniques for synthesizing sound, just techniques for playing sounds that have been synthesized by code running in the browser.
I would imagine your best bet is to have Javascript talk to Flash using ExternalInterface (http://www.adobe.com/devnet/flash/articles/external_interface.html). Flash now has a way of transfering data between the sound buffers and a general purpose ByteArray classs.
http://www.adobe.com/devnet/flash/articles/dynamic_sound_generation/
You can develop Flash for free using the Flex SDK http://www.adobe.com/products/flex/.
Most developers use SoundManager 2 when they want to add sound to their application with JavaScript. It has hooks so JavaScript can interact with Flash 8 and 9 features. I am not sure if it has exposed the ability to work with Byte Data which I guess you are after, I never had to deal with that.
It turns out the author of the NES emulator has found a dynamic audio library:
https://github.com/bfirsh/dynamicaudio.js
I haven't tried it, but the docs look promising:
var dynamicaudio = new DynamicAudio({'swf':
'/static/dynamicaudio.swf'})
write(samples); // Plays an array of floating point audio samples in the range -1.0 to
1.0.
In theory it should be possible to synthesize the sounds and then get the browser to play them using a data URL.
In practice, Steven Wittens has actually produced a demo of this technique. The encodeAudio8bit and encodeAudio16bit functions are where the magic happens.

HTML5 Local Storage of audio element source - is it possible?

I've been experimenting with the audio and local storage features of html5 of late and have run into something that has me stumped.
I'd like to be able to cache or store the source of the audio element locally to enable speedier and offline playback. The problem is I can't see how this is possible with the current implementation.
I have tried the following using WebKit:
Creating a manifest file to set up local caching but the audio file appears not to be a cacheable item maybe due to the way it is stream or something
I have also attempted to use javascript to put an audio object into local storage but the size of the mp3 makes this impossible due to memory issues (i think).
I have tried to use the data uri and base64 to use the html as a audio transport that can be cached but again the filesize makes this prohibitive. Also the audio element does not seem to like this in WebKit (works fine in mozilla)
I have tried several methods of putting the data into the local database store. Again suffering the same issues as the other cases.
I'd love to hear any other ideas anyone may have as to how I could achieve my goal of offline playback using caching/local storage in WebKit.
I've been trying to do this myself, on the iOS (for iPhone / iPad) but it refuses to cache audio files in offline, even if in Cache Manifest.
It does not error, but instead simply pretends to have played the audio element if invoked via JavaScript without a control. If it's embedded with a control, it displays an alternate control that says "Cannot play audio file.". It works fine if the application is able to go online.
It seems not to cache the audio, playing another sound resource seems to cause it to clear the previous resource from memory - this is pretty worthless functionality even when online.
I have experimented with base64 encoding the audio as data URI's. This works in Safari on the desktop (at least for fairly short samples of around 20-30k that I've been using) but seems not to be supported at all on iOS - it silently does nothing, which is highly annoying.
I don't know about other vendors - Google Chrome used to not support data URI's for audio but perhaps they fixed it... - it seems like it's not possible for now though.
Update: Minor discrepancy with iPhone OS 3.x (tested with 3.1.2): If an audio element is specified in an offline web app but it doesn't have a control, it displays a non-interactive control with a non-animated spinner on it (which it definitely shouldn't do). I assume this is fixed in iOS 4.x (which should be out next week).
So it's been a while since I asked this question and I thought i'd give some info about how we solved it. Basically we encoded the data into PNG's using a similar technique to this:
http://audioscene.org/scene-files/yury/pngencoding/sample.html
Then cached the image on the mobile device using html5 local storage and accessed it as needed. The PNG's were pretty big but this worked for us.
I spent a while trying to do this for a game I'm making, and since as far as I could tell browsers (Firefox and Chrome) still don't support caching of audio elements I thought I'd post the solution I found.
There is a workaround described here: http://dougx.net/plunder/index.php#code
I can confirm it works pretty well, but is probably better suited to smaller files. As he describes here (http://dougx.net/plunder/GameSounds.txt), you encode the audio as base64 strings, and give them a data:audio/ogg;base64 (or any compatible audio format) header, which HTML5 audio can then read in. Because this is just a string, the browser will cache it.
I guess it would be preferable to get the manifest approach working, since this feels like the most relevant mechanism for locally caching the file.
What happens if you alter the audio file's HTTP headers, e.g. Content-Type and Expires? Does the browser do something different if the file extension is changed?
I see you've had no luck so far.
You might want to take a look at JAI (JavaScript Audio Interface) ("the world's first javascript interface for web <audio>"). Or get in touch with Alastair MacDonald, who wrote it.
Failing that, the HTML5 Doctor may be able to assist.
Adding video and audio files to local storage works with iOS 4.3.
I just added a video and an audio file to manifest and they both got downloaded to offline storage on iPad.

Categories