Using nodejs to stream microphone from one client to others - javascript

I am trying to take audio recorded by one client and send it to other connected clients in realtime. The objective being a sort of "broadcast". I have read many explanations to help guide me, with no luck.
Currently I have the audio being written to file like so:
var fileWriter = new wav.FileWriter(outFile, {
channels: 1,
sampleRate: 44100,
bitDepth: 16
});
client.on('stream', function(stream, meta) {
stream.pipe(fileWriter);
stream.on('end', function() {
fileWriter.end();
console.log('wrote to file ' + outFile);
});
});
});
As you can see, I'm currently using Binaryjs to send the audio data to the server, at which point I pipe the stream to the FileWriter
I then tried to read the file and pipe it to the response
app.get('/audio', function(req, res) {
fs.createReadStream(__dirname + '/demo.wav').pipe(res);
})
As I'm sure you've already noticed, this doesn't work. I thought that (maybe) while the file is being constructed, it would playback all updated content added to the file as well. This didn't happen, it played up to the point a client requested the file and then ended.
I am unsure of how to pass the stream data in real time to the clients requesting it. As a result of being completely new to nodejs I am not sure of the methods and terms used for this procedure and have been unable to find any direct working examples.

Related

How to create a live media stream with Javascript

I am wanting to create a live audio stream from one device to a node server which can then broadcast that live feed to several front ends.
I have searched extensively for this and have really hit a wall so hoping somebody out there can help.
I am able to get my audio input from the window.navigator.getUserMedia API.
getAudioInput(){
const constraints = {
video: false,
audio: {deviceId: this.state.deviceId ? {exact: this.state.deviceId} : undefined},
};
window.navigator.getUserMedia(
constraints,
this.initializeRecorder,
this.handleError
);
}
This then passes the stream to the initializeRecorder function which utilises the AudioContext API to create a createMediaStreamSource`
initializeRecorder = (stream) => {
const audioContext = window.AudioContext;
const context = new audioContext();
const audioInput = context.createMediaStreamSource(stream);
const bufferSize = 2048;
// create a javascript node
const recorder = context.createScriptProcessor(bufferSize, 1, 1);
// specify the processing function
recorder.onaudioprocess = this.recorderProcess;
// connect stream to our recorder
audioInput.connect(recorder);
// connect our recorder to the previous destination
recorder.connect(context.destination);
}
In my recorderProcess function, I now have an AudioProcessingEvent object which I can stream.
Currently I am emitting the audio event as as a stream via a socket connection like so:
recorderProcess = (e) => {
const left = e.inputBuffer.getChannelData(0);
this.socket.emit('stream', this.convertFloat32ToInt16(left))
}
Is this the best or only way to do this? Is there a better way by using fs.createReadStream and then posting the an endpoint via Axios? As far as I can tell this will only work with a file as opposed to a continuous live stream?
Server
I have a very simple socket server running ontop of express. Currently I listen for the stream event and then emit that same input back out:
io.on('connection', (client) => {
client.on('stream', (stream) => {
client.emit('stream', stream)
});
});
Not sure how scalable this is but if you have a better suggestion, I'm very open to it.
Client
Now this is where I am really stuck:
On my client I am listening for the stream event and want to listen to the stream as audio output in my browser. I have a function that receives the event but am stuck as to how I can use the arrayBuffer object that is being returned.
retrieveAudioStream = () => {
this.socket.on('stream', (buffer) => {
// ... how can I listen to the buffer as audio
})
}
Is the way I am streaming audio the best / only way I can upload to the node server?
How can I listen to the arrayBuffer object that is being returned on my client side?
Is the way I am streaming audio the best / only way I can upload to the node server?
Not really the best but i have seen worse, its not the only way either using websockets its considered ok from point of view since you want things to be "live" and not keep sending http post request every 5sec.
How can I listen to the arrayBuffer object that is being returned on my client side?
You can try this BaseAudioContext.decodeAudioData to listen to data streamed, the example is pretty simple.
From the code snippets you provide i assume you want to build something from scratch to learn how things work.
In that case, you can try MediaStream Recording API along with an websocket server that sends the chunks to X clients so they can reproduce the audio, etc.
It would make sense to invest time into WebRTC API, to learn how to stream from client to another client.
Also take a look at the links below for some useful information.
(stackoverflow) Get live streaming audio from NodeJS server to clients
(github) video-conference-webrtc
twitch.tv tech stack article
rtc.io

NodeJs Microsoft Azure Storage SDK Download File to Stream

I just started working with the Microsoft Azure Storage SDK for NodeJS (https://github.com/Azure/azure-storage-node) and already successfully uploaded my first pdf files to the cloud storage.
However, now I started looking at the documentation, in order to download my files as a node_buffer (so I dont have to use fs.createWriteStream), however the documentation is not giving any examples of how this works. The only thing they are writing is "There are also several ways to download files. For example, getFileToStream downloads the file to a stream:", but then they only show one example, which is using the fs.createWriteStream, which I dont want to use.
I was also not able to find anything on Google that really helped me, so I was wondering if anybody has experience with doing this and could share a code sample with me?
The getFileToStream function need a writable stream as param. If you want all the data wrote to a Buffer instead of a file, you just need to create a custom writable stream.
const { Writable } = require('stream');
let bufferArray = [];
const myWriteStream = new Writable({
write(chunk, encoding, callback) {
bufferArray.push(...chunk)
callback();
}
});
myWriteStream.on('finish', function () {
// all the data is stored inside this dataBuffer
let dataBuffer = Buffer.from(bufferArray);
})
then pass myWriteStream to getFileToStream function
fileService.getFileToStream('taskshare', 'taskdirectory', 'taskfile', myWriteStream, function(error, result, response) {
if (!error) {
// file retrieved
}
});

write to a wav file while streaming node js

Is there a way to force .pipe on a stream to write to a file every certain time/size?
Basically, I am using socket io stream, from a browser I am sending a buffer with audio and I send with emit:
Browser
c.onaudioprocess = function(o)
{
var input = o.inputBuffer.getChannelData(0);
stream1.write( new ss.Buffer( convertFloat32ToInt16( input ) ));
}
Server (nodejs)
var fileWriter = new wav.FileWriter('/tmp/demo.wav', {
channels: 1,
sampleRate: 44100,
bitDepth: 16
});
ss(socket).on('client-stream-request', function(stream)
{
stream.pipe(fileWriter);
}
The problem I have is that the file demo.wav will be only wrote when I finish the stream, so when I stop the microphone. But I would like it to write always, as I will be doing speech recognition using google, any ideas? If I call the speech recognition from google using pipe, the chuncks are too small and google is not able to recognize it.
Looking over the Node stream API it looks like you should be able to add an options parameter to the pipe function.
Try
stream.pipe(fileWriter, { end: false });

Stream broadcast image through socket.io-stream

Right now I have code, that sends image or any other file from the browser to the server, the server then broadcasts the stream to all other sockets. It all works fine, but the part I have no clue in how to implement is the receiving end of the browser.
Browser send:
var file = files[0]
var stream = ss.createStream()
ss(socket).emit('file', stream, { size: file.size })
ss.createBlobReadStream(file).pipe(stream)
Server:
ss(socket).on('file', function(stream, data) {
stream.pipe(fs.createWriteStream('file.jpg'))
var newStream = ss.createStream()
ss(socket).emit('file', newStream)
stream.pipe(newStream)
})
Browser Receive:
ss(socket).on('file', function(stream, data) {
// Implement this
})
I'm not exactly sure what to do with the incoming stream and use the image I just sent, display it or any other task. I appreciate any help you can give to me about this, or point to anything wrong in this code. Thanks in advance.
You can get a uint8array by listening for data events on the browser receive. I still don't know what to do then though.
ss(socket).on('file', function(stream, data) {
stream.on("data",function(uint8array){ //do something })
})
The other events I would expect to see don't seem to show up in the client received stream.

Use an Image Collection in Meteor

I'm building a Meteor app that communicates with a desktop client via HTTP requests with https://github.com/crazytoad/meteor-collectionapi
The desktop client generates images at irregular time intervals, and I want the Meteor site to only display the most recently generated image (ideally in real time). My initial idea was to use a PUT request to a singleton collection with the base64 imagedata, but I don't know how to turn that data into an image in the web browser. Note: the images are all pretty small (much less than 1 MB) so using gridFS should be unnecessary.
I realize this idea could be completely wrong, so if I'm completely on the wrong track, please suggest a better course of action.
You'll need to write a middleware to serve your images with proper MIME type. Example:
WebApp.connectHandlers.stack.splice (0, 0, {
route: '/imageserver',
handle: function(req, res, next) {
// Assuming the path is /imageserver/:id, here you get the :id
var iid = req.url.split('/')[1];
var item = Images.findOne(iid);
if(!item) {
// Image not found
res.writeHead(404);
res.end('File not found');
return;
}
// Image found
res.writeHead(200, {
'Content-Type': item.type,
});
res.write(new Buffer(item.data, 'base64'));
res.end();
},
});

Categories