Stream broadcast image through socket.io-stream - javascript

Right now I have code, that sends image or any other file from the browser to the server, the server then broadcasts the stream to all other sockets. It all works fine, but the part I have no clue in how to implement is the receiving end of the browser.
Browser send:
var file = files[0]
var stream = ss.createStream()
ss(socket).emit('file', stream, { size: file.size })
ss.createBlobReadStream(file).pipe(stream)
Server:
ss(socket).on('file', function(stream, data) {
stream.pipe(fs.createWriteStream('file.jpg'))
var newStream = ss.createStream()
ss(socket).emit('file', newStream)
stream.pipe(newStream)
})
Browser Receive:
ss(socket).on('file', function(stream, data) {
// Implement this
})
I'm not exactly sure what to do with the incoming stream and use the image I just sent, display it or any other task. I appreciate any help you can give to me about this, or point to anything wrong in this code. Thanks in advance.

You can get a uint8array by listening for data events on the browser receive. I still don't know what to do then though.
ss(socket).on('file', function(stream, data) {
stream.on("data",function(uint8array){ //do something })
})
The other events I would expect to see don't seem to show up in the client received stream.

Related

How to download csv file from server using stream

I'm trying to download huge CSV file from server which is being generated on the fly.
Im returning ResponseEntity<StreamingResponseBody> in async so as soon as i have part of my data i'm returning it.
this is my controller code:
StreamingResponseBody streamingResponseBody = out -> {
csvService.exportToCsvBySessionId(applicationId, sessionIdsInRange, out, tags);
}
return ResponseEntity.ok()
.headers(csvService.getHeaders(CSV_FILE_NAME))
.body(streamingResponseBody);
in the header i'm adding
produces: text\csv;
Content-Disposition: attachment; filename=%s.csv;
On the client side im using aurelia framework and sending the request using HttpClient (fetch)
public getFraudAlertsCsv() {
this.serverProxy.fetch(`/sessions/fraud/csv)
.then(response => {
logger.debug('waiting for response');
return response.blob());
.then((blob: Blob) => this.donwnloadCsv(blob, `Fraud_Alerts_${new Date()}.csv`))
.catch( (err)=> {
this.logger.error("Failed to get appSessionId sessions csv file", err);
});
}
even though i can see in the network analysis that my request is starting to get response (it size increases) there is no popup window asking to download the file, and the log doesn't print "waiting for response".
instead im getting the whole file being download what the entire response arrived (when server close the stream).
I want to show the progress of the download, how can i do it?
I think fetch doesn't support progress API yet, so you may want to use traditional XHR and use onprogress or progress event:
xhr.onprogress = function updateProgress(oEvent) {
if (oEvent.lengthComputable) {
var percentComplete = oEvent.loaded / oEvent.total * 100;
// ...
} else {
// Unable to compute progress information since the total size is unknown
}
}
Note: code example taken from MDN page https://developer.mozilla.org/en-US/docs/Web/API/XMLHttpRequest/Using_XMLHttpRequest#Monitoring_progress

How to create a live media stream with Javascript

I am wanting to create a live audio stream from one device to a node server which can then broadcast that live feed to several front ends.
I have searched extensively for this and have really hit a wall so hoping somebody out there can help.
I am able to get my audio input from the window.navigator.getUserMedia API.
getAudioInput(){
const constraints = {
video: false,
audio: {deviceId: this.state.deviceId ? {exact: this.state.deviceId} : undefined},
};
window.navigator.getUserMedia(
constraints,
this.initializeRecorder,
this.handleError
);
}
This then passes the stream to the initializeRecorder function which utilises the AudioContext API to create a createMediaStreamSource`
initializeRecorder = (stream) => {
const audioContext = window.AudioContext;
const context = new audioContext();
const audioInput = context.createMediaStreamSource(stream);
const bufferSize = 2048;
// create a javascript node
const recorder = context.createScriptProcessor(bufferSize, 1, 1);
// specify the processing function
recorder.onaudioprocess = this.recorderProcess;
// connect stream to our recorder
audioInput.connect(recorder);
// connect our recorder to the previous destination
recorder.connect(context.destination);
}
In my recorderProcess function, I now have an AudioProcessingEvent object which I can stream.
Currently I am emitting the audio event as as a stream via a socket connection like so:
recorderProcess = (e) => {
const left = e.inputBuffer.getChannelData(0);
this.socket.emit('stream', this.convertFloat32ToInt16(left))
}
Is this the best or only way to do this? Is there a better way by using fs.createReadStream and then posting the an endpoint via Axios? As far as I can tell this will only work with a file as opposed to a continuous live stream?
Server
I have a very simple socket server running ontop of express. Currently I listen for the stream event and then emit that same input back out:
io.on('connection', (client) => {
client.on('stream', (stream) => {
client.emit('stream', stream)
});
});
Not sure how scalable this is but if you have a better suggestion, I'm very open to it.
Client
Now this is where I am really stuck:
On my client I am listening for the stream event and want to listen to the stream as audio output in my browser. I have a function that receives the event but am stuck as to how I can use the arrayBuffer object that is being returned.
retrieveAudioStream = () => {
this.socket.on('stream', (buffer) => {
// ... how can I listen to the buffer as audio
})
}
Is the way I am streaming audio the best / only way I can upload to the node server?
How can I listen to the arrayBuffer object that is being returned on my client side?
Is the way I am streaming audio the best / only way I can upload to the node server?
Not really the best but i have seen worse, its not the only way either using websockets its considered ok from point of view since you want things to be "live" and not keep sending http post request every 5sec.
How can I listen to the arrayBuffer object that is being returned on my client side?
You can try this BaseAudioContext.decodeAudioData to listen to data streamed, the example is pretty simple.
From the code snippets you provide i assume you want to build something from scratch to learn how things work.
In that case, you can try MediaStream Recording API along with an websocket server that sends the chunks to X clients so they can reproduce the audio, etc.
It would make sense to invest time into WebRTC API, to learn how to stream from client to another client.
Also take a look at the links below for some useful information.
(stackoverflow) Get live streaming audio from NodeJS server to clients
(github) video-conference-webrtc
twitch.tv tech stack article
rtc.io

Using nodejs to stream microphone from one client to others

I am trying to take audio recorded by one client and send it to other connected clients in realtime. The objective being a sort of "broadcast". I have read many explanations to help guide me, with no luck.
Currently I have the audio being written to file like so:
var fileWriter = new wav.FileWriter(outFile, {
channels: 1,
sampleRate: 44100,
bitDepth: 16
});
client.on('stream', function(stream, meta) {
stream.pipe(fileWriter);
stream.on('end', function() {
fileWriter.end();
console.log('wrote to file ' + outFile);
});
});
});
As you can see, I'm currently using Binaryjs to send the audio data to the server, at which point I pipe the stream to the FileWriter
I then tried to read the file and pipe it to the response
app.get('/audio', function(req, res) {
fs.createReadStream(__dirname + '/demo.wav').pipe(res);
})
As I'm sure you've already noticed, this doesn't work. I thought that (maybe) while the file is being constructed, it would playback all updated content added to the file as well. This didn't happen, it played up to the point a client requested the file and then ended.
I am unsure of how to pass the stream data in real time to the clients requesting it. As a result of being completely new to nodejs I am not sure of the methods and terms used for this procedure and have been unable to find any direct working examples.

Save json object as a text file

I am using an API for a Twitch.tv streaming bot called DeepBot.
Here is the link to it on github https://github.com/DeepBot-API/client-websocket
My goal is to create a text document listing all the information pulled from the bot using the command api|get_users|. The bot's response is always a json object. How can I take the json object from the bot and save it as a text file?
Edit: My code
var WebSocket = require('ws');
var ws = new WebSocket('ws://Ip and Port/');
ws.on('open', function () {
console.log('sending API registration');
ws.send('api|register|SECRET');
});
ws.on('close', function close() {
console.log('disconnected');
});
ws.on('message', function (message) {
console.log('Received: ' + message);
});
ws.on('open', function () {
ws.send('api|get_users|');
});
Well that depends on how your setup is? You posted this under javascript. So I guess you are either:
using a browser, to make the websocket connection, in with case there is no direct way to save a file on the client. But in HTML5 you can store key,value pairs with local storage.
using node js (server side javascript) in witch case the code is as below:
some other setup, that I can't guess. in witch case you might tell a little more about it?
In browser with HTML5 capabilities:
// where msg is an object returned from the API
localStorage.setItem('Some key', JSON.stringify(msg));
In Node JS
var fs = require("fs"); // Has to be installed first with “npm install fs”
// where msg is an object returned from the API
fs.writeFile("some-file.json", JSON.stringify(msg), function (err) {
if (err) throw err;
});
Edit: OK, Thanks for clearing it up.
I believe Blag's solution is the way to go.
Good luck with your project!
If it's for a client side JS save :
Create a file in memory for user to download, not through server
and
Convert JS object to JSON string
Is what you need. ( I don't test it, but it'll look like this : )
var j = {"name":"binchen"};
var s = JSON.stringify(j);
window.location = 'data:text/plain;charset=utf-8,'+encodeURIComponent(s);

Want to send images using node.js and socket.io in android

I am Creating a chat app between two users now I can do Simple text chat with different users using node.js and socket.io. Now problem arises here as I have to send image in chat application and after searching for whole long day I am not able to get perfect node.js in which I can send image in chat app. So I want to know is it possible to send image using node.js. Here is my simple node.js file for sending simple text message from one user to another.
socket.on('privateMessage', function(data) {
socket.get('name', function (err, name) {
if(!err) {
// get the user from list by its name to get its socket,
// then emit event privateMessage
// again here we want to make you clear
// that every single client connection has its own
// unique SOcket Object, we need to get this Socket object
// to communicate with every other client. The socket variable
// in this scope is the client who wants to send the private
// message but the socket of the receiver is not know.
// Get it from the saved list when connectMe handlers gets called
// by each user.
onLine[data.to].emit('newPrivateMessage',{from:name, msg:data.msg, type:'Private Msg'})
}
});
});
You can use the Base64 version of your image and send it like this:
onLine[data.to].emit('newPrivateMessage',{from:name, img:data.img.toString('base64'), type:'Private Msg'})
.. and then on the client side receive it and create an image
socket.on("newPrivateMessage", function(data) {
if (data.img) {
var img = new Image();
img.src = 'data:image/jpeg;base64,' + data.img;
// Do whatever you want with your image.
}
});
UPDATE
The following is a snippet taken from the link I've commented below. As you can see it takes the image from the input, reads it and sends to the server. After that you can send the same data from the server to another client.
For the full example, please read the article.
JavaScript (client)
...
$('#imageFile').on('change', function(e) {
var file = e.originalEvent.target.files[0],
reader = new FileReader();
reader.onload = function(evt) {
var jsonObject = {
'imageData': evt.target.result
}
// send a custom socket message to server
socket.emit('user image', jsonObject);
};
reader.readAsDataURL(file);
});
...
HTML
...
Image file: <input type="file" id="imageFile" /><br/>
...
UPDATE 2
Here is one example I have found:
Java (client)
File file = new File("path/to/the/image");
try {
FileInputStream imageInFile = new FileInputStream(file);
byte imageData[] = new byte[(int) file.length()];
imageInFile.read(imageData);
// Converting Image byte array into Base64 String
String imageDataString = Base64.encodeBase64URLSafeString(imageData);
} catch (...) {
...
}
The above snippet shows how to read the file and encode the data into a base64 string. So then you can send it just like a string (I assume).
Here is the complete example: How to convert Image to String and String to Image in Java?
Also I have found encodeToString function of Base64.Encoder (java.util package), which you can use.
The easiest way I can think of is to simply Base64 encode the image and send it through the text pipe. You would need to distinguish text and image messages with header information (Maybe send a JSON object?).

Categories