Web Audio- streaming file from server to client - javascript

I'm trying to stream audio from a server containing an audio file to a client using BinaryJS. My code was inspired by the code in this question: Playing PCM stream from Web Audio API on Node.js
Here's what my server code looks like:
// create a BinaryServer using BinaryJS
var BinaryServer = require('binaryjs').BinaryServer;
// gotta be able to access the filesystem
var fs = require('fs');
// create our server listening on a specific port
var server = BinaryServer({port: 8080});
// do this when a client makes a request
server.on('connection', function(client){
// get the audio file
var file = fs.createReadStream(__dirname + '/JDillaLife.mp3');
// convert to int16
var len = file.length;
var buf = new Int16Array(len);
while(len--){
buf[len] = data[len]*0xFFFF;
}
var Stream = client.send();
Stream.write(buf.buffer);
console.log("Server contacted and file sent");
});
console.log('Server running on port 8080');
And my client code:
var Speaker = require('speaker');
var speaker = new Speaker({
channels: 2,
bitDepth: 32,
sampleRate: 44100,
signed: true
});
var BinaryClient = require('binaryjs').BinaryClient;
var client = BinaryClient('http://localhost:8080');
client.on('stream', function(stream, meta){
stream.on('data', function(data){
speaker.write(data);
});
});
This is a very rough draft and I'm almost certain it won't play nicely right away, but right now it's throwing an error when I run that seems to take issue with the line var buf = new Int16Array(len);, and I'm not sure why this is happening. It says there's a "type error," but I'm not sure why this would happen when assigning a new object to an empty variable. I'm new to JS (and the whole untyped languages thing in general) so is this an issue with what I'm assigning?

I think the issue here is that you're accessing file.length, while file is a Stream object which I don't think have a length property. So what you're doing is basically saying
new Int16Array(undefined);
and hence the type error.
fs.createReadStream is documented here; https://nodejs.org/api/fs.html#fs_fs_createreadstream_path_options
You can use the Stream object to read data in chunks using stream.read(256), as documented here; https://nodejs.org/api/stream.html#stream_readable_read_size
Hopefully that gives you enough pointers to proceed!

Related

JavaScript play arraybuffer as audio. Need help to solve "decodeaudiodata unable to decode audio data"

I have a .net core WebSocket server that receives live stream audio from client A, then I need to stream this live audio to client B (Browser). So I've received the byte array from client A, and I sent the byte array to client B (Browser) *The byte array is correct as I can convert it into .wav and play it without a problem.
In client B (Browser), I try to decode the array buffer into the audio buffer so it can be put into output and play.
The mediastreamhandler.SendArraySegToAllAsync is where I start to send out the byte array from the server to client B. I use to send to all method 1st, later will be modified and send out data by matching websocket connection ID.
private async Task Echo(HttpContext context, WebSocket webSocket)
{
Debug.WriteLine("Start Echo between Websocket server & client");
var buffer = new byte[1024 * 4];
WebSocketReceiveResult result = await webSocket.ReceiveAsync(new ArraySegment<byte>(buffer), CancellationToken.None);
while (!result.CloseStatus.HasValue)
{
await webSocket.SendAsync(new ArraySegment<byte>(buffer, 0, result.Count), result.MessageType, result.EndOfMessage, CancellationToken.None);
result = await webSocket.ReceiveAsync(new ArraySegment<byte>(buffer), CancellationToken.None);
await mediastreamhandler.SendArraySegToAllAsync(new ArraySegment<byte>(buffer, 0, result.Count));
}
Debug.WriteLine("Close Echo");
await webSocket.CloseAsync(result.CloseStatus.Value, result.CloseStatusDescription, CancellationToken.None);
}
I then receive the audio byte array through websocket.onmessage in Javascript. Then I pass the byte array to decode and play. But here it says "unable to decode data". While in Mozilla, it did says that content format was unknown (Do I need to reformat the byte array that I receive?) The byte array itself is fine because I've used the same byte to create .wav file locally and play it without any problem.
var ctx = new AudioContext();
function playSound(arrBuff) {
var myAudioBuffer;
var src = ctx.createBufferSource();
ctx.decodeAudioData(arrBuff, function (buffer) {
myAudioBuffer = buffer;
});
src.buffer = myAudioBuffer;
src.connect(ctx.destination);
src.start();
}
I then try another method to decode and play the audio, this time, it has played out some whitenoise sounds instead of streaming out the audio from Client A.
var ctx = new AudioContext();
function playSound(arrBuff) {
var myAudioBuffer;
var src = ctx.createBufferSource();
myAudioBuffer = ctx.createBuffer(1, arrBuff.byteLength, 8000);
var nowBuffering = myAudioBuffer.getChannelData(0);
for (var i = 0; i < arrBuff.byteLength; i++) {
nowBuffering[i] = arrBuff[i];
}
src.buffer = myAudioBuffer;
src.connect(ctx.destination);
src.start();
}
I think I really need some help over here guys, trying to play out the array buffer in weeks and still, couldn't have any breakthrough. Stuck here. Not sure what I've done, could you guys kindly guide me or tell me any other approach to this? Thanks in advance very much, really mean it.
decodeAudioData() requires complete files, so it can't be used to decode partial chunks of data as they are received from a websocket. If you can stream Opus audio files over your websocket, you can play back with an available WebAssembly decoder. See:
https://fetch-stream-audio.anthum.com/
https://github.com/AnthumChris/fetch-stream-audio
I've solved the issue months ago, just here to post my solution.
Steps:
Server receive payload string from Twilio
Send payload string from server to client (browser).
public async Task SendMessageAsync(WebSocket socket, string message)
{
if (socket.State != WebSocketState.Open)
return;
await socket.SendAsync(buffer: new ArraySegment<byte>(array: Encoding.ASCII.GetBytes(message),
offset: 0,
count: message.Length),
messageType: WebSocketMessageType.Text,
endOfMessage: true,
cancellationToken: CancellationToken.None);
}
Add wavheader to payload string at the client side before play out the audio.
function playSound(payloadBase64) {
/* You can generate the wav header here --> https://codepen.io/mxfh/pen/mWLMrJ */
var Base64wavheader = "UklGRgAAAABXQVZFZm10IBIAAAAHAAEAQB8AAEAfAAABAAgAAABmYWN0BAAAAAAAAABkYXRh";
var audio = new Audio('data:audio/wav;base64,' + Base64wavheader + payloadBase64);
audio.play();
};

Socket.io server crash after client reload/leave page

I have a strange situation. I'm writing a web application for streaming rtsp from ip cameras. Everything is fine except when client that watches the stream reload or leave page the server crashes(not always).
Here's the crash output:
zlib.js:499
var newReq = self._handle.write(flushFlag,
^
TypeError: Cannot read property 'write' of null
at Zlib.callback (zlib.js:499:33)
Server side:
const app = require('express')();
const child_process = require('child_process');
const server = require('http').Server(app);
const io = require('socket.io')(server);
io.of('/stream').on('connection', function(socket) {
let room;
socket.on('startStream', function(camuid) {
room = camuid;
let stream;
socket.join(room);
if (processes.indexOf(camuid) == -1) {
processes.push(camuid);
logic.Onvif.getStreamLink(camuid).then(rtsp_link => {
stream = child_process.spawn("ffmpeg", [
'-y',
'-re',
"-i", rtsp_link,
'-f', 'mjpeg',
'-'
], {
detached: false
});
stream.stdout.on('data', (data) => {
const frame = Object.values(new Uint8Array(data));
socket.nsp.to(room).emit('camfeed', frame);
});
}).catch((e) => {
socket.nsp.to(room).emit('streamError', true);
});
}
});
On the client side:
const io = require('socket.io-client');
socket = io.connect(`http://${location.hostname}:8088/stream`);
socket.on('connect', function() {
socket.emit('startStream', $this.$props.id);
});
socket.on('camfeed', function (data) {
//Here the data is displayed on canvas
});
I tried to find out what part of code in the server causes that behaviour with no luck so far.
I tried put functions, listeners and emitters in try{}catch and console.log every step to see where it stops but the output is not always the same.
So I started looking for a solution and found a github issue saying that the zlib is responsible for compressing data to gzip before sending and the error is caused by trying to process data that doesn't exist. On the other hand I don't have zlib installed as a dependency and as far as I know zlib package isn't used anymore because node has this functionality built-in. I searched the node_modules and found minizlib package but no clue whose is that dependency.
I believe you need to use
socket.on("disconnect",function(){
//delete all user data here
})
so it stops using the objects you created

Download a json file from a link using Node js

I need to write an application with Node JS which given a link to a json file e.g http://data.phishtank.com/data/online-valid.json (The link doesn't open the file it opens a download), the program simply downloads the object and then prints it out. How can this be achieved? This is what I have so far and it doesn't seem to be working:
var checkIfPhishing = function(urlToPrint){
var http = require('http');
var fs = require('fs');
var file = fs.createWriteStream("SiteObject.json");
var request = http.get(urlToPrint, function(response) {
response.pipe(file);});
var siteObj= fs.readFileSync("SiteObject.json");
console.log(siteObj);
};
Thank you!
You cannot mix up async and sync reads and writes.
In your case you start Streaming the data from the other server to yours, but then you already start the sync read. Wich blocks the thread so the stream will be processed after youve read your 0 byte file...
So you need to store the stream in a variable first, then on finish log that stream.
So Simply do sth like this:
var data="";
var request = http.get(urlToPrint, function(response) {
response.on("data",append=>data+=append).on("finish",()=>console.log(data));;
});
Store the asyncly provided chunks of the stream in a variable, if the stream finishes log that string.
If you want to store it too:
var http = require('http');
var fs = require('fs');
function checkIfPhishing(urlToPrint){
var file = fs.createWriteStream("SiteObject.json");
var request = http.get(urlToPrint, function(response) {
response.on("finish",function(){
console.log( fs.readFileSync("SiteObject.json",{encoding:"utf8"}));
}).pipe(file);
});
}
This is exactly like your code, but it waits for the stream to finish before it reads synchronously...
However note that the sync read will slow down the whole thing, so you might directly stream to the console/a browser..

createBlockBlob and commitBlobBlocks create empty files in BlobStorage

I'm developing a web app that can upload large file into the Azure Blob Storage.
As a backend, I am using Windows Azure Mobile Services (the web app will generate contents for mobile devices) in nodeJS.
My client can successfully send chunks of data to the backend, everything looks fine but, at the end, the uploaded file is empty. The data upload has been prepared by following this tutorial: it works perfectly when the file is small enough to be uploaded in a single requests. The process fails when the file needs to be broken in chunks. It uses the ReadableStreamBuffer from the tutorial.
Can somebody help me?
Here the code:
Back-end : createBlobBlockFromStream
[...]
//Get references
var azure = require('azure');
var qs = require('querystring');
var appSettings = require('mobileservice-config').appSettings;
var accountName = appSettings.STORAGE_NAME;
var accountKey = appSettings.STORAGE_KEY;
var host = accountName + '.blob.core.windows.net';
var container = "zips";
//console.log(request.body);
var blobName = request.body.file;
var blobExt = request.body.ext;
var blockId = request.body.blockId;
var data = new Buffer(request.body.data, "base64");
var stream = new ReadableStreamBuffer(data);
var streamLen = stream.size();
var blobFull = blobName+"."+blobExt;
console.log("BlobFull: "+blobFull+"; id: "+blockId+"; len: "+streamLen+"; "+stream);
var blobService = azure.createBlobService(accountName, accountKey, host);
//console.log("blockId: "+blockId+"; container: "+container+";\nblobFull: "+blobFull+"streamLen: "+streamLen);
blobService.createBlobBlockFromStream(blockId, container, blobFull, stream, streamLen,
function(error, response){
if(error){
request.respond(statusCodes.INTERNAL_SERVER_ERROR, error);
} else {
request.respond(statusCodes.OK, {message : "block created"});
}
});
[...]
Back-end: commitBlobBlock
[...]
var azure = require('azure');
var qs = require('querystring');
var appSettings = require('mobileservice-config').appSettings;
var accountName = appSettings.STORAGE_NAME;
var accountKey = appSettings.STORAGE_KEY;
var host = accountName + '.blob.core.windows.net';
var container = "zips";
var blobName = request.body.file;
var blobExt = request.body.ext;
var blobFull = blobName+"."+blobExt;
var blockIdList = request.body.blockList;
console.log("blobFull: "+blobFull+"; blockIdList: "+JSON.stringify(blockIdList));
var blobService = azure.createBlobService(accountName, accountKey, host);
blobService.commitBlobBlocks(container, blobFull, blockIdList, function(error, result){
if(error){
request.respond(statusCodes.INTERNAL_SERVER_ERROR, error);
} else {
request.respond(statusCodes.OK, result);
blobService.listBlobBlocks(container, blobFull)
}
});
[...]
The second method returns the correct list of blockId, so I think that the second part of the process works fine. I think that it is the first method that fails to write the data inside the block, as if it creates some empty blocks.
In the client-side, I read the file as an ArrayBuffer, by using the FileReader JS API.
Then I convert it in a Base4 encoded string by using the following code. This approach works perfectly if I create the blob in a single call, good for small files.
[...]
//data contains the ArrayBuffer read by the FileReader API
var requestData = new Uint8Array(data);
var binary = "";
for (var i = 0; i < requestData.length; i++) {
binary += String.fromCharCode( requestData[ i ] );
}
[...]
Any idea?
Thank you,
Ric
Which version of the Azure Storage Node.js SDK are you using? It looks like you might be using an older version; if so I would recommend upgrading to the latest (0.3.0 as of this writing). We’ve improved many areas with the new library, including blob upload; you might be hitting a bug that has already been fixed. Note that there may be breaking changes between versions.
Download the latest Node.js Module (code is also on Github)
https://www.npmjs.org/package/azure-storage
Read our blog post: Microsoft Azure Storage Client Module for Node.js v. 0.2.0 http://blogs.msdn.com/b/windowsazurestorage/archive/2014/06/26/microsoft-azure-storage-client-module-for-node-js-v-0-2-0.aspx
If that’s not the issue, can you check a Fiddler trace (or equivalent) to see if the raw data blocks are being sent to the service?
Not too sure if your still suffering from this problem but i was experiencing the exact same thing and came across this looking for a solution. Well i found one and though id share.
My problem was not with how i push the block but in how i committed it. My little proxy server had no knowledge of prior commits, it just pushes the data its sent and commits it. Trouble is i wasn't providing the commit message with the previously committed blocks so it was overwriting them with the current commit each time.
So my solution:
var opts = {
UncommittedBlocks: [IdOfJustCommitedBlock],
CommittedBlocks: [IdsOfPreviouslyCommittedBlocks]
}
blobService.commitBlobBlocks('containerName', 'blobName', opts, function(e, r){});
For me the bit that broke everything was the format of the opts object. I wasn't providing an array of previously committed block names. Its also worth noting that i had to base64 decode the existing block names as:
blobService.listBlobBlocks('containerName', 'fileName', 'type IE committed', fn)
Returns an object for each block with the name being base64 encoded.
Just for completeness here's how i push my blocks, req is from the express route:
var blobId = blobService.getBlockId('blobName', 'lengthOfPreviouslyCommittedArray + 1 as Int');
var length = req.headers['content-length'];
blobService.createBlobBlockFromStream(blobId, 'containerName', 'blobName', req, length, fn);
Also with the upload i had a strange issue where the content-length header caused it to break and so had to delete it from the req.headers object.
Hope this helps and is detailed enough.

Play Audio from client when message is recieved from socket.io - node.js

I've searched for a while for a solution to this problem, but haven't found much.
My goal is to recieve a message from a udp client, which the server recieves and forwards to a web client, which plays an audio clip each time a message is recieved. However, for some reason the audio will not play. If I open the page straight from my directory, the audio can be played, but if I try and access it through localhost it fails to load. Does anyone know of a solution?
Here is the client side javascript.
var mySound = new Audio('/public/audio/Bloom.mp3');
mySound.load();
var socket = io.connect('http://localhost');
socket.on('message', function(data){
console.log(data);
$('#content').text(data);
mySound.play();
//document.getElementById('audiotag1').play();
});
This page is served by server.js, a node.js file using socket.io and express. I don't receive any errors from my console.log.
Here is the server.js
var app = require('express')()
, server = require('http').Server(app)
, io =require('socket.io')(server)
, dgram = require('dgram');
var httpPort = 1234;
var udpPort = 5000;
server.listen(httpPort);
app.use(server.express.static( __dirname + '/public'));
app.get('/', function(request, response){
var ipAddress = request.socket.remoteAddress;
console.log("New express connection from: " + ipAddress);
response.sendfile(__dirname + '/public/index.html');
});
var udpSocket = dgram.createSocket('udp4', function(msgBuffer){
var jsonMessage = JSON.parse(msgBuffer);
io.sockets.emit('message', JSON.stringify(jsonMessage));
});
udpSocket.bind(udpPort, '127.0.0.1');
You can go to this link to see the error chrome has.
http://postimg.org/image/xkv7a2kwb/
Does anyone have any ideas on how to fix this?
This error usually occurs when you have incorrect Express config for static files.
From your client-side JavaScript I can see that you store your static files in public folder.
So somewhere on the server-side you should have something like this:
app.use('/public', express.static('/path/to/public'));
Doing this way http://localhost/public/audio/Bloom.mp3 request will be served with /path/to/public/audio/Bloom.mp3 file.
Caveat:
Since Express 4 there is "no more app.use(app.router)" and "All routing methods will be added in the order in which they appear". This means that you have to put the above line of code before first app.get(...), app.post(...), etc.
Hope this information will help someone.
we can upload tone into cloud storage and then use that source like below
playAudio() {
let audio = new Audio()
audio.src ='https://res.cloudinary.com/Apple_Notification.mp3'
audio.load()
audio.play()
}
trigger this function in client-side whenever a new message comes in

Categories