In trying to send document scans as binary data over a web socket the first block of code below works fairly well. Since the first byte contains information concerning what exactly to do with the blob, the image data starts at an offset of one byte; and because it appears that blob.slice() returns a copy of the original blob rather than simply reading it, this likely is not the most efficient method of processing the blob because it makes a copy of the entire blob less one byte.
socket.onmessage = function(evt) {
if (evt.data instanceof Blob) {
evt.data.slice(0,1).arrayBuffer()
.then( (b) => {
let v = new DataView(b),
r = v.getUint8(0);
// Do something based on r and then display scan.
let objectURL = URL.createObjectURL( evt.data.slice(1) );
imgscan.onload = () => {
URL.revokeObjectURL( objectURL );
};
imgscan.src = objectURL;
})
If the websocket's binaryType is changed to arraybuffer, it's a bit easier to read the first byte and apparently does not make a copy, but I do not understand how to get the image data out of the buffer to display it; that is, I don't see which method to apply to the DataView to get the raw binary data of the image. Would you please explain or point me to the correct documentation? Thank you.
This SO question seems similar but was not helpful to me anyway.
socket.onmessage = function(evt) {
if ( evt.data instanceof ArrayBuffer ) {
let v = new DataView(evt.data),
r = v.getUint8(0);
// How to get the raw binary image data out of
// the array buffer starting at the first byte?
Related
I've got a blob of audio data confirmed to play in the browser but fails to play after storing, retrieving, and conversion of the same data. I've tried a few methods without success, each time returning the error:
Uncaught (in promise) DOMException: Failed to load because no supported source was found
Hasura notes that bytea data must be passed in as a String, so I tried a couple things.
Converting the blob into base64 stores fine but the retrieval and playing of the data doesn't work. I've tried doing conversions within the browser to base64 and then back into blob. I think it's just the data doesn't store properly as bytea if I convert it to base64 first:
// Storing bytea data as base64 string
const arrayBuffer = await blob.arrayBuffer();
const byteArray = new Uint8Array(arrayBuffer);
const charArray = Array.from(byteArray, (x: number) => String.fromCharCode(x));
const encodedString = window.btoa(charArray.join(''));
hasuraRequest....
`
mutation SaveAudioBlob ($input: String) {
insert_testerooey_one(
object: {
blubberz: $input
}
) {
id
blubberz
}
}
`,
{ input: encodedString }
);
// Decoding bytea data
const decodedString = window.atob(encodedString);
const decodedByteArray = new Uint8Array(decodedString.length).map((_, i) =>
decodedString.charCodeAt(i)
);
const decodedBlob = new Blob([decodedByteArray.buffer], { type: 'audio/mpeg' });
const audio4 = new Audio();
audio4.src = URL.createObjectURL(decodedBlob);
audio4.play();
Then I came across a Github issue (https://github.com/hasura/graphql-engine/issues/3336) suggesting the use of a computed field to convert the bytea data to base64, so I tried using that instead of my decoding attempt, only to be met with the same error:
CREATE OR REPLACE FUNCTION public.content_base64(mm testerooey)
RETURNS text
LANGUAGE sql
STABLE
AS $function$
SELECT encode(mm.blobberz, 'base64')
$function$
It seemed like a base64 string was not the way to store bytea data, so I tried converting the data to a hex string prior to storing. It stores ok, I think, but upon retrieval the data doesn't play, and I think it's a similar problem as storing as base64:
// Encoding to hex string
const arrayBuffer = await blob.arrayBuffer();
const byteArray = new Uint8Array(arrayBuffer);
const hexString = Array.from(byteArray, (byte) =>
byte.toString(16).padStart(2, '0')
).join('');
But using the decoded data didn't work again, regardless of whether I tried the computed field method or my own conversion methods. So, am I just not converting it right? Is my line of thinking incorrect? Or what is it I'm doing wrong?
I've got it working if I just convert to base64 and store as a text field but I'd prefer to store as bytea because it takes up less space. I think something's wrong with how the data is either stored, retrieved, or converted, but I don't know how to do it. I know the blob itself is fine because when generated I can play audio with it, it only bugs out after fetching and attempted conversion its stored value. Any ideas?
Also, I'd really like to not store the file in another service like s3, even if drastically simpler.
Checking MDN I see there used to be BlobBuilder and that I could call blobBuilder.append to continue adding data to a blob but according to MDN BlobBuilder is deprecated in favor of the Blob constructor. Unfortunately the Blob constructor requires all data in memory at construction time. My data is too large to be in memory at construction time. Looking at the File API see nothing there either.
Is there a way to generate large data client side and put it in a blob? For example say I wanted to render a 16k by 16k image. Uncompressed that's a 1gig image.
I have an algorithm that can generate it 1 or a few scan lines at a time but I need way to write those scan lines into a file/blob and then when finished I can use the standard way to let the user download that blob but, I can't seem to find an API that let's me stream data into a blob.
The only thing I can think of is apparently I can make a Blob from Blobs so I suppose I can write each part of the image to a separate blob and then send all the blobs to another blob to get a big blob.
Is that the only solution? Seems kind of um .... strange. Though if it works then ¯\_(ツ)_/¯
Someone voted to close as they don't understand the question. Here's another explanation.
Write 4 gig to a blob
const arrays = [];
for (let i = 0; i < 4096; ++i) {
arrays.push(new Uint8Array(1024 * 1024)); // 1 meg
}
// arrays now holds 4 gig of data
const blob = new Blob(arrays);
The code above will crash because the browser will kill the page for using too much memory. Using BlobBuilder I could have done something like
const builder = new BlobBuilder();
for (let i = 0; i < 4096; ++i) {
const data = new Uint8Array(1024 * 1024); // 1 meg
builder.append(data);
}
const blob = builder.getBlob(...);
That would not have run out of memory because there is never more than 1meg of data around. The browser can flush the data being appended to the BlobBuilder out to disk.
What's the new way to achieve writing 4 gig to a blob? Is it only writing lots of small blobs and then using those to generate a larger one or is there some more traditional way where traditional means steaming into some object/file/blob/storage.
As you know, the data that the blob will contain must be ready to pass to the constructor. Let us take the example from MDN:
var aFileParts = ['<a id="a"><b id="b">hey!</b></a>'];
var oMyBlob = new Blob(aFileParts, {type : 'text/html'});
Now, we have two options:
We can append data to the array, and then convert it to a blob:
var aFileParts = ['<a id="a"><b id="b">hey!</b></a>'];
aFileParts.push('<p>How are you?</p>');
var oMyBlob = new Blob(aFileParts, {type : 'text/html'});
Alternatively, we can use blobs to create the blob:
var oMyOtherBlob = new Blob([], {type: 'text/html'});
oMyOtherBlob = new Blob([oMyOtherBlob, '<a id="a"><b id="b">hey!</b></a>'], {type : 'text/html'});
oMyOtherBlob= new Blob([oMyOtherBlob, '<p>How are you?</p>'], {type : 'text/html'});
You may build your own BlobBuilder encapsulating that... given that appending to an array seems to lead you to run out of memory, let us encapsulate the second option:
var MyBlobBuilder = function() {
var blob = new Blob([], {type: 'text/html'});
this.append = function(src)
{
blob = new Blob([blob, src], {type: 'text/html'});
};
this.getBlob = function()
{
return blob;
}
};
Note: tested with your code (replaced BlobBuilder with MyBlobBuilder), did not run out of memory on my machine. Windows 10, Chrome 67, 8 GB Ram, Intel Core I3 - 64 bits.
How do you convert a blob that was received with a WebSocket Binary message to a Float32Array (or other Typed Arrays : Uint32, Uint16, etc).
I've tried to use the FileReader but the 'result' takes WAY too long to become available. The result MUST be available on the next received WebSocket message.
If I could get the WebSocket to receive an ArrayBuffer instead of a Blob, that would work. How can I do that?
Found the solution, is was easy. The WebSocket binaryType default is 'Blob', change it to ArrayBuffer and then convert data to other TypedArrays is fast.
var ws = new WebSocket(...);
ws.binaryType = 'arraybuffer';
ws.onmessage = wsevent;
The message handler might look like this:
var floatArray;
function wsevent(event) {
if (event.data instanceof ArrayBuffer) {
floatArray = new Float32Array(event.data);
return;
}
//...handle other ws messages
}
In my code I typically send the binary data in one message and then the next text message would use the binary data.
I am making an application where I am taking mic data from the inputBuffer and I want to stream to another client and play it. However, I cannot get it wokring.
My recording/capturing works fine so I will skip to relevant parts of the code
function recorderProcess(e) {
var left = e.inputBuffer.getChannelData(0);
var convert = convertFloat32ToInt16(left);
window.stream.write(convert);
var src = window.URL.createObjectURL(lcm);
playsound(convert);
ss(socket).emit('file',convert, {size: src.size},currentgame);
ss.createBlobReadStream(convert).pipe(window.stream);
//ss.createReadStream(f).pipe(widnow.stream);
}
function playsound(raw) {
console.log("now playing a sound, that starts with", new Uint8Array(raw.slice(0, 10)));
context.decodeAudioData(raw, function (buffer) {
if (!buffer) {
console.error("failed to decode:", "buffer null");
return;
}
var source = context.createBufferSource();
source.buffer = buffer;
source.connect(context.destination);
source.start(0);
console.log("started...");
}, function (error) {
console.error("failed to decode:", error);
});
}
I am able to successfully create an array buffer using the float32toint16 function, however when I use the init sound function I get an error "null" meaning that the arraybuffer will not decode into an audio stream? Has anyone else had this issue? I have scoured the internet with no answer on how to do this. I am trying to play it this way because ultimately I will be streaming from client to client so I will be sending arraybufers via sockets.
thanks in advance.
If I'm understanding this correctly (there are some missing pieces in your code sample)...
decodeAudioData can only decode things like MP3 or WAV. It looks like you're passing it a raw Int16Array or Uint16Array. Because the underlying ArrayBuffer isn't a format that decodeAudioData understands, it gives up.
I think what you want to do is something like this:
function playsound( raw ) {
// i'll assume you know how to convert in this direction
// since you have convertFloat32ToInt16
var buffer = convertInt16ToFloat32( raw ),
src = context.createBufferSource(),
audioBuffer = context.createBuffer( 1, buffer.length, context.sampleRate );
audioBuffer.getChannelData( 0 ).set( buffer );
src.buffer = audioBuffer;
src.connect( context.destination );
src.start( 0 );
}
Basically, you already have a way to create the raw Float32Array that the Web Audio API likes, so there's no need to decode (and you can't decode anyway, since your data isn't a valid file format). So you just convert back to Float32Array, create your own AudioBuffer, write in the data from buffer, and go from there.
For converting from float32 to unsigned int 16 you can multiply each float32 value with 0xffff(which is 16 bit max value). and for int16 to float32 do this reversely which means divide by 0xffff. audio should be fine now.
I am new in stackoverflow. I should write this as a comment but due to lack of reputation point i can't. Thats why i have to write it as a answer. sorry for inconvenience.
I am trying to encode and decode an image. I am using the FileReader's readAsDataURL method to convert the image to base64. Then to convert it back I have tried using readAsBinaryString() and atob() with no luck. Is there another way to persist images without base64 encoding them?
readAsBinaryString()
Starts reading the contents of the specified Blob, which may be a
File. When the read operation is finished, the readyState will become
DONE, and the onloadend callback, if any, will be called. At that
time, the result attribute contains the raw binary data from the file.
Any idea what I'm doing wrong here?
Sample Code
http://jsfiddle.net/qL86Z/3/
$("#base64Button").on("click", function () {
var file = $("#base64File")[0].files[0]
var reader = new FileReader();
// callback for readAsDataURL
reader.onload = function (encodedFile) {
console.log("reader.onload");
var base64Image = encodedFile.srcElement.result.split("data:image/jpeg;base64,")[1];
var blob = new Blob([base64Image],{type:"image/jpeg"});
var reader2 = new FileReader();
// callback for readAsBinaryString
reader2.onloadend = function(decoded) {
console.log("reader2.onloadend");
console.log(decoded); // this should contain binary format of the image
// console.log(URL.createObjectURL(decoded.binary)); // Doesn't work
};
reader2.readAsBinaryString(blob);
// console.log(URL.createObjectURL(atob(base64Image))); // Doesn't work
};
reader.readAsDataURL(file);
console.log(URL.createObjectURL(file)); // Works
});
Thanks!
After some more research I found the answer from here
I basically needed to wrap the raw binary in an arraybuffer and convert the binary chars to Unicode.
This is the code that was missing,
var binaryImg = atob(base64Image);
var length = binaryImg.length;
var ab = new ArrayBuffer(length);
var ua = new Uint8Array(ab);
for (var i = 0; i < length; i++) {
ua[i] = binaryImg.charCodeAt(i);
}
The full sample code is here
URL.createObjectURL expects a Blob (which can be a File) as its argument. Not a string. That's why URL.createObjectURL(file) works.
Instead, you are creating a FileReader reader that reads file as a data url, then you use that data url to create another Blob (with the same contents). And then you even create a reader2 to get a binary string from the just constructed blob. However, neither the base64Image url string part (even if btoa-decoded to a larger string) nor the decoded.binary string are vaild arguments to URL.createObjectURL!