I have a binary string (from an REST API) that is the content of a Excel file.
PK\x03\x04\x14\x00\x06\00\x00\x00���N�0\x10E�H�C�-#\b5��\x12*Q>�ēƪc[�ii����\x10B�\x15j7�\x12��{2��h�nm���ƻR\f����U^\x1B7/���%�\x17\x19�rZY�\x14\x1B#1\x19__�f�\x00�q��R4D�AJ�\x1Ah\x15\x16>����V\x11�ƹ\f�Z�9����NV ...
What I want is to put this content in a FileReader object. I tried to convert the content to blob and to use readAsBinaryString but it doesn't work.
Maybe I missed something
However, when I use an input type=file, it's works with this example
$("#input").on("change", (e) => {
selectedFile = e.target.files[0];
});
let fileReader = new FileReader();
fileReader.readAsBinaryString(selectedFile);
fileReader.onload = (event)=>{
let data = event.target.result;
let workbook = XLSX.read(data,{type:"binary"});
}
What I would like is for selectedFile to reflect the binary string and not have to go through an input type=file
Thanks for your help
You can create a Blob object from the binary string, then create a File object from the Blob and finally, create a FileReader object from the File object.
var binaryString = "PK\x03\x04\x14\x00\x06\00\x00\x00���N�0\x10E�H�C�-#\b5��\x12*Q>�ēƪc[�ii����\x10B�\x15j7�\x12��{2��h�nm���ƻR\f����U^\x1B7/���%�\x17\x19�rZY�\x14\x1B#1\x19__�f�\x00�q��R4D�AJ�\x1Ah\x15\x16>����V\x11�ƹ\f�Z�9����NV ...";
// create a Blob object
var blob = new Blob([binaryString], { type: "application/vnd.ms-excel" });
// create a File object from the Blob
var file = new File([blob], "file.xlsx");
// create a FileReader object
var reader = new FileReader();
// use the readAsArrayBuffer method to read the file
reader.readAsArrayBuffer(file);
// when the reading is done, log the result
reader.onloadend = function () {
console.log(reader.result);
};
Related
I'm trying to read a local file into an ArrayBuffer using the FileReader API, like this
let reader = new FileReader();
reader.onload = function(e) {
let arrayBuffer = new Uint8Array(reader.result);
console.log(arrayBuffer);
}
reader.readAsArrayBuffer(new File([], 'data.txt'));
But I'm getting an empty arrayBuffer
How can I read this local file as an ArrayBuffer in my browser?
Thank you.
You cannot read a file by pathname through a browser. You need to have the user interact with the file system and choose a file before you can read the content.
const readFile = e => {
const file = e.target.files[0]
let reader = new FileReader();
reader.onload = function(e) {
let arrayBuffer = new Uint8Array(reader.result);
console.log(arrayBuffer);
}
reader.readAsArrayBuffer(file);
}
document.querySelector("#fileItem").onchange=readFile
<input id="fileItem" type="file">
I'm trying to export data from an Angular 6 web application.
I have an array of string, where each string is a csv line, formatted like this:
var csvLines = ['val1,val2\n', 'val3,val4\n'...];
Once I've added all the data to i need to the array, i write it to the console:
This looks fine...
Now i wan't to convert it to a blob and download it as a .CSV file.
The download is fine, but the format of the output is wrong.
When I run the following code:
const blob = new Blob([csvLines], {type: 'text/csv;encoding:utf-8'});
const reader = new FileReader();
reader.onload = () => {
console.log(reader.result);
};
reader.readAsText(blob);
I get this output.
NOTE the commas that are appended on every line but the first - this mess up my csv.
Can anyone tell me why this is happening and perhaps how to disable the comma appending?
I have tried to create the Blob with text/plain as mimetype and without the encoding, but the commas are still appended.
Because you are passing csvLines as [csvLines] to new Blob(..), you are passing an array containing an array. It seems like the subarray is joined using commas.
Just use new Blob(csvLines, { type: 'text/csv;encoding:utf-8' }); and you should be fine.
const csvLines = ['val1,val2\n', 'val3,val4\n'];
const blob = new Blob(csvLines, { type: 'text/csv;encoding:utf-8' });
const reader = new FileReader();
reader.onload = () => {
console.log(reader.result);
};
reader.readAsText(blob);
I know that, in order to convert a BLOB object to a readable format (URL) in Javascript, I should use the createObjectURL() method, right ?
Example :
var blob = new Blob(["Example"], { type: "text/plain" });
url = window.URL.createObjectURL(blob);
My question is:
Is it possible to get the raw binary content of a BLOB ? so, I can get something like :
"01000101 01111000 01100001 01101101 01110000 01101100 01100101" // "Example" in binary .
Convert the blob to an ArrayBuffer (see 2 methods). Create an ArrayBufferView (Int8array in this case), spread it into an array, and then map the view to the binary representation of each number using Number.toString() with a radix of 2 - .toString(2).
Method 1 - Use the Blob.arrayBuffer() instance method to get a promise that resolves with the ArrayBuffer:
const blobToBinary = async (blob) => {
const buffer = await blob.arrayBuffer();
const view = new Int8Array(buffer);
return [...view].map((n) => n.toString(2)).join(' ');
};
const blob = new Blob(["Example"], { type: "text/plain" });
blobToBinary(blob).then(console.log);
Method 2 - Extract the data from the blob using a FileReader. To get an ArrayBuffer use FileReader.readAsArrayBuffer().
const blob = new Blob(["Example"], { type: "text/plain" });
const reader = new FileReader();
reader.addEventListener("loadend", function() {
const view = new Int8Array(reader.result);
const bin = [...view].map((n) => n.toString(2)).join(' ');
console.log(bin);
});
reader.readAsArrayBuffer(blob);
You can use a FileReader to get the contents of the BLOB as a byte array:
var reader = new FileReader();
reader.readAsArrayBuffer(blob);
reader.onloadend = (event) => {
// The contents of the BLOB are in reader.result:
console.log(reader.result);
}
https://developer.mozilla.org/en-US/docs/Web/API/FileReader
https://developer.mozilla.org/en-US/docs/Web/API/Blob#Example_for_extracting_data_from_a_Blob
For learning purposes, I want to use the html input tag to select a jpeg image, retrieve the File Object, load it with fileReader and use the retrieved image string (base64) to create a new blob/file.
the service can upload the original file retrieved from the input just fine. However using my newFile the file get's corrupted and the file size somehow is larger.
I figure I'm doing something wrong with the blob constructor?
I'm using angular2 in typescript
<input type="file" (change)="onFileChanged($event)">
onFileChanged(event){
if (event.target.files && event.target.files[0]) {
let file = event.target.files[0];
let newFile;
let fr = new FileReader();
fr.onload = (event:any)=>{
let base64 = event.target.result
let img = base64.split(',')[1]
let blob = new Blob([window.atob(img)],{type:'image/jpeg'})
newFile = this.blobToFile(blob,'test')
}
fr.readAsDataURL(file)
console.log(file)
console.log(newFile)
this.service.upload(newFile).subscribe()
}
}
blobToFile(blob: Blob, fileName: string): File {
let b: any = blob;
b.lastModified = moment.now();
b.lastModifiedDate = new Date();
b.name = fileName;
b.webkitRelativePath="";
return <File>blob
}
EDIT------------
After finding out that fileReader is asynchronous, i've adjusted it a little bit and indeed the problem is with the blob constructor.
loggin the both the target.result of original file and new one revealed that the base64 as been transmuted. Any ideas why?
if (event.target.files && event.target.files[0]) {
let file = event.target.files[0];
let base64: string = null;
if (/^image\//.test(file.type)) {
let reader = new FileReader();
reader.onload = (e: any) => {
console.log(e.target)
base64 = e.target.result
let img = base64.split(',')[1];
let blob = new Blob([img], { type: 'image/jpeg' })
console.log(blob);
let fr = new FileReader()
fr.onload = (event: any) => {
console.log(event.target)
}
fr.readAsDataURL(blob)
}
reader.readAsDataURL(file);
}
Modify your function like this. Because FileReader is asynchronous, to process the result, you need to do it inside the onload callback, but here, you are uploading the file outside of onload which at that point, is undefined or whatever initial value it contains.
onFileChanged(event){
if (event.target.files && event.target.files[0]) {
let file = event.target.files[0];
let newFile;
let fr = new FileReader();
fr.onload = (event:any)=>{
let base64 = event.target.result
let img = base64.split(',')[1]
let blob = new Blob([window.atob(img)],{type:'image/jpeg'})
newFile = this.blobToFile(blob,'test')
this.service.upload(newFile).subscribe()
}
fr.readAsDataURL(file)
console.log(file)
console.log(newFile) // Either prints undefined or whatever initial value it contains
}
}
I am suspecting your code:
onFileChanged(event){
if (event.target.files && event.target.files[0]) {
let file = event.target.files[0];
let newFile;
let fr = new FileReader();
fr.onload = (event:any)=>{
let base64 = event.target.result
let img = base64.split(',')[1]
let blob = new Blob([window.atob(img)],{type:'image/jpeg'})
newFile = this.blobToFile(blob,'test')
}
fr.readAsDataURL(file)
console.log(file)
console.log(newFile)
this.service.upload(newFile).subscribe()
}
}
onFileChanged(event) and (event:any), these two 'event' mean different objects. event in onFileChanged is the event object of onFileChanged. event in fr.onload is the event object of FileReader.onload. Don't you think it is confusing and might cause cross reference?
var reader = new FileReader();
var rawData = new ArrayBuffer();
//console.log(1);
reader.onload = function(e) {
var rawData = e.target.result; //binary data
console.log(rawData);
}
I want to see explicitly the binary raw data as a text string, is that possible?, cause the only thing i see when logging is:
ArrayBuffer {}
You can try
console.log(String.fromCharCode.apply(null, new Uint16Array(rawData)));
This is what'I've needed:
reader.readAsBinaryString(file);
then the data is available raw