SheetJS XLSX.read ends up with Page Unresponsive - javascript

I am trying to read a xlsx file that has a size of 3000KB and convert it to JSON using SheetJS, but when I read the file I seem to get a chrome 'Page Unresponsive error', clicking wait on the error everytime I upload a file is not ideal.
How can I resolve this issue?
onChange(event) {
this.file = event.target.files ? event.target.files[0] : null;
this.uploaded = true;
const reader = new FileReader();
reader.readAsArrayBuffer(this.file);
reader.onload = function () {
const data = new Uint8Array(reader.result);
const wb = XLSX.read(data, { type: 'array', sheets: ['Sheet1', 'Sheet2'] });
this.summary = XLSX.utils.sheet_to_json(wb.Sheets.Summary);
this.weeklyDetails = XLSX.utils.sheet_to_json(wb.Sheets['Weekly Details']);
console.log(this.weeklyDetails);
};
},

Related

readAsDataUrl converting to octet-stream instead of pdf

I am getting a blob data from a RESTFul endpoint and then I need to convert it to base64 string with file type as application/pdf however its converting it to application/octet-stream.
Here's what my codes does:
const getBytesData = () => {
if (user) {
getInvoicePDFStringByInvoiceId('129', user.user.token).then((data) => {
if (data.blob) {
const reader = new FileReader();
reader.readAsDataURL(data.blob);
reader.onloadend = () => {
var base64data = reader.result.replace('octet-stream', 'pdf');
console.log('Pdf loaded:- ', base64data);
setPDFLoaded(base64data);
return;
};
} else {
console.log('Error happened from API = ', data.error, data.message);
}
});
}
};
Can someone help me understand what could solve this issue?

Vue.js excel to json

i tried to read the excel files with vue.js but u can once i read the file the memory start to sky rocket like 5 gb ram and i the excel file is feairly small pls help need to convert the file to json
The vue method to handle the excel file i tried all the type option i saw in the documintation but all send me diffrent errors
I saw a similar question here but still could not solve this
when tried
base64: "TypeError: input.replace is not a function"
binary: "TypeError: x.charCodeAt is not a function"
string: "TypeError: data.match is not a function"
array: is the one that cause the memory to get into 5gb
Also when tried to use the new file reader as present in the documentation when create the reader.onload the function never ran.
the actual temeplate i tried two things.
when i use the buffer it's seems to work but all the function return empty array.
like the file is empty but it is not
both way did the same thing
<v-file-input
v-on:change="displayFile($event)"
v-model="file">
</v-file-input>
<input type="file" name="xlfile" id="xlf" v-on:change="displayFile($event)" />
displayFile: function (event) {
// console.log(event.target.files[0])
// const file = event.target.files[0]
// const workbook = XLSX.read(file, {
// type: 'string'
// })
// console.log(workbook, workbook.SheetNames)
// const res = XLSX.read(file)
// console.log(res)
// const res = XLSX.read(this.file)
// console.log(res)
console.log(this.file)
this.file.text().then(text => {
const fileType = this.file.type
console.log(fileType)
// this.PropceseMethod(this.file, fileType)
})
const reader = new FileReader()
reader.onload = (data) => {
console.log('HERE')
console.log(data)
const workbook = XLSX.read(data, {
type: 'buffer'
})
console.log(workbook)
workbook.SheetNames.forEach(function (sheetName) {
console.log(sheetName)
console.log(workbook.Sheets[sheetName])
// Here is your object
const XLRowObject = XLSX.utils.sheet_to_row_object_array(workbook.Sheets[sheetName])
console.log(XLSX.utils.sheet_to_json(workbook.Sheets[sheetName]))
console.log(XLRowObject)
const jsonObject = JSON.stringify(XLRowObject)
console.log(jsonObject)
})
}
reader.onerror = function (ex) {
console.log(ex)
}
reader.readAsText(this.file)
}
to manage this i had to do change the way i am reading the file.
When i used readAsBinaryString it's working, and pay using the type binary with this.
This function is reading only the first sheet
fileToJson (e) {
const file = e.target.files[0]
/* Boilerplate to set up FileReader */
const reader = new FileReader()
reader.onload = (e) => {
/* Parse data */
const bstr = e.target.result
const wb = XLSX.read(bstr, { type: 'binary' })
/* Get first worksheet */
const wsname = wb.SheetNames[0]
const ws = wb.Sheets[wsname]
/* Convert array of arrays */
const data = XLSX.utils.sheet_to_json(ws, { header: 1 })
/* Update state */
this.data = data
const header = data.shift()
}
reader.readAsBinaryString(file)
}
This code worked for me in a Vue CLI App:
// Important that import statement must get the full.min.js file only.
import XLSX from '../../../node_modules/xlsx/dist/xlsx.full.min.js'
var reader = new FileReader()
reader.onload = function (e) {
var data = e.target.result
var workbook = XLSX.read(data, { type: 'binary' })
let sheetName = workbook.SheetNames[0]
let worksheet = workbook.Sheets[sheetName]
let rowObject = XLSX.utils.sheet_to_row_object_array(worksheet)
const finalJsonData = JSON.stringify(rowObject, undefined, 4)
console.log(finalJsonData)
}
reader.readAsBinaryString(this.excelFile)
With my final JSON Output as:
[
{
"email": "test5#test.com",
"password": "password",
"full_name": "Some Name 5",
"mobile": 9897675463
},
{
"email": "test6#test.com",
"password": "password",
"full_name": "Some Name 6",
"mobile": 9897675463
},
...
...
]
And my Excel file as:

How do I read a JSON file in Angular?

I am trying to load a JSON file from local disk and use the data from it to fill a FabricJS canvas. I have problems on getting the data from the file.
This is what i have till now.
app.html
<input type="file" accept=".json" id="fileInput" (change)="loadFile($event)"/>
app.ts
loadFile(event) {
const eventObj: MSInputMethodContext = <MSInputMethodContext> event;
const target: HTMLInputElement = <HTMLInputElement> eventObj.target;
const files: FileList = target.files;
this.file = files[0];
const reader = new FileReader();
reader.readAsText(this.file, 'utf8');
this.canvas.loadFromJSON(this.file, this.canvas.renderAll.bind(this.canvas), function (o, object) {
console.log(o, object);
});
Any thoughts on how I can make this work?
FileReader has an async api.
You must register a callback to the onload event to get the data.
loadFile(event) {
const eventObj: MSInputMethodContext = <MSInputMethodContext> event;
const target: HTMLInputElement = <HTMLInputElement> eventObj.target;
const files: FileList = target.files;
this.file = files[0];
const reader = new FileReader();
reader.readAsText(this.file, 'utf8');
reader.onload = function() {
this.canvas.loadFromJSON(reader.result, this.canvas.renderAll.bind(this.canvas), function (o, object) {
console.log(o, object);
});
}

Reactjs - Can't base64 encode file from react-dropzone

I am using react-dropzone to handle file upload on my website. When successfully loading a file, the dropzone triggers the following callback:
onDrop: function (acceptedFiles, rejectedFiles) {
myFile = acceptedFiles[0];
console.log('Accepted files: ', myFile);
}
I would like to base64 encode this file. When doing :
var base64data = Base64.encode(myFile)
console.log("base64 data: ", base64data) // => base64 data: W29iamVjdCBGaWxlXQ==W29iamVjdCBGaWxlXQ==
Regardless of file uploaded, it always prints out the same string.
Am I missing something ? I need to base64 encode this file (always images)
This JS Bin is a working example of converting a File to base64: http://jsbin.com/piqiqecuxo/1/edit?js,console,output . The main addition seems to be reading the file using a FileReader, where FileReader.readAsDataURL() returns a base64 encoded string
document.getElementById('button').addEventListener('click', function() {
var files = document.getElementById('file').files;
if (files.length > 0) {
getBase64(files[0]);
}
});
function getBase64(file) {
var reader = new FileReader();
reader.readAsDataURL(file);
reader.onload = function () {
console.log(reader.result);
};
reader.onerror = function (error) {
console.log('Error: ', error);
};
}
If you want it in a neat method that works with async / await, you can do it this way
const getBase64 = async (file: Blob): Promise<string | undefined> => {
var reader = new FileReader();
reader.readAsDataURL(file as Blob);
return new Promise((reslove, reject) => {
reader.onload = () => reslove(reader.result as any);
reader.onerror = (error) => reject(error);
})
}

Reading multiple files with FileReader using .map()

I have array of files and I need to format the file to json (object at this point) with bunch of other info. Here's what I have tried
const uploadData = Files.map((file) => {
const fr = new FileReader();
fr.readAsArrayBuffer(file);
fr.onload = (event) =>{
const fileData = event.target.result;
return {
query:{
documentName: file.name,
personId: personId,
serviceId: serviceID,
documentFile: fileData,
}
}
}
})
I want to use immutable techniques. I do have a guess why this doesn't work but have no idea how to fix it. I Think .map does not wait for filereader to read and thus returns only array of undefined values. I tried to use IIFE but was unsuccessfull.
const uploadData = [],
Files.forEach((file) => {
const fr = new FileReader();
fr.readAsArrayBuffer(file);
fr.onload = (event) =>{
const fileData = event.target.result;
uploadData.push(query:{
documentName: file.name,
personId: personId,
serviceId: serviceID,
documentFile: fileData,
})
}
})
Since return is occuring in the callback , you should use local var and push the data

Categories