Edit-Reading xlsx file in Angular 5 - javascript

I want to read xlsx file but what i want is just to read first three records because as you know if i read all the records, the browser will crash. I need your help to find a way to just read first three records (rows)
P.S: I don't want to save data in memory while parsing xlsx file
i am using this right now :
fileChange(event) {
const fileList: FileList = event.target.files;
if (fileList.length > 0) {
const file: File = fileList[0];
const reader = new FileReader();
reader.onload = e => {
const arrayBuffer = reader.result,
data = new Uint8Array(arrayBuffer),
arr = new Array();
for (let i = 0; i !== data.length; ++i) {
arr[i] = String.fromCharCode(data[i]);
}
const bstr = arr.join("");
const workbook: XLSX.WorkBook = XLSX.read(bstr, { type: "binary" });
const firstSheetName: string = workbook.SheetNames[0];
const worksheet: XLSX.WorkSheet = workbook.Sheets[firstSheetName];
this.setXlsxData(XLSX.utils.sheet_to_json(worksheet));
};
reader.readAsArrayBuffer(file);
}
}
setXlsxData(data: Array<any>) {
this.headers = Object.keys(data[0]);
this.xlsxData = data;
}

try this library. I have written & read some xlsx files, it works smoothly.
(Apologies, I couldnt write this in comment, can't comment)
https://www.npmjs.com/package/xlsx

Related

Read CSV File In React

I'm uploading and then reading the CSV file but I'm facing an issue while splitting it, so basically, column names in CSV contain ',' so when I'm going to split the columns with ',' so I don't get the full column value, please suggest me some proper way for it. Thanks
const readCsv = (file) => {
const reader = new FileReader();
const filetext = reader.readAsBinaryString(file);
reader.addEventListener('load', function (e) {
const data = e.target.result;
let parsedata = [];
let newLinebrk = data.split('\n');
for (let i = 0; i < newLinebrk.length; i++) {
parsedata.push(newLinebrk[i].split(','));
}
console.log("parsedData: ", parsedata);
});
};
CSV:
column 1 column2
test lorem, ipsum, dummy/text
after splitting:
['test', 'lorem', 'ipsum', 'dummy/text']
so by doing that I'm unable to get a proper column name that contains a comma in string.
In my case, I used Papa Parse which fulfills my all requirements.
const readCsv = (file) => {
const reader = new FileReader();
reader.readAsBinaryString(file);
reader.addEventListener('load', function (e) {
const data = e.target.result;
Papaparse.parse(data, {
complete: function (results) {
console.log("results: ", results.data);
},
});
});
};

how to read portion of large excel file to prevent browser crashing using filereader and xlsx

I am able to read/parse excel files on the front end using the below code with FileReader and xlsx package. However, for very large files, this will crash the browser. I only need to read the first few rows, how can i achieve this?
working code
const xlsxParse = (file) => {
var reader = new FileReader();
reader.onload = function (e) {
var data = e.target.result;
let readedData = XLSX.read(data, { type: 'binary' });
const wsname = readedData.SheetNames[0];
const ws = readedData.Sheets[wsname];
/* Convert array to json*/
const dataParse = XLSX.utils.sheet_to_json(ws, { header: 1 });
};
reader.readAsBinaryString(file)
}
my attempt to read first few rows. not working
const xlsxParse = (file) => {
var reader = new FileReader();
reader.onprogress = (e) => {
var data = e.target.result;
let readedData = XLSX.read(data, { type: 'binary' });
if (readedData) {
const wsname = readedData.SheetNames[0];
const ws = readedData.Sheets[wsname];
/* Convert array to json*/
const dataParse = XLSX.utils.sheet_to_json(ws, { header: 1 });
console.log('dataParse', dataParse)
if (dataParse.length > 3) {
reader.abort()
}
}
reader.readAsBinaryString(file)
}
thanks
This can help you out. Basically, sheetRows is standing for reading the first n lines:
handleFile = (file /*:File*/) => {
/* Boilerplate to set up FileReader */
const reader = new FileReader();
const rABS = !!reader.readAsBinaryString;
reader.onload = e => {
/* Parse data */
const bstr = e.target.result;
const wb = XLSX.read(bstr, { type: "binary", sheetRows: 100});
/* Get first worksheet */
const wsname = wb.SheetNames[0];
const ws = wb.Sheets[wsname];
/* Convert array of arrays */
const data = XLSX.utils.sheet_to_json(ws);
const tableColumns = Object.keys(data[0]);
console.log('data', data);
};
if (rABS) reader.readAsBinaryString(file);
else reader.readAsArrayBuffer(file);
};

How to decode a base64 string properly in javascript

I tried to convert a base64 string generated from pdf file using FileReader.readAsDataURL() to its original format.In NodeJS I did it like this and it was able generated the pdf to its initial state.
filebuffer = "data:application/pdf;base64,JVBERi0xLjQKJSDi48/..........."
let base64file = fileBuffer.split(';base64,').pop();
fs.writeFileSync('download.pdf',base64file,{encoding:'base64'},function(err){
if(err === null){
console.log("file created");
return;
}
else{
console.log(err);
return;
}
})
But i tried to do it in HTML + Javascript in this way.But in this way , pdf was empty/no letter wasn't in it
let stringval = "data:application/pdf;base64,JVBERi0xLjQKJSDi48/..........."
let encodedString = stringval.split(';base64,').pop();
let data = atob(encodedString);
let blob = new Blob([data]);
// //if you need a literal File object
let file = new File([blob], "filename");
link.href = URL.createObjectURL(file);
link.download = 'filename';
I was Capturing file and converting to base64 string in this way:
captureFile: function () {
event.preventDefault();
const file = event.target.files[0];
$("#labelinput1").html(file.name);
const reader = new window.FileReader();
reader.readAsDataURL(file);
reader.onloadend = () => {
var x = reader.result.toString();
App.buffer2 = x;
console.log("buffer", App.buffer);
};}
Then after clicking a button , I added the buffer to IPFS node
addfile: async function () {
if (App.buffer2 === null) return;
App.node = await window.Ipfs.create()
App.node.add(App.buffer2, function (errx, resipfs) {
if (errx === null) {
console.log(resipfs[0].hash);
App.buffer2 = null;
return App.showInfo(resipfs[0].hash);
}
else {
return App.showError(errx.message.toString() + errx.stack.toString());
}
});
}
using the IPFS HASH i can get back the base64 encoded string , I retrieved this string in this way:
ipfsfiledownload: async function () {
var filebuffer = await App.node.cat(hashtext);
var stringval = filebuffer.toString();
//convert this string to main file
}
I used Truffle Petshop and write those functions in top of it. Here is a IPFS hash QmfSefUiwjV44hpfnHyUngGATyHm9M4vN3PzF1mpe59Nn1. you can try out this Hash value in nodejs with this code
const IPFS = require('ipfs');
const fs = require('fs');
const main = async() => {
const node = await IPFS.create()
var fileBuffer = await
node.cat('QmfSefUiwjV44hpfnHyUngGATyHm9M4vN3PzF1mpe59Nn1')
fileBuffer = fileBuffer.toString()
let base64file = fileBuffer.split(';base64,').pop();
fs.writeFileSync('download.pdf',base64file, {encoding:'base64'},function(err){
if(err === null){
console.log("file created");
return;
}
else{
console.log(err);
return;
}
})
}
main()
You can find the full code here.
What I am doing wrong and how to solve it?
After converting the base64 string using atob() , I converted it to Uint8Array Then created the blob and file . It seems to work now ..
Here is the full code :
ipfsfiledownload: async function () {
var hashtext = document.getElementById("id_ipfshash").value //getting the IPFS HASH
var link = document.getElementById("downloadLink");
if (hashtext === null) return
var filebuffer = await App.node.cat(hashtext); //getting the base64 string from IPFS
var stringval = filebuffer.toString();
console.log(stringval);
let encodedString = stringval.split(',')[1]; //getting the base64 hash
let mimetype = stringval.split(',')[0].split(':')[1].split(';')[0]; //getting the mime type
let data = atob(encodedString); //ascii to binary
var ab = new ArrayBuffer(data.length);
var ia = new Uint8Array(ab);
//converting to Uint8Array
for(var i = 0;i<data.length;i++){
ia[i] = data.charCodeAt(i);
}
let blob = new Blob([ia],{ "type": mimetype});
let filename = 'filename.' + App.getExtension(mimetype);
let file = new File([blob], filename);
link.href = window.URL.createObjectURL(file);
link.download = filename;
link.click();
}

how to Convert Unstructured excel sheet Data into json object and to display in table format

i need to display unstructured excel data in table format along with space
now i am displaying the table for unstructured data but it is not taking space in columns, and table also in unstructured format
onFileChange(ev) {
let workBook = null;
let jsonData = null;
const reader = new FileReader();
const file = ev.target.files[0];
reader.onload = (event) => {
const data = reader.result;
workBook = XLSX.read(data, { type: 'binary' });
jsonData = workBook.SheetNames.reduce((initial, name) => {
const sheet = workBook.Sheets[name];
initial[name] = XLSX.utils.sheet_to_json(sheet);
return initial;
}, {});
this.columnsArr = [];
this.data = Object.keys(jsonData).map(key=>jsonData[key])
console.log(this.data);
this.data.forEach((elm, index) => {
this.item = elm;
for (var firstKey in this.item[0]) break;
this.emtKey = firstKey;
});
// this.output = dataString.slice(0, 300);
// this.setDownload(dataString);
}
reader.readAsBinaryString(file);
}
i need to display the table has same as excel sheet data

How to read excel file in angularjs?

I have tried to read excel file to follow the following tutorial.
http://code.psjinx.com/xlsx.js/
But I have failed to read excel file for undefine situation in the following highlighted line.... I have tried it in IE11.
var reader = new FileReader();
reader.onload = function(e) {
var data = e.target.result;
var workbook = XLSX.read(data, {
type: 'binary'
});
obj.sheets = XLSXReader.utils.parseWorkbook(workbook, readCells, toJSON);
handler(obj);
}
**reader.readAsBinaryString(file)**;
The following answer describe, if you are going to load xlsx file from server. For uploading there is another code.
OPTION 1: This is a procedure, which works in Alasql library:
See files: 15utility.js and 84from.js for example
readBinaryFile(filename,true,function(a){
var workbook = X.read(data,{type:'binary'});
// do what you need with parsed xlsx
});
// Read Binary reading procedure
// path - path to the file
// asy - true - async / false - sync
var readBinaryFile = utils.loadBinaryFile = function(path, asy, success, error) {
if(typeof exports == 'object') {
// For Node.js
var fs = require('fs');
var data = fs.readFileSync(path);
var arr = new Array();
for(var i = 0; i != data.length; ++i) arr[i] = String.fromCharCode(data[i]);
success(arr.join(""));
} else {
// For browser
var xhr = new XMLHttpRequest();
xhr.open("GET", path, asy); // Async
xhr.responseType = "arraybuffer";
xhr.onload = function() {
var data = new Uint8Array(xhr.response);
var arr = new Array();
for(var i = 0; i != data.length; ++i) arr[i] = String.fromCharCode(data[i]);
success(arr.join(""));
};
xhr.send();
};
};
OPTION 2: you can use Alasql library itself, which, probably, can be easier option.
alasql('SELECT * FROM XLSX("myfile.xlsx",{headers:true,sheetid:"Sheet2",range:"A1:D100"})',
[],function(data) {
console.log(res);
});
See the example here (simple Excel reading demo) or here (d3.js demo from Excel).

Categories