I'm uploading and then reading the CSV file but I'm facing an issue while splitting it, so basically, column names in CSV contain ',' so when I'm going to split the columns with ',' so I don't get the full column value, please suggest me some proper way for it. Thanks
const readCsv = (file) => {
const reader = new FileReader();
const filetext = reader.readAsBinaryString(file);
reader.addEventListener('load', function (e) {
const data = e.target.result;
let parsedata = [];
let newLinebrk = data.split('\n');
for (let i = 0; i < newLinebrk.length; i++) {
parsedata.push(newLinebrk[i].split(','));
}
console.log("parsedData: ", parsedata);
});
};
CSV:
column 1 column2
test lorem, ipsum, dummy/text
after splitting:
['test', 'lorem', 'ipsum', 'dummy/text']
so by doing that I'm unable to get a proper column name that contains a comma in string.
In my case, I used Papa Parse which fulfills my all requirements.
const readCsv = (file) => {
const reader = new FileReader();
reader.readAsBinaryString(file);
reader.addEventListener('load', function (e) {
const data = e.target.result;
Papaparse.parse(data, {
complete: function (results) {
console.log("results: ", results.data);
},
});
});
};
Related
I have implemented Drag and Drop File Upload in my react project, so on every drag and drop of file into the drop zone , I'm taking the file and accessing it's name and it's data and converting the data to base64 using javscript's FileReader() and readAsDataURL() and updating the state, which I need to send it to bakend.
How to append a number to filename if the file with same name already exist in the state ?
eg: file(1).csv or file 2.csv
Main State
this.state : {
Files:[],
}
Function that get's triggered every time for drag and drop of file
FileHandling = (files) => {
files.forEach((file) => {
const reader = new FileReader();
reader.readAsDataURL(file);
reader.onload = () => {
const CompleteData= {
fileData: reader.result,
fileName: file.name,
};
this.setState({
Files:[...this.state.Files, CompleteData]
})
};
});
};
Make any pattern that you want on the line with a comment.
const getUniqueName = (fileName, index = 0) => {
let checkName = fileName, ext = '';
if(index){
if(checkName.indexOf('.') > -1){
let tokens = checkName.split('.'); ext = '.' + tokens.pop();
checkName = tokens.join('.');
}
// make any pattern here
checkName = `${checkName} (${index})${ext}`;
}
const nameExists = this.state.Files.filter(f=>f.fileName === checkName).length > 0;
return nameExists ? getUniqueName(fileName, index + 1) : checkName;
}
FileHandling = (files) => {
files.forEach((file) => {
const reader = new FileReader();
reader.readAsDataURL(file);
reader.onload = () => {
const CompleteData = {
fileData: reader.result,
fileName: getUniqueName(file.name),
};
this.setState({
Files:[...this.state.Files, CompleteData]
})
};
});
};
You can check this.state.Files before. A recursive function could be used here. Imagine you load a file named export.csv. The second one would export.csv transformed in export_1.csv. But on a third one named export.csv, the verification would be done on export, leading to export_1 => Error !
The best is to do :
const checkNameOfTheFile = (newFileName) => {
// Ex 'export.csv'
const counter = this.state.Files.filter(f => f.fileName === newFileName).length;
// If counter >= 2, an error has already been passed to the files because it means
// 2 files have the same name
if (counter >= 2) {
throw 'Error duplicate name already present';
}
if (counter === 0) {
return newFileName
}
if (counter === 1) {
const newName = `${newFileName.split('.')[0]}_${counter}.${newFileName.split('.')[1]}`;
// Return export_1.csv;
return checkNameOfTheFile(newName);
// We need to check if export_1.csv has not been already taken.
// If so, the new name would be export_1_1.csv, not really pretty but it can be changed easily in this function
}
};
const CompleteData= {
fileData: reader.result,
fileName: checkNameOfTheFile(file.name),
};
Determine if there is a matching file, if so determine if it is already a duplicate (has a number at the end already). If so find its number and increment it. Otherwise just append a '1'. If there are no matching files, don't do anything.
FileHandling = (files) => {
files.forEach((file) => {
const reader = new FileReader();
reader.readAsDataURL(file);
reader.onload = () => {
let existingFiles = this.state.Files.filter(x=>x.fileName==file.Name);
let fileName = file.Name;
if(existingFiles.length > 0) {
let oldFileName = existingFiles[0].split('.')[0];
let oldFileMatchNumberMatch = oldFileName.match(/\d+$/);
let fileNumber = (oldFileMatchNumberMatch) parseInt(oldFileMatchNumberMatch[0], 10) : 1;
fileName = file.Name + fileNumber; //if file.Name has an extension, you'll need to split, add the number to the end of the 0'th element, and rejoin it
}
const CompleteData= {
fileData: reader.result,
fileName: file.name,
};
this.setState({
Files:[...this.state.Files, CompleteData]
})
};
});
Ok, It's not exactly in the format of the question, but it seems to me that it can help.
Below is a simple js code, which goes through a list of strings representing file names, and computes the next name.
If the file 'file.png' exists once - it will be returned 'file(1).png'.
If the file 'file(1).png' also exists- it will be returned 'file(2).png', and etc.
let fileList=['fa.png','fb.mp4','fc.jpeg','fa(1).png'];
const getName=(fileName)=>{
let [name,end]=fileName.split('.');
let num = 0;
let curName = `${name}.${end}`;
let exists=fileList.filter(f => f === curName).length;
while(exists) {
console.log('curName:',curName,'exists:',exists,'num:',num);
curName = `${name}(${++num}).${end}`;
exists=fileList.filter(f => f === curName).length;
}
return curName;
}
console.log(getName('fa.png'));
see in codeSandbox
I am able to read/parse excel files on the front end using the below code with FileReader and xlsx package. However, for very large files, this will crash the browser. I only need to read the first few rows, how can i achieve this?
working code
const xlsxParse = (file) => {
var reader = new FileReader();
reader.onload = function (e) {
var data = e.target.result;
let readedData = XLSX.read(data, { type: 'binary' });
const wsname = readedData.SheetNames[0];
const ws = readedData.Sheets[wsname];
/* Convert array to json*/
const dataParse = XLSX.utils.sheet_to_json(ws, { header: 1 });
};
reader.readAsBinaryString(file)
}
my attempt to read first few rows. not working
const xlsxParse = (file) => {
var reader = new FileReader();
reader.onprogress = (e) => {
var data = e.target.result;
let readedData = XLSX.read(data, { type: 'binary' });
if (readedData) {
const wsname = readedData.SheetNames[0];
const ws = readedData.Sheets[wsname];
/* Convert array to json*/
const dataParse = XLSX.utils.sheet_to_json(ws, { header: 1 });
console.log('dataParse', dataParse)
if (dataParse.length > 3) {
reader.abort()
}
}
reader.readAsBinaryString(file)
}
thanks
This can help you out. Basically, sheetRows is standing for reading the first n lines:
handleFile = (file /*:File*/) => {
/* Boilerplate to set up FileReader */
const reader = new FileReader();
const rABS = !!reader.readAsBinaryString;
reader.onload = e => {
/* Parse data */
const bstr = e.target.result;
const wb = XLSX.read(bstr, { type: "binary", sheetRows: 100});
/* Get first worksheet */
const wsname = wb.SheetNames[0];
const ws = wb.Sheets[wsname];
/* Convert array of arrays */
const data = XLSX.utils.sheet_to_json(ws);
const tableColumns = Object.keys(data[0]);
console.log('data', data);
};
if (rABS) reader.readAsBinaryString(file);
else reader.readAsArrayBuffer(file);
};
I was trying to convert a blob to base64, and I found my way around, but while waiting the result from the function displayBase64String the map function in submitOffre returns an empty string even though console.log prints some data.
I'll appreciate any solution
here is my code.
submitOffre = (saleData) => {
debugger ;
var result = base64Service.displayBase64String(saleData);
console.log("========", result);
const rs = result.map(value => value.file); // Doesn't work.
console.log(rs); // rs is empty
}
class Base64Service {
blobToBase64 = (blob, callback) => {
var reader = new FileReader();
var data = '';
reader.onload = function () {
var dataUrl = reader.result;
var base64 = dataUrl.split(',')[1];
callback(base64);
};
reader.readAsDataURL(blob);
}
displayBase64String(formProps) {
const result = [];
const outbut = Object.entries(formProps.imageToUpload).map(([key, value]) => {
this.blobToBase64(value, (data) => {
result.push({ "file": `data:${value.type};base64,${data}` })
})
});
return result;
};
}
export default new Base64Service();
Something like that might help:
I've modified your code a bit, just to show you the basic pattern.
If you're doing more than 1 image at a time, you will need to use Promise.all, to keep track of more than 1 promise at once.
submitOffre = async (saleData) => { // SEE THE async KEYWORD
debugger ;
var result = await blobToBase64(saleData); // SEE THE await KEYWORD
console.log("========", result);
const rs = result.map(value => value.file); // Doesn't work.
console.log(rs); // rs is empty
}
I'll treat as if you were converting only 1 image.
blobToBase64 = (blob, callback) => new Promise((resolve,reject) => {
var reader = new FileReader();
var data = '';
reader.onload = function () {
var dataUrl = reader.result;
var base64 = dataUrl.split(',')[1];
callback(base64);
resolve(base64); // NOTE THE resolve() FUNCTION TO RETURN SOME VALUE TO THE await
};
reader.readAsDataURL(blob);
});
I want to read xlsx file but what i want is just to read first three records because as you know if i read all the records, the browser will crash. I need your help to find a way to just read first three records (rows)
P.S: I don't want to save data in memory while parsing xlsx file
i am using this right now :
fileChange(event) {
const fileList: FileList = event.target.files;
if (fileList.length > 0) {
const file: File = fileList[0];
const reader = new FileReader();
reader.onload = e => {
const arrayBuffer = reader.result,
data = new Uint8Array(arrayBuffer),
arr = new Array();
for (let i = 0; i !== data.length; ++i) {
arr[i] = String.fromCharCode(data[i]);
}
const bstr = arr.join("");
const workbook: XLSX.WorkBook = XLSX.read(bstr, { type: "binary" });
const firstSheetName: string = workbook.SheetNames[0];
const worksheet: XLSX.WorkSheet = workbook.Sheets[firstSheetName];
this.setXlsxData(XLSX.utils.sheet_to_json(worksheet));
};
reader.readAsArrayBuffer(file);
}
}
setXlsxData(data: Array<any>) {
this.headers = Object.keys(data[0]);
this.xlsxData = data;
}
try this library. I have written & read some xlsx files, it works smoothly.
(Apologies, I couldnt write this in comment, can't comment)
https://www.npmjs.com/package/xlsx
I have a web page made by html+javascript which is demo, I want to know how to read a local csv file and read line by line so that I can extract data from the csv file.
Without jQuery:
const $output = document.getElementById('output')
document.getElementById('file').onchange = function() {
var file = this.files[0];
var reader = new FileReader();
reader.onload = function(progressEvent) {
// Entire file
const text = this.result;
$output.innerText = text
// By lines
var lines = text.split('\n');
for (var line = 0; line < lines.length; line++) {
console.log(lines[line]);
}
};
reader.readAsText(file);
};
<input type="file" name="file" id="file">
<div id='output'>
...
</div>
Remember to put your javascript code after the file field is rendered.
Using ES6 the javascript becomes a little cleaner
handleFiles(input) {
const file = input.target.files[0];
const reader = new FileReader();
reader.onload = (event) => {
const file = event.target.result;
const allLines = file.split(/\r\n|\n/);
// Reading line by line
allLines.forEach((line) => {
console.log(line);
});
};
reader.onerror = (event) => {
alert(event.target.error.name);
};
reader.readAsText(file);
}