How do I read a JSON file in Angular? - javascript

I am trying to load a JSON file from local disk and use the data from it to fill a FabricJS canvas. I have problems on getting the data from the file.
This is what i have till now.
app.html
<input type="file" accept=".json" id="fileInput" (change)="loadFile($event)"/>
app.ts
loadFile(event) {
const eventObj: MSInputMethodContext = <MSInputMethodContext> event;
const target: HTMLInputElement = <HTMLInputElement> eventObj.target;
const files: FileList = target.files;
this.file = files[0];
const reader = new FileReader();
reader.readAsText(this.file, 'utf8');
this.canvas.loadFromJSON(this.file, this.canvas.renderAll.bind(this.canvas), function (o, object) {
console.log(o, object);
});
Any thoughts on how I can make this work?

FileReader has an async api.
You must register a callback to the onload event to get the data.
loadFile(event) {
const eventObj: MSInputMethodContext = <MSInputMethodContext> event;
const target: HTMLInputElement = <HTMLInputElement> eventObj.target;
const files: FileList = target.files;
this.file = files[0];
const reader = new FileReader();
reader.readAsText(this.file, 'utf8');
reader.onload = function() {
this.canvas.loadFromJSON(reader.result, this.canvas.renderAll.bind(this.canvas), function (o, object) {
console.log(o, object);
});
}

Related

SheetJS XLSX.read ends up with Page Unresponsive

I am trying to read a xlsx file that has a size of 3000KB and convert it to JSON using SheetJS, but when I read the file I seem to get a chrome 'Page Unresponsive error', clicking wait on the error everytime I upload a file is not ideal.
How can I resolve this issue?
onChange(event) {
this.file = event.target.files ? event.target.files[0] : null;
this.uploaded = true;
const reader = new FileReader();
reader.readAsArrayBuffer(this.file);
reader.onload = function () {
const data = new Uint8Array(reader.result);
const wb = XLSX.read(data, { type: 'array', sheets: ['Sheet1', 'Sheet2'] });
this.summary = XLSX.utils.sheet_to_json(wb.Sheets.Summary);
this.weeklyDetails = XLSX.utils.sheet_to_json(wb.Sheets['Weekly Details']);
console.log(this.weeklyDetails);
};
},

Is it possible to convert a blob file into a base64Data in Javascript (Ionic,Angular)?

[
async FileZip() {
const code = await fetch("./assets/input.txt")
var blob = await downloadZip([code]).blob()
console.log(blob);
function blobToBase64(blob: Blob): Observable<string> {
return new Observable<string>(observer => {
const reader = new FileReader();
reader.onerror = observer.error;
reader.onabort = observer.error;
reader.onload = () => observer.next(reader.result as string);
reader.onloadend = observer.complete;
FileSharer.share({
filename: "input.zip",
base64Data: //base64datawillbehere ,
contentType: 'application/zip'
});
reader.readAsDataURL(blob);
})
I am pretty new to Ionic and App Development.
I have compressed a text file into a zip blob file using client-zip library. Using the downloadZip() I am getting a zip blob file like this.
I want to share this file as a zip file using Capacitor Filesharer . But to use this Filesharer plugin , it seems I have to convert this blob zip file into base64 data.
Can anyone tell how to do it ?? Or is it even possible to do this ??
Please forgive me If you find my question too immature ,because as I said I am pretty new to javascript .
Consider using the following function:
function blobToBase64(blob: Blob): Observable<string> {
return new Observable<string>(observer => {
const reader = new FileReader();
reader.onerror = observer.error;
reader.onabort = observer.error;
reader.onload = () => observer.next(reader.result as string);
reader.onloadend = observer.complete;
reader.readAsDataURL(blob);
})
}
Try modifying your code as demonstrated below: (haven't changed the previous answer as it might be useful for others to implement such operations using Observable strategy unlike your case where I would recommend of using Promise)
ngOnInit(): void {
this.fileZip();
}
private blobToBase64(blob: Blob): Promise<string> {
return new Promise((resolve, reject) => {
const reader = new FileReader();
reader.onerror = reject;
reader.onabort = reject;
reader.onload = () => resolve(reader.result as string);
reader.readAsDataURL(blob);
})
}
private async fileZip(): Promise<void> {
const code = await fetch("./assets/input.txt")
const blob = await downloadZip([code]).blob();
const base64Data = await this.blobToBase64(blob);
await FileSharer.share({
filename: "input.zip",
base64Data: base64Data,
contentType: "application/zip",
})
}
You can try this code -
fileZip() {
constcode = awaitfetch("./assets/input.txt");
varblob = awaitdownloadZip([code]).blob()
console.log(blob);
varreader = newFileReader();
reader.readAsDataURL(blob);
reader.onloadend = ()=> {
constresult = reader.resultasstring;
constbase64Data = result.split(',')[1];
console.log(base64Data)
FileSharer.share({
filename:"json.zip",
base64Data,
contentType:'application/zip'
});
}
}

Vue.js excel to json

i tried to read the excel files with vue.js but u can once i read the file the memory start to sky rocket like 5 gb ram and i the excel file is feairly small pls help need to convert the file to json
The vue method to handle the excel file i tried all the type option i saw in the documintation but all send me diffrent errors
I saw a similar question here but still could not solve this
when tried
base64: "TypeError: input.replace is not a function"
binary: "TypeError: x.charCodeAt is not a function"
string: "TypeError: data.match is not a function"
array: is the one that cause the memory to get into 5gb
Also when tried to use the new file reader as present in the documentation when create the reader.onload the function never ran.
the actual temeplate i tried two things.
when i use the buffer it's seems to work but all the function return empty array.
like the file is empty but it is not
both way did the same thing
<v-file-input
v-on:change="displayFile($event)"
v-model="file">
</v-file-input>
<input type="file" name="xlfile" id="xlf" v-on:change="displayFile($event)" />
displayFile: function (event) {
// console.log(event.target.files[0])
// const file = event.target.files[0]
// const workbook = XLSX.read(file, {
// type: 'string'
// })
// console.log(workbook, workbook.SheetNames)
// const res = XLSX.read(file)
// console.log(res)
// const res = XLSX.read(this.file)
// console.log(res)
console.log(this.file)
this.file.text().then(text => {
const fileType = this.file.type
console.log(fileType)
// this.PropceseMethod(this.file, fileType)
})
const reader = new FileReader()
reader.onload = (data) => {
console.log('HERE')
console.log(data)
const workbook = XLSX.read(data, {
type: 'buffer'
})
console.log(workbook)
workbook.SheetNames.forEach(function (sheetName) {
console.log(sheetName)
console.log(workbook.Sheets[sheetName])
// Here is your object
const XLRowObject = XLSX.utils.sheet_to_row_object_array(workbook.Sheets[sheetName])
console.log(XLSX.utils.sheet_to_json(workbook.Sheets[sheetName]))
console.log(XLRowObject)
const jsonObject = JSON.stringify(XLRowObject)
console.log(jsonObject)
})
}
reader.onerror = function (ex) {
console.log(ex)
}
reader.readAsText(this.file)
}
to manage this i had to do change the way i am reading the file.
When i used readAsBinaryString it's working, and pay using the type binary with this.
This function is reading only the first sheet
fileToJson (e) {
const file = e.target.files[0]
/* Boilerplate to set up FileReader */
const reader = new FileReader()
reader.onload = (e) => {
/* Parse data */
const bstr = e.target.result
const wb = XLSX.read(bstr, { type: 'binary' })
/* Get first worksheet */
const wsname = wb.SheetNames[0]
const ws = wb.Sheets[wsname]
/* Convert array of arrays */
const data = XLSX.utils.sheet_to_json(ws, { header: 1 })
/* Update state */
this.data = data
const header = data.shift()
}
reader.readAsBinaryString(file)
}
This code worked for me in a Vue CLI App:
// Important that import statement must get the full.min.js file only.
import XLSX from '../../../node_modules/xlsx/dist/xlsx.full.min.js'
var reader = new FileReader()
reader.onload = function (e) {
var data = e.target.result
var workbook = XLSX.read(data, { type: 'binary' })
let sheetName = workbook.SheetNames[0]
let worksheet = workbook.Sheets[sheetName]
let rowObject = XLSX.utils.sheet_to_row_object_array(worksheet)
const finalJsonData = JSON.stringify(rowObject, undefined, 4)
console.log(finalJsonData)
}
reader.readAsBinaryString(this.excelFile)
With my final JSON Output as:
[
{
"email": "test5#test.com",
"password": "password",
"full_name": "Some Name 5",
"mobile": 9897675463
},
{
"email": "test6#test.com",
"password": "password",
"full_name": "Some Name 6",
"mobile": 9897675463
},
...
...
]
And my Excel file as:

Angular 6 - Upload files

I am trying to post files (Video and Thumbnail) and object to server, but I have an issue because of the structure backend expects. When I do it from postman, this is how it looks like and it works:
{'name': ['Blabla'], 'user': ['8c3a636c-9d08-453d-9e59-7a0ec93200c4'], 'file': [<InMemoryUploadedFile: SampleVideo_1280x720_1mb.mp4 (video/mp4)>]}>
I am having trouble with passing the file like this, don't know how to do it. I tried to do it like this:
videoFile: File[] = [];
thumbnailFile: File[] = [];
files: File[] = [];
readVideoUrl(event:any) {
this.videoFile = [];
const eventObj: MSInputMethodContext = <MSInputMethodContext> event;
const target: HTMLInputElement = <HTMLInputElement> eventObj.target;
const files: FileList = target.files;
if (files) {
this.videoFile.push(files[0]);
this.videoModel.name = files[0].name;
}
if (event.target.files && event.target.files[0]) {
var reader = new FileReader();
reader.onload = (event: ProgressEvent) => {
this.videoUrl = (<FileReader>event.target).result;
}
reader.readAsDataURL(event.target.files[0]);
}
}
readThumbUrl(event:any) {
this.thumbnailFile = [];
const eventObj: MSInputMethodContext = <MSInputMethodContext> event;
const target: HTMLInputElement = <HTMLInputElement> eventObj.target;
const files: FileList = target.files;
if (files) {
this.thumbnailFile.push(files[0]);
}
if (event.target.files && event.target.files[0]) {
var reader = new FileReader();
reader.onload = (event: ProgressEvent) => {
this.thumbUrl = (<FileReader>event.target).result;
}
reader.readAsDataURL(event.target.files[0]);
}
}
I pass the model and files:
this.campaignService.createVideo(this.videoModel, this.files)
.subscribe(
(response: any) => {
},
(error) => {
console.log(error);
}
);
And here is the issue, how can I create the structure from above with form data, I used to do it like this:
postMultipart(url: string, data: any, files: File[]) {
const formData: FormData = new FormData();
// I understand this stringify is part of the issue,
// just put the code as it is at the moment.
formData.append('data', JSON.stringify(data));
for (const file of files) {
formData.append(file.name, file);
}
const result = this.http.post(url, formData)
.pipe(map((response: Response) => {
return response;
// }
}),
catchError(response => this.handleError(response))
);
return result;
}
But this passes everything like string, and backend is not expecting that. How can I get this (this is just video, the thumbnail is an image):
{'file': [<InMemoryUploadedFile: SampleVideo_1280x720_1mb.mp4 (video/mp4)>]}>

Reading multiple files with FileReader using .map()

I have array of files and I need to format the file to json (object at this point) with bunch of other info. Here's what I have tried
const uploadData = Files.map((file) => {
const fr = new FileReader();
fr.readAsArrayBuffer(file);
fr.onload = (event) =>{
const fileData = event.target.result;
return {
query:{
documentName: file.name,
personId: personId,
serviceId: serviceID,
documentFile: fileData,
}
}
}
})
I want to use immutable techniques. I do have a guess why this doesn't work but have no idea how to fix it. I Think .map does not wait for filereader to read and thus returns only array of undefined values. I tried to use IIFE but was unsuccessfull.
const uploadData = [],
Files.forEach((file) => {
const fr = new FileReader();
fr.readAsArrayBuffer(file);
fr.onload = (event) =>{
const fileData = event.target.result;
uploadData.push(query:{
documentName: file.name,
personId: personId,
serviceId: serviceID,
documentFile: fileData,
})
}
})
Since return is occuring in the callback , you should use local var and push the data

Categories