Spread operator not working with module function - javascript

I've this function in a module for to load a files as base64 and its attributes:
function getFiles(obj, type) {
let destinazione = type;
let files = obj.files;
let imagesDocs = [];
files.forEach(file => {
let reader = new FileReader();
reader.onload = function(ev) {
let result = this.result;
let f = {
name: file.name,
size: file.size,
type: file.type,
data: result
};
imagesDocs.push(f);
};
reader.readAsDataURL(file);
});
return imagesDocs;
}
export default getFiles;
In another module, I import fileArray module and use it as:
const resultArray = getFiles(e.target, type);
let newArray = [...resultArray];
console.log(newArray);
console.log show me an empty array, while resultArray contain more objects.
I tried too with:
let numbers = [1, 2, 3, 4];
let newnumbers = [...numbers];
and work fine.
Why?
Update:
Using the Anson Miu's code I resolved.
The "files" variable, being filelist type must be converted to array type. I did it with [...files].
Thanks to all.

The problem is that reader.onload is an asynchronous process.
What happens is that you loop through all the files, then you pass the empty result (imagesDocs) back, and then sometime later populate imagesDocs.
you need to handle the process asynchronously, using callback, async/await and/or promises.

reader.onload is an asynchronous callback, so it is possible that the imageDocs array is not populated when you return from your getFiles function.
I suggest to extract the logic for reading a file into a separate function that returns a Promise:
function readFile(file) {
return new Promise((resolve, reject) => {
const reader = new FileReader();
reader.onload = function(ev) {
const result = this.result;
resolve({
name: file.name,
size: file.size,
type: file.type,
data: result,
});
};
reader.readAsDataURL(file);
});
}
Then update getFiles to be an asynchronous function that maps the files to Promises and returns a Promise that resolves to the imageDocs:
async function getFiles(obj, type) {
const destinazione = type;
const files = obj.files;
const imageDocs = await Promise.all(files.map(readFile));
return imageDocs;
}

I think you need to
export default getFiles;

Related

Blob to Base64 in javascript not returning anything from FileReader

I am using FileReader in typescript to convert a blob to a base64 image that will then be displayed in the template of my application.
adaptResultToBase64(res: Blob): string {
let imageToDisplay : string | ArrayBuffer | null = '';
const reader = new FileReader();
reader.onloadend = function () {
imageToDisplay = reader.result;
return imageToDisplay;
};
reader.readAsDataURL(res);
return imageToDisplay;
}
Whilst the data logged inside the read.onloadend function displays the base64 string I cannot pass it out of the function.
I have tried adding a callback but where it is called elsewhere doesn't return anything but an empty string.
Please check this code
<input type="file" id="file">
<button id="click">click</button>
let data: string | ArrayBuffer;
document.getElementById('file').onchange = function (e: Event) {
let files: FileList | null = (<HTMLInputElement>e.target).files;
let reader: FileReader = new FileReader();
reader.onload = function (e: ProgressEvent<FileReader>) {
console.log(e.target.result);
data = e.target.result;
};
if (files.length > 0) {
reader.readAsDataURL(files?.[0]);
}
};
document.getElementById('click').onclick = function () {
console.log(data); // result if present otherwise null is returned
};
Using a separate method view. The return value is a Promise.
function adaptResultToBase64(res: Blob): Promise<string> {
let reader: FileReader = new FileReader();
return new Promise((resolve, reject) => {
reader.onloadend = () => {
resolve(reader.result as string);
}
reader.onerror = () => {
reject("Error reading file.");
}
reader.readAsDataURL(res);
})
}
To get the result
adaptResultToBase64(/* Blob value */)
.then(resp => console.log(resp))
.catch(error => console.log(error));
See here for specifics on Promise
MDN
learn.javascript.ru
The basic result I needed and did not realise that the reader.onload is actually a callback for read.readAsDataUrl and finishes everything inside it async.
adaptResultToBase64(res:Blob){
const reader = new FileReader();
reader.onload = function () {
// Was missing code here that needed to be called asynchronously.
adapterToNewObject(reader.result.toString())
};
reader.readAsDataURL(res);
}
}
I was performing this in Angular so for anyone else who runs into this problem here it is using Angular syntax:
In your class:
export class Component {
adaptedResult:Result
getBase64() {
this.http.get().subscribe((result: Blob) => {
const reader = new FileReader();
reader.onload = () => {
this.adaptedResult = this.adapter(reader.result) // Assign or use reader.result value, this is an example of using an adapter function.
};
reader.readAsDataURL(result);
});
}
adapter(base64:string){
return {
name:'image',
image:base64'
}
}
}

NodeJS CSVReader - createReadStream with csv-parse returns empty array

I have the CSV reader class that takes two parameters (filePath, and model). The file path is the path to the .csv file, and the model is the model that the .csv will be built on. The data will be stored in the output array. It as has a problem as when I return the array, it is empty. But when I console log the array, it has the data in there. Can someone help me?
Index.js
const parse = require('csv-parse')
const fs = require('fs');
const output = [];
class CSVReader{
static GetRecord(filePath, model){
fs.createReadStream(filePath)
.pipe(parse({columns: false, delimiter: ',', trim: true, skip_empty_lines: true}))
.on('readable', function (){
let record
while (record = this.read()){
let city = model.create(record)
output.push(record)
}
})
.on('end', function (){
//console.log(output);
})
return output;
}
}
module.exports = CSVReader;
I used Jest to test the file but it has a problem as expected is 6, but I received []
Index.test.js
const CSVReader = require('../src/Index');
const City = require('../src/Models/City')
test('Can Read CSV File', () => {
let filePath = 'data/worldcities.csv';
let records = CSVReader.GetRecord(filePath, City);
expect(records.length).toBe(6);
});
tl;dr:
This is an async call, so you cannot just return the response and expect it to work. You need to use the Promise API and an async function.
What makes this async?
All of the Node.js fs API is async (excluding the fs.*Sync functions).
How can I use Promises?
You can return a Promise at the top of your function, then pass a callback:
return new Promise((resolve, reject) => { /* callback */ });
Fixing the code
// all of this is fine
const parse = require('csv-parse')
const fs = require('fs');
// remove the const output = []; as this will cause problems
class CSVReader{
// this needs to be async
static async GetRecord(filePath, model){
// return a promise
return new Promise((resolve, reject) => {
// assign output here (https://stackoverflow.com/a/66402114/14133230)
const output = [];
fs.createReadStream(filePath)
.pipe(parse({columns: false, delimiter: ',', trim: true, skip_empty_lines: true}))
.on('readable', function (){
let record
while (record = this.read()){
let city = model.create(record)
output.push(record)
}
// you may need to call WriteStream#close() here
})
.on('end', function (){
// output is available here
// resolve the promise
resolve(output);
})
});
}
}
module.exports = CSVReader;
Using the new Promise-based function
const CSVReader = require('../src/Index');
const City = require('../src/Models/City')
test('Can Read CSV File', () => {
let filePath = 'data/worldcities.csv';
let records = CSVReader.GetRecord(filePath, City); // type Promise
records.then((response) => { // when promise fills
expect(response.length).toBe(6);
});
});
First recommendation: get rid of the globally defined "const output = []" in your library class. This will cause so much confusion to whoever uses your static method.
You'll end up accumulating results of previous calls on there.
Also I recommend: if you don't need it to handle huge csv files you can definitely go for the synchronous version and get rid of the complexity of the async loading.
const parse = require('csv-parse/lib/sync');
const fs = require('fs');
class CSVReader{
static GetRecord(filePath, model) {
const csvContents = fs.readFileSync(filePath);
const output = parse(csvContents, {columns: false, delimiter: ',', trim: true, skip_empty_lines: true})
.map(function (record) {
return model.create(record);
});
return output;
}
}
module.exports = CSVReader;
Another potentially helpful restructuring could come from attaching the factory pattern (which you currently aim to implement with CSVReader) directly to the base implementation of your Models. This will make it clearer which Models are usable with the CSVReader and how the implementation of a Model is structured to be usable with the pattern.
The base CSVModel provides the general listFromCSV functionality and modelFromRecord instantiation.
src/Models/CSVModel.js
const parse = require('csv-parse/lib/sync');
const fs = require('fs');
class CSVModel {
static modelFromRecord(record) {
var model = new this();
model.record = record;
return model;
}
static listFromCSV (filePath) {
const csvContents = fs.readFileSync(filePath);
return parse(csvContents, {columns: false, delimiter: ',', trim: true, skip_empty_lines: true})
.map((record) => {
return this.modelFromRecord(record);
});
}
}
module.exports = CSVModel;
City model inherits the listFromCSV factory from the CSVModel and specifies its data association to local model properties
src/Models/City.js
const CSVModel = require('./CSVModel');
class City extends CSVModel {
static modelFromRecord(record) {
var model = super.modelFromRecord(record);
// assign properties of the CSV row to dedicated model properties
[model.name, model.id, model.asd] = [record[0], record[1], record[2]];
// or classical
model.name = record[0];
// ...
return model;
}
}
module.exports = City;
This allows the usage of the concrete Model Class to instantiate a list from a CSV.
test/csvmodel.test.js
const City = require('../src/Models/City');
test('Can Read City list', () => {
let filePath = 'data/worldcities.csv';
let records = City.listFromCSV(filePath);
expect(records.length).toBe(6);
});

Rxjs chain to convert observable<File> to observable<string> (base64)

I have a working code to convert my File object to base64:
let reader = new FileReader();
reader.readAsDataURL(myFile);
reader.onload = () => {
let resultStrOrArrayBuf = reader.result;
if (!(resultStrOrArrayBuf instanceof ArrayBuffer)) {
..do something with resultStrOrArrayBuf
}
};
However I do now have to integrate this part to an existing rxjs chain. In the chain I receive the File object and would like to go on with the base64 result of the conversion. However the conversion is done with the help of the onload event. Is there some way to convert this event to a new observable and pass this to the chain?
there is no out of the box method to convert this loaded event to rxjs observable. you will have to make your own operator.
export const dataUrlToObs = myFile => new Observable<string | ArrayBuffer>(subscriber => {
const reader = new FileReader();
reader.readAsDataURL(myFile);
reader.onload = () => {subscriber.next(reader.result); subscriber.complete(); };
reader.onerror = () => subscriber.error(reader.error);
return () => reader.abort(); // cancel function in case you unsubscribe from the obs
}
it later can be used like this:
..chain
switchMap(myFile => dataUrlToObs(myFile)),
tap(resultStrOrArrayBuf => {
if (!(resultStrOrArrayBuf instanceof ArrayBuffer)) {
..do something with resultStrOrArrayBuf
}
})
Consider the following helper function, which accepts a Blob as a parameter and return an Observable<string>:
function blobToBase64(blob: Blob): Observable<string> {
return new Observable<string>(observer => {
const reader = new FileReader();
reader.onerror = observer.error;
reader.onabort = observer.error;
reader.onload = () => observer.next(reader.result as string);
reader.onloadend = observer.complete;
reader.readAsDataURL(blob);
return {
unsubscribe: reader.abort
}
})
}
Usage:
declare const fileObservable: Observable<File>;
fileObservable
.pipe(switchMap(blobToBase64))
.subscribe(base64 => console.log(base64))

how to turn multiple files into base64 string?

i have a component like this
<input type="file" multiple #change="toBase64Handler($event)">
<script>
data() {
return {
files: [],
}
},
methods: {
tobase64Handler(event) {
// the code
}
}
</script>
and i would like to turn all of the selected files into base64 string something like this:
files: [
{
selectedFile: 'ajsdgfauywdljasvdajsgvdasdo1u2ydfouayvsdlj2vo8ayasd...'
},
{
selectedFile: 'askdhgoiydvywdljasvdajsgvdasdo1u2ydfoakjgsfdjagswsd...'
},
{
selectedFile: '12edashjvlsljasvdajsgvdasdo1u2ydfouayvsdlj2vo8ayfsd...'
},
]
how do i achieve that?
You can loop though the files call a helper method toBase64, push all Promises to an array and resolve all of them:
toBase64(file) {
return new Promise((resolve, reject) => {
const reader = new FileReader();
reader.readAsDataURL(file);
reader.onload = () => resolve(reader.result);
reader.onerror = error => reject(error);
});
};
async tobase64Handler(files) {
const filePathsPromises = [];
files.forEach(file => {
filePathsPromises.push(this.toBase64(file));
});
const filePaths = await Promise.all(filePathsPromises);
const mappedFiles = filePaths.map((base64File) => ({ selectedFile: base64File }));
return mappedFiles;
}
this should do the trick: JSON.stringify(files) you can read more about it here. If you later want to turn it back into the original array, object, or value then you'd do JSON.parse(files). you can read more about it here
UPDATE: turns out I was wrong and JSON.stringify/parse don't work with files(thanks for the info #patrick evans).
found this answer which seems better (the one by #ahmed hamdy)

Lag by taking an instance of an array using es6

I got an empty array while I'm logging for an instance of an array!
onSelectMultipleImage = (event): Promise<any> => {
return new Promise(resolve => {
const files = <File[]>event.target.files;
let file: File;
let counter = -1;
const response = [];
while (file = files[++counter]) {
const reader: FileReader = new FileReader();
reader.onload = ((f) => {
return () => {
response.push({
file: f,
base64: reader.result
})
}
})(file);
reader.readAsDataURL(file);
console.log(counter);
if(counter == files.length -1) {
console.log('inside the while');
resolve(response);
}
}
});
};
onImagesSelect = async (event: Event) => {
this.newImages = await this.helper.onSelectMultipleImage(event) || [];
console.log(this.newImages, "Original"); // [file: File, base64: "base 64 long string..."]
console.log([...this.newImages],"instance"); // [] empty array
setTimeout(() => {console.log([...this.newImages, 'instance'])}, 2000); // [file: File, base64: "base 64 long string..."]
}
So why I'm getting the presented result? It's something causing by the base64 presented inside the array? if yes what is the solution?
It doesn't wait reader.onload to be completed. So resolve(response) is called before response.push.
You can create a promise to read one file and return them with Promise.all like following code.
readFile = (file: File): Promise<any> => {
return new Promise(resolve => {
const reader: FileReader = new FileReader();
reader.onload = (f) => {
resolve({
file: f,
base64: reader.result
});
};
reader.readAsDataURL(file);
})
}
onSelectMultipleImage = (event): Promise<any> => {
const files = <File[]>event.target.files;
return Promise.all(files.map(file => readFile(file)));
};

Categories