Javascript Promises with FileReader() - javascript

I have the following HTML Code:
<input type='file' multiple>
And Here's my JS Code:
var inputFiles = document.getElementsByTagName("input")[0];
inputFiles.onchange = function(){
var fr = new FileReader();
for(var i = 0; i < inputFiles.files.length; i++){
fr.onload = function(){
console.log(i) // Prints "0, 3, 2, 1" in case of 4 chosen files
}
}
fr.readAsDataURL(inputFiles.files[i]);
}
So my question is, how can I make this loop synchronous ? That is first wait for the file to finish loading then move on to the next file. Someone told me to use JS Promises. But I can't make it to work. Here's what I'm trying:
var inputFiles = document.getElementsByTagName("input")[0];
inputFiles.onchange = function(){
for(var i = 0; i < inputFiles.files.length; i++){
var fr = new FileReader();
var test = new Promise(function(resolve, reject){
console.log(i) // Prints 0, 1, 2, 3 just as expected
resolve(fr.readAsDataURL(inputFiles.files[i]));
});
test.then(function(){
fr.onload = function(){
console.log(i); // Prints only 3
}
});
};
}
Thanks in advance...

We modified midos answer to get it to work the following:
function readFile(file){
return new Promise((resolve, reject) => {
var fr = new FileReader();
fr.onload = () => {
resolve(fr.result )
};
fr.onerror = reject;
fr.readAsText(file.blob);
});
}

If you want to do it sequentially( not synchronously) using Promises, you could do something like:
var inputFiles = document.getElementsByTagName("input")[0];
inputFiles.onchange = function(){
var promise = Promise.resolve();
inputFiles.files.map( file => promise.then(()=> pFileReader(file)));
promise.then(() => console.log('all done...'));
}
function pFileReader(file){
return new Promise((resolve, reject) => {
var fr = new FileReader();
fr.onload = resolve; // CHANGE to whatever function you want which would eventually call resolve
fr.onerror = reject;
fr.readAsDataURL(file);
});
}

Preface: This answer originally written in 2015 shows wrapping FileReader in a promise. That's still a perfectly valid way to do the readAsDataURL operation the question asked about, but if you were going to use readAsText or readAsArrayBuffer (in general, new code shouldn't use the older readAsBinaryString), you'd want to use the File object's built-in promise-based methods text or arrayBuffer instead (or possibly stream if you want to do inline processing of the data as it flowed through), all of which are inherited from Blob.
The nature of FileReader is that you cannot make its operation synchronous.
I suspect you don't really need or want it to be synchronous, just that you want to get the resulting URLs correctly. The person suggesting using promises was probably right, but not because promises make the process synchronous (they don't), but because they provide standardized semantics for dealing with asynchronous operations (whether in parallel or in series):
Using promises, you'd start with a promise wrapper for readAsDataURL (I'm using ES2015+ here, but you can convert it to ES5 with a promise library instead):
function readAsDataURL(file) {
return new Promise((resolve, reject) => {
const fr = new FileReader();
fr.onerror = reject;
fr.onload = () => {
resolve(fr.result);
}
fr.readAsDataURL(file);
});
}
Then you'd use the promise-based operations I describe in this answer to read those in parallel:
Promise.all(Array.prototype.map.call(inputFiles.files, readAsDataURL))
.then(urls => {
// ...use `urls` (an array) here...
})
.catch(error => {
// ...handle/report error...
});
...or in series:
let p = Promise.resolve();
for (const file of inputFiles.files) {
p = p.then(() => readAsDataURL(file).then(url => {
// ...use `url` here...
}));
}
p.catch(error => {
// ...handle/report error...
});
Inside an ES2017 async function, you could use await. It doesn't do much for the parallel version:
// Inside an `async` function
try {
const urls = await Promise.all(Array.prototype.map.call(inputFiles.files, readAsDataURL));
} catch (error) {
// ...handle/report error...
}
...but it makes the series version simpler and clearer:
// Inside an `async` function
try {
for (const file of inputFiles.files) {
const url = await readAsDataURL(file);
// ...use `url` here...
}
} catch (error) {
// ...handle/report error...
}
Without promises, you'd do this by keeping track of how many outstanding operations you have so you know when you're done:
const inputFiles = document.getElementsByTagName("input")[0];
inputFiles.onchange = () => {
const data = []; // The results
let pending = 0; // How many outstanding operations we have
// Schedule reading all the files (this finishes before the first onload
// callback is allowed to be executed). Note that the use of `let` in the
// `for` loop is important, `var` would not work correctly.
for (let index = 0; index < inputFiles.files.length; ++index) {
const file = inputFiles.files[index];
// Read this file, remember it in `data` using the same index
// as the file entry
const fr = new FileReader();
fr.onload = () => {
data[index] = fr.result;
--pending;
if (pending == 0) {
// All requests are complete, you're done
}
}
fr.readAsDataURL(file);
++pending;
});
};
Or if you want for some reason to read the files sequentially (but still asynchronously), you can do that by scheduling the next call only when the previous one is complete:
// Note: This assumes there is at least one file, if that
// assumption isn't valid, you'll need to add an up-front check
var inputFiles = document.getElementsByTagName("input")[0];
inputFiles.onchange = () => {
let index = 0;
readNext();
function readNext() {
const file = inputFiles.files[index++];
const fr = new FileReader();
fr.onload = () => {
// use fr.result here
if (index < inputFiles.files.length) {
// More to do, start loading the next one
readNext();
}
}
fr.readAsDataURL(file);
}
};

I upgrade Jens Lincke answer by add working example and introduce async/await syntax
function readFile(file) {
return new Promise((resolve, reject) => {
let fr = new FileReader();
fr.onload = x=> resolve(fr.result);
fr.onerrror = reject;
fr.readAsDataURL(file) // or readAsText(file) to get raw content
})}
function readFile(file) {
return new Promise((resolve, reject) => {
let fr = new FileReader();
fr.onload = x=> resolve(fr.result);
fr.readAsDataURL(file) // or readAsText(file) to get raw content
})}
async function load(e) {
for(let [i,f] of [...e.target.files].entries() ){
msg.innerHTML += `<h1>File ${i}: ${f.name}</h1>`;
let p = document.createElement("pre");
p.innerText += await readFile(f);
msg.appendChild(p);
}
}
<input type="file" onchange="load(event)" multiple />
<div id="msg"></div>

Promisified FileReader
/**
* Promisified FileReader
* More info https://developer.mozilla.org/en-US/docs/Web/API/FileReader
* #param {*} file
* #param {*} method: readAsArrayBuffer, readAsBinaryString, readAsDataURL, readAsText
*/
export const readFile = (file = {}, method = 'readAsText') => {
const reader = new FileReader()
return new Promise((resolve, reject) => {
reader[method](file)
reader.onload = () => {
resolve(reader)
}
reader.onerror = (error) => reject(error)
})
}
Usage
const file = new File(["foo"], "foo.txt", {
type: "text/plain",
});
// Text
const resp1 = await readFile(file)
console.log(resp1.result)
// DataURL
const resp2 = await readFile(file, 'readAsDataURL')
console.log(resp2.result)

Using promises can make it much more elegant,
// opens file dialog waits till user selects file and return dataurl of uploaded file
async function pick() {
var filepicker = document.createElement("input");
filepicker.setAttribute("type","file");
filepicker.click();
return new Promise((resolve,reject) => {
filepicker.addEventListener("change", e => {
var reader = new FileReader();
reader.addEventListener('load', file => resolve(file.target.result));
reader.addEventListener('error', reject);
reader.readAsDataURL(e.target.files[0]);
});
});
}
// Only call this function on a user event
window.onclick = async function() {
var file = await pick();
console.log(file);
}

Here is another modification to Jens' answer (piggybacking off Mido's answer) to additionally check the file size:
function readFileBase64(file, max_size){
max_size_bytes = max_size * 1048576;
return new Promise((resolve, reject) => {
if (file.size > max_size_bytes) {
console.log("file is too big at " + (file.size / 1048576) + "MB");
reject("file exceeds max size of " + max_size + "MB");
}
else {
var fr = new FileReader();
fr.onload = () => {
data = fr.result;
resolve(data)
};
fr.readAsDataURL(file);
}
});
}

We can use the callback function to get the reader.result
function myDisplay(some) {
document.getElementById('demo').innerHTML = some;
}
function read(file, callback) {
const reader = new FileReader();
reader.onload = () => {
callback(reader.result);
}
reader.readAsText(file);
}
// When you pass a function as an argument, remember not to use parenthesis.
read(this.files[0], myDisplay);

Related

Promise in background sync

I have a promise which return an array of objects from IndexedDB:
const syncData = () => {
return new Promise((resolve, reject)=>{
var all_form_obj = [];
var db = self.indexedDB.open('Test');
db.onsuccess = function(event) {
var db = this.result
// Table new_form
var count_object_store_new_form = this.result.transaction("new_form").objectStore("new_form").count()
count_object_store_new_form.onsuccess = function(event) {
if(count_object_store_new_form.result > 0){
db.transaction("new_form").objectStore("new_form").getAll().onsuccess = function(event) {
var old_form_arr = event.target.result
for(element in old_form_arr){
all_form_obj.push(old_form_arr[element])
}
}
}
}
// Table old_form
var count_object_store_old_form = this.result.transaction("old_form").objectStore("old_form").count()
count_object_store_old_form.onsuccess = function(event) {
if(count_object_store_old_form.result > 0){
db.transaction("old_form").objectStore("old_form").getAll().onsuccess = function(event) {
var old_form_arr = event.target.result
for(element in old_form_arr){
all_form_obj.push(old_form_arr[element])
}
}
}
}
}
db.onerror = function(err) {
reject(err);
}
resolve(all_form_obj)
})
};
After I resolve my array, I call the promise in the sync event:
self.addEventListener('sync', function(event) {
if (event.tag == 'sync_event') {
event.waitUntil(
syncData()
.then((form_arr)=>{
console.log(form_arr)
for(form in form_arr) {
console.log(form_arr)
}
}).catch((err) => console.log(err))
);
}
});
In the 'then' of my promise syncData I print to the console two times.
The first console.log appears in the console (my array of objects) but the second which is in loop (for in) doesn't appear in the console and I don't understand why.
My goal is to be able to loop through each object and send it to my database with fetch but the problem is that the code in the loop doesn't run.
My result of the first console log:
I think resolve is not placed in the desired place and the reason why the 2nd console.log is not showing up is because form_arr is []. I simplified your code to demonstrate why it went []. db.onsuccess and db.onerror were just defined there without being called. To fix this problem, you may want to place resolve inside db.onsuccess and reject inside db.onerror.
const syncData = () => {
return new Promise((resolve, reject)=>{
var all_form_obj = [];
var db = self.indexedDB.open('Test');
db.onsuccess = function(event) {
// ... will be called async
resolve(all_form_obj)
}
db.onerror = function(err) {
// ... will be called async
}
// remove resolve here
})
};

Promise is still been returned after await when converting image to base64 string

I have the method below that converts an image to base64 string
const toBase64 = file => new Promise((resolve, reject) => {
const reader = new FileReader();
reader.readAsDataURL(file);
reader.onload = () => resolve(reader.result);
reader.onerror = error => reject(error);
});
Then I decide to use this function like below:
let base_strings = files.map(async file => {
let image = await toBase64(file);
console.dir(image); //this console base string
return image;
});
but when I try to access my base_string array, I still get a promise like below:
I am not exactly sure of what I am doing wrong, I need to be able to get my base64 string in the array.
Try using Promise.all() like this:
let base_strings = files.map(file => {
return toBase64(file);
});
const myImages = await Promise.all(base_strings);
map function return an Array you need to use Promise.all like that
await Promise.all(files.map())
async await dont work with forEach or map :
let base_strings = files.map(async file => {
let image = await toBase64(file);
console.dir(image); //this console base string
return image;
});
You can do that with simple for loop, but that should be within async function like this :
const getFiles = async (files) => {
const base_strings = [];
for(let i=0;i<files.length; i++) {
let image = await toBase64(file);
console.dir(image); //this console base string
base_strings.push(image);
}
return base_strings;
}

Does fetch-API ReadableStream work with promise.then(...)?

I'm trying to use the Fetch API readable streams to download multiple files.
const files = [.....]; // my file objects
const promises = [];
for (let i = 0; i < files.length; i += 1) {
const file = files[i];
promises.push(
fetch(file.uri)
.then(response => {
const reader = response.body.getReader();
return new ReadableStream({
async start(controller) {
while (true) {
const { done, value } = await reader.read();
// When no more data needs to be consumed, break the reading
if (done) {
break;
}
if (!file.content) {
file.content = value;
}
else {
file.content += value;
}
// Enqueue the next data chunk into our target stream
controller.enqueue(value);
}
// Close the stream
controller.close();
reader.releaseLock();
}
});
})
.then(rs => new Response(rs))
);
}
return Promise.all(promises).then(() => {
// do something else once all the files are downloaded
console.log('All file content downloaded');
});
The idea here is for the file objects to only have a URI. Then this code will add a content field. After this point I can do something else.
However, in practice, the code sequence is incorrect. Namely the following lines run immediately after each other.
return new ReadableStream({...});
console.log('All file content downloaded');
The code's not waiting until the file content has been downloaded. The log above is printed ahead of time. After it runs I see the code hit the
while (true) {
Loop's code. i.e. Where the file content is streamed.
I'm obviously misunderstanding a fundamental concept. How can I wait for the file content to be downloaded and then do something else? i.e. How do streams work with the promise.then() model.
Found the simplest solution was to create my own promise.
const files = [.....]; // my file objects
const promises = [];
for (let i = 0; i < files.length; i += 1) {
const file = files[i];
promises.push(
fetch(file.uri)
.then(response => {
return new Promise((resolve, reject) => {
const reader = response.body.getReader();
const stream = new ReadableStream({
async start(controller) {
while (true) {
const { done, value } = await reader.read();
// When no more data needs to be consumed, break the reading
if (done) {
break;
}
if (!file.content) {
file.content = value;
}
else {
file.content += value;
}
// Enqueue the next data chunk into our target stream
controller.enqueue(value);
}
// Close the stream
controller.close();
reader.releaseLock();
}
});
});
})
.then(rs => new Response(rs))
);
}
return Promise.all(promises).then(() => {
// do something else once all the files are downloaded
console.log('All file content downloaded');
});
Caveat: The error scenario should also be gracefully handled.

Why await a promise doesn't wait the promise to resolve?

I am trying to learn to use properly async await but I am kind of cofused about it.
In the snippets, I am trying to build an array of object containing the infos I need about the file I am uploading in the component. The problem is that the objects in this.fileInfo are not exactly waiting the promise to return the encoded images, returning this output as I console.log this.fileInfo:
As you can see, the key image is a ZoneAwarePromise whose value is undefined. Can you please help me to fix this?
Function build()
async build(e) {
let files = e.target.files;
this.j = Array.from(files);
this.fileInfo = await this.j.reduce((acc, cur) => [
...acc, {
name: cur.name.replace(/^.*\\/, ""),
sizeunit: this.getSize(cur.size),
extention: cur.name.split(/\.(?=[^\.]+$)/).slice(-1).pop().toString(),
image: this.previewFile(cur)
}
], [])
console.log(await this.fileInfo);
}
Promise
async previewFile(file) {
const reader = new FileReader();
reader.readAsDataURL(file);
reader.onload = () => {
return new Promise((res) => {
res(reader.result)
}).then( res => res);
};
}
You are not awaiting anything in this function: async previewFile(file).
Perhaps you assume returning a new Promise somewhere along the lines of your code will make it function as a Promise. In this particular case it will not work, because it is inside a delegate (onload), that will not be executed within the scope of your function previewFile().
You can lose the async modifier, because you can return a Promise instead:
previewFileAsync(file) {
// the async modifier keyword is not necessary,
// because we don't need to await anything.
return new Promise((res) => {
const reader = new FileReader();
reader.readAsDataURL(file);
reader.onload = () => res(reader.result);
});
}
When you call this, you can await it inside your loop:
async buildAsync(e) {
let files = e.target.files;
for(let i = 0; i < files.length; i++) {
const file = files[i];
const preview = await previewFileAsync(file);
// Do something with preview here...
}
}
Of course, you can execute a range of promises to allow for some sort of concurrency, but this will help get you started.
I added the Async suffix to your method so a caller knows that this can be awaited. It does not do anything special, but it helps clarify your code. You can use whatever suffix you think is right. I'm just used to the Async suffix.
Edit
Stackblitz example of async logic

How to convert method created to return a promise with $q library to use an ES6 Promise. Angularjs app to Angular4+

Since the ES6 Promise does not have a deferred object confused on how to go about converting this to work with ES6 Promises. One solution that I was looking at is to add a deffered object manually to the Promise Constructor. see the No deffered section for the example code
Reason: I am converting the angularjs app to angular4 and using it to better understand how to work with ES6 Promises. I was going to add the deferred object, but thought that was too much of a workaround.
function generateImages() {
console.log('generateImagesCalled')
// set up promises
var fullDeferred = $q.defer();
var thumbDeferred = $q.defer();
var resolveFullBlob = blob => fullDeferred.resolve(blob);
var resolveThumbBlob = blob => thumbDeferred.resolve(blob);
var displayPicture = (url) => {
var image = new Image();
image.src = url;
// Generate thumb
var maxThumbDimension = THUMB_IMAGE_SPECS.maxDimension;
var thumbCanvas = _getScaledCanvas(image, maxThumbDimension);
thumbCanvas.toBlob(resolveThumbBlob, 'image/jpeg', THUMB_IMAGE_SPECS.quality);
// Generate full
var maxFullDimension = FULL_IMAGE_SPECS.maxDimension;
var fullCanvas = _getScaledCanvas(image, maxFullDimension);
fullCanvas.toBlob(resolveFullBlob, 'image/jpeg', FULL_IMAGE_SPECS.quality);
}
var reader = new FileReader();
reader.onload = (e) => {
displayPicture(e.target.result);
}
reader.readAsDataURL(vm.currentFile);
return $q.all([fullDeferred.promise, thumbDeferred.promise]).then(results => {
console.log(results);
return {
full: results[0],
thumb: results[1]
}
});
}
In some special cases resolve and reject can be exposed in order to replicate deferreds:
let pResolve;
let pReject;
const p = new Promise((resolve, reject) => {
pResolve = resolve;
pReject = reject;
});
Or deferred object can be formed:
const deferred = {};
deferred.promise = new Promise((resolve, reject) => {
Object.assign(deferred, { resolve, reject });
});
Since deferreds aren't Promise built-in feature and are prone to be antipattern, it's preferable to solve this with constructor function only, when it is applicable.
In the code above displayPicture and the fact that promises cannot be formed instantly but on load event limits the ways the situation can be efficiently handled. It can be improved with sticking to promises for events (also beneficial for further conversion to async..await):
const readerPromise = new Promise(resolve => reader.onload = resolve);
reader.readAsDataURL(vm.currentFile);
const blobPromises = readerPromise.then(e => {
const url = e.target.result;
const fullBlobPromise = new Promise(resolve => {
...
fullCanvas.toBlob(resolve, ...);
});
const thumbBlobPromise = ...;
return Promise.all([fullBlobPromise, thumbBlobPromise]);
});
The promises should also make use of Angular digests in order to provide same control flow. Since promise internals don't rely on Angular services, there can be a single digest at the end:
return blobPromises.then(
() => $rootScope.$apply(),
err => {
$rootScope.$apply();
throw err;
}
);
// or
return $q.resolve(blobPromises);
in ES6 it would be
let resolve1 = new Promise<string>((resolve, reject) => {
if (error) {
reject();
} else {
resolve();
}
});
Do the same for the number of promise you want to resolve, then if you want to run multiple promises, use :
Promise.all([resolve1, resolve2]).then(values => {
console.log(values);
});
Or you can also declare your promises array and then resolve it
let resolvedPromisesArray = [Promise.resolve(val1), Promise.resolve(val2)];
let p = Promise.all(resolvedPromisesArray);
It will fail if one of the promises is rejected, and will run then if all success ;)
https://codecraft.tv/courses/angular/es6-typescript/promises/

Categories