Invalid state error in chrome when using FileReader - javascript

I'm simply trying to use FileReader to display image files, however when I try to use more than 1 image, I get the following "InvalidStateError: DOM Exception 11". In firefox, however, it works fine.
Here's my code
function addImages(images)
{
var reader=new FileReader()
reader.onload=function()
{
$("#images").append('<img src="'+this.result+'"/><br/>')
}
for(var count=0;count<images.length;count++)
{
reader.readAsDataURL(images[count])
}
}
function uploadImagesToBrowser(e)
{
addImages(e.target.files)
}
$("#imagefiles").on("change",uploadImagesToBrowser)

Unfortunately doesn't work. As mentioned by janje you will need to create a new FileReader per iteration. Also remember to create a closure when binding the event handler, due to the "creating functions in a loop" problem in JavaScript.
Eric Bidelman's post is a rather good source:
function handleFileSelect(evt) {
var files = evt.target.files; // FileList object
// Loop through the FileList and render image files as thumbnails.
for (var i = 0, f; f = files[i]; i++) {
var reader = new FileReader();
// Closure to capture the file information.
reader.onload = (function(theFile) {
return function(e) {
// Render thumbnail.
};
})(f);
// Read in the image file as a data URL.
reader.readAsDataURL(f);
}
}

You have to create a new FileReader (and a new variable to store it in) for every iteration using readAsDataURL.

Related

How to pass image data as base64 to a global variable after the input changes as has happend in an input field in an async function case

I am very new to this concept of async ,await ,promise ..paradigms ..Please help me
I have an input field like this
I have a global variable var base64.How to fill base64 global variable with base encoded value. Inside the function in addEventListener reader.result shows base encoded value.How Can i pass it to outside global variable.I tried a lot .. as it seems to be a async function ..the behavior seems to be quiet different and seems very difficult in catching up and also I am not getting the expected result
<input type="file" id="photo" accept="image/*" required #change="UploadPhoto()" />
The change function of UploadPhoto() is ..{This i got from mozilla}
function UploadPhoto() {
const file = document.querySelector('#photo').files[0];
const reader = new FileReader();
reader.addEventListener("load", function () {
// convert image file to base64 string
console.log(reader.result);
}, false);
if (file) {
reader.readAsDataURL(file);
}
}
Also I will be greatful if you can explain me the working of reader.addEventListener and reader.readAsDataURL(file); with the case of asynchronous behavioured ..I googled a lot but couldnt find an article with detailed explaination of these two functions.
Wanting a global (in this case) is a symptom of not having the tools you need to do without it.
If you really needed a global, you could get it this way...
let b64Encoded;
function UploadPhoto() {
const file = document.querySelector('#photo').files[0];
const reader = new FileReader();
reader.addEventListener("load", function () {
// convert image file to base64 string
b64Encoded = btoa(reader.result);
}, false);
But, as you suppose, the modern way to do it is to use promises. This will make the code simpler to understand, and will prove an even better decision when you get to actually posting the encoded file...
First wrap the file reader in a promise. Here's a fine post on the subject, and here's the punchline (slightly modernized)
async function readAsDataURL(file) {
return new Promise((resolve, reject) => {
const fr = new FileReader();
fr.onerror = reject;
fr.onload = () => {
resolve(fr.result);
}
fr.readAsDataURL(file);
});
}
Use it like this:
// we don't need this anymore :-)
// let b64Encoded;
async function UploadPhoto() {
const file = document.querySelector('#photo').files[0];
const result = await readAsDataURL(file);
const b64Encoded = btoa(result); // stack var, as it should be!
// insert real (promise-based) network code here
return myNetworkLibrary.post(b64Encoded);
}
The addEventListener method on file reader is part of the old school async: It says: file reader is going to do i/o on a separate thread. It will you tell you when it's done (that's the "event" part) by calling a function you specify (that's the "listener" part), pass it a function to call (that's the "add" part).

How to correctly set new Image in the following situation?

I'm trying to read the width and height of files that I get after clicking the Browser button:
for (let i = 0; i < this.uploadingPanoramas.length; i++) {
const img = new Image() // eslint-disable-line
console.log('file', this.uploadingPanoramas[i].file)
img.src = this.uploadingPanoramas[i].file
console.log('img', img)
img.onload(() => {
console.log('width', img.width)
})
}
console.log('file') logs:
console.log('file') logs: <img src="[object File]">
So img.onload doesn't actually work because I'm not getting the image apparently.
What's the correct way of doing this?
EDIT:
this.uploadingPanoramas is an array which objects that contains the files:
[{
file: File,
progress: 0
}, {
file: File,
progress: 0
}]
As this.uploadingPanoramas[i].file is File type you can use FileReader to generate data URL using readAsDataURL() method.
The FileReader object lets web applications asynchronously read the contents of files (or raw data buffers) stored on the user's computer, using File or Blob objects to specify the file or data to read.
var reader = new FileReader();
reader.onload = function (e) {
img.src = e.target.result;
}
reader.readAsDataURL(this.uploadingPanoramas[i].file);

Reading multiple files synchronously in Javascript using FileReader

I have a for loop iterating over the number of files
I have to read the first line of each file and add it let's say to a Map having File name as the key and First line of that file as a the value.
I am using FileReader to read the file but it is asynchronous.
When I open a stream to read the file the loop gets incremented before I am done with reading the file and adding my desired entry to the map.
I need a synchronous operation i.e. Read the First line , add it to the Map and then increment the loop and proceed with the next file.
for (var i = 0; i < files.length; i++){
var file = files[i];
var reader = new FileReader();
reader.onload = function(progressEvent){
var lines = progressEvent.target.result.split('\n');
firstLine = lines[0];
alert('FirstLine'+firstLine);
//add to Map here
}
reader.readAsText(file);
}
How to modify the code so as to achieve the above mentioned functionality.
You can use promises and let them run in the order you create them using reduce.
The below code shows how it could be done this way, and you can take a look at this simple JSFiddle that demos the idea.
//create a function that returns a promise
function readFileAndAddToMap(file){
return new Promise(function(resolve, reject){
var reader = new FileReader();
reader.onload = function(progressEvent){
var lines = progressEvent.target.result.split('\n');
firstLine = lines[0];
console.log('FirstLine'+firstLine);
//add to Map here
resolve();
}
reader.onerror = function(error){
reject(error);
}
reader.readAsText(file);
});
}
//create an array to hold your promises
var promises = [];
for (var i = 0; i < files.length; i++){
//push to the array
promises.push(readFileAndAddToMap(files[i]));
}
//use reduce to create a chain in the order of the promise array
promises.reduce(function(cur, next) {
return cur.then(next);
}, Promise.resolve()).then(function() {
//all files read and executed!
}).catch(function(error){
//handle potential error
});
I was facing the same issue, what I did was I removed the for loop and used recursive function instead. That way, I was able to handle the sequence of FileReader.
Below I tried to modify your code based on my logic. feel free to ask any question in comments if you need more clarity.
attachmentI = { i: files.length };
function UploadMe() {
attachmentI.i--;
if(attachmentI.i > -1){
var file = files[attachmentI.i];
var reader = new FileReader();
reader.onload = function(progressEvent){
var lines = progressEvent.target.result.split('\n');
firstLine = lines[0];
alert('FirstLine'+firstLine);
//add to Map here
UploadMe();
}
reader.readAsText(file);
}

Jquery Onload Function stopped for loop, passing for loop x inside function

I'm passing image files from XMLHttpRequest to this function readfiles(files) by using dataTransfer
what i'm trying to do is to preview the images and the image file names at the same time and in one line code inside the reader.onload() function.
and because there will be more than 1 file passed to the function, i threw them into a for loop
the problem is when i try to preview the images by readDataURL it's ok but the file names cannot be previewed i think because the reader.onload() function stopped the for loop from looping trough the image files.
Here's my code
function readfiles(files) {
var x;
for(x = 0; x < files.length; x = x + 1) {
var reader = new FileReader();
reader.readAsDataURL(files[x]);
reader.onload = function(e) {
console.log(e.target.result);
console.log(files[x].name);
}
}
}
have been searching for solution about 5 hours now, any help!
ROX's answer is not right. In his case, you will see that it will output the same file name 4 times. What you need is a closure which will essentially keep the right context on each iteration. You can achieve this as follows. Check the fiddle http://jsfiddle.net/cy03fc8x/.
function readfiles(files) {
for(x = 0; x < files.length; x = x + 1) {
var file = files[x];
(function(file){ //this is a closure which we use to ensure each iteration has the right version of the variable 'file'
var reader = new FileReader();
reader.readAsDataURL(file);
reader.onload = function(e) {
console.log(e.target.result);
console.log(file.name);
}
})(file); //on each iteration, pass in the current file to the closure so that it can be used within
}
}
Because onload will run later, at that moment, the x is one more than your number of files. For example if you have 4 files, the x will be 5 when the onload executes.
So keep a reference to your current file:
function readfiles(files) {
for (var x = 0; x < files.length; x = x + 1) {
// keep reference to current file on iteration
var file = files[x];
// create closure and execute it
(function (file) {
var reader = new FileReader();
reader.readAsDataURL(file);
reader.onload = function(e) {
console.log(file.name);
}
}(file)); // pass the `file` to the function
}
}

Clarification required on Javascript in HTML5 fileAPI for image preview

I was looking at the new HTML5 file API for showing a preview of an image to be uploaded. I googled for some code and almost every example had the same structure, almost the same code. I don't mind copying, particularly when it works, but I need to understand it. So I tried to understand the code but I am stuck with one area and need someone to explain that small part:
The code refers to a HTML form input field and when the file is selected shows the preview image in a img tag. Nothing fancy. Simple. Here it is after removing all the noise:
$('input[type=file]').change(function(e) {
var elem = $(this);
var file = e.target.files[0];
var reader = new FileReader();
//Part I could not understand starts here
reader.onload = (function(theFile) {
return function(e) {
var image_file = e.target.result
$('#img_id').attr('src',image_file);
};
})(file);
reader.readAsDataURL(file);
//Upto here
});
I think that reader.onload needs to be assigned a plain event handler, so I replaced the entire section marked above to:
reader.readAsDataURL(file);
reader.onload = function(e) {
var image_file = e.target.result;
//#img_id is the id of an img tag
$('#img_id').attr('src',image_file)
};
And it worked as I expected it to work.
QUESTION: What's the original code doing that the above simplified code is missing? I understand that it is a function expression returning a function and then calling it… but for what? There is just too much of the original code copied around under tutorials and what not on it but no good explanation. Please explain. Thanks
Sure.
The function-wrapped function, here serves the specific purpose of remembering which file it was you were looking at.
This might be less of a problem using your exact codebase, but if you had a multiple-upload widget, and you wanted to display a row of previews:
var my_files = [].slice.call(file_input.files),
file, reader,
i = 0, l = my_files.length;
for (; i < l; i += 1) {
file = my_files[i];
reader = new FileReader();
// always put the handler first (upside down), because this is async
// if written normally and the operation finishes first (ie:cached response)
// then handler never gets called
reader.onload = function (e) {
var blob_url = e.target.result,
img = new Image();
img.src = blob_url;
document.body.appendChild(img);
};
reader.readAsDataUrl(file);
}
That should all work fine. ...except...
And it's clean and readable.
The issue that was being resolved is simply this:
We're dealing with async-handlers, which means that the value of file isn't necessarily the same when the callback fires, as it was before...
There are a number of ways to potentially solve that.
Pretty much all of them that don't generate random ids/sequence-numbers/time-based hashes to check against on return, rely on closure.
And why would I want to create a whole management-system, when I could wrap it in a function and be done?
var save_file_reference_and_return_new_handler = function (given_file) {
return function (e) {
var blob_url = e.target.result,
file_name = given_file.name;
//...
};
};
So if you had that at the top of the function (above the loop), you could say:
reader = new FileReader();
reader.onload = save_file_reference_and_return_new_handler(file);
reader.readAsDataUrl(file);
And now it will work just fine.
Of course, JS people don't always feel compelled to write named functions, just to store one item in closure to remember later...
reader.onload = (function (current_file) {
return function (e) {
var blob_url = e.target.result,
file_name = current_file.name;
};
}(file));

Categories