i'm currently trying to upload a file to my server. But i'm not really sure how to do this with readAsArrayBuffer. This works if I use readAsBinaryString.
If i try to console.log it only returns 'arrayBuffer: {}'.
After I've tried to upload it, i can see inside post that only a empty object was sent. If I use readAsBinaryString, I see a bunch of binary code.
var file = document.getElementById('my_file').files[0],
reader = new FileReader();
reader.onloadend = function(e){
console.log(e.target.result);
$scope.image = e.target.result;
}
reader.readAsArrayBuffer(file);
How can I see my file, so I know it's working when using readAsArrayBuffer?
If more code is needed let me know! Thanks.
According to ArrayBuffer documentation
You can not directly manipulate the contents of an ArrayBuffer; instead, you create one of the typed array objects or a DataView object which represents the buffer in a specific format, and use that to read and write the contents of the buffer.
So as I commented before probably console.log doesn't know how to represent the buffer and therefore it simply outputs arrayBuffer: {}.
If you want to show something in the console you need to use a typed array or a DataView. For example using an Int8Array:
reader.onloadend = function (e) {
console.log(e);
console.log(new Int8Array(e.target.result));
};
See demo
If you want to upload am image then you have to convert it into base64 format. You can do it either by using canvas element or by using Filereader.If you are using Filereader then you have to use readAsDataURL()
You can refer MDN for this
https://developer.mozilla.org/en-US/docs/Web/API/FileReader.readAsDataURL
also you can use canvas element
Convert an image to canvas that is already loaded
Related
I'm just learning how to use the filereader now, and I duplicated an example I found online to experiment with, but for some reason, the filereader always returns an empty string.
First, I have an HTML form for the user to select a file, which then calls the script:
<input type="file" id="filelist" onchange="selectfile()">
Here's the script:
function selectfile() {
myFile = document.getElementById("filelist").files[0];
reader = new FileReader();
reader.readAsText(myFile);
myResult = reader.result;
alert(myFile.name);
alert(myResult);
alert(reader.error);
}
I have tried this with a number of different text files I typed up in Notepad, and in every case the results are the same. I'm only ever submitting one file through the html form.
The 3 alerts are for testing.
It displays the file name correctly.
It displays an empty string for the result.
It displays NULL for the error so it's not getting an error.
I searched around to see if there was something obvious here already, but couldn't find anything that seemed to point me in the right direction.
Thoughts?
The FileReader object is not ready yet. You need to add an onload event listener to the reader and then make a call to the readAsText method. You can then access the file contents from inside the callback function.
MDN docs - https://developer.mozilla.org/en-US/docs/Web/API/FileReader/result
function selectfile() {
myFile = document.getElementById("filelist").files[0];
reader = new FileReader();
reader.onload = () => {
myResult = reader.result;
alert(myFile.name);
alert(myResult);
alert(reader.error);
};
reader.readAsText(myFile); // only accessible when the FileReader is loaded
}
<input type="file" id="filelist" onchange="selectfile()">
I am reading a local CSV file using a web UI, and the HTML5 FileReader interface to handle the local file stream. This works great.
However, sometimes I want the file being read to be updated continuously, after the initial load. I am having problems, and I think it might have something to do with the FileReader API. Specifically, after the initial file load, I maintain a reference to the file. Then, when I detect that the size of the file has increased, I slice off the new part of the file, and get a new Blob object. However, there appears to be no data in these new Blobs.
I am using PapaParse to handle the CSV parsing, though I don't think that is the source of the problem (though it may be).
The source code is too voluminous to post here, but here is some pseudocode:
var reader = new FileReader();
reader.onload = loadChunk;
var file = null;
function readLocalFile(event) {
file = event.target.files[0];
// code that divides file up into chunks.
// for each chunk:
readChunk(chunk);
}
function readChunk(chunk) {
reader.readAsText(chunk);
}
function loadChunk(event) {
return event.target.result;
}
// this is run when file size has increased
function readUpdatedFile(oldLength, newLength) {
var newData = file.slice(oldLength, newLength);
readChunk(newData);
}
The output of loadChunk when the file is first loading is a string, but after the file has been updated it is a blank string. I am not sure if the problem is with my slice method, or if there is something going on with FileReader that I am not aware of.
The spec for File objects shouldn't allow this: http://www.w3.org/TR/FileAPI/#file -- it's supposed to be like a snapshot.
The fact that you can detect that the size has changed is probably a shortcoming of an implementation.
I have an HTML5/javscript app which uses
<input type="file" accept="image/*;capture=camera" onchange="gotPhoto(this)">
to capture a camera image. Because my app wants to be runnable offline, how do I save the File (https://developer.mozilla.org/en-US/docs/Web/API/File) object in local storage, such that it can be retrieved later for an ajax upload?
I'm grabbing the file object from the using ...
function gotPhoto(element) {
var file = element.files[0];
//I want to save 'file' to local storage here :-(
}
I can Stringify the object and save it, but when I restore it, it is no longer recognised as a File object, and thus can't be used to grab the file content.
I have a feeling it can't be done, but am open to suggestions.
FWIW My workaround is to read the file contents at store time and save the full contents to local storage. This works, but quickly consumes local storage since each file is a 1MB plus photograph.
You cannot serialize file API object.
Not that it helps with the specific problem, but ...
Although I haven't used this, if you look at the article it seems that there are ways (although not supported yet by most browsers) to store the offline image data to some files so as to restore them afterward when the user is online (and not to use localStorage)
Convert it to base64 and then save it.
function gotPhoto(element) {
var file = element.files[0];
var reader = new FileReader()
reader.onload = function(base64) {
localStorage["file"] = base64;
}
reader.readAsDataURL(file);
}
// Saved to localstorage
function getPhoto() {
var base64 = localStorage["file"];
var base64Parts = base64.split(",");
var fileFormat = base64Parts[0].split(";")[1];
var fileContent = base64Parts[1];
var file = new File([fileContent], "file name here", {type: fileFormat});
return file;
}
// Retreived file object
Here is a workaround that I got working with the code below. I'm aware with your edit you talked about localStorage but I wanted to share how I actually implemented that workaround. I like to put the functions on body so that even if the class is added afterwards via AJAX the "change" command will still trigger the event.
See my example here: http://jsfiddle.net/x11joex11/9g8NN/
If you run the JSFiddle example twice you will see it remembers the image.
My approach does use jQuery. This approach also demonstrates the image is actually there to prove it worked.
HTML:
<input class="classhere" type="file" name="logo" id="logo" />
<div class="imagearea"></div>
JS:
$(document).ready(function(){
//You might want to do if check to see if localstorage set for theImage here
var img = new Image();
img.src = localStorage.theImage;
$('.imagearea').html(img);
$("body").on("change",".classhere",function(){
//Equivalent of getElementById
var fileInput = $(this)[0];//returns a HTML DOM object by putting the [0] since it's really an associative array.
var file = fileInput.files[0]; //there is only '1' file since they are not multiple type.
var reader = new FileReader();
reader.onload = function(e) {
// Create a new image.
var img = new Image();
img.src = reader.result;
localStorage.theImage = reader.result; //stores the image to localStorage
$(".imagearea").html(img);
}
reader.readAsDataURL(file);//attempts to read the file in question.
});
});
This approach uses the HTML5 File System API's to read the image and put it into a new javascript img object. The key here is readAsDataURL. If you use chrome inspector you will notice the images are stored in base64 encoding.
The reader is Asynchronous, this is why it uses the callback function onload. So make sure any important code that requires the image is inside the onLoad or else you may get unexpected results.
You could use this lib:
https://github.com/carlo/jquery-base64
then do something similar to this:
//Set file
var baseFile = $.base64.encode(fileObject);
window.localStorage.setItem("file",basefile);
//get file
var outFile = window.localStorage.getItem("file");
an other solution would be using json (I prefer this method)
using: http://code.google.com/p/jquery-json/
//Set file
window.localStorage.setItem("file",$.toJSON(fileobject));
//get file
var outFile = $.evalJSON(window.localStorage.getItem("file"));
I don't think that there is a direct way to Stringify and then deserialize the string object into the object of your interest. But as a work around you can store the image paths in your local storage and load the images by retrieving the URL for the images. Advantages would be, you will never run out of storage space and you can store 1000 times more files there.. Saving an image or any other file as a string in local storage is never a wise decision..
create an object on the global scope
exp: var attmap = new Object();
after you are done with file selection, put your files in attmap variable as below,
attmap[file.name] = attachmentBody;
JSON.stringify(attmap)
Then you can send it to controller via input hidden or etc. and use it after deserializing.
(Map<String, String>)JSON.deserialize(attachments, Map<String,String>.class);
You can create your files with those values in a for loop or etc.
EncodingUtil.base64Decode(CurrentMapValue);
FYI:This solution will also cover multiple file selection
You could do something like this:
// fileObj = new File(); from file input
const buffer = Buffer.from(await new Response(fileObj).arrayBuffer());
const dataUrl = `data:${fileObj.type};base64,${buffer.toString("base64")}`;
localStorage.setItem('dataUrl', dataUrl);
then you can do:
document.getElementById('image').src = localStorage.getItem('dataUrl');
Using the filereader API it is possible to show a preview of the file, by reading the file with readAsDataURL
What I am trying to do is:
The user selects a file
A preview is shown, so that the user has some feedback.
If the user is satisfied, he submits the data to the backend.
Implementing step 3 can be done by re-reading the file with readAsBinaryString, but this looks problematic because the data could have disappeared or changed on disk. So What I would like is to convert the data returned from readAsDataURL to the format returned by readAsBinaryString. How can I do this?
Another alternative would be to submit the data to the backend as returned by readAsDataURL, but I would like to avoid that, since that would require special handling on the backend in my case.
Like CBroe said, you dont need to read the file twice.
JS :
handleFileSelectThumbFile(evt){
var files = evt.target.files;
var file = files[0];
// You can get the mime type like this.
var thumbMIME = files[0]['name'].split('.').pop();
if (files && file) {
var reader = new FileReader();
reader.onload = function(readerEvt) {
// Split the readerEvt.target.result by a ','.
// You can send the binaryString variable to the server.
// Its base64 encoded already.
var binaryString = readerEvt.target.result.split(',')[1];
// Set the image preview to the uploaded image.
$('.img-preview').prop('src', readerEvt.target.result);
}.bind(this);
reader.readAsDataURL(file);
}
}
HTML :
<input type="file" onChange={this.handleFileSelectThumbFile} required/>
<img src='http://placehold.it/300' class='img-preview'/>
You can read the MIME type from the first part of readerEvt as well. Look at CBroe's comment above.
I need to serialize a File object from a file input, so that the object can be saved, parsed back to a file object, and then read using the FileReader object.
Does anyone know if this is possible in Google Chrome?
I think the problem lies in the protection of the file.path property. Webkit browsers hide this property, so I am guessing when you serialize it, the path is removed.
Then of course, the FileReader is unable to read it without path information.
Here is an example:
var files = uploadControl.files[0];
var dataFile = JSON.stringify(files);
var newFile = JSON.parse(dataFile);
var reader = new FileReader();
reader.onload = (function(event) {
var fileContents = event.target.result;
});
reader.readAsText(newFile);
Nothing happens. The reader is not loaded. If I pass the JSON object, it doesn't work either.
As a matter of principle, what you are asking for will not be possible. If it were possible to have some text which represented the file object, then you could construct that text from scratch, unserialize it, and thus get access to files the user did not grant permission to.
(The exception to this is if the representative text is some sort of robustly-secret bitstring (cryptographically signed, or a sparse key in a lookup table) that only the browser could have given out in the first place — but I expect if that feature existed, it would be documented and you would have found it already.)