There are two ways I can upload files using Ajax (XHR2). First, I can read the file content as array buffer or binary string and then simply stream using XHR send method. For example, as shown here:
function uploadFile(img, file) {
const reader = new FileReader();
const xhr = new XMLHttpRequest();
xhr.upload.addEventListener("progress", function(e) {
if (e.lengthComputable) {
const percentage = Math.round((e.loaded * 100) / e.total);
// Do something with percentage
}
});
xhr.upload.addEventListener("load", (e) => console.log('Do something more'));
xhr.open("POST", "some-url");
xhr.overrideMimeType('text/plain; charset=x-user-defined-binary');
reader.onload = function(evt) {
xhr.send(evt.target.result);
};
reader.readAsBinaryString(file);
}
Second, I can use FormData to upload my file as shown here:
var formData = new FormData();
// HTML file input, chosen by user
formData.append("userfile", fileInputElement.files[0]);
var request = new XMLHttpRequest();
request.open("POST", "some-url");
request.send(formData);
Are the two methods equivalent? Is there any advantage of using FileReader instead of FormData? Is one more performant than the other?
First, there is a third option you omitted which is to send the File directly through xhr.send(file) just like you did with the ArrayBuffer.
That being said, there doesn't exist any possible advantage to first reading the file in memory through FileReader.
When doing a file upload from a File on disk, the browser doesn't load the full file in memory but streams it through the request. This is how you can upload gigs of data even though it wouldn't fit in memory. This also is more friendly with the HDD since it allows for other processes to access it between each chunk instead of locking it.
When reading the File through a FileReader you are asking the browser to read the full file to memory, and then when you send it through XHR the data from memory is being used. You are thus limited by the memory available, bloating it for no good reasons, and even asking the CPU to work here while the data could have gone from the disk to the network card almost directly.
As to what's the difference between formdata.append(file); xhr.send(formdata); and xhr.send(file), basically only request headers. The former will wrap the request as a multipart/form-data enctype request, while the latter will send it as is.
So you'd handle both requests differently on the receiving end.
Related
Having an HTML Form that is submittet via POST (user clicking the
submit button).
Furthermore having an image that is read via Javascript of a canvas
object (getImageData()).
Question:
How to "inject" this image data into the HTML Form, so that it gets uploaded as Content-Type:multipart/form-data and can be processed via the existing PHP Frameworks Data Extraction Logic?
Example from a <input type="file" upload captured with CHrome in a POST request => it should look like this
------WebKitFormBoundaryBBAQ5B4Ax1NgxFmD
Content-Disposition: form-data; name="images"; filename="fooimage.png"
Content-Type: image/png
Problem:
I know how to uploed it in a seperate request (via ajax, seperate from the form). I know how to upload it as base64 Data an process it manually in the form.
But I do not know how to send the Image Data along the exiting Form so that it looks for the PHP Serverside Scripts exactly the same as an image that is send via <input type="file"...
Reason: Symphony (FileUpload Object) checks if a file is uploaded via the POST Form and fails if I manulally instanciate the object with the data.
So the best solution would be (in regards to a lot of other things, like testing, avoiding unnecessary logik), if the data would be passed the same as a regular form upload. How to do this?
Thanks!
You can use a FormData object to get the values of your form, and then append a blob version of your canvas into the FormData.
This blob will be seen as a file by the server.
Unfortunately, all browsers still don't support the native canvas.toBlob() method, and even worth, all implementations are not the same.
All major browsers now support the toBlob method, and you can find a polyfill on mdn for older browsers.
// the function to create and send our FormData
var send = function(form, url, canvas, filename, type, quality, callback) {
canvas.toBlob(function(blob){
var formData = form ? new FormData(form) : new FormData();
formData.append('file', blob, filename);
var xhr = new XMLHttpRequest();
xhr.onload = callback;
xhr.open('POST', url);
xhr.send(formData);
}, type, quality);
};
// How to use it //
var form = document.querySelector('form'), // the form to construct the FormData, can be null or undefined to send only the image
url = 'http://example.com/upload.php', // required, the url where we'll send it
canvas = document.querySelector('canvas'), // required, the canvas to send
filename = (new Date()).getTime() + '.jpg',// required, a filename
type = 'image/jpeg', // optional, the algorithm to encode the canvas. If omitted defaults to 'image/png'
quality = .5, // optional, if the type is set to jpeg, 0-1 parameter that sets the quality of the encoding
callback = function(e) {console.log(this.response);}; // optional, a callback once the xhr finished
send(form, url, canvas, filename, type, quality, callback);
PHP side would then be :
if ( isset( $_FILES["file"] ) ){
$dir = 'some/dir/';
$blob = file_get_contents($_FILES["file"]['tmp_name']);
file_put_contents($dir.$_FILES["file"]["name"], $blob);
}
The goal is to send a list of items to retrieve from the server, which queries the file system or a database and ideally returns an array of binary files.
My current methodology is to base64 encode binaries into a JSON array.
// Extend Array class methods to NodeList data type
Object.getOwnPropertyNames( Array.prototype ).forEach( function(method){
if (method !== 'length')
NodeList.prototype[method]=Array.prototype[method];
});
// Find People
var nodeList = document.querySelectorAll('div.people');
var people = nodeList.map(function(el){ return el.dataset.personid; });
// Search Server
var request = new XMLHttpRequest();
var skip_cache = '?' + new Date().getTime();
request.open('POST', 'cgi.script' + skip_cache );
request.responseType = 'json';
request.onload = function(){
// response handler
/* receives json from server, including an array of base64 files for each person
loop over array and append base64 values to images (data uri) and prepare other HTML elements
*/
};
request.send( JSON.stringify({ids:people}) );
The bottleneck is the size of the request - I'm already breaking the number of people into multiple requests. Although, base64 is only ~37% larger in its uncompressed state; when you have 1k files to download, it doesn't matter.
The goal is to reduce the size per request (without sacrificing time), which has come down to either a better compression method (lzma vs gzip), or improving the data format (binary over ascii).
Is it possible to transfer multiple binary files at once, or even embed them directly in JSON without side-effects? As a preventative measure I've never attempted this, thinking of possible side effects that would challenge technology of yesteryear.
I need to create a small browser-based application that helps users download/save, and possibly print to the default printer, a large number of files from a webserver we have no control over (but we have all the URIs beforehand).
These files are hidden behind a "single sign-on" (SSO) that can only be performed via a browser and requires a user. Hence, it must be a browser-based solution, where we piggyback on to the session established by the SSO.
The users' platform is Windows 7.
The point is to save the users from going through a lot of clicks per file (download, where to save, etc.) when they need to perform this operation (daily).
At this point all the files are PDF, but that might change in the future.
A browser-agnostic solution is preferred (and I assume more robust when it comes to future browser updates).
But we can base it on a particular browser if needed.
How would you do this from Javascript?
As the comments to my question says, this isn't really allowed by the browsers for security reasons.
My workaround for now (only tested using IE11) is to manually change the security settings of the users browser, and then download the files as a blob into a javascript variable using AJAX, followed by upload of same blob to my own server again using AJAX.
"My own server" is a Django site created for this purpose, that also knows which files to download for the day, and provide the javascript needed. The user goes to this site to initiate the daily download after performing the SSO in a separate browser tab.
On the server I can then perform whatever operations needed for said files.
Many thanks to this post https://stackoverflow.com/a/13887220/833320 for the handling of binary data in AJAX.
1) In IE, add the involved sites to the "Local Intranet Zone", and enable "Access data sources across domains" for this zone to overcome the CORS protection otherwise preventing this.
Of course, consider the security consequences involved in this...
2) In javascript (browser), download the file as a blob and POST the resulting data to my own server:
var x = new XMLHttpRequest();
x.onload = function() {
// Create a form
var fd = new FormData();
fd.append('csrfmiddlewaretoken', '{{ csrf_token }}'); // Needed by Django
fd.append('file', x.response); // x.response is a Blob object
// Upload to your server
var y = new XMLHttpRequest();
y.onload = function() {
alert('File uploaded!');
};
y.open('POST', '/django/upload/');
y.send(fd);
};
x.open('GET', 'https://external.url', true);
x.responseType = 'blob'; // <-- This is necessary!
x.send();
3) Finally (in Django view for '/django/upload/'), receive the uploaded data and save as file - or whatever...
filedata = request.FILES['file'].read()
with open('filename', 'wb') as f:
f.write(filedata)
Thanks all, for your comments.
And yes, the real solution would be to overcome the SSO (that requieres the user), so it all could be done by the server itself.
But at least I learned a little about getting/posting binary data using modern XMLHttpRequests. :)
Actually, I had a problem like it, I wanted to download a binary file(an image) and store it and then use it when I need it, So I decided to download it with Fetch API Get call:
const imageAddress = 'an-address-to-my-image.jpg'; // sample address
fetch(imageAddress)
.then(res => res.blob) // <-- This is necessary!
.then(blobFileToBase64)
.then(base64FinalAnswer => console.log(base64FinalAnswer))
The blobFileToBase64 is a helper function that converts blob binary file to a base64 data string:
const blobToBase64 = blob => {
const reader = new FileReader();
reader.readAsDataURL(blob);
return new Promise(resolve => {
reader.onloadend = () => {
resolve(reader.result);
};
});
};
In the end, I have the base64FinalAnswer and I can do anything with it.
I've got a WebGL application which requires me to load a lot of x,y vertex data, but to minimise bandwidth usage I want to also compress the data (on a one off basis) using gzip.
Below is the code I will use to load in the data. I want to retrieve data from a server and pass it straight into a Float32Array.
var xhr = new XMLHttpRequest();
xhr.open('GET', 'array.gz', true);
xhr.responseType = 'arraybuffer';
xhr.onload = function(data) {
console.log("loaded");
var dataArray = new Float32Array(this.response);
xhr.onprogress = function(e) {
};
xhr.onerror = function(error) {
console.log('error!');
};
xhr.send();
Now my problem isn't linked to the code directly, but to the file format supported. In what format (i.e. csv, json, xml) does the data need to be in, before being gzipped so that this method can properly consume it?
I've played around to find that if I JSON.stringify a Float32Array and place the content in a file, then load it in, it works fine. However, to load in all my uncompressed data into a JavaScript array, before copying all its contents back into a new file to be compressed isn't very feasible. So I'm really looking for an alternative way to this (assuming this file format is the only one supported).
I currently have a django formset with dynamic number of forms. Each form has a file field.
While this works, I need a way to allow multiple files to be selected at once. I can do that with input type='file' multiple='multiple' and I can read file data using FileReader. But how can I pass these files to django formset instead of form-0-file, form-1-file etc?
I can think of one way - replace FileField with TextField, and pass around the base64 encoded file as text, but I'm not sure if it's a good solution.
Just use multiple attribute and use FILES to get all of uploaded files.
base64 encoding maybe not help
Using multiple='multiple' is not related to formset or single form. It will work natively with django. So that if you plan to have single form instead of formset, just put multiple attribute and then access request.FILES to get all of uploaded files.
You should store the Images as Files,(Here are some good answers to do that),
I already tried to store images of that way but it have a lot of problems:
Every time you go to that page, it will start loading the image again, and the people don't want to load the same image every time.
It will use a lot of bandwidth.
If you are using a free web hosting service, will spend all you your bandwidth in a couple of hours, or when you store 50 images, so that mean that your site will be out the whole month, even the services that provide unlimited hosting and bandwidth, inforce a monthly bandwidth.
I recently had a problem where I had to implement a solution where a user can upload a document, stream that to the server, and without storing in on the server, post the stream to a SOAP server.
The way I implemented it, is as follows:
I wanted to upload the file via AJAX in order for me to show the progress of the upload.
This is my solution (Please note this only catered for one file at a time - but it might be a starting point.)
JavaScript:
First declare an object for the FormData - This will be used to send any additional info along with the files:
var formData = new FormData();
Secondly append all the data you would like to send to the server:
formData.append("documentDescription", $("#documentDescription textarea").val());
formData.append("afile", file.files[0]);
Now create a new instance of XMLHttpRequest:
var xhr = new XMLHttpRequest();
This is the rest of the code that got everything working:
xhr.open("POST", "UploadDocumentURL", true);
xhr.upload.onprogress = function(e) {
if (e.lengthComputable) {
var percentComplete = (e.loaded / e.total) * 100;
$('.progress .bar').css('width', percentComplete + '%').attr('aria-valuenow', percentComplete);
}
}
xhr.onload = function() {
if (this.status == 200) {
var resp = JSON.parse(this.response);
if (resp.type === "error") {
notify.add(resp.type, "Error", resp.message, 3000, true);
} else {
notify.add(resp.type, "Success", resp.message, 3000);
}
}
;
};
xhr.send(formData);
PHP
$documentName = $_POST["documentDescription"];
$fileName = $_FILES['afile']['name'];
$fileType = $_FILES['afile']['type'];
$ext = pathinfo($fileName, PATHINFO_EXTENSION);
$fileName = pathinfo($fileName, PATHINFO_FILENAME);
$fileContent = file_get_contents($_FILES['afile']['tmp_name']);
You will now have the Binary data on the server.
You should be able to make this work for multiple files by looping through the file.files[0] in JavaScript.
Hope you can apply this to your problem.