Blueimp File Upload - Multiple Uploads Directly to S3 - javascript

After searching the past couple days, I've found nearly 30 different people asking this same question, and I haven't found an answer. A couple reported that they found a solution, but did not provide it, so I'm hoping someone could answer it here.
Using blueimp jQuery File Upload, how can you upload multiple files directly to Amazon S3?
Problem: S3 accepts only one file per request.
Solution: Use blueimp jQuery File Upload to send separate requests for each file.
Roadblock: I cannot figure out how to make blueimp jQuery File Upload do this.
Solution Attempt
The guide is here: https://github.com/blueimp/jQuery-File-Upload/wiki/Upload-directly-to-S3
S3 currently requires more form fields than the ones shown in the guide. Here's a trimmed down version of my code:
$(':file').each(function(i, el) {
var fileInput = $(el);
var form = fileInput.parents('form:first');
fileInput.fileupload({
forceIframeTransport: true,
autoUpload: true,
singleFileUploads: true, //default anyway
add: function(event, data) {
var files = data.files || [];
var nextInQueue = files[0]; //this is a queue, not a list
console.log('nextInQueue:', nextInQueue);
if (!nextInQueue) return;
var fileData = {name: nextInQueue.name, mime: nextInQueue.type, size: nextInQueue.size};
$.ajax({
url: '/s3stuff',
type: 'POST',
dataType: 'json',
data: {'file': fileData},
async: false,
success: function(res) {
form.find('input[name="key"]').val(res.key);
form.find('input[name="AWSAccessKeyId"]').val(res.AWSAccessKeyId);
form.find('input[name="policy"]').val(res.policy);
form.find('input[name="signature"]').val(res.signature);
form.find('input[name="acl"]').val(res.acl);
form.find('input[name="success_action_status"]').val(res.success_action_status);
form.find('input[name="Content-Type"]').val(nextInQueue.type);
form.attr('action', res.url);
data.submit(); //why is this submitting all files at once?
}
});
},
fail: function(err) {
console.log('err:', err.stack);
}
});
});
Error
When I try to upload a single file, it works great! But when I try to upload multiple files, S3 returns this 400 error:
<?xml version="1.0" encoding="UTF-8"?>
<Error>
<Code>InvalidArgument</Code>
<Message>POST requires exactly one file upload per request.</Message>
<ArgumentName>file</ArgumentName>
<ArgumentValue>2</ArgumentValue>
</Error>
Analysis
The blueimp jQuery File Upload documentation says this (under add):
If the singleFileUploads option is enabled (which is the default), the add callback will be called once for each file in the selection for XHR file uploads, with a data.files array length of one, as each file is uploaded individually.
This is where I'm stuck:
If "each file is uploaded individually" to S3, why does S3 say otherwise?
Also, regardless of the number of files, the add function runs only once. (I verify this with the console.log statement.) I see 2 likely reasons for this:
The plugin stops further submissions after one fails.
The plugin is not submitting files individually (for whatever reason).
Is my code incorrect, am I missing an option in the plugin, or does the plugin not support multiple direct uploads to S3?
Update
Here's an html file for quick testing:
http://pastebin.com/mUBgr4MP

Author of JQuery File Upload here.
The reason the files are not submitted individually is the forceIframeTransport option, which is set to true in your example.
The documentation for the singleFileUploads option states the following:
By default, each file of a selection is uploaded using an individual request for XHR type uploads.
While the documentation for the forceIframeTransport option states the following:
Set this option to true to force iframe transport uploads, even if the browser is capable of XHR file uploads.
So, the solution is to enable CORS on the S3 bucket and to not enable the forceIframeTransport option.
By the way, most of the integration examples are user contributions (including the S3 upload guides).
Here's another one which uses the S3 CORS feature and might help your out (haven't tested it though):
http://pjambet.github.io/blog/direct-upload-to-s3

Related

Getting null file name in php - jquery file upload

I'm using jquery file upload by blueimp plugin for uploading files.
My client side options are
dataType: 'json',
autoUpload: true,
formData: [],
fileInput: $("#fileupload"),
acceptFileTypes: /(\.|\/)(jpe?g|png)$/i,
maxFileSize: 999000,
maxChunkSize: 0,
multipart: false,
When I tried uploading files without the last two options, small files around(450 - 500 kb) were successfully uploaded and large files around (800 kb) were not. (request was sent, but the response never arrived. No errors were recorded on php or apache or javascript)
Then after adding the last two options, the response got me to the event of 'fileuploaddone' on client side, but was actually an erroneous response as I was getting file name as NULL on my backend-php and it shows me invalid file type.
I'm making api calls to php backend and when I do the same using postman(plugin in google chrome to make api calls), everything works fine.
P.S. my file name is transmitted via Content-Disposition header, but i'm not sure what it exactly is.

Append formdata for multiple file upload [duplicate]

I need to upload files using ajax which has to be supported in IE9. I was using FormData as mentioned here. My code looks like this:
var files = new FormData();
JQuery.each($('#file')[0].files, function (i, file) {
files.append('file', file);
});
$.ajax({
type: "POST",
url: '/url',
cache: false,
contentType: false,
processData: false,
data: files,
...
});
This works fine in Safari and Firefox, but fails in IE9 as the FormData is not supported in IE9. I tried sending just as a file by setting:
data: $('#file')[0].files[0]
contentType: 'multipart/form-data'
This fails as the data is sent in url-encoded form and is cannot be parsed at the java side. Any help or pointer on how to solve this will be greatly appreciated. I need something that works across all browsers.
EDIT: I do not need any upload progress bar as the files are usually small. I do not need to upload multiple files. I just need a single file upload.
Unfortunately you cannot use Ajax (XMLHttpRequest in other words) for sending files, but you can implement a similar behavior using the <iframe/> with a <form method="post" enctype="multipart/form-data"/> that contains an <input type="file"/> which sends a user chosen file using the "natural" way. You can use javascript to call the form.submit() then poll that <iframe/> from parent document to check whether the file upload process is done.
jQuery has a lot of cool plugins for getting this job done, there is my favorite one, for example.

How to add files that were loaded earlier to DropZone.js

Got a really interesting (for me) problem.
I have a dropzone.js plugin installed and now I need to put some files there... from php.
What I am trying to do:
php script detects, that there are some files (in directory) that were loaded earlier (for example, few days ago). (I know the names of this files).
After that, I have to pass this files to my javascript script which will add them to dropzone so user could see files that he uploaded earlier.
And all of this using Ajax.
I understand, what to do with step 1 (I can find those files). But how to pass it to js and then add to dropzone?
Or am I thinking wrong? Help me please.
Dropzone has a wiki page explaining that.
Here is how I've recently done that by getting file URLs from REST API:
$.get('http://api.to.return.files', function(data) {
$(data.photos).each(function(i, photo) {
var mockFile = { name: photo.name, size: photo.size, accepted: true, id: photo.id };
myDropzone.emit("addedfile", mockFile);
myDropzone.emit("thumbnail", mockFile, photo.url);
myDropzone.emit("complete", mockFile);
myDropzone.files.push(mockFile);
});
});
If you already have your files urls in the script, use them instead of API response in my case.

FineUploader batch upload with one invalid object

When a user uploads a batch of files with the FineUploader javascript plugin, and one of these objects has an invalid extension, the whole upload fails. I would like the process to proceed for all valid files. Is this possible?
Yes, it is possible to have Fine Uploader simply ignore an invalid file, instead of stopping the entire upload. This is covered in the documentation (in several places) such as in the validation options section.
For example:
var uploader = new qq.FineUploader({
...
validation: {
....
stopOnFirstInvalidFile: false
}
});

How to get file contentType using file uploader in IE9

I'm trying to do a simple task. Upload a file with valums file uploader (or fine-uploader) with MVC3 application, save it in database, and let user download it again (with an action returning FileContentResult), but to do that, I need the contentType of file uploaded.
IE9 uses the "UploadHandlerForm" methods in vlaums file uploader (I'm using version 2.1.2), where I can't get the contentType.
When I'm using IE10 for example, the plugin uploads using UploadHandlerXhr, so I can get the content type and post it to the server, with that:
_upload: function(id, params)
{
...
var file = this._files[id];
var type = (file.fileSize != null ? file.fileSize : file.size);
....
//and then, add it to be posted to server:
xhr.setRequestHeader("X-File-Type", type);
}
Is there any way I cant get the contentType of the file from an input file with javascript in older browsers (like IE9)?
It's not clear what you are trying to do here at all. Are you trying to send the content-type of the file in a separate request? If so, why? The content-type is part of each MPE request. Just examine the Content-Type header of the multipart boundary that contains the file data.
Also, don't access variables/functions that start with an underscore. Those are not part of the API and may change or be removed at any time. In the future, I hope to prevent access to these internal entirely.

Categories