I'm using jquery file upload by blueimp plugin for uploading files.
My client side options are
dataType: 'json',
autoUpload: true,
formData: [],
fileInput: $("#fileupload"),
acceptFileTypes: /(\.|\/)(jpe?g|png)$/i,
maxFileSize: 999000,
maxChunkSize: 0,
multipart: false,
When I tried uploading files without the last two options, small files around(450 - 500 kb) were successfully uploaded and large files around (800 kb) were not. (request was sent, but the response never arrived. No errors were recorded on php or apache or javascript)
Then after adding the last two options, the response got me to the event of 'fileuploaddone' on client side, but was actually an erroneous response as I was getting file name as NULL on my backend-php and it shows me invalid file type.
I'm making api calls to php backend and when I do the same using postman(plugin in google chrome to make api calls), everything works fine.
P.S. my file name is transmitted via Content-Disposition header, but i'm not sure what it exactly is.
Related
I was writing end to end test case for a site. In this site, user is supposed to upload python file. I want to test what happens if I rename png file with `.py extension and upload it. When I did manually, the website gave the error
ERROR: You have submitted a file factorial.py of type py but the contents are of image/png type
However, when I uploaded through cypress using cypress-file-upload plugin, it gave me following error:
ERROR: You have submitted a file factorial.py of type py but the contents are of application/octet-stream type
Why is this so? Am I missing something or am getting diffrent message because of the way the plugin uploads file in some peculiar manner? Can I imitate the manual file upload output through cypress (that is get exactly the same error of image/png type)?
Update
I also checked for the mp3 file. Manual upload was giving error:
ERROR: You have submitted a file factorial.py of type py but the contents are of audio/mpeg type
whereas cypress upload was giving same application/octet-stream type-error.
Also, my code to upload the file looks like this:
cy.get('input[name="docfile"]').attachFile(filePath)
cy.contains('input','Upload').click()
Explicitly adding a mimetype also doesn't work and still gives same error:
cy.get('input[name="docfile"]').attachFile({ filePath: _filePath, mimeType : _mimeType })
After searching the past couple days, I've found nearly 30 different people asking this same question, and I haven't found an answer. A couple reported that they found a solution, but did not provide it, so I'm hoping someone could answer it here.
Using blueimp jQuery File Upload, how can you upload multiple files directly to Amazon S3?
Problem: S3 accepts only one file per request.
Solution: Use blueimp jQuery File Upload to send separate requests for each file.
Roadblock: I cannot figure out how to make blueimp jQuery File Upload do this.
Solution Attempt
The guide is here: https://github.com/blueimp/jQuery-File-Upload/wiki/Upload-directly-to-S3
S3 currently requires more form fields than the ones shown in the guide. Here's a trimmed down version of my code:
$(':file').each(function(i, el) {
var fileInput = $(el);
var form = fileInput.parents('form:first');
fileInput.fileupload({
forceIframeTransport: true,
autoUpload: true,
singleFileUploads: true, //default anyway
add: function(event, data) {
var files = data.files || [];
var nextInQueue = files[0]; //this is a queue, not a list
console.log('nextInQueue:', nextInQueue);
if (!nextInQueue) return;
var fileData = {name: nextInQueue.name, mime: nextInQueue.type, size: nextInQueue.size};
$.ajax({
url: '/s3stuff',
type: 'POST',
dataType: 'json',
data: {'file': fileData},
async: false,
success: function(res) {
form.find('input[name="key"]').val(res.key);
form.find('input[name="AWSAccessKeyId"]').val(res.AWSAccessKeyId);
form.find('input[name="policy"]').val(res.policy);
form.find('input[name="signature"]').val(res.signature);
form.find('input[name="acl"]').val(res.acl);
form.find('input[name="success_action_status"]').val(res.success_action_status);
form.find('input[name="Content-Type"]').val(nextInQueue.type);
form.attr('action', res.url);
data.submit(); //why is this submitting all files at once?
}
});
},
fail: function(err) {
console.log('err:', err.stack);
}
});
});
Error
When I try to upload a single file, it works great! But when I try to upload multiple files, S3 returns this 400 error:
<?xml version="1.0" encoding="UTF-8"?>
<Error>
<Code>InvalidArgument</Code>
<Message>POST requires exactly one file upload per request.</Message>
<ArgumentName>file</ArgumentName>
<ArgumentValue>2</ArgumentValue>
</Error>
Analysis
The blueimp jQuery File Upload documentation says this (under add):
If the singleFileUploads option is enabled (which is the default), the add callback will be called once for each file in the selection for XHR file uploads, with a data.files array length of one, as each file is uploaded individually.
This is where I'm stuck:
If "each file is uploaded individually" to S3, why does S3 say otherwise?
Also, regardless of the number of files, the add function runs only once. (I verify this with the console.log statement.) I see 2 likely reasons for this:
The plugin stops further submissions after one fails.
The plugin is not submitting files individually (for whatever reason).
Is my code incorrect, am I missing an option in the plugin, or does the plugin not support multiple direct uploads to S3?
Update
Here's an html file for quick testing:
http://pastebin.com/mUBgr4MP
Author of JQuery File Upload here.
The reason the files are not submitted individually is the forceIframeTransport option, which is set to true in your example.
The documentation for the singleFileUploads option states the following:
By default, each file of a selection is uploaded using an individual request for XHR type uploads.
While the documentation for the forceIframeTransport option states the following:
Set this option to true to force iframe transport uploads, even if the browser is capable of XHR file uploads.
So, the solution is to enable CORS on the S3 bucket and to not enable the forceIframeTransport option.
By the way, most of the integration examples are user contributions (including the S3 upload guides).
Here's another one which uses the S3 CORS feature and might help your out (haven't tested it though):
http://pjambet.github.io/blog/direct-upload-to-s3
I am using AWS S3 Javascript sdk to upload files to my S3 bucket via my browser. I had no problem fetching files or uploading small and even huge files with the multi-part upload normally.
The issue I faced was while uploading a huge file and lost my connection in between. After the connection returned, the request was resend for the remaining parts to be uploaded but failed.
I have attached a screenshot of the failed requests
Any reason why this fails, or any way this can be handled/resolved?
When you are uploading a huge set of data, you can try including a class ManagedUpload for multi-part uploading. You need to specify the bucket size, however. A sample code of this fromt the documentation would be:
var upload = new AWS.S3.ManagedUpload({
partSize: 10 * 1024 * 1024, queueSize: 1,
params: {Bucket: 'bucket', Key: 'key', Body: stream}
});
Where, the partSize (Number), by default, the value is 5mb is the size in bytes for each individual part to be uploads.
There's also an open source project in GitHub: AWS S3 Multipart Upload from Browser, which is written in JavaScript and PHP to make huge files to be uploaded in Amazon S3 server directly, in chunks of 5 MB, so it is resumable and recovers easily from error.
Guessing that to use the above plugin, you might have to use PHP. There's also a limit on maximum upload size per file. Please do have a look at it.
In form, There is a field of file
<input type="file" name="file">
<input type="button" value="upload">
I want to upload image file onto server but I can't able to upload the image. Is there any solution in only javascript? Please help me to find this answer.
Thanks..!
This isn't possible with JavaScript alone, you will need to use a server side language to process the actual uploading of the file.
This will do the job in Firefox, IE does not support FormData, so you should find another way
<script type="text/javascript" src="http://code.jquery.com/jquery-latest.min.js"></script>
<form id="data">
<fieldset>
<div>Asset File: <input id="image_file" name="image_file[]" type="file" /></div>
<div><input type="submit" value="Submit"></div>
</fieldset>
</form>
<script>
$(function() {
$("form#data").submit(function(event){
event.preventDefault();
var url = 'http://server.com/upload';
var image_file = $('#image_file').get(0).files[0];
var formData = new FormData();
formData.append("image_file", image_file);
$.ajax({
url: url,
type: 'POST',
data: formData,
async: false,
cache: false,
contentType: false,
processData: false,
success: function (status) {
// do something on success
}
});
return false;
});
});
</script>
Your web site files are stored at the server. JavaScript runs at the client-side.
The images must be stored somewhere. But the client-side JavaScript have access to the user's browser window (or tab). and user's cache. You cannot write files at the server storage using only client-side JavaScript.
The ways images are uploaded and downloaded:
Download:
You send request to the server using JavaScript (ajax for example). In this request you say: "GET http://my-site/images/cool-dog.png" This is static request that try to access images/cool-dog.png from your public folder on the server. Every server has option that allows you to determinate which folder will contain all the files for the static requests.
STATIC request is when you try to access a file with an extension (cool-dog.png)
Upload:
As we know, everybody can write client-side JavaScript to the console: Every major browser has debugging tools. And everybody can send any kind of request to your server from Postman for example.
It will be bad to accept all of there request. Somebody may want to upload 100GB file at max speed. That may affect the application and server performance or even hack the application.
You need server-side logic to determinate which file is good for your server and where this file must be stored, since the JavaScript know only about client-side storage (cookies, localStorage, sessionStorage, cache, etc...).
Upload process is:
You send request from the client-side JavaScript to the server-side. For example : POST http://my-site/uploadImage with the image data.
The server-side accepts that request. For example: Router.get('uploadImage', function() {...Server side code...}). Since http://my-site/uploadImage is dynamic path (we do not have extension) The server-side Router will wait for this kind request and if it's requested the server-side code must check the file size and extension (.png or .dll) and etc... And if the file is good for your application then you can write this file to the server location.
The different server languages and technologies has different methods for processing requests and writing files.
If you want to extend your knowledge you need to learn at least one server-side language and technology. You cannot make flexible applications only with client-side logic.
If you are familiar with JavaScript yet. You may want to learn NodeJS. This site contains great tutorials for NodeJS beginners.
PHP is also easy to learn if you can found some good tutorials.
Here is tutorial how to upload files using nodeJS
and here is another for PHP
I'm using jquery-file-upload plugin to upload some files.
I have a bind to the add callback where I collect the files to upload and on form submit i'm issuing the 'send' command to begin the upload.
The problem i'm facing is that the fail callback is called upon successful upload with the following error message:
"Uploaded bytes exceed file size"
can anyone explain me what does this error mean? and why I keep getting it?
I appreciate the help.
10x
I was getting this error as well but I was setting the dataType value to be JSON.
I remove the dataType open and stopped getting this error.
This only worked for me as I was not getting Json back
I was also getting this kind of error. And it took me a whole day to find the cause of the problem. So, I changed a lot of things and among them:
I explicitly set "upload_tmp_dir" directive in php.ini - it was
irrelevant;
Checked and increased values for both "upload_max_filesize" and "post_max_size" directives in php.ini - it was also not the cause.
Created all the folders in my document root directory: "images/" and "images/temp/" - didn't help though is necessary.
The problem was that there was a separate root for AJAX post request for image preliminary upload (Silex/Symfony library), and in the example I copied, the author (deliberately) left out the "url" option from the request, as in:
$('#fileupload').fileupload({
dataType: 'json',
url: '/post/upload', // this was missing
replaceFileInput: false,
fileInput: $('input:file'), ...
At the same time, the form, where this element belonged to, had its own route set to "/post/" and, with "url" option missing, the plug-in, obviously took it from the parent form.
As such, no controller was called, the uploaded file was unprocessed, and the text of the error was misleading.
PS. This "auto substitution" by the plug-in might also affect the methods used for request (PUT or POST), as I used both in controllers - "PUT" for the form and "POST" for the fileupload management, but I didn't check.
I tried reviewing all available messages in console and I found this
always: function(e, data) {
console.log(data.jqXHR.responseJSON.files[0].error);
In my part, I got the error due to the file I'm uploading exceeds the upload_max_filesize directive in php.ini
The won't give the you the exact error.
console.log(data.messages);
Make sure your form only has inputs that are what AWS expects. Do not add extra form fields of your own. Or if you do, remove them with JS before the AWS upload begins.