I've successfully implemented the uploading of directory structures with Fine Uploader using the dragAndDrop: reportDirectoryPaths option. Each file that is uploaded has it's qqpath property to signify the path from which it came.
var exampleUploader = new qq.FineUploader({
element: document.getElementById('fine-uploader'),
template: 'qq-template-manual-trigger',
request: {
endpoint: '/api/UploadDocuments',
params: {
param1: "Test1",
param2: "Test2"
}
},
callbacks: {
onComplete: function (id, name, responseJSON, xhr) {
console.log("onComplete");
},
onAllComplete: function (id, name, responseJSON, xhr) {
console.log("onAllComplete");
}
},
multiple: true,
dragAndDrop: {
reportDirectoryPaths: true
},
autoUpload: true,
debug: false
});
There is however, one problem: the files are uploaded one by one, but the ajax request is called sometimes at the same time. If two files with the same directory structure are passed to the service at the exact same time, the directories might be created twice in one go. Is there a way to only do the ajax request on success of the previous? In other words, is there a way to upload the files sequentially, rather than at the same time (a whole bunch of files being passed into the service at once)?
Thanks in advance,
aj
The problem you are seeing is an issue with your server, and not with anything client/browser-side. It's shortsighted to limit your frontend to one request at a time. This presents a completely unnecessary bottleneck. Your server should sort all of this out.
On approach is to keying uploaded files by the UUID assigned by Fine Uploader and then sort out the storage hierarchy later. And if you don't want to trust the UUID supplied by Fine Uploader, you can always generate your own server-side, return it with the response, and Fine Uploader will use that for all other requests related to that specific file instead.
Another approach is for your server to simply check server side to see if the directory already exists for the target file.
Keep in mind that this "qqPath" property you are depending on only exists in Chrome/Opera, and only when a directory is dropped.
If you want to upload files one by one using fine uploader the easiest way is limiting the max connections to 1.
var exampleUploader = new qq.FineUploader({
element: document.getElementById('fine-uploader'),
template: 'qq-template-manual-trigger',
request: {
endpoint: '/api/UploadDocuments',
params: {
param1: "Test1",
param2: "Test2"
}
},
callbacks: {
onComplete: function (id, name, responseJSON, xhr) {
console.log("onComplete");
},
onAllComplete: function (id, name, responseJSON, xhr) {
console.log("onAllComplete");
}
},
maxConnections: 1,
multiple: true,
dragAndDrop: {
reportDirectoryPaths: true
},
autoUpload: true,
debug: false
});
By default the parameter maxConnections is 3.
Related
I can find plenty of documentation on how to use the chunked file upload with various API's and libraries, but I am unable to find how to use Dropzone.js chunked file upload with just plain PHP.
The documentation is very minimal. I cannot add any libraries or API's other than jQuery on the client side (JavaScript).
My question is how do you use Dropzone.js new chunked file upload feature with PHP only on the server side. (Client side code is appreciated for setup).
Here is the code I've attempted so far:
Client side .js file:
var myDropzone = new Dropzone("div#formDiv",
{
url: "uploadform.php",
params: function (files, xhr, chunk)
{
if (chunk)
{
return
{
dzUuid=chunk.file.upload.uuid,
dzChunkIndex=chunk.index,
dzTotalFileSize=chunk.file.size,
dzCurrentChunkSize=chunk.dataBlock.data.size,
dzTotalChunkCount=chunk.file.upload.totalChunkCount,
dzChunkByteOffset=chunk.index * this.options.chunkSize,
dzChunkSize=this.options.chunkSize,
dzFilename=chunk.file.name;
};
}
},
method: "post",
timeout: 600000,
maxFileSize: 1024,
parallelUploads: 1,
chunking: true,
forceChunking: true,
chunkSize: 1000000,
parallelChunkUploads: true,
retryChunks: true,
retryChunksLimit: 3,
chunksUploaded: function (file, done)
{
// All chunks have uploaded successfully
},
error: function (msg)
{
alert(msg.responseText);
}
});
Here is the PHP (server):
foreach ($_POST as $key => $value)
{
_log('key:' . $key . '; Value:' . $value);
}
The above code shows nothing (_log() just echoes it to the screen and logs it in a text file).
I looked at the send/receive headers and it only sends one call to the server.
I've verified that the file drag/drop zone is setup correctly by Dropzone.js using the developer tools console in the browser.
Edit: Here is the documentation for chunked uploads: https://gitlab.com/meno/dropzone/wikis/faq#chunked-uploads
Chunked uploads
Dropzone offers the possibility to upload files in chunks. The
relevant configuration options for this feature are:
chunking which should be set to true
forceChunking, if true, will always send a file in chunks, even if it
is only one chunk
chunkSize in bytes
parallelChunkUploads, if true, the chunks will be uploaded
simultaneously
retryChunks, if true, the library will retry to upload a chunk if it
fails
retryChunksLimit defaults to 3
Then there are two important callbacks. The first one is: params which
can be a function, that receives files, xhr and chunk as the first
argument. If chunking is enabled, you know that files only contains
that one file, and chunk is the object holding all the information
about the current chunk. Example:
var chunk = {
file: file,
index: 0,
status: Dropzone.UPLOADING,
progress: 0.4
}
See the documentation for that parameter for more information or look
at the source code for the default implementation.
The second important callback is chunksUploaded, which gets the file
that finished uploading and the done function as second argument. Do
whatever you need to do in that function, to tell the server that the
file finished uploading and invoke the done() function when ready.
I found that I had a number of problems. One was that I wasn't using the most recent version of Dropzone.js.
Another was that I wasn't checking $_FILE and $_POST for the variables and file parts.
Once I got those fixed, I changed the JavaScript to:
var myDropzone = new Dropzone("div#formDiv",
{
url: "uploadform.php",
method: "post",
timeout: 180000,
maxFileSize: 1024,
parallelUploads: 1,
chunking: true,
forceChunking: true,
chunkSize: 256000,
parallelChunkUploads: true,
retryChunks: true,
retryChunksLimit: 3,
};
By removing the params function, I get the default values. These values include the chunk index and the maximum number of chunks.
From there I was able to get the chunked files in the PHP script using:
$_FILE['file']['name']
$_FILE['file']['tmp_name']
$_POST['dzchunkindex']
$_POST['dztotalchunkcount']
After that it was just a matter of checking in the PHP script for all the file parts being uploaded and then re-assembling them which is handled in many tutorials that are easy to find on the internet.
I used the "blobProperties -> name" property for the fine-uploader as stated in the following answer: https://stackoverflow.com/a/36453417/6746874
It is supposed to set the name for the uploaded file to the result from a function (if a function is set to this property).
So I created a function to get called, which calls a function from an asp.net controller where I create a GUID.
Here is a code example for my javascript code:
blobProperties: {//Set the names from the files to a server generated guid
name: function (fileId) {
return new Promise(function (resolve) {
// retrieve file name for this file from controller
$.ajax({
type: "GET",
url: "/controller/action",
headers: { "cache-control": "no-cache" },
success: function (result) {
uploader.setUuid(fileId, result);
resolve(result);
},
failure: function (error) {
alert("Failure")
}
});
})
}
}
It calls the action via ajax, and if successfull it sets the current files uuid and the name to the returned value.
This works great, but only if the file is not chunked.
If the file is chunked, the action gets called multiple times. And for every chunk a new Guid as a filename and uuid is created. So it gets into an invalid state, because when the fine-uploader tries to combine the chunks, azure returns an error-code:
400
With the message: The specified block list is invalid
My question is is this behaviour intentionally to call it for every chunk? If yes how could I prevent it from getting called multiple times per file?
P.S. In the linked answer is stated that it should only be called once
I want to create/save a .json file locally in Extjs with information from the DOM. Usually to POST, DELETE, GET or PUT .json query packets, the following method is used;
Ext.Ajax.request({
url: GlobalInfo.apiURL + 'api/grades/postsub',
method: 'POST',
params: {
SubjectName: newSubjectName,
SkillID: newSubjSkill
},
success: function(){
Ext.MessageBox.alert('Status', 'Success');
},
failure: function() {
Ext.Msg.alert('Status', 'You failed me! :o');
}
});
Is there a similar method I can use to create a local .json file on the hard drive of the user when they click a 'Save' button for example?
Create a store/model which uses a client proxy. Ext has a few built in client proxies.
Local storage
In memory
Session
If none of these work for your target browsers, then you will likely need to write your own.
The Sencha docs on proxies should help
I've been beating my head over this for quite a while now, so I think it's time I reach out for help. I have some already existing code that uses the jQuery File Uploader plugin, allowing me to upload files to my webserver. The trouble I am having is listing files that already exist on the web server.
Here is my initialization code that runs at the client:
$('#fileupload').fileupload({
disableImageResize: false,
url: '/api/upload',
done: function (e, data) { // data is checked here }
});
// Load existing files:
$('#fileupload').addClass('fileupload-processing');
$.ajax({
url: $('#fileupload').fileupload('option', 'url'),
dataType: 'json',
context: $('#fileupload')[0],
data: { action: "FileList", blob: "uts", path: "Unit 14/Binaries/" }
}).always(function (e, data) {
$(this).removeClass('fileupload-processing');
}).done(function (result) {
$(this).fileupload('option', 'done')
.call(this, $.Event('done'), { result: result });
});
Now, I am trying to return a list of pre-existing files on the server side that matches the JSON response akin to the documentation. My ASP.NET code on the server side is as follows (with two bogus files called "Something" and "SomethingElse" using my FilesStatus class).
// Get a list of files from
private void FileList(HttpContext hc)
{
var serializer = new System.Web.Script.Serialization.JavaScriptSerializer();
List<FilesStatus> fs_list = new List<FilesStatus>();
fs_list.Add(new FilesStatus("Something", 124));
fs_list.Add(new FilesStatus("SomethingElse", 124));
HttpContext.Current.Response.AddHeader("Pragma", "no-cache");
HttpContext.Current.Response.AddHeader("Cache-Control", "private, no-cache");
hc.Response.AddHeader("Content-Disposition", "inline; filename=\"files.json\"");
var result = new { files = fs_list.ToArray() };
hc.Response.Write(serializer.Serialize(result));
hc.Response.ContentType = "application/json";
HttpContext.Current.Response.StatusCode = 200;
}
In the "done" function of the AJAX code, I see what I believe is the proper response on the client side. Here, you can see the format in my Javascript debugger (i.e., a top level "files" that is an array):
These files do not get populated in to the file list, though. The code that I marked "//data is checked here" in the main done() function shows that the array can be accessed as "data.result.files" NOT "data.files." I can change ".call(this, $.Event('done'), { result: result });" to ".call(this, $.Event('done'), { files: result.files });" so that "data.files" is the location of the file array, but this does not solve the problem. I can't seem to get any pre-existing files to load in to the list.
Does anyone see what I am doing wrong? Happy holidays.
What happens when you change the line:
hc.Response.Write(serializer.Serialize(result));
into
hc.Response.Write(serializer.Serialize(fs_list.ToArray()));
It looks like the serializer is taking the variable name into account when you are serializing your file descriptions. The 'result' JSON object should disappear from the response.
I was overwriting the done() method where I have:
done: function (e, data) { // data is checked here }
I did this for debugging, but apparently it blocks the pre-existing file list from being loaded and calling the download template.
There is a great multiple-file upload script out there, totally flash free:
http://valums.com/ajax-upload/
Now, while it works for me, I'd like some code to be executed when the upload is done. I think the script provides this functionality, in examples it has this:
onComplete: function(id, fileName, responseJSON){}
but I'm not sure how I'd use it - I just need to have some code executed after the upload is successfully finished.
Sorry if it's a script-specific question, looked to me like it may be general js knowledge.
The {} denotes the function body. Here it is expanded:
onComplete: function(id, fileName, responseJSON){
//this is where you do something on completion.
//you have access to the id, the filename and the JSON.
}
So, for example:
var uploader = new qq.FileUploader({
element: document.getElementById('file-uploader'),
action: '/server-side.upload',
// additional data to send, name-value pairs
params: {
param1: 'value1',
param2: 'value2'
},
onComplete: function (id, fileName, responseJSON) {
alert(fileName + ' was successfully uploaded');
}
});
When it has completed, uploader will execute the onComplete function, passing in the id, filename and JSON.