I've been beating my head over this for quite a while now, so I think it's time I reach out for help. I have some already existing code that uses the jQuery File Uploader plugin, allowing me to upload files to my webserver. The trouble I am having is listing files that already exist on the web server.
Here is my initialization code that runs at the client:
$('#fileupload').fileupload({
disableImageResize: false,
url: '/api/upload',
done: function (e, data) { // data is checked here }
});
// Load existing files:
$('#fileupload').addClass('fileupload-processing');
$.ajax({
url: $('#fileupload').fileupload('option', 'url'),
dataType: 'json',
context: $('#fileupload')[0],
data: { action: "FileList", blob: "uts", path: "Unit 14/Binaries/" }
}).always(function (e, data) {
$(this).removeClass('fileupload-processing');
}).done(function (result) {
$(this).fileupload('option', 'done')
.call(this, $.Event('done'), { result: result });
});
Now, I am trying to return a list of pre-existing files on the server side that matches the JSON response akin to the documentation. My ASP.NET code on the server side is as follows (with two bogus files called "Something" and "SomethingElse" using my FilesStatus class).
// Get a list of files from
private void FileList(HttpContext hc)
{
var serializer = new System.Web.Script.Serialization.JavaScriptSerializer();
List<FilesStatus> fs_list = new List<FilesStatus>();
fs_list.Add(new FilesStatus("Something", 124));
fs_list.Add(new FilesStatus("SomethingElse", 124));
HttpContext.Current.Response.AddHeader("Pragma", "no-cache");
HttpContext.Current.Response.AddHeader("Cache-Control", "private, no-cache");
hc.Response.AddHeader("Content-Disposition", "inline; filename=\"files.json\"");
var result = new { files = fs_list.ToArray() };
hc.Response.Write(serializer.Serialize(result));
hc.Response.ContentType = "application/json";
HttpContext.Current.Response.StatusCode = 200;
}
In the "done" function of the AJAX code, I see what I believe is the proper response on the client side. Here, you can see the format in my Javascript debugger (i.e., a top level "files" that is an array):
These files do not get populated in to the file list, though. The code that I marked "//data is checked here" in the main done() function shows that the array can be accessed as "data.result.files" NOT "data.files." I can change ".call(this, $.Event('done'), { result: result });" to ".call(this, $.Event('done'), { files: result.files });" so that "data.files" is the location of the file array, but this does not solve the problem. I can't seem to get any pre-existing files to load in to the list.
Does anyone see what I am doing wrong? Happy holidays.
What happens when you change the line:
hc.Response.Write(serializer.Serialize(result));
into
hc.Response.Write(serializer.Serialize(fs_list.ToArray()));
It looks like the serializer is taking the variable name into account when you are serializing your file descriptions. The 'result' JSON object should disappear from the response.
I was overwriting the done() method where I have:
done: function (e, data) { // data is checked here }
I did this for debugging, but apparently it blocks the pre-existing file list from being loaded and calling the download template.
Related
I'm trying to figure out a way how to rename file after upload using dropzone.js so I could later delete it just by sending correct name.
What I have right now:
this.on("success", function(file, responseText) {
console.log(responseText); // responseText contains actual file name after server modifications
});
addRemoveLinks: true,
this.on("removedfile", function(file) {
var name = file.name;
$.ajax({
type: 'POST',
url: 'delete_file.html',
data: {
'file-name': name,
},
});
});
As you can see in my ajax data I am sending initial file name, not the one that is actually on the server (e.g. if there are multiple same named files server will rename them).
I have been thinking of changing previewElement name on success:
file.previewElement.querySelector(".name").textContent = responseText;
and then refer to this in ajax, but it doesn't look like an elegant approach.
Other alternative would be to create a map with <file, new_name> mapping, but I'm not sure if that's not an overkill.
How would you recommend accessing new file name after upload?
The cleanest option I came up with is using file.xhr.response value which hold the new name instead of file.name
I want a local file (on the server) to be downloaded by the user. The user first kicks off the file creation by pressing a button and once the file is ready, he should be able to clock on a link or a button to download the file.
Creating the file has not been a problem, as i simply send an AJAX call to my backend which looks like
#POST
#Path("/createFile")
#Produces("application/text")
#Consumes("application/json")
public String createFile(String argsFromPage) {
/*File creation code here*/
return "Path of file created";
}
Now, that the file is created, all I want is to create a link which the user can click and download this file. For now, the file can be either a binary or a CSV file. I have made several attempts but without any success
<button onclick='create_file()'>Create</button>
function create_file() {
$.ajax({
method : "POST",
url : ".path/to/backend/service",
contentType : "application/json",
data : JSON.stringify({
param1 : val1
})
}).done(function(data) {
console.log(data);
});
}
now once the file has been created, is it possible to create a download link? Better still, is it possible to invoke the download as soon as the file is created? Should this be done in the browser, or the back end?
Follow Up
Once the file has been downloaded, how can i delete it form the server? Is there any way to endure that the file download has been completed?
To create a link to the file you can just create an a element in the DOM within the done() handler. Try this:
function create_file() {
$.ajax({
method: "POST",
url: ".path/to/backend/service",
contentType: "application/json",
data: { param1: val1 } // I assume 'val1' is declared in a higher scope?
}).done(function(path) {
$('#someContainer').append('Click here to download');
});
}
Note that I removed the manual JSON.stringify call as jQuery will do this for you. Also note that it would be better to return JSON from the AJAX request as it avoids issues with whitespace, although the above should still work given the code sample you provided.
I used the "blobProperties -> name" property for the fine-uploader as stated in the following answer: https://stackoverflow.com/a/36453417/6746874
It is supposed to set the name for the uploaded file to the result from a function (if a function is set to this property).
So I created a function to get called, which calls a function from an asp.net controller where I create a GUID.
Here is a code example for my javascript code:
blobProperties: {//Set the names from the files to a server generated guid
name: function (fileId) {
return new Promise(function (resolve) {
// retrieve file name for this file from controller
$.ajax({
type: "GET",
url: "/controller/action",
headers: { "cache-control": "no-cache" },
success: function (result) {
uploader.setUuid(fileId, result);
resolve(result);
},
failure: function (error) {
alert("Failure")
}
});
})
}
}
It calls the action via ajax, and if successfull it sets the current files uuid and the name to the returned value.
This works great, but only if the file is not chunked.
If the file is chunked, the action gets called multiple times. And for every chunk a new Guid as a filename and uuid is created. So it gets into an invalid state, because when the fine-uploader tries to combine the chunks, azure returns an error-code:
400
With the message: The specified block list is invalid
My question is is this behaviour intentionally to call it for every chunk? If yes how could I prevent it from getting called multiple times per file?
P.S. In the linked answer is stated that it should only be called once
I've successfully implemented the uploading of directory structures with Fine Uploader using the dragAndDrop: reportDirectoryPaths option. Each file that is uploaded has it's qqpath property to signify the path from which it came.
var exampleUploader = new qq.FineUploader({
element: document.getElementById('fine-uploader'),
template: 'qq-template-manual-trigger',
request: {
endpoint: '/api/UploadDocuments',
params: {
param1: "Test1",
param2: "Test2"
}
},
callbacks: {
onComplete: function (id, name, responseJSON, xhr) {
console.log("onComplete");
},
onAllComplete: function (id, name, responseJSON, xhr) {
console.log("onAllComplete");
}
},
multiple: true,
dragAndDrop: {
reportDirectoryPaths: true
},
autoUpload: true,
debug: false
});
There is however, one problem: the files are uploaded one by one, but the ajax request is called sometimes at the same time. If two files with the same directory structure are passed to the service at the exact same time, the directories might be created twice in one go. Is there a way to only do the ajax request on success of the previous? In other words, is there a way to upload the files sequentially, rather than at the same time (a whole bunch of files being passed into the service at once)?
Thanks in advance,
aj
The problem you are seeing is an issue with your server, and not with anything client/browser-side. It's shortsighted to limit your frontend to one request at a time. This presents a completely unnecessary bottleneck. Your server should sort all of this out.
On approach is to keying uploaded files by the UUID assigned by Fine Uploader and then sort out the storage hierarchy later. And if you don't want to trust the UUID supplied by Fine Uploader, you can always generate your own server-side, return it with the response, and Fine Uploader will use that for all other requests related to that specific file instead.
Another approach is for your server to simply check server side to see if the directory already exists for the target file.
Keep in mind that this "qqPath" property you are depending on only exists in Chrome/Opera, and only when a directory is dropped.
If you want to upload files one by one using fine uploader the easiest way is limiting the max connections to 1.
var exampleUploader = new qq.FineUploader({
element: document.getElementById('fine-uploader'),
template: 'qq-template-manual-trigger',
request: {
endpoint: '/api/UploadDocuments',
params: {
param1: "Test1",
param2: "Test2"
}
},
callbacks: {
onComplete: function (id, name, responseJSON, xhr) {
console.log("onComplete");
},
onAllComplete: function (id, name, responseJSON, xhr) {
console.log("onAllComplete");
}
},
maxConnections: 1,
multiple: true,
dragAndDrop: {
reportDirectoryPaths: true
},
autoUpload: true,
debug: false
});
By default the parameter maxConnections is 3.
I'm using Google App Engine for a backend service and I'm trying to upload a file using an AJAX post and their Blobstore API. I got that part working. If you are not familiar with the service, is quite simple. Blobstore API uploads is a two step process: You need to get an upload url and then upload into that url.
Now, I'm implementing an editor, medium.com-like.
The thing is this plugin needs an endpoint for the upload. As my endpoint is not static and I need to update that URL each time, I have prepared an API in the backend that responds with a JSON file with that URL. I'm trying to do an AJAX request to get that URL but I'm getting an error, as the POST request is done to bad url.
This is the POST requet:
INFO 2014-10-19 08:58:22,355 module.py:659] default: "POST /admin/%5Bobject%20Object%5D HTTP/1.1" 200 2594
An this is my Javascript code:
function getURL(callback) {
return $.ajax({
type: "GET",
url: "/admin/upload_url",
dataType: "json",
success: callback
});
};
$('.editable').mediumInsert({
editor: editor,
addons: {
images: {
imagesUploadScript: getURL().done(function(json){return json['url']})
},
embeds: {
oembedProxy: 'http://medium.iframe.ly/api/oembed?iframe=1'
}
}
});
I guess I'm doing something wrong with the AJAX return, but if I console.log it I get the result I want. I've read this answer and try to apply it, but I didn't manage to get it working.
Thanks for your time and your help ! :)
If someone ever has the same problem this is the way I solved it. If you are reading this and you now a better one, please, every help is appreciated.
var url; // Set a global variable
// Define the AJAX call
function AJAXURL() {
return $.ajax({
type: "GET",
url: "/admin/upload_url",
success: function(response){
// Sets the global variable
url = response['url'];
}
});
};
// Gets a first upload URL doing an AJAX call while everything keeps loading
AJAXURL();
$('#editable').mediumInsert({
editor: editor,
addons: {
images: {
imagesUploadScript: function getURL() {
// makes a request to grab new url
AJAXURL();
// But returns the old url in the meanwhile
return url;
}
},
embeds: {
urlPlaceholder: 'YouTube or Vimeo Link to video',
oembedProxy: 'http://medium.iframe.ly/api/oembed?iframe=1'
}
}
});