I'm using dropzone.js to upload certain files to my server. I have the problem that sometimes the server isn't able to keep up with the connections and refuses some uploads so they will fail and get marked red with an x. I would like to automaticaly retry after a certain amount of time or at least give the user the ability to restart it manually.
Is there an implemented feature in dropzone.js, an easy enough way to implement it for myself or is there a better tool to do those kinda uploads via drag/drop, preview, ajax, etc...?
My solution is without changing the Dropzone library and only 4 lines long. I try the file upload two times because the first failed request sets a new cookie based CSRF token:
var isFirstTry = true;
myDropzone = new Dropzone(document.body, {
init: function() {
this.on("sending", function(file, xhr, formData) {
xhr.setRequestHeader("X-CSRF", Cookies.get('CSRF'));
});
},
error: function(file, errorMessage, xhr){
if (errorMessage && errorMessage.status === 405 && file && isFirstTry) {
isFirstTry = false;
//remove item from preview
this.removeFile(file)
//duplicate File objet
new File([file], file.name, { type: file.type });
this.uploadFile(file);
}
},
// other configs
});
After dropzone error event, dropzone fires complete event no matter what is the result. On complete dropzone set the status element to be complete. This hide the progress bar. To prevent this behavior, copy the File object. This prevent complete hook to handle the new preview element.
One small modification to dropzone.js is required to make things look pretty but otherwise its just a directive.
My dropzone now retries (infinitely, but I'll fix that later) until it succeeds. A little more work is required to reset the progress bars but this should be enough to get you somewhere (if you still care about this).
The edit to dropzone.js is (in the beautified version):
success: function(file) {
file.previewElement.classList.remove("dz-error");
return file.previewElement.classList.add("dz-success");
}
Where I've added the remove line. This changes Xs to ticks when a file successfully uploads.
The angular directive follows:
.directive('dropZone', function($rootScope) {
return function ($scope, element, attr) {
var myDropZone = element.dropzone({
url: "api/ImageUpload",
maxFilesize: 100,
paramName: "uploadfile",
maxThumbnailFilesize: 5,
autoProcessQueue: false,
parallelUploads: 99999,
uploadMultiple: false,
// this is my identifier so my backend can index the images together
params: {identifier: $scope.identifier},
// I seem to need to do this when a file is added, otherwise it doesn't update
init: function(){this.on("addedfile", function(file){$rootScope.$digest();})}
});
// grabbing the dropzone object and putting it somewhere the controller can reach it
$scope.dropZone = myDropZone.context.dropzone;
// what we use to work out if we're _really_ complete
$scope.errors = [];
// here is our retry mechanism
myDropZone.context.dropzone.addEventListener("error", function(file,errorMessage,xhr)
{
// log our failure so we don't accidentally complete
$scope.errors.push(file.name);
// retry!
myDropZone.context.dropzone.uploadFile(file);
});
myDropZone.context.dropzone.addEventListener("success", function(file,errorMessage,xhr)
{
// remove from the error list once "success" (_not_ "complete")
$scope.errors.splice($scope.errors.indexOf(file.name), 1);
});
// this gets called multiple times because of our "retry"
myDropZone.context.dropzone.addEventListener("queuecomplete", function()
{
// if we're here AND have no errors we're done
if($scope.errors.length == 0)
{
// this is my callback to the controller to state we're all done
$scope.uploadComplete();
}
});
};
})
not sure if all that myDropZone.context.dropZone stuff is necessary, I kinda suck at javascript and spend a lot of my time console.logging() objects and examining them in the debugger. This is where I found the dropzone component, perhaps there is an easier way?
My situation was a little bit different then Quibblesome, but my solution was based on their answer... so thanks Quibblesome!!!!
On my situation the ajax wasn't failing so dropzone.addEventListener("error", function(file,errorMessage,xhr) was never getting triggered. So I changed Quibblsomes solution a bit to work even if dropdzone triggered success.
var fileObj;
clientDetailsDropZone = new Dropzone($("#clientDetailsDropZoneArea").get(0), {
init: function()
{
this.on("success", function(e,response)
{
//IF THE USERS authtoken is expired..... retry the upload automatically
if(!authToken.isValid)
{
//Get a new Token and then autmoatically re upload -- getFileAuthToken is another ajax call that authorizes the user
getFileAuthToken(function f(e){
this.uploadFile(fileObj);
});
}
else
{ //They had a good token and the upload worked
alert("yay your upload worked!");
}
});
this.on("addedfile", function(file) {
fileObj = file;
});
}
});
Related
I am using below code to open a link on button click. The link is pointing to a Controller method responsible for downloading some Excel file.
// Button to download table data
$("#btnDownloadCIRResults").click(function (e) {
var All_Recs = $("#cbShowAllRecords").prop("checked") ? "YES" : "NO";
DisplayStatusMessageWind("Downloading report, please wait...", MessageType.Info, true, false);
// DownloadCIRemediationTable(string AllRecords)
window.location = '/AskNow/DownloadCIRemediationTable?AllRecords=' + All_Recs;
DisplayStatusMessageWind("Report downloaded successfully.", MessageType.Success, false, true, 1000);
e.preventDefault();
});
The Controller method queries a DB table, converts it to an Excel workbook and returns a file as download result. All is working fine and as expected, except, since this is a time consuming process, I just want to improve on user experience and update this code to show some wait message while the file is being downloaded.
The DisplayStatusMessageWind() method shows a wait message. However, it doesn't know or care about the load complete event of the window.location = '/AskNow/DownloadCIRemediationTable?AllRecords=' + All_Recs; code.
How can I make the completion message appear only after the file download is completed:
DisplayStatusMessageWind("Report downloaded successfully.", MessageType.Success, false, true, 1000);
By assigning a new location with window.location = "<NEWURL>"; you're requesting asynchronously to replace the current page. What will happen, is that the next line is immediately executed (DisplayStatusMessage()). When all events are handled, the page will finally be replaced. The new page (URL) will load and you'll have no control whatsoever about how or what will happen next.
What you should do is use window.open("<NEWURL>", '_blank') MOZ and then on the new page send a signal via localStorage, which can be read and written by all pages of the same domain. These are some hints, to write the actual code is your job.
On this page, in on("click") event:
// local scope
var ukey;
// polling function
function waitOtherIsReady()
{
if (localStorage.getItem(ukey) === true)
{
// other page experienced ready event
localStorage.removeItem(ukey); // clean-up
// TODO: do your stuff
} else {
setTimeout(waitOtherIsReady, 500);
}
}
// create unique key and deposit it in localStorage
ukey = "report_" + Math.random().toString(16);
localStorage.setItem(ukey, false);
// pass key to other page
window.open("URL?ukey=" + ukey, "_blank");
// start polling until flag is flipped to true
setTimeout(waitOtherIsReady, 500);
On the other page:
$(() => {
// get ukey from URL
var ukey = new URL(window.location.href).searchParams.get("ukey");
// page is now ready, flip flag to signal ready event
localStorage.setItem(ukey, true);
});
Can anyone tell me why this error happens and how to solve it?
When I reload the page (cmd+r) the image request works just fine.
The error from browser console:
http://xxxxx.meteor.com/cfs/files/images2/aEaqKMjLQZcDPHLS8 503 (Service Unavailable)
What happens:
Image is uploaded (gridFS)
url is added to user profile
img src="{{imgsrc}}" is reactivly updated, thus a http request.
request url is correct but result is 503
Reloading page and the error goes away.
Code for uploading image
Template.profileedit.events({
'change .myFileInput': function(event, template) {
var files = event.target.files;
for (var i = 0, ln = files.length; i < ln; i++) {
Images.insert(files[i], function (err, fileObj) {
var userId = Meteor.userId();
var imagesURL = {
"profile.image": "/cfs/files/images2/" + fileObj._id
};
Meteor.users.update(userId, {$set: imagesURL});
});
}
}
Template helper
Template.profileedit.helpers({
imgsrc: function() {
var user = Meteor.users.findOne({"_id" : Meteor.userId()});
return Meteor.user().profile.image;
}
});
HTML that displays the image:
<template name="profileedit">
<img src="{{imgsrc}}" style="width:400px" >
</template>
Other stuff
Problem is the same both locally and on meteor.com
insecure and autopublish is active (this is just learning steps)
Thanks
Jesper
I would imagine that you are running into a problem where Mongo is not ready to retrieve the object yet. My guess is that the reactive update is happening before your image is actually available for retrieval, but by the time you hit refresh it is there and ready to be retrieved, which is why that work around works.
There are a few ways around this. For example: https://github.com/CollectionFS/Meteor-CollectionFS basically allows you to set a "Waiting" image which will display while your file is not ready and will reactively update when your image is set to be retrieved. You could do this on your own with your implementation shown as well.
I have a backend with header-based authentication and I want to download a file from it. With cookie-based authentication it is simple: you render a link to a file into <a href...> and let browser to handle the rest.
I have this code:
$http.get(url).then(function (response) {
// do something
}, function (response) {
alert("ERROR");
});
I have injection of HTTP headers, so in this case $http.get will populate them for us. What I want to do is to feed this response to $window, so from user perspective it will looks like usual file download.
Is there any way of doing this? Any other options are welcomed.
I assume what you want is to push the file to the client without them needing to click anything else on the page. You can cause this to occur without needing to mess with $window at all.
Here is an example of how I pushed a zip file created in the browser that contained multiple selected regions of a canvas:
var downloader = angular.element('<a>download</a>');
downloader.attr('href', "data:application/zip;base64," + content);
downloader.attr('download', 'my.zip')
var ev = $document[0].createEvent("MouseEvent");
ev.initMouseEvent(
"click",
true /* bubble */, true /* cancelable */,
window, null,
0, 0, 0, 0, /* coordinates */
false, false, false, false, /* modifier keys */
0 /*left*/, null
);
downloader[0].dispatchEvent(ev);
I am using fine uploader to handle the uploading of files in a web application I have. Is there some sort of callback for when the last file has finished processing? I found the onComplete callback, but this is fired when every file completes. I need to know when all files are done. Does anyone know of a way to do this with fine uploader?
I don't think fine uploader provides what you're asking for, but it's easy to do this yourself. You can increment a count in the onSubmit callback, and decrement it in the onComplete callback. When the count reaches 0 in onComplete, that means all the files have processed.
Ended up going this route:
var uploader = $("#upload").fineUploader({
request: {
endpoint: postUrl
},
template: uploadTemplate
}).on('submit',function(){
fileCount++;
}).on('complete', function (event, id, name, responseJSON) {
fileCount--;
if(fileCount == 0){
alert("done!");
}
});
I'm working on an application that allows the user to send a file using a form (a POST request), and that executes a series of GET requests while that file is being uploaded to gather information about the state of the upload.
It works fine in IE and Firefox, but not so much in Chrome and Safari.
The problem is that even though send() is called on the XMLHttpRequest object, nothing is being requested as can be seen in Fiddler.
To be more specific, an event handler is placed on the "submit" event of the form, that places a timeout function call on the window:
window.setTimeout(startPolling, 10);
and in this function "startPolling" sequence is started that keeps firing GET requests to receive status updates from a web service that returns text/json that can be used to update the UI.
Is this a limitation (perhaps security-wise?) on WebKit based browsers? Is this a Chrome bug? (I'm seeing the same behaviour in Safari though).
I am having the exact same problem. At the moment i use an iframe, which is targeted in the form. That allows the xhr requests to be executed while posting. While that does work, it doesn't degrade gracefully if someone disables javascript.(I couldn't load the next page outside the iframe without js) So if someone has a nicer solution, i would be grateful to hear it.
Here the jQuery script for reference:
$(function() {
$('form[enctype=multipart/form-data]').submit(function(){
// Prevent multiple submits
if ($.data(this, 'submitted')) return false;
var freq = 500; // freqency of update in ms
var progress_url = '{% url checker_progress %}'; // ajax view serving progress info
$("#progressbar").progressbar({
value: 0
});
$("#progress").overlay({
top: 272,
api: true,
closeOnClick: false,
closeOnEsc: false,
expose: {
color: '#333',
loadSpeed: 1000,
opacity: 0.9
},
}).load();
// Update progress bar
function update_progress_info() {
$.getJSON(progress_url, {}, function(data, status){
if (data) {
var progresse = parseInt(data.progress);
$('#progressbar div').animate({width: progresse + '%' },{ queue:'false', duration:500, easing:"swing" });
}
window.setTimeout(update_progress_info, freq);
});
};
window.setTimeout(update_progress_info, freq);
$.data(this, 'submitted', true); // mark form as submitted.
});
});