I'm trying to implement an upload form and return the upload status to return tot he user using xhr. Everything seems to be implemented correctly, however when uploading, the callbacks seem to occur way too quick and return a much higher percentage than has actually occurred.
With files ~<20Mb, I get a callback immediately which shows over 99% while the upload continues to churn away for some time in the background.
See the below screengrab showing the console from a 74Mb file. This was taken a couple of seconds after the upload was initialised and the upload continued for another ~60 seconds (notice just 3 callbacks registering (loaded totalsize) (calculatedpercentage) and the ajax upload continuing with the throbber).
Has anyone experienced this and managed to get an acurate representation of upload status?
(the 'load' event triggers correctly after the upload process)
Here's my code:
$(this).ajaxSubmit({
target: '#output',
beforeSubmit: showRequest,
xhr: function()
{
myXhr = $.ajaxSettings.xhr();
if (myXhr.upload)
{
console.log('have xhr');
myXhr.upload.addEventListener('progress', function(ev){
if (ev.lengthComputable) {
console.log(ev.loaded + " " + ev.total);
console.log((ev.loaded / ev.total) * 100 + "%");
}
}, false);
}
return myXhr;
},
dataType: 'json',
success: afterSuccess
});
There are several reports of the same behavior - incorrect progress report on file upload - caused by antivirus software checking the files to be uploaded. My guess is that some part of antivirus attempts to make up for possible delay (caused by the check) - and fails to do it properly.
I had the same issue recently. I think your ajax call is simply returned before your file uploads. To work around this load back what you uploaded and check for the load event. For example, if you are uploading an image (using jQuery):
var loadCheck = $('<img src="' + uploadedUrl +'">').hide().appendTo('body');
loadCheck.on('load', updateProgressBar());
of Course you can implement it on other type files, and incorporate an $.each iteration.
Related
I've written a javascript uploader which takes images and uploads them to a server. I've stripped the code down to the bare minimum I can, but I still get a leak in Firefox, and I can't see why.
The script obtains a list of file objects called files from a HTML form and then incrementally runs through that list and uploads each file to a server.
The javascript code is as follows:
function UploadFile(file) {
var form_data = new FormData();
form_data.append('file', file);
$.ajax({
url: 'Upload.php',
dataType: 'text',
cache: false,
contentType: false,
processData: false,
data: form_data,
type: 'post',
success: function(){
console.log("SUCCESS!");
upload_counter = upload_counter + 1;
UploadFile(files[upload_counter]); //Self calling function
form_data = null; //Clearing form data memory
file = null;
},
error: function(error)
{
console.log("ERROR!");
}
});
}
The function is started again by calling itself from within the success function of the AJAX call, this keeps everything linear and simple, note how it increments along to the next index in the files array to access the next file object.
The PHP code isn't relevant of course, but suffice to say it handles the upload fine.
Using Windows memory monitor the memory used by Firefox is the following:
1 image -> 320Mb
10 images -> 500Mb
20 images -> 720Mb
30 images -> 960Mb
40 images -> 1.1Gb
140 images -> 1.6Gb
150 images -> 1.7Gb
Clearly this is a problem, and eventually Firefox will crash. The files are quite large (around 10Mb each), so by 150 images around 1.5 Gb's have been uploaded.
Why is it leaking memory?
I should add:
this doesn't happen in Chrome, Edge or Opera their memory doesn't change during upload.
I have tested this in Firefox safe mode with all add-ons/extensions disabled as well.
As a result of the fact that it only occurs in Firefox, I've submitted a bug here:
https://bugzilla.mozilla.org/show_bug.cgi?id=1302657
This is just a theory, but it might be a problem with Firefox not hanlding correctly the recursivity of your ajax calls. Since it keeps uploading files again and again, it might not be freeing the memory until it finishes and if you end up consuming all the memory before that, then it will crash.
I think that it might be worth to try something like this:
for(var i=0;i<file.length;i++){
UploadFile(files[i]);
}
And in your UploadFile function, get rid of these 2 lines:
upload_counter = upload_counter + 1;
UploadFile(files[upload_counter]); //Self calling function
Since the "for loop" will already take care of iterate through the files, and with that, you will remove the recursivity in your success callback function.
You might be worried because doing it like that will make the file upload completely asynchronous and it might end up consuming more memory, but since most web browsers limit the number of parallel HTTP requests to up to 6 connections, then based on your benchmarks, you will never be using more than 500MB of RAM (assuming that this approach works for you).
It took me days to stumble on this answer.
The problem/bug lies entirely with firebug, the commonly used developer tool plugin. With it disabled, Firefox actually manages the memory fine.
Failed to clear temp storage: It was determined that certain files are unsafe for access within a Web application, or that too many calls are being made on file resources. SecurityError
I'm getting this error in console. I have a script name script.js which makes ajax calls to retrieve data from php.
Any idea why?
Here's my jQuery script
$(document).ready(function() {
var loading = false;
var docHeight = $(window).height();
$('.timeline').css({minHeight: docHeight});
function get_tl_post() {
if (loading==false) {
loading = true;
$.ajax({
type:"POST",
url:"timeline.php",
data:"data=instagram",
beforeSend:function(){
$('.loader').fadeIn("slow");
},
complete:function(){
loading = false;
$('.loader').fadeOut("slow");
},
success:function(data) {
if(data=="error")
{
get_tl_post();
}
$(data).hide().appendTo(".timeline").fadeIn(1000);
}
});
}
}
$(window).scroll(function(){
if ($(window).scrollTop() == $(document).height() - $(window).height()) {
get_tl_post();
}
});
});
This is Due to Network Mapping of your resources.
In other words, you might have added workspace folder in your chrome dev tools.
Now when you are trying to make changes in some files it makes the Request to the File-System. This works fine for a while. However in some scenarios you remove your network mapping.
Then when you trying to open that web page on the browser, it might or might not ask for remapping of network resources and still try to update the File System.
And that's when you get this error.
There is nothing wrong with your script.
Now the only solution to this could be Removing cache, then restarting System.
If the problem still persist, you can simply re install chrome and this should be fixed.
Moreover, sometimes network mapping can cause several other issues as well.
For example, making the CSS file size to whooping 75MB or above. So you have to take precautions when playing with network mapping.
Optionally if you are on Mac... or Even on Windows and have sh
commands available.
sudo find / -type f -size +50000k -exec ls -lh {} \; | awk '{ print $9 ": " $5 }'
Hit this in your Terminal to find out the culprit individual file which is over 50MB. you could then remove them.
Note : What the above command does it, it will find all the individual files which are more than 50MB and print them on your terminal one by one.
If I was to guess I would say your timeline.php script is always returning "error" so you are making too many calls recursively and the browser blocks them.
Try to eliminate the recursive function call and see if that will fix the problem.
Remove the following 3 lines and try again:
if (data == "error")
{
get_tl_post();
}
If your ajax call fails for some reason, this could lead to too many recursive calls of the get_tl_post();.
I suggest that you use the error property for error handling, and to avoid situations of recursively calling your function. An idea could be to set a policy like "if the request failed/data are with errors, wait for an amount of time, then retry. If X retries are made, then show an error code and stop requesting".
Below is an example of untested code, in order to show you the idea:
var attempts = 0;
$.ajax({
//Rest of properties
success: function(data) {
if(data == "error") {
if(attempts < 3) {
setTimeout(function(){
get_tl_post();
++attempts;
}, 2000);
} else {
//output failure here.
}
}
//Rest of code....
}
});
I am using fine uploader to handle the uploading of files in a web application I have. Is there some sort of callback for when the last file has finished processing? I found the onComplete callback, but this is fired when every file completes. I need to know when all files are done. Does anyone know of a way to do this with fine uploader?
I don't think fine uploader provides what you're asking for, but it's easy to do this yourself. You can increment a count in the onSubmit callback, and decrement it in the onComplete callback. When the count reaches 0 in onComplete, that means all the files have processed.
Ended up going this route:
var uploader = $("#upload").fineUploader({
request: {
endpoint: postUrl
},
template: uploadTemplate
}).on('submit',function(){
fileCount++;
}).on('complete', function (event, id, name, responseJSON) {
fileCount--;
if(fileCount == 0){
alert("done!");
}
});
I'm working on an application that allows the user to send a file using a form (a POST request), and that executes a series of GET requests while that file is being uploaded to gather information about the state of the upload.
It works fine in IE and Firefox, but not so much in Chrome and Safari.
The problem is that even though send() is called on the XMLHttpRequest object, nothing is being requested as can be seen in Fiddler.
To be more specific, an event handler is placed on the "submit" event of the form, that places a timeout function call on the window:
window.setTimeout(startPolling, 10);
and in this function "startPolling" sequence is started that keeps firing GET requests to receive status updates from a web service that returns text/json that can be used to update the UI.
Is this a limitation (perhaps security-wise?) on WebKit based browsers? Is this a Chrome bug? (I'm seeing the same behaviour in Safari though).
I am having the exact same problem. At the moment i use an iframe, which is targeted in the form. That allows the xhr requests to be executed while posting. While that does work, it doesn't degrade gracefully if someone disables javascript.(I couldn't load the next page outside the iframe without js) So if someone has a nicer solution, i would be grateful to hear it.
Here the jQuery script for reference:
$(function() {
$('form[enctype=multipart/form-data]').submit(function(){
// Prevent multiple submits
if ($.data(this, 'submitted')) return false;
var freq = 500; // freqency of update in ms
var progress_url = '{% url checker_progress %}'; // ajax view serving progress info
$("#progressbar").progressbar({
value: 0
});
$("#progress").overlay({
top: 272,
api: true,
closeOnClick: false,
closeOnEsc: false,
expose: {
color: '#333',
loadSpeed: 1000,
opacity: 0.9
},
}).load();
// Update progress bar
function update_progress_info() {
$.getJSON(progress_url, {}, function(data, status){
if (data) {
var progresse = parseInt(data.progress);
$('#progressbar div').animate({width: progresse + '%' },{ queue:'false', duration:500, easing:"swing" });
}
window.setTimeout(update_progress_info, freq);
});
};
window.setTimeout(update_progress_info, freq);
$.data(this, 'submitted', true); // mark form as submitted.
});
});
Failed to clear temp storage: It was determined that certain files are unsafe for access within a Web application, or that too many calls are being made on file resources. SecurityError
I'm getting this error in console. I have a script name script.js which makes ajax calls to retrieve data from php.
Any idea why?
Here's my jQuery script
$(document).ready(function() {
var loading = false;
var docHeight = $(window).height();
$('.timeline').css({minHeight: docHeight});
function get_tl_post() {
if (loading==false) {
loading = true;
$.ajax({
type:"POST",
url:"timeline.php",
data:"data=instagram",
beforeSend:function(){
$('.loader').fadeIn("slow");
},
complete:function(){
loading = false;
$('.loader').fadeOut("slow");
},
success:function(data) {
if(data=="error")
{
get_tl_post();
}
$(data).hide().appendTo(".timeline").fadeIn(1000);
}
});
}
}
$(window).scroll(function(){
if ($(window).scrollTop() == $(document).height() - $(window).height()) {
get_tl_post();
}
});
});
This is Due to Network Mapping of your resources.
In other words, you might have added workspace folder in your chrome dev tools.
Now when you are trying to make changes in some files it makes the Request to the File-System. This works fine for a while. However in some scenarios you remove your network mapping.
Then when you trying to open that web page on the browser, it might or might not ask for remapping of network resources and still try to update the File System.
And that's when you get this error.
There is nothing wrong with your script.
Now the only solution to this could be Removing cache, then restarting System.
If the problem still persist, you can simply re install chrome and this should be fixed.
Moreover, sometimes network mapping can cause several other issues as well.
For example, making the CSS file size to whooping 75MB or above. So you have to take precautions when playing with network mapping.
Optionally if you are on Mac... or Even on Windows and have sh
commands available.
sudo find / -type f -size +50000k -exec ls -lh {} \; | awk '{ print $9 ": " $5 }'
Hit this in your Terminal to find out the culprit individual file which is over 50MB. you could then remove them.
Note : What the above command does it, it will find all the individual files which are more than 50MB and print them on your terminal one by one.
If I was to guess I would say your timeline.php script is always returning "error" so you are making too many calls recursively and the browser blocks them.
Try to eliminate the recursive function call and see if that will fix the problem.
Remove the following 3 lines and try again:
if (data == "error")
{
get_tl_post();
}
If your ajax call fails for some reason, this could lead to too many recursive calls of the get_tl_post();.
I suggest that you use the error property for error handling, and to avoid situations of recursively calling your function. An idea could be to set a policy like "if the request failed/data are with errors, wait for an amount of time, then retry. If X retries are made, then show an error code and stop requesting".
Below is an example of untested code, in order to show you the idea:
var attempts = 0;
$.ajax({
//Rest of properties
success: function(data) {
if(data == "error") {
if(attempts < 3) {
setTimeout(function(){
get_tl_post();
++attempts;
}, 2000);
} else {
//output failure here.
}
}
//Rest of code....
}
});