I've written a javascript uploader which takes images and uploads them to a server. I've stripped the code down to the bare minimum I can, but I still get a leak in Firefox, and I can't see why.
The script obtains a list of file objects called files from a HTML form and then incrementally runs through that list and uploads each file to a server.
The javascript code is as follows:
function UploadFile(file) {
var form_data = new FormData();
form_data.append('file', file);
$.ajax({
url: 'Upload.php',
dataType: 'text',
cache: false,
contentType: false,
processData: false,
data: form_data,
type: 'post',
success: function(){
console.log("SUCCESS!");
upload_counter = upload_counter + 1;
UploadFile(files[upload_counter]); //Self calling function
form_data = null; //Clearing form data memory
file = null;
},
error: function(error)
{
console.log("ERROR!");
}
});
}
The function is started again by calling itself from within the success function of the AJAX call, this keeps everything linear and simple, note how it increments along to the next index in the files array to access the next file object.
The PHP code isn't relevant of course, but suffice to say it handles the upload fine.
Using Windows memory monitor the memory used by Firefox is the following:
1 image -> 320Mb
10 images -> 500Mb
20 images -> 720Mb
30 images -> 960Mb
40 images -> 1.1Gb
140 images -> 1.6Gb
150 images -> 1.7Gb
Clearly this is a problem, and eventually Firefox will crash. The files are quite large (around 10Mb each), so by 150 images around 1.5 Gb's have been uploaded.
Why is it leaking memory?
I should add:
this doesn't happen in Chrome, Edge or Opera their memory doesn't change during upload.
I have tested this in Firefox safe mode with all add-ons/extensions disabled as well.
As a result of the fact that it only occurs in Firefox, I've submitted a bug here:
https://bugzilla.mozilla.org/show_bug.cgi?id=1302657
This is just a theory, but it might be a problem with Firefox not hanlding correctly the recursivity of your ajax calls. Since it keeps uploading files again and again, it might not be freeing the memory until it finishes and if you end up consuming all the memory before that, then it will crash.
I think that it might be worth to try something like this:
for(var i=0;i<file.length;i++){
UploadFile(files[i]);
}
And in your UploadFile function, get rid of these 2 lines:
upload_counter = upload_counter + 1;
UploadFile(files[upload_counter]); //Self calling function
Since the "for loop" will already take care of iterate through the files, and with that, you will remove the recursivity in your success callback function.
You might be worried because doing it like that will make the file upload completely asynchronous and it might end up consuming more memory, but since most web browsers limit the number of parallel HTTP requests to up to 6 connections, then based on your benchmarks, you will never be using more than 500MB of RAM (assuming that this approach works for you).
It took me days to stumble on this answer.
The problem/bug lies entirely with firebug, the commonly used developer tool plugin. With it disabled, Firefox actually manages the memory fine.
Related
Failed to clear temp storage: It was determined that certain files are unsafe for access within a Web application, or that too many calls are being made on file resources. SecurityError
I'm getting this error in console. I have a script name script.js which makes ajax calls to retrieve data from php.
Any idea why?
Here's my jQuery script
$(document).ready(function() {
var loading = false;
var docHeight = $(window).height();
$('.timeline').css({minHeight: docHeight});
function get_tl_post() {
if (loading==false) {
loading = true;
$.ajax({
type:"POST",
url:"timeline.php",
data:"data=instagram",
beforeSend:function(){
$('.loader').fadeIn("slow");
},
complete:function(){
loading = false;
$('.loader').fadeOut("slow");
},
success:function(data) {
if(data=="error")
{
get_tl_post();
}
$(data).hide().appendTo(".timeline").fadeIn(1000);
}
});
}
}
$(window).scroll(function(){
if ($(window).scrollTop() == $(document).height() - $(window).height()) {
get_tl_post();
}
});
});
This is Due to Network Mapping of your resources.
In other words, you might have added workspace folder in your chrome dev tools.
Now when you are trying to make changes in some files it makes the Request to the File-System. This works fine for a while. However in some scenarios you remove your network mapping.
Then when you trying to open that web page on the browser, it might or might not ask for remapping of network resources and still try to update the File System.
And that's when you get this error.
There is nothing wrong with your script.
Now the only solution to this could be Removing cache, then restarting System.
If the problem still persist, you can simply re install chrome and this should be fixed.
Moreover, sometimes network mapping can cause several other issues as well.
For example, making the CSS file size to whooping 75MB or above. So you have to take precautions when playing with network mapping.
Optionally if you are on Mac... or Even on Windows and have sh
commands available.
sudo find / -type f -size +50000k -exec ls -lh {} \; | awk '{ print $9 ": " $5 }'
Hit this in your Terminal to find out the culprit individual file which is over 50MB. you could then remove them.
Note : What the above command does it, it will find all the individual files which are more than 50MB and print them on your terminal one by one.
If I was to guess I would say your timeline.php script is always returning "error" so you are making too many calls recursively and the browser blocks them.
Try to eliminate the recursive function call and see if that will fix the problem.
Remove the following 3 lines and try again:
if (data == "error")
{
get_tl_post();
}
If your ajax call fails for some reason, this could lead to too many recursive calls of the get_tl_post();.
I suggest that you use the error property for error handling, and to avoid situations of recursively calling your function. An idea could be to set a policy like "if the request failed/data are with errors, wait for an amount of time, then retry. If X retries are made, then show an error code and stop requesting".
Below is an example of untested code, in order to show you the idea:
var attempts = 0;
$.ajax({
//Rest of properties
success: function(data) {
if(data == "error") {
if(attempts < 3) {
setTimeout(function(){
get_tl_post();
++attempts;
}, 2000);
} else {
//output failure here.
}
}
//Rest of code....
}
});
I'm trying to implement an upload form and return the upload status to return tot he user using xhr. Everything seems to be implemented correctly, however when uploading, the callbacks seem to occur way too quick and return a much higher percentage than has actually occurred.
With files ~<20Mb, I get a callback immediately which shows over 99% while the upload continues to churn away for some time in the background.
See the below screengrab showing the console from a 74Mb file. This was taken a couple of seconds after the upload was initialised and the upload continued for another ~60 seconds (notice just 3 callbacks registering (loaded totalsize) (calculatedpercentage) and the ajax upload continuing with the throbber).
Has anyone experienced this and managed to get an acurate representation of upload status?
(the 'load' event triggers correctly after the upload process)
Here's my code:
$(this).ajaxSubmit({
target: '#output',
beforeSubmit: showRequest,
xhr: function()
{
myXhr = $.ajaxSettings.xhr();
if (myXhr.upload)
{
console.log('have xhr');
myXhr.upload.addEventListener('progress', function(ev){
if (ev.lengthComputable) {
console.log(ev.loaded + " " + ev.total);
console.log((ev.loaded / ev.total) * 100 + "%");
}
}, false);
}
return myXhr;
},
dataType: 'json',
success: afterSuccess
});
There are several reports of the same behavior - incorrect progress report on file upload - caused by antivirus software checking the files to be uploaded. My guess is that some part of antivirus attempts to make up for possible delay (caused by the check) - and fails to do it properly.
I had the same issue recently. I think your ajax call is simply returned before your file uploads. To work around this load back what you uploaded and check for the load event. For example, if you are uploading an image (using jQuery):
var loadCheck = $('<img src="' + uploadedUrl +'">').hide().appendTo('body');
loadCheck.on('load', updateProgressBar());
of Course you can implement it on other type files, and incorporate an $.each iteration.
I'm working on a client-side webapplication that makes heavy use of JavaScript and Ajax to provide the required functionality.
This is not a problem for most browsers (Chrome, Firefox, ...), but in Internet Explorer the performance is a major issue.
It takes less than a second to load the page initially, even on Internet Explorer. But upon refreshing the page it can take anywhere between 1 and 20 seconds to load and display the page.
It's hard to post code since the application is divided into multiple files. I can only explain the intended behaviour.
The application initializes two content containers, one for static content and one for dynamic content. Each of these content containers is populated via Ajax and affects DOM elements via the innerHTML attribute.
The first time it takes less than a second to build the page. Subsequent refreshes take significantly longer.
What changes between the initial loading of the page and the refreshing of the page to explain this enormous performance drop? Do I need to uninitialize something on unloading the page?
Communication.request = function (method, target, async, data, callback) {
var types = ['string', 'string', 'boolean', 'null', 'null']; // Parameter types
if (data) { // Data was provided
types[3] = 'string'; // Data must be a string
}
if (callback) { // Callback was provided
types[4] = 'function'; // Callback must be a function
}
if (Utils.hasParams(arguments, types)) { // All the required parameters were provided and of the right type
var request = new XMLHttpRequest(); // Create a new request
request.open(method, target, async); // Open the request
if (callback) { // Callback was provided
request.handleCallback(callback); // Register the callback
}
if (data) { // Data was provided
var contentType = 'application/x-www-form-urlencoded'; // Prepare the content type
request.setRequestHeader('Content-Type', contentType); // Add a content type request header
}
request.send(data); // Send the request
}
};
The problem appeared to be related to the amount of concurrent connections. Depending on the connection / type of web server this is limited to 2 or 4 concurrent connections in Internet Explorer.
After clamping the number of connections to 2 the problem ceased to occur. Other browsers appear to have higher limits, though I have limited those to 4 just in case.
Also, the amount of concurrent messages is the amount of messages in-flight at any given time. This was previously unlimited and that made Internet Explorer quite sad :-(
The pushState method accepts a state object. Firefox documents say the maximum size of this object is 640kb. Is it defined in the specs what the smallest maximum size a browser can implement is? Can I reasonably expect major browsers to provide me with at least 100kb?
EDIT: I tested it out with Chrome, and it was still working for state objects over 1MB.
The specification doesn't set out a limit, however the various browser do have their own limits.
Firefox's is well documented and as you said, it's 640kB ("as much RAM as anybody will ever need").
I couldn't find Chrome or Internet Explorer's listed anywhere, but some quick testing shows:
Chrome working at least up to 10MB (and possibly further),
IE hitting the limit at 1MB (in IE11, which is all I have handy).
So, to summarise for the people of the future:
history.state object size limit is: 640kB for Firefox, 1MB for Internet Explorer 11 and at least 10Mb for Chrome.
EDIT: Versions tested: IE: 11, Chrome: 33, Firefox: Irrelevant as they document the max size on MDN for you :).
No. The normative document here is http://www.whatwg.org/specs/web-apps/current-work/multipage/history.html#dom-history-pushstate and it doesn't even mention a limit for the data size. A different limit is suggested however:
User agents may limit the number of
state objects added to the session
history per page.
As you can see on this example the specification generally avoids mentioning any hard limits and leaves them at the discretion of browser makers. So even if the spec is revised at some point in future to consider the possibility of data size limits, it is unlikely to give you a real number. Instead it will be "big enough for common use cases".
only see the MDN tells that FireFox impose a size limit to 640K, don't know other browsers.
https://developer.mozilla.org/en-US/docs/DOM/Manipulating_the_browser_history
Painstakingly, I have a page that is exceeding the character limit on IE11. I did a substring operation to get an exact character count since I couldn't find it anywhere. The answer is that (at least on IE11) 524282 characters are allowed to be passed to the pushState/replaceState.
I handled that via the following code:
function pushState(data, title, url) {
if (data.length > 524282) {
//can't push the data to the History API--pass null
history.pushState(null, title, url);
history.replaceState(null, title, url);
}
else {
history.pushState(data, title, url);
history.replaceState(data, title, url);
}
document.title = title;
}
I call beforeNavigate to save any current position information or state changes made by the user before loading the new content via an ajax request.
function beforeNavigate(){
if ($("#container").html().length <= 524282) {
//save current state to history before navigating via ajax
history.replaceState($("#container").html(), document.title, window.location.pathname);
}
}
Handle pushing the back and forward buttons by listening for popstate. If we passed a null value for data, then e.state will return null, and we need to load the stored url via an ajax request.
window.addEventListener('popstate', function (e) {
if (e.state!=null) {
$("#container").html(e.state);
}
else {
//load the partialpage into the container(a full html page is not returned by the ajax query for my site)
$.ajax({
url: location.href,
type: "GET",
success: function (data) {
$('#container').html(data);
}
});
}
});
Failed to clear temp storage: It was determined that certain files are unsafe for access within a Web application, or that too many calls are being made on file resources. SecurityError
I'm getting this error in console. I have a script name script.js which makes ajax calls to retrieve data from php.
Any idea why?
Here's my jQuery script
$(document).ready(function() {
var loading = false;
var docHeight = $(window).height();
$('.timeline').css({minHeight: docHeight});
function get_tl_post() {
if (loading==false) {
loading = true;
$.ajax({
type:"POST",
url:"timeline.php",
data:"data=instagram",
beforeSend:function(){
$('.loader').fadeIn("slow");
},
complete:function(){
loading = false;
$('.loader').fadeOut("slow");
},
success:function(data) {
if(data=="error")
{
get_tl_post();
}
$(data).hide().appendTo(".timeline").fadeIn(1000);
}
});
}
}
$(window).scroll(function(){
if ($(window).scrollTop() == $(document).height() - $(window).height()) {
get_tl_post();
}
});
});
This is Due to Network Mapping of your resources.
In other words, you might have added workspace folder in your chrome dev tools.
Now when you are trying to make changes in some files it makes the Request to the File-System. This works fine for a while. However in some scenarios you remove your network mapping.
Then when you trying to open that web page on the browser, it might or might not ask for remapping of network resources and still try to update the File System.
And that's when you get this error.
There is nothing wrong with your script.
Now the only solution to this could be Removing cache, then restarting System.
If the problem still persist, you can simply re install chrome and this should be fixed.
Moreover, sometimes network mapping can cause several other issues as well.
For example, making the CSS file size to whooping 75MB or above. So you have to take precautions when playing with network mapping.
Optionally if you are on Mac... or Even on Windows and have sh
commands available.
sudo find / -type f -size +50000k -exec ls -lh {} \; | awk '{ print $9 ": " $5 }'
Hit this in your Terminal to find out the culprit individual file which is over 50MB. you could then remove them.
Note : What the above command does it, it will find all the individual files which are more than 50MB and print them on your terminal one by one.
If I was to guess I would say your timeline.php script is always returning "error" so you are making too many calls recursively and the browser blocks them.
Try to eliminate the recursive function call and see if that will fix the problem.
Remove the following 3 lines and try again:
if (data == "error")
{
get_tl_post();
}
If your ajax call fails for some reason, this could lead to too many recursive calls of the get_tl_post();.
I suggest that you use the error property for error handling, and to avoid situations of recursively calling your function. An idea could be to set a policy like "if the request failed/data are with errors, wait for an amount of time, then retry. If X retries are made, then show an error code and stop requesting".
Below is an example of untested code, in order to show you the idea:
var attempts = 0;
$.ajax({
//Rest of properties
success: function(data) {
if(data == "error") {
if(attempts < 3) {
setTimeout(function(){
get_tl_post();
++attempts;
}, 2000);
} else {
//output failure here.
}
}
//Rest of code....
}
});