I am having a problem where I am submitting a file as part of a standard HTML form, and the file uploads, however - this process seems to be stuck in a never ending loop... so the file uploads over and over.
The form is submitted via jQuery, i.e. $('myform').submit(); and isn't an ajax request. Looking in the chrome network console, the request is "cancelled" and 0 bytes are transmitted.
What is causing this loop?
After about an hour of searching around, I upgraded chrome and the console showed a warning about an JavaScript Interval timer = setInterval(count_seconds, 1000).
This function was simply counting the number of elapsed seconds. I've never come across this before, but it seems that Chrome prevents the submission of the form while there is an interval actively running?!
I have now cleared the interval using clearInterval(count_seconds) and timer=null; for good measure, before submitting the form, and that has fixed the issue.
Answering my own question to save others the headache, but if anyone can explain this odd (new?) behaviour then that'd be great.
Related
Here's the setup:
Very simple form, just a name field, plus two ActiveStorage attachment fields square_image and landscape_image.
Both fields have direct_upload: true set.
If all form fields are filled out, including the files, the submit work exactly right, no problem.
However, if you fill out only the name field, leaving any of the file fields blank, I get an invalid_request every time.
This only happens in Safari.
The debug logs from nginx reveal client prematurely closed stream: only 548 out of 953 bytes of request body received.
It doesn't seem to be an nginx issue, because I get a similar experience locally via pow (the connection just hangs for a long time, presumably because pow is waiting for the data that never arrives).
Has anyone else experienced this, or have any ideas about how to debug this? I can't figure out where to go from here.
Rails 5.2.0
Safari 11.1 (13605.1.33.1.2)
This is indeed a bug in webkit. It has allegly been fixed but at this point in time, the bug still affects Safari. https://trac.webkit.org/changeset/230963/webkit
I have an Xpage that submits data using partial refresh. I have noticed that if I leave the webpage open for a very long time and submit a comment the partial refresh is not returning anything after submit. leaving the area blank. and the comment is not saved.
When I look in developer tools I can se the request being submitted with status 200 but nothing is returned.
I have several intervall timers running on the page so there should be no scope timeouts and all users are anonymous. and these "request timers" seem to work fine even before I submit the comment
I also use hijackPartialRefresh to log the request but it does not return any errors.
I am using server page persistence: keep pages in memory with max 16 pages in memory. (don't ask)
What could be the cause of this and is there anything I can do to make sure my comments are submitted or at least leave an error. (instead of blank area)
Thanks
Thomas
Recently my Apache server's CPU has been at 100% (all day, that is). I think I have found my issue.
On a page, a user has the ability to click a button and send a POST request to a PHP file on my server. Sometimes the PHP file takes a LONG (and I mean VERY long) time to respond or sometimes does not respond at all:
function buttonFunction() {
$.post("http://ipaddress/core/file.php",{username:username, password:pword, coins:coins}, function(data) {
// Stuff
});
}
Hypothesis 1:
I believe that sometimes people might click this button again (while it is still trying to get the result/response from file.php from the previous click), and therefore causing two simultaneous processes from PHP on the Apache server - causing higher CPU usage (I think that's what happens, correct me if I'm wrong because I'm new to this server stuff).
Hypothesis 2:
Another thing that may be causing the high CPU usage (that I believe) is the user refreshing the page while it is still trying to get the result/response from file.php. After 12 seconds (with no response/result), I have a message appear saying "Please refresh if this takes too long." With that being the case (after refreshing the page), the user once again tries to send a post request to file.php while the old one may still be running - causing higher CPU usage (again, I think that's what happens, correct me if I'm wrong because I'm new to this server stuff).
Reasoning:
I'm saying this because on my site it may say that there are only 12 people online (and probably 12 people sending the post requests), however when I run the top command on PuTTY to see what processes are currently running, it shows nearly 30-40+ processes running (some taking as long as 17 minutes).
So, is there a way that I can abort the request on a refresh (if it's still going on) or on the click on the button (again, if the request is still going on)? In fact, can somebody either confirm or deny if my hypotheses (especially hypothesis 2) are correct - if those actually ARE causing the high CPU? Furthermore, if anybody has an idea for a more efficient way that I can go about this (sending these requests), it would be highly appreciated.
Edit 1:
I can fix the possible issue stated in my first hypothesis. However, can somebody please either confirm or deny if my second hypothesis is true/valid?
Summary:
I am attempting to pass base64 encoded image data via a form field input. My code works fine on all browsers I've tested it on, but there is severe amount of CPU lag, post submit, on Google Chrome - the length of which is proportional to the length of data submitted.
Details:
What I'm Doing:
I have an SVG editor on my site in which users may create images to be saved to their profile. Once the user finishes their work, they click 'save' - which kicks off some javascript to convert the SVG into an encoded data string via canvas.toDataURL(), store it in a hidden input field, submit the form, and return the user to an overview of their designs.
What's the problem?
The code, itself, seems to be functioning without an issue across both Firefox and Google Chrome. Firefox page loads take 1-2 seconds, regardless of the data_string size. However, on Google Chrome, the time it takes to load the 'overview' page is proportional to the size of the data string submitted in the hidden field.
For example, if I truncate the data string at various lengths, I receive different page load times:
Test Image 1:
5000 chars - 1.78 sec
50000 chars - 8.24 sec
73198 chars - 11.67 sec (Not truncated)
Test Image 2:
5000 chars - 1.92 sec
50000 chars - 8.79 sec
307466 chars - 42.24 sec (Not truncated)
My Question:
The delay is unacceptable (as most images will be at least 100k in size); does anyone know what's going on with Google Chrome?
I would like to reiterate that the server responds with the same speed, regardless of browser; it is definitely a client-side, browser specific issue with Google Chrome.
I would also appreciate alternative suggestions. I've spent some time attempting to fool the browser into thinking the data was a file upload (by changing the text input to a file input field and then manually trying to form the data and submit it via javascript, but I can't seem to get Django to recognize the falsified file (so it errors out, believing that no file was uploaded).
Summary
Google Chrome seems to have a problem handling large amounts of data when said data is placed into an actual input field. I suspect it's an issue with Chrome attempting to clean up the memory used to display the data.
Details
I was able to achieve a workaround by doing away with the client-side form, entirely, and submitting the data via a javascript XMLHttpRequest (as I had touched on at the end of my question), then redirecting the user to the next page in the AJAX callback.
I could never get Django to recognize a manually formed FileField object (as multipart/form-data), but I was able to get it to accept a manually formed CharField string (which was my base64 encoded canvas data).
Because the data is never placed into an input field, Google Chrome responds without delay.
I hope that helps anyone who may run across a similar issue.
I was also having the exact same problem, I was searching for a solution.
In my case there was no such problem for the initial few runs of the page.
Then it suddenly started to lag eating up a large amount of memory which in turn made my whole system running very slow.
I tried in another PC like what expected there was no problem submitting the big sized svg data for the first few runs but later it is also showing the same lagging problem.
After reading your post i am planning to use jquery's ajax for posting the data . I hope this will solve the issue.
I've got a bunch of javascript working on this page so that users can fill the form which includes a file upload field. They can add these forms to a "queue" which is just a series of iframes with the forms data moved into it. With the click of a button it will go through each form and submit them one at a time. When each form is submitted I load a gif to show that there is action. When the processing page is finished it will spit some jquery back at the iframe and give a success or error message. This works great so long as the files are not too large. It seems that the larger files (near 1GB) results in a condition where the jquery from the processing page never shows up in the iframe. This is disastrous because the submitting page will not continue to submit forms unless it gets some sort of response. Also the user is left with a spinning image that never goes away, and are unsure if even one large file has actually uploaded. I've tried setting the max_execution_time and max_input_time for an hour, but this doesn't help at all. Currently using a jquery/javascript to loop through each form and submit it. Can anyone tell me why this is happening and/or how to resolve this issue?
You can set the timeout with jQuery ajax to be longer - From the documentation.
timeoutNumber
Set a timeout (in milliseconds) for the request. This will override any global timeout set with $.ajaxSetup(). The timeout period starts at the point the $.ajax call is made; if several other requests are in progress and the browser has no connections available, it is possible for a request to time out before it can be sent. In jQuery 1.4.x and below, the XMLHttpRequest object will be in an invalid state if the request times out; accessing any object members may throw an exception. In Firefox 3.0+ only, script and JSONP requests cannot be cancelled by a timeout; the script will run even if it arrives after the timeout period.