400 Error when Trying to submit rails form via AJAX - javascript

Here's the setup:
Very simple form, just a name field, plus two ActiveStorage attachment fields square_image and landscape_image.
Both fields have direct_upload: true set.
If all form fields are filled out, including the files, the submit work exactly right, no problem.
However, if you fill out only the name field, leaving any of the file fields blank, I get an invalid_request every time.
This only happens in Safari.
The debug logs from nginx reveal client prematurely closed stream: only 548 out of 953 bytes of request body received.
It doesn't seem to be an nginx issue, because I get a similar experience locally via pow (the connection just hangs for a long time, presumably because pow is waiting for the data that never arrives).
Has anyone else experienced this, or have any ideas about how to debug this? I can't figure out where to go from here.
Rails 5.2.0
Safari 11.1 (13605.1.33.1.2)

This is indeed a bug in webkit. It has allegly been fixed but at this point in time, the bug still affects Safari. https://trac.webkit.org/changeset/230963/webkit

Related

Form submit cancelled and looping

I am having a problem where I am submitting a file as part of a standard HTML form, and the file uploads, however - this process seems to be stuck in a never ending loop... so the file uploads over and over.
The form is submitted via jQuery, i.e. $('myform').submit(); and isn't an ajax request. Looking in the chrome network console, the request is "cancelled" and 0 bytes are transmitted.
What is causing this loop?
After about an hour of searching around, I upgraded chrome and the console showed a warning about an JavaScript Interval timer = setInterval(count_seconds, 1000).
This function was simply counting the number of elapsed seconds. I've never come across this before, but it seems that Chrome prevents the submission of the form while there is an interval actively running?!
I have now cleared the interval using clearInterval(count_seconds) and timer=null; for good measure, before submitting the form, and that has fixed the issue.
Answering my own question to save others the headache, but if anyone can explain this odd (new?) behaviour then that'd be great.

No errors when submitting comment, leaving blank area if webpage is idle for long time

I have an Xpage that submits data using partial refresh. I have noticed that if I leave the webpage open for a very long time and submit a comment the partial refresh is not returning anything after submit. leaving the area blank. and the comment is not saved.
When I look in developer tools I can se the request being submitted with status 200 but nothing is returned.
I have several intervall timers running on the page so there should be no scope timeouts and all users are anonymous. and these "request timers" seem to work fine even before I submit the comment
I also use hijackPartialRefresh to log the request but it does not return any errors.
I am using server page persistence: keep pages in memory with max 16 pages in memory. (don't ask)
What could be the cause of this and is there anything I can do to make sure my comments are submitted or at least leave an error. (instead of blank area)
Thanks
Thomas

Javascript Abort POST Request on Retry/Refresh (HIGH CPU Usage)

Recently my Apache server's CPU has been at 100% (all day, that is). I think I have found my issue.
On a page, a user has the ability to click a button and send a POST request to a PHP file on my server. Sometimes the PHP file takes a LONG (and I mean VERY long) time to respond or sometimes does not respond at all:
function buttonFunction() {
$.post("http://ipaddress/core/file.php",{username:username, password:pword, coins:coins}, function(data) {
// Stuff
});
}
Hypothesis 1:
I believe that sometimes people might click this button again (while it is still trying to get the result/response from file.php from the previous click), and therefore causing two simultaneous processes from PHP on the Apache server - causing higher CPU usage (I think that's what happens, correct me if I'm wrong because I'm new to this server stuff).
Hypothesis 2:
Another thing that may be causing the high CPU usage (that I believe) is the user refreshing the page while it is still trying to get the result/response from file.php. After 12 seconds (with no response/result), I have a message appear saying "Please refresh if this takes too long." With that being the case (after refreshing the page), the user once again tries to send a post request to file.php while the old one may still be running - causing higher CPU usage (again, I think that's what happens, correct me if I'm wrong because I'm new to this server stuff).
Reasoning:
I'm saying this because on my site it may say that there are only 12 people online (and probably 12 people sending the post requests), however when I run the top command on PuTTY to see what processes are currently running, it shows nearly 30-40+ processes running (some taking as long as 17 minutes).
So, is there a way that I can abort the request on a refresh (if it's still going on) or on the click on the button (again, if the request is still going on)? In fact, can somebody either confirm or deny if my hypotheses (especially hypothesis 2) are correct - if those actually ARE causing the high CPU? Furthermore, if anybody has an idea for a more efficient way that I can go about this (sending these requests), it would be highly appreciated.
Edit 1:
I can fix the possible issue stated in my first hypothesis. However, can somebody please either confirm or deny if my second hypothesis is true/valid?

Google Chrome Lags on Large Form Data Sumbissions

Summary:
I am attempting to pass base64 encoded image data via a form field input. My code works fine on all browsers I've tested it on, but there is severe amount of CPU lag, post submit, on Google Chrome - the length of which is proportional to the length of data submitted.
Details:
What I'm Doing:
I have an SVG editor on my site in which users may create images to be saved to their profile. Once the user finishes their work, they click 'save' - which kicks off some javascript to convert the SVG into an encoded data string via canvas.toDataURL(), store it in a hidden input field, submit the form, and return the user to an overview of their designs.
What's the problem?
The code, itself, seems to be functioning without an issue across both Firefox and Google Chrome. Firefox page loads take 1-2 seconds, regardless of the data_string size. However, on Google Chrome, the time it takes to load the 'overview' page is proportional to the size of the data string submitted in the hidden field.
For example, if I truncate the data string at various lengths, I receive different page load times:
Test Image 1:
5000 chars - 1.78 sec
50000 chars - 8.24 sec
73198 chars - 11.67 sec (Not truncated)
Test Image 2:
5000 chars - 1.92 sec
50000 chars - 8.79 sec
307466 chars - 42.24 sec (Not truncated)
My Question:
The delay is unacceptable (as most images will be at least 100k in size); does anyone know what's going on with Google Chrome?
I would like to reiterate that the server responds with the same speed, regardless of browser; it is definitely a client-side, browser specific issue with Google Chrome.
I would also appreciate alternative suggestions. I've spent some time attempting to fool the browser into thinking the data was a file upload (by changing the text input to a file input field and then manually trying to form the data and submit it via javascript, but I can't seem to get Django to recognize the falsified file (so it errors out, believing that no file was uploaded).
Summary
Google Chrome seems to have a problem handling large amounts of data when said data is placed into an actual input field. I suspect it's an issue with Chrome attempting to clean up the memory used to display the data.
Details
I was able to achieve a workaround by doing away with the client-side form, entirely, and submitting the data via a javascript XMLHttpRequest (as I had touched on at the end of my question), then redirecting the user to the next page in the AJAX callback.
I could never get Django to recognize a manually formed FileField object (as multipart/form-data), but I was able to get it to accept a manually formed CharField string (which was my base64 encoded canvas data).
Because the data is never placed into an input field, Google Chrome responds without delay.
I hope that helps anyone who may run across a similar issue.
I was also having the exact same problem, I was searching for a solution.
In my case there was no such problem for the initial few runs of the page.
Then it suddenly started to lag eating up a large amount of memory which in turn made my whole system running very slow.
I tried in another PC like what expected there was no problem submitting the big sized svg data for the first few runs but later it is also showing the same lagging problem.
After reading your post i am planning to use jquery's ajax for posting the data . I hope this will solve the issue.

Form values getting lost in IE8 but Firefox, IE9 works

I ran into a scenario where I was thrown an unexpected behavior only in IE8 browser. IE9 and Firefox browsers work fine. The behavior went like:
User populated a form
On purpose - user leaves a mandatory field blanked
User clicked "Submit button" and browser sent a POST request
Expected behavior - error message is thrown along with data that was already provided. Only mandatory field should be left blanked as we did not provide anything in step 2. But instead I'm getting an error message with previous data lost i.e. form empty.
And note this only happens in IE8. Any suggestions?
I am going to answer this questions myself. So, here's what happened in my scenario. It was a double click problem. But I only clicked the button once. Then how did that happen? Some programmer who worked on this project was handling a form submit where he did another submit using JavaScript. But then how did this work in Firefox or IE9+?
I used Fiddler to go deep into this - I noticed in IE8 browser two requests are sent to the server. But IE9 and Firefox correctly handles this scenario (i.e. learns about double click) and sends only 1 POST request instead of 2.
Technologies used: Spring Framework 2.0, JSP, HTML, JavaScript
Why data is lost has also to do with Server - Spring modifies the session attributes (to be specific it's a formObject which is temporarily removed and re-added) while processing requests. When there's another request at the same time it goes through another pipeline (handleInvalidSubmit) which ends up creating a new formObject and thus destroying old data.
Hope this will help others :)

Categories