Google Chrome Lags on Large Form Data Sumbissions - javascript

Summary:
I am attempting to pass base64 encoded image data via a form field input. My code works fine on all browsers I've tested it on, but there is severe amount of CPU lag, post submit, on Google Chrome - the length of which is proportional to the length of data submitted.
Details:
What I'm Doing:
I have an SVG editor on my site in which users may create images to be saved to their profile. Once the user finishes their work, they click 'save' - which kicks off some javascript to convert the SVG into an encoded data string via canvas.toDataURL(), store it in a hidden input field, submit the form, and return the user to an overview of their designs.
What's the problem?
The code, itself, seems to be functioning without an issue across both Firefox and Google Chrome. Firefox page loads take 1-2 seconds, regardless of the data_string size. However, on Google Chrome, the time it takes to load the 'overview' page is proportional to the size of the data string submitted in the hidden field.
For example, if I truncate the data string at various lengths, I receive different page load times:
Test Image 1:
5000 chars - 1.78 sec
50000 chars - 8.24 sec
73198 chars - 11.67 sec (Not truncated)
Test Image 2:
5000 chars - 1.92 sec
50000 chars - 8.79 sec
307466 chars - 42.24 sec (Not truncated)
My Question:
The delay is unacceptable (as most images will be at least 100k in size); does anyone know what's going on with Google Chrome?
I would like to reiterate that the server responds with the same speed, regardless of browser; it is definitely a client-side, browser specific issue with Google Chrome.
I would also appreciate alternative suggestions. I've spent some time attempting to fool the browser into thinking the data was a file upload (by changing the text input to a file input field and then manually trying to form the data and submit it via javascript, but I can't seem to get Django to recognize the falsified file (so it errors out, believing that no file was uploaded).

Summary
Google Chrome seems to have a problem handling large amounts of data when said data is placed into an actual input field. I suspect it's an issue with Chrome attempting to clean up the memory used to display the data.
Details
I was able to achieve a workaround by doing away with the client-side form, entirely, and submitting the data via a javascript XMLHttpRequest (as I had touched on at the end of my question), then redirecting the user to the next page in the AJAX callback.
I could never get Django to recognize a manually formed FileField object (as multipart/form-data), but I was able to get it to accept a manually formed CharField string (which was my base64 encoded canvas data).
Because the data is never placed into an input field, Google Chrome responds without delay.
I hope that helps anyone who may run across a similar issue.

I was also having the exact same problem, I was searching for a solution.
In my case there was no such problem for the initial few runs of the page.
Then it suddenly started to lag eating up a large amount of memory which in turn made my whole system running very slow.
I tried in another PC like what expected there was no problem submitting the big sized svg data for the first few runs but later it is also showing the same lagging problem.
After reading your post i am planning to use jquery's ajax for posting the data . I hope this will solve the issue.

Related

how to get data from ESP8266 WiFi AccessPoint withouit refresh HTML page

Currently I am getting data from a hardware device(charge/load controller) over WiFi, it has an ESP8266 configured as an AccessPoint.
The WiFi is setup to ignore all request from computer, just send its data once per second.
The data is a single string representing about 20 JavaScript Variables...
var xx1="text1";
var xx2="text2"; etc...
I get the data by refreshing the HTML5 page, process with JavaScript & logging to localStorage.
It all works well except I can only refresh about 3 second interval minimum for reliable consistent data-logging. The browser (FireFox) takes a while to complete refresh.
Q. Is there a way I can get every 'data send' using JavaScript without page refresh, this way I can log just the periodic strings I choose from 1 second to xxx second.
I suspect I might need to install some library component to access with my JavaScript ?, i would need to embed this into my HTML file if possible or have it reside in the same folder.
I have been learning JS for about 2 weeks now, getting most from examples & my mistakes.

Form submit cancelled and looping

I am having a problem where I am submitting a file as part of a standard HTML form, and the file uploads, however - this process seems to be stuck in a never ending loop... so the file uploads over and over.
The form is submitted via jQuery, i.e. $('myform').submit(); and isn't an ajax request. Looking in the chrome network console, the request is "cancelled" and 0 bytes are transmitted.
What is causing this loop?
After about an hour of searching around, I upgraded chrome and the console showed a warning about an JavaScript Interval timer = setInterval(count_seconds, 1000).
This function was simply counting the number of elapsed seconds. I've never come across this before, but it seems that Chrome prevents the submission of the form while there is an interval actively running?!
I have now cleared the interval using clearInterval(count_seconds) and timer=null; for good measure, before submitting the form, and that has fixed the issue.
Answering my own question to save others the headache, but if anyone can explain this odd (new?) behaviour then that'd be great.

400 Error when Trying to submit rails form via AJAX

Here's the setup:
Very simple form, just a name field, plus two ActiveStorage attachment fields square_image and landscape_image.
Both fields have direct_upload: true set.
If all form fields are filled out, including the files, the submit work exactly right, no problem.
However, if you fill out only the name field, leaving any of the file fields blank, I get an invalid_request every time.
This only happens in Safari.
The debug logs from nginx reveal client prematurely closed stream: only 548 out of 953 bytes of request body received.
It doesn't seem to be an nginx issue, because I get a similar experience locally via pow (the connection just hangs for a long time, presumably because pow is waiting for the data that never arrives).
Has anyone else experienced this, or have any ideas about how to debug this? I can't figure out where to go from here.
Rails 5.2.0
Safari 11.1 (13605.1.33.1.2)
This is indeed a bug in webkit. It has allegly been fixed but at this point in time, the bug still affects Safari. https://trac.webkit.org/changeset/230963/webkit

What is the right way to call dynamic content (currently using ajax) inside a cached page?

We have a news website where we cache a complete article page.
There are 4 areas that need to continue to be dynamic on that page:
View Counter: We add +1 to view_counts of that article when page loads.
Header: On the header of the website we check if session->id exists or not if it does we display a Welcome [Name], My Profile / Logout and if not we show Register / Login.
Comments: We display the comments made for that article.
Track User Behavior: We track every single action made by users on the site
Now the only way we could think of doing this is through AJAX calls:
$('#usercheck').load(<?php echo "'" . base_url() . "ajax/check_header'"; ?>);
And so on.
This is creating a massive load on CPU, but what would be the right/alternative way of approaching this?
Please see attached:
First of all, you do not have to use AJAX for every possible dynamic content, especially in the case of comments, you may as well load them via an iframe.
That way, you are not relying on Javascript to make the request.
It may even work for the counter.
However, you problem is not Javascript, nor the database server, based on what I can see from your graph. It seems to me you have some heavy PHP controllers, maybe you are loading a heavy framework just to have $session->id checked.
Further, what do you mean by "we track every single action"? How do you track them? Are you sending an AJAX request from every little thing or are you debouncing them with JS and only sending them one every 30 seconds or so?
My advice is that you consider the size of the PHP code you are calling, and slim it down as much as you can, even to zero if it seems feasible (by leveraging localStorage to keep track of you user session after the first login), and maybe loading the counter and the comments in alternative ways.
For example, I infer you are only checking the counter once per page load, ignoring subsequent loads by other users while the current user is reading the article, so your counter may happen to be out-of-date once i a while, depending on your traffic.
I going to explain it better: your page has n views, so when I load it, you request for n and then display n+1 to me. While I'm reading, the same page gets requested and viewed x times by other users. Your counter on the server has been surely updated to n+x, but the counter on my page still says "n views".
So, what's the point in being picky and showing n+1 to me and the not updating it, thus being off by x?
So, first of all the counter controller should be as slim as possible, and what if you loaded it within an iframe, auto updating without AJAX?
How to refresh an iframe not using javascript?
That would keep the counter up-to-date, you may render it with PHP just once per page view, and then just statically serve the resulting HTML file.

google finance zoom in zoom out graph logic

I'm looking for logic behind zoom-able graph like google finance. I know there
are off the shelf components that just do that, but I am looking for a basic
example that explains the logic.
Whoever writes things like that basically has two choices.
Load a lot of data, and show only a little bit. When the user changes the zoom, use the data we weren't showing before. Basically, we load all of the data at page-load time, so the Javascript can use it later. This is easier to write, but slow; sometimes, you have to load tons of data to do it.
Load only the data you need. When the user interacts with the page, make AJAX requests back to the server, to load in the new data that you need.
2a. When you load new data, store everything you've loaded so far, so that you don't need to make more AJAX requests if they return to an older zoom setting.
1 + 2. Load only the data you need, then show the page. Then immediately load everything else, but don't show it until/unless they change the zoom settings.
Of these, 2 and 2a are likely the best choices, while #1 is the "get it done quicker" approach.
Google Chrome (and browsers based on chromium) have developer tools with a network feature that lets you see what happens.
When you load a quote and then change the zoom, you will see a new data request. For example:
https://www.google.com/finance/getprices?q=AA&x=NYSE&i=1800&p=30d&f=d,c,v,o,h,l&df=cpct&auto=1&ts=1382233772497
It makes a new request for each "zoom level", which is necessary because the larger time windows (1 yr, 5 yr) will show data at coarser granularity (1 day, 1 week respectively)

Categories