I have a fairly simplistic HTML page that relies on JQuery to process a JSON object after a Form has been submitted and then organize the data in a large table (upwards of 1000+ rows depending on the criteria selected in the Form, and 50 columns)
The request goes to a PHP page which does some fairly heavy computations to build the JSON array and send it back to be processed by JQuery. The JS organizes the data in an HTML table.
The number of objects returned in the JSON array varies depending on the settings chosen in the form, but the length of time seems to increase the total load time on the HTML page exponentially.
I know that each JSON object is fairly large, but I need all of the information that is returned so it's not really an option to pare it down any further.
My problem is that I'm unable to figure out where in my code the slow down is occurring?
Is it the size of the JSON array?
What I've Tried:
Speeding up the PHP/MYSQL selections (helped minimally)
Time the PHP script by writing to a *.txt file when the script begins
and ends. Even in the worst case scenario (most options selected on the HTML form) the total time never exceeds 4-5 seconds to process json_encode.
Use the ob_start("ob_gzhandler"); prior to the json_encode (didn't
seem to make any difference).
I took out the JS Plugins to sort the table columns (originally used sorttable.js and changed to stupidtable to speed things up, but didn't seem to work)
I tried using Chrome's heap snapshot to identify any memory leaks and
to find the total size of the JSON array. Honestly, I'm not really
sure what I'm looking at??
I appreciate any help you can give me. I've spent the last day searching through Google and StackOverflow, but I'm currently at a loss.
Thank you
use the dev tools f12 to see how long it is taking the request from the server to come back. Add some logging in your js after it comes back between different operations to see where the bottleneck might be. Trying to display 1000k+ table rows in the dom all at once is going to affect performance. I would consider a JS table api, such as, DataTables that will only show so many table rows at once in the dom. Something that you could feed the whole json object into to page it.
https://datatables.net/
Related
I'm inexperienced at JavaScript, and don't know how to optimize things for my situation.
I've written an autocomplete function using the JQueryUI Autocomplete plugin. The source for the completion is a JSON array, holding a few thousand items, that I load from my same server. This autocomplete will be attached to a search box that will be on every page of my site, so it'll get populated a lot; I don't want to request the same array every time anyone hits any page. The completion depends on database values, so I can't just put the array in static form in the code. However, it doesn't have to be perfectly synced; caching it for some amount of time would be fine.
Right now, I'm loading the array with $.getJSON. It seems that using an actual remote source is meant to be an AJAX thing where the server does the actual search itself as you type; I think this is probably overkill given that there are only a few thousand, rather than millions, of items--I don't want to fire a zillion requests every time someone types into the search box.
What is the correct way of handling this? I'm totally unfamiliar with how caching would work in JS, or if there's some built-in way to accomplish a similar thing.
I am using datatables jquery to display the table with nodejs to make a query to the database. It is currently taking 18s to display 20000 rows of data
when i directly query the database it takes less then a second to display the data.
First of all, even if it is possible to show 20K records in a grid, your user will definitely not be interested to see all of these at one go. Rather show a small set of data in the UI grid and provide search, sort and pagination options.
Go for both server-side (NodeJS) and client-side (jQuery) pagination. Fetch data from the database one page at a time and let the client request one page at a time. Make the page size configurable.
If server-side performance (especially database query) is not a concern then fetch all 20K data at one go. Reduces complexity at server side by avoiding pagination. However, still at client-side implement pagination to avoid rendering issue and to improve usability. Note that, by fetching all data at one go memory consumption of both server and client-side processes will increase.
If it works today well, it will definitely break if there is a possibility that your data grows over time. If not and you still want to fetch all 20K records and display at one go, then go ahead, good luck :)
What kind of data?
Rather than display 20K rows of details, summarize the data and show no more than a screenful of data. You could do this with some kind of graph. Or summarizing the data. Or...
And you could have links on the consolidated data for those who need to drill down to the details.
This seems to be a UI issue.
I am building a page, which depending on a date range, displays between 0 and a couple hundred rows. When the user enters the page it loads all rows and displays them, the user can then filter the data to his needs. This seems reasonable fast in chrome but IE8 becomes quite slow at some point. (Unfortunately IE8 is the Browser that counts)
Say I need the entire data at page load, but only want to display a subset. Whats the best way to do that?
1.) Build a DOM String and add only the needed rows to the "real" DOM.
2.) Save the data in localStorage.
3.) Take the needed data from the Server produced JSON Object.
4.) ???
Or is it always better to hit the server with a specified query and return only the needed data?
On page load render all the rows in the DOM and print the necessary fields of the data in JSON array.
When filter criteria changes, filter the data in JSON, and then using the unique identifiers in JSON hide the rows in the table (only hide, not remove). This way You wont have to rerender existing rows.
If You choose the ajax way though, the fastest way is to render the HTML on the server side, then simply replace the content of the table with it. This way the browser renders the representation from the given string, u dont have to iterate through a JSON array and render it one by one. Its drawback maybe the network latency and bandwidth.
Hope it helps decide
I have a huge array(4MB) data on the server side. right now I'm using jquery getJSON method to get the array data and render the whole array on the browser. It turns out getting the array data is too slow. Probably the main time is spent on JSON parsing,probably not.
What is the best/fastest way to get such array kind of data from server?
Four MegaBytes is a lot of data to be sending client side in all one go? Rather than trying to speed up how fast JavaScript can process JSON, i suspect your best bet will be to figure out some tactics to break the data up a bit more (so you can work with less at a time).
I mean, do you really need all of it at once? Its probably worth looking in to adding some server side filtering to the JSON being returned so as to try and limit it only to the data needed to do whatever you apps supposed to be doing?
For example, if your planning to display a massive list of products, it may be worth just loading the first 50-100, then as the user scrolls down the page make a second call to load the next 50-100 etc?
I am continually pulling in a list (over 200) entries from a remote server and showing them in a table (many pieces of data per row). The data changes and fluctuates over time. What's the most efficient way in JS to render this to the screen. As the data changes do I simply wipe my DOM and re-render with the new data; do I have a list of ids that will simply update the correct tr dom entries?
Its already sorted and filtered to be the correct information (server side) so its just displaying it to the user I am concerned with.
Any suggestions would be helpful. I am using jQuery for my requests but that can be changed. I need to know the quickest (latency wise) way of pulling this large dataset and displaying it in a tabular format.
Edit: Pagination is not possible in this scenario.
Edit2: The backend is Python (WSGI) running in a load balancer and is capable of flushing out 600 reqs / second. The data is only available as JSON and the backend is more of an API.
Are you using HTML tables? Sometime ago I crossed a Microsoft blog from the IE team stating that rendering tables are slow. Very slow. Re-rendering the whole table everytime will probably be slower than updating its values, but that is just my guess.
In fact, my opinion here is that updating values will be faster than re-rendering elements in the page. To optimize your latency, you could keep a hash of objects mapped by, so you don't have to lookup the DOM everytime you update the values.
I quite like the 'Live Image Search' which provides more data as you scroll down.
If your data list is getting really large, consider not displaying all of that data to the user at one time (if that is possible in your situation). Not only will the user get lost looking at that much data, but the browser will have to render the whole list and slow you down.
You don't mention what server side technology you are using. If you are in .Net, there are a couple ASP.Net controls that use a data pager (like a grid view). There is also a PagedDataSource object you might look into using (which can be used with any ASP.Net control that has a DataSource property). Both will break up your data into pages, and only the viewed page will be rendered at one time. This decreases latency dramatically.