I am continually pulling in a list (over 200) entries from a remote server and showing them in a table (many pieces of data per row). The data changes and fluctuates over time. What's the most efficient way in JS to render this to the screen. As the data changes do I simply wipe my DOM and re-render with the new data; do I have a list of ids that will simply update the correct tr dom entries?
Its already sorted and filtered to be the correct information (server side) so its just displaying it to the user I am concerned with.
Any suggestions would be helpful. I am using jQuery for my requests but that can be changed. I need to know the quickest (latency wise) way of pulling this large dataset and displaying it in a tabular format.
Edit: Pagination is not possible in this scenario.
Edit2: The backend is Python (WSGI) running in a load balancer and is capable of flushing out 600 reqs / second. The data is only available as JSON and the backend is more of an API.
Are you using HTML tables? Sometime ago I crossed a Microsoft blog from the IE team stating that rendering tables are slow. Very slow. Re-rendering the whole table everytime will probably be slower than updating its values, but that is just my guess.
In fact, my opinion here is that updating values will be faster than re-rendering elements in the page. To optimize your latency, you could keep a hash of objects mapped by, so you don't have to lookup the DOM everytime you update the values.
I quite like the 'Live Image Search' which provides more data as you scroll down.
If your data list is getting really large, consider not displaying all of that data to the user at one time (if that is possible in your situation). Not only will the user get lost looking at that much data, but the browser will have to render the whole list and slow you down.
You don't mention what server side technology you are using. If you are in .Net, there are a couple ASP.Net controls that use a data pager (like a grid view). There is also a PagedDataSource object you might look into using (which can be used with any ASP.Net control that has a DataSource property). Both will break up your data into pages, and only the viewed page will be rendered at one time. This decreases latency dramatically.
Related
Using ag-grid-community 22.1.1 version in Angular 7
If we go by official docs then client side model should load only the records available or set in pagination size. But that does not happen. When browser makes a request it waits till all the records are loaded and response is returned before the view starts rendering.
Can someone explain is my understanding wrong from the below wordings
Here are more detailed rules of thumb.
If you are not sure, use default Client-Side. The grid can handle
massive amounts of data (100k+ rows). The grid will only render what's
visible on the screen (40 rows approximately, depending on your screen
size) even if you have thousands of rows returned from your server.
You will not kill the grid with too much data - rather your browser
will run out of memory before the grid gets into problems. So if you
are unsure, go with Client-Side Row Model first and only change if you
need to. With Client-Side, you get sorting, filtering, grouping,
pivoting and aggregation all done for you by the grid. All of the
examples in the documentation use the Client-Side model unless
specified otherwise.
Link to official docs explaining different row models and when to use what.
Based on that if am expecting my api is going to return 500 records and my [paginationPageSize]="40"
Should't it load 40 records and render,although in background it can still load all other remaining records to the browser cache. But it looks like it is waiting for whole set of records to load to browser cache and then starts rendering which is impacting the performance.
The below is the line confusing the most
The grid will only render what's
visible on the screen (40 rows approximately, depending on your screen
size)
ag-grid and for that matter any grid, in client side model, will fetch all the records first and then starts rendering in browser. But, it will only render the numbers of rows which fit into the visible view. This is for a very good reason.
Consider a scenario where the user of your application is searching/filtering on a certain field in the grid, if grid does not have all the data with it at that time (it is still fetching from server in background) it may return Not Found even though matching record(s) exist(s) in the data which is yet to come. Same problem would be there for sorting,grouping etc. operations.
This link on ag-grid doc states clearly
By default the grid expects you to provide all the data up front. In
other words, your application loads the full set of data into the
client and then passes it in its entirety to the grid. This is in
contrast to Server-Side Data where the data is mostly kept on the
server and loaded into the grid in parts.
Now, the reason it renders 40 or so record in view is because creating and rendering HTML for all rows will make the page very slow or unusable.
You need to opt for server side model if you wish to fetch data in chunks from server. But then it involves more work to implement filtering,sorting etc.
I am using datatables jquery to display the table with nodejs to make a query to the database. It is currently taking 18s to display 20000 rows of data
when i directly query the database it takes less then a second to display the data.
First of all, even if it is possible to show 20K records in a grid, your user will definitely not be interested to see all of these at one go. Rather show a small set of data in the UI grid and provide search, sort and pagination options.
Go for both server-side (NodeJS) and client-side (jQuery) pagination. Fetch data from the database one page at a time and let the client request one page at a time. Make the page size configurable.
If server-side performance (especially database query) is not a concern then fetch all 20K data at one go. Reduces complexity at server side by avoiding pagination. However, still at client-side implement pagination to avoid rendering issue and to improve usability. Note that, by fetching all data at one go memory consumption of both server and client-side processes will increase.
If it works today well, it will definitely break if there is a possibility that your data grows over time. If not and you still want to fetch all 20K records and display at one go, then go ahead, good luck :)
What kind of data?
Rather than display 20K rows of details, summarize the data and show no more than a screenful of data. You could do this with some kind of graph. Or summarizing the data. Or...
And you could have links on the consolidated data for those who need to drill down to the details.
This seems to be a UI issue.
I have a fairly simplistic HTML page that relies on JQuery to process a JSON object after a Form has been submitted and then organize the data in a large table (upwards of 1000+ rows depending on the criteria selected in the Form, and 50 columns)
The request goes to a PHP page which does some fairly heavy computations to build the JSON array and send it back to be processed by JQuery. The JS organizes the data in an HTML table.
The number of objects returned in the JSON array varies depending on the settings chosen in the form, but the length of time seems to increase the total load time on the HTML page exponentially.
I know that each JSON object is fairly large, but I need all of the information that is returned so it's not really an option to pare it down any further.
My problem is that I'm unable to figure out where in my code the slow down is occurring?
Is it the size of the JSON array?
What I've Tried:
Speeding up the PHP/MYSQL selections (helped minimally)
Time the PHP script by writing to a *.txt file when the script begins
and ends. Even in the worst case scenario (most options selected on the HTML form) the total time never exceeds 4-5 seconds to process json_encode.
Use the ob_start("ob_gzhandler"); prior to the json_encode (didn't
seem to make any difference).
I took out the JS Plugins to sort the table columns (originally used sorttable.js and changed to stupidtable to speed things up, but didn't seem to work)
I tried using Chrome's heap snapshot to identify any memory leaks and
to find the total size of the JSON array. Honestly, I'm not really
sure what I'm looking at??
I appreciate any help you can give me. I've spent the last day searching through Google and StackOverflow, but I'm currently at a loss.
Thank you
use the dev tools f12 to see how long it is taking the request from the server to come back. Add some logging in your js after it comes back between different operations to see where the bottleneck might be. Trying to display 1000k+ table rows in the dom all at once is going to affect performance. I would consider a JS table api, such as, DataTables that will only show so many table rows at once in the dom. Something that you could feed the whole json object into to page it.
https://datatables.net/
I've been pondering moving our current admin system over to a JS framework for a while and I tested out AngularJS today. I really like how powerful it is. I created a demo application (source: https://github.com/andyhmltn/portfolio-viewer) that has a list of 'items' and displays them in a paginated list that you can order/search in realtime.
The problem that I'm having is figuring out how I would replicate this kind of behaviour with a larger data set. Ideally, I want to have a table of items that's sortable/searchable and paginated that's all in realtime.
The part that concerns me is that this table will have 10,000+ records at least. At current, that's no problem as it's a PHP file that limits the query to the current page and appends any search options to the end. The demo above only has about 15-20 records in. I'm wondering how hard it would be to do the same thing with such a large amount of records without pulling all of them into one JSON request at once as it'll be incredibly slow.
Does anyone have any ideas?
I'm used to handle large datasets in JavaScript, and I would suggest you to :
use pagination (either server-sided or client-sided, depending on the actual volume of your data, see below)
use Crossfilter.js to group your records and adopt a several-levels architecture in your GUI (records per month, double click, records per day for the clicked month, etc.)
An indicator I often use is the following :
rowsAmount x columnsAmount x dataManipulationsPerRow
Also, consider the fact that handling large datasets and displaying them are two very differents things.
Indeed pulling so many rows in one request would be a killer. Fortunately Angular has the ng-grid component that can do server-side paging (among many other things). Instructions are provided in the given link.
I am developing a special grid solution for a software product in javascript. The data are collected with a PHP script on the server side and pushed to JavaScript via a JSON array. In my script I have to parse this array and render the grid rows. And here is my problem. If I receive for example 4000 rows, javascript is rendering this very fast, but I think the bottleneck is the browser...
My question is, is it possible to render only the visible parts? I need to scroll to the other information, but the browser does not need to render it if it is not visible. is it possible to render something outside of the viewport?
I need to set width and positions and this is only possible if I added the new element to the viewport and this is very slow by a huge mass on data... How could I solve this problem?
The solution here might be to paginate your data on client side. That way, you can sort your object array with JS and simply insert a section of the data into the DOM at a time.
Client side pagination library options have be discussed here.
DOM updates are the slowest part of the chain. Process the result in memory and insert it into the DOM in one go if you can.