I am trying to render data in table form to front-end in angular and there i've used ng-grid and ui-grid so far.
But they seems to be slowing down performance as there are lot more rows sometimes like 200+ per grid and more than 20 grids on single page.
So the browser most of the times hangs up i just needed to show data no operation is needed to be performed on data. it just needs to be rendered.
So anybody please help if you have any other idea to render this much data in grid form.
There are a few options for how to handle this in ui-grid.
There is an option in ui-grid for flatEntityAccess, that improves performance a bit with the bindings.
Many people put their grids in tabs, and use an ng-if around the tab so that it only renders the grid if that tab is active.
If you need all the grids in a list down the page, presumably not all the grids are visible to a user at the same time (you couldn't have 20 grids all visible, probably only one or two visible at a time. You can potentially detect the page scroll and use an ng-if on the grid so that only those that are actually visible are being rendered.
Ultimately the problem is that if you have 10 columns and perhaps 15 rows visible in each grid, that's immediately 150 watchers per grid, if you then have 20 grids you've got 2000+ watchers on your page, and all of them have to be evaluated whenever you scroll any of the grids. Note that the watchers per grid are correlated to the number of visible rows, number of visible columns, and then number of grids that are rendered. Ultimately you need to change one of those elements if you want it to go faster.
Lastly, you may want to check version - rc20 or even current unstable is faster than older release candidates.
Use one time binding in your grid row. {{::data.id}}. This will boost your page performance as watches are removed after first binding is done.
But remember if you change any model changes in you cannot see them in your view when you use one time binding.
One time binding was introduced in Angular 1.3. I you are using Angular 1.3 or above version.
If you are using version below 1.3 then probably you have to change you design. Rather then showing more than 20 grids, show only one at a time based on some select drop down.
You can try ng-table directive also. But if you are using 20 grids with more than 200 rows even that becomes slow.
Angular application become slow if they have watches more than 2000. In your case there are 20 * 200 = 4000 watch. Even more watch if u have more grids and row. So better change your design or use one time binding.
Related
I have a table on UI with 5 columns. These columns contain different charts which are rendered using different directives i.e. custom progress bar, chart for margin, chart for target achieved, etc.
It is working fine currently.
Problem :- While loading the page, page takes around 10s to render. (data from backend takes 800-900ms to return data). When I reduce the number of chart columns from 5 to 1, time taken to render page has been drastically reduced to 2.5s. Time take in rendering page keeps on increasing by 1-2 seconds as I add columns one by one.
I am looking if there is a way I can call 5 directives in parallel at a time so that time taken to render page becomes somewhere around 3-4 seconds.
Thanks in advance !!
Looking at your problem I have below finding to share.
Directive always runs in parallel except they are not nested to each other. Assuming in your scenario they are in ng-repeat loop so they will run parallel.
Avoid calling all 25 objects in one go (have pagination kind of use case) this will save server level time.
Avoid rendering all the chart on DOM at the same time. DOM rendering is the most expensive process in load cycle. Try rendering 5 but display one with display block and rest with display none then show them one by one. this will avoid DOM freezing. try limit option in ng-repeat.
It would be great if you can share DOM code and JS code.
I am working with a big amount of data (1M - 5M), and rows in the grid should be groupable, sortable, and filterable. As ag-grid can populate table with data quickly enough, I use in-memory row model to satisfy requirements.
However, when I click column in order to sort all rows by this column, it takes some time to do this. Moreover, sequential clicking on columns while rows are still being sorted may crash grid as well as browser application.
Are there any ways to prevent user from clicking on columns (disable sorting, show loading overlay, or something like this)?
I am trying to use beforeSortChanged and afterSortChanged events to show overlay or modify DOM elements (to make grid a little bit grey and show loading circle), but it doesn't work properly: beforeSortChanged event handler seems to be stuck for a moment and then only executed.
Ag-grid is used inside Ember framework as a component.
How about using onCellClicked which is an attribute of the columnDefs. Should work in a similar manner to what you are looking for with beforeSortChanged.
We've got a <table> of data which live syncs between users in a similar way to Google Spreadsheets, for this we are using Angular.js and syncing the data using long polling (not the most scalable but for our needs - private web app with set limit of users this is fine).
The table has a mixture of static generated values, inputs, checkboxes, select elements in the cells.
The user is able to sort, filter and scroll through the table / columns.
The <table> has about 26 columns and can have up to 500 rows so that could be 13000 (26*500) elements which need to be synced back and forth, after about 1500 elements we've found angular takes a) ages to render the table and b) when data is synced it becomes un responsive.
Apart from the obvious (have less elements) which is not an option from a UX point of view as all the data needs to be displayed to the user, is there a prescribed work around for this ? Am I using the right tool for the job with angular ?
I was playing with the idea of :
A) Making all the data 'read sync only' (so it would get updated, but wouldn't constantly push updates) instead of read + write sync so it was only 1 way and only make an element 2 way when its clicked on ie. when you click on an input it would become a 2 way element, and when your done it would go back to read sync only.
B) Implementing a kind of infinite scroll pagination - that would say load 20 rows and a time as you scrolled down it would unload the first 20 rows and add the next 20, you could then scroll back up if needed and it would reload the first 20 and unload the second 20. My only concern with this is that it might be complex to implement and also im not sure how it would work with the filtering and sorting.
Are either of these approached applicable to overcoming 2 way data binding limits - or is there another approach ive overlooked ?
An app I am working on has a 50 row x 100 col grid, where each cell is a div containing 1 textbox, with one outer div containing the entire grid to handle scrolling. Exact dimensions can vary, but that it a typical max size. So, that works out to about 5000 divs, each with its own textbox, for a total of 10,000 elements.
The contents of the grid need to be updated via ajax calls to load data for different periods.
My question is, which of these 2 approaches would be more efficient:
1) Have the ajax call return the full formatted HTML for the grid, and simply set the innerHtml of the containing div to the new contents.
2) Have ajax cal return JSON or similar, then use javascript/jquery to loop through and update the values of each grid cell in the existing divs. Might need to add or delete a few columns in this case, but number of rows will remain constant.
For smaller grids/tables, returning the complete html works great, and requires very little client JS code, but I have heard about performance issues when manipulating large numbers of DOM elements. With the huge number of attributes and properties associated with each element, I can see where it could add up to a lot of overhead to create/destroy 1000s of them. So, I thought I would ask for advise here before deciding which way to go on this. Thanks.
I've been through the very same situation. For tables of a certain size and complexity, it is much faster to render pregenerated HTML over and over again than to hunt for and update the data, but you wind up with a server communication lag to contend with.
The key here is "manipulate". If you're just painting the grid, it's not very costly in terms of time unless there are a lot of events that fire on complete, like event bindings and so forth. Even those aren't too bad. At some point, there is a diminishing return and single cell updates becomes more tolerable.
It really depends on what the users do with the grid. If they are updating many rows / cells, you might want a bulk update function for applying changes to multiple selected rows at once so they don't have to wait for each individual cell change to refresh. This was our solution for large tables as individual cell updates when making lots of changes were too time consuming and reloading the whole table from the server was not too bad if you're only doing it once per batch of updates.
So I'm trying to create an infinite scrolling table using AngularJS, similar to this: http://jsfiddle.net/vojtajina/U7Bz9/
The problem I'm having is that in the jsfiddle example, if I keep scrolling till I have a million results, the browser is going to slow to a crawl, wouldn't it? Because there would now be 1,000,000 results in $scope.items. It would be better if I only ever had, for example, 1000 results at a time inside $scope.items, and the results I was viewing happen to be within those 1000.
Example use case: page loads and I see the first 10 results (out of 1,000,000). Even though I only see 10, the first 1000 results are actually loaded. I then scroll to the very bottom of the list to see the last 10 items. If I scroll back up to the top again, I would expect that the top 10 results will have to be loaded again from the server.
I have a project I did with ExtJS that a similar situation: an infinite scrolling list with several thousand results in it. The ExtJS way to handle this was to load the current page of results, then pre-load a couple of extra pages of results as well. At any one time though, there was only ever 10 pages of results stored locally.
So I guess my question is how would I go about implementing this in AngularJS? I kow I haven't provided much code, so I'm not expecting people to just give the coded answer, but at least some advice in which direction to go.
A couple of things:
"Infinite scrolling" to "1,000,000" rows is likely to have issues regardless of the framework, just because you've created millions and millions of DOM nodes (presuming you have more than one element in each record)
The implementation you're looking at doing with <li ng-repeat="item in items">{{item.foo}}</li> or anything like that will have issues very quickly for one big reason: {{item.foo}}} or any ngBind like that will set up a $watch on that field, which creates a lot of overhead in the form of function references, etc. So while 10,000 small objects in an "array" isn't going to be that bad... 10,000-20,000 additional function references for each of those 10,000 items will be.
What you'd want to do in this case would be create a directive that handles the adding and removing of DOM elements that are "too far" out of view as well as keeping the data up to date. That should mitigate most performance issues you might have.
... good infinite scrolling isn't simple, I'm sorry to say.
I like the angular-ui implementation ui-scroll...
https://github.com/angular-ui/ui-scroll
... over ngInfiniteScroll. The main difference with ui-scroll from a standard infinite scroll is that previous elements are removed when leaving the viewport.
From their readme...
The common way to present to the user a list of data elements of undefined length is to start with a small portion at the top of the list - just enough to fill the space on the page. Additional rows are appended to the bottom of the list as the user scrolls down the list.
The problem with this approach is that even though rows at the top of the list become invisible as they scroll out of the view, they are still a part of the page and still consume resources. As the user scrolls down the list grows and the web app slows down.
This becomes a real problem if the html representing a row has event handlers and/or angular watchers attached. A web app of an average complexity can easily introduce 20 watchers per row. Which for a list of 100 rows gives you total of 2000 watchers and a sluggish app.
Additionally, ui-scroll is actively maintained.
It seems that http://kamilkp.github.io/angular-vs-repeat would be what you are looking for. It is a virtual scrolling directive.
So turns out that the ng-grid module for AngularJS has pretty much exactly what I needed. If you look at the examples page, the Server-Side Processing Example is also an infinite scrolling list that only loads the data that is needed.
Thanks to those who commented and answered anyway.
Latest URL : ng-grid
You may try using ng-infinite-scroll :
http://binarymuse.github.io/ngInfiniteScroll/
Check out virtualRepeat from Angular Material
It implements dynamic reuse of rows visible in the viewport area, just like ui-scroll