Too many 2 way data bindings issue in angular.js - workaround - javascript

We've got a <table> of data which live syncs between users in a similar way to Google Spreadsheets, for this we are using Angular.js and syncing the data using long polling (not the most scalable but for our needs - private web app with set limit of users this is fine).
The table has a mixture of static generated values, inputs, checkboxes, select elements in the cells.
The user is able to sort, filter and scroll through the table / columns.
The <table> has about 26 columns and can have up to 500 rows so that could be 13000 (26*500) elements which need to be synced back and forth, after about 1500 elements we've found angular takes a) ages to render the table and b) when data is synced it becomes un responsive.
Apart from the obvious (have less elements) which is not an option from a UX point of view as all the data needs to be displayed to the user, is there a prescribed work around for this ? Am I using the right tool for the job with angular ?
I was playing with the idea of :
A) Making all the data 'read sync only' (so it would get updated, but wouldn't constantly push updates) instead of read + write sync so it was only 1 way and only make an element 2 way when its clicked on ie. when you click on an input it would become a 2 way element, and when your done it would go back to read sync only.
B) Implementing a kind of infinite scroll pagination - that would say load 20 rows and a time as you scrolled down it would unload the first 20 rows and add the next 20, you could then scroll back up if needed and it would reload the first 20 and unload the second 20. My only concern with this is that it might be complex to implement and also im not sure how it would work with the filtering and sorting.
Are either of these approached applicable to overcoming 2 way data binding limits - or is there another approach ive overlooked ?

Related

Best practice to force the browser to only render user visible elements?

A particular page on our site loads with 1000s of divs each about 1000px x ~1500px(A printable page), each div displays additional elements/basic table/etc but can vary in height.
Render time can be several minutes depending on PC performance.
Using tools like webix which can load millions of rows proves the render process is taking up most of the loading time, but doesn't work well for non-tabular data.
Using Angular JS to create infinite scroll lists is possible. But this also doesn't work well with varying height elements.
All solutions I have found so far loose the browsers find feature, which our users commonly use, thus we will probably have to develop our own search tool.
Yes we could add pagination, or some sort of way of breaking down the data, but users still need to review all the data regardless of how it's broken down.
The same data (10,000 pages 30mb) once exported to PDF loads in < than 1 second.
I think the best solution will be the combination of a few different ideas.

best way to render data in tables in angular

I am trying to render data in table form to front-end in angular and there i've used ng-grid and ui-grid so far.
But they seems to be slowing down performance as there are lot more rows sometimes like 200+ per grid and more than 20 grids on single page.
So the browser most of the times hangs up i just needed to show data no operation is needed to be performed on data. it just needs to be rendered.
So anybody please help if you have any other idea to render this much data in grid form.
There are a few options for how to handle this in ui-grid.
There is an option in ui-grid for flatEntityAccess, that improves performance a bit with the bindings.
Many people put their grids in tabs, and use an ng-if around the tab so that it only renders the grid if that tab is active.
If you need all the grids in a list down the page, presumably not all the grids are visible to a user at the same time (you couldn't have 20 grids all visible, probably only one or two visible at a time. You can potentially detect the page scroll and use an ng-if on the grid so that only those that are actually visible are being rendered.
Ultimately the problem is that if you have 10 columns and perhaps 15 rows visible in each grid, that's immediately 150 watchers per grid, if you then have 20 grids you've got 2000+ watchers on your page, and all of them have to be evaluated whenever you scroll any of the grids. Note that the watchers per grid are correlated to the number of visible rows, number of visible columns, and then number of grids that are rendered. Ultimately you need to change one of those elements if you want it to go faster.
Lastly, you may want to check version - rc20 or even current unstable is faster than older release candidates.
Use one time binding in your grid row. {{::data.id}}. This will boost your page performance as watches are removed after first binding is done.
But remember if you change any model changes in you cannot see them in your view when you use one time binding.
One time binding was introduced in Angular 1.3. I you are using Angular 1.3 or above version.
If you are using version below 1.3 then probably you have to change you design. Rather then showing more than 20 grids, show only one at a time based on some select drop down.
You can try ng-table directive also. But if you are using 20 grids with more than 200 rows even that becomes slow.
Angular application become slow if they have watches more than 2000. In your case there are 20 * 200 = 4000 watch. Even more watch if u have more grids and row. So better change your design or use one time binding.

ajax data - replacing html vs updating value attributes

An app I am working on has a 50 row x 100 col grid, where each cell is a div containing 1 textbox, with one outer div containing the entire grid to handle scrolling. Exact dimensions can vary, but that it a typical max size. So, that works out to about 5000 divs, each with its own textbox, for a total of 10,000 elements.
The contents of the grid need to be updated via ajax calls to load data for different periods.
My question is, which of these 2 approaches would be more efficient:
1) Have the ajax call return the full formatted HTML for the grid, and simply set the innerHtml of the containing div to the new contents.
2) Have ajax cal return JSON or similar, then use javascript/jquery to loop through and update the values of each grid cell in the existing divs. Might need to add or delete a few columns in this case, but number of rows will remain constant.
For smaller grids/tables, returning the complete html works great, and requires very little client JS code, but I have heard about performance issues when manipulating large numbers of DOM elements. With the huge number of attributes and properties associated with each element, I can see where it could add up to a lot of overhead to create/destroy 1000s of them. So, I thought I would ask for advise here before deciding which way to go on this. Thanks.
I've been through the very same situation. For tables of a certain size and complexity, it is much faster to render pregenerated HTML over and over again than to hunt for and update the data, but you wind up with a server communication lag to contend with.
The key here is "manipulate". If you're just painting the grid, it's not very costly in terms of time unless there are a lot of events that fire on complete, like event bindings and so forth. Even those aren't too bad. At some point, there is a diminishing return and single cell updates becomes more tolerable.
It really depends on what the users do with the grid. If they are updating many rows / cells, you might want a bulk update function for applying changes to multiple selected rows at once so they don't have to wait for each individual cell change to refresh. This was our solution for large tables as individual cell updates when making lots of changes were too time consuming and reloading the whole table from the server was not too bad if you're only doing it once per batch of updates.

AngularJS Infinite Scrolling with lots of data

So I'm trying to create an infinite scrolling table using AngularJS, similar to this: http://jsfiddle.net/vojtajina/U7Bz9/
The problem I'm having is that in the jsfiddle example, if I keep scrolling till I have a million results, the browser is going to slow to a crawl, wouldn't it? Because there would now be 1,000,000 results in $scope.items. It would be better if I only ever had, for example, 1000 results at a time inside $scope.items, and the results I was viewing happen to be within those 1000.
Example use case: page loads and I see the first 10 results (out of 1,000,000). Even though I only see 10, the first 1000 results are actually loaded. I then scroll to the very bottom of the list to see the last 10 items. If I scroll back up to the top again, I would expect that the top 10 results will have to be loaded again from the server.
I have a project I did with ExtJS that a similar situation: an infinite scrolling list with several thousand results in it. The ExtJS way to handle this was to load the current page of results, then pre-load a couple of extra pages of results as well. At any one time though, there was only ever 10 pages of results stored locally.
So I guess my question is how would I go about implementing this in AngularJS? I kow I haven't provided much code, so I'm not expecting people to just give the coded answer, but at least some advice in which direction to go.
A couple of things:
"Infinite scrolling" to "1,000,000" rows is likely to have issues regardless of the framework, just because you've created millions and millions of DOM nodes (presuming you have more than one element in each record)
The implementation you're looking at doing with <li ng-repeat="item in items">{{item.foo}}</li> or anything like that will have issues very quickly for one big reason: {{item.foo}}} or any ngBind like that will set up a $watch on that field, which creates a lot of overhead in the form of function references, etc. So while 10,000 small objects in an "array" isn't going to be that bad... 10,000-20,000 additional function references for each of those 10,000 items will be.
What you'd want to do in this case would be create a directive that handles the adding and removing of DOM elements that are "too far" out of view as well as keeping the data up to date. That should mitigate most performance issues you might have.
... good infinite scrolling isn't simple, I'm sorry to say.
I like the angular-ui implementation ui-scroll...
https://github.com/angular-ui/ui-scroll
... over ngInfiniteScroll. The main difference with ui-scroll from a standard infinite scroll is that previous elements are removed when leaving the viewport.
From their readme...
The common way to present to the user a list of data elements of undefined length is to start with a small portion at the top of the list - just enough to fill the space on the page. Additional rows are appended to the bottom of the list as the user scrolls down the list.
The problem with this approach is that even though rows at the top of the list become invisible as they scroll out of the view, they are still a part of the page and still consume resources. As the user scrolls down the list grows and the web app slows down.
This becomes a real problem if the html representing a row has event handlers and/or angular watchers attached. A web app of an average complexity can easily introduce 20 watchers per row. Which for a list of 100 rows gives you total of 2000 watchers and a sluggish app.
Additionally, ui-scroll is actively maintained.
It seems that http://kamilkp.github.io/angular-vs-repeat would be what you are looking for. It is a virtual scrolling directive.
So turns out that the ng-grid module for AngularJS has pretty much exactly what I needed. If you look at the examples page, the Server-Side Processing Example is also an infinite scrolling list that only loads the data that is needed.
Thanks to those who commented and answered anyway.
Latest URL : ng-grid
You may try using ng-infinite-scroll :
http://binarymuse.github.io/ngInfiniteScroll/
Check out virtualRepeat from Angular Material
It implements dynamic reuse of rows visible in the viewport area, just like ui-scroll

Can anyone recommend a well performing interface to allow the user to organize a large number of items in HTML?

Currently for "group" management you can click the name of the group in a list of available groups and it will take you to a page with two side by side multi-select list boxes. Between the two list boxes are Add and Remove buttons. You select all the "users" from the left list and click 'Add' and they will appear in the right list, and vice versa. This works fairly well for a small amount of data.
The problem lies when you start having thousands of users. Not only is it difficult and time consuming to search through (despite having a 'filter' at the top that will narrow results based on a string), but you will eventually reach a point where your computer's power and the number of list items apex and the whole browser starts to lag horrendously.
Is there a better interface idea for managing this? Or are there any well known tricks to make it perform better and/or be easier to use when there are many 'items' in the lists?
Implement an Ajax function that hooks on keydown and checks the characters the user has typed into the search/filter box so far (server-side). When the search results drop below 50, push those to the browser for display.
Alternatively, you can use a jQuery UI Autocomplete plugin, and set the minimum number of characters to 3 to trigger the search. This will limit the number of list items that are pushed to the browser.
I would get away from using the native list box in your browser and implement a solution in HTML/CSS using lists or tables (depending on your needs). Then you can use JavaScript and AJAX to pull only the subset of data you need. Watch the user's actions and pull the next 50 records before they actually get to them. That will give the illusion of all of the records being loaded at runtime.
The iPhone does this kind of thing to preserve memory for it's TableViews. I would take that idea and apply it to your case.
I'd say you hit the nail on the head with the word 'filter'. I'm not the hugest fan of parallel multi-selects like what you are describing, but that is almost beside the point, whatever UX element you use, you are going to run into a problem given thousands of items. Thus, filtering. Filtering with a search string is a fine solution, but I suspect searching by name is not the fastest way to get to the users that the admin here wants. What else do you know about the users? How are they grouped.
For example, if these users were students at a highschool, we would know some meta-data about them: What grade are they in? How old are they? What stream of studies are they in? What is their GPA? ... providing filtering on these pieces of metadata is one way of limiting the number of students available at a time. If you have too many to start with, and it is causing performance problems, consider just limiting them, have a button to load more, and only show 100 at a time.
Update: the last point here is essentially what Jesse is proposing below.

Categories