Let's suppose a web page located at /items, listing all the available items.
When the user clicks on one item, the browser shows /items/:itemId and present the item's details.
Of course, the items list could be very large.
That's why I'm using the scroll infinite feature from Ionic (Framework I'm using based on Angular).
However, as many developers know, handling the "back button" from /items/:itemId to /items is tricky since the Angular's controller is destroyed during the page transition and then rebuilt when the list page is loaded again.
Therefore losing the scroll position and the exact items already loaded before clicking on an item's detail.
I think about some solutions, but I hesitate since all have a "drawback":
Storing the actual loaded items (the complete items) in $cacheFactory (Angular's cache or localStorage) just before the item details is about to be shown.
Thus, when the back button is pressed (on the detail page), the /items's controller can init the data from the cache, and besides the scroll position could then be easily "remembered".
Drawback is that the data in cache may be stale...
"No need to store any loaded items in cache!" Just store in cache or localStorage the actual number of chunks loaded.
For instance, my REST API allows to retrieve items 10 by 10.
If the user's was loaded up to the second chunk (up to 20 elements so), a variable in cache would contain this value and the controller could then be init with all the 20 items initially.
Drawback is that it would involve several requests to server. Indeed, if the user loaded 30 chunks, the controller would need to make 30 calls to server...
I could customize the size of chunk processed by the server, but one item is heavy (several long texts etc.. lists of anything), explaining why I limited to a relatively small number.
Using a bidirectional scrolling (top and bottom), so that there's ALWAYS 10 items in the DOM. I would just need to store the last number of chunk already loaded, in order to reload it.
Drawback: I would have to write my directive myself (possible so) since there's no Angular (neither from Ionic) directive currently.
And I imagine that bidirectional is useful only when dealing with very very large list: more than 2000 rows, in order to lighten the DOM.
I don't expect more than 400 rows in a majority of my cases...
What is a good practice? Maybe another solution?
This is kind of a cheat, but what about using a Modal (http://ionicframework.com/docs/api/service/$ionicModal/) to render the detail?
Related
Working with Sencha touch application, I am having the next problem and I think is a bug from time ago..
When the user is filtering from a searchfield and the values found are (for instance) 1 or 2 records, the ipad screen sometimes does not appear items, it appears blank screen and it is necessary a light scroll movement to see the values.
Anyone know this bug and I have a solution?
Thank you.
If I'm correct, in order to reproduce the problem you have to open a long, paged list, then swipe down until you are at least on the second page, then call store.setFilter(...) and store.load() on the list's store.
The calls to setFilter and load are not resetting the page number to 1, so you are asking for the second (or third) page of the searched data.
If your server side works like mine, you'll get an empty result set, with a total number of 1 or 2 records.
You can search by yourself for the problem, debugging from Chrome dev tools (or whatever suits you best), and checking the start, limit and page parameters included in the request for data issued by the "incriminated" store load.
One possible, simple solution, is to call store.loadPage(1) instead of load.
Good luck!
There is a similar question here, but this does not appear to apply for my circumstances.
I'm building an app using Angular-Material. I have tabular data of about 5,000 rows, and I'm loading them in to a virtual-repeat container.
If you're unfamiliar, in short this limits the displayed data to as many rows fit in the viewport, and dynamically loads in/out data as the user scrolls, dramatically decreasing the page load. It's fantastic! (I've gone from a ~30s load time to >1s)
However, I have a cell in each row that pulls in an image from Facebook, looking like:
<div style="background-image:url(https://graph.facebook.com/120945717945722/picture?width=200&height=200&)" class="avatar"></div>
When I scroll down the page these images load fine, but as they're removed from the DOM it seems that they're not cached; when I scroll up and down the repeated table, the images load from scratch again and again.
How can I ensure that they're downloaded once and then cached properly?
To clarify, each repeated item has a different image, but the image for each item doesn't change (they all come from the facebook avatar associated with a particular organisation).
It seems today that accessing the picture through the graph is a pointer to the "current" resource. So, the url in your style rule is a method to retrieve the current picture. The graph url is a no-cache url because it redirects to the current resource (i.e. it isn).
So, for the browser to cache it seems to be difficult without forcing an assumption that the image has not changed.
Depending on your situation, proxying the request through your app server would give you some cache control. You could forego the cache header manipulation and use the proxy in conjunction with Angular's $cacheFactory to keep those objects in memory during the state of the app or until you remove them from the Cache you create. Definitely use caution with caching into memory with the amount of data you can potentially have.
I am developing a gallery for huge pictures, sounds and videos(=items). The page show really a lot of galleries, and each gallery contains a lot of items. All items will be pulled via ajax in that ,moment the user is asking for. (By clicking "next" for example). At the moment I am caching all viewed items during lifetime of the page in an extra div-container, so that no item needs to be requested via ajax a second time. But thats just a small speed advantage, even noticeable.
I would love to do two things:
The user is viewing the first gallery. All items he has viewd will be cached. During the time he is not doing anything, maybe watching a video of the page or the actual image, I want my script to load all other items of the page, one after the other, and move them to the cache. but as soon as the user clicks to view the next item, this caching process needs to be paused. So it has the priority 2. If the prosesses the user was asking for are finished, my caching process shall go forward. How can I afford this?
And how can I say, caching should stop when maybe 25MB of cached Items are received? How can I see the used RAM of an div containing the items?
Hopefully I could explain what I want and somebody has good understandable ideas for me.:)
Best!
Falk
I've got an angular app and I integrated it with UI Bootstrap project. I'm using regular
The modal dialog with dropdown containing 750 records, when one of the items is selected and clicked "Ok" or "Cancel", the modal and the overlay fades out without any delay.
Here's the plunker:Modal dialog with 750 records
If the modal dialog with dropdown containing around 10k+ records, and one of the items is selected from the list. Clicking "Ok" or "Cancel" is not hiding the modal dialog right away, instead I'm having a 8-10 second delay on Chrome, I've not tested on IE yet.
Here's the plunker:Modal dialog with 10k+ records
Question: Why I'm having performance hit with more data?
You are slowing the whole entire browser down by grabbing the DOM by the neck and pouring 10,000 <option> nodes down its throat. You need to lazy load your data somehow. Ever noticed on sites like Twitter, Facebook, and others that when you scroll to the bottom of the page it will begin loading more records from the server? Good apps will start to garbage collect old records that have been scrolled up as well.
When you scroll through your Facebook news feed it's not loading all your friends post since 2007 into the browser all at the same time. Once a maximum number of posts exists in the DOM Facebook will start removing the oldest ones you scrolled up to make room for more and grab fresh posts from the server so you can continue scrolling. You can even see your browser scroll bar jump up as you scroll down because more posts are being added to the DOM.
No browser is going to be able to handle that much data. The browser is not a database. I'm amazed your plunker with 10k records is as performant as it is! Haha. A dropdown is not what you want to display that data. You're going to have to sit down and think of a better way to show that data to the user. My first thought is to provide a filterable list that initially contains the top 25 most selected options or something, then typing in a search field causes it to load a new list from the server that matches the search criteria. Only you will know what your users will actually want, but I assure you it's not a dropdown list with 10k+ options.
Example:
Notice how the browser scroll bar jumps up a bit when it gets to the bottom. Twitter gets to the bottom and then loads more data to scroll through. It will eventually start cleaning up data at the top of the page as well if I scroll far enough.
Modern browsers can handle a lot, but 10,000+ <option> nodes is pushing it overboard.
The browser can handle a large number of values in a dropdown list, but a dropdown list isn't meant for such a task. Not to mention the users will have a hard time selecting an appropriate value, even if you sort them alphabetically.
You would be much better off using an input text box instead of a dropdown.
jQueryUI has some nice autocomplete features that would improve not only the performance of your web application, but also make the user experience much more bearable. I would any day prefer to type out one of the 10,000 options provided to me than search for them in a dropdown using a mouse and select them.
Here's an example on jsfiddle with ~8.5k records for a performance test.
Let me quickly tell you few points:
It is a usability bug to scroll through 10K records. Consider someone going through 10K options and selecting the one which they want. Not a good idea.
Performance issue:
If the options were rendered from a back-end in a traditional way (non-Angular way) then it would just take time to load but after that the performance won't be such an issue.
Since, you are using AngularJS with ng-options, the options are populated in the front-end and you have all the data in Angular's scope. To perform, two-way binding, Angular always does a dirty-checking in each 'digest cycle' which loops through each and every data element in $scope and causes that delay.
Solution:
Use Select2's "Loading Remote Data". Select2 is a jQuery based replacement for select boxes.
Consider using the AngularUI's Select2 wrapper instead of directly using it.
I have a webpage which contains a table for displaying a large amount of data (on average from 2,000 to 10,000 rows).
This page takes a long time to load/render. Which is understandable.
The problem is, while the page is loading the PCs memory usage skyrockets (500mb on my test system is in use by iexplorer) and the whole PC grinds to a halt until it has finished, which can take a minute or two. IE hangs until it is complete, switching to another running program is the same.
I need to fix this - and ideally i want to accomplish 2 things:
1) Load individual parts of the page seperately. So the page can render initially without the large data table. A loading div will be placed there until it is ready.
2) Dont use up so much memory or local resources while rendering - so at least they can use a different tab/application at the same time.
How would I go about doing both or either of these?
I'm an applications programmer by trade so i am still a little fizzy on the things I can do in a web environment.
Cheers all.
Regarding the first part, it's called Ajax: display the page without the table, or with an empty table, and then use ajax requests to fetch the data (in html or any data format) and display it.
Regarding the second part, you want something called lazyloading: the possibility to load data only when the user needs it, ie when it's on the visible part of the document. You can look at this question for a DataGrid library capable of handling millions of rows.
Two basic options:
Pagination
Lazy loading (load as user scrolls down). See this jQuery plugin
You could try a couple of things:
Loading data Asynchronously
and
Paging