There is a similar question here, but this does not appear to apply for my circumstances.
I'm building an app using Angular-Material. I have tabular data of about 5,000 rows, and I'm loading them in to a virtual-repeat container.
If you're unfamiliar, in short this limits the displayed data to as many rows fit in the viewport, and dynamically loads in/out data as the user scrolls, dramatically decreasing the page load. It's fantastic! (I've gone from a ~30s load time to >1s)
However, I have a cell in each row that pulls in an image from Facebook, looking like:
<div style="background-image:url(https://graph.facebook.com/120945717945722/picture?width=200&height=200&)" class="avatar"></div>
When I scroll down the page these images load fine, but as they're removed from the DOM it seems that they're not cached; when I scroll up and down the repeated table, the images load from scratch again and again.
How can I ensure that they're downloaded once and then cached properly?
To clarify, each repeated item has a different image, but the image for each item doesn't change (they all come from the facebook avatar associated with a particular organisation).
It seems today that accessing the picture through the graph is a pointer to the "current" resource. So, the url in your style rule is a method to retrieve the current picture. The graph url is a no-cache url because it redirects to the current resource (i.e. it isn).
So, for the browser to cache it seems to be difficult without forcing an assumption that the image has not changed.
Depending on your situation, proxying the request through your app server would give you some cache control. You could forego the cache header manipulation and use the proxy in conjunction with Angular's $cacheFactory to keep those objects in memory during the state of the app or until you remove them from the Cache you create. Definitely use caution with caching into memory with the amount of data you can potentially have.
Related
The problem of making a grid into an Buffered grid is not resolved.As I have applied all of the techniques being used,the problem in the grid still exists.How can i apply the BufferedRenderer or BufferedGrid property in my already applied extjs grid which loads about 20000 records?
The only way i can tell you to try is buffered store, try to set on your store buffered: true;
That allows the Store to prefetch and cache in a page cache, pages of Records, and to then satisfy loading requirements from this page cache.
To use buffered Stores, initiate the process by loading the first page. The number of rows rendered are determined automatically, and the range of pages needed to keep the cache primed for scrolling is requested and cached. Example:
myStore.loadPage(1); // Load page 1
A BufferedRenderer is instantiated which will monitor the scrolling in the grid, and refresh the view's rows from the page cache as needed. It will also pull new data into the page cache when scrolling of the view draws upon data near either end of the prefetched data.
The margins which trigger view refreshing from the prefetched data are Ext.grid.plugin.BufferedRenderer.numFromEdge, Ext.grid.plugin.BufferedRenderer.leadingBufferZone and Ext.grid.plugin.BufferedRenderer.trailingBufferZone.
The margins which trigger loading more data into the page cache are, leadingBufferZone and trailingBufferZone.
By default, only 5 pages of data (in addition to the pages which over the visible region) are cached in the page cache, with old pages being evicted from the cache as the view moves down through the dataset. This is controlled by the purgePageCount setting.
Setting this value to zero means that no pages are ever scrolled out of the page cache, and that eventually the whole dataset may become present in the page cache. This is sometimes desirable as long as datasets do not reach astronomical proportions.
I had the same problem on Extjs 5 and solved it with buffered store, but in your version of ext there isn't a specific class like in mine version, so maybe you can do it with this property
here you can read about buffered store on extjs 5: http://docs.sencha.com/extjs/5.1/5.1.2-apidocs/#!/api/Ext.data.BufferedStore
My problem is rather simple, but I haven´t found a simple solution for it.
I would like keep one div element from reloading while navigating on other pages. So this one div element would be on the same spot and not to refresh, even when I´m going from page to page on my web pages. I have Soundcloud player in this div, which I want to keep from reloading. The idea is to keep it playing the same song while navigating trough other pages. The point is, everything else should be able to reload, while keeping this one div from not reloading.
How to do this in practise is my question?
The only way to accomplish something like that would be to load the pages via ajax, instead of a full round trip to the server.
If you reload the whole page, then you reload the whole page and you can't keep part of it.
You can either:
Use frames (obsolete) to display two pages at once
Only ever display one page but use XMLHttpRequest to fetch new data from the server, DOM to change the content of the page (leaving that one div alone) and use the History API (pushState and friends) to map the changes you are making with JS to real URLs (which, when requested, cause the server to load a page which is the same as the one you have created with your client side JS modifications).
you might want to look into single-page app frameworks, like Angular, to help you quickly establish that kind of front-end functionality you're looking for
Let's suppose a web page located at /items, listing all the available items.
When the user clicks on one item, the browser shows /items/:itemId and present the item's details.
Of course, the items list could be very large.
That's why I'm using the scroll infinite feature from Ionic (Framework I'm using based on Angular).
However, as many developers know, handling the "back button" from /items/:itemId to /items is tricky since the Angular's controller is destroyed during the page transition and then rebuilt when the list page is loaded again.
Therefore losing the scroll position and the exact items already loaded before clicking on an item's detail.
I think about some solutions, but I hesitate since all have a "drawback":
Storing the actual loaded items (the complete items) in $cacheFactory (Angular's cache or localStorage) just before the item details is about to be shown.
Thus, when the back button is pressed (on the detail page), the /items's controller can init the data from the cache, and besides the scroll position could then be easily "remembered".
Drawback is that the data in cache may be stale...
"No need to store any loaded items in cache!" Just store in cache or localStorage the actual number of chunks loaded.
For instance, my REST API allows to retrieve items 10 by 10.
If the user's was loaded up to the second chunk (up to 20 elements so), a variable in cache would contain this value and the controller could then be init with all the 20 items initially.
Drawback is that it would involve several requests to server. Indeed, if the user loaded 30 chunks, the controller would need to make 30 calls to server...
I could customize the size of chunk processed by the server, but one item is heavy (several long texts etc.. lists of anything), explaining why I limited to a relatively small number.
Using a bidirectional scrolling (top and bottom), so that there's ALWAYS 10 items in the DOM. I would just need to store the last number of chunk already loaded, in order to reload it.
Drawback: I would have to write my directive myself (possible so) since there's no Angular (neither from Ionic) directive currently.
And I imagine that bidirectional is useful only when dealing with very very large list: more than 2000 rows, in order to lighten the DOM.
I don't expect more than 400 rows in a majority of my cases...
What is a good practice? Maybe another solution?
This is kind of a cheat, but what about using a Modal (http://ionicframework.com/docs/api/service/$ionicModal/) to render the detail?
Thank you in advance to anyone who attempts to help me with this.
I have a form that I am adding checkboxes to via appendChild() - as selections for the user to chose from - based on a bunch of criteria.
When the user checks any of these boxes and then clicks the 'continue' button to post the selection to another page - and then clicks the back button - the checkboxes that were checked by the user - have now been forgotten by the browser (no longer checked).
If I use php to write the checkboxes or simply have static checkboxes - when the user checks any of these boxes and then clicks the 'continue' button to post the selection to another page - and then clicks the back button - the selected checkboxes are remembered (still checked)
My question is:
Why does the browser forget the selections the user made when I create the checkboxes with appendChild()
yet the same browser will remember the selections the user made when using static checkboxes
What is it about appendChild() that is not allowing the same browser to remember the checked selection?
[div id="mydiv"] here is where the checkboxes are going[div]
[script type="text/javascript"]
var newInput = document.createElement("INPUT");
newInput.id = "mycheckboxid";
newInput.name = "mycheckboxname";
newInput.type = "checkbox";
document.getElementById('mydiv').appendChild(newInput);
[/script]
The browser may "forget" dynamic changes to the DOM because different browsers use different strategies for caching web pages. When you hit the back button, the idea is that the browser can display its cached copy rather than re-request the page from the original web server.
It can accomplish this in (at least) two ways:
The browser caches the DOM itself of a page upon leaving it. Upon revisit (forward or back) dynamic changes will persist.
The browser caches only the original HTML of the page at load time (prior to any dynamic changes). This has the effect of losing those dynamic changes--further modification to the DOM with appendChild() or innerHTML is not recorded.
Note that some browsers additionally keep modified form data, and others do not. If your goal is 99+% compatibility across all browsers, then you have some work to do.
To work around this you need to persist the state somehow. You have a few options:
Save data about the modifications to the page to localstorage. Use a key that is generated randomly on first page load and then kept in the page, so that the state changes will only apply to that instance of the page. On page load, if this key already exists, read the change data out and re-apply the changes. Older browsers do not support local storage.
Do the prior thing with cookies. I don't recommend this, as it has the drawback of proliferating cookies. Cookies are sent and received in every request (including ajax ones), so you would be bloating the data being transmitted on every request. Old browsers would work fine with this.
Abandon your dynamic change model and make the changes occur through a post to the server. Then the page will contain the modified html when pulled from the browser's cache. You probably don't want this, but I thought I'd point it out for completeness' sake.
Save data about the modifications to the page via ajax behind the scenes to the server. This is not the same as actually round-tripping each change like the previous item. You still make changes dynamically, but you post an "advisement" file to the server. Then, on each page load, request any adjustment data from the server. This is similar to the first suggestion, but you are using the remote server as your storage. This makes extra net traffic occur on each page load, but the traffic can be minimal as it would be just about this page. It also makes extra net traffic occur that would not normally be sent (the advisement data). A clever system architecture, however, could use this information to persist a user's unsubmitted form data across computers and over time in a way that could be very handy (lets say your user does 199 out of a 200-question survey and then his power goes out--with this scheme he has a chance of painlessly continuing later exactly where he left off!).
Make your Continue button open a new browser window, preserving the original page intact.
Make your Continue button post the data without leaving the page, preserving it intact. You could do a simple lightbox-style overlay.
If the lightbox-style overlay will not work but you really have to display a new page and don't want it to be in a new window, then redesign your site to work similarly to gmail: where pages change only through javascript, and only through using #hash tags at the end of URLs to control behavior. This can be difficult but there are libraries out there that can accomplish it. (For some browsers one has to resort to polling to see if the hashtag has changed.) The basic idea is that when you click a link that points to the same page but has a tag on it such as About the browser will initiate a page load event, and will push a new context into the history forward/back stack, but will not actually load a new page. You then parse the updated URL for the hash code (which maps to some kind of command) and carry it out. Through careful choice of the proper hash codes for each link, you can hide and display the appropriate page dynamically through Javascript and it will appear as if the person is navigating around a real web site. The reason you do all this is that, because the page never loads, you not only can maintain state in Javascript, you can maintain your DOM state as well--you simply hide the page that was modified by the user, and when the back event occurs that means to visit that page again, you display it, and it is exactly how the user left it. Advantage: If your site requires Javascript to operate, then you are not taking a risk by using even more Javascript to accomplish it. Disadvantage: Completely changing the architecture of your site is a LOT of work and can be difficult to get working on older browsers. To get started with this see Unique URLs. You might try out the jQuery hashchange plugin. If your web site has wide distribution you will want to be sure to address search engine optimization and web usability issues. You may want to see this SO page on detecting back button hash changes.
Use the same strategy as in the prior point but instead of doing it with hashtags, use the new HTML5 history.pushState() and history.replaceState() methods--see Mozilla browser history.
If your goal is not 99% compatibility across 99% of the browsers in use, then please let us know what you are aiming at, as there may be a shortcut possible.
Update: added an option #8
Scripting pages doesn't stop at state management. It includes state management.
This means scripted state changes such as scripted page transitions(pages that internally navigate), content panes, popover menus , style changes and of course, form input and selections are all the responsibility of the scripter.
So, in answer to why .. it is because you did not manage the page state you scripted.
If you want your page to work as you seem to expect you can manage the page state changes you script yourself, use a js lib that manages page, or perhaps in your case form, state, or use the http(s) client/server state management and load up the session state, or in your case just the form state, at the server.
I have a webpage which contains a table for displaying a large amount of data (on average from 2,000 to 10,000 rows).
This page takes a long time to load/render. Which is understandable.
The problem is, while the page is loading the PCs memory usage skyrockets (500mb on my test system is in use by iexplorer) and the whole PC grinds to a halt until it has finished, which can take a minute or two. IE hangs until it is complete, switching to another running program is the same.
I need to fix this - and ideally i want to accomplish 2 things:
1) Load individual parts of the page seperately. So the page can render initially without the large data table. A loading div will be placed there until it is ready.
2) Dont use up so much memory or local resources while rendering - so at least they can use a different tab/application at the same time.
How would I go about doing both or either of these?
I'm an applications programmer by trade so i am still a little fizzy on the things I can do in a web environment.
Cheers all.
Regarding the first part, it's called Ajax: display the page without the table, or with an empty table, and then use ajax requests to fetch the data (in html or any data format) and display it.
Regarding the second part, you want something called lazyloading: the possibility to load data only when the user needs it, ie when it's on the visible part of the document. You can look at this question for a DataGrid library capable of handling millions of rows.
Two basic options:
Pagination
Lazy loading (load as user scrolls down). See this jQuery plugin
You could try a couple of things:
Loading data Asynchronously
and
Paging