I am developing a gallery for huge pictures, sounds and videos(=items). The page show really a lot of galleries, and each gallery contains a lot of items. All items will be pulled via ajax in that ,moment the user is asking for. (By clicking "next" for example). At the moment I am caching all viewed items during lifetime of the page in an extra div-container, so that no item needs to be requested via ajax a second time. But thats just a small speed advantage, even noticeable.
I would love to do two things:
The user is viewing the first gallery. All items he has viewd will be cached. During the time he is not doing anything, maybe watching a video of the page or the actual image, I want my script to load all other items of the page, one after the other, and move them to the cache. but as soon as the user clicks to view the next item, this caching process needs to be paused. So it has the priority 2. If the prosesses the user was asking for are finished, my caching process shall go forward. How can I afford this?
And how can I say, caching should stop when maybe 25MB of cached Items are received? How can I see the used RAM of an div containing the items?
Hopefully I could explain what I want and somebody has good understandable ideas for me.:)
Best!
Falk
Related
I'm building a page on my site (wordpress) where I am pulling in multiple posts of post type people (url: site.com/people/name-of-person) into a page called people (url: site.com/people) as a list. It is designed to be a list of people.
Now, What I am essentially doing is, in my list of people, I have a thumbnail of the person, upon clicking, the persons profile is displayed underneath via javascript. This is pulling in name, job position, favourite quote, details about them, some Instagram photos and a larger profile image. So you can quickly click through each persons profile, navigate around etc.
Now I'm thinking how to structure the page. In terms of the page by itself, does it makes more sense to have all the text for each person (title, job position etc) all loaded already into the page? BUT, this text will be an almost an exact dupe of the individual post for that person. (Which I don't intend to directly link to anywhere through the site, they will always be directed to /people)
Will this have a bad effect on SEO with this structure? With the duplicated content? Or should I not worry too much about this?
Thanks in advance,
Craig
Loading the profile of the people obviously takes more time and make the page to load slowly .which in turn gives trouble for the users with very slow internet connection or users with limited data..
Instead of doing that just keep the data in the page minimal and when the user clicks on the thumbnail redirect them to another page where you can provide them the info about people...
OR
USE AJAX
I find that my Ionic/Angular app constantly crashes after ~5 minutes of usage on my iPhone 6. It doesn’t crash when I use it on my computer’s browser.
I’ve been trying to troubleshoot the issue but have not been able to do so successfully. In my app, I have 48 images. In 1 round, I show 4 images. When the round starts, the user sees all 4 images. And the user can then pick which image he wants to guess on. In my javascript, this is represented as $scope.locations variable which is an array containing 4 objects. One of the attributes of each object is the link to the image to display. In my template, I loop through the 4 images using ng-repeat.
After the user is done with 1 image, he/she goes back to the home page and that object is removed from $scope.locations. And then the template runs through ng-repeat once again.
My hypothesis is that even though the image is not visually on the screen, it is still in the DOM and there are still some listeners present for the object that has been removed. Is this a valid concern? If so, how do I resolve this?
I'm thinking that this is why the app is crashing so often.
Yes. The image is present and can be resized even if it fails to display. I know due to personal experience with the side effects of blocking ads and turning on a bunch of accessibility settings.
Getting attributes from the image tends to not work for the obvious reason; but URL is there.
Let's suppose a web page located at /items, listing all the available items.
When the user clicks on one item, the browser shows /items/:itemId and present the item's details.
Of course, the items list could be very large.
That's why I'm using the scroll infinite feature from Ionic (Framework I'm using based on Angular).
However, as many developers know, handling the "back button" from /items/:itemId to /items is tricky since the Angular's controller is destroyed during the page transition and then rebuilt when the list page is loaded again.
Therefore losing the scroll position and the exact items already loaded before clicking on an item's detail.
I think about some solutions, but I hesitate since all have a "drawback":
Storing the actual loaded items (the complete items) in $cacheFactory (Angular's cache or localStorage) just before the item details is about to be shown.
Thus, when the back button is pressed (on the detail page), the /items's controller can init the data from the cache, and besides the scroll position could then be easily "remembered".
Drawback is that the data in cache may be stale...
"No need to store any loaded items in cache!" Just store in cache or localStorage the actual number of chunks loaded.
For instance, my REST API allows to retrieve items 10 by 10.
If the user's was loaded up to the second chunk (up to 20 elements so), a variable in cache would contain this value and the controller could then be init with all the 20 items initially.
Drawback is that it would involve several requests to server. Indeed, if the user loaded 30 chunks, the controller would need to make 30 calls to server...
I could customize the size of chunk processed by the server, but one item is heavy (several long texts etc.. lists of anything), explaining why I limited to a relatively small number.
Using a bidirectional scrolling (top and bottom), so that there's ALWAYS 10 items in the DOM. I would just need to store the last number of chunk already loaded, in order to reload it.
Drawback: I would have to write my directive myself (possible so) since there's no Angular (neither from Ionic) directive currently.
And I imagine that bidirectional is useful only when dealing with very very large list: more than 2000 rows, in order to lighten the DOM.
I don't expect more than 400 rows in a majority of my cases...
What is a good practice? Maybe another solution?
This is kind of a cheat, but what about using a Modal (http://ionicframework.com/docs/api/service/$ionicModal/) to render the detail?
I've got an angular app and I integrated it with UI Bootstrap project. I'm using regular
The modal dialog with dropdown containing 750 records, when one of the items is selected and clicked "Ok" or "Cancel", the modal and the overlay fades out without any delay.
Here's the plunker:Modal dialog with 750 records
If the modal dialog with dropdown containing around 10k+ records, and one of the items is selected from the list. Clicking "Ok" or "Cancel" is not hiding the modal dialog right away, instead I'm having a 8-10 second delay on Chrome, I've not tested on IE yet.
Here's the plunker:Modal dialog with 10k+ records
Question: Why I'm having performance hit with more data?
You are slowing the whole entire browser down by grabbing the DOM by the neck and pouring 10,000 <option> nodes down its throat. You need to lazy load your data somehow. Ever noticed on sites like Twitter, Facebook, and others that when you scroll to the bottom of the page it will begin loading more records from the server? Good apps will start to garbage collect old records that have been scrolled up as well.
When you scroll through your Facebook news feed it's not loading all your friends post since 2007 into the browser all at the same time. Once a maximum number of posts exists in the DOM Facebook will start removing the oldest ones you scrolled up to make room for more and grab fresh posts from the server so you can continue scrolling. You can even see your browser scroll bar jump up as you scroll down because more posts are being added to the DOM.
No browser is going to be able to handle that much data. The browser is not a database. I'm amazed your plunker with 10k records is as performant as it is! Haha. A dropdown is not what you want to display that data. You're going to have to sit down and think of a better way to show that data to the user. My first thought is to provide a filterable list that initially contains the top 25 most selected options or something, then typing in a search field causes it to load a new list from the server that matches the search criteria. Only you will know what your users will actually want, but I assure you it's not a dropdown list with 10k+ options.
Example:
Notice how the browser scroll bar jumps up a bit when it gets to the bottom. Twitter gets to the bottom and then loads more data to scroll through. It will eventually start cleaning up data at the top of the page as well if I scroll far enough.
Modern browsers can handle a lot, but 10,000+ <option> nodes is pushing it overboard.
The browser can handle a large number of values in a dropdown list, but a dropdown list isn't meant for such a task. Not to mention the users will have a hard time selecting an appropriate value, even if you sort them alphabetically.
You would be much better off using an input text box instead of a dropdown.
jQueryUI has some nice autocomplete features that would improve not only the performance of your web application, but also make the user experience much more bearable. I would any day prefer to type out one of the 10,000 options provided to me than search for them in a dropdown using a mouse and select them.
Here's an example on jsfiddle with ~8.5k records for a performance test.
Let me quickly tell you few points:
It is a usability bug to scroll through 10K records. Consider someone going through 10K options and selecting the one which they want. Not a good idea.
Performance issue:
If the options were rendered from a back-end in a traditional way (non-Angular way) then it would just take time to load but after that the performance won't be such an issue.
Since, you are using AngularJS with ng-options, the options are populated in the front-end and you have all the data in Angular's scope. To perform, two-way binding, Angular always does a dirty-checking in each 'digest cycle' which loops through each and every data element in $scope and causes that delay.
Solution:
Use Select2's "Loading Remote Data". Select2 is a jQuery based replacement for select boxes.
Consider using the AngularUI's Select2 wrapper instead of directly using it.
I have a webpage which contains a table for displaying a large amount of data (on average from 2,000 to 10,000 rows).
This page takes a long time to load/render. Which is understandable.
The problem is, while the page is loading the PCs memory usage skyrockets (500mb on my test system is in use by iexplorer) and the whole PC grinds to a halt until it has finished, which can take a minute or two. IE hangs until it is complete, switching to another running program is the same.
I need to fix this - and ideally i want to accomplish 2 things:
1) Load individual parts of the page seperately. So the page can render initially without the large data table. A loading div will be placed there until it is ready.
2) Dont use up so much memory or local resources while rendering - so at least they can use a different tab/application at the same time.
How would I go about doing both or either of these?
I'm an applications programmer by trade so i am still a little fizzy on the things I can do in a web environment.
Cheers all.
Regarding the first part, it's called Ajax: display the page without the table, or with an empty table, and then use ajax requests to fetch the data (in html or any data format) and display it.
Regarding the second part, you want something called lazyloading: the possibility to load data only when the user needs it, ie when it's on the visible part of the document. You can look at this question for a DataGrid library capable of handling millions of rows.
Two basic options:
Pagination
Lazy loading (load as user scrolls down). See this jQuery plugin
You could try a couple of things:
Loading data Asynchronously
and
Paging