Pagination's performance in AngularJS applications - javascript

I have written AngularJS application that lists a couple of thousands records from the database. Then I created front-end pagination using limitTo filter.
Thus I load all records at once and paginate them dynamically, so the result is:
DOM loading ~5s
page loading ~5.2s
DOM loading takes way too long, what can I do about it?
I was thinking about backend pagination instead, but then I couldn't do dynamic searching with all records, I couldn't do dynamic ordering, etc, because I'd slice the list on PHP level before javascript loads.
Any other solutions?
view.tpl:
some HTML
<script type="text/javascript">
var data = {$data};
</script>
<script src="controller.js">
That's how I pass the variable from PHP controller to AngularJS controller. And thus the page source contains huge data array.

Have you tried to format SQL result in back-end side (compact json response) ? Maybe you can use webstorage to stock all results and make once request to get it.
https://github.com/fredricrylander/angular-webstorage

DOM Loading time depends (or at least should) only on elements you show. If you have 5000 objects, but show only 10 - it should work fast.
You should try using i.e. bootsrap pagination: https://angular-ui.github.io/bootstrap/
Or you can do it manually - using filter in html for big arrays is often bad idea, just create filteredArray in controller and update it when you need.
If you fix number of items on page (i.e. you want 200) making backend pagination will not improve your DOM render time.

Related

Django: fastest way to update the data that was once sent from template to view

I am working on an data visualisation app that will allow the user to filter the data that he sees by various criteria.
I want to keep as much logic as possible on Python/Django side, like this:
Data is passed from Django view to the template.
On the frontend, user filters the data through various controls: dropdowns, sliders etc.
The controls inputs are sent back to Django view (via AJAX post request?), which returns filtered data and sends it back to the template.
4.The template - the visualization - is updated with the filtered data.
Is this a good approach? My concern is that a lot of data will be flying around and the app might be unresponsive.
Another, possibly faster idea is to filter the data on client's side in JavaScript - but I would really like to leverage the great Python data munching libraries instead.
If you want to use DRF API, then go with it. A lot of websites have filtering features. I'd suggest you to take a look at django_filter package. It's possible to integrate it with DRF.
The worst thing in filtering data on client side is that you can't use pagination. Imagine that you have 500+ objects to filter, javascript filtering function is what really will make your app slow.
At the same time, if you have 20-30 objects to filter and this number won't grow up, then you can go with JS only and single endpoint: getAll()
Common approach is to set up javascript on_change handler and construct GET requests like(example from real project) this:
https://yourbackend.com/api/product/?status=not_published,published,inactive&search=132&moderation_status=declined,on_moderation,not_ready&ordering=desc&price_max=1000&page=1
DRF + django_filters will work just fine that, with minimum of your code
involved.
Well known pitfall on js side is to make request without timeout, eg user writes text and on every keyUP() event request being sent. Or he moves the slider and a lot of requests being made - you'll need to make request when users stop it, eg 300ms after he chosen value. See this question for reference.
Sure, there's one more point. Your database have to be normalised and have proper indexes. But you have to look at this side if you'll have really slow SQL queries.
Summing up: I'd choose thin js layer and do most of work on backend.

How to ng-include a partial view that requires a lot of data (parameters) to be rendered?

I have a Angular.js Web UI for editing complex and large mathematical objects. I'm trying to build a view that displays results for such an object. Thus, the edited object needs to be sent to the back-end and the back-end would compute a partial view based on its data.
The ordinary (easy) way of doing so would be to use the ngInclude directive:
<div ng-include=".../resultView?data=[JSON_stringyfied_object_here]>
This works. However the problem is that the object can be quite big in terms of numbers of chars used in a JSON representation (as they contain a lot of floating point number and dates etc.). So, I'm afraid of running into practical limitations of the length of a query string.
Instead, I'd rather send the object as payload of the GET (or even POST?) request. I'm just not sure how to accomplish this the Angular way. Is there a way of doing so?
Worst case, I can also live with a solution that displays a "Compute" button which would then fetch the partial view by calling a function that uses $http. How would I include this view in the DOM in this case?
I appreciate any hints in how people would tackle this problem.
EDIT: The view can look quite different depending on the (dynamic) type of the mathematical object and its computed results. Thus, rendering a static view and then filling data won't work.
What I would do is use ng-include to include a static templated page and then make a $http.put call to fetch the data and have it populate onto the templated page.

How to populate nodes(of a specific template) based on JSON data on different pages, and still Filter

So, I am working on a project where from a JSON response i have to create a list. Each item in the list has a common template. Only the data part that is different is populated using JSON response.
About the JSON, it is an ajax request response, in which as parameter I am passing start:0 and rows:9999, so to get all data.
The whole point of briefing above thing is, when i go with above approach, dynamically creating list based on JSON response creates thousands of nodes, which effects the initial page load time. In general, it takes around 10 secs to populate around 6k nodes.
Only solution which seems doable to me is to break the result into pieces and show a max of say, 10 results in one page. It boils down to use pagination and thus on every page request, do ajax with request to next 10 values and simultaneously populate.
But the problem lies in filtering the data when the list is not full(start:0, rows:20), as we used to have earlier with all nodes/data(start:0, rows:9999).
This will effect the filtered results. Filter is done by hiding the node which doesn't satisfy the filter criteria. So if all the nodes are not in the list, filter will not show the same result as earlier.
I am confused about the way to proceed and still achieve filter. How can on a single ajax request, list is populated on different pages, but only one of them is shown at a time. Any suggestions are appreciated.
If you want to display the data in form of tree then you can use jstree framework which is based on jQuery.

Where to render Ajax search results in an object oriented approach using Coldfusion?

I'm updating a Coldfusion8/MySql site with a fairly complex search from "spaghetti to object" (= separate view, controller and process - no framework, everything handled by Jquery Mobile).
I need to run the search query through Jquery-AJAX and am now posting the search form to my searchProcess.cfc, which does the database queries.
Question:
I'm not sure where to render the results?
The results will be fairly complex (database with a few million records, rows of 40 fields) and should end up in a single result layout or a multiple result layout file.
I was thinking of constructing the files inside the cfc and handing them back via cfsavecontent, but I'm reading everywhere this is a no-no...
What are the alternatives then?
I could set up a template_single.cfm and template_multi.cfm, pass back pure search results as AJAX response and then fire another AJAX call from the success handler to call the template and then render the output of this 2nd call. This seems awfully complicated, plus I don't see where I can fit my pagination in there without passing around large datasets.
So I'm looking for some advice on how to handle search-results in an object-oriented-way?
Thanks for input!
EDIT:
After a few more hours of googling, I'm currently looking at the following option:
1.) run a single database query to return paginated results - as per here
2.) send data with 0-25 records back to AJAX in JSON
3.) trying to use a template cf/js in a loop (length 1 or length 25) - as per here
This would mean data-transfer only for 1-25 raw records in JSON. If I try to render in the success handler, I'm not having to http-request another template.
Does this approach make sense?
First off, I see nothing wrong with putting display logic in a .cfc that is specifically for display logic. I know it's not strict MVC, but depending on what you're doing, it can work just fine. What I don't like is .cfc's that do any sort of output within the function. I always hand back any data from the function.
/end opinion
For this particular problem, I second EDIT idea of setting up the view to be almost all HTML/jQuery with AJAX calls for paginated recordsets. As far as the single/multiple results, I'd go with separate jQuery functions depending on which one you needed. The nice thing about this is that you could have the multiple recordset display call the single record display to view a single record (while still retaining the full recordset in the Dom).
In both cases, I highly recommend getting a Javascript Dump function (helps so much in visualizing the DOM data.)
PS. If anybody finds any newer/better JS Dump functions that work like cfdump, please, please, please let me know!

Efficient way to display lots of information in Javascript

I am continually pulling in a list (over 200) entries from a remote server and showing them in a table (many pieces of data per row). The data changes and fluctuates over time. What's the most efficient way in JS to render this to the screen. As the data changes do I simply wipe my DOM and re-render with the new data; do I have a list of ids that will simply update the correct tr dom entries?
Its already sorted and filtered to be the correct information (server side) so its just displaying it to the user I am concerned with.
Any suggestions would be helpful. I am using jQuery for my requests but that can be changed. I need to know the quickest (latency wise) way of pulling this large dataset and displaying it in a tabular format.
Edit: Pagination is not possible in this scenario.
Edit2: The backend is Python (WSGI) running in a load balancer and is capable of flushing out 600 reqs / second. The data is only available as JSON and the backend is more of an API.
Are you using HTML tables? Sometime ago I crossed a Microsoft blog from the IE team stating that rendering tables are slow. Very slow. Re-rendering the whole table everytime will probably be slower than updating its values, but that is just my guess.
In fact, my opinion here is that updating values will be faster than re-rendering elements in the page. To optimize your latency, you could keep a hash of objects mapped by, so you don't have to lookup the DOM everytime you update the values.
I quite like the 'Live Image Search' which provides more data as you scroll down.
If your data list is getting really large, consider not displaying all of that data to the user at one time (if that is possible in your situation). Not only will the user get lost looking at that much data, but the browser will have to render the whole list and slow you down.
You don't mention what server side technology you are using. If you are in .Net, there are a couple ASP.Net controls that use a data pager (like a grid view). There is also a PagedDataSource object you might look into using (which can be used with any ASP.Net control that has a DataSource property). Both will break up your data into pages, and only the viewed page will be rendered at one time. This decreases latency dramatically.

Categories