I've been pondering moving our current admin system over to a JS framework for a while and I tested out AngularJS today. I really like how powerful it is. I created a demo application (source: https://github.com/andyhmltn/portfolio-viewer) that has a list of 'items' and displays them in a paginated list that you can order/search in realtime.
The problem that I'm having is figuring out how I would replicate this kind of behaviour with a larger data set. Ideally, I want to have a table of items that's sortable/searchable and paginated that's all in realtime.
The part that concerns me is that this table will have 10,000+ records at least. At current, that's no problem as it's a PHP file that limits the query to the current page and appends any search options to the end. The demo above only has about 15-20 records in. I'm wondering how hard it would be to do the same thing with such a large amount of records without pulling all of them into one JSON request at once as it'll be incredibly slow.
Does anyone have any ideas?
I'm used to handle large datasets in JavaScript, and I would suggest you to :
use pagination (either server-sided or client-sided, depending on the actual volume of your data, see below)
use Crossfilter.js to group your records and adopt a several-levels architecture in your GUI (records per month, double click, records per day for the clicked month, etc.)
An indicator I often use is the following :
rowsAmount x columnsAmount x dataManipulationsPerRow
Also, consider the fact that handling large datasets and displaying them are two very differents things.
Indeed pulling so many rows in one request would be a killer. Fortunately Angular has the ng-grid component that can do server-side paging (among many other things). Instructions are provided in the given link.
Related
I am using datatables jquery to display the table with nodejs to make a query to the database. It is currently taking 18s to display 20000 rows of data
when i directly query the database it takes less then a second to display the data.
First of all, even if it is possible to show 20K records in a grid, your user will definitely not be interested to see all of these at one go. Rather show a small set of data in the UI grid and provide search, sort and pagination options.
Go for both server-side (NodeJS) and client-side (jQuery) pagination. Fetch data from the database one page at a time and let the client request one page at a time. Make the page size configurable.
If server-side performance (especially database query) is not a concern then fetch all 20K data at one go. Reduces complexity at server side by avoiding pagination. However, still at client-side implement pagination to avoid rendering issue and to improve usability. Note that, by fetching all data at one go memory consumption of both server and client-side processes will increase.
If it works today well, it will definitely break if there is a possibility that your data grows over time. If not and you still want to fetch all 20K records and display at one go, then go ahead, good luck :)
What kind of data?
Rather than display 20K rows of details, summarize the data and show no more than a screenful of data. You could do this with some kind of graph. Or summarizing the data. Or...
And you could have links on the consolidated data for those who need to drill down to the details.
This seems to be a UI issue.
I am getting a list of items in a JSON(through AJAX) and creating the required markup by JS and appending in the view. On the other hand I have seen few examples which do not use this practice, and send the complete or partially complete markup through AJAX and then simply append them to the document. So definitely the markup is being generated on the server.
So I am curious which one is the better approach and why. One thing I can clearly see is that the later approach does not expose the JSON structure to the UI.
You're right about exposing the json structure but perhaps the main thing to consider is loading times, both client side and server side. If the server is dedicated and powerful enough, generating the markup in the server will be fast and will keep the site snappy on the client side.
If not, you'll want the client to do as much of the processing as possible. Generating markup on the client side is not a bad thing. The popukar angularjs framework builds markup on the fly client side and is very effective.
The choice depends on several factors like
what we are trying to achieve - Requirement must be picture clear.
what is the amount of work involved with these choices alone
which choice will have better performance
and in few scenarios which choice will get the job done quick. - where performance is compromised over delivery
Lets take an example to explain better.
1) Create a table to display all the details of employee.
Here since this is a stand alone table we don't have to worry about passing a JSON and then writing logic inside the JS to build the table. This can be done quickly without the need of JS. -- HTML Wins
2) Create a table to display all the details of employee. Also build a dynamic bar graph to display the details of the employee group on click of any row in the table.
Here the scenario is to build a table also on click of any table row get the employee group and generate a bar graph to show total employees in that group or may be any other stuff.
Here if we have built out table with just HTML then to get the data for the bar graph either we need to make another server call OR do a for loop of all tr's and then extract data from each cell by cell which is a pain.
If you use JSON to dynamically build the table structure, We can use the same JSON data and then build the graph and quick without other overhead, As we have the data ready to manipulate. -- JSON wins
This is just a small example. Also this concept is totally debatable, it is choice of the developer and what he is comfortable with, along with few other factors as mentioned above. I think you get the big picture. Hope this is helpful.
I have an app that uses a large database to fill in google maps and charts data. This ends up being about 5000 lines per column and about 20 columns. The issue I am running into is whether to put this data in the view template, which makes my source code several thousand lines long, or generating a javascript file for each instance and including them in the view. The issue I am running into with that method is that I am generating files with no way to delete them out of the webroot folder (without a cron job to go through and delete old ones). I was wondering what the solution is for this.
of course, but you as a developer are responsable for fast delivering websites. you cannot fetch all of the data. for example when using google maps it is common practice that you display a limited number of data according to the displayed area ( rectangle ). When using charts, then you should answer only with the already aggregated data.
there is almost never the reason to display all the data to the user. Ask yourself, what will you do when you see 5000 thousands lines of code on the website. do you need it all at once? No.
Use AJAX to fetch only the rows you need right now.
I am new to Knockout.js and already like it very much.
Say I'm implementing web blog and want add/edit/delete blog post comments with using of Knockout.js. In purpose of this I define Comment viewmodel with subject, text and tags (in my real application i need much more fields, like 10 to 20).
After the message have been edited by user and posted to server I want to refresh it at the screen with the latest values (including those that came from server - say, timestamp). It appears that I need observable (not just simple) properties for every listed field, otherwise the values will not be refreshed at user's screen after postback.
Now, if I have 20 observables per comment and there are 50? 100? comments on the screen, then will it slow browser down much? What about mobile devices? If so, is there another way to achive what I want?
The other possible option is to use viewmodels only for the comment being edited. In this case I somehow need to "unbind" other viewmodels from html elements - ex., delete it and render again. But here I can't see a nice solution.
Interesting question.
The short, simple answer is no.
Browser performance is not really an issue unless you are specifically developing an application that is know or expected to be a performance hit.
A browser is well designed to handle very large amounts of data. Be it downloading new data from a server or rendering DOM elements. I would say a browser could handle over 1000 comments (an educated guess).
Take a look at a Google application (such as calendar) - they tend to process huge amounts of data.
This use case scenario sounds like a perfect match for the mapping plugin:
// Every time data is received from the server:
ko.mapping.fromJS(data, viewModel);
And if you ever get into performance issues, the Viewmodel plugin claims to be several times faster specifically for the task of updating your viewmodel from an updated model.
ko.viewmodel.updateFromModel(viewmodel, updatedModel);
I am continually pulling in a list (over 200) entries from a remote server and showing them in a table (many pieces of data per row). The data changes and fluctuates over time. What's the most efficient way in JS to render this to the screen. As the data changes do I simply wipe my DOM and re-render with the new data; do I have a list of ids that will simply update the correct tr dom entries?
Its already sorted and filtered to be the correct information (server side) so its just displaying it to the user I am concerned with.
Any suggestions would be helpful. I am using jQuery for my requests but that can be changed. I need to know the quickest (latency wise) way of pulling this large dataset and displaying it in a tabular format.
Edit: Pagination is not possible in this scenario.
Edit2: The backend is Python (WSGI) running in a load balancer and is capable of flushing out 600 reqs / second. The data is only available as JSON and the backend is more of an API.
Are you using HTML tables? Sometime ago I crossed a Microsoft blog from the IE team stating that rendering tables are slow. Very slow. Re-rendering the whole table everytime will probably be slower than updating its values, but that is just my guess.
In fact, my opinion here is that updating values will be faster than re-rendering elements in the page. To optimize your latency, you could keep a hash of objects mapped by, so you don't have to lookup the DOM everytime you update the values.
I quite like the 'Live Image Search' which provides more data as you scroll down.
If your data list is getting really large, consider not displaying all of that data to the user at one time (if that is possible in your situation). Not only will the user get lost looking at that much data, but the browser will have to render the whole list and slow you down.
You don't mention what server side technology you are using. If you are in .Net, there are a couple ASP.Net controls that use a data pager (like a grid view). There is also a PagedDataSource object you might look into using (which can be used with any ASP.Net control that has a DataSource property). Both will break up your data into pages, and only the viewed page will be rendered at one time. This decreases latency dramatically.