Best way to Handle large amounts of Data - javascript

Currently I am experiencing speed issues with parts of my application that load large amounts of data into a reporting table. The data in the reporting table is pulling from multiple tables and running some complex queries, but required quires.
Besides optimizing the code, my question is how do you personally handle large amounts of data that need to be displayed to the user, and what is the best practice?
Currently I am processing all the data before hand and then generating a table via the data table javascript library.
Things I know:
The user doesn't need to see all the data at once
The user needs to be able to search through all the data
The user needs to be able to filter through the data
Is the best way really to just use a loading spinner, and load only a small portion of the data when the page first loads? then the rest of the data retrieval is through Ajax?
I feel like there has to be a better way
Thanks,

I think you're answering your own question a bit. Yes, it is better to not deliver the whole database to the user at once, this is why any RDBMS supports things like LIMIT. Your three criteria exactly match what a database system can do for you – queries of small subsets of data (i.e. pages), optionally filtered or matched based on a search query.
For simplicity of the front end, you can make the first page load via AJAX as well, though having it pre-rendered does make the page feel more responsive. Having said that, there are many existing solutions to this problem; some template engines and JS front-end frameworks (Vue.js SSR) support server-side pre-render.

Related

how to create pagination if api has not provided any option (data items per page)?

All I get from an api is an array of objects, there is no pages or anything that I can use to request only part of the data. I wonder if there is still a good way to create pagination on my client?
(Im using react)
Thank you
Since you can't save on network data-transfers or parsing (although some lazy parsing algorithm might help), paging would improve the memory footprint of all the DOM nodes created, and the time it takes to layout the page.
If you are really conscious about those aspects I would consider using something like react-window to lazy-render data.
The primary considerations of paging is saving on network transfers. The trick of lazy-rendering nodes makes sense on really large datasets.

AngularJS: Load all data at once as JSON vs. using a database and only load parts. What is better?

I have to display products which are stored in an ERP on a webpage. The ERP can produce an XML or JSON file which would include all products. The webpage needs functions like pagination, sorting or filtering by attributes. At the moment I think the easiest way would be to just load the entire file in AngularJS and then iterate over all items and work with that. The number of products is limited to ~500. The reason why I think this is the easiest is because the client changes the information on a daily basis in this way I don't have to write an import / synchronization process for a database.
But I am a bit worried about performance. Sorting, filtering, pagination etc. are all things that would be very fast with a database (probably MongoDB since the datastructure is quite simple).
Can I expect serious performance problems? Is this doable? Or should I put a database between ERP and frontend that does the heavy lifting?
The only way to be sure is to test it out yourself, that is what I would do. The answer is very often, "it depends.. ". You say its around ~500 products, if each product only has a product name, then a database would be overkill. Angular is perfectly comfortable with that amount. But if each product got tons of properties and nested data, the file itself could be very large to even load on every pageload. So, it depends..
I would to this:
Export the file from the ERP as JSON
Create a boilerplate angular app
Put the JSON file as a resource file
Create a simple repeater and throw those objects out into the DOM
Now you can easily experiment with filtering, sorting, pagination and so on. Test if the browser perfomance and load time is what you are looking for.
I think there will be no performance problem on data transfer since there are only 500 elements in the JSON file. But maybe you'll experiment performance problems in showing those 500 elements with AngularJS.
Instead of pagination, you can checkout 'Infinite Scroll' with AngularJS, a good solution for result showing performance. Check this article: http://www.williambrownstreet.net/blog/2013/07/angularjs-my-solution-to-the-ng-repeat-performance-problem/
And no, I don't agree having a database in the middle. You are trying to use it as a cache, but you'll have more problems than solutions, because since you have only 500 elements, you would not gain performance at all. And another added problem: database maintenance ;)
Cheers

Social network architecture decision

As I can't orientate freely in the topic of building dynamic sites, it is quite hard to me to google this. So I'll try to explain the problem to you.
I'm developing a simple social network. I've built a basic PHP API represented by the files like "get_profile.php", "add_post.php", etc. with the POST method that is used to pass some data. Then I try to get the data using JS AJAX (php functions return it by JSON), which means I get all the data that I need to show on a page after the page is loaded. That causes decreasing of a page loading speed and I feel like this structure is really wrong.
I hope you'll explain me how to build a proper structure or at least give me some links to read. Thanks.
Populate the HTML with the (minimum) required data on the server side and load all other necessary data on the client side using AJAX (as you already do).
In any case, I would profile your application to find the most important bottle necks. Do you parallelize AJAX requests?
Facebook, for example, doesn't populate its HTML with the actual data on the server side, but provides the rough structure, which is later filled using AJAX requests.
If I understood your architecture right, it sounds ok.
Advices
Making your architecture similar to this allows you to deliver templates for the page structure that you then populate with data from your ajax request. This makes your server faster also since it doesn't have to render the HTML also.
Be careful with the amount of requests you make though, if each client makes a lot of them you will have a problem.
Try and break your application into different major pieces and treat each one in turn. This will allow you to separate them into modules later on. This practice is also referred as micro-services architecture.
After you broke them down try and figure user interaction and patterns. This will help you design your database and model in a way in which you can easily optimise for most frequest use-cases.
The way of the pros.
You should study how facebook is doing things. They are quite open about it.
For example, the BigPipe method is the fastest I have seen for loading a page.
Also, I think you should read a bit about RESTful applications and SOA type architectures.

Sorting with javascript or with Server Request? [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Sorting on the server or on the client?
As a part of a project for my university, I have to make a website like a forum let's say.
So there will be posts, many of them. Every Post has a like bar, comments, some text, some buttons etc. Also the user will be able to sort these posts that appear in a page, based on some criteria like date,name, popularity etc.
My question is how should I implement the sorting: 1. with javascript on the browser's side, or 2. with a form or something, and a new request to the server (in this way the server has to send me back the posts sorted) ?
Thank you in advance.
There are pro's and con's to both.
Generally speaking, if you already have all of the data available in the client anyhow, you will provide a more responsive user experience sorting on the client.
If you have to fetch extra records that you would otherwise not fetch to sort client-side, there's a great chance that you are bloating the download to the client beyond the optimal point, and a sort on the server-side via Ajax would be better.
That's a huge depends. Is there paging involved? What's the max size of the data set? It's only the records in the single page on client screen need to be sorted?
Server side sorting is better for:
Large data set
Faster initial page load
Accessibility for those not running JavaScript
Complex view business logic
Resilience to concurrent changes
Client side sorting is better for:
Small data set
Faster subsequent page loads
The sort criteria are user-selectable or numerous.
Once you have this feature, you can add filters, and pagination easily
Related question:
Sorting on the server or on the client?
Related answer:
The important thing to remember is that while balancing the load between powerful clients and the server may be a good idea in theory, only the server can maintain an index which is updated on every insert. Whatever the client does, it's starting with a non-indexed unsorted set of data.
if it is possible/practical to sort elements on the client side, that would bet the best solution (reduce server requests). however this is often not the case.
It happened to us. We sort the data in the client side. Then a new requirement arrived. We need to put the sorted data in a report. So, instead of translating the sorted data directly into report datasource (which will be achieved through server side sorting), we are then required to catch the sort details (table to sort, column to sort, sort order) from the client side activity (on sort column event) then send it to the server when print report button is pressed then do the sorting on the serverside. Lot of work.

Opinion Regarding Filtering of Content using JS

I'm working on a project and there is some battle between how some JS filtering should be implemented and I would like to ask you guys some input on this.
Today we have this site that displays a long list of repeated entries of data and some JS filtering would be nice for the users to navigate through. The usual stuff: keyword, order, date, price, etc. The question is not the use of JS, which is obvious, but the origin of the data. One person defends that the HTML itself should be used and that the JS should parse through it making the user's desired filtering. Another person defends that we should use a JSON generated in the server, and that JSON should be the data's origin.
What you guys think on this? What are the pros and cons?
As a final request, I would like you to be the most informative as possible since your answers will be used and referenced for all us in the company. (Yes, that is how we trust you!:)
The right action is matter of taste and system architecture as well as utility.
I would go with dynamically generated pages with JS and JSON -- These days I think you can safely assume that most browsers has Javascript enabled -- however you may need to make provisions for crawler (GoogleBot, Bing, Ask etc) as they may not fully execute all JS and hence may not index the page if you do figure out some kind of exception for supporting those.
Using JS+JSON also means that you make your code work so that support for mobile diveces is done client side, without the webserver having to create anything special.
Doing DOM manipulation as the alternative would not be my best friend, as the logic of the page control and layout is split-up in two places -- partly in the View controller on the webserver, and partly in the JavaScript -- it is in my opinion better to have it in one place and have the view controller only generate JSON and server the root pages etc.
However this is a matter of taste, and im not sure that I would be able to say that there is one correct and best solution.
I think it's a lot cleaner if the data is delivered in JSON and then the presentation HTML or view of that data is generated from that JSON with javascript.
This fits the more classic style of keeping core data structures separate from views. In this manner you can generate all types of views without having to constantly munge/revise the way you store, access and manipulate the data. You can even build classes and methods to develop a clean interface on your data that is entirely independent of how that data is displayed.
The only issue I see with that is if the browser doesn't support javascript and that browser is a desired viewer. In that case, you have to include a default HTML version from the server that will obviously not be manipulated and the JSON will be ignored.
The middle ground is that you include both JSON and the "default", initial HTML view of that data in rendered HTML. The view comes up quickly and non-JS browsers can see something useful. But, then any future manipulation of the view (sorting, for example) uses the JSON data and generates a new clean view from the JSON data. No data is then ever "parsed" from the HTML view.
In larger projects, this also can facilitate the separation of presentation from data manipulation so different people may work on creating HTML views vs. manipulate the data (like sorting).
I would make the multiple ajax calls to the server and have it return the sorted/filtered data. If you server backend is fast than it won't be very taxing and you could even cache the data between requests.
If you only have 50-100 items than it would be reasonable to send it all to the client and have javascript sort and filter it.
Some considerations to help make the decision
Is the information sensitive and unique? (this voids and benefit to caching in my first point)
What is the most common request that will happen and are you optimizing for that?
How much data is there? (tens of rows, hundreds, thousands, millions)?
Does you site have to work with JavaScript turned off? (supporting older browsers?)
Is your development team more comfortable doing this in the front-end or back-end?
The answer is that it depends on your situation.

Categories