Create a dynamic datatable dashboard with dynamic values in cells - javascript

My website needs to display a dashboard (made of a datatable) on a webpage, in which the data will evolve really quickly (1 to 2 seconds). The data are received from a rabbitmq message queue.
I have successfully achieved it with Dash. Everything works well, but I have one major problem. Dash seems to be really slow when there is a huge amount of data (approximately 300 columns and 5000 rows in my case) that are regularly updated.
As far as I went, I think that the major lack of speed comes from the fact that the values pushed in the datatables are in a dictionary and not dynamic (not an in-cell modification), so all the javascript loads slowly and is not that stable.
Which leads me to this question: what would be the best way to achieve the goals mentioned above?
I thought that using Django and creating a custom javascript datatable could do the job but I would like to get some advice and ideas before starting to code that (really) time consuming solution.

Related

Best way to Handle large amounts of Data

Currently I am experiencing speed issues with parts of my application that load large amounts of data into a reporting table. The data in the reporting table is pulling from multiple tables and running some complex queries, but required quires.
Besides optimizing the code, my question is how do you personally handle large amounts of data that need to be displayed to the user, and what is the best practice?
Currently I am processing all the data before hand and then generating a table via the data table javascript library.
Things I know:
The user doesn't need to see all the data at once
The user needs to be able to search through all the data
The user needs to be able to filter through the data
Is the best way really to just use a loading spinner, and load only a small portion of the data when the page first loads? then the rest of the data retrieval is through Ajax?
I feel like there has to be a better way
Thanks,
I think you're answering your own question a bit. Yes, it is better to not deliver the whole database to the user at once, this is why any RDBMS supports things like LIMIT. Your three criteria exactly match what a database system can do for you – queries of small subsets of data (i.e. pages), optionally filtered or matched based on a search query.
For simplicity of the front end, you can make the first page load via AJAX as well, though having it pre-rendered does make the page feel more responsive. Having said that, there are many existing solutions to this problem; some template engines and JS front-end frameworks (Vue.js SSR) support server-side pre-render.

multiple users updating the one form

I have a website that manages realworld tabletop games via a php, jquery, bootstrap and mysql setup.
It has been running very well for a number of years, but I am implementing a team game concept, which allows 2 "captains" to manage the pairings at the same time. The page itself does what I want it to do when one captain does all the data entry, but it is not really optimal for both to be doing it at the same time.
Once both players for a game have been selected the row turns green
The goal is that as a Captain selects a player from a drop down box, it should somehow update the other captains screen and vice versa.
Should I have some kind of timer going, and every X seconds refresh the page, form, etc? Has anyone done something similar to this in the past?
I am thinking of having a table in my database with each field on the form, and when it was last updated, then I could loop through the table and only update the most recent ones, but I feel this could be an extra layer that just may be over complicating it.
Any pointers would be appreciated
Refreshing the whole page is definitely possible, but I wouldn't recommend it.
You could execute an ajax call for every x second with setInterval(). Requesting all data from the server and see if everything is loaded. You also would have to send the new data back to the server when the player changes a field.
A better approach for this would be the usage of sockets. They synchronize data across different browsers (almost) instantly. Without the need to constantly request data from the server.
You can take a look at socket.io for more information. This is a javascript package to make the implementation of sockets fairly simple in javascript.

Angular.Js Performance, large dataset, ng-repeat, html table with filters and two way binding

So I have a simple layout of a page which includes a panel of filters and a html table of records using ng-repeat. I am using MVC5 and an angularJs controller
I may have to deal with up to 100000 records.
Filters will occur for most of the columns including dates and text fields
The records need to deal with two way binding (user has to select records which will be returned to the server).
I'd like get opinions on the best design ideas for this....i.e.
Would you load all the data to the browser upfront. If not when would
more data be requested from the server.
If all upfront should two arrays be held, one for display and one
with all the data.
Does AngularJs have limitations with what I am trying to do, should I
be using something else?
I've read limitTo and trackby can be useful for filtering large
datasets but would like to get others thoughts.
I have recently ran into a similar issue with ~60k items, filterable, expandable, full of icons in each entry and stuff like this. It was extremely heavy and even though our team implemented some performance enhancements (like filtering, trackby, limitTo, pagination) it still was quite a mess especially in IE (even in IE11) which we unfortunately have to support.
From the aforementioned enhancements pagination helped the most (as Nitishkumar Singh also suggests) but still wasn't enough for a smooth UX. Nitishkumar's answer sums up perfectly each point you asked for I would just like to point you towards React (very great documentation imho) and ngReact which will help you achieve what you wish. Our team started to look into React and possible integration to our already extensive AngularJS project(s) and realized it is quite a common thing to do so. Several addons you will find (such as ngReact, angular2react, react2angular, etc.) which helps you with integration.
This is a codepen I worked on to test some features of React while learning how it actually works. I am no expert on React but after a few days of digging and learning I could come up with a solution that now loads 3*20k items with several features that runs smoothly even on IE9.
My answer is not supposed to be a 'I suggest React because it is so cool' especially since I am no expert on React either, just wanted to share this quite new (actually ongoing) experience and how we overcame it.
At the very end we ended up with this tiny snippet in our template (check the codepen for full, just had to copy some code):
ReactDOM.render(
<Header parents={parentArray} suppliers={supplierArray} bsrs={bsrArray}/>,
document.getElementById('app')
);
Some further reading on AngularJS + React which I found useful:
https://blog.logentries.com/2016/02/combining-angularjs-and-reactjs-for-better-applications/
Can angular and react play together?
https://www.quora.com/Why-would-someone-combine-AngularJS-with-ReactJS-when-they-do-the-same-thing
I would say you have really thought very well, will answer your questions one by one
No Loading all data upfront won't work. Client browser will get hung or crashed. You should implement pagination feature, where you should get data in chunks. If possible don't hold more no of rows in browser memory at once. Since it will slow down your application in any case
Maintaining two version won't help, it will simply increase complexity and maintenance for array. You will end up doing more code than expected
I won't say angular have limitation as loading 100000 rows at once won't work for any of the framework such as react, vue etc.
Yes you are right limitTo and trackby are the best options to use for angular in case of large dataset

how much data can charts js handle

For my application, I am making a get request of thousands of data points.
When I use charts js to display the data, it takes a long time to render, and I experience lag. I also noticed that the x-axis labels for each data point don't appear properly, so they had to be omitted
I like the sleek design and ui of the graphs, but cannot get it to work well for my use case. Is charts js not meant to be used with large data sets? Is there another library like charts js that can handle large data sets? While also being free?
if you want to handle big data you should use Highcharts
it easy can handle some million data without a big delay
Another option to consider is ZingChart. It is free as a branded version, but renders large amounts of data quickly while still maintaining flexibility in customization. If you are looking for a sleek design and UI, ZingChart allows the user to change just about every size, shape, and color to match your taste.
Full disclosure, I am on the ZingChart team. However, we developed a speed test tool that I think you will find helpful in testing your number of data points, regardless of which library you end up selecting. Note that some of these libraries will use up all your browser memory, so proceed with caution in some cases.
I had the same problem, Charts js seems to be unable to handle large data sets. The best alternative I've found is https://github.com/danvk/dygraphs . Also you could try http://canvasjs.com/ although it is comercial.
Have a look at LightningChart JS... It is made with WebGL. It can render
1 million data points in ~80 ms in line chart
10 million data points in ~800 ms
that is for static data. Those I got from my PC (AMD Ryzen, NVidia GTX1060)
But for scrolling streaming data, the performance is yet more impressive. Dozens of millions of points, with some configurations with Firefox browser, over 100 million points.
There is a chart performance tester application
I work with the team making this chart...

Handling large amounts of data in Angular

I'm going to build a SPA with Angular. The app will be composed of three tabs. Every tab will hold large amounts of data (basically 100-200 rows and various text fields, drop downs etc).
I'm facing objections from my colleagues to build this as a real SPA - they would like to separate this into three completely independent angular applications, living in the same asp.net MVC website.
My question is: Will holding such data on the client side cause browser or rendering issues? Are they right thinking this is dangerous?
I'd check to see how big a memory hit this is with a real data set and see if it would tax the resources available to most mobile devices.
My first thought would be 100 rows of data, with 100 columns each of 100 byte strings, would mean ~1MB of RAM per tab. Even my old phone would have enough RAM to handle that.
"Dangerous"? What can they do to manipulate that data in harmful ways?
~1MB estimate is based on raw data, if the page uses a lot of widgets which scales as data, it can quickly become nightmare unless data is loaded on demand

Categories