How to reduce memory footprint for dygraphs graph - javascript

I'm using dygraphs to produce a canvas-based stock chart in a web trading platform I'm developing -- source here. Data is delivered to the web client via a WebSocket connection.
If you look at the source, you'll see I'm appending data for the chart to an array, chartData, as data comes in over the socket (line 100), and then I'm passing the data to the dygraphs chart via updateOptions (line 111), causing the chart to redraw itself using the most current data.
This works fine, and it performs well. However, after about an hour, when maybe 10,000 data items are appended to the chart, the page (I'm using Chrome) crashes, probably due to memory usage. Data is stored in both the chart AND the array (chartData), so I imagine that's a good chunk of memory for one web page. Plus, I'm using ExtJS which is a hog :)
Does anyone have suggestions on how I can reduce the memory footprint for the chart?

Besides the obvious, "Don't use Ext", I can offer several guesses, but nothing definitive.
If, as I assume, much of the data being used is not in a currently viewable portion of the chart, then perhaps you can simply remove it. After you have enough data to fill the chart, every time you add n records to the end, splice off n records from the beginning.
If the data is coming in faster than you can comfortably render it (unlikely but possible), swap in and out several images: collect the data into a group. At a certain interval, clone that group into your rendering area, and use that to render a new chart. When rendering is complete, place it in the DOM, discarding the old one.
But simply removing older data might solve many of your problems...
... especially if you get rid of Ext.

Related

what should be the initial dataset from millions of data points for stock line highcharts

I want to prepare a line stock highchart like this example : https://www.highcharts.com/demo/stock/lazy-loading
In the given example, when you load the chart for the first time, it calls https://demo-live-data.highcharts.com/aapl-historical.json and fetches some points, to be precise 0-165 records (if you check the network tab and ajax call). At a same time All option is selected in the time range tool.
If you drill down further or go for any specific time range, it will bring more data always from the server.
Question: If you have millions of data points, consider from 2000 to 2022 years, then for All option, what are you going to display. what should be the initial data set or result or filter ?
NOTE: I will have millions of data points from 2000 to 2022 years going forward. When I load the chart for the first time, out of these millions points, what should come back from the server ?
Just for your reference, you can check example of time series data that I'm going to have in mock-data=>i.js folder/file which is NOT being used anywhere in below example as of now.
Highcharts 1.7 million points example : https://stackblitz.com/edit/js-wng4y6?file=index.js
P.S. : I'm new to stockhighcharts and I don't seem to find any proper explanation anywhere. Trying to reach out to the community for further help.
Server-side data grouping should be done based on the range for with you are trying to group the data, so All means nothing - however, in your case this will be 2 years.
For the data grouping you might also consider the chart size (this is done by default for a dataGrouping feature running client-side in Highcharts Stock). When relevant information is passed to the server it should return a set of grouped data points.
About grouping logic you can find more in the api options where is present method of approximation inside a group.
https://api.highcharts.com/highstock/series.area.dataGrouping.approximation
Sending this much data to highcharts to be processed is asking for issues. I highly recommend making a local highcharts server (something they support) and have this done within your system. See it here
This is very important when it comes to security as well (if your data is sensitive), having it race across the internets to highcharts and then sent back to you leaves it open to the world.
From here, you can also specify the start and end time of each render, and have that change based on user input. Personally I would generally display the last 5 days or something, and then if someone wanted to, they could pull the slider all the way back for the last significant amount of time.
But, to answer your question, when you send a data object to highcharts, either local server or the highcharts server, you will get a base64 image back that you can directly imbed in your UI.

Angular rendering 80+ using highcharts-ng

I've got performance issue/question. I'm rendering about 80 area charts in single page using angular and highcharts-ng. When data is loaded and angular is binding it to charts my browser does not respond for about 2 seconds. It is maybe not great amount, but still...
What is the reason? It is angular-to-chart binding issue or just chart rendering by highcharts?
There is possibility to make it a little faster or just make it not hanging browser ?
EDIT:
Response comes from server really fast. Data size is quite small. When I turn off charts (ng-if="false"), rest data loads really fast without any performance issue.
Each area chart has max 12 datapoints.
Old post, but in case anyone else encounters something similar, watch out for what you put inside data objects in highcharts-ng. Everything gets $watched. In my case I put the arrays of objects into the chart data so I could then view details when a graph segment was clicked. However this caused horrendous performance issues as the digest cycle became huge. I solved it by putting a function that returned the array into the chart data instead of the array itself.
At API/Server end
I am also having angular app which had similar issues about performance(one diff is: I am using highcharts but without directive-highchartsNG). At angular and server call's end , Sometimes when data is too much it takes times to download data on a page. however your api response comes fast but actually the request is stalled and takes time to load data on page and provide it to highchart/or any other dom label markup things.
My response data had many fields timestamp ,value,average,etc etc. I optimized the query response to return me required fields only to make it faster.
To make chart rendering faster :
Recently highcharts relased boost.js module which enhance the performance of various highcharts charts. it is a module to allow quick loading of hundred thousands of data points in Highcharts.
See here Highcharts Boost.js module official page

Where to put/generate large javascript data files in front end with cakephp

I have an app that uses a large database to fill in google maps and charts data. This ends up being about 5000 lines per column and about 20 columns. The issue I am running into is whether to put this data in the view template, which makes my source code several thousand lines long, or generating a javascript file for each instance and including them in the view. The issue I am running into with that method is that I am generating files with no way to delete them out of the webroot folder (without a cron job to go through and delete old ones). I was wondering what the solution is for this.
of course, but you as a developer are responsable for fast delivering websites. you cannot fetch all of the data. for example when using google maps it is common practice that you display a limited number of data according to the displayed area ( rectangle ). When using charts, then you should answer only with the already aggregated data.
there is almost never the reason to display all the data to the user. Ask yourself, what will you do when you see 5000 thousands lines of code on the website. do you need it all at once? No.
Use AJAX to fetch only the rows you need right now.

AngularJS - Large sets of data

I've been pondering moving our current admin system over to a JS framework for a while and I tested out AngularJS today. I really like how powerful it is. I created a demo application (source: https://github.com/andyhmltn/portfolio-viewer) that has a list of 'items' and displays them in a paginated list that you can order/search in realtime.
The problem that I'm having is figuring out how I would replicate this kind of behaviour with a larger data set. Ideally, I want to have a table of items that's sortable/searchable and paginated that's all in realtime.
The part that concerns me is that this table will have 10,000+ records at least. At current, that's no problem as it's a PHP file that limits the query to the current page and appends any search options to the end. The demo above only has about 15-20 records in. I'm wondering how hard it would be to do the same thing with such a large amount of records without pulling all of them into one JSON request at once as it'll be incredibly slow.
Does anyone have any ideas?
I'm used to handle large datasets in JavaScript, and I would suggest you to :
use pagination (either server-sided or client-sided, depending on the actual volume of your data, see below)
use Crossfilter.js to group your records and adopt a several-levels architecture in your GUI (records per month, double click, records per day for the clicked month, etc.)
An indicator I often use is the following :
rowsAmount x columnsAmount x dataManipulationsPerRow
Also, consider the fact that handling large datasets and displaying them are two very differents things.
Indeed pulling so many rows in one request would be a killer. Fortunately Angular has the ng-grid component that can do server-side paging (among many other things). Instructions are provided in the given link.

Efficient way to display lots of information in Javascript

I am continually pulling in a list (over 200) entries from a remote server and showing them in a table (many pieces of data per row). The data changes and fluctuates over time. What's the most efficient way in JS to render this to the screen. As the data changes do I simply wipe my DOM and re-render with the new data; do I have a list of ids that will simply update the correct tr dom entries?
Its already sorted and filtered to be the correct information (server side) so its just displaying it to the user I am concerned with.
Any suggestions would be helpful. I am using jQuery for my requests but that can be changed. I need to know the quickest (latency wise) way of pulling this large dataset and displaying it in a tabular format.
Edit: Pagination is not possible in this scenario.
Edit2: The backend is Python (WSGI) running in a load balancer and is capable of flushing out 600 reqs / second. The data is only available as JSON and the backend is more of an API.
Are you using HTML tables? Sometime ago I crossed a Microsoft blog from the IE team stating that rendering tables are slow. Very slow. Re-rendering the whole table everytime will probably be slower than updating its values, but that is just my guess.
In fact, my opinion here is that updating values will be faster than re-rendering elements in the page. To optimize your latency, you could keep a hash of objects mapped by, so you don't have to lookup the DOM everytime you update the values.
I quite like the 'Live Image Search' which provides more data as you scroll down.
If your data list is getting really large, consider not displaying all of that data to the user at one time (if that is possible in your situation). Not only will the user get lost looking at that much data, but the browser will have to render the whole list and slow you down.
You don't mention what server side technology you are using. If you are in .Net, there are a couple ASP.Net controls that use a data pager (like a grid view). There is also a PagedDataSource object you might look into using (which can be used with any ASP.Net control that has a DataSource property). Both will break up your data into pages, and only the viewed page will be rendered at one time. This decreases latency dramatically.

Categories