I'm running into an interesting problem with LightningChart where it seems to be corrupting or otherwise decimating my data depending on how far it is from the DateOrigin of the chart. My data is 1000 samples per second and I am trying to display 1-2 weeks of data at a time. I am using the ChartXY class, the x-axis type is set to "linear-highPrecision" which should have 1 ms accuracy, which is all I need and I don't need any more, I am creating a LineSeries and it's data pattern is 'ProgressiveX' and regularProgressiveStep: true.
Here's what it looks like when the data is plotted near the DateOrigin.
Here's what it looks like zoomed in on the data near the DateOrigin.
That looks fantastic! And lightning chart is doing exactly what I want!
However, I would like this data to be offset correctly to show it's true absolute time.
Here's what it looks like when I offset this data by 14 days. My code to set the relative offset looks like this.
ds.addArrayY(curve.data,step=1,start=14*24*60*60*1000)
Ok, it looks alright zoomed out, but what if we zoom in?
It's gone haywire! It looks like the X axis values are being coerced to some larger step of the X axis. It gets worse the further that you go out from the DateOrigin. My fear is that this is some built-in behavior of the engine and I am expecting too much, however, it says it has 1ms resolution, so I expect that to be respected.
Here's how I create the chart.
// Create a Line Chart.
const PumpsChart = lightningChart().ChartXY({
// Set the chart into a div with id, 'target'.
// Chart's size will automatically adjust to div's size.
theme: Themes.lightGradient,
container: 'PumpsChart',
defaultAxisX: {
type: 'linear-highPrecision'
}
}).setTitle('') // Set chart title
.setTitleFont(new FontSettings({
family: waveChartFontFamily,
size: 20
}))
.setMouseInteractionWheelZoom(false)
axisPumpsChartDateTime = PumpsChart.getDefaultAxisX()
.setTickStrategy(
AxisTickStrategies.DateTime,
(tickStrategy) => tickStrategy.setDateOrigin(waveDateOrigin))
axisPumpsChartPressurePSI = PumpsChart.getDefaultAxisY()
.setTitle("Pressure (PSI)")
.setInterval(0,10000,0,true)
Here's how I create the LineSeries
newDataSeries = targetChart.chart.addLineSeries(
{
yAxis: targetChart.axis,
dataPattern: {
pattern: 'ProgressiveX',
regularProgressiveStep: true,
}
}
);
Here's how I add data to the chart:
ds.addArrayY(curve.data,step=1,start=14*24*60*60*1000)
I would prefer not to use the AxisTickStrategies.DateTime over AxisTickStrategies.DateTime for a few reasons, my data spans weeks, 100 hours is too little, I am just fine with millisecond resolution, I don't need more than that, and I need to present my data in relative and not absolute time.
Hopefully there's some parameter that I missing that I can adjust to achieve this.
EDIT
Well, this corruption is also happening with Time tick strategy s well when the data is offset relative to the origin -636 hours. I tried this with and without ProgressiveX set as DataPattern.pattern.
**** EDIT 2 ****
Well, I even tried downSampling to 20 samples per second, and changed this back to AxisTickStrategies.DateTime, it's "squishing" all the points to this magic .25 second interval for some reason.
I tried to produce a reference application in a similar situation - using DateTime ticks with high resolution time data (1000 Hz). The below snippet should open right here in stack overflow, generating a quite large set of test data (might take a while), and using date origin to show it with high zoom range.
Does this work for you? If yes, then maybe there is something off in your application which you could spot with the help of this reference application.
const { lightningChart, AxisTickStrategies } = lcjs
const { createProgressiveTraceGenerator } = xydata
createProgressiveTraceGenerator()
.setNumberOfPoints(1000 * 60 * 60 * 4)
.generate()
.toPromise()
.then((data) => {
// Just for generating data set
const dataStart = Date.now()
return data.map((p) => ({ x: dataStart - 1000 * 60 * 60 * 4 + p.x, y: p.y }))
})
.then((data) => {
// 1. Offset data X by date origin
const dateOrigin = new Date()
const tDateOrigin = dateOrigin.getTime()
data = data.map((p) => ({ x: p.x - tDateOrigin, y: p.y }))
const chart = lightningChart().ChartXY({ disableAnimations: true })
const axisX = chart.getDefaultAxisX().setTickStrategy(AxisTickStrategies.DateTime, (ticks) =>
ticks
// 2. Inform Tick Strategy of date origin
.setDateOrigin(dateOrigin),
)
const series = chart.addLineSeries({
dataPattern: {
pattern: 'ProgressiveX',
regularProgressiveStep: true,
},
})
series.add(data)
chart.addLegendBox().add(chart)
})
<script src="https://unpkg.com/#arction/lcjs#3.4.0/dist/lcjs.iife.js"></script>
<script src="https://unpkg.com/#arction/xydata#1.2.1/dist/xydata.iife.js"></script>
Looking back at your question, perhaps the original issue was on the tick formatting? When you zoom in this case, it ultimately shows just seconds on each tick rather than a full time. Was this the problem? If yes, then the issue is about formatting axis ticks and I'll have to see what can be done about that.
Could it be 32 bit floating point precision loss? Two weeks in seconds amounts to 1 209 600, and log2(1_209_600) = 20.2 = 21 bits of storage space. The mantissa of 32-bit binary floats is just 23 bits, leaving you with 2 bits for the fractionnal part, which would explain the 0.25 increments.
So if i'm correct, you would need to have the X position precision bumped to 64-bit floats. You're already using "linear-highPrecision" axis mode, though, which would have been my first guess for a solution, so as far as I can tell it doesn't actually increase the data precision to 64 bits, only the axis. Unless there's another solution I've missed, you would probably need to split your data into separate series.
EDIT: I'm not actually sure that's a problem on LightningChart's end, now that I've looked it up. OpenGL ES 3.0 (upon which WebGL 2 relies, and in turn LightningChart) requires that "high-precision" float parameters be stored in binary32 IEEE 754 standard floats.
Official spec, p. 53:
highp floating point values are stored in IEEE 754 single precision floating point format. Mediump and lowp floating point values have minimum range and precision requirements as detailed below and have maximum range and precision as defined by IEEE 754.
So given that information, it seems to be a bug in LCjs caused a WebGL technical limitation.
Related
Following the c3js documentation there is no option for Bubble chart. One workaround for that is to setup scatter plot and specify point radius, but all of the bubbles will be the same height.
point = {
r: function(d) {
var num = d.value;
return num
},
Adding the value of axis inside the r solve the problem, but now the problem is how to setup very high or very low values ? For e.g if there is 1 000 000 value the whole chart will be colored. Is there any simple workarounds for that ?
First of all, set r to return the square root of your chosen variable e.g. return sqrt(num), that way a circle representing a data point 100 times the size of another has 100, not 10,000, times the area (area=pi r2 and all that)
If the numbers are still too big use a linear scale to restrict them to a usable size:
rscale = d3.scale.linear().domain([1,1000]).range([0,10])
and then return rscale(sqrt(num))
If your problem is to represent large and small values on the same chart so small values don't disappear and large values don't exceed the chart size look at using a d3 log scale:
rscale = d3.scale.log().base(10).domain([1,1000]).range([0,10])
Of course on a log scale the areas aren't linearly proportionate any more so whether the sqrt step is necessary is debatable. If you don't just remember to adjust the domain to account for this - change it to domain([1,1000000])
if you don't know the size of your numbers beforehand it will be worthwhile looping through your dataset to pick out the min and max to plug into the domain value: domain([your_min, your_max]). my examples above all assume a max of one million.
Here's an example I forked on jsfiddle, numbers from a few hundred to over a hundred thousand are displayed using a log scale and all are visible but the differences are still obvious:
http://jsfiddle.net/m9gcno5n/
I'm experiencing some undesired behavior when using the built in data-grouping for Highstock charts. The result appears to be some sort of conflict between the underlying data-grouping logic (only grouping visible points?) and the extremes/navigator.
Some context:
I have a chart where the data points occur roughly every 2 minutes. Aside from viewing the point individually the user has the option to select a number of different groupings for the data points (15 minute intervals, hourly, and daily). The options for this are:
const dataGrouping = {
enabled: true,
forced: true,
approximation: 'sum',
units: [['minute', [15]]]
};
Where the units would be swapped out depending on which option was selected. This appears to be working as intended. Depending on which grouping the user has selected I now enforce restrictions on the navigator to snap to the nearest acceptable interval. For example, for the 15 minute interval:
const newMin = moment(extremes.min).minute(15 * Math.round(moment(extremes.min).startOf('minute').minute() / 15)).startOf('minute').valueOf();
let newMax = moment(extremes.max).minute(15 * Math.round(moment(extremes.max).startOf('minute').minute() / 15)).startOf('minute').valueOf() - 1000;
if (newMax > extremes.dataMax) {
newMax = extremes.dataMax;
}
if (newMin !== extremes.min || newMax !== extremes.max) {
this.chart.xAxis[0].setExtremes(newMin, newMax, true, true, { trigger: 'adjExtremes' });
}
This also appears to be working as expected with one small flaw: the right-most point on the graph trails off incorrectly. For instance, if the right-most point is at 3:00PM and meant to be an aggregation of the values between 3:00PM and 3:15PM the 3:00PM point will only sum any points between 3:00PM and 3:01PM as 3:01 - 3:15 are not technically visible on the graph (however the data is there, just not in the visible range).
Bad Right-Most Point
I've tried a number of different things to no avail. Subtracting a second off of the max extreme makes it so the point is no longer there, and therefore can't show an incorrect value on hover, however the line still trails off the to bottom. I believe that given the nature of the issue, no matter what I set the extreme to I will continue to have the problem. The Highstock demo example for Data Grouping doesn't appear to have this issue (points right on the line can be seen going up, even when the point/group isn't drawn on the graph).
My thought is that the correct solution to this would be to force Highstock to group points so long as the data exists, even if the points wouldn't technically be drawn on screen. I've tried turning off the various thresholds, but that also has no effect.
I'm using Highstock (v4.2.3) to present data in a StockChart with a number of different Y axes, all plotted against time on the X axis. The data has gaps in it, and I'd like to depict those gaps, but when I turn on gapSize (with any value other than zero), there's a weird quirk that causes line rendering issues--when using the navigator to zoom in on certain date ranges (not all), in some cases (whose pattern I've yet to discern) the chart fails to fully render the line across the entire x axis.
This annotated screenshot depicts the issue.
When I turn gapSize off (or explicitly set it to zero), this problem goes away. Note that the gaps themselves appear correctly on the chart (when navigating to a date range that doesn't present the line rendering issue).
plotOptions: {
series: {gapSize:2}
}
Any ideas?
jsFiddle with your issue:
http://jsfiddle.net/2N52H/109/
As you can read in our API:
http://api.highcharts.com/highstock#plotOptions.line.gapSize
A gap size of 5 means that if the distance between two points is
greater than five times that of the two closest points, the graph will
be broken
As far as I know data you have has random gaps so you will never know what is the distance between two closest points. For example if you will have data in every one hour, distance between two closest points will be 15 minutes and your gapSize will be set to 2, you will see only your closest points.
When you are using zoom sometimes your visible data closest distance is changing so the gaps are changing as well.
See this example:
http://jsfiddle.net/2N52H/111/
Maybe you can use xAxis.ordinal parameter to visualise your gaps:
http://api.highcharts.com/highstock#xAxis.ordinal
You can also change standard functionallity by using wrapper. Here you can read about it:
http://www.highcharts.com/docs/extending-highcharts/extending-highcharts
For example you can change gappedPath function:
(function(H) {
H.wrap(H.Series.prototype, 'gappedPath', function(proceed) {
var gapSize = this.options.gapSize,
xAxis = this.xAxis,
points = this.points.slice(),
i = points.length - 1;
if (gapSize && i > 0) { // #5008
// extension for ordinal breaks
while (i--) {
if (points[i + 1].x - points[i].x > gapSize) {
points.splice( // insert after this one
i + 1,
0, {
isNull: true
}
);
}
}
}
return this.getGraphPath(points);
})
}(Highcharts))
example:
http://jsfiddle.net/2N52H/113/
Kind regards.
In d3, if you want to create an axis you might do something like this:
var xAxis = d3.svg.axis()
.scale(x)
where x is a scale function. I understand that the domain of x defines the start and ending values for the ticks. I'm having trouble understanding how the range of x changes the resulting axis. What does the domain map to in the context of an axis.
Think about what one must do to create a visual representation of any data set. You must convert each data point (e.g. 1 million dollars) into a point on the screen. If your data has a minimum value of $0 and maximum value of $1000000, you have a domain of 0 to 1000000. Now to represent your data on a computer screen you must convert each data point (e.g. $25) into a number of pixels. You could try a simple 1 to 1 linear conversion ($25 converts to 25 pixels on the screen), in which case your range would be the same as your domain = 0 to 1000000. But this would require a bloody big screen. More likely we have an idea of how large we want the graphic to appear on the screen, so we set our range accordingly (e.g. 0 to 600).
The d3 scale function converts each data point in your dataset into a corresponding value within your range. That enables it to be presented on the screen. The previous example is a simple conversion so the d3.scale() function is not doing much for you, but spend some time converting data points into a visual representation and you will quickly discover some situations where the scale function is doing a lot of work for you.
In the particular case of an axis, the scale function is doing exactly the same thing. It is doing the conversion (to pixels) for each 'tick' and placing them on the screen.
Is there any possibility to limit the number of d3.svg.axis integer labels displayed on the graph? Take for instance this graph. There are only 5 sizes here: [0, 1, 2, 3, 4]. However, the ticks are also displayed for .5, 1.5, 2.5 and 3.5.
You should be able to use d3.format instead of writing your own format function for this.
d3.svg.axis()
.tickFormat(d3.format("d"));
You can also use tickFormat on your scale which the axis will automatically use by default.
I've realised that it is enough to hide these values, rather than to completely exclude. This can be done using the tickFormat (https://github.com/mbostock/d3/wiki/Formatting#wiki-d3_format) as such:
d3.svg.axis()
.tickFormat(function(e){
if(Math.floor(e) != e)
{
return;
}
return e;
});
The use of d3.format("d") as tickFormat might not work in all scenarios (see this bug). On Windows Phone devices (and probably others) imprecise calculations may lead to situations where ticks are not correctly detected as integer numbers, i. e. a tick of "3" may be internally stored as 2.9999999999997 which for d3.format("d") is not an integer number.
As a result the axis is not displayed at all. This behaviour especially becomes apparent in tick ranges up to 10.
One possible solution is to tolerate the machine error by specifying an epsilon value. The values do not have to be exact integers this way, but may differ from the next integer up to the epsilon range (which should be way lower than your data error):
var epsilon = Math.pow(10,-7);
var axis = d3.svg.axis().tickFormat(function(d) {
if (((d - Math.floor(d)) > epsilon) && ((Math.ceil(d) -d) > epsilon))
return;
return d;
}
Another solution, which is also mentioned in other threads, is to specify a fixed tick count and then round the ticks (which now should be "almost" integer) to the nearest integer. You need to know the maximum value of your data for this to work:
var axis = d3.svg.axis().ticks(dataMax).tickFormat(d3.format(".0f"))