change the domain of my line chart at runtime - javascript

In my code I am loading a JSON of more than 900 data. these data represent the data emitted by some machines. I'm drawing a line chart, the keys in this JSON represent the name of the machines.
This is the structure of my JSON:
{"AF3":3605.1496928113393,"AF4":-6000.4375230516,"F3":1700.3827875419374,"F4":4822.544985821321,"F7":4903.330735023786,"F8":824.4048714773611,"FC5":3259.4071092472655,"FC6":4248.067359141752,"O1":3714.5106599153364,"O2":697.2904723891061,"P7":522.7300768483767,"P8":4050.79490288753,"T7":2939.896657485737,"T8":9.551935316881588}
each line represents each machine and I put a space to see each machine separately. I am currently reading the data with the help of an counter called cont. all the data in the JSON is between 0 to 5000. But I have modified some objects of the JSON to achieve change the domain and then the new domain in general for all the lines must be equal to the change.
for example on line 106 of the JSON to "AF3":7000. (in this case the domain should be [0-7000] for all the lines)
in the line 300, "AF4": - 1000.(in this case the domain should be [-1000,7000] for all the lines)
I have modified some data on purpose to achieve this change. I would like all lines to be updated to this new domain, if possible with an animation.
How can I do it?
this is my code:
http://plnkr.co/edit/KVVyOYZ4CVjxeei7pd9H?p=preview

To update domain across all line charts, we need to recalculate the domain before new data gets pushed in.
Plunker: http://plnkr.co/edit/AHWVM3HT7TDAiINFRlN9?p=preview
var newDomain = d3.extent(ids.map(function(d) {
return aData[cont][d]
}));
var oldDomain = y.domain()
newDomain[0] = newDomain[0] < oldDomain[0] ? newDomain[0] : oldDomain[0]
newDomain[1] = newDomain[1] > oldDomain[1] ? newDomain[1] : oldDomain[1]
y.domain(newDomain)
domain.text(y.domain())
With respect to graph getting trimmed, data needs to be manipulated ( In your case, 14 arrays, push and shift operation to the array and D3 transition) all within a 1ms, which may not be enough. Unfortunately I don't have any resource to back this up. In case anyone can edit this answer to provide proof, please feel free.

Related

Check Validity of Data Before Update Phase in D3.js

I have data which updates every 10 seconds and I would like to check that all the data is valid before progressing with updates. I am currently getting false data intermittently which occurs as a negative number in one of the values. If one of the objects has a negative value then I don't trust the whole set and don't want to update any elements.
Ideally I don't want to update some items and then bail once the incorrect value occurs, but rather, determine if the whole set is good before updating anything
I'm not sure how d3 can manage this but I've tried with this and it seems to work. But it doesn't seem particularly in keeping with the elegance of D3 so I think there's probably a correct and better way to do it. But maybe not?!
var dataValid = true;
abcItems.each(function (d, i) {
if (0 > dd.Number1 - dd.Number2) dataValid = false;
});
if (dataValid) {
abcItems.each(function (d, i) {
// updating elements here
});
} else {
console.log("negative value occurred");
}
Is there a better way to manage this through D3?
A little bit more context:
The data (JSON provided via a RESTful API) and visualisation (a bar chart) are updating every 10 seconds. The glitch in the API results in incorrect data once every hour or so at the most (sometimes it doesn't happen all day). The effect of the glitch is that the bars all change dramatically whereas the data should only change by ones or twos each iteration. In the next fetch of data 10 seconds later the data is fine and the visualisation comes right.
The data itself is always "well-formed" it's just that the values provided are incorrect. Therefore even during the glitch it is safe to bind the data to elements.
What I want to do, is skip the entire iteration and update phase if the data contains one of these negative values.
Perhaps also worth noting is that the items in the data are always the same, that is to say the only "enter" phase that occurs is on page load and there are no items that exit (though I do include these operations to capture any unexpected fluctuations in the data). The values for these items do change though.
Looking at your code it seams you already have bound the dataset to your DOM elements abcItems.each(...).
Why not bail out of the update function when the data is not valid.
d3.json("bar-tooltip.json", function(dataset) {
if (!dataset.every(d => d.Number2 <= d.Number1)) return;
// do the update of the graph
});
The example assumes you call d3.json() froma function that is called every update interval, but you can use a different update method.

Cesium large number of entity updates

I am working on a project dealing with sensor data. In my backend everything is stored in a database which is getting polled by a controller and converted into kml to display on the cesium globe. This poll happens every 5-10 seconds and contains around 4000-8000 objects (we store up to 1 minute worth of data so we are looking at somewhere like 20k - 50k points). Following this I have an update function which slowly fades the markers out which updates every 5 seconds.
To load the kml on the map I use the following function:
var dataSource = new Cesium.KmlDataSource();
dataSource.load('link').then(function(value);
viewer.dataSources.add(dataSource);
});
On the update color function I am iterating over all of the objects within the datasources entity collection and updating them like so (this is very inefficient):
var colorUpdate = Cesium.Color.fromAlpha(newColor, .4);
dataSource.entities.values[i].billboard.color = colorUpdate;
When I do and add or color update I see a large amount of lag and was curious if there was anything you would suggest to fix this? Generally I get a freeze up for a few seconds. After 60 seconds of the data being on the map it gets removed like so (just a different if case within the color update loop)
dataSource.entities.remove(dataSource.entities.values[i]);
Is there potentially a way to set a propertiy for an entire entity collection so when this collection becomes 30 seconds old it updates the color to a new one? It seems that I just need to find a way to set a property for the entire collection vs individual entities. Does anyone know how to do that or have a suggestion for something better?

Big data amounts with Highcharts / Highstock (async loading)

Since my data amount becomes bigger everyday (right now > 200k MySQL rows in one week), the chart is very slow at loading. I guess the async loading method is the right way to go here (http://www.highcharts.com/stock/demo/lazy-loading).
I tried to implement it, but it doesn't work. So far I can provide my data with Python via URL parameters e.g. http://www.url.de/data?start=1482848100&end=1483107000, but there are several things I don't understand in the example code:
If a period of All Data is chosen in the Navigator, then all data is
provided by my server and loaded by the chart. So its the same as I
what I do right now without lazy loading. Whats the difference then?
Why there is a second getJSON() method without any URL parameter in
the above mentioned example code? Its the following URL, which is empty. What do I
need it for? I don't understand it:
https://www.highcharts.com/samples/data/from-sql.php?callback=?
And which method to load the data is better?:
This one: chart.series[0].setData(data);
or the code below which I use so far:
var ohlc = [],
volume = [],
dataLength = data.length,
i = 0;
for (i; i < dataLength; i += 1) {
ohlc.push([
data[i]['0'], // date
data[i]['1_x'], // open
data[i]['2_x'], // high
data[i]['3'], // low
data[i]['4'] // close ]);
The idea behind the lazy loading demo is that you fetch only the amount of points which is necessary, so if you have the data which includes 1.7 mln points, you never load so many points to the chart.
Based on Highcharts demo. Instead of loading too many points, you request for already grouped points, you have 1.7 milion daily points, you set the navigator to 'all' (time range 1998 - 2011), you don't need daily data, so the response will include monthly points. Gains are: fetching smaller amount of data (12 * 14 = 168 instead of 1.7 mln), avoiding heavy processing data on the client side (processing, grouping, etc.) -> lower memory and cpu usage for the client, faster chart loading.
The request for the data is in JSONP format. More information about its advantages here. So actually, url has 3 params - mandatory callback=? and optional start=?&stop=? - which indicates the points time range and its density. The first request does not have start/stop params because the server has some default values already set. After the navigator is moved, more detailed points are requested and loaded to the chart. This is the downside of the lazy loading - after the navigator is moved, your request a new set of data -> frequent data request and interruption due to the network failure.
The answer for your last question depends on if you have your data in a proper format or you don't. If you do, you can avoid looping the data on the client side and load it to the chart directly. If the format is not correct, then you have to preprocess the data, so the chart will be able to visualize them correctly. Ideally, you want the data to be in the right format after you request them - so if you can, you should do it on the server side.

Error parsing data when drawing d3 line chart

I'm trying to draw a difference chart (like this one) but using a more modular style.
So far, I've got as far as reading in two dummy CSV files, combining the data, generating a chart, and am now trying to just draw a single line from one part of the data, but I keep getting an error. The full code is available on bl.ocks.org.
The error is:
Error: Problem parsing d="M0,221.73913043478262LNaN,182.60869565217394LNaN,195.6521739130435LNaN,156.52173913043478L500,91.30434782608697L500,91.30434782608697LNaN,156.52173913043478LNaN,195.6521739130435LNaN,182.60869565217394L0,221.73913043478262Z"
which occurs when doing this:
g.select('.line').attr('d', line);
At that point (as seen in the console), data is:
[{"year":1999,"imports":15,"exports":19},{"year":2000,"imports":18,"exports":20},{"year":2001,"imports":17,"exports":30},{"year":2002,"imports":20,"exports":32},{"year":2003,"imports":25,"exports":9}]
xScale.range() is:
[0, 500]
xScale.domain() is:
[1999, 2003]
yScale.range() is:
[300, 0]
and yScale.domain() is:
[9, 32]
I'm guessing there's a simple error somewhere in there, meaning the wrong data is being used to draw the line, but after several hours trying to fix this, I can't see what I've done wrong.
You are using an ordinal scale, which doesn't interpolate between values. The domain of that scale consists of two elements, and it will map those to the two elements in the output range. That is, 1999 is mapped to 0 and 2003 is mapped to 500. For any other inputs, the scale will return NaN as the value isn't in its input domain.
You can fix this by specifying all the years you want mapped in the domain and the corresponding output values in the range. In your case, the easiest would be to use a linear scale though as that seems to be what you're assuming will happen with your current scale. You would simply need to replace the definition of the scale and how the range is set. This is what I have done here.
Alternatively, you could use a time scale as that would give you potentially better labels.

problems with latest cluster force layout example

Based on this work: http://bl.ocks.org/mbostock/7882658
If I substitute the automatic nodes creation by a JSON.stringify() output of the automatically generated data like this...
var nodes = [
{"cluster":2,"radius":1.6180680659922448},
{"cluster":0,"radius":3.3575295077569},
{"cluster":1,"radius":0.9569281165554346},
{"cluster":3,"radius":10.7245554165012}
];
...I get an exception "cannot read property x of undefined" on the line:
var x = d.x - cluster.x,
This is inside the cluster(alpha) function. So, apparently the d3.map function that automatically generates the data is putting something in the structure that the JSON stringification has not caught? Maybe I am just overlooking something simple...help is appreciated. Thanks! Here is a fiddle to help out: http://jsfiddle.net/Nivaldo/FJ3qq/1/
I commented out the code that is not working. Also, another detail, it does not seem like the original code as i left it (except that i reduced the count of clusters and nodes) is actually handling the right number of distinct clusters. It should paint 4 different ones but is only painting with 3 colors.
The problem is that nodes is not the only data structure that needs to be initialised -- clusters needs to be as well. In particular, specific nodes are assigned to specific cluster indices. If you don't do that, things will break.
To fix, do something like
nodes.forEach(function(d) { clusters[d.cluster] = d; });
Complete jsfiddle here.

Categories