How to make d3 scales proportional even if domains are different - javascript

Here is the example data set:
var dataset = [
{
x: 0,
y: 1.188
},
{
x: 22,
y: 0.822
},
{
x: 72,
y: 1.366
},
{
x: 82,
y: 1.163
},
{
x: 111,
y: 1.601
}
];
So, what I desire, for example: Distance between 0.001 and 0.002 on Y scale to be equal in width/proportions to distance from 1 to 2 on X scale on chart.
I tried it with linear scales but couldn't achieve proportions. I don't know if there is some built in method for setting those things.
Here is an attempt:
http://jsbin.com/goxamemare/8/edit?js,output
So, if I get it correctly in order to have equal ticks regardless of domain I need to take at least two things into considoration:
1. How many values (steps like 0.001, 0.002 for Y scale or 1, 2, 3 for X scale) there are for X and Y respectfully
2. To set one dimension, let's say width of the chart
So, that way I can work out:
height : number of values on Y = width : number of values on X scale
Which gives for example:
height : 810 = 360px : 300
height = 810 * 1.2
height = 972px
And so I get proportional ticks between Y and X. Right?

Related

react-chartjs-2: how to fix stepSize on y axis without knowing max range

I'm trying to fix stepSize to 1 on y axis in my Line chart using:
const options = {
scales: {
y:
{
min: 0,
stepSize: 1,
},
x:
{
},
},
};
it is starting from zero, but at intervals of 5 units. I know we can set it by specifying min AND max but I'm getting the graph data dynamically and can't be sure about max.
Is there any other keyword for stepSize?

Having issues with D3 scale and data binding

I've attached fiddle which has JSON, JS, CSS, HTML- https://jsfiddle.net/mihirchronicles/f9mjqwoa/. The goal is to use D3 to show the range of low, medium and high on Bitcoin circulation(volume) for given dates. Also, display the total circulation and display the circulation for each year. I am having issues binding data and scaling on D3. Any help would be great!
var width = 940, //width
height = 600, //height
tooltip = new CustomTooltip("bitcurve_tooltip", 240), //tooltip
layout_gravity = -0.01, //gravity
damper = 0.1, //moving around nodes
nodes = [], //empty nodes
//these will be set in create_nodes and create_vis
vis = null,
force = null,
circles = null,
radius_scale = null;
range,
lowRange,
aveRange,
highRange;
//defining the center based on width and height
var center = {x: width / 2, y: height / 2};
//defining the area for all the years when split
var year_centers = {
"2011": {x: width / 3, y: height / 2},
"2012": {x: width / 2, y: height / 2},
"2013": {x: 2 * width / 3, y: height / 2}
};

How to find the coordinate that is closest to the point of origin?

I know there are many many questions about sorting javascript arrays by multiple values, but none of the answers solved my problem.
I have an array of coordinates like:
x | y
--------
10 20
12 18
20 30
5 40
100 2
How can I get the coordinate that is closest to the point of origin?
Calculate the distance of each point using
Math.sqrt( Math.pow(x, 2) + Math.pow(y, 2) );
Take the result that's the lowest
var points = [
{x: 10, y: 20},
{x: 12, y: 18},
{x: 20, y: 30},
{x: 5, y: 40},
{x: 100, y: 2}
];
function d(point) {
return Math.pow(point.x, 2) + Math.pow(point.y, 2);
}
var closest = points.slice(1).reduce(function(min, p) {
if (d(p) < min.d) min.point = p;
return min;
}, {point: points[0], d:d(points[0])}).point;
closest;
// {x: 12, y:18}
You'll notice that we're skipping the Math.sqrt step here. As Mark Setchell points out, calculating the square root is a sort of "lowest common denominator" operation; We can still determine the closest point by getting the smallest x^2 + y^2 value.
For each x,y pair, square x, square y and add together. Smallest number is nearest to the origin.

Custom axis function in D3JS

I'm using D3JS and I want an axis in x with this kind of values : 125, 250, 500, 1000 ... until 8000. So multiply by 2 my values each time.
So I tried a Quantize Scales but axis do not support it.
How can I do this ? Can I do a custom mathematical function like y = mx + b (where m = 2 in my case and b = 0) and use it in axis?
Here you can see my code
Podelo
The linear scale is pretty flexible if you mass in multiple values for the range and domain to create a polylinear scale:
tickWidth = (ChartWidth - padding)/7
xScale = d3.scale.linear()
.domain([0, 125, 250, 500, 1000, 2000, 4000, 8000])
.range(d3.range(8).map(function(d){ return d*tickWidth; }));
http://jsfiddle.net/h2juD/6/

Math equation for graph

I am working on a graphing class (in javascript) which uses canvas. This is just for experimental / learning purposes. Currently the graph scales correctly based on whatever height and width the canvas is set at. This is not a problem, and here is basically what I am doing to plot the correct coordinates [pseudo-code].
point[0] = [10, 15]
point[1] = [20, 10]
point[2] = [30, 20]
point[3] = [40, 15]
canvas width = 300
max x = 40
so for any given point:
position x = ( point[i][0] / max x ) * canvas width
simple enough. I get a ratio, then multiply it by the canvas width to plot it at the correct pixel.
The problem however, is coming up with an equation that would cause the minimum value of x to reside at 0 on the x coordinate of the graph, and the max value to be at the maximum point of the graph (which it already does because of the 1:1 ratio in my current equation). Currently the minimum value of x (10 in the example), resides at 75px in the x coordinate, because of the 1:4 ratio being multiplied to the canvas' width.
tldr / summary: I need to make a graph in which the minimum value is plotted at the beginning of the graph(0,0), and the maximum value plotted to the end.
try calculating a value for pixel-width-per-point first.
e.g.
widthPerPoint = canvasWidth / (maxX - minX)
then your position can be normalised to zero by subtracting the minimum value:
position = widthPerPoint * (point[i][0] - minX)
for your first example
widthPerPoint = 300 / (40 - 10) = 10
position = 10 * (point[i][0] - 10) = 10 * 0 = 0
and for the others:
point[0] = [10, 15] -> 0
point[1] = [20, 10] -> 100
point[2] = [30, 20] -> 200
point[3] = [40, 15] -> 300
at least I think that'll work....
Just loop over your points and record what you find (sorry, I can't do algorithm programming in JavaScript. pseudo-Python is so much easier):
minimum = [infinity, infinity, -1]
maximum = [-infinity, -infinity, -1]
for point in points:
if point.x > maximum.x and point.y > maximum.y:
maximum = [point.x, point.y, point.index]
if point.x < minimum.x and point.y < minimum.y:
minimum = [point.x, point.y, point.index]
if maximum.index == -1:
print 'No point is a maximum, so the points all lie in a horizontal line.'
maximum = [points[0].x, points[0].y, 0]
minimum = [points[0].x, points[0].y, 0]
You need to map linearly the range [min_x, max_x] to [0, width]. A point x = point[i][0] is mapped to
position(x) = width/(max_x - min_x) * (x - min_x).
Not sure I understand your question correctly. If so, this is the equation:
position x = (point[i][0] - min x) * canvas width / (max x - min x)
This way when point[i][0] is minimal (min x) your value is 0.0. and when it is maximal (max x) the value is canvas width, growing linearly.

Categories