I tried from yesterday to find a formula in JavaScript (also math formula) that return the value, of a given percentage from 3 different cases.
Example:
range A = [0, 10 ] - percentage 25% => value will be 2.5
range B = [0, 100] - percentage 50% => value will be 50
but how do I treat this 2 cases?:
case 1 = range [-5, 5 ] - percentage for example 50% => value will be 0
case 2 = range [-10, 0 ] - percentage for example 25% => value will be -7.5
case 3 = range [-11, -1] - percentage for example 30% => value will be ?
Here is your formula:
Try this.
const percentage = function(x, y, perc){
// x is start point
// y is end point
// so you need length of this range (between x and y) and we subtract x from y
// and dividing to 100 (because 100% is full range between x and y)
// when we divide it to 100, the result is 1% of the range
// then we multiply it to percentage we want for example 25%
// and adding x to result. Because we started from x;
return ((y-x)/100)*perc+x;
}
console.log(percentage(0,10,25));
console.log(percentage(0,100,50));
console.log(percentage(-5,5,50));
console.log(percentage(-10,0,25));
console.log(percentage(-11,-1,30));
Related
I'm building a scale for generating colors for a map, where various ranges of numbers correspond to colors. In the scale below, the a, b, and c values (along with the min and max) form "steps" or "buckets" corresponding to colors. Anything between -1 and min will be colored "#ffeda0 in this example, and anything between min and a will be colored differently:
return [
"#e7e9e9",
-1,
"#ffeda0",
min,
"#fed976",
a
"#feb24c",
b
"#fd8d3c",
c
"#fc4e2a",
max
"#e31a1c"
];
This scale will always start an a minimum value, called min, and will end at a maximum value, max. I've currently got a simple linear scale like this:
const min = Math.min(data);
const range = Math.max(data) - min;
const step = range / 4;
return [
"#e7e9e9",
-1,
"#ffeda0",
min,
"#fed976",
min + step,
"#feb24c",
min + step * 2,
"#fd8d3c",
min + step * 3,
"#fc4e2a",
max,
"#e31a1c"
];
How could one write a function that, given a minimum, maximum, and the number of steps in a scale (in this case three) output the correct numbers for a logarithmic scale?
A logarithmic scale is linear in its logarithms. So:
const min = Math.min(data);
const max = Math.max(data);
const logmin = Math.log(min);
const logmax = Math.log(max);
const logrange = logmax - logmin;
const logstep = logrange / 4;
Now where the linear scale has
value = min + n * step;
the logarithmic scale has:
value = Math.exp(logmin + n * logstep);
This will work only if both min and max are positive, because you cannot take the logarithm of zero or a negative number and bcause in logarithmic scale, each value has a constant positive factor to the previous one. You can make this work for negative min and max by adjusting the signs.
You can do the calculations in other bases, for example 10:
const logmin = Math.log10(min);
const logmax = Math.log10(max);
value = Math.pow(10, logmin + n * logstep);
Alternatively, you can find the common factor from one step to the next by taking the n-th root of the quotient of the min and max values:
const min = Math.min(data);
const max = Math.max(data);
const fact = Math.pow(max / min, 1.0 / 4.0);
The values are then found by repeated multiplication:
value = prev_value * fact
or
value = xmin * Math.pow(fact, n);
Both methods might yield "untidy" numbers even if the scale should have nice round numbers, because the logarithms and n-th roots could produce inaccurate floating-point results. You might get better results if you pick a round factor and adjust your min/max values.
I've got multiple number values with a min and a max value. Assume the minimum value should be visualized by a 5px circle and the maximum value should be visualized by a 50px circle, I need to calculate a scaling factor for all other values.
50 / maxValue * value
...is not working well, because it does not consider the min value.
const array = [5, 20, 50, 100]
const min = 5 // => 5px
const max = 100 // => 50px
d3-scale does exactly this for you.
You just need to create a scaleLinear, define the domain (range of input values), and range (corresponding desired output values).
Have a look at the snippet below for an illustration:
const array = [5, 20, 50, 100]
const min = 5 // => 5px
const max = 100 // => 50px
// define the scaling function
const size = d3.scaleLinear()
.domain([min, max])
.range([5, 50])
// call it for each value in the input array
array.forEach(function(val) {
console.log(val, Math.round(size(val)) + 'px')
})
<script src="https://cdnjs.cloudflare.com/ajax/libs/d3/5.7.0/d3.min.js"></script>
<script src="https://cdn.jsdelivr.net/npm/d3-scale#3.2.0/dist/d3-scale.min.js"></script>
const array = [5, 20, 50, 100]
const minpx = 5 // => 5px
const maxpx = 50 // => 50px
let max = Math.max(...array)
let min = Math.min(...array)
// (x-min)/(max-min) the ratio
// * (maxpx-minpx) the new range
// + minpx the new offset
let transform = x=>(x-min)/(max-min)*(maxpx-minpx)+minpx
console.log(array.map(transform))
Use the so called Linear Interpolation method.
The following function takes 3 parameters:
a: The starting value
b: The destination value
amount: The normal or actual value (percentage between 0 and 1) to control the Linear Interpolation
function mix(a, b, amount) {
return (1 - amount) * a + amount * b
}
Imagine a line from A to B and you put a circle on it:
A ----o--------------------------- B
If your normal value is equal to 1 the circle will instantly switch from A to B.
If your normal value is equal to 0 the circle will not move.
The closer your normal is to 0 the smoother will be the interpolation.
The closer your normal is to 1 the sharper will be the interpolation.
const array = [5, 20, 50, 100]
const minpx = 5 // => 5px
const maxpx = 50 // => 50px
const min = Math.min(...array) // 5
const max = Math.max(...array) // 100
function mix(a, b, amount) {
return (1 - amount) * a + amount * b
}
array.map(value => mix(minpx, maxpx, (value-min) / (max-min)))
// [ 5, 12.105263157894736, 26.31578947368421, 50 ]
I have a rectangle and would like to:
Get a random point on one (any) of the sides.
Get a random point on one (except for the previously picked) side.
My initial approach is to create arrays for each possible side.
var arr:Array = [[{x:0,y:0}, // Top
{x:width,y:0}], //
[{x:width,y:0}, // Right
{x:width,y:height}], //
[{x:width,y:height}, // Bottom
{x:0,y:height}], //
[{x:0,y:height}, // Left
{x:0,y:0}]]; //
Then, I get the sides.
rand is an instance of Rand and has the methods:
.next() which provides a random number between 0 and 1
.between(x,y) which returns a random number between x and y.
var firstSide:Array = arr[rand.next() * arr.length];
var secondSide:Array;
do {
secondSide = arr[rand.next() * arr.length];
} while(secondSide.equals(firstSide));
Finally, I calculate my points.
var pointOnFirstSide:Object = {x:rand.between(firstSide[0].x, firstSide[1].x),
y:rand.between(firstSide[0].y, firstSide[1].y};
var pointOnSecondSide:Object = {x:rand.between(secondSide[0].x, secondSide[1].x),
y:rand.between(secondSide[0].y, secondSide[1].y};
I don't think this is the most efficient way to solve this.
How would you do it?
Assuming we have the following interfaces and types:
interface Rand {
next(): number;
between(x: number, y: number): number;
}
interface Point {
x: number;
y: number;
}
type PointPair = readonly [Point, Point];
and taking you at your word in the comment that the procedure is: first randomly pick two sides, and then pick random points on those sides... first let's see what's involved in picking two sides at random:
const s1 = Math.floor(rand.between(0, arr.length));
const s2 = (Math.floor(rand.between(1, arr.length)) + s1) % arr.length;
s1 and s2 represent the indices of arr that we are choosing. The first one chooses a whole number between 0 and one less than the length of the array. We do this by picking a real number (okay, floating point number, whatever) between 0 and the length of the array, and then taking the floor of that real number. Since the length is 4, what we are doing is picking a real number uniformly between 0 and 4. One quarter of those numbers are between 0 and 1, another quarter between 1 and 2, another quarter between 2 and 3, and the last quarter are between 3 and 4. That means you have a 25% chance of choosing each of 0, 1, 2 and 3. (The chance of choosing 4 is essentially 0, or perhaps exactly 0 if rand is implemented in the normal way which excludes the upper bound).
For s2 we now pick a number uniformly between 1 and the length of the array. In this case, we are picking 1, 2, or 3 with a 33% chance each. We add that number to s1 and then take the remainder when dividing by 4. Think of what we are doing as starting on the first side s1, and then moving either 1, 2, or 3 sides (say) clockwise to pick the next side. This completely eliminates the possibility of choosing the same side twice.
Now let's see what's involved in randomly picking a point on a line segment (which can be defined as a PointPair, corresponding to the two ends p1 and p2 of the line segment) given a Rand instance:
function randomPointOnSide([p1, p2]: PointPair, rand: Rand): Point {
const frac = rand.next(); // between 0 and 1
return { x: (p2.x - p1.x) * frac + p1.x, y: (p2.y - p1.y) * frac + p1.y };
}
Here what we do is pick a single random number frac, representing how far along the way from p1 to p2 we want to go. If frac is 0, we pick p1. If frac is 1, we pick p2. If frac is 0.5, we pick halfway between p1 and p2. The general formula for this is a linear interpolation between p1 and p2 given frac.
Hopefully between the two of those, you can implement the algorithm you're looking for. Good luck!
Link to code
jcalz already gave an excellent answer. Here is an alternate version for the variant I asked about in the comments: When you want your points uniformly chosen over two sides of the perimeter, so that if your w : h ratio was 4 : 1, the first point is four times as likely to lie on a horizontal side as a vertical one. (This means that the chance of hitting two opposite long sides is 24/45; two opposite short side, 1/45; and one of each, 20/45 -- by a simple but slightly tedious calculation.)
const rand = {
next: () => Math. random (),
between: (lo, hi) => lo + (hi - lo) * Math .random (),
}
const vertices = (w, h) => [ {x: 0, y: h}, {x: w, y: h}, {x: w, y: 0}, {x: 0, y: 0} ]
const edges = ([v1, v2, v3, v4]) => [ [v1, v2], [v2, v3], [v3, v4], [v4, v1] ]
const randomPoint = ([v1, v2], rand) => ({
x: v1 .x + rand .next () * (v2 .x - v1 .x),
y: v1 .y + rand .next () * (v2 .y - v1 .y),
})
const getIndex = (w, h, x) => x < w ? 0 : x < w + h ? 1 : x < w + h + w ? 2 : 3
const twoPoints = (w, h, rand) => {
const es = edges (vertices (w, h) )
const perimeter = 2 * w + 2 * h
const r1 = rand .between (0, perimeter)
const idx1 = getIndex (w, h, r1)
const r2 = (
rand. between (0, perimeter - (idx1 % 2 == 0 ? w : h)) +
Math .ceil ((idx1 + 1) / 2) * w + Math .floor ((idx1 + 1) / 2) * h
) % perimeter
const idx2 = getIndex (w, h, r2)
return {p1: randomPoint (es [idx1], rand), p2: randomPoint (es [idx2], rand)}
}
console .log (
// Ten random pairs on rectangle with width 5 and height 2
Array (10) .fill () .map (() => twoPoints (5, 2, rand))
)
The only complicated bit in there is the calculation of r2. We calculate a random number between 0 and the total length of the remaining three sides, by adding all four sides together and subtracting off the length of the current side, width if idx is even, height if it's odd. Then we add it to the total length of the sides up to and including the index (where the ceil and floor calls simply count the number of horizontal and vertical sides, these values multiplied by the width and height, respectively, and added together) and finally take a floating-point modulus of the result with the perimeter. This is the same technique as in jcalz's answer, but made more complex by dealing with side lengths rather than simple counts.
I didn't make rand an instance of any class or interface, and in fact didn't do any Typescript here, but you can add that yourself easily enough.
I have an array (1200 values) of numbers
[123, 145, 158, 133...]
I'd like to have a div for each value with a background color from red to green, red being the smallest number and green the largest.
The base setup looks like this: (templating with vuejs but unrelated to the problem)
const values = [123, 145, 158, 133...]; // 1200 values inside
const total = values.length;
<div
v-for="(val, i) in values"
:key="i"
:style="{backgroundColor: `rgb(${(100 - (val*100/total)) * 256}, ${(val*100/total) * 256}, 0)`}">
{{val}}
</div>
I'm not a maths specialist but since all my numbers are around 100, the rgb generated is the same. (around 12% yellowish color)
How can I give more weight to the difference between 137 and 147?
EDIT: final formula:
:style="{backgroundColor: `rgb(${(256/(maxValue-minValue) * (boule-maxValue) - 255)}, ${(256/20 * (boule-maxValue) + 255)}, 0)`}"
Checkout this post: https://stats.stackexchange.com/questions/70801/how-to-normalize-data-to-0-1-range.
Basically you want to linearly rescale your values to another interval. You need your current min and max values from the array. Then define the new min' and max' which are the limits of the new interval. This would be [0, 255] in your case.
To do the transformation use the formula:
newvalue= (max'-min')/(max-min)*(value-max)+max'
As an example:
If your min value is 127 and max is 147, and you want to map 137. Then:
256/20 * (137-147) + 255 which results in 127.
If you want to map 130. Then:
256/20 * (130-147) + 255 = 37.4.
It really depends on what meaning those values actually have
However, you can try this: if your values are always bigger than 100 and always less than 150 (you can choose these number of course) you can "stretch" your values using the values as minimum and maximum. Let's take 137 and 147 as examples:
(val-min) : (max-min) = x : 255
(137-100):(150-100) = x:255 -> 37:50 = x:255 -> 188
(147-100):(150-100) = x:255 -> 47:50 = x:255 -> 239
That is for the math. In the end, this is the calculation:
newValue = (val-min)*255/(max-min)
where min and max are your chosen values.
You could take a kind of magnifier for a range of data. In this example, the values between 20 and 30 are mapped to a two times greater range than the outside values inside of an interval of 0 ... 100.
function magnifier(value, start, end, factor) {
var middle = (start + end) / 2,
size = (end - start) * factor / 2,
left = middle - size,
right = middle + size;
if (value <= start) return value * left / start;
if (value <= end) return (value - start) * factor + left;
return (value - end) * (100 - right) / (100 - end) + right;
}
var i;
for (i = 0; i <= 100; i += 5) {
console.log(i, magnifier(i, 20, 30, 2));
}
.as-console-wrapper { max-height: 100% !important; top: 0; }
I am working on a graphing class (in javascript) which uses canvas. This is just for experimental / learning purposes. Currently the graph scales correctly based on whatever height and width the canvas is set at. This is not a problem, and here is basically what I am doing to plot the correct coordinates [pseudo-code].
point[0] = [10, 15]
point[1] = [20, 10]
point[2] = [30, 20]
point[3] = [40, 15]
canvas width = 300
max x = 40
so for any given point:
position x = ( point[i][0] / max x ) * canvas width
simple enough. I get a ratio, then multiply it by the canvas width to plot it at the correct pixel.
The problem however, is coming up with an equation that would cause the minimum value of x to reside at 0 on the x coordinate of the graph, and the max value to be at the maximum point of the graph (which it already does because of the 1:1 ratio in my current equation). Currently the minimum value of x (10 in the example), resides at 75px in the x coordinate, because of the 1:4 ratio being multiplied to the canvas' width.
tldr / summary: I need to make a graph in which the minimum value is plotted at the beginning of the graph(0,0), and the maximum value plotted to the end.
try calculating a value for pixel-width-per-point first.
e.g.
widthPerPoint = canvasWidth / (maxX - minX)
then your position can be normalised to zero by subtracting the minimum value:
position = widthPerPoint * (point[i][0] - minX)
for your first example
widthPerPoint = 300 / (40 - 10) = 10
position = 10 * (point[i][0] - 10) = 10 * 0 = 0
and for the others:
point[0] = [10, 15] -> 0
point[1] = [20, 10] -> 100
point[2] = [30, 20] -> 200
point[3] = [40, 15] -> 300
at least I think that'll work....
Just loop over your points and record what you find (sorry, I can't do algorithm programming in JavaScript. pseudo-Python is so much easier):
minimum = [infinity, infinity, -1]
maximum = [-infinity, -infinity, -1]
for point in points:
if point.x > maximum.x and point.y > maximum.y:
maximum = [point.x, point.y, point.index]
if point.x < minimum.x and point.y < minimum.y:
minimum = [point.x, point.y, point.index]
if maximum.index == -1:
print 'No point is a maximum, so the points all lie in a horizontal line.'
maximum = [points[0].x, points[0].y, 0]
minimum = [points[0].x, points[0].y, 0]
You need to map linearly the range [min_x, max_x] to [0, width]. A point x = point[i][0] is mapped to
position(x) = width/(max_x - min_x) * (x - min_x).
Not sure I understand your question correctly. If so, this is the equation:
position x = (point[i][0] - min x) * canvas width / (max x - min x)
This way when point[i][0] is minimal (min x) your value is 0.0. and when it is maximal (max x) the value is canvas width, growing linearly.