Related
I've been writing an image processing program which applies effects through HTML5 canvas pixel processing. I've achieved Thresholding, Vintaging, and ColorGradient pixel manipulations but unbelievably I cannot change the contrast of the image!
I've tried multiple solutions but I always get too much brightness in the picture and less of a contrast effect and I'm not planning to use any Javascript libraries since I'm trying to achieve these effects natively.
The basic pixel manipulation code:
var data = imageData.data;
for (var i = 0; i < data.length; i += 4) {
//Note: data[i], data[i+1], data[i+2] represent RGB respectively
data[i] = data[i];
data[i+1] = data[i+1];
data[i+2] = data[i+2];
}
Pixel manipulation example
Values are in RGB mode which means data[i] is the Red color. So if data[i] = data[i] * 2; the brightness will be increased to twice for the Red channel of that pixel. Example:
var data = imageData.data;
for (var i = 0; i < data.length; i += 4) {
//Note: data[i], data[i+1], data[i+2] represent RGB respectively
//Increases brightness of RGB channel by 2
data[i] = data[i]*2;
data[i+1] = data[i+1]*2;
data[i+2] = data[i+2]*2;
}
*Note: I'm not asking you guys to complete the code! That would just be a favor! I'm asking for an algorithm (even Pseudo code) that shows how Contrast in pixel manipulation is possible!
I would be glad if someone can provide a good algorithm for Image Contrast in HTML5 canvas.
A faster option (based on Escher's approach) is:
function contrastImage(imgData, contrast){ //input range [-100..100]
var d = imgData.data;
contrast = (contrast/100) + 1; //convert to decimal & shift range: [0..2]
var intercept = 128 * (1 - contrast);
for(var i=0;i<d.length;i+=4){ //r,g,b,a
d[i] = d[i]*contrast + intercept;
d[i+1] = d[i+1]*contrast + intercept;
d[i+2] = d[i+2]*contrast + intercept;
}
return imgData;
}
Derivation similar to the below; this version is mathematically the same, but runs much faster.
Original answer
Here is a simplified version with explanation of an approach already discussed (which was based on this article):
function contrastImage(imageData, contrast) { // contrast as an integer percent
var data = imageData.data; // original array modified, but canvas not updated
contrast *= 2.55; // or *= 255 / 100; scale integer percent to full range
var factor = (255 + contrast) / (255.01 - contrast); //add .1 to avoid /0 error
for(var i=0;i<data.length;i+=4) //pixel values in 4-byte blocks (r,g,b,a)
{
data[i] = factor * (data[i] - 128) + 128; //r value
data[i+1] = factor * (data[i+1] - 128) + 128; //g value
data[i+2] = factor * (data[i+2] - 128) + 128; //b value
}
return imageData; //optional (e.g. for filter function chaining)
}
Notes
I have chosen to use a contrast range of +/- 100 instead of the original +/- 255. A percentage value seems more intuitive for users, or programmers who don't understand the underlying concepts. Also, my usage is always tied to UI controls; a range from -100% to +100% allows me to label and bind the control value directly instead of adjusting or explaining it.
This algorithm doesn't include range checking, even though the calculated values can far exceed the allowable range - this is because the array underlying the ImageData object is a Uint8ClampedArray. As MSDN explains, with a Uint8ClampedArray the range checking is handled for you:
"if you specified a value that is out of the range of [0,255], 0 or 255 will be set instead."
Usage
Note that while the underlying formula is fairly symmetric (allows round-tripping), data is lost at high levels of filtering because pixels only allow integer values. For example, by the time you de-saturate an image to extreme levels (>95% or so), all the pixels are basically a uniform medium gray (within a few digits of the average possible value of 128). Turning the contrast back up again results in a flattened color range.
Also, order of operations is important when applying multiple contrast adjustments - saturated values "blow out" (exceed the clamped max value of 255) quickly, meaning highly saturating and then de-saturating will result in a darker image overall. De-saturating and then saturating however doesn't have as much data loss, because the highlight and shadow values get muted, instead of clipped (see explanation below).
Generally speaking, when applying multiple filters it's better to start each operation with the original data and re-apply each adjustment in turn, rather than trying to reverse a previous change - at least for image quality. Performance speed or other demands may dictate differently for each situation.
Code Example:
function contrastImage(imageData, contrast) { // contrast input as percent; range [-1..1]
var data = imageData.data; // Note: original dataset modified directly!
contrast *= 255;
var factor = (contrast + 255) / (255.01 - contrast); //add .1 to avoid /0 error.
for(var i=0;i<data.length;i+=4)
{
data[i] = factor * (data[i] - 128) + 128;
data[i+1] = factor * (data[i+1] - 128) + 128;
data[i+2] = factor * (data[i+2] - 128) + 128;
}
return imageData; //optional (e.g. for filter function chaining)
}
$(document).ready(function(){
var ctxOrigMinus100 = document.getElementById('canvOrigMinus100').getContext("2d");
var ctxOrigMinus50 = document.getElementById('canvOrigMinus50').getContext("2d");
var ctxOrig = document.getElementById('canvOrig').getContext("2d");
var ctxOrigPlus50 = document.getElementById('canvOrigPlus50').getContext("2d");
var ctxOrigPlus100 = document.getElementById('canvOrigPlus100').getContext("2d");
var ctxRoundMinus90 = document.getElementById('canvRoundMinus90').getContext("2d");
var ctxRoundMinus50 = document.getElementById('canvRoundMinus50').getContext("2d");
var ctxRound0 = document.getElementById('canvRound0').getContext("2d");
var ctxRoundPlus50 = document.getElementById('canvRoundPlus50').getContext("2d");
var ctxRoundPlus90 = document.getElementById('canvRoundPlus90').getContext("2d");
var img = new Image();
img.onload = function() {
//draw orig
ctxOrig.drawImage(img, 0, 0, img.width, img.height, 0, 0, 100, 100); //100 = canvas width, height
//reduce contrast
var origBits = ctxOrig.getImageData(0, 0, 100, 100);
contrastImage(origBits, -.98);
ctxOrigMinus100.putImageData(origBits, 0, 0);
var origBits = ctxOrig.getImageData(0, 0, 100, 100);
contrastImage(origBits, -.5);
ctxOrigMinus50.putImageData(origBits, 0, 0);
// add contrast
var origBits = ctxOrig.getImageData(0, 0, 100, 100);
contrastImage(origBits, .5);
ctxOrigPlus50.putImageData(origBits, 0, 0);
var origBits = ctxOrig.getImageData(0, 0, 100, 100);
contrastImage(origBits, .98);
ctxOrigPlus100.putImageData(origBits, 0, 0);
//round-trip, de-saturate first
origBits = ctxOrig.getImageData(0, 0, 100, 100);
contrastImage(origBits, -.98);
contrastImage(origBits, .98);
ctxRoundMinus90.putImageData(origBits, 0, 0);
origBits = ctxOrig.getImageData(0, 0, 100, 100);
contrastImage(origBits, -.5);
contrastImage(origBits, .5);
ctxRoundMinus50.putImageData(origBits, 0, 0);
//do nothing 100 times
origBits = ctxOrig.getImageData(0, 0, 100, 100);
for(i=0;i<100;i++){
contrastImage(origBits, 0);
}
ctxRound0.putImageData(origBits, 0, 0);
//round-trip, saturate first
origBits = ctxOrig.getImageData(0, 0, 100, 100);
contrastImage(origBits, .5);
contrastImage(origBits, -.5);
ctxRoundPlus50.putImageData(origBits, 0, 0);
origBits = ctxOrig.getImageData(0, 0, 100, 100);
contrastImage(origBits, .98);
contrastImage(origBits, -.98);
ctxRoundPlus90.putImageData(origBits, 0, 0);
};
img.src = "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAGQAAABkCAMAAABHPGVmAAADAFBMVEX0RydFRjuPweRoak+awuFzbknzUj9SV0T0RRVtb1lcak7zUTSUmX94hnOYoHFacmpCPS9zwvOBkG5Zbl7xVCCVxuhnfnV7fFOnoqU3OTFQZVNNXlCmxeBtfWlhaFlbXUmfyOVrh3/vXUiBfUcrKCF9wu50fXB4fGVlcGJIUEvtSDCUvd2Up4uJnYBUX0A7QzjtW1tYZUlqY0U1Miivy+BfdXJ/hWtuc2eNjWSCkWFbYVVhcU6Pl2tsf13xVUxyueeir4N+lICIkXdIUjuxr7JyfH1iemiHhFT5Rx+7zN2Xn2KCf11jeFl1jYhSZ11OTkGJxe6Kt9ybrp6fqZBkfoZ/in53jntBTUN4h2FPTDFFRy+GwOeju9iVp35dWj1uh25gZWqNmF9JWUX3SjOvxNtyjZlYeJdqrNeMn3F2dFmNj1KWkktFV1W7vbd/l4ydkIVOXWDgVEjwWzlqb3WXq6/Bp6OHlJuWn5aKmotSZ2t/jVWRgFKmkU92eUF+rdaKrNCnsqVnh5BqeEyHh0POmzxiZDuotbqkrW+lnmvEo1BafqZ3gYnmZWKonFyZjlvvZ1Xev0hlu+5aksKyr5RLaI+YkmrXZmqYnFa3l0tXVDZpXzGFoqWlkZa3hYHrbmbXuFGLo5Z4hU3UqkdhbUBvbThxoM29u6SVoaNVbXziXlRVVVRZndPNsl1/uOHCsbB5mJWSjZE9PyWWs9egpX+st36lm3eysnJ6bTdxkb5gcYfSWlMfGhxpwPdloLp8mqWCi4++s344SVKih0aYfT6JcTy/TzwsLzZAMR+drM2/kJGrnpBafntHYnk1PUNfVC3SUCpfr+XTc4OGeXnjaHSlqV+sgzm9mpviSTlRg7Rrh7HmdXbCdXDSeW5AVWy2pWHZa1ynt5Hge5DNiouMgIbGtGq6w86+ZFmkV06+jzxOQyXWiKKzcGeFosRgkajdrzqddHOEcFN6SCTRig2dRCbjRiK1ahXMnLV6VzuEXFyeaiy8voxgPh/CfBK6lK6ciHeFa2q3ecQCAAAjI0lEQVRo3iRVf2gbZRjOGUroaO5iTq8mx6WGxDTBpGGNl+baFJOGpXhtLs4mzY+RxjvMTJNQAw5tTuJsjnCiKAiTsNSBoVFGRWpsyJBuheps/oiNIhEEHQQFR9Hi2lFq14J+6sMH33HHve/7vD+eV9KO7Jav6nTVfL6cFuhiZYU/UzmTNCvNbkIbJ1B0HJ12E74FrRdDFxgkgfgwBlkKrG36tFoEO+dbY9ZaGGJktJtrmUu+hdmg3Y0Nmp2LdaVIizCvFFPF4YLkpFy9WtTpdOl0viRUK2yFpyrOsEW5uIChLVxrIUaMC1oMR7ksgjMkjobi4MIQn5pEsr6cOoRkGBJpBRKbl5Z9pGFBHjQHzcqusuFQsmFRGXZ0+GJeslu+kS+lBSFP06UqTYtjcYof01hmUR/mXcPOzA6ZvCTJQFaGUfs9fivHWT1qJhDKJsjcEo4wiDZHtgKB5cRy4pI2Lp8YIeTmcaUTnocbIlxX9vMiL6xI2rvtok6odPg0neJTYpVyOQthftZ4aTmhZRBjKIubEMiPk4jMpIayIavJ6gkxrcCty0ubawGEJAPLm8trr1z6Y2lzOeHjxs5qzEbY7l6P1XVUfV0s6XQFnpLsttt5oZRKpdOlEptiKZ6OVobOElo1ksNa5zASRWUQTiYCATK7xLSWSDLRWtq8fP7y8iVfC/rCt7m5mUgktCSJYAEUw0zucY1RA/dSsDNJTcVYlk8W8nlJu3yjmq/k2U4lla+wPM+OrRSkQ2rvKIliWhLL4iiSg0xMNrAUWP5jOcEgy3ubfyQubyImtTVE5pAsYwqZOEaN+9WQWnY1MkBopNL+nv6+usslOhsuuLAyB5xUK4U0O9kQOwIrUvmVWmT67JgfHYI4DpJZTR61TKaIKqDQKqkFfhJqda6VWMJQT8TjUaNciOM4k9/jiQIoIDUyKpMW5vqdFCvWO7Qz6ehJ83MUKHw1nU5R5XQJpKo0VChIr8qGIpGoAhx/TVFTNJvgKdpUQNcDJB7KqlW5AOOJAJsej4eDojarAoIUCigalXlqNQXkV1w1jivtDidP0XXluK5U5PslN/JzVSGv69AlgRallamiTCPzN8FvEUXTFm2qFDab4n/4FX5U4YnW/jtNGyCnCkUi5Rv7J7vWmoKDIBXgA/mNoL/6Vkou18T6A5eSoni2KImAOUkL777zDl2ZpHlW2a8ag0wyK/RvfCrV9m7z5KTZtAGoVAoVsKzGEc4D2WzDGky9bTw8PT48OP0BtelRz79sOJnf2CuPycMs7FDW63SHH6AKZUk1JayUUg6xmE/RIp3v6ZH6QUw4JAtZVfrt6uHp4f7h4fGOCYP0tppN4RnycB6PCpqfHx9SbcMHB8Pzxwenyh0NET+LAxCEN2iZmFY67esPlFLRORhzshWJcEMQ0pNsp0E3XC44SQ0MQP5VyGZVezirbefLw9PmHPBCbuPjMj1w8y+iNWi4J+y16vXJ04Pjk8PTU/mOafHDOMOgCBlfDbonYkkYnqAqLFunpAVpv6QslIr5fl2p02nAohPuGzSARuc8HEoYxzzbwunB4f3Dwx9MG3pjb6Rpa9aioOa1SFkaj25sbOA/HACcfpXTjy6a3STqzRmC8t6ZCSrspGjKOQ7TKTa/kpcIc8JwOiUUHWyn46Rpe2wwGMRQNOddtKj1evzwPyv3bBsKPwE19fpapNau1dq9YS4yHNHrZcfg62ROFcJmPoyToyjmm0XNFioZg9c7rJi0jHlA5XnJbaGtK5fSOoF2pkSXE7bPzwQXnkdaS2uzGpCQ0D7gsq/XN3cjmjNNcM8Vd6PtI/k095pOZbOBF03r9g7iXTcbRnCvL240PBeG+5LJcLI7E0vyNGx3FouS2zeAcqXSAtxolHSlyZTDbggaZjEExGQfUls39Bv/wX9WymmN+vbubi1yNHyVwCHOmLGGrDs7W6YtbTz4qtsoY2QGr1zrtiyau65uzNJ9QFN0QckDWbnd1glCzzudfLpRSoNKwQ4i2Of1erVo3DKNqq2ZzNZWJpNdWHh11qRoHp3st4+O7o14VEME2AAtJsNkEa07+GoQz6CjBEaMeONds8VZX5+BKTEZc5b6K5WKZHclBeQxVS9R9QbLlh7ABTgYXJz2xlHES/R64xyOqhnjyz65mtNbd3ua+6/t3/7Tx3hVGu3CWhDDEHnw1ZlFjUmLoASGy71yi91icXVnZrousyuZhGkHOySp6srptK6fFTssC3YZLFLzU4Q9KCfPkV4ZITXOEoOG4MjbWEBrQvRz905O/j4+/uvtpVyLw02ake9f+uyDx37tlWUx4EOuWZCfkxOxriVMwV1XjIVFsFAc/LRkF+wsXXVSpEG3daZ6qKTTPTGxOGLADFovbuUG7d/b7YOG0dVVBjGFRu7t37129/j26nIrlPnm4t7e3lsXv1i1qTItBGxLwjholGN9MaWSguszgEa9HoMtlKMgKbfTQk9HLDlZGoZddZ5PJgeDSoPh3PMEpsVDJqvfD5nuvHVxa1uVMRrunhxd++349wEDY3vrxfce/hffvfjW9RyDY5rxEa+bMLrNix++YVHOzFjqrgegv4Cyj0mK5XQqVUhVikJJl4cbIF8wOzY68rKBxI0hyLpz58qPP/64d+v8hT391tsv3W0eXTs+/P2vkNW298Sjjz7+6IXXby0FrmczS1oGnT6jwdxueczi+viN51wPzN1YLOkc7uFXwMQPT+VT9UYq1Uk56Tcd/evvw30yVGs0DHAmq/XOj9/dvAniffrTz/2m76/dP9n/4P63v/0C1RQfPf7UU0+evwWwd+VOxpRFNMSZabtl0R2LKWOuriVW78bqSWk4yVdA4XU6YbIkpNIdIPdAWeZ7+/ocBK5ZxVHT1pUrVy6++PATT1x49sIfR9Cfv9zfP7r2yLe/XbtXG3jomRfOn//0iYdv3rz54sVcNpdF1ZowEbbLqeQsDMOWJOARp8CknKlI2lVhuFzV0ZPJyR66KDbenbI7pwbtfpnMqIbufPLdd7euf3ThUYDH7kf+/Pnu/t1fHvn2659+L597/KnLTz70LHDy3i2MvP6NNYuMjstGCWPc7IorxfrMG10XHYu5YlSY/4cG841po47D+HG9E1oo0KPHOVbWa2vHtrYoZ1cL7dk/i3S1Zf2zpV5XuIaFWWCEhMboFKlQgy4GR9oMlk4lJQUdNg2Rzb+DJSaiKGJZZnQhLGwmakIk0YzgK43fW+K96Mv73PM83+d79yuC46Oui4Ov4+fe7r08aLVaKY/ZIzZ5hFX/3Jd96VQqFdsJ0BhJP7r819b48l/LG+W7xcmtv2+SOh1G+v2y2NmlpfX19NWrtfXixlKrydDtdEAadkfHdb1B393MNncjGYZ3Ua5zLnPvZcp1+RVPz2WzK5xhlC0nTlzKL72VWpRpCwnMSJPvFn/f3HrwYPmWXL67tfngJorRNAiMLcVsizLb0tWjp47WeYlDIUeUMHTb9d2Q/3VYXt3NXi8y8+mVDDOjqex89qJG1FvpabD2DjElT0mPHt1/J2Vbyi9qI5GAzmikfyoub8SLD36fjMvlc1vLAoSk6VjMlloEx+ATbETVFo1WtYaCQQPBGfT2jo57do6zdk9bEJwynxvVUJ9qBimP/ZNnPWaKp0zPSFtb9q+lF22FdMoWQTFdl5v+tlicOrJbPLIhL5Orx7/4AKVpiOVh8us7SzurL58dGZk/+05WyhkAc/LesN5ZYXeYfuxM5qAnYaV5CKfODc1Qva4fzQ2mSmmFVCo90NeXltkiibxWgLS7jd8Vt6bkxeLAuFwuV0/99iKKKYx0IKKVyVLppYJtafXl2v6q0Hy2/55er7effB9+r3cnX6ns5HIILqKGZnqtVPOFyxrrhbc9BGeyO021qoO1H76VAqtighIM/HqvOK6WF3fVW3GAjM29i2JGI4ZC8rLYrE0rW9859WLbTdX5tnmD3iC12+2Gjk5rs9XZ/EoyicyE8SszzJDGpTn36Qx18VXKYyXqDzrbDr38/FvrAAjEIgUU1SmM78nH1UeK8qkxgMSn5l5AhTLCeGltMWAA5GwbfA9ns47z54eHO5xWO1FpfbaZo5JEM8KLfEMMbh7kD0wM4i5PRUVDg/PaU4YDVZ/nYX5taCARoVEUbQfIlHp3V65Wg13lU3OvkUa3gqQFSAHUpPKnTp09O/9C9p0XgnrpyWBQX2m3VnSw3L3u6QzC4zyvwX24hhLjvS6R024iOnvYp1oO311Pr8siKJooYKggZTM+DhrK1YIS+dgXr9FuN0aDkkIBLIP52jlV2zbyzgc354PDQb1Tn9UbTuo7DHpv0jKNMJQGd/X09EAclMt87U1zM+Gsb2g5/OH30DAIhQwEMDSC6bDN+BPqsrL4FKAglIE/jMCg/SRaiAi1X4/FTq0ejar6g8HgS8Gs81qFw+AkOIKT5KZzCONjcCXP2M3UBGPydJorLrga8JaWQ4cvpbRpmRakBDAsgpLYpno8XrZbPr4Rl5eVqQdWIHYSihIJkAIkJkuNnIBTU1twPgtKCOt1R0OHk7Jy08dyAHFp+DBuxl24+NqFH03EhFnMtByUStbuLPohUZgvnU5BCpB4Ody9XD2uhqLEAaIQ2kgGUFKIRICsqhwhRygU0g9DFfVtQU5KcFxzstGCMHwpw2RwivLgLleltcHTA8mblL676UWtLQ9SSB2GPYSo1cehhrc3NgTIFytGYChoVIcJkHWbLLW0Cm2cVwHkZLb/ZIfD0MGBXdO5XCMiEvdQVCbDizW4xkU5r33SzZkqpJJ9dwFQeAiB0GkSxTbHAYKoNzaENsYnFxRQeQUWCJACJA9idvYfa82qVPP9Hef1Uj2UxaCXshyb81oQimGmcT4XHsKZixqJuaPC02B2Ur61S/nFRW3aFoE1D02E8doafwTcmtu+D6GAXe0KNIIaaR1Kk0JVtLLUal1Nq0qV7R/O9hv0wX4xUcmKuonmZNKC8JnqMP5DdRLHJ6hpymQ2TVhNImX1vnQ+pfUnYloSo4XNhbZvPR0vK0OeuH97ShivuXYIQwGvRlroYywC784RgLTWOvrrQw6DQRoNmaTRplwuyULjv2KSOMPnNAwvFlOueqmzouZQiUj5Jbyu/GRi1ibslHaArGyMlwNkbPv+2P8QzK3QoaQgZDbil9mqztTU1YWC8yrQ4Qi1OQzRqJezNLGsBRkc9IhEvDKsuXLY19MpxnsaKpW8WLmWT+W1fvSzAqrAsPZfjZGVsUegJkfG9rbVADk+0F6IKABCCpBYQutPxarOeOv62vqz2RBhCIWi99gDLOGVsBZLM/KvxzN4hQ9Xh8OEiBl9vfrDHoopaRGt5W0wXuhsDDWC8W6A/DL1CEBu7+2NwSzLAYIa3e0YDXbZZlHIfqmu7oTwR0mtNHveYSAIluCiFouXtUjqEaX4oEe5T3Slel9O5DtWKlaWKJWMr+TrtA22ij8xW1AABEJZGZsqFyDbe5CJAAnAl4QONjHpnw3QfmG4Hj/R6gxFVQSUJVoT5VggsF5vNMoifzItjJIrFZX68NFcY2mFSAknUabk6zuyfD7lRxMJhRFCAYhaXXYcOTJ2f+82QJDJdhQDiMJtpAsAIbW21b7n9u9nHdBGCJ4T7t+UlEjqJWy0FRnsMTOj35zO8NUlFvx0LgyeZapL9j12aTEPfpGJhO4hBH3jiUfKQMmtbbBLXo7MtaMK9wLmdrvpRIIkIwA5U7dfUuflvBIvwcLzWxq5psbG1qi3SYKIGkwNYj7D88pROKg9WVriOx328V/fuLtYuLNuI9GFQFeXTicoOQ5KIJPtsXIBsoAqumATGxWBBLil9ceunulTRSVNLEd0c1D1+lyp1wJXY2NjKfInZZ7o8TAininhw5mPfGLf6Wo4ud9Yk2ln87IIVFrRpQi4FSsDIAQZ2P5ne7IcQZBJgLTDq8xoTLymI7VaWX71cFVNTU2rlwWnWJMwuvWSRom3qUlSily4OPHxhJkwi0XKTObYR77SJ6vDvup9N26sywppGdwpAW7puowAgevnW//s7amPy8tuBbCuBUyhcOv+WKBJmXZ9qepMXU1TjbeVhadvkkgsJgvrrav3llqaGv/jwHpik4bDaJk4WDk4g3/SUsemXTIgaa1wUIgSU8VVCbMXm40GlTQjUQ5unoxIEyNxkIEZQcI8eFBJRDYTe5Aa9DC2EEyGkRDnFh0HMxJnOHhYdlvih03ogcvr+977Xvt7SCSVejpz+tLpE5cuHX31hQ28658wa7ogmY/+hBtAZJvNxoucmO2CGBq77bfPD+mRX7wNvEVwQIQY7IZXyCcIuGfIi1JnUOaoy8W8pLxeisKZII4j8VYkOZP6e/nK0dj+WGye7u+3a3QMTZZX5/xq3i9pFR5iUOa0P0dhXL2N3d1d8LJ1jScABF5cHcUxCMEV+iQEcd+RI0eGPM6zLhas6/KcBUKUiWFZJNlqzdyKQOtxO3X7wTjUG+x4v52GY+1m0W1LLxISKE9o4fd5dBRBphp7f/48mUamLvgdEvzJARHuMMjufo0JQdrk8RivH4P1Aym8Hg8KowpSLIwrksslHyffJ5Mzlx/aNWPYNolN0MBkc32O2AAQUeYkiRcdF3oRRH8u+2uveRwxgIM5UEoSOzaO64EUzoeDNI7jJhxHqQHm4UPUfI0xMsdQnJkPskglF2+13gPGzf0TdnsVwyYARQNMlta/LsohLSd2JNi6GlebhXlZs43GrEGPNAa1nAzbKMpSm7gKRD7RLEvjB2BVRsBcRqd5wMxCH4eijE43hvyOb7WSM8nUiVPz4/MUjWF2FhohAStX14tuPtTTlmS5DTdOnAUmO1ONxslRvXVvUCtFCZilLHE9kJDuSZqmBV/4wAGjN9j1FqiBukwowzA6igWQOFCJpO6vlEoPdOMkoJB26KDK1Uzabcu721xNromSLBENcNdOttHs1QMIfKRGJYLowA0Uic5t0lg47DOZfB78OgWh5WQYIMEAWAyW8cXC1lYrF6kkU6VXUGtgVZIkgQpZrmcyX/2dkE2So/DAco3YQ2BPprIFAyy+yIsdWUsoqsj5h4c3rn4KY3AGNgnGoSEPzhiDThdjHvGgb8w6iqECSByuXC7+rXLrypXtbQxSi6QtglCtVpcyxWGlyIu1PAxfUcQ1q1X/PdsoIGAyQhRVrdaRVyXC7U7zBxNL9fCdMBv0GeFiUA3j6dvHDDhHnAz1MkYhj3K5HMBElleelUr7Ndi7LgtauLdEVtcTcz3pRaWjKpwoyrCNO13hm9MGa0F0dFStg1A3HOAsdTGU2AzXl27cuTN0dwiKPRbSF3LSO4DqxnSBAPWfCWBUoOReKT0bM5FljBTqNGkK11fTabca6oGCVlQUuVaw6hFDtpk1GKxTNW4DpmWb7EDGp/Oh4urq5j0fLVhYk8l5EULRbKb6+vqO7WPMFBWIIS/+g3xbWVm+XXkwUSZ1pqowYhGwKgbH60QxOhni1Q1ZUWprs1ZQvNB8bpietrbFD7zNFv0QdbuLajSdTqzes9CQK+GgxWO5ex4UcbogHVGXOQaeRRYWFkCT5eVvlcitAPiwbtewFrzbDdXrmXR6eHIyqn7gCaX2o2BFkN7m7D+SzDW0aSiK45FuDWmdHQqVOZewKl1bUeNji0StD6gIWoVZHT6YhZmIUKXthhqEboUimhU2NIjPVqrRbDo/zA9LFNRaq1OjQ8ZoUyxSRSxTCzLQgQw8mSf5khDuj/85Nzc5/2szhl2umWEFfpFZvygO5btFIc+ywUDACUosnJ+mV9DEThPR4KDw9aBkxSoklUrFyhPl8kTm+ciVNbV1c5YufazP45VA4Xl+kyzIP5RIVffZS8CAL2PBCOlyfxhVlq2NSJosD/FD4FnwP1k1aFnZFrQ4BwkTzN4GDCNMWDtmajdht5BKJXa+HIuBmstXX8FX6+WaOueCbfVtL5emi8K9/AWoy7Aid49+DsOLjphJMhwOG83xL/LiKk0Z2nTwgsgKm1iWFbigBIY+mJ04Sq9AiUX4etxAGDDM0I6BkmQydv9+qhy7fPV9Lbjy9Ls5tGcv5QwW02peVFk+H2HFqgdxlxlxIW6StIEU48DMkUPDrBgRI7xYPJpXBZblnAEuUF8P2YKFhECbcZwgDAaiGjMYkGQyNcuBGfZidW1tHaxxFOagBtMBSSqqebA9eFEc/QxDQ7JcJEBmKR++VnUrP2RNkgX+gsBrqqpybW2BIGw/OGgCRWkcNcAMNuhRjaT6k8l+nXT5z8jCYzUe796Td691nbY70+mixAqsoF3//DGEdEBBYGwy7ja63Uazy0zODMuRiMyKAg8PSdFeH5Se4yyNzbB+0QS9mzbhuAHSBScCKmYxqYlzO0ZWH6hbAI1SK+w1+TnOJ0kqOx3K2WxmsxnZDgwjKHHrouBGx8DrUVHUNJ5VoxCq9MTCWRiv5+nefXYaXYWbCLQdpSFZGzEk2a8HQMqZkSXr59Sfbu2BhmxuU1s6rRYlfjobisdJUk+W2WZ0zzLgAniQtNANTeZ1jKCqrCQxTK/Vus3i8VJ2B1SGWIWaaKqJOoYh/f2ppA6JlV9kqq/sPrz14cMN0LxuqXuZVlmh81suFBqIFxIJN2l069nSC99h7EsUCok+W/i7rMkaz7OzISm+Zq/H6rFSvVt2OVAUCo877NCPIpVkSldSicWen6neX2OYd+7ZxT1dXTcb/c2KT5kecyGusVyoUJqcHI+T40Bxk26y9HuykIj3QWP/ndEUoPBy1BfkOOaJ1ep1dHrs/pM1KEGbCMzRUgOQ1H2oiA4pZ15U41jLOhDS09N6CncGBVX6lkO2u8amwBwqlCDGS+PjpUk4PmWzY2EbtKq5Lxqv/VR4tpNp9vn8bxth/7HBgVOUfxftaMApO/w7NiHA+C+lPDFye8ktbO7muW96jvec9tNOSfFn9aUkQWan/mb7CjC+DiglwPb6NZXN2Yw2s+sbw2qsIgV9fuYkWDi+aOPhww0Y7QCndBtFOQAyvwVJ6iWpVACSybxfsnD1vIeb79zZ03r2LtiXxUEd0mGzkWQWKCQJiEICGFOPBuIJss9s3I5ke6MMo0pSlOH8nRxzzXut64TXjm00DHb6rXaqqWbL8pZ/NZphaGplGMdPDG+te7ZiH9r50qI70G0sWIOmRckm5+C4MoI5R30oBNmHNdTZOZ5jaGC1To6T3uCg7V617HbZaTLciXU25od7RHcLhe5l3qaQQ2s6d2OQi0Z3Ldp6TtHjOaIg78//8zzv874+r0qBBAZcC7t/3lV1B7ZDAT5k9gsT/f3fvkPcKYMSCPiPQ/d+r125fOUzQFy7d2fos2tgl2ywSDbJmffmZ7a+emfr9tQUMTUD57VrsgM89lKPtY/o0KoNhi5ktfFfSH59AEH5RTUXykgbfMEsCjRJWvsPywMD+qH7P94f0p/c+xh+w1+79tnKiV6vhxS22S7Zxi83ua0pwjr1XpJMWsnozNrt22trdCJh0arVcHWo1SMGAyj5/JOboGMBlGzeCn0Q9HkDwSDGyALNWfshuy4B5D7MD/3JncjKFWhCARXe2cBgu+c65KasVoKYss5zSZKUo9FowmExWDTaPpLknobGmQUgmdWbUFN+/bfg3zDmcr4gbPcKOEtRouX7/tPDMpQR5YLvfhK5//NORGHAO5ABkCvpFmmdmrf209y8laZlwUJR/kQi4XFMcOq+Ptkx1mF4DiAgZfXzm4s3FxYePFyam9v27i6d2XGMlQUrIXNHrsu2yzYwvTIVISR6YCimt11ShBwVyS0iSVjnKfFCFKy04IGqBGnjsWjBptWesTGANFZByueri4q/tndzn875lhrLBSeGyaxfniBP02VlsVL8b0MGoAKDDf1XJMfHxwdACPfCPMmRVgKL1UlhDUoS1AszCt2Tp7XqsY4u6LGNIQ2gNFYbAHmwu5vdngv5MpnseaFQZAWWpeX6IWQrtDtsiDLmlWsDkFKXlbAr3tLrm1aOILa2OEqM4ZilXytQMJHNo6NXjSZNe7t2eqTLAp0DZKPRyABkQVGyu+vzBr2+LECceIxmWVYmxXgE0ngcGI+NlyMfIzblhU0x2OPtXdRFUiTnORErsiynfYGaZBh+tPObn64CBM5O1RbLtOc5JNPIZEANCFl48GDJp5Oyfy5JmQLuxATq4ICiBGczveLSK4Mj5UgZxlYMQfSXkAHXYZGUaY6VRfAVR7NUlGInzev8Rz8EOnWzIEXdMW0YG/MgGcUUyMJNWOezYV/uTy/v40EKVmeEgwNMxOJNCIzehuhdaRekFTBsegSW4HSrpa3TrIjZCy2RThKCwFKC32zmRwOBq4YujVarMYxZxhwO5CyT2cg0FhdXgbO7lPNmf8tJks9dcB5gWpllD/y4vxWP34mcuE4i6fSOawi5bHsb0uDHlXL8AisWRRxP4fbTOs3RMxTFTA6aJSgZXuPLXSOa2ZGOMaLLsYZkzvhMo1pdqsIOL1v1eXO5XCAg8RL4ixVEjLEf4H4RO4xXmvuRyE66XCvfH1gZiKy4SkfOIoYVcHveHcOK1nlGkEXGgK57e71hCVpnGo1GO93heBaigph5PrOxEQhWVxd3fcGsN5sN5SQ+WCg4CwyHsZj/4MBux+tYLKZc8aNKJd/cT1cqlRYGCZVKud2lFi7SXJKmorTADJrNowFUB6cjs0CZ7ph2PDs9oUBGR6VwAJQAwzfny+ay0vl5we8sOEWWZQpOv92d+hvH8BjcGBg8tZxYDMNL7pQ7FcdwcJkVahBBW4r+N9B1KdypM8wCY7YD9iWOiWkHYjYzvLQePqsuLYWDPigqWV/ouH4uOWO8hB3gnosiHnNX3PFUPv83DuNDBPA4IFL5UiqfKrkhMU6tgoWkpmgKvMW/rDMavaYRYLRr1ACZniBACWMeXR/8SKouhtBANRT05uY2T2Pn4RbP8347xtRxrHjgdldK+XQln4+XSuCfPfc+WMTddIMHL4rMqyT7hEwzfhQ1+zvDnZ1GowmUzLZ3EGqOcxAkIpgHzSgqrQer0BIOhEJeb/bY6TwOHRcKBdbPshQr4AVnCYLhSkf2Ins7f+xFajv7e5H9Wjzewi+KxboogrdkGSBAWb8aftloMqnaYZuihpNEYoIgkEl0chDtXJeCErre2RnS+ea2jefHwKryvB3HcZbFIOKxo2bFVXat1FZqOzVXrba/l95rHrqLuHhab6cpUpYTTJFB16/CwYKq22SabWvTal6Bo2r1UxxHIFSUYRhUCgRQs/kNVAoFje8aex8eb3vDGwEct4PhOISgcNS8U0una+VaJO3a29uvNA/jpzFMFE9BCksQFGowoC+HwzrdW8bu7tlu2Dtq2rXchJokAUJ7PDS4S1pnJoVBNByWjqHdva06vsj5AnZ+GSCAKcRKR0fNZtp1L11Jp9N7laP8Yct5Ua/j9WKdltlJVJhEdSYdnIyBEmgwmgAyotWSBEEokGhClsGV0JVmBAaF7zNiNKpUpuG/vAF+Y2N5GTg4GFaKV+IVmCXgtyYw4q14K3YRi4lFkfKzftSJ8uGQoVf1Yu8XqvdVbbPDj0LhUlthmbcmrUi0a/q2JzHICAkLnUAtI5Yuo2rYpHp415STspnl5WU7D1rckKz5eL4E+XUUz+f34TWkmZOJYX7MDlkYCIQ7vaHwrd7uL1Svtb/f1t72aJva8nRPsi/Z82oScTjW1jwO8BkdjXosFlBiUkHnfvPG8LZ3m8+cKRiwFMxteKTy7jxk8X7erUx1N++225cVtRu+oC7kfevWux9+qWprG36zrf2ZHm0HbFi0Pcnkq8jE2gwtOzy36SgV9chEAiDdw93Dqu2Hx5tStZGxL8MwcCscOwys0MAU5vK/drZ8luGrPshKb++tD69/ePf6h23XX+p5RtPToX2pJ9kDN/LCE1tPkU89BRtf5Q9UVrKv5/nHn3n8+ut3b7x53bTpDYfCqGLOf+3ceV5Xnv83NHyuC4fhM97NzVt3byhdzF++/u516JQ+Cn+xeTT56iNPwuORfwDmIxlqcXq9nQAAAABJRU5ErkJggg==";
});
canvas {width: 100px; height: 100px}
div {text-align:center; width:120px; float:left}
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
<div>
<canvas id="canvOrigMinus100" width="100" height="100"></canvas>
-98%
</div>
<div>
<canvas id="canvOrigMinus50" width="100" height="100"></canvas>
-50%
</div>
<div>
<canvas id="canvOrig" width="100" height="100"></canvas>
Original
</div>
<div>
<canvas id="canvOrigPlus50" width="100" height="100"></canvas>
+50%
</div>
<div>
<canvas id="canvOrigPlus100" width="100" height="100"></canvas>
+98%
</div>
<hr/>
<div style="clear:left">
<canvas id="canvRoundMinus90" width="100" height="100"></canvas>
Round-trip <br/> (-98%, +98%)
</div>
<div>
<canvas id="canvRoundMinus50" width="100" height="100"></canvas>
Round-trip <br/> (-50%, +50%)
</div>
<div>
<canvas id="canvRound0" width="100" height="100"></canvas>
Round-trip <br/> (0% 100x)
</div>
<div>
<canvas id="canvRoundPlus50" width="100" height="100"></canvas>
Round-trip <br/> (+50%, -50%)
</div>
<div>
<canvas id="canvRoundPlus90" width="100" height="100"></canvas>
Round-trip <br/> (+98%, -98%)
</div>
Explanation
(Disclaimer - I am not an image specialist or a mathematician. I am trying to provide a common-sense explanation with minimal technical details. Some hand-waving below, e.g. 255=256 to avoid indexing issues, and 127.5=128, for simplifying the numbers.)
Since, for a given pixel, the possible number of non-zero values for a color channel is 255, the "no-contrast", average value of a pixel is 128 (or 127, or 127.5 if you want argue, but the difference is negligible). For purposed of this explanation, the amount of "contrast" is the distance from the current value to the average value (128). Adjusting the contrast means increasing or decreasing the difference between the current value and the average value.
The problem the algorithm solves then is to:
Chose a constant factor to adjust contrast by
For each color channel of each pixel, scale "contrast" (distance from average) by that constant factor
Or, as hinted at in the CSS spec, simply choosing the slope and intercept of a line:
<feFuncR type="linear" slope="[amount]" intercept="-(0.5 * [amount]) + 0.5"/>
Note the term type='linear'; we are doing linear contrast adjustment in RGB color space, as opposed to a quadratic scaling function, luminence-based adjustment, or histogram matching.
If you recall from geometry class, the formula for a line is y=mx+b. y is the final value we are after, the slope m is the contrast (or factor), x is the initial pixel value, and b is the intercept of the y-axis (x=0), which shifts the line vertically. Recall also that since the y-intercept is not at the origin (0,0), the formula can also be represented as y=m(x-a)+b, where a is the x-offset shifting the line horizontally.
For our purposes, this graph represents the input value (x-axis) and the result (y-axis). We already know that b, the y-intercept (for m=0, no contrast) must be 128 (which we can check against the 0.5 from the spec - 0.5 * the full range of 256 = 128). x is our original value, so all we need is to figure out the slope m and x-offset a.
First, the slope m is "rise over run", or (y2-y1)/(x2-x1) - so we need 2 points known to be on the desired line. Finding these points requires bringing a few things together:
Our function takes the shape of a line-intercept graph
The y-intercept is at b = 128 - regardless of the slope (contrast).
The maximum expected 'y' value is 255, and the minimum is 0
The range of possible 'x' values is 256
A neutral value should always stay neutral: 128 => 128 regardless of slope
A contrast adjustment of 0 should result in no change between input and output; that is, a 1:1 slope.
Taking all these together, we can deduce that regardless of the contrast (slope) applied, our resulting line will be centered at (and pivot around) 128,128. Since our y-intercept is non-zero, the x-intercept is also non-zero; we know the x-range is 256 wide and is centered in the middle, so it must be offset by half of the possible range: 256 / 2 = 128.
So now for y=m(x-a)+b, we know everything except m. Recall two more important points from geometry class:
Lines have the same slope even if their location changes; that is, m stays the same regardless of the values of a and b.
The slope of a line can be found using any 2 points on the line
To simplify the slope discussion, let's move the coordinate origin to the x-intercept (-128) and ignore a and b for a moment. Our original line will now pivot through (0,0), and we know a second point on the line lies away the full range of both x (input) and y (output) at (255,255).
We'll let the new line pivot at (0,0), so we can use that as one of the points on the new line that will follow our final contrast slope m. The second point can be determined by moving the current end at (255,255) by some amount; since we are limited to a single input (contrast) and using a linear function, this second point will be moved equally in the x and y directions on our graph.
The (x,y) coordinates of the 4 possible new points will be 255 +/- contrast. Since increasing or decreasing both x and y would keep us on the original 1:1 line, let's just look at +x, -y and -x, +y as shown.
The steeper line (-x, +y) is associated with a positive contrast adjustment; it's (x,y) coordinates are (255 - contrast,255 + contrast). The coordinates of the shallower line (negative contrast) are found the same way. Notice that the biggest meaningful value of contrast will be 255 - the most that the initial point of (255,255) can be translated before resulting in a vertical line (full contrast, all black or white) or a horizontal line (no contrast, all gray).
So now we have the coordinates of two points on our new line - (0,0) and (255 - contrast,255 + contrast). We plug this into the slope equation, and then plug that into the full line equation, using all the parts from before:
y = m(x-a) + b
m = (y2-y1)/(x2-x1) =>
((255 + contrast) - 0)/((255 - contrast) - 0) =>
(255 + contrast)/(255 - contrast)
a = 128
b = 128
y = (255 + contrast)/(255 - contrast) * (x - 128) + 128 QED
The math-minded will notice that the resulting m or factor is a scalar (unitless) value; you can use any range you want for contrast as long as it matches the constant (255) in the factor calculation. For example, a contrast range of +/-100 and factor = (100 + contrast)/(100.01 - contrast), which is was I really use to eliminate the step of scaling to 255; I just left 255 in the code at the top to simplify the explanation.
Note about the "magic" 259
The source article uses a "magic" 259, although the author admits he doesn't remember why:
"I can’t remember if I had calculated this myself or if I’ve read it in a book or online.".
259 should really be 255 or perhaps 256 - the number of possible non-zero values for each channel of each pixel. Note that in the original factor calculation, 259/255 cancels out - technically 1.01, but final values are whole integers so 1 for all practical purposes. So this outer term can be discarded. Actually using 255 for the constant in the denominator, though, introduces the possibility of a Divide By Zero error in the formula; adjusting to a slightly larger value (say, 259) avoids this issue without introducing significant error to the results. I chose to use 255.01 instead as the error is lower and it (hopefully) seems less "magic" to a newcomer.
As far as I can tell though, it doesn't make much difference which you use - you get identical values except for minor, symmetric differences in a narrow band of low contrast values with a low positive contrast increase. I'd be curious to round-trip both versions repeatedly and compare to the original data, but this answer already took way too long. :)
After trying the answer by Schahriar SaffarShargh, it wasn't behaving like contrast should behave. I finally came across this algorithm, and it works like a charm!
For additional information about the algorithm, read this article and it's comments section.
function contrastImage(imageData, contrast) {
var data = imageData.data;
var factor = (259 * (contrast + 255)) / (255 * (259 - contrast));
for(var i=0;i<data.length;i+=4)
{
data[i] = factor * (data[i] - 128) + 128;
data[i+1] = factor * (data[i+1] - 128) + 128;
data[i+2] = factor * (data[i+2] - 128) + 128;
}
return imageData;
}
Usage:
var newImageData = contrastImage(imageData, 30);
Hopefully this will be a time-saver for someone. Cheers!
This javascript implementation complies with the SVG/CSS3 definition of "contrast" (and the following code will render your canvas image identically):
/*contrast filter function*/
//See definition at https://drafts.fxtf.org/filters/#contrastEquivalent
//pixels come from your getImageData() function call on your canvas image
contrast = function(pixels, value){
var d = pixels.data;
var intercept = 255*(-value/2 + 0.5);
for(var i=0;i<d.length;i+=4){
d[i] = d[i]*value + intercept;
d[i+1] = d[i+1]*value + intercept;
d[i+2] = d[i+2]*value + intercept;
//implement clamping in a separate function if using in production
if(d[i] > 255) d[i] = 255;
if(d[i+1] > 255) d[i+1] = 255;
if(d[i+2] > 255) d[i+2] = 255;
if(d[i] < 0) d[i] = 0;
if(d[i+1] < 0) d[i+1] = 0;
if(d[i+2] < 0) d[i+2] = 0;
}
return pixels;
}
I found out that you have to use the effect by separating the darks and lights or technically anything that is less than 127 (average of R+G+B / 3) in rgb scale is a black and more than 127 is a white, therefore by your level of contrast you minus a value say 10 contrast from the blacks and add the same value to the whites!
Here is an example:
I have two pixels with RGB colors, [105,40,200] | [255,200,150]
So I know that for my first pixel 105 + 40 + 200 = 345, 345/3 = 115
and 115 is less than my half of 255 which is 127 so I consider the pixel closer to [0,0,0] therefore if I want to minus 10 contrast then I take away 10 from each color on it's average
Thus I have to divide each color's value by the total's average which was 115 for this case and times it by my contrast and minus out the final value from that specific color:
For example I'll take 105 (red) from my pixel, so I divide it by total RGB's avg. which is 115 and times it by my contrast value of 10, (105/115)*10 which gives you something around 9 (you have to round it up!) and then take that 9 away from 105 so that color becomes 96 so my red after having a 10 contrast on a dark pixel.
So if I go on my pixel's values become [96,37,183]! (note: the scale of contrast is up to you! but my in the end you should convert it to some scale like from 1 to 255)
For the lighter pixels I also do the same except instead of subtracting the contrast value I add it! and if you reach the limit of 255 or 0 then you stop your addition and subtraction for that specific color! therefore my second pixel which is a lighter pixel becomes [255,210,157]
As you add more contrast it will lighten the lighter colors and darken the darker and therefore adds contrast to your picture!
Here is a sample Javascript code ( I haven't tried it yet ) :
var data = imageData.data;
for (var i = 0; i < data.length; i += 4) {
var contrast = 10;
var average = Math.round( ( data[i] + data[i+1] + data[i+2] ) / 3 );
if (average > 127){
data[i] += ( data[i]/average ) * contrast;
data[i+1] += ( data[i+1]/average ) * contrast;
data[i+2] += ( data[i+2]/average ) * contrast;
}else{
data[i] -= ( data[i]/average ) * contrast;
data[i+1] -= ( data[i+1]/average ) * contrast;
data[i+2] -= ( data[i+2]/average ) * contrast;
}
}
You can take a look at the OpenCV docs to see how you could accomplish this: Brightness and contrast adjustments.
Then there's the demo code:
double alpha; // Simple contrast control: value [1.0-3.0]
int beta; // Simple brightness control: value [0-100]
for( int y = 0; y < image.rows; y++ )
{
for( int x = 0; x < image.cols; x++ )
{
for( int c = 0; c < 3; c++ )
{
new_image.at<Vec3b>(y,x)[c] = saturate_cast<uchar>( alpha*( image.at<Vec3b>(y,x)[c] ) + beta );
}
}
}
which I imagine you are capable of translating to javascript.
By vintaging I assume your trying to apply LUTS..Recently I have been trying to add color treatments to canvas windows. If you want to actually apply "LUTS" to the canvas window I believe you need to actually map the array that imageData returns to the RGB array of the LUT.
(From Light illusion)
As an example the start of a 1D LUT could look something like this:
Note: strictly speaking this is 3x 1D LUTs, as each colour (R,G,B) is a 1D LUT
R, G, B
3, 0, 0
5, 2, 1
7, 5, 3
9, 9, 9
Which means that:
For an input value of 0 for R, G, and B, the output is R=3, G=0, B=0
For an input value of 1 for R, G, and B, the output is R=5, G=2, B=1
For an input value of 2 for R, G, and B, the output is R=7, G=5, B=3
For an input value of 3 for R, G, and B, the output is R=9, G=9, B=9
Which is a weird LUT, but you see that for a given value of R, G, or B input, there is a given value of R, G, and B output.
So, if a pixel had an input value of 3, 1, 0 for RGB, the output pixel would be 9, 2, 0.
During this I also realized after playing with imageData that it returns a Uint8Array and that the values in that array are decimal. Most 3D LUTS are Hex. So you first have to do some type of hex to dec conversion on the entire array before all this mapping.
This is the formula you are looking for ...
var data = imageData.data;
if (contrast > 0) {
for(var i = 0; i < data.length; i += 4) {
data[i] += (255 - data[i]) * contrast / 255; // red
data[i + 1] += (255 - data[i + 1]) * contrast / 255; // green
data[i + 2] += (255 - data[i + 2]) * contrast / 255; // blue
}
} else if (contrast < 0) {
for (var i = 0; i < data.length; i += 4) {
data[i] += data[i] * (contrast) / 255; // red
data[i + 1] += data[i + 1] * (contrast) / 255; // green
data[i + 2] += data[i + 2] * (contrast) / 255; // blue
}
}
Hope it helps!
I have a rectangle and would like to:
Get a random point on one (any) of the sides.
Get a random point on one (except for the previously picked) side.
My initial approach is to create arrays for each possible side.
var arr:Array = [[{x:0,y:0}, // Top
{x:width,y:0}], //
[{x:width,y:0}, // Right
{x:width,y:height}], //
[{x:width,y:height}, // Bottom
{x:0,y:height}], //
[{x:0,y:height}, // Left
{x:0,y:0}]]; //
Then, I get the sides.
rand is an instance of Rand and has the methods:
.next() which provides a random number between 0 and 1
.between(x,y) which returns a random number between x and y.
var firstSide:Array = arr[rand.next() * arr.length];
var secondSide:Array;
do {
secondSide = arr[rand.next() * arr.length];
} while(secondSide.equals(firstSide));
Finally, I calculate my points.
var pointOnFirstSide:Object = {x:rand.between(firstSide[0].x, firstSide[1].x),
y:rand.between(firstSide[0].y, firstSide[1].y};
var pointOnSecondSide:Object = {x:rand.between(secondSide[0].x, secondSide[1].x),
y:rand.between(secondSide[0].y, secondSide[1].y};
I don't think this is the most efficient way to solve this.
How would you do it?
Assuming we have the following interfaces and types:
interface Rand {
next(): number;
between(x: number, y: number): number;
}
interface Point {
x: number;
y: number;
}
type PointPair = readonly [Point, Point];
and taking you at your word in the comment that the procedure is: first randomly pick two sides, and then pick random points on those sides... first let's see what's involved in picking two sides at random:
const s1 = Math.floor(rand.between(0, arr.length));
const s2 = (Math.floor(rand.between(1, arr.length)) + s1) % arr.length;
s1 and s2 represent the indices of arr that we are choosing. The first one chooses a whole number between 0 and one less than the length of the array. We do this by picking a real number (okay, floating point number, whatever) between 0 and the length of the array, and then taking the floor of that real number. Since the length is 4, what we are doing is picking a real number uniformly between 0 and 4. One quarter of those numbers are between 0 and 1, another quarter between 1 and 2, another quarter between 2 and 3, and the last quarter are between 3 and 4. That means you have a 25% chance of choosing each of 0, 1, 2 and 3. (The chance of choosing 4 is essentially 0, or perhaps exactly 0 if rand is implemented in the normal way which excludes the upper bound).
For s2 we now pick a number uniformly between 1 and the length of the array. In this case, we are picking 1, 2, or 3 with a 33% chance each. We add that number to s1 and then take the remainder when dividing by 4. Think of what we are doing as starting on the first side s1, and then moving either 1, 2, or 3 sides (say) clockwise to pick the next side. This completely eliminates the possibility of choosing the same side twice.
Now let's see what's involved in randomly picking a point on a line segment (which can be defined as a PointPair, corresponding to the two ends p1 and p2 of the line segment) given a Rand instance:
function randomPointOnSide([p1, p2]: PointPair, rand: Rand): Point {
const frac = rand.next(); // between 0 and 1
return { x: (p2.x - p1.x) * frac + p1.x, y: (p2.y - p1.y) * frac + p1.y };
}
Here what we do is pick a single random number frac, representing how far along the way from p1 to p2 we want to go. If frac is 0, we pick p1. If frac is 1, we pick p2. If frac is 0.5, we pick halfway between p1 and p2. The general formula for this is a linear interpolation between p1 and p2 given frac.
Hopefully between the two of those, you can implement the algorithm you're looking for. Good luck!
Link to code
jcalz already gave an excellent answer. Here is an alternate version for the variant I asked about in the comments: When you want your points uniformly chosen over two sides of the perimeter, so that if your w : h ratio was 4 : 1, the first point is four times as likely to lie on a horizontal side as a vertical one. (This means that the chance of hitting two opposite long sides is 24/45; two opposite short side, 1/45; and one of each, 20/45 -- by a simple but slightly tedious calculation.)
const rand = {
next: () => Math. random (),
between: (lo, hi) => lo + (hi - lo) * Math .random (),
}
const vertices = (w, h) => [ {x: 0, y: h}, {x: w, y: h}, {x: w, y: 0}, {x: 0, y: 0} ]
const edges = ([v1, v2, v3, v4]) => [ [v1, v2], [v2, v3], [v3, v4], [v4, v1] ]
const randomPoint = ([v1, v2], rand) => ({
x: v1 .x + rand .next () * (v2 .x - v1 .x),
y: v1 .y + rand .next () * (v2 .y - v1 .y),
})
const getIndex = (w, h, x) => x < w ? 0 : x < w + h ? 1 : x < w + h + w ? 2 : 3
const twoPoints = (w, h, rand) => {
const es = edges (vertices (w, h) )
const perimeter = 2 * w + 2 * h
const r1 = rand .between (0, perimeter)
const idx1 = getIndex (w, h, r1)
const r2 = (
rand. between (0, perimeter - (idx1 % 2 == 0 ? w : h)) +
Math .ceil ((idx1 + 1) / 2) * w + Math .floor ((idx1 + 1) / 2) * h
) % perimeter
const idx2 = getIndex (w, h, r2)
return {p1: randomPoint (es [idx1], rand), p2: randomPoint (es [idx2], rand)}
}
console .log (
// Ten random pairs on rectangle with width 5 and height 2
Array (10) .fill () .map (() => twoPoints (5, 2, rand))
)
The only complicated bit in there is the calculation of r2. We calculate a random number between 0 and the total length of the remaining three sides, by adding all four sides together and subtracting off the length of the current side, width if idx is even, height if it's odd. Then we add it to the total length of the sides up to and including the index (where the ceil and floor calls simply count the number of horizontal and vertical sides, these values multiplied by the width and height, respectively, and added together) and finally take a floating-point modulus of the result with the perimeter. This is the same technique as in jcalz's answer, but made more complex by dealing with side lengths rather than simple counts.
I didn't make rand an instance of any class or interface, and in fact didn't do any Typescript here, but you can add that yourself easily enough.
I have an array (1200 values) of numbers
[123, 145, 158, 133...]
I'd like to have a div for each value with a background color from red to green, red being the smallest number and green the largest.
The base setup looks like this: (templating with vuejs but unrelated to the problem)
const values = [123, 145, 158, 133...]; // 1200 values inside
const total = values.length;
<div
v-for="(val, i) in values"
:key="i"
:style="{backgroundColor: `rgb(${(100 - (val*100/total)) * 256}, ${(val*100/total) * 256}, 0)`}">
{{val}}
</div>
I'm not a maths specialist but since all my numbers are around 100, the rgb generated is the same. (around 12% yellowish color)
How can I give more weight to the difference between 137 and 147?
EDIT: final formula:
:style="{backgroundColor: `rgb(${(256/(maxValue-minValue) * (boule-maxValue) - 255)}, ${(256/20 * (boule-maxValue) + 255)}, 0)`}"
Checkout this post: https://stats.stackexchange.com/questions/70801/how-to-normalize-data-to-0-1-range.
Basically you want to linearly rescale your values to another interval. You need your current min and max values from the array. Then define the new min' and max' which are the limits of the new interval. This would be [0, 255] in your case.
To do the transformation use the formula:
newvalue= (max'-min')/(max-min)*(value-max)+max'
As an example:
If your min value is 127 and max is 147, and you want to map 137. Then:
256/20 * (137-147) + 255 which results in 127.
If you want to map 130. Then:
256/20 * (130-147) + 255 = 37.4.
It really depends on what meaning those values actually have
However, you can try this: if your values are always bigger than 100 and always less than 150 (you can choose these number of course) you can "stretch" your values using the values as minimum and maximum. Let's take 137 and 147 as examples:
(val-min) : (max-min) = x : 255
(137-100):(150-100) = x:255 -> 37:50 = x:255 -> 188
(147-100):(150-100) = x:255 -> 47:50 = x:255 -> 239
That is for the math. In the end, this is the calculation:
newValue = (val-min)*255/(max-min)
where min and max are your chosen values.
You could take a kind of magnifier for a range of data. In this example, the values between 20 and 30 are mapped to a two times greater range than the outside values inside of an interval of 0 ... 100.
function magnifier(value, start, end, factor) {
var middle = (start + end) / 2,
size = (end - start) * factor / 2,
left = middle - size,
right = middle + size;
if (value <= start) return value * left / start;
if (value <= end) return (value - start) * factor + left;
return (value - end) * (100 - right) / (100 - end) + right;
}
var i;
for (i = 0; i <= 100; i += 5) {
console.log(i, magnifier(i, 20, 30, 2));
}
.as-console-wrapper { max-height: 100% !important; top: 0; }
I have an element and I'm changing its colour with timer:
r = 0.05;
delta = 0.05;
function changecol(){
if (r < 0.05 || r > 0.95) delta = delta * (-1);
r += delta;
col = '#'+(Math.floor(r*255)).toString(16)+(Math.floor(r*255)).toString(16)+(Math.floor(r*255)).toString(16);
svg_elem.style.fill = col;
setTimeout("changecol()", 50);
}
So colour changes from white to black and back, r*255 keeps it between 00 and FF, everything goes smoothly but when it goes to Black, it flickers to White and starts ascending up basically. Have I missed some error in calculations?
here's a jFiddle with a demo: https://jsfiddle.net/b4f58bz2/
I ran a console.debug on your jsfiddle and I found that the result of (Math.floor(r*255)).toString(16) when it flickers is 'C' in hex, just before '0'. The problem is that when translated to css color it's: #ccc which equals to #cccccc which is a very light color instead of a dark one. You have to pad the result of (Math.floor(r*255)).toString(16) with a leading 0 if its length is less than 2, like:
color = (Math.floor(r*255)).toString(16);
if (color.length < 2) {
color = "0" + color;
}
then just make:
col = '#' + color + color + color;
I hope this gets you on the right track.
To change from black to white you can simply use the Lightness part of HSL with 0% saturation, giving that as color argument:
Replacing setTimeout() with requestAnimationFrame() will also make the animation much smoother as this will sync to the monitor vblank updates. Just compensate by reducing delta to about half.
Example using a div
var l = 5;
var delta = 2.5;
(function changecol() {
if (l < 5 || l > 95) delta = -delta;
l += delta;
d.style.backgroundColor = "hsl(0,0%," + l + "%)";
requestAnimationFrame(changecol);
})();
<div id=d>Ping-pong fade</div>
This took a bit.
The problem is your conversion of toString(16), you aren't considering the possibility that your hex code is too short.
R = hex.length == 1 ? "0" + hex : hex;
Please see:
r = 0.05;
delta = 0.05;
function changecol() {
if (r < 0.05 || r > 0.95) delta = delta * (-1);
r += delta;
hex = (Math.floor(r * 255)).toString(16);
R = hex.length == 1 ? "0" + hex : hex;
col = '#' + R + R + R;
document.getElementById('test').style.backgroundColor = col;
setTimeout("changecol()", 300);
}
document.addEventListener('DOMContentLoaded', changecol, false);
<div id="test">
ADFASDF
</div>
<div id="test1">
ADFASDF
</div>
The other poster already mentioned your bug in encoding small values, so that #0C0C0C for example is encoded as #CCC wich actually represents #CCCCCC.
Besides that your code has some more bad practices in it.
First: use local variables! some may be "needed" global, but col for example has no reason at all, to be made global. That's just environmental pollution.
Next: pass the function to setTimeout, not a string that has to be parsed.
setTimeout(changecol, 50);
It's faster, it's cleaner, it can be encapsulated, it can be minified, ... only benefits.
Then, with a little trick, you can rewrite your color-encoding to be way nicer:
var c = 255 * r;
var col = "#" + (0x1000000 | c << 16 | c << 8 | c).toString(16).substr(1);
the 0x1000000 is padding, so that the values always exceed 6 digits, followed by the three color channels (r, g, b).
This value is converted to a hex-value, having a 1 in front of it from the padding, wich then is removed by the substr(1). Therefore we have always exactly 6 hexadecimal digits, 2 per color-channel, no worrying about leading zeroes and stuff.
And the bit-operations also strip all decimal-places that c might have.
And don't use body-onload to start this thing. If you really have to, use jQuery ($(changecol)) or search for a DOMContentLoaded-implementation, ... or just put the script at the end of the body and know therefore that is executed after all the html is parsed, and the dom is built.
But imo. we can do even better than that. We can make this thing a function of time, instead of incrementing and decrementing the value, and giving it a padding of 5% so you don't go over the borders, ...
var startTime = Date.now();
var speed = 255 / 1000; //255 shades of gray / 1000ms
//this means every 2 seconds this animation completes
//a full cycle from black to white to black again.
//or you slow the speed down to `2*255 / 30000` for example
//1cycle per 30seconds
function changecol(){
//first we calculate the passed time since the start (in ms)
//and then we multiply it by the speed by wich the value should change
//the amount of color-steps per ms
var t = (Date.now() - startTime) * speed;
//now we get the fraction of 256 wich defines the shade
/*
var c = Math.floor(t % 256);
*/
//since we're dealing with powers of 2 we can use a bit-mask to extract just
//the last 8 bit of the value and do basically the same, including the Math.floor()
var c = t & 0xFF;
//every other interval we have to change the direction, so that we fade back from white to black
//and don't start over from 0 to 255 (black to white)
/*
var interval = Math.floor(t / 256); //get the interval
if(interval % 2 === 1) //check wether it's odd
c = 255 - c; //revert the direction
*/
//but again we can do easyer by simply checking the 9th bit.
//if it's a 1, this is an odd interval. the 9th bit equals 0x100 (256).
if(t & 0x100) //mask only the 9th bit. returns either 256 (wich is true) or 0 (false)
c = 0xFF - c; //revert the direction
var col = "#" + (0x1000000 | c << 16 | c << 8 | c).toString(16).substr(1);
test.style.color = col;
//using animation-frames to stay in sync with the rest of animation in the browser.
requestAnimationFrame(changecol);
}
changecol(); //start this thing, don't use body-onload
You could do something like this instead, that way you don't have to try and figure out the correct HEX code.
var whatColor = true;
fadeColor = function() {
var color = whatColor ? 'white' : 'black';
$('svg circle').css({fill: color});
whatColor = !whatColor;
setTimeout(function() {
fadeColor();
}, 1000);
}
fadeColor();
svg circle {
-webkit-transition: fill 1s;
transition: fill 1s;
fill: black;
}
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
<svg class="mySVG" height="100" width="100">
<circle cx="50" cy="50" r="40" stroke="black" stroke-width="3" />
</svg>
I need to compute the difference between two hex color values so the output is a percentage value. The first thing I discarted was converting the hex value into decimal, as the first one will have much higher weight than the last.
The second option is to compute the difference between each of the RGB values and then add them all. However, the difference between 0, 0, 0 and 30, 30, 30 is much lower than the one between 0, 0, 0 and 90, 0, 0.
This question recommends using YUV, but I can't figure out how to use it to establish the difference.
Also, this other question has a nice formula to compute the difference and output a RGB value, but it's not quite there.
For those just looking for a quick copy/paste, here's the code from this repo by antimatter15 (with some tweaks for ease of use):
function deltaE(rgbA, rgbB) {
let labA = rgb2lab(rgbA);
let labB = rgb2lab(rgbB);
let deltaL = labA[0] - labB[0];
let deltaA = labA[1] - labB[1];
let deltaB = labA[2] - labB[2];
let c1 = Math.sqrt(labA[1] * labA[1] + labA[2] * labA[2]);
let c2 = Math.sqrt(labB[1] * labB[1] + labB[2] * labB[2]);
let deltaC = c1 - c2;
let deltaH = deltaA * deltaA + deltaB * deltaB - deltaC * deltaC;
deltaH = deltaH < 0 ? 0 : Math.sqrt(deltaH);
let sc = 1.0 + 0.045 * c1;
let sh = 1.0 + 0.015 * c1;
let deltaLKlsl = deltaL / (1.0);
let deltaCkcsc = deltaC / (sc);
let deltaHkhsh = deltaH / (sh);
let i = deltaLKlsl * deltaLKlsl + deltaCkcsc * deltaCkcsc + deltaHkhsh * deltaHkhsh;
return i < 0 ? 0 : Math.sqrt(i);
}
function rgb2lab(rgb){
let r = rgb[0] / 255, g = rgb[1] / 255, b = rgb[2] / 255, x, y, z;
r = (r > 0.04045) ? Math.pow((r + 0.055) / 1.055, 2.4) : r / 12.92;
g = (g > 0.04045) ? Math.pow((g + 0.055) / 1.055, 2.4) : g / 12.92;
b = (b > 0.04045) ? Math.pow((b + 0.055) / 1.055, 2.4) : b / 12.92;
x = (r * 0.4124 + g * 0.3576 + b * 0.1805) / 0.95047;
y = (r * 0.2126 + g * 0.7152 + b * 0.0722) / 1.00000;
z = (r * 0.0193 + g * 0.1192 + b * 0.9505) / 1.08883;
x = (x > 0.008856) ? Math.pow(x, 1/3) : (7.787 * x) + 16/116;
y = (y > 0.008856) ? Math.pow(y, 1/3) : (7.787 * y) + 16/116;
z = (z > 0.008856) ? Math.pow(z, 1/3) : (7.787 * z) + 16/116;
return [(116 * y) - 16, 500 * (x - y), 200 * (y - z)]
}
To use it, just pass in two rgb arrays:
deltaE([128, 0, 255], [128, 0, 255]); // 0
deltaE([128, 0, 255], [128, 0, 230]); // 3.175
deltaE([128, 0, 255], [128, 0, 230]); // 21.434
deltaE([0, 0, 255], [255, 0, 0]); // 61.24
The above table is from here. The above code is based on the 1994 version of DeltaE.
the issue is that you want something like a distance on a 3 dimensionnal world,
but that rgb representation is not intuitive at all : 'near' colors can be
much different that 'far' color.
Take for instance two shades of grey c1 : (120,120,120) and c2 : (150,150,150) and a now take c3 : (160,140,140) it is closer to c2 than c1, yet it is purple, and for the eye the darker grey is much closer to grey than a purple.
I would suggest you to use hsv : a color is defined by a 'base' color (hue), the saturation, and the intensity. colors having close hue are indeed very close. colors having very different hue do not relate one to another (expl : yellow and green ) but might seem closer with a (very) low saturation and (very) low intensity.
( At night all colors are alike. )
Since the hue is divided into 6 blocks, cyl = Math.floor( hue / 6 ) gives you the first step of your similarity evalution : if same part of the cylinder -> quite close.
If they don't belong to same cylinder, they might still be (quite) close if (h2-h1) is small, compare it to (1/6). If (h2-h1) > 1/6 this might just be too different colors.
Then you can be more precise with the (s,v). Colors they are nearer if both low/very low saturation and/or low intensity.
Play around with a color picker supporting both rgb and hsv until you know what you would like to have as a difference value. But be aware that you cannot have a 'true' similarity measure.
you have a rgb --> hsv javascript convertor here : http://axonflux.com/handy-rgb-to-hsl-and-rgb-to-hsv-color-model-c
Just compute an Euclidean distance:
var c1 = [0, 0, 0],
c2 = [30, 30, 30],
c3 = [90, 0, 0],
distance = function(v1, v2){
var i,
d = 0;
for (i = 0; i < v1.length; i++) {
d += (v1[i] - v2[i])*(v1[i] - v2[i]);
}
return Math.sqrt(d);
};
console.log( distance(c1, c2), distance(c1, c3), distance(c2, c3) );
//will give you 51.96152422706632 90 73.48469228349535
I released an npm/Bower package for calculating the three CIE algorithms: de76, de94, and de00.
It's public domain and on Github:
http://zschuessler.github.io/DeltaE/
Here's a quickstart guide:
Install via npm
npm install delta-e
Usage
// Include library
var DeltaE = require('delta-e');
// Create two test LAB color objects to compare!
var color1 = {L: 36, A: 60, B: 41};
var color2 = {L: 100, A: 40, B: 90};
// 1976 formula
console.log(DeltaE.getDeltaE76(color1, color2));
// 1994 formula
console.log(DeltaE.getDeltaE94(color1, color2));
// 2000 formula
console.log(DeltaE.getDeltaE00(color1, color2));
You will need to convert to LAB color to use this library. d3.js has an excellent API for doing that - and I'm sure you can find something adhoc as well.
The 3rd rule for color comparisons on ColorWiki is "Never attempt to convert between color differences calculated by different equations through the use of averaging factors". This is because colors that are mathematically close to each other aren't always visually similar to us humans.
What you're looking for is probably delta-e, which is a single number that represents the 'distance' between two colors.
The most popular algorithms are listed below, with CIE76 (aka CIE 1976 or dE76) being the most popular.
CIE76
CMC l:c
dE94
dE2000
Each one goes about things in a different way, but for the most part they all require you to convert to a better (for comparison) color model than RGB.
Wikipedia has all the formulae: http://en.wikipedia.org/wiki/Color_difference
You can check your work with online color calculators:
CIE76
CMC l:c
Finally, it's not javascript but there's an open-source c# library I started will do some of these conversions and calculations: https://github.com/THEjoezack/ColorMine
Using Color.js:
let Color = await import("https://cdn.jsdelivr.net/npm/colorjs.io#0.0.5/dist/color.esm.js").then(m => m.default);
let color1 = new Color(`rgb(10,230,95)`);
let color2 = new Color(`rgb(100,20,130)`);
let colorDistance = color1.deltaE2000(color2);
A distance of 0 means the colors are identical, and a value of 100 means they're opposite.
deltaE2000 is the current industry standard (it's better than the 1994 version mentioned in top answer), but Color.js also has other algorithms like deltaE76, deltaECMC and deltaEITP.