Related
I've been writing an image processing program which applies effects through HTML5 canvas pixel processing. I've achieved Thresholding, Vintaging, and ColorGradient pixel manipulations but unbelievably I cannot change the contrast of the image!
I've tried multiple solutions but I always get too much brightness in the picture and less of a contrast effect and I'm not planning to use any Javascript libraries since I'm trying to achieve these effects natively.
The basic pixel manipulation code:
var data = imageData.data;
for (var i = 0; i < data.length; i += 4) {
//Note: data[i], data[i+1], data[i+2] represent RGB respectively
data[i] = data[i];
data[i+1] = data[i+1];
data[i+2] = data[i+2];
}
Pixel manipulation example
Values are in RGB mode which means data[i] is the Red color. So if data[i] = data[i] * 2; the brightness will be increased to twice for the Red channel of that pixel. Example:
var data = imageData.data;
for (var i = 0; i < data.length; i += 4) {
//Note: data[i], data[i+1], data[i+2] represent RGB respectively
//Increases brightness of RGB channel by 2
data[i] = data[i]*2;
data[i+1] = data[i+1]*2;
data[i+2] = data[i+2]*2;
}
*Note: I'm not asking you guys to complete the code! That would just be a favor! I'm asking for an algorithm (even Pseudo code) that shows how Contrast in pixel manipulation is possible!
I would be glad if someone can provide a good algorithm for Image Contrast in HTML5 canvas.
A faster option (based on Escher's approach) is:
function contrastImage(imgData, contrast){ //input range [-100..100]
var d = imgData.data;
contrast = (contrast/100) + 1; //convert to decimal & shift range: [0..2]
var intercept = 128 * (1 - contrast);
for(var i=0;i<d.length;i+=4){ //r,g,b,a
d[i] = d[i]*contrast + intercept;
d[i+1] = d[i+1]*contrast + intercept;
d[i+2] = d[i+2]*contrast + intercept;
}
return imgData;
}
Derivation similar to the below; this version is mathematically the same, but runs much faster.
Original answer
Here is a simplified version with explanation of an approach already discussed (which was based on this article):
function contrastImage(imageData, contrast) { // contrast as an integer percent
var data = imageData.data; // original array modified, but canvas not updated
contrast *= 2.55; // or *= 255 / 100; scale integer percent to full range
var factor = (255 + contrast) / (255.01 - contrast); //add .1 to avoid /0 error
for(var i=0;i<data.length;i+=4) //pixel values in 4-byte blocks (r,g,b,a)
{
data[i] = factor * (data[i] - 128) + 128; //r value
data[i+1] = factor * (data[i+1] - 128) + 128; //g value
data[i+2] = factor * (data[i+2] - 128) + 128; //b value
}
return imageData; //optional (e.g. for filter function chaining)
}
Notes
I have chosen to use a contrast range of +/- 100 instead of the original +/- 255. A percentage value seems more intuitive for users, or programmers who don't understand the underlying concepts. Also, my usage is always tied to UI controls; a range from -100% to +100% allows me to label and bind the control value directly instead of adjusting or explaining it.
This algorithm doesn't include range checking, even though the calculated values can far exceed the allowable range - this is because the array underlying the ImageData object is a Uint8ClampedArray. As MSDN explains, with a Uint8ClampedArray the range checking is handled for you:
"if you specified a value that is out of the range of [0,255], 0 or 255 will be set instead."
Usage
Note that while the underlying formula is fairly symmetric (allows round-tripping), data is lost at high levels of filtering because pixels only allow integer values. For example, by the time you de-saturate an image to extreme levels (>95% or so), all the pixels are basically a uniform medium gray (within a few digits of the average possible value of 128). Turning the contrast back up again results in a flattened color range.
Also, order of operations is important when applying multiple contrast adjustments - saturated values "blow out" (exceed the clamped max value of 255) quickly, meaning highly saturating and then de-saturating will result in a darker image overall. De-saturating and then saturating however doesn't have as much data loss, because the highlight and shadow values get muted, instead of clipped (see explanation below).
Generally speaking, when applying multiple filters it's better to start each operation with the original data and re-apply each adjustment in turn, rather than trying to reverse a previous change - at least for image quality. Performance speed or other demands may dictate differently for each situation.
Code Example:
function contrastImage(imageData, contrast) { // contrast input as percent; range [-1..1]
var data = imageData.data; // Note: original dataset modified directly!
contrast *= 255;
var factor = (contrast + 255) / (255.01 - contrast); //add .1 to avoid /0 error.
for(var i=0;i<data.length;i+=4)
{
data[i] = factor * (data[i] - 128) + 128;
data[i+1] = factor * (data[i+1] - 128) + 128;
data[i+2] = factor * (data[i+2] - 128) + 128;
}
return imageData; //optional (e.g. for filter function chaining)
}
$(document).ready(function(){
var ctxOrigMinus100 = document.getElementById('canvOrigMinus100').getContext("2d");
var ctxOrigMinus50 = document.getElementById('canvOrigMinus50').getContext("2d");
var ctxOrig = document.getElementById('canvOrig').getContext("2d");
var ctxOrigPlus50 = document.getElementById('canvOrigPlus50').getContext("2d");
var ctxOrigPlus100 = document.getElementById('canvOrigPlus100').getContext("2d");
var ctxRoundMinus90 = document.getElementById('canvRoundMinus90').getContext("2d");
var ctxRoundMinus50 = document.getElementById('canvRoundMinus50').getContext("2d");
var ctxRound0 = document.getElementById('canvRound0').getContext("2d");
var ctxRoundPlus50 = document.getElementById('canvRoundPlus50').getContext("2d");
var ctxRoundPlus90 = document.getElementById('canvRoundPlus90').getContext("2d");
var img = new Image();
img.onload = function() {
//draw orig
ctxOrig.drawImage(img, 0, 0, img.width, img.height, 0, 0, 100, 100); //100 = canvas width, height
//reduce contrast
var origBits = ctxOrig.getImageData(0, 0, 100, 100);
contrastImage(origBits, -.98);
ctxOrigMinus100.putImageData(origBits, 0, 0);
var origBits = ctxOrig.getImageData(0, 0, 100, 100);
contrastImage(origBits, -.5);
ctxOrigMinus50.putImageData(origBits, 0, 0);
// add contrast
var origBits = ctxOrig.getImageData(0, 0, 100, 100);
contrastImage(origBits, .5);
ctxOrigPlus50.putImageData(origBits, 0, 0);
var origBits = ctxOrig.getImageData(0, 0, 100, 100);
contrastImage(origBits, .98);
ctxOrigPlus100.putImageData(origBits, 0, 0);
//round-trip, de-saturate first
origBits = ctxOrig.getImageData(0, 0, 100, 100);
contrastImage(origBits, -.98);
contrastImage(origBits, .98);
ctxRoundMinus90.putImageData(origBits, 0, 0);
origBits = ctxOrig.getImageData(0, 0, 100, 100);
contrastImage(origBits, -.5);
contrastImage(origBits, .5);
ctxRoundMinus50.putImageData(origBits, 0, 0);
//do nothing 100 times
origBits = ctxOrig.getImageData(0, 0, 100, 100);
for(i=0;i<100;i++){
contrastImage(origBits, 0);
}
ctxRound0.putImageData(origBits, 0, 0);
//round-trip, saturate first
origBits = ctxOrig.getImageData(0, 0, 100, 100);
contrastImage(origBits, .5);
contrastImage(origBits, -.5);
ctxRoundPlus50.putImageData(origBits, 0, 0);
origBits = ctxOrig.getImageData(0, 0, 100, 100);
contrastImage(origBits, .98);
contrastImage(origBits, -.98);
ctxRoundPlus90.putImageData(origBits, 0, 0);
};
img.src = "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAGQAAABkCAMAAABHPGVmAAADAFBMVEX0RydFRjuPweRoak+awuFzbknzUj9SV0T0RRVtb1lcak7zUTSUmX94hnOYoHFacmpCPS9zwvOBkG5Zbl7xVCCVxuhnfnV7fFOnoqU3OTFQZVNNXlCmxeBtfWlhaFlbXUmfyOVrh3/vXUiBfUcrKCF9wu50fXB4fGVlcGJIUEvtSDCUvd2Up4uJnYBUX0A7QzjtW1tYZUlqY0U1Miivy+BfdXJ/hWtuc2eNjWSCkWFbYVVhcU6Pl2tsf13xVUxyueeir4N+lICIkXdIUjuxr7JyfH1iemiHhFT5Rx+7zN2Xn2KCf11jeFl1jYhSZ11OTkGJxe6Kt9ybrp6fqZBkfoZ/in53jntBTUN4h2FPTDFFRy+GwOeju9iVp35dWj1uh25gZWqNmF9JWUX3SjOvxNtyjZlYeJdqrNeMn3F2dFmNj1KWkktFV1W7vbd/l4ydkIVOXWDgVEjwWzlqb3WXq6/Bp6OHlJuWn5aKmotSZ2t/jVWRgFKmkU92eUF+rdaKrNCnsqVnh5BqeEyHh0POmzxiZDuotbqkrW+lnmvEo1BafqZ3gYnmZWKonFyZjlvvZ1Xev0hlu+5aksKyr5RLaI+YkmrXZmqYnFa3l0tXVDZpXzGFoqWlkZa3hYHrbmbXuFGLo5Z4hU3UqkdhbUBvbThxoM29u6SVoaNVbXziXlRVVVRZndPNsl1/uOHCsbB5mJWSjZE9PyWWs9egpX+st36lm3eysnJ6bTdxkb5gcYfSWlMfGhxpwPdloLp8mqWCi4++s344SVKih0aYfT6JcTy/TzwsLzZAMR+drM2/kJGrnpBafntHYnk1PUNfVC3SUCpfr+XTc4OGeXnjaHSlqV+sgzm9mpviSTlRg7Rrh7HmdXbCdXDSeW5AVWy2pWHZa1ynt5Hge5DNiouMgIbGtGq6w86+ZFmkV06+jzxOQyXWiKKzcGeFosRgkajdrzqddHOEcFN6SCTRig2dRCbjRiK1ahXMnLV6VzuEXFyeaiy8voxgPh/CfBK6lK6ciHeFa2q3ecQCAAAjI0lEQVRo3iRVf2gbZRjOGUroaO5iTq8mx6WGxDTBpGGNl+baFJOGpXhtLs4mzY+RxjvMTJNQAw5tTuJsjnCiKAiTsNSBoVFGRWpsyJBuheps/oiNIhEEHQQFR9Hi2lFq14J+6sMH33HHve/7vD+eV9KO7Jav6nTVfL6cFuhiZYU/UzmTNCvNbkIbJ1B0HJ12E74FrRdDFxgkgfgwBlkKrG36tFoEO+dbY9ZaGGJktJtrmUu+hdmg3Y0Nmp2LdaVIizCvFFPF4YLkpFy9WtTpdOl0viRUK2yFpyrOsEW5uIChLVxrIUaMC1oMR7ksgjMkjobi4MIQn5pEsr6cOoRkGBJpBRKbl5Z9pGFBHjQHzcqusuFQsmFRGXZ0+GJeslu+kS+lBSFP06UqTYtjcYof01hmUR/mXcPOzA6ZvCTJQFaGUfs9fivHWT1qJhDKJsjcEo4wiDZHtgKB5cRy4pI2Lp8YIeTmcaUTnocbIlxX9vMiL6xI2rvtok6odPg0neJTYpVyOQthftZ4aTmhZRBjKIubEMiPk4jMpIayIavJ6gkxrcCty0ubawGEJAPLm8trr1z6Y2lzOeHjxs5qzEbY7l6P1XVUfV0s6XQFnpLsttt5oZRKpdOlEptiKZ6OVobOElo1ksNa5zASRWUQTiYCATK7xLSWSDLRWtq8fP7y8iVfC/rCt7m5mUgktCSJYAEUw0zucY1RA/dSsDNJTcVYlk8W8nlJu3yjmq/k2U4lla+wPM+OrRSkQ2rvKIliWhLL4iiSg0xMNrAUWP5jOcEgy3ubfyQubyImtTVE5pAsYwqZOEaN+9WQWnY1MkBopNL+nv6+usslOhsuuLAyB5xUK4U0O9kQOwIrUvmVWmT67JgfHYI4DpJZTR61TKaIKqDQKqkFfhJqda6VWMJQT8TjUaNciOM4k9/jiQIoIDUyKpMW5vqdFCvWO7Qz6ehJ83MUKHw1nU5R5XQJpKo0VChIr8qGIpGoAhx/TVFTNJvgKdpUQNcDJB7KqlW5AOOJAJsej4eDojarAoIUCigalXlqNQXkV1w1jivtDidP0XXluK5U5PslN/JzVSGv69AlgRallamiTCPzN8FvEUXTFm2qFDab4n/4FX5U4YnW/jtNGyCnCkUi5Rv7J7vWmoKDIBXgA/mNoL/6Vkou18T6A5eSoni2KImAOUkL777zDl2ZpHlW2a8ag0wyK/RvfCrV9m7z5KTZtAGoVAoVsKzGEc4D2WzDGky9bTw8PT48OP0BtelRz79sOJnf2CuPycMs7FDW63SHH6AKZUk1JayUUg6xmE/RIp3v6ZH6QUw4JAtZVfrt6uHp4f7h4fGOCYP0tppN4RnycB6PCpqfHx9SbcMHB8Pzxwenyh0NET+LAxCEN2iZmFY67esPlFLRORhzshWJcEMQ0pNsp0E3XC44SQ0MQP5VyGZVezirbefLw9PmHPBCbuPjMj1w8y+iNWi4J+y16vXJ04Pjk8PTU/mOafHDOMOgCBlfDbonYkkYnqAqLFunpAVpv6QslIr5fl2p02nAohPuGzSARuc8HEoYxzzbwunB4f3Dwx9MG3pjb6Rpa9aioOa1SFkaj25sbOA/HACcfpXTjy6a3STqzRmC8t6ZCSrspGjKOQ7TKTa/kpcIc8JwOiUUHWyn46Rpe2wwGMRQNOddtKj1evzwPyv3bBsKPwE19fpapNau1dq9YS4yHNHrZcfg62ROFcJmPoyToyjmm0XNFioZg9c7rJi0jHlA5XnJbaGtK5fSOoF2pkSXE7bPzwQXnkdaS2uzGpCQ0D7gsq/XN3cjmjNNcM8Vd6PtI/k095pOZbOBF03r9g7iXTcbRnCvL240PBeG+5LJcLI7E0vyNGx3FouS2zeAcqXSAtxolHSlyZTDbggaZjEExGQfUls39Bv/wX9WymmN+vbubi1yNHyVwCHOmLGGrDs7W6YtbTz4qtsoY2QGr1zrtiyau65uzNJ9QFN0QckDWbnd1glCzzudfLpRSoNKwQ4i2Of1erVo3DKNqq2ZzNZWJpNdWHh11qRoHp3st4+O7o14VEME2AAtJsNkEa07+GoQz6CjBEaMeONds8VZX5+BKTEZc5b6K5WKZHclBeQxVS9R9QbLlh7ABTgYXJz2xlHES/R64xyOqhnjyz65mtNbd3ua+6/t3/7Tx3hVGu3CWhDDEHnw1ZlFjUmLoASGy71yi91icXVnZrousyuZhGkHOySp6srptK6fFTssC3YZLFLzU4Q9KCfPkV4ZITXOEoOG4MjbWEBrQvRz905O/j4+/uvtpVyLw02ake9f+uyDx37tlWUx4EOuWZCfkxOxriVMwV1XjIVFsFAc/LRkF+wsXXVSpEG3daZ6qKTTPTGxOGLADFovbuUG7d/b7YOG0dVVBjGFRu7t37129/j26nIrlPnm4t7e3lsXv1i1qTItBGxLwjholGN9MaWSguszgEa9HoMtlKMgKbfTQk9HLDlZGoZddZ5PJgeDSoPh3PMEpsVDJqvfD5nuvHVxa1uVMRrunhxd++349wEDY3vrxfce/hffvfjW9RyDY5rxEa+bMLrNix++YVHOzFjqrgegv4Cyj0mK5XQqVUhVikJJl4cbIF8wOzY68rKBxI0hyLpz58qPP/64d+v8hT391tsv3W0eXTs+/P2vkNW298Sjjz7+6IXXby0FrmczS1oGnT6jwdxueczi+viN51wPzN1YLOkc7uFXwMQPT+VT9UYq1Uk56Tcd/evvw30yVGs0DHAmq/XOj9/dvAniffrTz/2m76/dP9n/4P63v/0C1RQfPf7UU0+evwWwd+VOxpRFNMSZabtl0R2LKWOuriVW78bqSWk4yVdA4XU6YbIkpNIdIPdAWeZ7+/ocBK5ZxVHT1pUrVy6++PATT1x49sIfR9Cfv9zfP7r2yLe/XbtXG3jomRfOn//0iYdv3rz54sVcNpdF1ZowEbbLqeQsDMOWJOARp8CknKlI2lVhuFzV0ZPJyR66KDbenbI7pwbtfpnMqIbufPLdd7euf3ThUYDH7kf+/Pnu/t1fHvn2659+L597/KnLTz70LHDy3i2MvP6NNYuMjstGCWPc7IorxfrMG10XHYu5YlSY/4cG841po47D+HG9E1oo0KPHOVbWa2vHtrYoZ1cL7dk/i3S1Zf2zpV5XuIaFWWCEhMboFKlQgy4GR9oMlk4lJQUdNg2Rzb+DJSaiKGJZZnQhLGwmakIk0YzgK43fW+K96Mv73PM83+d79yuC46Oui4Ov4+fe7r08aLVaKY/ZIzZ5hFX/3Jd96VQqFdsJ0BhJP7r819b48l/LG+W7xcmtv2+SOh1G+v2y2NmlpfX19NWrtfXixlKrydDtdEAadkfHdb1B393MNncjGYZ3Ua5zLnPvZcp1+RVPz2WzK5xhlC0nTlzKL72VWpRpCwnMSJPvFn/f3HrwYPmWXL67tfngJorRNAiMLcVsizLb0tWjp47WeYlDIUeUMHTb9d2Q/3VYXt3NXi8y8+mVDDOjqex89qJG1FvpabD2DjElT0mPHt1/J2Vbyi9qI5GAzmikfyoub8SLD36fjMvlc1vLAoSk6VjMlloEx+ATbETVFo1WtYaCQQPBGfT2jo57do6zdk9bEJwynxvVUJ9qBimP/ZNnPWaKp0zPSFtb9q+lF22FdMoWQTFdl5v+tlicOrJbPLIhL5Orx7/4AKVpiOVh8us7SzurL58dGZk/+05WyhkAc/LesN5ZYXeYfuxM5qAnYaV5CKfODc1Qva4fzQ2mSmmFVCo90NeXltkiibxWgLS7jd8Vt6bkxeLAuFwuV0/99iKKKYx0IKKVyVLppYJtafXl2v6q0Hy2/55er7effB9+r3cnX6ns5HIILqKGZnqtVPOFyxrrhbc9BGeyO021qoO1H76VAqtighIM/HqvOK6WF3fVW3GAjM29i2JGI4ZC8rLYrE0rW9859WLbTdX5tnmD3iC12+2Gjk5rs9XZ/EoyicyE8SszzJDGpTn36Qx18VXKYyXqDzrbDr38/FvrAAjEIgUU1SmM78nH1UeK8qkxgMSn5l5AhTLCeGltMWAA5GwbfA9ns47z54eHO5xWO1FpfbaZo5JEM8KLfEMMbh7kD0wM4i5PRUVDg/PaU4YDVZ/nYX5taCARoVEUbQfIlHp3V65Wg13lU3OvkUa3gqQFSAHUpPKnTp09O/9C9p0XgnrpyWBQX2m3VnSw3L3u6QzC4zyvwX24hhLjvS6R024iOnvYp1oO311Pr8siKJooYKggZTM+DhrK1YIS+dgXr9FuN0aDkkIBLIP52jlV2zbyzgc354PDQb1Tn9UbTuo7DHpv0jKNMJQGd/X09EAclMt87U1zM+Gsb2g5/OH30DAIhQwEMDSC6bDN+BPqsrL4FKAglIE/jMCg/SRaiAi1X4/FTq0ejar6g8HgS8Gs81qFw+AkOIKT5KZzCONjcCXP2M3UBGPydJorLrga8JaWQ4cvpbRpmRakBDAsgpLYpno8XrZbPr4Rl5eVqQdWIHYSihIJkAIkJkuNnIBTU1twPgtKCOt1R0OHk7Jy08dyAHFp+DBuxl24+NqFH03EhFnMtByUStbuLPohUZgvnU5BCpB4Ody9XD2uhqLEAaIQ2kgGUFKIRICsqhwhRygU0g9DFfVtQU5KcFxzstGCMHwpw2RwivLgLleltcHTA8mblL676UWtLQ9SSB2GPYSo1cehhrc3NgTIFytGYChoVIcJkHWbLLW0Cm2cVwHkZLb/ZIfD0MGBXdO5XCMiEvdQVCbDizW4xkU5r33SzZkqpJJ9dwFQeAiB0GkSxTbHAYKoNzaENsYnFxRQeQUWCJACJA9idvYfa82qVPP9Hef1Uj2UxaCXshyb81oQimGmcT4XHsKZixqJuaPC02B2Ur61S/nFRW3aFoE1D02E8doafwTcmtu+D6GAXe0KNIIaaR1Kk0JVtLLUal1Nq0qV7R/O9hv0wX4xUcmKuonmZNKC8JnqMP5DdRLHJ6hpymQ2TVhNImX1vnQ+pfUnYloSo4XNhbZvPR0vK0OeuH97ShivuXYIQwGvRlroYywC784RgLTWOvrrQw6DQRoNmaTRplwuyULjv2KSOMPnNAwvFlOueqmzouZQiUj5Jbyu/GRi1ibslHaArGyMlwNkbPv+2P8QzK3QoaQgZDbil9mqztTU1YWC8yrQ4Qi1OQzRqJezNLGsBRkc9IhEvDKsuXLY19MpxnsaKpW8WLmWT+W1fvSzAqrAsPZfjZGVsUegJkfG9rbVADk+0F6IKABCCpBYQutPxarOeOv62vqz2RBhCIWi99gDLOGVsBZLM/KvxzN4hQ9Xh8OEiBl9vfrDHoopaRGt5W0wXuhsDDWC8W6A/DL1CEBu7+2NwSzLAYIa3e0YDXbZZlHIfqmu7oTwR0mtNHveYSAIluCiFouXtUjqEaX4oEe5T3Slel9O5DtWKlaWKJWMr+TrtA22ij8xW1AABEJZGZsqFyDbe5CJAAnAl4QONjHpnw3QfmG4Hj/R6gxFVQSUJVoT5VggsF5vNMoifzItjJIrFZX68NFcY2mFSAknUabk6zuyfD7lRxMJhRFCAYhaXXYcOTJ2f+82QJDJdhQDiMJtpAsAIbW21b7n9u9nHdBGCJ4T7t+UlEjqJWy0FRnsMTOj35zO8NUlFvx0LgyeZapL9j12aTEPfpGJhO4hBH3jiUfKQMmtbbBLXo7MtaMK9wLmdrvpRIIkIwA5U7dfUuflvBIvwcLzWxq5psbG1qi3SYKIGkwNYj7D88pROKg9WVriOx328V/fuLtYuLNuI9GFQFeXTicoOQ5KIJPtsXIBsoAqumATGxWBBLil9ceunulTRSVNLEd0c1D1+lyp1wJXY2NjKfInZZ7o8TAininhw5mPfGLf6Wo4ud9Yk2ln87IIVFrRpQi4FSsDIAQZ2P5ne7IcQZBJgLTDq8xoTLymI7VaWX71cFVNTU2rlwWnWJMwuvWSRom3qUlSily4OPHxhJkwi0XKTObYR77SJ6vDvup9N26sywppGdwpAW7puowAgevnW//s7amPy8tuBbCuBUyhcOv+WKBJmXZ9qepMXU1TjbeVhadvkkgsJgvrrav3llqaGv/jwHpik4bDaJk4WDk4g3/SUsemXTIgaa1wUIgSU8VVCbMXm40GlTQjUQ5unoxIEyNxkIEZQcI8eFBJRDYTe5Aa9DC2EEyGkRDnFh0HMxJnOHhYdlvih03ogcvr+977Xvt7SCSVejpz+tLpE5cuHX31hQ28658wa7ogmY/+hBtAZJvNxoucmO2CGBq77bfPD+mRX7wNvEVwQIQY7IZXyCcIuGfIi1JnUOaoy8W8pLxeisKZII4j8VYkOZP6e/nK0dj+WGye7u+3a3QMTZZX5/xq3i9pFR5iUOa0P0dhXL2N3d1d8LJ1jScABF5cHcUxCMEV+iQEcd+RI0eGPM6zLhas6/KcBUKUiWFZJNlqzdyKQOtxO3X7wTjUG+x4v52GY+1m0W1LLxISKE9o4fd5dBRBphp7f/48mUamLvgdEvzJARHuMMjufo0JQdrk8RivH4P1Aym8Hg8KowpSLIwrksslHyffJ5Mzlx/aNWPYNolN0MBkc32O2AAQUeYkiRcdF3oRRH8u+2uveRwxgIM5UEoSOzaO64EUzoeDNI7jJhxHqQHm4UPUfI0xMsdQnJkPskglF2+13gPGzf0TdnsVwyYARQNMlta/LsohLSd2JNi6GlebhXlZs43GrEGPNAa1nAzbKMpSm7gKRD7RLEvjB2BVRsBcRqd5wMxCH4eijE43hvyOb7WSM8nUiVPz4/MUjWF2FhohAStX14tuPtTTlmS5DTdOnAUmO1ONxslRvXVvUCtFCZilLHE9kJDuSZqmBV/4wAGjN9j1FqiBukwowzA6igWQOFCJpO6vlEoPdOMkoJB26KDK1Uzabcu721xNromSLBENcNdOttHs1QMIfKRGJYLowA0Uic5t0lg47DOZfB78OgWh5WQYIMEAWAyW8cXC1lYrF6kkU6VXUGtgVZIkgQpZrmcyX/2dkE2So/DAco3YQ2BPprIFAyy+yIsdWUsoqsj5h4c3rn4KY3AGNgnGoSEPzhiDThdjHvGgb8w6iqECSByuXC7+rXLrypXtbQxSi6QtglCtVpcyxWGlyIu1PAxfUcQ1q1X/PdsoIGAyQhRVrdaRVyXC7U7zBxNL9fCdMBv0GeFiUA3j6dvHDDhHnAz1MkYhj3K5HMBElleelUr7Ndi7LgtauLdEVtcTcz3pRaWjKpwoyrCNO13hm9MGa0F0dFStg1A3HOAsdTGU2AzXl27cuTN0dwiKPRbSF3LSO4DqxnSBAPWfCWBUoOReKT0bM5FljBTqNGkK11fTabca6oGCVlQUuVaw6hFDtpk1GKxTNW4DpmWb7EDGp/Oh4urq5j0fLVhYk8l5EULRbKb6+vqO7WPMFBWIIS/+g3xbWVm+XXkwUSZ1pqowYhGwKgbH60QxOhni1Q1ZUWprs1ZQvNB8bpietrbFD7zNFv0QdbuLajSdTqzes9CQK+GgxWO5ex4UcbogHVGXOQaeRRYWFkCT5eVvlcitAPiwbtewFrzbDdXrmXR6eHIyqn7gCaX2o2BFkN7m7D+SzDW0aSiK45FuDWmdHQqVOZewKl1bUeNji0StD6gIWoVZHT6YhZmIUKXthhqEboUimhU2NIjPVqrRbDo/zA9LFNRaq1OjQ8ZoUyxSRSxTCzLQgQw8mSf5khDuj/85Nzc5/2szhl2umWEFfpFZvygO5btFIc+ywUDACUosnJ+mV9DEThPR4KDw9aBkxSoklUrFyhPl8kTm+ciVNbV1c5YufazP45VA4Xl+kyzIP5RIVffZS8CAL2PBCOlyfxhVlq2NSJosD/FD4FnwP1k1aFnZFrQ4BwkTzN4GDCNMWDtmajdht5BKJXa+HIuBmstXX8FX6+WaOueCbfVtL5emi8K9/AWoy7Aid49+DsOLjphJMhwOG83xL/LiKk0Z2nTwgsgKm1iWFbigBIY+mJ04Sq9AiUX4etxAGDDM0I6BkmQydv9+qhy7fPV9Lbjy9Ls5tGcv5QwW02peVFk+H2HFqgdxlxlxIW6StIEU48DMkUPDrBgRI7xYPJpXBZblnAEuUF8P2YKFhECbcZwgDAaiGjMYkGQyNcuBGfZidW1tHaxxFOagBtMBSSqqebA9eFEc/QxDQ7JcJEBmKR++VnUrP2RNkgX+gsBrqqpybW2BIGw/OGgCRWkcNcAMNuhRjaT6k8l+nXT5z8jCYzUe796Td691nbY70+mixAqsoF3//DGEdEBBYGwy7ja63Uazy0zODMuRiMyKAg8PSdFeH5Se4yyNzbB+0QS9mzbhuAHSBScCKmYxqYlzO0ZWH6hbAI1SK+w1+TnOJ0kqOx3K2WxmsxnZDgwjKHHrouBGx8DrUVHUNJ5VoxCq9MTCWRiv5+nefXYaXYWbCLQdpSFZGzEk2a8HQMqZkSXr59Sfbu2BhmxuU1s6rRYlfjobisdJUk+W2WZ0zzLgAniQtNANTeZ1jKCqrCQxTK/Vus3i8VJ2B1SGWIWaaKqJOoYh/f2ppA6JlV9kqq/sPrz14cMN0LxuqXuZVlmh81suFBqIFxIJN2l069nSC99h7EsUCok+W/i7rMkaz7OzISm+Zq/H6rFSvVt2OVAUCo877NCPIpVkSldSicWen6neX2OYd+7ZxT1dXTcb/c2KT5kecyGusVyoUJqcHI+T40Bxk26y9HuykIj3QWP/ndEUoPBy1BfkOOaJ1ep1dHrs/pM1KEGbCMzRUgOQ1H2oiA4pZ15U41jLOhDS09N6CncGBVX6lkO2u8amwBwqlCDGS+PjpUk4PmWzY2EbtKq5Lxqv/VR4tpNp9vn8bxth/7HBgVOUfxftaMApO/w7NiHA+C+lPDFye8ktbO7muW96jvec9tNOSfFn9aUkQWan/mb7CjC+DiglwPb6NZXN2Yw2s+sbw2qsIgV9fuYkWDi+aOPhww0Y7QCndBtFOQAyvwVJ6iWpVACSybxfsnD1vIeb79zZ03r2LtiXxUEd0mGzkWQWKCQJiEICGFOPBuIJss9s3I5ke6MMo0pSlOH8nRxzzXut64TXjm00DHb6rXaqqWbL8pZ/NZphaGplGMdPDG+te7ZiH9r50qI70G0sWIOmRckm5+C4MoI5R30oBNmHNdTZOZ5jaGC1To6T3uCg7V617HbZaTLciXU25od7RHcLhe5l3qaQQ2s6d2OQi0Z3Ldp6TtHjOaIg78//8zzv874+r0qBBAZcC7t/3lV1B7ZDAT5k9gsT/f3fvkPcKYMSCPiPQ/d+r125fOUzQFy7d2fos2tgl2ywSDbJmffmZ7a+emfr9tQUMTUD57VrsgM89lKPtY/o0KoNhi5ktfFfSH59AEH5RTUXykgbfMEsCjRJWvsPywMD+qH7P94f0p/c+xh+w1+79tnKiV6vhxS22S7Zxi83ua0pwjr1XpJMWsnozNrt22trdCJh0arVcHWo1SMGAyj5/JOboGMBlGzeCn0Q9HkDwSDGyALNWfshuy4B5D7MD/3JncjKFWhCARXe2cBgu+c65KasVoKYss5zSZKUo9FowmExWDTaPpLknobGmQUgmdWbUFN+/bfg3zDmcr4gbPcKOEtRouX7/tPDMpQR5YLvfhK5//NORGHAO5ABkCvpFmmdmrf209y8laZlwUJR/kQi4XFMcOq+Ptkx1mF4DiAgZfXzm4s3FxYePFyam9v27i6d2XGMlQUrIXNHrsu2yzYwvTIVISR6YCimt11ShBwVyS0iSVjnKfFCFKy04IGqBGnjsWjBptWesTGANFZByueri4q/tndzn875lhrLBSeGyaxfniBP02VlsVL8b0MGoAKDDf1XJMfHxwdACPfCPMmRVgKL1UlhDUoS1AszCt2Tp7XqsY4u6LGNIQ2gNFYbAHmwu5vdngv5MpnseaFQZAWWpeX6IWQrtDtsiDLmlWsDkFKXlbAr3tLrm1aOILa2OEqM4ZilXytQMJHNo6NXjSZNe7t2eqTLAp0DZKPRyABkQVGyu+vzBr2+LECceIxmWVYmxXgE0ngcGI+NlyMfIzblhU0x2OPtXdRFUiTnORErsiynfYGaZBh+tPObn64CBM5O1RbLtOc5JNPIZEANCFl48GDJp5Oyfy5JmQLuxATq4ICiBGczveLSK4Mj5UgZxlYMQfSXkAHXYZGUaY6VRfAVR7NUlGInzev8Rz8EOnWzIEXdMW0YG/MgGcUUyMJNWOezYV/uTy/v40EKVmeEgwNMxOJNCIzehuhdaRekFTBsegSW4HSrpa3TrIjZCy2RThKCwFKC32zmRwOBq4YujVarMYxZxhwO5CyT2cg0FhdXgbO7lPNmf8tJks9dcB5gWpllD/y4vxWP34mcuE4i6fSOawi5bHsb0uDHlXL8AisWRRxP4fbTOs3RMxTFTA6aJSgZXuPLXSOa2ZGOMaLLsYZkzvhMo1pdqsIOL1v1eXO5XCAg8RL4ixVEjLEf4H4RO4xXmvuRyE66XCvfH1gZiKy4SkfOIoYVcHveHcOK1nlGkEXGgK57e71hCVpnGo1GO93heBaigph5PrOxEQhWVxd3fcGsN5sN5SQ+WCg4CwyHsZj/4MBux+tYLKZc8aNKJd/cT1cqlRYGCZVKud2lFi7SXJKmorTADJrNowFUB6cjs0CZ7ph2PDs9oUBGR6VwAJQAwzfny+ay0vl5we8sOEWWZQpOv92d+hvH8BjcGBg8tZxYDMNL7pQ7FcdwcJkVahBBW4r+N9B1KdypM8wCY7YD9iWOiWkHYjYzvLQePqsuLYWDPigqWV/ouH4uOWO8hB3gnosiHnNX3PFUPv83DuNDBPA4IFL5UiqfKrkhMU6tgoWkpmgKvMW/rDMavaYRYLRr1ACZniBACWMeXR/8SKouhtBANRT05uY2T2Pn4RbP8347xtRxrHjgdldK+XQln4+XSuCfPfc+WMTddIMHL4rMqyT7hEwzfhQ1+zvDnZ1GowmUzLZ3EGqOcxAkIpgHzSgqrQer0BIOhEJeb/bY6TwOHRcKBdbPshQr4AVnCYLhSkf2Ins7f+xFajv7e5H9Wjzewi+KxboogrdkGSBAWb8aftloMqnaYZuihpNEYoIgkEl0chDtXJeCErre2RnS+ea2jefHwKryvB3HcZbFIOKxo2bFVXat1FZqOzVXrba/l95rHrqLuHhab6cpUpYTTJFB16/CwYKq22SabWvTal6Bo2r1UxxHIFSUYRhUCgRQs/kNVAoFje8aex8eb3vDGwEct4PhOISgcNS8U0una+VaJO3a29uvNA/jpzFMFE9BCksQFGowoC+HwzrdW8bu7tlu2Dtq2rXchJokAUJ7PDS4S1pnJoVBNByWjqHdva06vsj5AnZ+GSCAKcRKR0fNZtp1L11Jp9N7laP8Yct5Ua/j9WKdltlJVJhEdSYdnIyBEmgwmgAyotWSBEEokGhClsGV0JVmBAaF7zNiNKpUpuG/vAF+Y2N5GTg4GFaKV+IVmCXgtyYw4q14K3YRi4lFkfKzftSJ8uGQoVf1Yu8XqvdVbbPDj0LhUlthmbcmrUi0a/q2JzHICAkLnUAtI5Yuo2rYpHp415STspnl5WU7D1rckKz5eL4E+XUUz+f34TWkmZOJYX7MDlkYCIQ7vaHwrd7uL1Svtb/f1t72aJva8nRPsi/Z82oScTjW1jwO8BkdjXosFlBiUkHnfvPG8LZ3m8+cKRiwFMxteKTy7jxk8X7erUx1N++225cVtRu+oC7kfevWux9+qWprG36zrf2ZHm0HbFi0Pcnkq8jE2gwtOzy36SgV9chEAiDdw93Dqu2Hx5tStZGxL8MwcCscOwys0MAU5vK/drZ8luGrPshKb++tD69/ePf6h23XX+p5RtPToX2pJ9kDN/LCE1tPkU89BRtf5Q9UVrKv5/nHn3n8+ut3b7x53bTpDYfCqGLOf+3ceV5Xnv83NHyuC4fhM97NzVt3byhdzF++/u516JQ+Cn+xeTT56iNPwuORfwDmIxlqcXq9nQAAAABJRU5ErkJggg==";
});
canvas {width: 100px; height: 100px}
div {text-align:center; width:120px; float:left}
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
<div>
<canvas id="canvOrigMinus100" width="100" height="100"></canvas>
-98%
</div>
<div>
<canvas id="canvOrigMinus50" width="100" height="100"></canvas>
-50%
</div>
<div>
<canvas id="canvOrig" width="100" height="100"></canvas>
Original
</div>
<div>
<canvas id="canvOrigPlus50" width="100" height="100"></canvas>
+50%
</div>
<div>
<canvas id="canvOrigPlus100" width="100" height="100"></canvas>
+98%
</div>
<hr/>
<div style="clear:left">
<canvas id="canvRoundMinus90" width="100" height="100"></canvas>
Round-trip <br/> (-98%, +98%)
</div>
<div>
<canvas id="canvRoundMinus50" width="100" height="100"></canvas>
Round-trip <br/> (-50%, +50%)
</div>
<div>
<canvas id="canvRound0" width="100" height="100"></canvas>
Round-trip <br/> (0% 100x)
</div>
<div>
<canvas id="canvRoundPlus50" width="100" height="100"></canvas>
Round-trip <br/> (+50%, -50%)
</div>
<div>
<canvas id="canvRoundPlus90" width="100" height="100"></canvas>
Round-trip <br/> (+98%, -98%)
</div>
Explanation
(Disclaimer - I am not an image specialist or a mathematician. I am trying to provide a common-sense explanation with minimal technical details. Some hand-waving below, e.g. 255=256 to avoid indexing issues, and 127.5=128, for simplifying the numbers.)
Since, for a given pixel, the possible number of non-zero values for a color channel is 255, the "no-contrast", average value of a pixel is 128 (or 127, or 127.5 if you want argue, but the difference is negligible). For purposed of this explanation, the amount of "contrast" is the distance from the current value to the average value (128). Adjusting the contrast means increasing or decreasing the difference between the current value and the average value.
The problem the algorithm solves then is to:
Chose a constant factor to adjust contrast by
For each color channel of each pixel, scale "contrast" (distance from average) by that constant factor
Or, as hinted at in the CSS spec, simply choosing the slope and intercept of a line:
<feFuncR type="linear" slope="[amount]" intercept="-(0.5 * [amount]) + 0.5"/>
Note the term type='linear'; we are doing linear contrast adjustment in RGB color space, as opposed to a quadratic scaling function, luminence-based adjustment, or histogram matching.
If you recall from geometry class, the formula for a line is y=mx+b. y is the final value we are after, the slope m is the contrast (or factor), x is the initial pixel value, and b is the intercept of the y-axis (x=0), which shifts the line vertically. Recall also that since the y-intercept is not at the origin (0,0), the formula can also be represented as y=m(x-a)+b, where a is the x-offset shifting the line horizontally.
For our purposes, this graph represents the input value (x-axis) and the result (y-axis). We already know that b, the y-intercept (for m=0, no contrast) must be 128 (which we can check against the 0.5 from the spec - 0.5 * the full range of 256 = 128). x is our original value, so all we need is to figure out the slope m and x-offset a.
First, the slope m is "rise over run", or (y2-y1)/(x2-x1) - so we need 2 points known to be on the desired line. Finding these points requires bringing a few things together:
Our function takes the shape of a line-intercept graph
The y-intercept is at b = 128 - regardless of the slope (contrast).
The maximum expected 'y' value is 255, and the minimum is 0
The range of possible 'x' values is 256
A neutral value should always stay neutral: 128 => 128 regardless of slope
A contrast adjustment of 0 should result in no change between input and output; that is, a 1:1 slope.
Taking all these together, we can deduce that regardless of the contrast (slope) applied, our resulting line will be centered at (and pivot around) 128,128. Since our y-intercept is non-zero, the x-intercept is also non-zero; we know the x-range is 256 wide and is centered in the middle, so it must be offset by half of the possible range: 256 / 2 = 128.
So now for y=m(x-a)+b, we know everything except m. Recall two more important points from geometry class:
Lines have the same slope even if their location changes; that is, m stays the same regardless of the values of a and b.
The slope of a line can be found using any 2 points on the line
To simplify the slope discussion, let's move the coordinate origin to the x-intercept (-128) and ignore a and b for a moment. Our original line will now pivot through (0,0), and we know a second point on the line lies away the full range of both x (input) and y (output) at (255,255).
We'll let the new line pivot at (0,0), so we can use that as one of the points on the new line that will follow our final contrast slope m. The second point can be determined by moving the current end at (255,255) by some amount; since we are limited to a single input (contrast) and using a linear function, this second point will be moved equally in the x and y directions on our graph.
The (x,y) coordinates of the 4 possible new points will be 255 +/- contrast. Since increasing or decreasing both x and y would keep us on the original 1:1 line, let's just look at +x, -y and -x, +y as shown.
The steeper line (-x, +y) is associated with a positive contrast adjustment; it's (x,y) coordinates are (255 - contrast,255 + contrast). The coordinates of the shallower line (negative contrast) are found the same way. Notice that the biggest meaningful value of contrast will be 255 - the most that the initial point of (255,255) can be translated before resulting in a vertical line (full contrast, all black or white) or a horizontal line (no contrast, all gray).
So now we have the coordinates of two points on our new line - (0,0) and (255 - contrast,255 + contrast). We plug this into the slope equation, and then plug that into the full line equation, using all the parts from before:
y = m(x-a) + b
m = (y2-y1)/(x2-x1) =>
((255 + contrast) - 0)/((255 - contrast) - 0) =>
(255 + contrast)/(255 - contrast)
a = 128
b = 128
y = (255 + contrast)/(255 - contrast) * (x - 128) + 128 QED
The math-minded will notice that the resulting m or factor is a scalar (unitless) value; you can use any range you want for contrast as long as it matches the constant (255) in the factor calculation. For example, a contrast range of +/-100 and factor = (100 + contrast)/(100.01 - contrast), which is was I really use to eliminate the step of scaling to 255; I just left 255 in the code at the top to simplify the explanation.
Note about the "magic" 259
The source article uses a "magic" 259, although the author admits he doesn't remember why:
"I can’t remember if I had calculated this myself or if I’ve read it in a book or online.".
259 should really be 255 or perhaps 256 - the number of possible non-zero values for each channel of each pixel. Note that in the original factor calculation, 259/255 cancels out - technically 1.01, but final values are whole integers so 1 for all practical purposes. So this outer term can be discarded. Actually using 255 for the constant in the denominator, though, introduces the possibility of a Divide By Zero error in the formula; adjusting to a slightly larger value (say, 259) avoids this issue without introducing significant error to the results. I chose to use 255.01 instead as the error is lower and it (hopefully) seems less "magic" to a newcomer.
As far as I can tell though, it doesn't make much difference which you use - you get identical values except for minor, symmetric differences in a narrow band of low contrast values with a low positive contrast increase. I'd be curious to round-trip both versions repeatedly and compare to the original data, but this answer already took way too long. :)
After trying the answer by Schahriar SaffarShargh, it wasn't behaving like contrast should behave. I finally came across this algorithm, and it works like a charm!
For additional information about the algorithm, read this article and it's comments section.
function contrastImage(imageData, contrast) {
var data = imageData.data;
var factor = (259 * (contrast + 255)) / (255 * (259 - contrast));
for(var i=0;i<data.length;i+=4)
{
data[i] = factor * (data[i] - 128) + 128;
data[i+1] = factor * (data[i+1] - 128) + 128;
data[i+2] = factor * (data[i+2] - 128) + 128;
}
return imageData;
}
Usage:
var newImageData = contrastImage(imageData, 30);
Hopefully this will be a time-saver for someone. Cheers!
This javascript implementation complies with the SVG/CSS3 definition of "contrast" (and the following code will render your canvas image identically):
/*contrast filter function*/
//See definition at https://drafts.fxtf.org/filters/#contrastEquivalent
//pixels come from your getImageData() function call on your canvas image
contrast = function(pixels, value){
var d = pixels.data;
var intercept = 255*(-value/2 + 0.5);
for(var i=0;i<d.length;i+=4){
d[i] = d[i]*value + intercept;
d[i+1] = d[i+1]*value + intercept;
d[i+2] = d[i+2]*value + intercept;
//implement clamping in a separate function if using in production
if(d[i] > 255) d[i] = 255;
if(d[i+1] > 255) d[i+1] = 255;
if(d[i+2] > 255) d[i+2] = 255;
if(d[i] < 0) d[i] = 0;
if(d[i+1] < 0) d[i+1] = 0;
if(d[i+2] < 0) d[i+2] = 0;
}
return pixels;
}
I found out that you have to use the effect by separating the darks and lights or technically anything that is less than 127 (average of R+G+B / 3) in rgb scale is a black and more than 127 is a white, therefore by your level of contrast you minus a value say 10 contrast from the blacks and add the same value to the whites!
Here is an example:
I have two pixels with RGB colors, [105,40,200] | [255,200,150]
So I know that for my first pixel 105 + 40 + 200 = 345, 345/3 = 115
and 115 is less than my half of 255 which is 127 so I consider the pixel closer to [0,0,0] therefore if I want to minus 10 contrast then I take away 10 from each color on it's average
Thus I have to divide each color's value by the total's average which was 115 for this case and times it by my contrast and minus out the final value from that specific color:
For example I'll take 105 (red) from my pixel, so I divide it by total RGB's avg. which is 115 and times it by my contrast value of 10, (105/115)*10 which gives you something around 9 (you have to round it up!) and then take that 9 away from 105 so that color becomes 96 so my red after having a 10 contrast on a dark pixel.
So if I go on my pixel's values become [96,37,183]! (note: the scale of contrast is up to you! but my in the end you should convert it to some scale like from 1 to 255)
For the lighter pixels I also do the same except instead of subtracting the contrast value I add it! and if you reach the limit of 255 or 0 then you stop your addition and subtraction for that specific color! therefore my second pixel which is a lighter pixel becomes [255,210,157]
As you add more contrast it will lighten the lighter colors and darken the darker and therefore adds contrast to your picture!
Here is a sample Javascript code ( I haven't tried it yet ) :
var data = imageData.data;
for (var i = 0; i < data.length; i += 4) {
var contrast = 10;
var average = Math.round( ( data[i] + data[i+1] + data[i+2] ) / 3 );
if (average > 127){
data[i] += ( data[i]/average ) * contrast;
data[i+1] += ( data[i+1]/average ) * contrast;
data[i+2] += ( data[i+2]/average ) * contrast;
}else{
data[i] -= ( data[i]/average ) * contrast;
data[i+1] -= ( data[i+1]/average ) * contrast;
data[i+2] -= ( data[i+2]/average ) * contrast;
}
}
You can take a look at the OpenCV docs to see how you could accomplish this: Brightness and contrast adjustments.
Then there's the demo code:
double alpha; // Simple contrast control: value [1.0-3.0]
int beta; // Simple brightness control: value [0-100]
for( int y = 0; y < image.rows; y++ )
{
for( int x = 0; x < image.cols; x++ )
{
for( int c = 0; c < 3; c++ )
{
new_image.at<Vec3b>(y,x)[c] = saturate_cast<uchar>( alpha*( image.at<Vec3b>(y,x)[c] ) + beta );
}
}
}
which I imagine you are capable of translating to javascript.
By vintaging I assume your trying to apply LUTS..Recently I have been trying to add color treatments to canvas windows. If you want to actually apply "LUTS" to the canvas window I believe you need to actually map the array that imageData returns to the RGB array of the LUT.
(From Light illusion)
As an example the start of a 1D LUT could look something like this:
Note: strictly speaking this is 3x 1D LUTs, as each colour (R,G,B) is a 1D LUT
R, G, B
3, 0, 0
5, 2, 1
7, 5, 3
9, 9, 9
Which means that:
For an input value of 0 for R, G, and B, the output is R=3, G=0, B=0
For an input value of 1 for R, G, and B, the output is R=5, G=2, B=1
For an input value of 2 for R, G, and B, the output is R=7, G=5, B=3
For an input value of 3 for R, G, and B, the output is R=9, G=9, B=9
Which is a weird LUT, but you see that for a given value of R, G, or B input, there is a given value of R, G, and B output.
So, if a pixel had an input value of 3, 1, 0 for RGB, the output pixel would be 9, 2, 0.
During this I also realized after playing with imageData that it returns a Uint8Array and that the values in that array are decimal. Most 3D LUTS are Hex. So you first have to do some type of hex to dec conversion on the entire array before all this mapping.
This is the formula you are looking for ...
var data = imageData.data;
if (contrast > 0) {
for(var i = 0; i < data.length; i += 4) {
data[i] += (255 - data[i]) * contrast / 255; // red
data[i + 1] += (255 - data[i + 1]) * contrast / 255; // green
data[i + 2] += (255 - data[i + 2]) * contrast / 255; // blue
}
} else if (contrast < 0) {
for (var i = 0; i < data.length; i += 4) {
data[i] += data[i] * (contrast) / 255; // red
data[i + 1] += data[i + 1] * (contrast) / 255; // green
data[i + 2] += data[i + 2] * (contrast) / 255; // blue
}
}
Hope it helps!
I have an array colors:
const colors = [
[0, 10, 56],
[40, 233, 247],
[50, 199, 70],
[255, 0, 0],
...
];
and a function reduceColor:
const reduceColor = pix => {
let lowdist = Number.MAX_SAFE_INTEGER;
let closestColor;
for (let color of colors) {
const dist = Math.abs(pix[0] - color[0]) +
Math.abs(pix[1] - color[1]) +
Math.abs(pix[2] - color[2]) ;
if (dist<lowdist) {
lowdist = dist;
closestColor = color;
}
}
return closestColor;
}
console.log(reduceColor([240, 10, 30]));
// [255, 0, 0]
Which works fine, but is slow in the context of a whole image.
Is there a way to, when supplied an array, check another array (made up of subarrays) for the closest subarray, without having to iterate and check over every subarray?
I don't think there is a better solution, you have to check that every row, but you could use Typed Arrays that has usually faster implementations, if you make a good use of their built-in methods. Usually browsers use these typed arrays in audio processing or canvas rendering.
Surely avoid JS Arrays because they are not contiguos in the memory, huge operations on them can be very slow. Typed arrays are contiguos instead.
Also avoid to use Math methods because they are implemented with the abstraction of JS interpreter objects, so Math.abs could be quite slow (it should handle big number, floating points, string, null, NaN, etc.). Numerical operations can be faster in these cases.
For the solution, you could use a Uint32Array (MDN), by reinterpreting the colors as a 32-bit number. In hexadecimal we represent a byte with 2 hex digits, so 32-bit is 8 digits. We will store them in the array as:
-- RR GG BB
0x 00 00 00 00
function makeColor(r, g, b) {
// the 32-bit color will be "0x00RRGGBB" in hex
return (r << 16) + (g << 8) + b;
}
function parseColor(color) {
// isolating the corrispondent bit of the colors
return [ (color & 0xff0000) >> 16, (color & 0xff00) >> 8, color & 0xff ];
}
function makeColorArray(colors) {
// creating the Uint32Array from the colors array
return new Uint32Array(colors.map(c => makeColor(...c)));
}
function distanceRGB(p1, p2) {
let r1 = (p1 & 0xff0000) >> 16, // extract red p1
g1 = (p1 & 0xff00) >> 8, // extract green p1
b1 = p1 & 0xff, // extract blue p1
r2 = (p2 & 0xff0000) >> 16, // extract red p2
g2 = (p2 & 0xff00) >> 8, // extract green p2
b2 = p2 & 0xff, // extract blue p2
rd = r1 > r2 ? (r1 - r2) : (r2 - r1), // compute dist r
gd = g1 > g2 ? (g1 - g2) : (g2 - g1), // compute dist g
bd = b1 > b2 ? (b1 - b2) : (b2 - b1); // compute dist b
return rd + gd + bd; // sum them up
}
// rising the threshold can speed up the process
function findNearest(colors, distance, threshold) {
return function(pixel) {
let bestDist = 765, best = 0, curr; // 765 is the max distance
for (c of colors) {
curr = distance(pixel, c);
if (curr <= threshold) { return c; }
else if (curr < bestDist) {
best = c;
bestDist = curr;
}
}
return best;
};
}
const colors = makeColorArray([
[0, 10, 56],
[40, 233, 247],
[50, 199, 70],
[255, 0, 0],
//... other colors
]);
const image = makeColorArray([
[240, 10, 30]
//... other pixels
])
const nearest = Array.from(image.map(findNearest(colors, distanceRGB, 0))).map(parseColor)
// ===== TEST =====
function randomColor() {
return [ Math.floor(Math.random() * 255), Math.floor(Math.random() * 255), Math.floor(Math.random() * 255) ];
}
testImages = [];
for (let i = 0; i < 1000000; ++i) { testImages.push(randomColor()); }
testImages = makeColorArray(testImages)
testColors = [];
for (let i = 0; i < 1000; ++i) { testColors.push(randomColor()); }
testColors = makeColorArray(testColors);
// START
let testNow = Date.now();
Array.from(testImages.map(findNearest(testColors, distanceRGB, 0))).map(parseColor)
console.log(Date.now() - testNow)
I've tested it over a million random pixels (1000x1000) with 4 colors and takes less then a second (~250-450ms), with a thousand of test colors takes 12-15 seconds. The same experiment with a normal Array took 4-6 seconds just with 4 colors, I did not try the thousand colors test (all on my PC obviously).
Consider to pass heavy work to a Worker (MDN) to avoid UI freezing.
I don't know if it is enough to be runned on a bigger image. I'm sure it can be further optimized (maybe through Gray code hamming distance), but it is a good start point by the way.
You can also be interested to get a way to extract pixels values from images, just check ImageData (MDN) and how to retrive it (Stack Overflow) from a <img> or <canvas> element.
So I have a 2D array created like this:
//Fill screen as blank
for(var x = 0; x<500; x++ ){
screen[x] = [];
for(var y = 0; y<500; y++ ){
screen[x][y] = '#ffffff';
}
}
And was wondering if there's an easy way to convert it to an ImageData object so I can display it on a canvas?
Flattening arrays
The first thing you'll have to learn is how to flatten a 2d array. You can use a nested loop and push to a new 1d array, but I prefer to use reduce and concat:
const concat = (xs, ys) => xs.concat(ys);
console.log(
[[1,2,3],[4,5,6]].reduce(concat)
)
Now you'll notice quickly enough that your matrix will be flipped. ImageData concatenates row by row, but your matrix is grouped by column (i.e. [x][y] instead of [y][x]). My advice is to flip your nested loop around :)
From "#ffffff" to [255, 255, 255, 255]
You now have the tool to create a 1d-array of hex codes (screen.reduce(concat)), but ImageData takes an Uint8ClampedArray of 0-255 values! Let's fix this:
const hexToRGBA = hexStr => [
parseInt(hexStr.substr(1, 2), 16),
parseInt(hexStr.substr(3, 2), 16),
parseInt(hexStr.substr(5, 2), 16),
255
];
console.log(
hexToRGBA("#ffffff")
);
Notice that I skip the first "#" char and hard-code the alpha value to 255.
Converting from hex to RGBA
We'll use map to convert the newly created 1d array at once:
screen.reduce(concat).map(hexToRGBA);
2d again?!
Back to square one... We're again stuck with an array of arrays:
[ [255, 255, 255, 255], [255, 255, 255, 255], /* ... */ ]
But wait... we already know how to fix this:
const flattenedRGBAValues = screen
.reduce(concat) // 1d list of hex codes
.map(hexToRGBA) // 1d list of [R, G, B, A] byte arrays
.reduce(concat); // 1d list of bytes
Putting the data to the canva
This is the part that was linked to in the comments, but I'll include it so you can have a working example!
const hexPixels = [
["#ffffff", "#000000"],
["#000000", "#ffffff"]
];
const concat = (xs, ys) => xs.concat(ys);
const hexToRGBA = hexStr => [
parseInt(hexStr.substr(1, 2), 16),
parseInt(hexStr.substr(3, 2), 16),
parseInt(hexStr.substr(5, 2), 16),
255
];
const flattenedRGBAValues = hexPixels
.reduce(concat) // 1d list of hex codes
.map(hexToRGBA) // 1d list of [R, G, B, A] byte arrays
.reduce(concat); // 1d list of bytes
// Render on screen for demo
const cvs = document.createElement("canvas");
cvs.width = cvs.height = 2;
const ctx = cvs.getContext("2d");
const imgData = new ImageData(Uint8ClampedArray.from(flattenedRGBAValues), 2, 2);
ctx.putImageData(imgData, 0, 0);
document.body.appendChild(cvs);
canvas { width: 128px; height: 128px; image-rendering: pixelated; }
I suspect your example code is just that, an example, but just in case it isn't there are easier way to fill an area with a single color:
ctx.fillStyle = "#fff";
ctx.fillRect(0, 0, 500, 500);
But back to flattening the array. If performance is a factor you can do it in for example the following way:
(side note: if possible - store the color information directly in the same type and byte-order you want to use as converting from string to number can be relatively costly when you deal with tens of thousands of pixels - binary/numeric storage is also cheaper).
Simply unwind/flatten the 2D array directly to a typed array:
var width = 500, height = 500;
var data32 = new Uint32Array(width * height); // create Uint32 view + underlying ArrayBuffer
for(var x, y = 0, p = 0; y < height; y++) { // outer loop represents rows
for(x = 0; x < width; x++) { // inner loop represents columns
data32[p++] = str2uint32(array[x][y]); // p = position in the 1D typed array
}
}
We also need to convert the string notation of the color to a number in little-endian order (format used by most consumer CPUs these days). Shift, AND and OR operations are multiple times faster than working on string parts, but if you can avoid strings at all that would be the ideal approach:
// str must be "#RRGGBB" with no alpha.
function str2uint32(str) {
var n = ("0x" + str.substr(1))|0; // to integer number
return 0xff000000 | // alpha (first byte)
(n << 16) | // blue (from last to second byte)
(n & 0xff00) | // green (keep position but mask)
(n >>> 16) // red (from first to last byte)
}
Here we first convert the string to a number - we shift it right away to a Uint32 value to optimize for the compiler now knowing we intend to use the number in the following conversion as a integer number.
Since we're most likely on a little endian plaform we have to shift, mask and OR around bytes to get the resulting number in the correct byte order (i.e. 0xAABBGGRR) and OR in a alpha channel as opaque (on a big-endian platform you would simply left-shift the entire value over 8 bits and OR in an alpha channel at the end).
Then finally create an ImageData object using the underlying ArrayBuffer we just filled and give it a Uint8ClampedArray view which ImageData require (this has almost no overhead since the underlying ArrayBuffer is shared):
var idata = new ImageData(new Uint8ClampedArray(data32.buffer), width, height);
From here you can use context.putImageData(idata, x, y).
Example
Here filling with a orange color to make the conversion visible (if you get a different color than orange then you're on a big-endian platform :) ):
var width = 500, height = 500;
var data32 = new Uint32Array(width * height);
var screen = [];
// Fill with orange color
for(var x = 0; x < width; x++ ){
screen[x] = [];
for(var y = 0; y < height; y++ ){
screen[x][y] = "#ff7700"; // orange to check final byte-order
}
}
// Flatten array
for(var x, y = 0, p = 0; y < height; y++){
for(x = 0; x < width; x++) {
data32[p++] = str2uint32(screen[x][y]);
}
}
function str2uint32(str) {
var n = ("0x" + str.substr(1))|0;
return 0xff000000 | (n << 16) | (n & 0xff00) | (n >>> 16)
}
var idata = new ImageData(new Uint8ClampedArray(data32.buffer), width, height);
c.getContext("2d").putImageData(idata, 0, 0);
<canvas id=c width=500 height=500></canvas>
I wrote js script which performs various operations (eg. summing up photos with a constant, square root, moving, applying filters) on the pictures in the canvas. But for large images (eg. 2000x200 pixels), the script frozen/crashes the browser (tested on Firefox), in addition, everything takes a long time.
function get_pixel (x, y, canvas)
{
var ctx = canvas.getContext("2d");
var imgData = ctx.getImageData(x, y, 1, 1);
return imgData.data;
}
function set_pixel (x, y, canvas, red, green, blue, alpha)
{
var ctx = canvas.getContext('2d');
var imgData = ctx.getImageData(0, 0, canvas.width, canvas.height),
pxData = imgData.data,
length = pxData.length;
var i = (x + y * canvas.width) * 4;
pxData[i] = red;
pxData[i + 1] = green;
pxData[i + 2] = blue;
pxData[i + 3] = alpha;
ctx.putImageData (imgData, 0, 0);
}
function sum (number, canvas1, canvas2)
{
show_button_normalization (false);
asyncLoop(
{
length : 5,
functionToLoop : function(loop, i){
setTimeout(function(){
asyncLoop(
{
length : 5,
functionToLoop : function(loop, i){
setTimeout(function(){
var pixel1 = get_pixel (i, j, canvas1);
var pixel2;
if (canvas2 != null)
{
pixel2 = get_pixel (i, j, canvas2);
}
else
{
pixel2 = new Array(4);
pixel2[0] = number;
pixel2[1] = number;
pixel2[2] = number;
pixel2[3] = number;
}
var pixel = new Array(4);
pixel[0] = parseInt (parseInt (pixel1[0]*0.5) + parseInt (pixel2[0]*0.5));
pixel[1] = parseInt (parseInt (pixel1[1]*0.5) + parseInt (pixel2[1]*0.5));
pixel[2] = parseInt (parseInt (pixel1[2]*0.5) + parseInt (pixel2[2]*0.5));
pixel[3] = parseInt (parseInt (pixel1[3]*0.5) + parseInt (pixel2[3]*0.5));
set_pixel (i, j, image1_a, pixel[0], pixel[1], pixel[2], pixel[3]);
loop();
},1000);
},
});
loop();
},1000);
},
});
/*for (var i=0; i<canvas1.width; i++)
{
for (var j=0; j<canvas1.height; j++)
{
var pixel1 = get_pixel (i, j, canvas1);
var pixel2;
if (canvas2 != null)
{
pixel2 = get_pixel (i, j, canvas2);
}
else
{
pixel2 = new Array(4);
pixel2[0] = number;
pixel2[1] = number;
pixel2[2] = number;
pixel2[3] = number;
}
var pixel = new Array(4);
pixel[0] = parseInt (parseInt (pixel1[0]*0.5) + parseInt (pixel2[0]*0.5));
pixel[1] = parseInt (parseInt (pixel1[1]*0.5) + parseInt (pixel2[1]*0.5));
pixel[2] = parseInt (parseInt (pixel1[2]*0.5) + parseInt (pixel2[2]*0.5));
pixel[3] = parseInt (parseInt (pixel1[3]*0.5) + parseInt (pixel2[3]*0.5));
set_pixel (i, j, image1_a, pixel[0], pixel[1], pixel[2], pixel[3]);
}
}*/
}
Is it possible to fix it?
Process pixels together!!
Looking at the code I would say that Firefox crashing and/or taking a long time is not a surprise at all. An image that is 2000 by 2000 pixels has 4 million pixels. I don't know what asyncLoop does but to me it looks like you are using timers to set groups of 5 pixels at a time. This is horrifically inefficient.
Problems with your code
Even looking at the commented code (which I assume is an alternative approch) you are processing the pixels with way to much overhead.
The array pixel you get from the function getPixel which returns the pixel array that is part of the object getImageData returns. If you look at the details of getImageData and te return object imageData you will see that the array is a typed array of type Uint8ClampedArray
That means most of the code you use to mix the pixels is redundant as that is done by javascript automatically when it assigns a number to any typed array.
pixel[0] = parseInt (parseInt (pixel1[0]*0.5) + parseInt (pixel2[0]*0.5));
Will be much quicker if you use
pixel[0] = (pixel1[0] + pixel2[0]) * 0.5; // a * n + b * n is the same as ( a+ b) *n
// with one less multiplication.
Standard simple image processing
But even then using a function call for each pixel adds a massive overhead to the basic operation you are performing. You should fetch all the pixels in one go and process them as two flat arrays.
Your sum function should look more like
function sum (number, canvas1, canvas2){
var i, data, ctx, imgData, imgData1, data1;
ctx = canvas1.getContext("2d");
imgData = ctx.getImageData(0, 0, canvas1.width, canvas1.height);
data = imgData.data; // get the array of pixels
if(canvas2 === null){
i = data.length;
number *= 0.5; // pre calculate number
while(i-- > 0){
data[i] = data[i] * 0.5 + number;
}
}else{
if(canvas1.width !== canvas2.width || canvas1.height !== canvas2.height){
throw new RangeError("Canvas size miss-match, can not process data as requested.");
}
data1 = canvas2.getContext("2d").getImageData(0,0,canvas2.width, canvas2.height).data
i = data.length;
while(i-- > 0){
data[i] = (data[i] + data1[i]) * 0.5;
}
}
ctx.setImageData(imgData,0,0); // put the new pixels back to the canvas
}
Bit math is quicker
You can improve on that if you use a bit of bit manipulation. Using a 32 bit typed array you can divide then add four 8 bit values in parallel (4* approx quicker for pixel calculations).
Note that this method will round down by one value a little more often than it should. ie Math.floor(199 * 233) === 216 is true while the method below will return 215. This can be corrected for by using the bottom bit of both inputs to add to the result. This completely eliminates the rounding error but the processing cost in my view is not worth the improvement. I have included the fix as commented code.
Note this method will only work for a / n + b / m where n and m are equal to 2^p and p is an integer > 0 and < 7 (in other words only if n and m are 2,4,8,16,32,64,127) and you must mask out the bottom p bits for a and b
Example performs C = C * 0.5 + C1 * 0.5 when C and C1 represent each R,G,B,A channel for canvas1 and canvas2
function sum (number, canvas1, canvas2){
var i, data, ctx, imgData, data32, data32A;
// this number is used to remove the bottom bit of each color channel
// The bottom bit is redundant as divide by 2 removes it
const botBitMask = 0b11111110111111101111111011111110;
// mask for rounding error (not used in this example)
// const botBitMaskA = 0b00000001000000010000000100000001;
ctx = canvas1.getContext("2d");
imgData = ctx.getImageData(0, 0, canvas1.width, canvas1.height);
data32 = new Uint32Array(imgData.data.buffer);
i = data32.length; // get the length that is 1/4 the size
if(canvas2 === null){
number >>= 1; // divide by 2
// fill to the 4 channels RGBA
number = (number << 24) + (number << 16) + (number << 8) + number;
// get reference to the 32bit version of the pixel data
while(i-- > 0){
// Remove bottom bit of each channel and then divide each channel by 2 using zero fill right shift (>>>) then add to number
data32[i] = ((data32[i] & botBitMask) >>> 1) + number;
}
}else{
if(canvas1.width !== canvas2.width || canvas1.height !== canvas2.height){
throw new RangeError("Canvas size miss-match, can not process data as requested.");
}
data32A = new Uint32Array(canvas2.getContext("2d").getImageData(0,0,canvas2.width, canvas2.height).data.buffer);
i = data32.length;
while(i-- > 0){
// for fixing rounding error include the following line removing the second one. Do the same for the above loop but optimise for number
// data32[i] = (((data32[i] & botBitMask) >>> 1) + ((data32A[i] & botBitMask) >>> 1)) | ((data32[i] & botBitMaskA) | (data32A[i] & botBitMaskA))
data32[i] = ((data32[i] & botBitMask) >>> 1) + ((data32A[i] & botBitMask) >>> 1);
}
}
ctx.setImageData(imgData,0,0); // put the new pixels back to the canvas
}
With all that you should not have any major problems. Though you will still have the page blocked while the image is being processed (depending on the machine and the image size it may take up to a second or 2)
Other solutions.
If you want to stop the image processing from blocking the page you can use a web worker and just send the data to them to process synchronously. You can find out how to do that but just searching stackOverflow.
Or use WebGL to process the images.
And you have one final option. The canvas api uses the GPU to do all its rendering and if you understand the way blending and compositing works you can do a surprising amount of maths using the canvas.
For example you can multiply all pixels RGBA channels with a value 0-1 using the following.
// multiplies all channels in source canvas by val and returns the resulting canvas
// returns the can2 the result of each pixel
// R *= val;
// G *= val;
// B *= val;
// A *= val;
function multiplyPixels(val, source)
var sctx = source.getContext("2d");
// need two working canvas. I create them here but if you are doing this
// many times you should create them once and reuse them
var can1 = document.createElement("canvas");
var can2 = document.createElement("canvas");
can1.width = can2.width = source.width;
can1.height= can2.height = source.height;
var ctx1 = can1.getContext("2d");
var ctx2 = can2.getContext("2d");
var chanMult = Math.round(255 * val);
// clamp it to 0-255 inclusive
chanMult = chanMult < 0 ? 0 : chanMult > 255 ? 255 : chanMult;
ctx1.drawImage(source,0,0); // copy the source
// multiply all RGB pixels by val
ctx1.fillStyle = "rgba(" + chanMult + "," + chanMult + "," + chanMult + ",1)";
ctx1.globalCompositeOperation = "multiply";
ctx1.fillRect(0, 0, source.width, source.height);
// now multiply the alpha channel by val. Clamp it to 0-1
ctx2.globalAlpha = val < 0 ? 0 : val > 1 ? 1 : val;
ctx2.drawImage(can1,0,0);
return can2;
}
There are quite a few composite operation that you can use in combination to do multiplication, addition, subtraction and division. Note though the accuracy is a little less than 8bits as addition and subtraction requires weighted values to compensate for the blending's (automatic) multiplication. Also the alpha channel must be handled separately from the RGB channels using globalAlpha and the compositing operations.
Realtime
The processing you are doing is very simple and a 2000 by 2000 pixel image can easily be processed in realtime. WebGl filter is an example of using webGL to do image processing. Though the filter system is not modular and the code is very old school it is a good backbone for webGL filters and offers much higher quality results because it uses floating point RGBA values.
what i want is to the the HEX or the RGB average value from an image to the another div background this color.
So if i upload an image with a ot of red i get something like #FF0000 just as an example.
Let Me know if this is posible :)
Many thanks.
First, draw the image on a canvas:
function draw(img) {
var canvas = document.createElement("canvas");
var c = canvas.getContext('2d');
c.width = canvas.width = img.width;
c.height = canvas.height = img.height;
c.clearRect(0, 0, c.width, c.height);
c.drawImage(img, 0, 0, img.width , img.height);
return c; // returns the context
}
You can now iterate over the image's pixels. A naive approach for color-detection is to simply count the frequency of each color in the image.
// returns a map counting the frequency of each color
// in the image on the canvas
function getColors(c) {
var col, colors = {};
var pixels, r, g, b, a;
r = g = b = a = 0;
pixels = c.getImageData(0, 0, c.width, c.height);
for (var i = 0, data = pixels.data; i < data.length; i += 4) {
r = data[i];
g = data[i + 1];
b = data[i + 2];
a = data[i + 3]; // alpha
// skip pixels >50% transparent
if (a < (255 / 2))
continue;
col = rgbToHex(r, g, b);
if (!colors[col])
colors[col] = 0;
colors[col]++;
}
return colors;
}
function rgbToHex(r, g, b) {
if (r > 255 || g > 255 || b > 255)
throw "Invalid color component";
return ((r << 16) | (g << 8) | b).toString(16);
}
getColors returns a map of color names and counts. Transparent pixels are skipped. It should be trivial to get the most-frequently seen color from this map.
If you literally want an average of each color component, you could easily get that from the results of getColors, too, but the results aren't likely to be very useful. This answer explains a much better approach.
You can use it all like this:
// nicely formats hex values
function pad(hex) {
return ("000000" + hex).slice(-6);
}
// see this example working in the fiddle below
var info = document.getElementById("info");
var img = document.getElementById("squares");
var colors = getColors(draw(img));
for (var hex in colors) {
info.innerHTML += "<li>" + pad(hex) + "->" + colors[hex];
}
See a working example.
Put image on canvas.
Get 2D context.
Loop through pixels, and store each r,g,b value. If you find the same, increment it once.
Loop through stored r,g,b values and take note of largest r,g,b value.
Convert r,g,b to hex.
This is only possible using the canvas tag as described here :
http://dev.opera.com/articles/view/html-5-canvas-the-basics/#pixelbasedmanipulation
Of course this is only available in newer browsers
You might consider using the convolution filters css allows you to apply. This might be able to get the effect you're going for ( assuming you're wanting to present it back into the html). So you could display the image twice , one convolved.
That being said, doesn't really work if you need the information yourself for some purpose.
For finding that average color:
Put Image on Canvas
Resize image to 1px by 1px
Get the color of the resulting pixel(This pixel will be the calculated average)