There are three common ways to convert colour to grey-scale. It turns out that all three are probably wrong. None of them account for the fact computers store the square* root of the brightness rather than the actual brightness. The reason for this peculiarity is described here: https://scratch.mit.edu/discuss/youtube/LKnqECcg6Gw/ Here I show all three grey-scale conversion methods just above a corrected version of the same method. The first is the luminosity method and is shown at the top of the screen. The formula is 0.21 R + 0.72 G + 0.07 B. It accounts for the fact that green appears brighter to the human eye than red and red brighter than blue. The corrected method is shown immediately below and uses the formula sqrt(0.21 R^2 + 0.72 G^2 + 0.07 B^2). The second is the average method and is shown in the middle of the screen. It uses (R + G + B) / 3. The corrected method is shown immediately below and uses the formula sqrt((R^2 + G^2 + B^2)/3) The third is the lightness method and is shown at the bottom of the screen. It uses (max(R, G, B) + min(R, G, B)) / 2. The corrected method is shown immediately below and uses the formula sqrt(max(R, G, B)^2 + min(R, G, B)^2) / 2).
https://www.youtube.com/watch?v=LKnqECcg6Gw Probably incorrect colour to grey conversion methods: http://www.johndcook.com/blog/2009/08/24/algorithms-convert-color-grayscale/ *@MartinBraendli2 pointed out that it's not necessarily the square root of the brightness that is stored but the gammath root. Gamma is typically between 1.8 (Mac) and 2.2 (PC). It would be the square root if gamma was 2. I've added a slider that let's you set your gamma. The difference is pretty subtle. @MartinBraendli2 also suggests that round( ( 0.2*r^y + 0.52*g^γ + 0.28*b^γ )^(1/γ) ) should theoretically look better for luminosity on a gamma corrected colour. I think he's right. I've added a Braendii Luminosity switch so that you can compare.