Gamma really isn't too hard to understand. Because of obvious reasons of not getting eaten by wolves, your eyes are geared to be much more sensitive to differences in dark colors than for staring directly into the sun to look for sunspots. This degree of sensitivity has been graphed in a swoopy curve. Computer nerds came along and said, 'hey, we can use this property of the human eye to save space on our hard drives and internet, use fewer 0s and 1s'. This is gamma encoding, which is a secret code used to make things smaller. Insert the math equations here. Now you've got machines which work not on swoopy curves but on nice little equal stairsteps. More computer nerds come along and say, hey no problem, we can decode the secret code, regain the big swoopy curve and project the swoopy curve onto the machine stairsteps. The problem is not all machines use the same size stairsteps. Insert more math equations. You can fiddle with your monitor to change the size of the stairsteps in order to make it better or worse for the particular nerd equations that your operating system uses (the clicky buttons on the side of the monitor). You can also pick different nerd equations to better match your monitors stairsteps in your color calibration settings (in system tools). If you're painting a digital picture, you're using those nice quantized digital stairsteps as you pick your colors from 0 to 255 RGB so being aware that your eye perceives things in analog and on a power curve is good to know, but really people understand it intuitively anyways, you know when something doesn't 'look right'. It's more complicated than this, but that's my hack job at explaining what is actually going on.