Gamma is the relationship of how bright an image is at any input level. The higher the gamma the darker and higher contrast the displayed image will be. This turns out to be one of the most important factors in display performance. Unfortunately, the standard for high definition television and standard definition television allows a range of gamma settings to be used. This has created significant variation in display setup in the home. In practice though I find a true 2.2 gamma repeatedly offers the most pleasing images.
I believe this is caused by the fact that current color grading monitors are setup for a 2.2 gamma. Because of this the artists who set the look of a film are using this gamma to set the scene contrast. I find deviating from their artistic vision is generally a mistake.
An example of this from a real monitor is the preset gamma of 2.2 for the BVM-L231. This is Sony’s color grading monitor at this time. An excerpt from page 53 of the operating manual that shows this is shown below. ITU-R BT.709 is the HDTV standard used on Blu-Ray. Sony color grading monitors are also some of the industries most prolific.
When you set gamma in your display though what may appear to be a 2.2 or labeled as such may not be. To really see what a 2.2 gamma looks like it must follow the 2.2 curve from one percent level to 100 percent. If it cannot then a compromise must be made to try to replicate the actual 2.2 look. The best compromise is generally to follow the 2.2 curve as close as possible from zero to 20 percent. Doing this is tricky since most instruments today will not measure light output well at these levels and adaptive black level performance found in some displays will interfere with these measurements from test patterns. The reason this is the case is that the dark region of the image controls apparent contrast and shading of an image. It is also a fact that most images contain very little information above 40 percent signal level.