aliEnRIK:And how many people watch their tvs with gamma settings way out...........
There is alot of leeway 2.2-2.5 built into the mastering, because they do not know what your viewing environment will be. Darker environment needing higher gamma to compensate for dark surround effect. Since displays track gamma curves the effect of higher contrast is only on the darkest parts of the image, To display accurately gamma of 2.2 from 5%white to 100%white only requires 728:1 contrast, to display down to 1%white needs 25,119:1, and down to black needs infinite contrast. But the bottom 5% of the greyscale is critical in giving the image solidity and creating the illusion of image depth.
As far as how many people use the wrong gamma settings it depends on the display type and viewing conditions. With CRT you generally had no control over gamma. Depending who you go by CRT had a average gamma of 2.2 or 2.35 or 2.5. Flat pannels often have a gamma of 2.2 which is fine in a living room, while a projector in a black home cinema looks better with higher gamma. The choice of gamma setting comes down to higher gamma giving more image depth - punch in bright images and more colour saturation, but a overall less bright image. Lower gamma giving a brighter image with more visible shadow detail. If you display image looks natural and consistantly gives the illusion of image depth, looks like you could throw a ball through the screen or shout at the people on it, then gamma is correct.
Display image quality can be measured by manufactures but they do not bother. I asume this is because it would be time consuming, and lack of consumer comprehension of what on earth the graphs indicated might render it a pointless exercise. So they just use large numbers to impress, and simple ideas like large contrast good, low contrast bad, which do not give a true indication of picture quality.
MTF modulation transfer function curves, which are calculated off the contrast of line pairs per degree of the viewers visual arc, so you need to know display size and viewing distance, clearly indicate the displays ability to produce a sharp image with good clarity. It can also be weighted for a overall image quality because the visual importance of the center, middle, corners of the display are different. Plotting from large details to small details, wide black and white lines to thin black and white lines. MTF at the start to middle of the curve indicates contrast in large to mid details, how much objects standout clearly from the background, how sharp the image looks. MTF at the middle to end of the curve indicates contrast in the middle to fine details how much texture of objects is visible, like skin tones to clothing texture, how detailed the image looks. MTF at the end of the curve indicates absolute display resolution, the least critical indicator of apparant sharpness. The overall perception of sharpness is determined by MTF from 3 to 12 line pairs per degree of viewers visual arc. For camera image reproduction MTF of 50% is the point taken to be out of focus, MTF 35% is the point that a area of the image is no longer in the zone of confusion - the part of the image that looks in focus due to depth perception, MTF 5% is taken as the resolution limit of the display, although a trained observer can still distinguish objects down to MTF 0.05%.
Gamma tracking, what curve is tracked 2.2-2.35-2.5 and how smoothly and accurately that curve is tracked particularly at the bottom dark end of the grey scale where the display can not faithfully follow the curve. Limited by MTF, indicates the displays contrast and apparant image depth in bright to dark scenes. Poor gamma tracking leads to images that look good in some scenes and poor in others due to altering perceptions of image depth and scenes that look less natural due to exageration or under emphasis in the contrast of various elements of the picture. Gamma curves also indicate a displays suitability to different viewing environments.
Greyscale color accuracy across the entire greyscale black through shades of grey to white, primary colour coordinates, and colour decoder accuracy. All combine to indicate if the image has correct, natural looking colours. Because the eye-brain perceives colors as ratios, inaccurate colours look more natural with the colour saturation/luminance control lowered, it reduces the size of the error, but also robs the image of colorfulness and some contrast - image depth. The eye brain is most sensitive to color errors in skin tones, accurate colour makes the people on screen look incredibly lifelike as it they are actually standing in front of you, and makes the actresses look more beautiful.
Then you have if the display is capable of displaying the 8-bit greyscale on an individual pixel bases, or relys on temporal or spatial dithering to create the shade. If the display can show changes in 8-bit greyscale pixel on a frame to frame bases or if the changes from one shade to another have lag time. How often and how long the pixels are illuminated per frame.