Ken wrote:
Why, if D50 is the standard for viewing prints,
isn’t D50 the preferred temperature for monitors?
There is no single answer to that:
1) D50 _is_ the preferred CCT for monitors of most of the people who use D50 viewing booths (people who deal with material that is going to be printed).
2) in the past there was no ("full spectrum") D65 fluoresecent lamps and filtering Tungsten to D65 is way more inefficient than to D50.
3) The chromatic adaptation of the vision is still not fully understood/characterized.
4) People only rarely view prints under D65 (open shadow on sunny day), mostly they view them under 2850K … 4000K. So even the 5000K is pretty high for the purpose of simulating the actual viewing experience. But lesser than 5000K for the viewing boot is troublesome in respect with color management … already the 5000K looks a little yellowish (to many).
5) With monitors, there is not many practical needs to limit the CCT of the whitepoint to D50, monitors can as easily produce CCT of D65 as well. The only need that I can figure out is the (supposed) need to match the viewing booth color temperature. D65 is, for many, very very pure white. And D50 is, for many, a little yellowish white, it is about the white that you see when you look a normal xerox copypaper that is illuminated by the direct sunshine. In case the monitor whitepoint is set to D65 then it can show the very pure white efficiently (all channels at max), and it can show rather high luminance D50 also very efficiently (mainly the blue channel is scaled down a little, and this has only a small effect to luminance). But a monitor that is set to D50 can not show the color of D65 at high luminance at all (both red and green channels need to be scaled down and that eats a lot of luminance). So, when a monitor is set to D65 it can show different kind of images more efficiently than a monitor that is set to D50 (where often luminance has to be sacrificed in favor of more pure whites).
Timo Autiokari