Kibo informs me that Timo Autiokari stated
that:
On Tue, 23 Nov 2004 10:50:44 +1100, wrote:
With photography, the intention is to produce a final image that is as similar as possible to what a human eye would’ve seen through the viewfinder,
For you information,
As it happens, I’ve designed imaging systems, so I’m quite familiar with the differences between requirements of a scientific imaging system, vs the requirements of a device to create images intended to approximate what a human eye would see in the same situation. And for /your/ information, a photograph evokes only a very vague approximation of what an eye would’ve seen if it’d been in place of the camera. Even a gamma-corrected (ie; non-linear) image is just another in a long string of compromises that makes it a little easier to trick the human eye into perceiving the printed/displayed image as ‘real’.
what ever real life scene the human eye is
viewing at, it happens that *linear* light (photons) will hit the sensors on the retina.
You’re ignoring the fact that most scientific imaging uses false-colouring *precisely because* the ‘true’ image would either be invisible, too dark, or too bright to be processed by a naked human eye. If the human eye was capable of perceiving, (for example), Doppler-shifted light from a star on the other side of the galaxy, we wouldn’t need space-telescopes in the first place, would we? – We could just look out the window instead. And the human eye can’t correctly image even fairly close stars – we perceive most stars as being white, (even though they are strongly coloured), because their light is too dim for our colour vision to pick it up. Fortunately, scientific imaging systems can show us their *real* colour. Closer to home, scientific instruments create images via things like soft X-rays, or infrared light – situations where the capabilities & limitations of the human eye are completely irrelevant. The particular scaling system, (whether it’s linear, log, exponential, bell-shaped or whatever) that’s optimal for scientific imaging has nothing whatever to do with how the eye perceives light, & everything to do with the physics of whatever it is that the device is intended to measure.
Displays however are not capable to output very high luminance levels but it so happens that the eye has the iris so it can adapt to different brightness levels,
The eye does a hell of a lot more to deal with large contrast ranges than just adjust the iris. For example; the retina automatically performs an astonishingly-similar analog of darkroom or PS contrast masking to ‘correct’ for localised highlights in the visual field that would otherwise ‘blowout’, just as photographers do to ‘correct’ photos of sunsets or other scenes with contrast ranges that are too big to print or display.
therefore 1:1 linearity is not needed,
just an overall linearity of the transfer function is enough. Nonlinearity in this path makes the image appearance too dark or to bright in some portion of the tonal reproduction range.
For starters, the light output of a display isn’t even close to being linear, nor should it be. If you actually look at the transfer graph for a calibrated monitor, you’ll find that the transfer curve is exponential. It’s no harder to calibrate a monitor to give a completely linear input-voltage to light output relationship, rather than a 1.8 or
2.2 gamma curve, then run an extremely accurate linear greyscale
gradient across it, but it would result in precisely the *perceived* non-linearity you’ve just mentioned. We gamma-correct monitors for the *exact purpose* of eliminating that non-linear perception.
which requires a non-linear response.
False.
No, I’m afraid not. You would do well to read up on how the human eye works, as well as about scientific imaging techniques, because the stuff you’re saying is just plain wrong.
—
W
. | ,. w , "Some people are alive only because
\|/ \|/ it is illegal to kill them." Perna condita delenda est —^—-^————————————————— ————