Views
428
Replies
13
Status
Closed
Can anyone clarify this issue with facts?
I believe that monitor calibration software is programmed for 8 bit color, because the purpose is to translate monitor color gamut to printing color gamut and printing is an 8 bit process.
That would suggest that when you work in 16 bit color your monitor is not correctly calibrated.
When your 16 bit image is processed for printing it takes two more uncalibrated software hits.
The first hit converts the image to 8 bit color.
The second hit converts the image to the gamut of a particular printer. When the 16 bit image is converted down to 8 bits many users feel there is a loss of color range or tonality that they see on their monitor. That 8 bit monitor image is the one that profiles will use to convert for printing, If one is then going to have readjust this 8 bit image to achieve calibrated,
i.e. predictable printed results then:
why use 16 bit color in the first place?
I believe that monitor calibration software is programmed for 8 bit color, because the purpose is to translate monitor color gamut to printing color gamut and printing is an 8 bit process.
That would suggest that when you work in 16 bit color your monitor is not correctly calibrated.
When your 16 bit image is processed for printing it takes two more uncalibrated software hits.
The first hit converts the image to 8 bit color.
The second hit converts the image to the gamut of a particular printer. When the 16 bit image is converted down to 8 bits many users feel there is a loss of color range or tonality that they see on their monitor. That 8 bit monitor image is the one that profiles will use to convert for printing, If one is then going to have readjust this 8 bit image to achieve calibrated,
i.e. predictable printed results then:
why use 16 bit color in the first place?
How to Improve Photoshop Performance
Learn how to optimize Photoshop for maximum speed, troubleshoot common issues, and keep your projects organized so that you can work faster than ever before!