On 3/20/06, Jack Carroll <[EMAIL PROTECTED]> wrote: > If I understand the implication of this correctly, doing gamma > correction digitally requires enough resolution to distinguish between > neighboring JNDs at the low-light end of the scale, while having enough > signal range to reach the bright end of the scale. This naturally requires > that linearity-corrected luminance have several more bits of resolution than > luma does. I did essentially the same thing last year, in a driver board > for a microwave attenuator. > If linearity correction could be done with analog hardware, the DACs > would only need enough bits to express the uncorrected data. Unfortunately, > an analog arbitrary-power nonlinear circuit's bandwidth depends on > instantaneous voltage, and can be as little as a few KHz at the low-voltage > end, increasing to perhaps a few hundred KHz at maximum signal level. It > also gets very noisy as signal level approaches zero.
Some of the analog stuff is over my head. But yes, if we were to represent the luminance digitally, it would need something like 14 bits to represent without huge gaps at (I believe) the lower end of the spectrum. Something like that. Fortunately, most gamma correction is from luma to luma. The monitor has ABOUT the right curve, but not quite, so we apply a small gamma correction factor. 10 bits is more than enough for that. _______________________________________________ Open-graphics mailing list [email protected] http://lists.duskglow.com/mailman/listinfo/open-graphics List service provided by Duskglow Consulting, LLC (www.duskglow.com)
