John Culleton <[EMAIL PROTECTED]> wrote:
>
> So how do I determine which monitors, if any can have adjustable
> Gamma? BTW I specified 3.0 gamma in my XF86Config file but I can
> spot no difference in the test files. So my current Orion monitor
> (17") does not seem to adjust.

Monitors typically don't have adjustable gamma.  The gamma is a
factor of the behaviour of the phosphors/etc and isn't adjustable
(except maybe as a side-effect of manipulating overall brightness
and contrast).  But the native gamma of the monitor will change
over time as it ages.

However, most video cards have a lookup table (LUT) which can be
used to statically transform each of the R/G/B components.  By
default this LUT is loaded by the X server with a linear transform
(a gamma of 1.0) and the effective gamma of your video system is
the combination of that and the native gamma of your monitor.
You can manipulate the gamma of your system by playing with the
Xserver's idea of gamma, but the overall gamma will not be just
that number.

Xfree86 implements LUT manipulation through a X extension, but only
allows you to set a gamma for red, green, and blue (it generates
the LUT values internally).  The basic commandline interface to
this is xgamma, and KDE/etc have added their own versions.

This is relevant to Sven's question:
>
> If I set a reasonable gamma value on my X server, things look
> washed out and pale. Is that really desirable?

If you set `xgamma -gamma 2.2` then your system will NOT have a
gamma of 2.2 (unless your monitor had a gamma of 1.0, which is VERY
unlikely).

The "standard monitor" that is modelled by the sRGB colour space
(which is meant to describe the "average" [well-adjusted] PC monitor,
and is specified as the default colour space of the web) happens
to have a gamma of 2.2 (as well as a white colour of 6500K and a
bunch of other details).


What happens in Windows and MacOS with a colour-managed system is
actually in two parts.  When the monitor is calibrated (e.g. with
a colorimeter or photospectrometer like a ColorVision Spyder, a
basICColor Squid, a Monaco Optix, etc) is twofold:

First, calibration:
        The system is calibrated to a standard viewing condition.
        The brightness/contrast is adjusted to achieve reasonable
        black/white points, and the video card's LUTs are manipulated
        to achieve the target gamma/colour-temp/etc and ensure that
        R=G=B results in a fairly neutral colour.
        Note that the resulting LUT values are not defined by a
        simple gamma curve.

At this point non-colour-managed applications (e.g. window managers)
will have a consistent look across systems that have been calibrated
the same way.  Most people calibrate their systems to D6500 and a
gamma of 2.2, primarily because this is close to the natural behaviour
of most CRTs and this the LUT tables will be close to linear.
Unfortunately most video cards have only 8-bit LUTs and the more
"aggressive" the curve the higher the probability that you will
introduce quantisation and posterisation of some colours.  In fact
when I calibrate the monitors on my Mac systems I specify "native"
gamma where it then measures and uses the gamma of the monitor (but
not all calibration software supports this).

Using D6500 and gamma=2.2 also means that the calibrated system is
close to sRGB.  Because of its use in the web standards (and a few
other reasons) sRGB has become the default colour space for non-CM
files.

Second, profiling:
        The software and colorimeter/photospectrometer measures the 
        colour behaviour of the calibrated system and generates an
        ICC profile for it.  This profile is stored in the system
        as the default profile for the display, and is used by
        colour-managed applications such as Photoshop to transform
        images from their internal colour spaces to the monitor's
        colour space and thus give you an accurate view of the
        colours represented by the image's RGB (or CMYK/etc) data.

Incidentally, the calibration/profiling software typically stores
the LUT data in a tag within the ICC profile.  This tag is not used
by Photoshop et. al. but is used by the "LUT loader" when the user
logs in to reset the calibration.  Unfortunately there are several
vendor-specific tags being used in the Windows environment and each
vendor has its own LUT loader program.  On the Mac there is a
standard tag, and the OS does the LUT loading.

Note that because the native behaviour of monitors changes over
time (and will be different if you adjust almost ANY of the monitor's
controls - even physically moving the monitor can make a difference)
in environments where users care about the colour accuracy of the
system it's usual to recalibrate/profile the monitor on a regular
basis (e.g.  weekly) and only after the monitor has warmed up for
30 minutes or so.

Does that description clear up anything for you?


Lack of support for this stuff in the Gimp et. al. is the main
reason I moved to Macs (I have an IT background, but these days
work as a professional photographer).  I haven't given up the Gimp
entirely yet, but its getting less and less use over time.
__
David Burren
_______________________________________________
Gimp-user mailing list
[EMAIL PROTECTED]
http://lists.xcf.berkeley.edu/mailman/listinfo/gimp-user

Reply via email to