hello everybody,

i'm evaluating some display calibration options and got confused, so i hoped somebody can shed some light on this:

i always assumed that the Nuke sRGB viewer LUT is made to match a standard sRGB monitor. now, the calibration software for the NEC monitor i'm using offers different target gamma curves, with a default of 2.2 and a custom option of sRGB. since nuke lists sRGB, i would expect that sRGB is the right option for nuke.

however, 2.2 is the recommended default in pretty much all profiling software (x-rite, datacolor and spectraview), so it would appear that for a normal desktop photo/video workflow this is the desired setting which most people use (which is confusing again as i thought any color dumb application, like final cut pro would be designed to work on a sRGB monitor).

so, what do other people with better understanding do?
calibrate to a sRGB monitor response curve for Nuke and a Gamma 2.2 for all other apps and switch between the two? or am i missing something here?

would be grateful for any pointers
++ chris




_______________________________________________
Nuke-users mailing list
[email protected], http://forums.thefoundry.co.uk/
http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-users

Reply via email to