Hello,

I am trying to get nVidia GeForce (GTX 750 Ti in particular) work with
10 bit depth with Dell U2713H (supposedly 10-bit input, 14-bit LUT,
8-bit + FRC panel; connected to card's internal DisplayPort) in Linux.

Without success so far.

NVIDIA X Server setting X.Org log confirms "30 bit Depth", GUI toolkits
(Qt, GTK) show the color glitches as expected. After patching the gtk2+
and applying "native" to Qt graphic system the apps appear almost okay,
just a few glitches remain here and there.

But it does so also with 8-bit LCD (Samsung XL-20) - my assumption was
that nVidia driver would not allow 30-bit depth unless there is a
10-bit display connected.

"dispcal -v -yl -R" says the video card LUT appears to have 8-bit precision.

1024 grey ramp (encoded as 16bit PNG) displayed in Krita (with OpenGL
backend active) shows clearly with only 8-bit precision.


Finaly, two questions:

x Anybody got GeForce cards working with 10-bit depth to a 10-bit
monitor (native or FRC)?

x Anybody uses 8bit+FRC display (Dell U2713H, Asus PA279Q and others)
with e.g. Quadro card with 10-bit depth? (I am wondering, if the input
of the display really is 10-bit or just 8-bit.)

regards,
Milan

-- 
http://milan-knizek.net/
About linux and photography (Czech only)
O linuxu a fotografování


------------------------------------------------------------------------------
_______________________________________________
Oyranos-devel mailing list
Oyranos-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/oyranos-devel

Reply via email to