On Sun, Feb 24, 2002 at 07:58:14PM -0500, Leif Delgass wrote:

> Given this hardware limitation, most GL_MODULATE cases can produce
> incorrect results with alpha blending enabled.  Using software fallbacks
> for these cases could seriously impact performance in applications that
> make heavy use of these modes.  So my question is this: what's the best
> way to handle compliance v. performance?  I can use software rendering by
> default when alpha blending is enabled to always give correct results, and
> offer an environment variable switch to use the flawed hardware
> implementation.  If you have a fast enough processor, maybe you wouldn't
> notice so much if the fallback cases are used sparingly in an app.  Then
> gamers could enable the hardware implementation to get better performance,
> at the cost of some rendering problems.  There doesn't seem to be a GL API
> method for conveying that a core feature like this (not an extension) is
> broken or slow, so an environment var or config file option seems like the
> only alternative.  Of course the advantage of an environment var is that 
> it can be set per app.

Using an environment variable is an OK sollution, but it is far from ideal.
I don't think setting this type of parameter in a global configuration file
is at all acceptable.  It doesn't do users any good if only root can change
"trivial" settings like this.

What would be ideal is to have something like .Xdefaults that could be used
to select parameters like this.  This would avoid the mess of environment
variables and still allow all settings to be (optionally) per-application.
Is there some way that DRI could trivially hook into settings from
.Xdefaults?

-- 
Tell that to the Marines!

_______________________________________________
Dri-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/dri-devel

Reply via email to