Roland Scheidegger wrote:
When I was playing around with texenv (I'm trying to implement GL_EXT_blend_func_separate and GL_EXT_blend_equation_separate for the R200, though my attempts to modify texenv to make it a useful test for that were unsuccesful), I've noticed that the radeon and r200 driver announce GL_EXT_blend_color (because it's part of the imaging subset), but neither one implements it at all. Unsurprisingly, it doesn't work (seems to use some sort of a default color, though not the Open GL default color for this extension).
I've looked throgh the r200_reg.h and radeon_reg.h but couldn't find an obvious register to store the constant blend color. In fact, ATI says GL_EXT_blend_color isn't available on the R100, so maybe this needs a different approach. Any ideas?

Are you sure? It looks like the code is all there in r200_state.c:


void r200BlendFuncSeparate( ... )
{
        ...

   case GL_CONSTANT_COLOR:
      b |= R200_SRC_BLEND_GL_CONST_COLOR;
      break;
   case GL_ONE_MINUS_CONSTANT_COLOR:
      b |= R200_SRC_BLEND_GL_ONE_MINUS_CONST_COLOR;
      break;
   case GL_CONSTANT_ALPHA:
      b |= R200_SRC_BLEND_GL_CONST_ALPHA;
      break;
   case GL_ONE_MINUS_CONSTANT_ALPHA:
      b |= R200_SRC_BLEND_GL_ONE_MINUS_CONST_ALPHA;
      break;
}


Am I missing something -- isn't this setting the appropriate blend modes? Is the constant color not getting updated correctly?


Keith



-------------------------------------------------------
SF.Net is sponsored by: Speed Start Your Linux Apps Now.
Build and deploy apps & Web services for Linux with
a free DVD software kit from IBM. Click Now!
http://ads.osdn.com/?ad_id=1356&alloc_id=3438&op=click
--
_______________________________________________
Dri-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/dri-devel

Reply via email to