On Thu, Jan 24, 2002 at 12:55:28AM -0600, John Utz wrote:
> one last question before i knock off for the nite....
> 
> suppose one has two cards.
> 
> the first, the FooBarTech VisiBlaster, has special hardware for optimizing 
> the generation of very realistic clouds.
> 
> the second, the BazGrafix RadiantTurkeyBaster XL6000 AGP 4X Value Edition, 
> has special hardware for optimizing the generation of very realistic hair.
> 
> how would these get hooked up and taken advantage of? how many layers of
> abstraction get punctured by these nonstandard features? or am i asking
> the wrong question because they (every graphics chipset) are *all* non
> standard?
> 
> can somebody point me to examples in the code? i'd like to be pointed at
> examples of graphics generation(the hair and clouds stuff) and examples of
> hardware operation ( ie turning on DVD decoding or enabling the tv-out
> signal path ).

You'd probably implement an OpenGL extension. The NV_* extensions are
nVidia specific extensions as examples. How closely this matches your
hardware depends on the API you design and the features of the
hardware. For it to get accepted as a standard part of OpenGL it would
have to be reviewed by the OpenGL architecture review board (ARB). If
there were multiple vendors who implemented the feature then the API
might need to be changed to standize it. This is one of the current
issues with pixle shaders, because the ATI and nVidia API are different,
and they've got to get to a common standard accepted by the ARB
members. (There's many more issues there, but that's one.)

Things like DVD decoding or TV out are even more different. They
wouldn't be part of OpenGL at all. They might be X extensions. They
might or might not use the DRM as part of their implemention.

                                                - |Daryll

_______________________________________________
Dri-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/dri-devel

Reply via email to