Re: Proprosed break in libGL / DRI driver ABI
Adam Jackson wrote: On Tuesday 05 April 2005 16:11, Brian Paul wrote: Roland Mainz wrote: Another item would be to look into what's required to support visuals beyond 24bit RGB (like 30bit TrueColor visuals) ... someone on IRC (AFAIK ajax (if I don't mix-up the nicks again :)) said that this may require an ABI change, too... I doubt an ABI change would be needed for that. Are you sure about this? Yup, pretty sure. An ABI change at the libGL / driver interface isn't needed. I don't know of any place in that interface where 8-bit color is an issue. Please let me know if I'm wrong. I thought we treated channels as bytes everywhere, unless GLchan was defined to something bigger, and even then only for OSMesa. Even if it's not an ABI change, I suspect that growing GLchan beyond 8 bits while still preserving performance is non-trivial. This is separate from Ian's ABI discussion. It's true that core Mesa has to be recompiled to support 8, 16 or 32-bit color channels. That's something I'd like to change in the future. It will be a lot of work but it can be done. Currently, there aren't any hardware drivers that support > 8-bit color channels. If we did want to support deeper channels in a hardware driver we'd have a lot of work to do in any case. One approach would be to compile core Mesa for 16-bit channels, then shift/drop bits in the driver whenever we write to a color buffer. Of course, there's more to it than that, but it would be feasible. As part of the GL_ARB_framebuffer_object work I'm doing, simultaneous support for various channel sizes will be more do-able. When I look at xc/extras/Mesa/src/mesa/main/config.h I see more items on my wishlist: Would it be possible to increase |MAX_WIDTH| and |MAX_HEIGHT| (and the matching texture limits of the software rasterizer) to 8192 to support larger displays (DMX, Xinerama and Xprint come in mind) ? If you increase MAX_WIDTH/HEIGHT too far, you'll start to see interpolation errors in triangle rasterization (the software routines). The full explanation is long, but basically there needs to be enough fractional bits in the GLfixed datatype to accomodate interpolation across the full viewport width/height. In fact, I'm not sure that we've already gone too far by setting MAX_WIDTH/HEIGHT to 4096 while the GLfixed type only has 11 fractional bits. I haven't heard any reports of bad triangles so far though. But there probably aren't too many people generating 4Kx4K images. Yet. Big images are becoming a reality. DMX+glxproxy brings this real close to home. I fully agree that there's need to render larger images. Before increasing MAX_WIDTH/HEIGHT, someone should do an analysis of the interpolation issues to see what side-effects might pop up. Definitely. Finally, Mesa has a number of scratch arrays that get dimensioned to [MAX_WIDTH]. Some of those arrays/structs are rather large already. I looked into allocating these dynamically, but there were one or two sticky points (mostly related to making scope act the same) so I dropped it. It could be done though. A lot of these allocations are on the stack. Changing them to heap allocations might cause some loss of performance too. -Brian --- SF email is sponsored by - The IT Product Guide Read honest & candid reviews on hundreds of IT Products from real users. Discover which products truly live up to the hype. Start reading now. http://ads.osdn.com/?ad_id=6595&alloc_id=14396&op=click -- ___ Dri-devel mailing list Dri-devel@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: Proprosed break in libGL / DRI driver ABI
On Tuesday 05 April 2005 16:11, Brian Paul wrote: > Roland Mainz wrote: > > Another item would be to look into what's required to support visuals > > beyond 24bit RGB (like 30bit TrueColor visuals) ... someone on IRC > > (AFAIK ajax (if I don't mix-up the nicks again :)) said that this may > > require an ABI change, too... > > I doubt an ABI change would be needed for that. Are you sure about this? I thought we treated channels as bytes everywhere, unless GLchan was defined to something bigger, and even then only for OSMesa. Even if it's not an ABI change, I suspect that growing GLchan beyond 8 bits while still preserving performance is non-trivial. > > When I look at xc/extras/Mesa/src/mesa/main/config.h I see more items on > > my wishlist: Would it be possible to increase |MAX_WIDTH| and > > |MAX_HEIGHT| (and the matching texture limits of the software > > rasterizer) to 8192 to support larger displays (DMX, Xinerama and Xprint > > come in mind) ? > > If you increase MAX_WIDTH/HEIGHT too far, you'll start to see > interpolation errors in triangle rasterization (the software > routines). The full explanation is long, but basically there needs to > be enough fractional bits in the GLfixed datatype to accomodate > interpolation across the full viewport width/height. > > In fact, I'm not sure that we've already gone too far by setting > MAX_WIDTH/HEIGHT to 4096 while the GLfixed type only has 11 fractional > bits. I haven't heard any reports of bad triangles so far though. > But there probably aren't too many people generating 4Kx4K images. Yet. Big images are becoming a reality. DMX+glxproxy brings this real close to home. > Before increasing MAX_WIDTH/HEIGHT, someone should do an analysis of > the interpolation issues to see what side-effects might pop up. Definitely. > Finally, Mesa has a number of scratch arrays that get dimensioned to > [MAX_WIDTH]. Some of those arrays/structs are rather large already. I looked into allocating these dynamically, but there were one or two sticky points (mostly related to making scope act the same) so I dropped it. It could be done though. - ajax pgpl5o2nPadCR.pgp Description: PGP signature
Re: Proprosed break in libGL / DRI driver ABI
Adam Jackson wrote: On Tuesday 05 April 2005 19:03, Ian Romanick wrote: Adam Jackson wrote: I have another one: Hide all the functions that start with XF86DRI*, and expose them to the driver through a function table or glXGetProcAddress rather than by allowing the driver to call them directly. This will simplify the case where the X server is itself linked against libGL. Kevin tells me these functions were never intended to be public API anyway. The only functions that are still used by DRI_NEW_INTERFACE_ONLY drivers are XF86DRICreateDrawable, XF86DRIDestroyDrawable, and XF86DRIDestroyContext. It should be easy enough to eliminate those, but something other than gLXGetProcAddress might be preferable. Yeah, I just threw out glXGetProcAddress as a suggestion. It's probably better to pass this table into the driver through the create context method. We can't eliminate the functionality of these calls (I don't think), but they should not be visible API from the perspective of the GL client. Right. glXGetProcAddress() should not be used by libGL or the drivers to get internal function pointers. There should be a new function for that, if we're breaking the ABI. -Brian --- SF email is sponsored by - The IT Product Guide Read honest & candid reviews on hundreds of IT Products from real users. Discover which products truly live up to the hype. Start reading now. http://ads.osdn.com/?ad_id=6595&alloc_id=14396&op=click -- ___ Dri-devel mailing list Dri-devel@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: Proprosed break in libGL / DRI driver ABI
On Tuesday 05 April 2005 19:03, Ian Romanick wrote: > Adam Jackson wrote: > > I have another one: Hide all the functions that start with XF86DRI*, and > > expose them to the driver through a function table or glXGetProcAddress > > rather than by allowing the driver to call them directly. This will > > simplify the case where the X server is itself linked against libGL. > > > > Kevin tells me these functions were never intended to be public API > > anyway. > > The only functions that are still used by DRI_NEW_INTERFACE_ONLY drivers > are XF86DRICreateDrawable, XF86DRIDestroyDrawable, and > XF86DRIDestroyContext. It should be easy enough to eliminate those, but > something other than gLXGetProcAddress might be preferable. Yeah, I just threw out glXGetProcAddress as a suggestion. It's probably better to pass this table into the driver through the create context method. We can't eliminate the functionality of these calls (I don't think), but they should not be visible API from the perspective of the GL client. - ajax pgpTQftVhZUVG.pgp Description: PGP signature
Re: Proprosed break in libGL / DRI driver ABI
Nicolai Haehnle wrote: On Tuesday 05 April 2005 22:11, Brian Paul wrote: If you increase MAX_WIDTH/HEIGHT too far, you'll start to see interpolation errors in triangle rasterization (the software routines). The full explanation is long, but basically there needs to be enough fractional bits in the GLfixed datatype to accomodate interpolation across the full viewport width/height. In fact, I'm not sure that we've already gone too far by setting MAX_WIDTH/HEIGHT to 4096 while the GLfixed type only has 11 fractional bits. I haven't heard any reports of bad triangles so far though. But there probably aren't too many people generating 4Kx4K images. Before increasing MAX_WIDTH/HEIGHT, someone should do an analysis of the interpolation issues to see what side-effects might pop up. Finally, Mesa has a number of scratch arrays that get dimensioned to [MAX_WIDTH]. Some of those arrays/structs are rather large already. Slightly off-topic, but a thought that occured to me in this regard was to tile rendering. Basically, do a logical divide of the framebuffer into rectangles of, say, 64x64 pixels. During rasterization, all primitives are split according to those tiles and rendered separately. This has some advantages: a) It could help reduce the interpolation issues you mentioned. It's obviously not a magic bullet, but it can avoid the need for insane precision in inner loops. c) Better control of the size of scratch structures, possibly even better caching behaviour. b) One could build a multi-threaded rasterizer (where work queues are per framebuffer tile), which is going to become all the more interesting once dualcore CPUs are widespread. This would be FAR more work than simply addressing the interpolation issue. There's lots of subtle conformance issues with the tiling approach you suggest. Consider something simple like line stipples. -Brian --- SF email is sponsored by - The IT Product Guide Read honest & candid reviews on hundreds of IT Products from real users. Discover which products truly live up to the hype. Start reading now. http://ads.osdn.com/?ad_id=6595&alloc_id=14396&op=click -- ___ Dri-devel mailing list Dri-devel@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: Proprosed break in libGL / DRI driver ABI
Ian Romanick wrote: > > For X.org 6.9 / 7.0 I would like to break the existing libGL / DRI > driver interface. There is a *LOT* of crap hanging around in both libGL > and in the DRI drivers that exists *only* to maintain backwards > compatability with older versions of the interface. Since it's crap, I > would very much like to flush it. > > I'd like to cut this stuff out for 7.0 for several main reasons: > > - A major release is a logical time to make breaks like this. > > - Bit rot. Sure, we /assume/ libGL and the DRI drivers still actually > work with older versions, but how often does it actually get tested? > > - Code asthetics. Because of the backwards compatability mechanisms > that are in place, especially in libGL, to code can be a bit hard to > follow. Removing that code would, in a WAG estimate, eliminate at least > a couple hundred lines of code. It would also eliminate a number of > '#ifdef DRI_NEW_INTERFACE_ONLY' blocks. > > What I'm proposing goes a bit beyond '-DDRI_NEW_INTERFACE_ONLY=1", but > that is a start. In include/GL/internal/dri_interface.h (in the Mesa > tree) there are number of methods that get converted to 'void *' if > DRI_NEW_INTERFACE_ONLY is defined. I propose that we completely remove > them from the structures and rename some of the remaining methods. For > example, __DRIcontextRec::bindContext and __DRIcontextRec::bindContext2 > would be removed, and __DRIcontextRec::bindContext3 would be renamed to > __DRIcontextRec::bindContext. > > Additionally, there are a few libGL-private structures in > src/glx/x11/glxclient.h that, due to binary compatability issues with > older versions of the interface, can't be change. Eliminating support > for those older interfaces would allow some significant cleaning in > those structures. Basically, all of the stuff in glxclient.h with > DEPRECATED in the name would be removed. Other, less important, changes > could also be made to __GLXcontextRec. Another item would be to look into what's required to support visuals beyond 24bit RGB (like 30bit TrueColor visuals) ... someone on IRC (AFAIK ajax (if I don't mix-up the nicks again :)) said that this may require an ABI change, too... When I look at xc/extras/Mesa/src/mesa/main/config.h I see more items on my wishlist: Would it be possible to increase |MAX_WIDTH| and |MAX_HEIGHT| (and the matching texture limits of the software rasterizer) to 8192 to support larger displays (DMX, Xinerama and Xprint come in mind) ? Bye, Roland -- __ . . __ (o.\ \/ /.o) [EMAIL PROTECTED] \__\/\/__/ MPEG specialist, C&&JAVA&&Sun&&Unix programmer /O /==\ O\ TEL +49 641 7950090 (;O/ \/ \O;) --- SF email is sponsored by - The IT Product Guide Read honest & candid reviews on hundreds of IT Products from real users. Discover which products truly live up to the hype. Start reading now. http://ads.osdn.com/?ad_id=6595&alloc_id=14396&op=click -- ___ Dri-devel mailing list Dri-devel@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: Proprosed break in libGL / DRI driver ABI
On Tuesday 05 April 2005 22:11, Brian Paul wrote: > If you increase MAX_WIDTH/HEIGHT too far, you'll start to see > interpolation errors in triangle rasterization (the software > routines). The full explanation is long, but basically there needs to > be enough fractional bits in the GLfixed datatype to accomodate > interpolation across the full viewport width/height. > > In fact, I'm not sure that we've already gone too far by setting > MAX_WIDTH/HEIGHT to 4096 while the GLfixed type only has 11 fractional > bits. I haven't heard any reports of bad triangles so far though. > But there probably aren't too many people generating 4Kx4K images. > > Before increasing MAX_WIDTH/HEIGHT, someone should do an analysis of > the interpolation issues to see what side-effects might pop up. > > Finally, Mesa has a number of scratch arrays that get dimensioned to > [MAX_WIDTH]. Some of those arrays/structs are rather large already. Slightly off-topic, but a thought that occured to me in this regard was to tile rendering. Basically, do a logical divide of the framebuffer into rectangles of, say, 64x64 pixels. During rasterization, all primitives are split according to those tiles and rendered separately. This has some advantages: a) It could help reduce the interpolation issues you mentioned. It's obviously not a magic bullet, but it can avoid the need for insane precision in inner loops. c) Better control of the size of scratch structures, possibly even better caching behaviour. b) One could build a multi-threaded rasterizer (where work queues are per framebuffer tile), which is going to become all the more interesting once dualcore CPUs are widespread. cu, Nicolai pgpyy3jidOfu4.pgp Description: PGP signature
Re: Proprosed break in libGL / DRI driver ABI
Adam Jackson wrote: I have another one: Hide all the functions that start with XF86DRI*, and expose them to the driver through a function table or glXGetProcAddress rather than by allowing the driver to call them directly. This will simplify the case where the X server is itself linked against libGL. Kevin tells me these functions were never intended to be public API anyway. The only functions that are still used by DRI_NEW_INTERFACE_ONLY drivers are XF86DRICreateDrawable, XF86DRIDestroyDrawable, and XF86DRIDestroyContext. It should be easy enough to eliminate those, but something other than gLXGetProcAddress might be preferable. --- SF email is sponsored by - The IT Product Guide Read honest & candid reviews on hundreds of IT Products from real users. Discover which products truly live up to the hype. Start reading now. http://ads.osdn.com/?ad_id=6595&alloc_id=14396&op=click -- ___ Dri-devel mailing list Dri-devel@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: Proprosed break in libGL / DRI driver ABI
On Tuesday 05 April 2005 14:06, Ian Romanick wrote: > For X.org 6.9 / 7.0 I would like to break the existing libGL / DRI > driver interface. There is a *LOT* of crap hanging around in both libGL > and in the DRI drivers that exists *only* to maintain backwards > compatability with older versions of the interface. Since it's crap, I > would very much like to flush it. > > I'd like to cut this stuff out for 7.0 for several main reasons: > > - A major release is a logical time to make breaks like this. > > - Bit rot. Sure, we /assume/ libGL and the DRI drivers still actually > work with older versions, but how often does it actually get tested? > > - Code asthetics. Because of the backwards compatability mechanisms > that are in place, especially in libGL, to code can be a bit hard to > follow. Removing that code would, in a WAG estimate, eliminate at least > a couple hundred lines of code. It would also eliminate a number of > '#ifdef DRI_NEW_INTERFACE_ONLY' blocks. > > What I'm proposing goes a bit beyond '-DDRI_NEW_INTERFACE_ONLY=1", but > that is a start. In include/GL/internal/dri_interface.h (in the Mesa > tree) there are number of methods that get converted to 'void *' if > DRI_NEW_INTERFACE_ONLY is defined. I propose that we completely remove > them from the structures and rename some of the remaining methods. For > example, __DRIcontextRec::bindContext and __DRIcontextRec::bindContext2 > would be removed, and __DRIcontextRec::bindContext3 would be renamed to > __DRIcontextRec::bindContext. > > Additionally, there are a few libGL-private structures in > src/glx/x11/glxclient.h that, due to binary compatability issues with > older versions of the interface, can't be change. Eliminating support > for those older interfaces would allow some significant cleaning in > those structures. Basically, all of the stuff in glxclient.h with > DEPRECATED in the name would be removed. Other, less important, changes > could also be made to __GLXcontextRec. I have another one: Hide all the functions that start with XF86DRI*, and expose them to the driver through a function table or glXGetProcAddress rather than by allowing the driver to call them directly. This will simplify the case where the X server is itself linked against libGL. Kevin tells me these functions were never intended to be public API anyway. - ajax pgpTUVhHUkbRR.pgp Description: PGP signature
Re: Proprosed break in libGL / DRI driver ABI
Keith Whitwell wrote: Ian Romanick wrote: For X.org 6.9 / 7.0 I would like to break the existing libGL / DRI driver interface. There is a *LOT* of crap hanging around in both libGL and in the DRI drivers that exists *only* to maintain backwards compatability with older versions of the interface. Since it's crap, I would very much like to flush it. I'd like to cut this stuff out for 7.0 for several main reasons: - A major release is a logical time to make breaks like this. - Bit rot. Sure, we /assume/ libGL and the DRI drivers still actually work with older versions, but how often does it actually get tested? In fact, we know that they don't work as backwards compatibility was broken in one of the recent 6.8.x releases, wasn't it? Given that is the case we might be able to take advantage of that and bring forward some of those changes - the old versions don't work anyway so there's absolutely no point keeping the code around for them... The 6.8.x break was on the server-side *only*. I made some changes in libglx that slightly broke the interface with the DDX. AFAIK, the client-side interfaces /should/ still work. Like I said, though, I don't know that it has been tested... --- SF email is sponsored by - The IT Product Guide Read honest & candid reviews on hundreds of IT Products from real users. Discover which products truly live up to the hype. Start reading now. http://ads.osdn.com/?ad_id=6595&alloc_id=14396&op=click -- ___ Dri-devel mailing list Dri-devel@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: Proprosed break in libGL / DRI driver ABI
Roland Mainz wrote: Ian Romanick wrote: For X.org 6.9 / 7.0 I would like to break the existing libGL / DRI driver interface. There is a *LOT* of crap hanging around in both libGL and in the DRI drivers that exists *only* to maintain backwards compatability with older versions of the interface. Since it's crap, I would very much like to flush it. I'd like to cut this stuff out for 7.0 for several main reasons: - A major release is a logical time to make breaks like this. - Bit rot. Sure, we /assume/ libGL and the DRI drivers still actually work with older versions, but how often does it actually get tested? - Code asthetics. Because of the backwards compatability mechanisms that are in place, especially in libGL, to code can be a bit hard to follow. Removing that code would, in a WAG estimate, eliminate at least a couple hundred lines of code. It would also eliminate a number of '#ifdef DRI_NEW_INTERFACE_ONLY' blocks. What I'm proposing goes a bit beyond '-DDRI_NEW_INTERFACE_ONLY=1", but that is a start. In include/GL/internal/dri_interface.h (in the Mesa tree) there are number of methods that get converted to 'void *' if DRI_NEW_INTERFACE_ONLY is defined. I propose that we completely remove them from the structures and rename some of the remaining methods. For example, __DRIcontextRec::bindContext and __DRIcontextRec::bindContext2 would be removed, and __DRIcontextRec::bindContext3 would be renamed to __DRIcontextRec::bindContext. Additionally, there are a few libGL-private structures in src/glx/x11/glxclient.h that, due to binary compatability issues with older versions of the interface, can't be change. Eliminating support for those older interfaces would allow some significant cleaning in those structures. Basically, all of the stuff in glxclient.h with DEPRECATED in the name would be removed. Other, less important, changes could also be made to __GLXcontextRec. Another item would be to look into what's required to support visuals beyond 24bit RGB (like 30bit TrueColor visuals) ... someone on IRC (AFAIK ajax (if I don't mix-up the nicks again :)) said that this may require an ABI change, too... I doubt an ABI change would be needed for that. When I look at xc/extras/Mesa/src/mesa/main/config.h I see more items on my wishlist: Would it be possible to increase |MAX_WIDTH| and |MAX_HEIGHT| (and the matching texture limits of the software rasterizer) to 8192 to support larger displays (DMX, Xinerama and Xprint come in mind) ? If you increase MAX_WIDTH/HEIGHT too far, you'll start to see interpolation errors in triangle rasterization (the software routines). The full explanation is long, but basically there needs to be enough fractional bits in the GLfixed datatype to accomodate interpolation across the full viewport width/height. In fact, I'm not sure that we've already gone too far by setting MAX_WIDTH/HEIGHT to 4096 while the GLfixed type only has 11 fractional bits. I haven't heard any reports of bad triangles so far though. But there probably aren't too many people generating 4Kx4K images. Before increasing MAX_WIDTH/HEIGHT, someone should do an analysis of the interpolation issues to see what side-effects might pop up. Finally, Mesa has a number of scratch arrays that get dimensioned to [MAX_WIDTH]. Some of those arrays/structs are rather large already. -Brian --- SF email is sponsored by - The IT Product Guide Read honest & candid reviews on hundreds of IT Products from real users. Discover which products truly live up to the hype. Start reading now. http://ads.osdn.com/?ad_id=6595&alloc_id=14396&op=click -- ___ Dri-devel mailing list Dri-devel@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: Proprosed break in libGL / DRI driver ABI
Ian Romanick wrote: For X.org 6.9 / 7.0 I would like to break the existing libGL / DRI driver interface. There is a *LOT* of crap hanging around in both libGL and in the DRI drivers that exists *only* to maintain backwards compatability with older versions of the interface. Since it's crap, I would very much like to flush it. I'd like to cut this stuff out for 7.0 for several main reasons: - A major release is a logical time to make breaks like this. - Bit rot. Sure, we /assume/ libGL and the DRI drivers still actually work with older versions, but how often does it actually get tested? In fact, we know that they don't work as backwards compatibility was broken in one of the recent 6.8.x releases, wasn't it? Given that is the case we might be able to take advantage of that and bring forward some of those changes - the old versions don't work anyway so there's absolutely no point keeping the code around for them... Keith --- SF email is sponsored by - The IT Product Guide Read honest & candid reviews on hundreds of IT Products from real users. Discover which products truly live up to the hype. Start reading now. http://ads.osdn.com/?ad_id=6595&alloc_id=14396&op=click -- ___ Dri-devel mailing list Dri-devel@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/dri-devel
Proprosed break in libGL / DRI driver ABI
For X.org 6.9 / 7.0 I would like to break the existing libGL / DRI driver interface. There is a *LOT* of crap hanging around in both libGL and in the DRI drivers that exists *only* to maintain backwards compatability with older versions of the interface. Since it's crap, I would very much like to flush it. I'd like to cut this stuff out for 7.0 for several main reasons: - A major release is a logical time to make breaks like this. - Bit rot. Sure, we /assume/ libGL and the DRI drivers still actually work with older versions, but how often does it actually get tested? - Code asthetics. Because of the backwards compatability mechanisms that are in place, especially in libGL, to code can be a bit hard to follow. Removing that code would, in a WAG estimate, eliminate at least a couple hundred lines of code. It would also eliminate a number of '#ifdef DRI_NEW_INTERFACE_ONLY' blocks. What I'm proposing goes a bit beyond '-DDRI_NEW_INTERFACE_ONLY=1", but that is a start. In include/GL/internal/dri_interface.h (in the Mesa tree) there are number of methods that get converted to 'void *' if DRI_NEW_INTERFACE_ONLY is defined. I propose that we completely remove them from the structures and rename some of the remaining methods. For example, __DRIcontextRec::bindContext and __DRIcontextRec::bindContext2 would be removed, and __DRIcontextRec::bindContext3 would be renamed to __DRIcontextRec::bindContext. Additionally, there are a few libGL-private structures in src/glx/x11/glxclient.h that, due to binary compatability issues with older versions of the interface, can't be change. Eliminating support for those older interfaces would allow some significant cleaning in those structures. Basically, all of the stuff in glxclient.h with DEPRECATED in the name would be removed. Other, less important, changes could also be made to __GLXcontextRec. --- SF email is sponsored by - The IT Product Guide Read honest & candid reviews on hundreds of IT Products from real users. Discover which products truly live up to the hype. Start reading now. http://ads.osdn.com/?ad_id=6595&alloc_id=14396&op=click -- ___ Dri-devel mailing list Dri-devel@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/dri-devel