Ian Romanick wrote:
Felix Kühling wrote:
On Wed, 30 Jul 2003 09:20:28 -0700 Ian Romanick <[EMAIL PROTECTED]> wrote:
Felix Kühling wrote:
I see: C SPECIFICATION const char * glXQueryExtensionsString( Display *dpy, int screen )
I don't mean what the GLX specification says to do. I mean what our code actually implements. Internally there is a *single* *global* table & extension string. So it's not even tracking it per-display. It's worse than that. :(
Yeah, I was just pointing out how extension tracking is specified.
they are instead tracked per-display. This doesn't "matter" right now because we don't support the configuration that you describe (at least not as far as I know!). Each card would be it's own display.
Maybe these configs don't work for one reason or another, but the
configuration framework was designed with this in mind and also the code
in dri_glx.c handles the case of different drivers for different
screens. I see two choices here, either glxextensions.c manages multiple
screens itself or the four bitfields server/client_support/only are
managed in __GLXscreenConfigsRec. In either case glXGetUsableExtensions
would have to be told about the screen. A screen number in the first
case or a __GLXscreenConfigsRec pointer in the second case.
Since glXGetUsableExtensions is only called from glXQueryExtensionsString (glxcmds.c, line 1416), that should be an easy change to make.
It gets more complicated with __glXEnableExtension. If it has to access per-screen extension information it would need some sort of a screen parameter too. As it's called by the driver, this is a binary compatibility problem. Furthermore it is called from __driRegisterExtensions which doesn't know the screen itself.
It is a binary compatabilty problem, but a minor one. Since no code with __glXEnableExtension has ever shipped with XFree86 (stable release or their CVS), our exposure is pretty low. Low enough that I wouldn't worry about it much. There is a pre-texmem code-path that was used by the R200 driver that needs to be maintained. I'm not sure how to keep that working.
The quick and dirty solution would be a global screen pointer that indicates the screen currently being configured.
A more invasive but more elegant solution is this:
I observed that glXQueryExtensionsString calls glXInitialize first which in turn loads and initializes the dri drivers (calls their createScreen functions). Thus, before an extension string is returned all drivers are initialized. So why not register extensions in the driver's createScreen function? The only reason I can see is the call to glXRegisterExtensions in glXGetProcAddress. Is there a good reason for not calling glXInitialize in glXGetProcAddress instead?
That's a really good idea. I think that solves most of the problems. Keith, do you have a problem with that change?
Not off the top of my head.
It's worth asking Brian about this, as he's had greater involvement in those paths than I.
And Ian's made a lot of changes since I've worked in that code. I'm not fully up to speed on it anymore.
You can't call __glXInitalize from in glXGetProcAddress because you don't have a Display pointer.
Earlier, Felix wrote:
> Do the __driRegisterExtensions functions in the drivers rely on > being called during initialisation?
Yes.
The driver's __driRegisterExtensions() function can do two things:
1. Add new gl*() functions into the dispatch table. For example, if libGL doesn't know anything about the GL_ARB_vertex_buffer_object extension but the driver really does implement it, the driver can plug in the glBindBufferARB(), etc functions into the dispatch table so the app can use that extension.
2. The driver can register/enable new glX*() functions with libGL.
In either case, this has to be done before the user gets the results of glXGetProcAddressARB() or glXQueryExtensionsString().
Earlier, Felix wrote, and Ian followed up with:
Consequently __glXDisableExtension should never be called (or better not even exist . And the only way to disable an extension is to not enable it. Thus, if you don't want to enable the swap-interval extensions if the hardware can't support them (no IRQs) then you have to know whether IRQs work at the time __driRegisterExtensions is called. Is that possible?
Now there's an interesting point. The bigger problem is that the driver might not have a chance to call __glXDisableExtension until *after* the app has called glXQueryExtensionsString. At that point the extension string cannot be changed. I'm not sure what the right answer is here.
I don't know the answer to this either.
-Brian
------------------------------------------------------- This SF.Net email sponsored by: Free pre-built ASP.NET sites including Data Reports, E-commerce, Portals, and Forums are available now. Download today and enter to win an XBOX or Visual Studio .NET. http://aspnet.click-url.com/go/psa00100003ave/direct;at.aspnet_072303_01/01 _______________________________________________ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel