Ian Romanick schrieb:
> This patch set hits 4 repositories.  Woo-hoo!
> 
> The old DRI2GetBuffers protocol is replaced with DRI2GetBuffersWithFormat.
> This protocol adds a per-buffer format value.  This is a magic value that is
> opaque outside the driver.  The Intel driver simply sends the bits-per-pixel.
> Drivers with other requirements to make the correct allocation can send other
> data in the field.
> 
> The new function also behaves somewhat differently.  DRI2GetBuffers would
> create the requested set of buffers and destroy all previously existing
> buffers.  DRI2GetBuffersWithFormat creates the requested set of buffers, but
> it only destroys existing buffers on attachments in the requested set.
> Futher, it only allocates (or destroys) buffers if the size or format have
> changed.  This allows a client to initially request { DRI2BufferBackLeft,
> DRI2BufferDepth }, and later request { DRI2BufferBackLeft, 
> DRI2BufferFrontLeft,
> DRI2BufferDepth } without ill effect.
> 
> Since the buffer allocation is done piece-wise, it is difficult to implement
> the combined depth / stencil buffers.  Taking a note from
> GL_ARB_framebuffer_object, I added a new DRI2BufferDepthStencil attachment.
> 
> I have tested the following combinations with the listed result:
> 
>  * New libGL, old 3D driver, old X server, old 2D driver -> works just as
>    previously
>  * New libGL, new 3D driver, old X server, old 2D driver -> works just as
>    previously
>  * New libGL, new 3D driver, new X server, old 2D driver -> DRI2 fails to
>    initialize, uses the software rasterizer
>  * New libGL, new 3D driver, old X server, new 2D driver -> DRI2 fails to
>    initialize, uses the software rasterizer
>  * New libGL, new 3D driver, new X server, new 2D driver -> Works the way
>    we really want it to!  Front-buffer rendering works, but the fake front-
>    buffer is only allocated when it's needed.  JUST LIKE MAGIC!
> 
> The combination that is not tested is the old libGL / 3D driver with 
> everything
> else new.  This should work.  The idea is that if the the 2D driver receives
> format=0 in CreateBuffer, it should make the same allocation that 
> CreateBuffers
> would have made for the attachment.

Looks like xf86-video-intel does not treat allocations with format=0 the
same with your patches. I need an extra (format !=
0)?format:pDraw->depth in xf86-video-intel to get kwin to work with a
mesa forced to use DRI2GetBuffers.

Additionally, in mesa, src/mesa/drivers/dri/intel/intel_tex_image.c,
intelSetTexBuffer2, color_rb[0](the front buffer) is accessed without
is_front_buffer_rendering being set to true, so RGBA visuals don't work
when doing opengl compositing.

Finally, i noticed that when forcing mesa to use DRI2GetBuffers, the
front buffer is requested without considering is_front_buffer_rendering.
In that case i get the "kwin renders to real front buffer" problem. This
also happens in the DRI2GetBuffersWithFormat-path, when using the
condition from the DRI2GetBuffers-path.

Completely unrelated to this(and your patchset): i think __glXDisp_WaitX
in xserver should call glxc->drawPriv->waitX instead of
glxc->drawPriv->waitGL.

Thanks for making opengl compositing using kwin work again on i965. I
attached patches for all of these problems, except for the differing
handling of is_front_buffer_rendering in intel_update_renderbuffers.

Regards,
  Pierre
commit 180da48d70f1fbb5db2ed69cfb5dcb715d48633e
Author: Pierre Willenbrock <[email protected]>
Date:   Sat Apr 25 22:35:40 2009 +0200

    Set is_front_buffer_rendering to GL_TRUE before touching color_rb[0]

diff --git a/src/mesa/drivers/dri/intel/intel_context.c b/src/mesa/drivers/dri/intel/intel_context.c
index eb224a8..ebf7177 100644
--- a/src/mesa/drivers/dri/intel/intel_context.c
+++ b/src/mesa/drivers/dri/intel/intel_context.c
@@ -739,6 +739,8 @@ intelInitContext(struct intel_context *intel,
     */
    intel->no_hw = getenv("INTEL_NO_HW") != NULL;
 
+   intel->is_front_buffer_rendering = GL_FALSE;
+
    return GL_TRUE;
 }
 
diff --git a/src/mesa/drivers/dri/intel/intel_tex_image.c b/src/mesa/drivers/dri/intel/intel_tex_image.c
index 1f192da..f1ccaf2 100644
--- a/src/mesa/drivers/dri/intel/intel_tex_image.c
+++ b/src/mesa/drivers/dri/intel/intel_tex_image.c
@@ -747,6 +747,8 @@ intelSetTexBuffer2(__DRIcontext *pDRICtx, GLint target,
    if (!intelObj)
       return;
 
+   intel->is_front_buffer_rendering = GL_TRUE;
+
    intel_update_renderbuffers(pDRICtx, dPriv);
 
    rb = intel_fb->color_rb[0];
commit b7070a39203cdc51d58dbaf10e3c817f31c6d4f9
Author: Pierre Willenbrock <[email protected]>
Date:   Sat Apr 25 22:58:20 2009 +0200

    format == 0 means "use default" in I830DRI2CreateBuffer

diff --git a/src/i830_dri.c b/src/i830_dri.c
index 4ab09c9..70b76ae 100644
--- a/src/i830_dri.c
+++ b/src/i830_dri.c
@@ -1677,7 +1677,7 @@ I830DRI2CreateBuffer(DrawablePtr pDraw, unsigned int attachment,
 	pPixmap = (*pScreen->CreatePixmap)(pScreen,
 					   pDraw->width,
 					   pDraw->height,
-					   format,
+					   (format != 0)?format:pDraw->depth,
 					   hint);
 
     }
commit d59e04611f7c9edd355034273ebabb28e3db1c47
Author: Pierre Willenbrock <[email protected]>
Date:   Sun Apr 19 21:15:22 2009 +0200

    Fix obvious copypasta

diff --git a/glx/glxcmds.c b/glx/glxcmds.c
index 45221d9..90cf817 100644
--- a/glx/glxcmds.c
+++ b/glx/glxcmds.c
@@ -799,8 +799,8 @@ int __glXDisp_WaitX(__GLXclientState *cl, GLbyte *pc)
 	    return error;
     }
 
-    if (glxc && glxc->drawPriv->waitGL)
-	(*glxc->drawPriv->waitGL)(glxc->drawPriv);
+    if (glxc && glxc->drawPriv->waitX)
+	(*glxc->drawPriv->waitX)(glxc->drawPriv);
 
     return Success;
 }
_______________________________________________
xorg-devel mailing list
[email protected]
http://lists.x.org/mailman/listinfo/xorg-devel

Reply via email to