Re: [Mesa-dev] [PATCH 1/3] meta: Implement sensible behavior for BlitFramebuffer with sRGB.

2012-08-01 Thread Kenneth Graunke
On 07/31/2012 06:04 PM, Eric Anholt wrote:
 Prior to GL 4.2 spec, there was no guidance on how to implement
 BlitFramebuffer for sRGB.  Mesa chose to implement skipping decode and encode,
 providing a reasonable behavior for sRGB - sRGB or RGB - RGB, but providing
 something absurd for mixing and matching the two.
 
 In GL 4.2, some text appeared in the spec saying to skip decode (search for
 no linearization).  The only non-absurd interpretation of that would be to
 have no encode (same as Mesa's current implementation), otherwise sRGB - sRGB
 blits wouldn't work.
 
 However, it seems clear that you should be able to blit sRGB to RGB and RGB to
 sRGB and get appropriate conversion.  The ARB has been discussing the issue,
 and appears to agree.  So, instead implement the same behavior as gallium, and
 always do the decode if the texture is sRGB, and do the encode if the
 application asked for it.
 
 Breaks piglit fbo-srgb-blit, which was expecting our previous no-conversion
 behavior.
 ---
  src/mesa/drivers/common/meta.c |   14 +-
  1 file changed, 5 insertions(+), 9 deletions(-)
 
 diff --git a/src/mesa/drivers/common/meta.c b/src/mesa/drivers/common/meta.c
 index 6846bbc..e35f0c8 100644
 --- a/src/mesa/drivers/common/meta.c
 +++ b/src/mesa/drivers/common/meta.c
 @@ -1357,7 +1357,6 @@ blitframebuffer_texture(struct gl_context *ctx,
   const GLenum wrapSSave = texObj-Sampler.WrapS;
   const GLenum wrapTSave = texObj-Sampler.WrapT;
   const GLenum srgbSave = texObj-Sampler.sRGBDecode;
 - const GLenum fbo_srgb_save = ctx-Color.sRGBEnabled;
   const GLenum target = texObj-Target;
  
   if (drawAtt-Texture == readAtt-Texture) {
 @@ -1390,14 +1389,14 @@ blitframebuffer_texture(struct gl_context *ctx,
   _mesa_TexParameteri(target, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
   _mesa_TexParameteri(target, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
  
 -  /* Always do our blits with no sRGB decode or encode.*/
 +  /* Always do sRGB decode on the read, and do sRGB encode on the write
 +  * if we've been asked to on this framebuffer by leaving
 +  * GL_FRAMEBUFFER_SRGB in place.
 +  */
if (ctx-Extensions.EXT_texture_sRGB_decode) {
   _mesa_TexParameteri(target, GL_TEXTURE_SRGB_DECODE_EXT,
 - GL_SKIP_DECODE_EXT);
 + GL_DECODE_EXT);
}

Wait, won't this break sRGB - sRGB blits?

I guess I'd expect that:

1. Blitting from a RGBA source to an RGBA destination should be an exact
copy: no decoding/encoding should occur, regardless of any enable bits.

2. Blitting from an sRGBA source to an sRGBA destination should also be
an exact copy---both buffers should then contain the same pixel values.

These seem pretty clear, and also match Mesa's current behavior.

With your patch, if an application had set GL_FRAMEBUFFER_SRGB_EXT to
GL_TRUE, then I believe case 2 would result in the two buffers having
different values: non-decoded source values would get encoded.

One could argue that such an application is being stupid, but honestly
as confusing as these specs are, I don't know what stupid is anymore.

I feel like blits should follow these rules for mixed formats:

3. Blitting from an sRGBA source to an RGBA destination should
optionally decode the values based on whether the application has set
GL_TEXTURE_SRGB_DECODE_EXT to GL_DECODE_EXT or GL_SKIP_DECODE_EXT.  The
other knob, GL_FRAMEBUFFER_SRGB_EXT, should be ignored since it controls
writes and the destination is an ordinary format (RGBA).

4. Blitting from an RGBA source to an sRGBA destination should
optionally encode the values based on whether the application has set
GL_FRAMEBUFFER_SRGB_EXT to true or false.  The other knob,
GL_TEXTURE_SRGB_DECODE_EXT, should be ignored since it controls
reads/texturing and the source is not an sRGB format.

These four rules allow you to copy SRGB-SRGB and RGB-RGB unmodified,
while also allowing you to decode SRGB values into a linear buffer, or
encode linear values into an SRGB buffer.  That seems like it covers all
the cases anyone would want to do.

 - if (ctx-Extensions.EXT_framebuffer_sRGB) {
 -_mesa_set_enable(ctx, GL_FRAMEBUFFER_SRGB_EXT, GL_FALSE);
 - }
  
   _mesa_TexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
   _mesa_set_enable(ctx, target, GL_TRUE);
 @@ -1463,9 +1462,6 @@ blitframebuffer_texture(struct gl_context *ctx,
if (ctx-Extensions.EXT_texture_sRGB_decode) {
   _mesa_TexParameteri(target, GL_TEXTURE_SRGB_DECODE_EXT, srgbSave);
}
 -  if (ctx-Extensions.EXT_framebuffer_sRGB  fbo_srgb_save) {
 - _mesa_set_enable(ctx, GL_FRAMEBUFFER_SRGB_EXT, GL_TRUE);
 -  }
  
   /* Done with color buffer */
   mask = ~GL_COLOR_BUFFER_BIT;
 

Regardless of what we decide to do here, I also wonder if any updates
are necessary to the i965 BLT or blorp 

Re: [Mesa-dev] [PATCH 1/3] meta: Implement sensible behavior for BlitFramebuffer with sRGB.

2012-08-01 Thread Christoph Bumiller
On 01.08.2012 07:52, Kenneth Graunke wrote:
 On 07/31/2012 06:04 PM, Eric Anholt wrote:
 Prior to GL 4.2 spec, there was no guidance on how to implement
 BlitFramebuffer for sRGB.  Mesa chose to implement skipping decode and 
 encode,
 providing a reasonable behavior for sRGB - sRGB or RGB - RGB, but providing
 something absurd for mixing and matching the two.

 In GL 4.2, some text appeared in the spec saying to skip decode (search for
 no linearization).  The only non-absurd interpretation of that would be to
 have no encode (same as Mesa's current implementation), otherwise sRGB - 
 sRGB
 blits wouldn't work.

 However, it seems clear that you should be able to blit sRGB to RGB and RGB 
 to
 sRGB and get appropriate conversion.  The ARB has been discussing the issue,
 and appears to agree.  So, instead implement the same behavior as gallium, 
 and
 always do the decode if the texture is sRGB, and do the encode if the
 application asked for it.


I was just about to revert that patch (gallium: specify resource_resolve
destination via a pipe_surface), because it does not match the behaviour
of the NVIDIA binary driver, and I discovered that I accidentally had
sRGB decode disabled (by previously unidentified not-set bit in sampler
state I didn't see the blob set for resolve), so disabling sRGB encode
(which is what the app has set) counter-acted that and made rendering
look correct.
If fix the decode setting AND honor the write setting, I get wrong
results again.

That is, an sRGB - sRGB blit should never change the values, regardless
of the setting of GL_FRAMEBUFFER_SRGB.

EXT_framebuffer_sRGB suggests that only drawing (i.e. where you have
blending) should be affected:

Should sRGB framebuffer support affect the pixel path?

RESOLVED:  No.

sRGB conversion only applies to color reads for blending and
color writes.  Color reads for glReadPixels, glCopyPixels,
or glAccum have no sRGB conversion applied.

 Breaks piglit fbo-srgb-blit, which was expecting our previous no-conversion
 behavior.
 ---
  src/mesa/drivers/common/meta.c |   14 +-
  1 file changed, 5 insertions(+), 9 deletions(-)

 diff --git a/src/mesa/drivers/common/meta.c b/src/mesa/drivers/common/meta.c
 index 6846bbc..e35f0c8 100644
 --- a/src/mesa/drivers/common/meta.c
 +++ b/src/mesa/drivers/common/meta.c
 @@ -1357,7 +1357,6 @@ blitframebuffer_texture(struct gl_context *ctx,
   const GLenum wrapSSave = texObj-Sampler.WrapS;
   const GLenum wrapTSave = texObj-Sampler.WrapT;
   const GLenum srgbSave = texObj-Sampler.sRGBDecode;
 - const GLenum fbo_srgb_save = ctx-Color.sRGBEnabled;
   const GLenum target = texObj-Target;
  
   if (drawAtt-Texture == readAtt-Texture) {
 @@ -1390,14 +1389,14 @@ blitframebuffer_texture(struct gl_context *ctx,
   _mesa_TexParameteri(target, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
   _mesa_TexParameteri(target, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
  
 - /* Always do our blits with no sRGB decode or encode.*/
 + /* Always do sRGB decode on the read, and do sRGB encode on the write
 +  * if we've been asked to on this framebuffer by leaving
 +  * GL_FRAMEBUFFER_SRGB in place.
 +  */
   if (ctx-Extensions.EXT_texture_sRGB_decode) {
  _mesa_TexParameteri(target, GL_TEXTURE_SRGB_DECODE_EXT,
 -GL_SKIP_DECODE_EXT);
 +GL_DECODE_EXT);
   }
 
 Wait, won't this break sRGB - sRGB blits?
 
 I guess I'd expect that:
 
 1. Blitting from a RGBA source to an RGBA destination should be an exact
 copy: no decoding/encoding should occur, regardless of any enable bits.
 
 2. Blitting from an sRGBA source to an sRGBA destination should also be
 an exact copy---both buffers should then contain the same pixel values.
 
 These seem pretty clear, and also match Mesa's current behavior.
 
 With your patch, if an application had set GL_FRAMEBUFFER_SRGB_EXT to
 GL_TRUE, then I believe case 2 would result in the two buffers having
 different values: non-decoded source values would get encoded.
 
 One could argue that such an application is being stupid, but honestly
 as confusing as these specs are, I don't know what stupid is anymore.
 
 I feel like blits should follow these rules for mixed formats:
 
 3. Blitting from an sRGBA source to an RGBA destination should
 optionally decode the values based on whether the application has set
 GL_TEXTURE_SRGB_DECODE_EXT to GL_DECODE_EXT or GL_SKIP_DECODE_EXT.  The
 other knob, GL_FRAMEBUFFER_SRGB_EXT, should be ignored since it controls
 writes and the destination is an ordinary format (RGBA).
 
 4. Blitting from an RGBA source to an sRGBA destination should
 optionally encode the values based on whether the application has set
 GL_FRAMEBUFFER_SRGB_EXT to true or false.  The other knob,
 GL_TEXTURE_SRGB_DECODE_EXT, should be ignored since it controls
 reads/texturing and the source is not an 

Re: [Mesa-dev] [PATCH 1/3] meta: Implement sensible behavior for BlitFramebuffer with sRGB.

2012-08-01 Thread Christoph Bumiller
On 01.08.2012 11:59, Christoph Bumiller wrote:
 On 01.08.2012 07:52, Kenneth Graunke wrote:
 On 07/31/2012 06:04 PM, Eric Anholt wrote:
 Prior to GL 4.2 spec, there was no guidance on how to implement
 BlitFramebuffer for sRGB.  Mesa chose to implement skipping decode and 
 encode,
 providing a reasonable behavior for sRGB - sRGB or RGB - RGB, but 
 providing
 something absurd for mixing and matching the two.

 In GL 4.2, some text appeared in the spec saying to skip decode (search for
 no linearization).  The only non-absurd interpretation of that would be to
 have no encode (same as Mesa's current implementation), otherwise sRGB - 
 sRGB
 blits wouldn't work.

 However, it seems clear that you should be able to blit sRGB to RGB and RGB 
 to
 sRGB and get appropriate conversion.  The ARB has been discussing the issue,
 and appears to agree.  So, instead implement the same behavior as gallium, 
 and
 always do the decode if the texture is sRGB, and do the encode if the
 application asked for it.

 
 I was just about to revert that patch (gallium: specify resource_resolve
 destination via a pipe_surface), because it does not match the behaviour
 of the NVIDIA binary driver, and I discovered that I accidentally had
 sRGB decode disabled (by previously unidentified not-set bit in sampler
 state I didn't see the blob set for resolve), so disabling sRGB encode
 (which is what the app has set) counter-acted that and made rendering
 look correct.
 If fix the decode setting AND honor the write setting, I get wrong
 results again.

However, reading that part of the 4.2 spec (let me quote here):

When values are taken from the read buffer, no linearization is
performed even if the format of the buffer is SRGB.
When values are written to the draw buffers, blit operations bypass most
of the fragment pipeline. The only fragment operations which affect a
blit are the pixel ownership test, the scissor test, and sRGB conversion
(see section 4.1.8).

It looks like I accidentally did the *right* thing, too - never do
linearization == no sRGB decode on read. This seems a little odd though,
blits from sRGB to sRGB with FRAMEBUFFER_SRGB enabled would change the
values, which takes some getting used to ...

This would also requires another change to the gallium interface, making
the source never be an sRGB format, or at least stating that it is
supposed to be treated as linear.

 
 That is, an sRGB - sRGB blit should never change the values, regardless
 of the setting of GL_FRAMEBUFFER_SRGB.
 
 EXT_framebuffer_sRGB suggests that only drawing (i.e. where you have
 blending) should be affected:
 
 Should sRGB framebuffer support affect the pixel path?
 
 RESOLVED:  No.
 
 sRGB conversion only applies to color reads for blending and
 color writes.  Color reads for glReadPixels, glCopyPixels,
 or glAccum have no sRGB conversion applied.
 
 Breaks piglit fbo-srgb-blit, which was expecting our previous no-conversion
 behavior.
 ---
  src/mesa/drivers/common/meta.c |   14 +-
  1 file changed, 5 insertions(+), 9 deletions(-)

 diff --git a/src/mesa/drivers/common/meta.c b/src/mesa/drivers/common/meta.c
 index 6846bbc..e35f0c8 100644
 --- a/src/mesa/drivers/common/meta.c
 +++ b/src/mesa/drivers/common/meta.c
 @@ -1357,7 +1357,6 @@ blitframebuffer_texture(struct gl_context *ctx,
   const GLenum wrapSSave = texObj-Sampler.WrapS;
   const GLenum wrapTSave = texObj-Sampler.WrapT;
   const GLenum srgbSave = texObj-Sampler.sRGBDecode;
 - const GLenum fbo_srgb_save = ctx-Color.sRGBEnabled;
   const GLenum target = texObj-Target;
  
   if (drawAtt-Texture == readAtt-Texture) {
 @@ -1390,14 +1389,14 @@ blitframebuffer_texture(struct gl_context *ctx,
   _mesa_TexParameteri(target, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
   _mesa_TexParameteri(target, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
  
 -/* Always do our blits with no sRGB decode or encode.*/
 +/* Always do sRGB decode on the read, and do sRGB encode on the write
 +  * if we've been asked to on this framebuffer by leaving
 +  * GL_FRAMEBUFFER_SRGB in place.
 +  */
  if (ctx-Extensions.EXT_texture_sRGB_decode) {
 _mesa_TexParameteri(target, GL_TEXTURE_SRGB_DECODE_EXT,
 -   GL_SKIP_DECODE_EXT);
 +   GL_DECODE_EXT);
  }

 Wait, won't this break sRGB - sRGB blits?

 I guess I'd expect that:

 1. Blitting from a RGBA source to an RGBA destination should be an exact
 copy: no decoding/encoding should occur, regardless of any enable bits.

 2. Blitting from an sRGBA source to an sRGBA destination should also be
 an exact copy---both buffers should then contain the same pixel values.

 These seem pretty clear, and also match Mesa's current behavior.

 With your patch, if an application had set GL_FRAMEBUFFER_SRGB_EXT to
 GL_TRUE, then I believe case 2 would result in the two buffers having
 different 

Re: [Mesa-dev] [PATCH 1/3] meta: Implement sensible behavior for BlitFramebuffer with sRGB.

2012-08-01 Thread Henri Verbeet
On 1 August 2012 03:04, Eric Anholt e...@anholt.net wrote:
 Prior to GL 4.2 spec, there was no guidance on how to implement
 BlitFramebuffer for sRGB.  Mesa chose to implement skipping decode and encode,
 providing a reasonable behavior for sRGB - sRGB or RGB - RGB, but providing
 something absurd for mixing and matching the two.

 In GL 4.2, some text appeared in the spec saying to skip decode (search for
 no linearization).  The only non-absurd interpretation of that would be to
 have no encode (same as Mesa's current implementation), otherwise sRGB - sRGB
 blits wouldn't work.

 However, it seems clear that you should be able to blit sRGB to RGB and RGB to
 sRGB and get appropriate conversion.  The ARB has been discussing the issue,
 and appears to agree.  So, instead implement the same behavior as gallium, and
 always do the decode if the texture is sRGB, and do the encode if the
 application asked for it.

 Breaks piglit fbo-srgb-blit, which was expecting our previous no-conversion
 behavior.
 ---
Issue 12 in EXT_texture_sRGB requires no conversion on FBO blits,
because they're mostly specified in terms of CopyPixels.
EXT_framebuffer_sRGB has similar language in issue 8.
___
mesa-dev mailing list
mesa-dev@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/mesa-dev


Re: [Mesa-dev] [PATCH 1/3] meta: Implement sensible behavior for BlitFramebuffer with sRGB.

2012-08-01 Thread Ian Romanick

On 08/01/2012 03:16 AM, Christoph Bumiller wrote:

On 01.08.2012 11:59, Christoph Bumiller wrote:

On 01.08.2012 07:52, Kenneth Graunke wrote:

On 07/31/2012 06:04 PM, Eric Anholt wrote:

Prior to GL 4.2 spec, there was no guidance on how to implement
BlitFramebuffer for sRGB.  Mesa chose to implement skipping decode and encode,
providing a reasonable behavior for sRGB - sRGB or RGB - RGB, but providing
something absurd for mixing and matching the two.

In GL 4.2, some text appeared in the spec saying to skip decode (search for
no linearization).  The only non-absurd interpretation of that would be to
have no encode (same as Mesa's current implementation), otherwise sRGB - sRGB
blits wouldn't work.

However, it seems clear that you should be able to blit sRGB to RGB and RGB to
sRGB and get appropriate conversion.  The ARB has been discussing the issue,
and appears to agree.  So, instead implement the same behavior as gallium, and
always do the decode if the texture is sRGB, and do the encode if the
application asked for it.



I was just about to revert that patch (gallium: specify resource_resolve
destination via a pipe_surface), because it does not match the behaviour
of the NVIDIA binary driver, and I discovered that I accidentally had
sRGB decode disabled (by previously unidentified not-set bit in sampler
state I didn't see the blob set for resolve), so disabling sRGB encode
(which is what the app has set) counter-acted that and made rendering
look correct.
If fix the decode setting AND honor the write setting, I get wrong
results again.


However, reading that part of the 4.2 spec (let me quote here):

When values are taken from the read buffer, no linearization is
performed even if the format of the buffer is SRGB.
When values are written to the draw buffers, blit operations bypass most
of the fragment pipeline. The only fragment operations which affect a
blit are the pixel ownership test, the scissor test, and sRGB conversion
(see section 4.1.8).


There was discussion about this in Khronos recently.  The OpenGL 4.2 
language is *broken*, and it will be changed in a future GL version. 
Eric's patch changes to the behavior that will be specified... which is, 
coincidentally, the only sane thing to do:  always linearize on read, 
respect application settings on write.



It looks like I accidentally did the *right* thing, too - never do
linearization == no sRGB decode on read. This seems a little odd though,
blits from sRGB to sRGB with FRAMEBUFFER_SRGB enabled would change the
values, which takes some getting used to ...

This would also requires another change to the gallium interface, making
the source never be an sRGB format, or at least stating that it is
supposed to be treated as linear.


That is, an sRGB - sRGB blit should never change the values, regardless
of the setting of GL_FRAMEBUFFER_SRGB.

EXT_framebuffer_sRGB suggests that only drawing (i.e. where you have
blending) should be affected:

Should sRGB framebuffer support affect the pixel path?

 RESOLVED:  No.

 sRGB conversion only applies to color reads for blending and
 color writes.  Color reads for glReadPixels, glCopyPixels,
 or glAccum have no sRGB conversion applied.


Breaks piglit fbo-srgb-blit, which was expecting our previous no-conversion
behavior.
---
  src/mesa/drivers/common/meta.c |   14 +-
  1 file changed, 5 insertions(+), 9 deletions(-)

diff --git a/src/mesa/drivers/common/meta.c b/src/mesa/drivers/common/meta.c
index 6846bbc..e35f0c8 100644
--- a/src/mesa/drivers/common/meta.c
+++ b/src/mesa/drivers/common/meta.c
@@ -1357,7 +1357,6 @@ blitframebuffer_texture(struct gl_context *ctx,
   const GLenum wrapSSave = texObj-Sampler.WrapS;
   const GLenum wrapTSave = texObj-Sampler.WrapT;
   const GLenum srgbSave = texObj-Sampler.sRGBDecode;
- const GLenum fbo_srgb_save = ctx-Color.sRGBEnabled;
   const GLenum target = texObj-Target;

   if (drawAtt-Texture == readAtt-Texture) {
@@ -1390,14 +1389,14 @@ blitframebuffer_texture(struct gl_context *ctx,
   _mesa_TexParameteri(target, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
   _mesa_TexParameteri(target, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);

-/* Always do our blits with no sRGB decode or encode.*/
+/* Always do sRGB decode on the read, and do sRGB encode on the write
+  * if we've been asked to on this framebuffer by leaving
+  * GL_FRAMEBUFFER_SRGB in place.
+  */
 if (ctx-Extensions.EXT_texture_sRGB_decode) {
_mesa_TexParameteri(target, GL_TEXTURE_SRGB_DECODE_EXT,
-   GL_SKIP_DECODE_EXT);
+   GL_DECODE_EXT);
 }


Wait, won't this break sRGB - sRGB blits?

I guess I'd expect that:

1. Blitting from a RGBA source to an RGBA destination should be an exact
copy: no decoding/encoding should occur, regardless of any enable bits.

2. Blitting from an sRGBA 

[Mesa-dev] [PATCH 1/3] meta: Implement sensible behavior for BlitFramebuffer with sRGB.

2012-07-31 Thread Eric Anholt
Prior to GL 4.2 spec, there was no guidance on how to implement
BlitFramebuffer for sRGB.  Mesa chose to implement skipping decode and encode,
providing a reasonable behavior for sRGB - sRGB or RGB - RGB, but providing
something absurd for mixing and matching the two.

In GL 4.2, some text appeared in the spec saying to skip decode (search for
no linearization).  The only non-absurd interpretation of that would be to
have no encode (same as Mesa's current implementation), otherwise sRGB - sRGB
blits wouldn't work.

However, it seems clear that you should be able to blit sRGB to RGB and RGB to
sRGB and get appropriate conversion.  The ARB has been discussing the issue,
and appears to agree.  So, instead implement the same behavior as gallium, and
always do the decode if the texture is sRGB, and do the encode if the
application asked for it.

Breaks piglit fbo-srgb-blit, which was expecting our previous no-conversion
behavior.
---
 src/mesa/drivers/common/meta.c |   14 +-
 1 file changed, 5 insertions(+), 9 deletions(-)

diff --git a/src/mesa/drivers/common/meta.c b/src/mesa/drivers/common/meta.c
index 6846bbc..e35f0c8 100644
--- a/src/mesa/drivers/common/meta.c
+++ b/src/mesa/drivers/common/meta.c
@@ -1357,7 +1357,6 @@ blitframebuffer_texture(struct gl_context *ctx,
  const GLenum wrapSSave = texObj-Sampler.WrapS;
  const GLenum wrapTSave = texObj-Sampler.WrapT;
  const GLenum srgbSave = texObj-Sampler.sRGBDecode;
- const GLenum fbo_srgb_save = ctx-Color.sRGBEnabled;
  const GLenum target = texObj-Target;
 
  if (drawAtt-Texture == readAtt-Texture) {
@@ -1390,14 +1389,14 @@ blitframebuffer_texture(struct gl_context *ctx,
  _mesa_TexParameteri(target, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
  _mesa_TexParameteri(target, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
 
-/* Always do our blits with no sRGB decode or encode.*/
+/* Always do sRGB decode on the read, and do sRGB encode on the write
+  * if we've been asked to on this framebuffer by leaving
+  * GL_FRAMEBUFFER_SRGB in place.
+  */
 if (ctx-Extensions.EXT_texture_sRGB_decode) {
_mesa_TexParameteri(target, GL_TEXTURE_SRGB_DECODE_EXT,
-   GL_SKIP_DECODE_EXT);
+   GL_DECODE_EXT);
 }
- if (ctx-Extensions.EXT_framebuffer_sRGB) {
-_mesa_set_enable(ctx, GL_FRAMEBUFFER_SRGB_EXT, GL_FALSE);
- }
 
  _mesa_TexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
  _mesa_set_enable(ctx, target, GL_TRUE);
@@ -1463,9 +1462,6 @@ blitframebuffer_texture(struct gl_context *ctx,
 if (ctx-Extensions.EXT_texture_sRGB_decode) {
_mesa_TexParameteri(target, GL_TEXTURE_SRGB_DECODE_EXT, srgbSave);
 }
-if (ctx-Extensions.EXT_framebuffer_sRGB  fbo_srgb_save) {
-   _mesa_set_enable(ctx, GL_FRAMEBUFFER_SRGB_EXT, GL_TRUE);
-}
 
  /* Done with color buffer */
  mask = ~GL_COLOR_BUFFER_BIT;
-- 
1.7.10.4

___
mesa-dev mailing list
mesa-dev@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/mesa-dev