Re: [Mesa-dev] GL 3.0 glDrawPixels(integer format)

2011-11-29 Thread Ian Romanick

On 11/27/2011 06:08 PM, Eric Anholt wrote:

While looking into MapRenderbuffer for glDrawPixels, I ended up
looking at integer again.  It looks like GL 3.0 has added sanity to
drawpixels for integer, which is to say that they've disallowed it.

I don't think there was any way to usefully use drawpixels of integer.
Assuming you want to output to an integer FBO, you have to have a
fragment shader bound.  You would also need to have the input to that
fragment shader be integer, or you're going to lose your precision.
But how does the glDrawPixels() input data get bound to some
user-defined shader input?

The fbo-integer-precision-drawpixels testcase currently binds a shader
that does gl_FragColor = gl_Color.  But if the output to an integer
FBO is written through a floating-point shader output, the result is
undefined (from GL_EXT_texture_integer):

The color components used for per-fragment operations and written into a
 color buffer are undefined:

   * for fragment shaders that write floating-point color components to an
 integer color buffer, or...

What should we do with that testcase?  If pre-3.0
GL_EXT_texture_integer allows drawpixels for integer, what behavior
would we actually want for making use of the user's shader?  (I'm
assuming that testcase passed for whatever code it was supposed to
test because that code just no-opped the shader.  Imagine the shader
doing some actual math on the color instead.).  I'm inclined to adopt
3.0 behavior generally, and assume that the 3.0 wording was a
correction after someone noticed that making drawpixels for integer
actually work was hopeless.


If other drivers report EXT_texture_integer with OpenGL 3.0 and generate 
the DrawPixels errors (per the 3.0 spec), then, in a practical sense, 
it's impossible for any application to use the DrawPixels functionality 
in EXT_texture_integer.


If there are no applications and there never will be any applications 
that use this quirk of the EXT, then I would opt for the 3.0 behavior.


I suspect that there may be apps that look for the EXT in the absence of 
3.0 to use integer textures.  We should continue to report the EXT.


I think this is a case where the EXT spec should have been updated to 
match 3.0 when 3.0 was released.  Alas.

___
mesa-dev mailing list
mesa-dev@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/mesa-dev


Re: [Mesa-dev] GL 3.0 glDrawPixels(integer format)

2011-11-28 Thread Dave Airlie
 While looking into MapRenderbuffer for glDrawPixels, I ended up
 looking at integer again.  It looks like GL 3.0 has added sanity to
 drawpixels for integer, which is to say that they've disallowed it.

Well I suppose the usual plan, test on binary drivers, though I expect
it works on nvidia, fails on fglrx.

We do also need to do something about the GL3.0 reporting as the
current code specifies EXT_texture_integer
which is more than the GL3.0 requirement. Maybe we need a sublevel of
EXT_texture_integer support for
drivers to report for GL3.0 support but not EXT_texture_integer support.

But I'm happy that GL3.0 simply worked out the stupid and it nobody
would possibly be using it for anything real.

Dave.
___
mesa-dev mailing list
mesa-dev@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/mesa-dev


[Mesa-dev] GL 3.0 glDrawPixels(integer format)

2011-11-27 Thread Eric Anholt
While looking into MapRenderbuffer for glDrawPixels, I ended up
looking at integer again.  It looks like GL 3.0 has added sanity to
drawpixels for integer, which is to say that they've disallowed it.

I don't think there was any way to usefully use drawpixels of integer.
Assuming you want to output to an integer FBO, you have to have a
fragment shader bound.  You would also need to have the input to that
fragment shader be integer, or you're going to lose your precision.
But how does the glDrawPixels() input data get bound to some
user-defined shader input?

The fbo-integer-precision-drawpixels testcase currently binds a shader
that does gl_FragColor = gl_Color.  But if the output to an integer
FBO is written through a floating-point shader output, the result is
undefined (from GL_EXT_texture_integer):

   The color components used for per-fragment operations and written into a
color buffer are undefined:

  * for fragment shaders that write floating-point color components to an
integer color buffer, or...

What should we do with that testcase?  If pre-3.0
GL_EXT_texture_integer allows drawpixels for integer, what behavior
would we actually want for making use of the user's shader?  (I'm
assuming that testcase passed for whatever code it was supposed to
test because that code just no-opped the shader.  Imagine the shader
doing some actual math on the color instead.).  I'm inclined to adopt
3.0 behavior generally, and assume that the 3.0 wording was a
correction after someone noticed that making drawpixels for integer
actually work was hopeless.

___
mesa-dev mailing list
mesa-dev@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/mesa-dev