On 14.09.2015 18:00, Max Reitz wrote: > On 09.09.2015 13:20, Gerd Hoffmann wrote: >> Create a buffer for the vertex data and place vertexes >> there at initialization time. Then just use the buffer >> for each texture blit. >> >> Signed-off-by: Gerd Hoffmann <kra...@redhat.com> >> --- >> include/ui/shader.h | 4 +++- >> ui/console-gl.c | 7 ++++++- >> ui/shader.c | 32 +++++++++++++++++++++++++++----- >> 3 files changed, 36 insertions(+), 7 deletions(-) >> >> diff --git a/include/ui/shader.h b/include/ui/shader.h >> index 8509596..f7d8618 100644 >> --- a/include/ui/shader.h >> +++ b/include/ui/shader.h >> @@ -3,7 +3,9 @@ >> >> #include <epoxy/gl.h> >> >> -void qemu_gl_run_texture_blit(GLint texture_blit_prog); >> +GLuint qemu_gl_init_texture_blit(GLint texture_blit_prog); >> +void qemu_gl_run_texture_blit(GLint texture_blit_prog, >> + GLint texture_blit_vao); >> >> GLuint qemu_gl_create_compile_shader(GLenum type, const GLchar *src); >> GLuint qemu_gl_create_link_program(GLuint vert, GLuint frag); >> diff --git a/ui/console-gl.c b/ui/console-gl.c >> index cb45cf8..baf397b 100644 >> --- a/ui/console-gl.c >> +++ b/ui/console-gl.c >> @@ -33,6 +33,7 @@ >> >> struct ConsoleGLState { >> GLint texture_blit_prog; >> + GLint texture_blit_vao; >> }; >> >> /* ---------------------------------------------------------------------- */ >> @@ -47,6 +48,9 @@ ConsoleGLState *console_gl_init_context(void) >> exit(1); >> } >> >> + gls->texture_blit_vao = >> + qemu_gl_init_texture_blit(gls->texture_blit_prog); >> + >> return gls; >> } >> >> @@ -131,7 +135,8 @@ void surface_gl_render_texture(ConsoleGLState *gls, >> glClearColor(0.1f, 0.1f, 0.1f, 0.0f); >> glClear(GL_COLOR_BUFFER_BIT); >> >> - qemu_gl_run_texture_blit(gls->texture_blit_prog); >> + qemu_gl_run_texture_blit(gls->texture_blit_prog, >> + gls->texture_blit_vao); >> } >> >> void surface_gl_destroy_texture(ConsoleGLState *gls, >> diff --git a/ui/shader.c b/ui/shader.c >> index 52a4632..ada1c4c 100644 >> --- a/ui/shader.c >> +++ b/ui/shader.c >> @@ -29,21 +29,43 @@ >> >> /* ---------------------------------------------------------------------- */ >> >> -void qemu_gl_run_texture_blit(GLint texture_blit_prog) >> +GLuint qemu_gl_init_texture_blit(GLint texture_blit_prog) >> { >> - GLfloat in_position[] = { >> + static const GLfloat in_position[] = { >> -1, -1, >> 1, -1, >> -1, 1, >> 1, 1, >> }; >> GLint l_position; >> + GLuint vao, buffer; >> + >> + glGenVertexArrays(1, &vao); >> + glBindVertexArray(vao); >> + >> + /* this is the VBO that holds the vertex data */ >> + glGenBuffers(1, &buffer); >> + glBindBuffer(GL_ARRAY_BUFFER, buffer); >> + glBufferData(GL_ARRAY_BUFFER, sizeof(in_position), in_position, >> + GL_STATIC_DRAW); >> >> - glUseProgram(texture_blit_prog); >> l_position = glGetAttribLocation(texture_blit_prog, "in_position"); >> - glVertexAttribPointer(l_position, 2, GL_FLOAT, GL_FALSE, 0, >> in_position); >> + glVertexAttribPointer(l_position, 2, GL_FLOAT, GL_FALSE, 0, 0); >> glEnableVertexAttribArray(l_position); >> - glDrawArrays(GL_TRIANGLE_STRIP, l_position, 4); >> + >> + glBindBuffer (GL_ARRAY_BUFFER, 0); >> + glBindVertexArray (0); >> + glDeleteBuffers (1, &buffer); > > Although I may be wrong, I don't think glVertexAttribPointer() loads the > buffer data into the given vertex attribute. > > As far as I can see from the specification regarding vertex array > objects, they only represent which data is to be used for vertex > attributes ("All state related to the definition of data used by the > vertex processor is encapsulated in a vertex array object."). > > glVertexAttribPointer(): > (1) determines the attribute's format, > (2) binds a vertex attribute index to a shader attribute index, > (3) binds the data to the vertex attribute. > > Step (3) (glBindVertexBuffer()) does not appear to load the data, but > only define its origin ("The source of data for a generic vertex > attribute may be determined by attaching a buffer object to a vertex > array object with [glBindVertexBuffer()]"). > > Therefore, I'm not sure whether deleting the buffer is right here. Maybe > OpenGL uses reference counting here, too, so it remembers that the > buffer is still in use by the VAO, and so the glDeleteBuffers() > operation will only decrease its refcount, but not actually end up > deleting it. But I don't think so (compare the documentation of > glDeleteBuffers() with glDeleteShader(); the former says it will delete > the buffer, whereas the latter explicitly mentions that the shader will > not be deleted as long as it is attached to a program). > > All in all I don't think we should delete the buffer as long as it is in > use by the VAO.
Oh, and I just noticed: With glDeleteBuffers(), I get a segmentation fault when qemu exits (somewhere deep in fglrx_dri.so). Without, the segfault disappears. Max >> + >> + return vao; >> +} >> + >> +void qemu_gl_run_texture_blit(GLint texture_blit_prog, >> + GLint texture_blit_vao) >> +{ >> + glUseProgram(texture_blit_prog); >> + glBindVertexArray(texture_blit_vao); >> + glDrawArrays(GL_TRIANGLE_STRIP, 0, 4); >> } >> >> /* ---------------------------------------------------------------------- */ >> > >
signature.asc
Description: OpenPGP digital signature