Hi All,

I want to implement the following use case on Intel Haswell CPU with HD
graphics:

offscreen OpenGL rendering to OpenGL 2D texture -> use
EGL_MESA_image_dma_buf_export
to export the texture as prime fd -> vaCreateSurfaces from the prime fd ->
use vaapi to hw acclerate h264 encoding

Is this supported by the latest va-api and libva-intel-driver release
(i.e., 1.7.0)? If so, is there any example I can start with?

I did read the sample application h264encode.c and thought it might be a
good starting point. However, the first major issue I encountered is when
creating context through vaCreateContext, a number of pre-allocated va
surfaces are required and thus statically associated with the new va
context. However, with prime fd, the fd will change so a new va surface is
created with every new prime fd. I don't see any API can be used to
dynamically add/remove va surfaces after the context is created. Can you
please give me some suggestions?

Best,
Kristine
_______________________________________________
Libva mailing list
Libva@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/libva

Reply via email to