On Wed, 9 Jan 2013 19:19:53 +0530 Arvind R arvin...@gmail.com said:
Hi all,
My understanding is that emotion gets the video backend to render RGBA
to the evas canvas that is then displayed by the ecore-evas backend.
Correct?
actually it gets the video decoder (xine/gstremer etc.) to decode
Hi all,
My understanding is that emotion gets the video backend to render RGBA
to the evas canvas that is then displayed by the ecore-evas backend.
Correct?
If so, would it be possible, for instance, using the xine backend to
render directly to screen using whatever HW-accleration available to
On Wed, Jan 9, 2013 at 11:49 AM, Arvind R arvin...@gmail.com wrote:
Hi all,
My understanding is that emotion gets the video backend to render RGBA
to the evas canvas that is then displayed by the ecore-evas backend.
Correct?
If so, would it be possible, for instance, using the xine backend
On Wed, Jan 9, 2013 at 11:49 AM, Arvind R arvin...@gmail.com wrote:
Hi all,
My understanding is that emotion gets the video backend to render RGBA
to the evas canvas that is then displayed by the ecore-evas backend.
Correct?
Actually it outputs to YUV as well, being converted to RGB by CPU