Sottek, Matthew J wrote:
Let me preface my comment with "I don't know a lot about OGL" so some
further clarification may be needed.

I am assuming that pbuffers are basically buffers that can be used
as textures by OGL. I would then assume that the OGL driver would
have some mapping of pbuffer id to the texture memory it represents,
maybe this memory is in video memory maybe it has been "swapped out"
so-to-speak by some texture manager etc.

A pbuffer is (basically) just an off-screen window. You can do the same things to a pbuffer that you can do to a normal window. This includes copying its contents to a texture. There was a proposal to bring WGL_render_texture to GLX, but, in light of other developments, there wasn't much interest. It *may* be resurrected at some point for completeness sake, but I wouldn't hold my breath.


So basically this copies data from an XvMC offscreen surface to an
OGL offscreen surface to be used by OGL for normal rendering purposes.
Seems easy enough... I expect anyone doing XvMC would use the drm
for the direct access (or their own drm equivalent) which would also
be the same drm used for OGL and therefore whatever texture management
needs to be done should be possible without much of a problem.

Well, except that, at least in the open-source DRI based drivers, the texture memory manager doesn't live in the DRM (anymore than malloc and free live in the kernel).


My main problem with the concept is that it seems that a copy is not
always required, and is costly at 24fps. For YUV packed surfaces at
least, an XvMC surface could be directly used as a texture. Some way
to associate an XvMC surface with a pbuffer without a copy seems
like something that would have a large performance gain.

It *may* not always be required. There have been GLX extensions in the past (see my first message in this thread) that worked that way. However, as we discussed earlier, this doesn't seem to work so well with MPEG video files. The main problem being that you don't get the frames exactly in order. You're stuck doing a copy either way.


Also, what is the goal exactly? Are you trying to allow video to be
used as textures within a 3d rendered scene, or are your trying to
make it possible to do something like Xv, but using direct rendering
and 3d hardware?

If you are trying to do the latter, it seems far easier to just plug
your XvMC extension into the 3d engine rather than into the overlay. I think
you've done the equivalent with Xv already.

I think the goal is to be able to do both. Although, the idea of using MPEG video files as animated textures in a game is pretty cool. :)


_______________________________________________
Devel mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/devel

Reply via email to