I'm still relatively new to graphics programming, and I had a quick
question (which ties closely to the question I asked a few weeks back
about creating a plugin to "render-to-video" for OSG).

As a newcomer to graphics programming--and as a user of an existing
toolkit--there are a few things you can take for granted (all the
specific windowing semantics, for example, that are local to your
system). For the last little while I've been toying with the idea of
really sitting down and writing a bit of code that will allow me to
render directly to both the screen and a video stream simultaneously,
just like the FRAPS (http://fraps.com) applications does for Windows.

However, one of the things I'm not entirely sure on is "how" do get the
frame's color data. I've learned how to do the conversion from RGBA to
YUV, and I know about video container formats and encodings. What I'm
missing is how to hook into that "feed."

Really what I need is just a fey keywords to search on.

I've found somethin in Linux called "yukon" that actually takes a neat
(but slow) approach: it intercepts glXSwapBuffers and copies the data
before calling the "real" glXSwapBuffers. It does work (links to example
OSG videos below), but most all of it is done on the CPU.

At any rate, does anyone have any ideas on how/where I should start
researching to find the most optimal ways of "hooking" into the
colorbuffer (or, perhaps, using some alternative internal buffer a-la
pbuffer, PBO, image, etc.) to grab that data?

Thanks beforehand!

P.S. Here are some OSG (and Cal3D) videos if you're extra-bored, both
done on Linux:

http://cherustone.com/dude.mp4
http://cherustone.com/cally.mp4

_______________________________________________
osg-users mailing list
osg-users@openscenegraph.net
http://openscenegraph.net/mailman/listinfo/osg-users
http://www.openscenegraph.org/

Reply via email to