For an Augmented Reality application, I need to get a video (file or
stream, ideally H.264 encoded) into an OpenGL texture.
I want to overlay certain real-world-objects in my AR application with
videos. So, the video will not be full-screen, the video should be
played in a (perspectively distorted) quad.
I guess, I would have to update the OpenGL texture every frame. But
how do I get the videoframes?

How can I achieve this?
MediaPlayer in combination with GLSurfaceView?
There is that FFmpeg solution, which looks rather hard and involved to
me. (http://stackoverflow.com/questions/4676178/android-video-player-
using-ndk-opengl-es-and-ffmpeg)
Do I have to go that way, or is there a simpler solution?

-- 
You received this message because you are subscribed to the Google
Groups "Android Developers" group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en

Reply via email to