In my experience with AVFoundation, its really not ready for anything more than
simple playback apps using Core Animation / AVPlayerLayer.
You have noticed there is no visual context replacement, and no way to get
reasonably fast access to CVPixelbufferRefs of CVOpenGLTextureRefs at all.
Le 12 oct. 2011 à 04:46, Robert Monaghan a écrit :
Hi,
A quick word of warning, while QTKit is being deprecated for
AVFoundation/Core Media, there is no third party codec support in the new
APIs. I would consider this while you develop your app. I doubt that 3rd
party codec support for
Just so Apple knows we want this, I'll also fill an enhancement request for a
QTVisualContext like API it's really useful for realtime video effects and off
process video playing. The more people who files bug reports for the same
thing, the more Apple knows we want this.
On Oct 12, 2011, at
Hello, I am needing to have AVPlayer output the video as an CVImageBufferRef in
real time instead of AVPlayerLayer. I am able to do this with QTKit and
QTPixelBufferContextCreate, however, QTKit is said to be dead and AVFoundation
is the future as well as AVFoundation is 64bit.
So far what
Hi,
A quick word of warning, while QTKit is being deprecated for AVFoundation/Core
Media, there is no third party codec support in the new APIs. I would consider
this while you develop your app. I doubt that 3rd party codec support for Core
media is going to arrive as fast (or as complete) as
I understand this and have things to account for this already working. I would
explain what my program does, but that will take a long time. Basically the
reason I need it in CVImageBufferRef using YUV format of pixles is because I am
doing off process playing of video and just receiving the