Hello, all ...

While all of this discussion on Xcode 4 is interesting (please, Apple, make 
multiple-window development work again), i've an issue that i'm hoping someone 
could help with. I'm recording audio and video using AVFoundation, and i'm 
applying a GPU shader to the incoming video frames for effects. The problem is 
that after I get the pixels from the GPU via glReadPixels(), i'm kinda stuck as 
to how to make a CMSampleBuffer out of them so I can write it with an 
AVAssetWriter. The examples i've seen are confusing to me as they only seem to 
be concerned with video, when i'm recording video _and_ audio (though i'm not 
doing any processing to the audio).

Could someone post some example code that illustrates how to get pixels from 
the GPU (i'm assuming glReadPixels() is the best way to do this) and create a 
CMSampeBuffer with those pixels? I'd really appreciate it :-)

Regards,

John

_______________________________________________

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Reply via email to