Hi,

I would like to build a simple live video capture streaming application, so my 
idea was to use AVCaptureVideoDataOutput to get the uncompressed data of the 
capture session, and then to send that data over a socket connection.
On the other side I would want to decompress the compressed frame using 
AVAssetReader. Unfortunately AVAssetReader, even though it states that it can 
be used to decompress raw data, requires an AVAsset as its input which as far 
as I can tell can only be created from a URL. Is there any way to make that 
work with the current API? 

Thank you,
Matthias
_______________________________________________

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Reply via email to