Hi,

A quick word of warning, while QTKit is being deprecated for AVFoundation/Core 
Media, there is no third party codec support in the new APIs. I would consider 
this while you develop your app. I doubt that 3rd party codec support for Core 
media is going to arrive as fast (or as complete) as we would like.

bob.
(A quicktime codec developer that would love to write AVFoundation/Core Media 
Codecs, but can't.)


On Oct 11, 2011, at 7:22 PM, Mr. Gecko wrote:

> Hello, I am needing to have AVPlayer output the video as an CVImageBufferRef 
> in real time instead of AVPlayerLayer. I am able to do this with QTKit and 
> QTPixelBufferContextCreate, however, QTKit is said to be dead and 
> AVFoundation is the future as well as AVFoundation is 64bit.
> 
> So far what I've came up with works, however it is not feasible as I cannot 
> skip time without also telling AVAssetReader to change it's position and it 
> takes up more CPU than QTKit does.
> 
> Another thought I had with using AVFoundation is that AVFoundation should 
> have GPU accelerated H.264 decoding, which this does not seem to be true in 
> my current testing with QuickTime X as it still uses 33% CPU with a 1080P 
> video (not sure if that's good or bad).
> 
> Here is what I currently have, if anyone can improve it, please tell me how.
> 
> avMovie = [[AVPlayer alloc] initWithURL:[NSURL fileURLWithPath:[theArguments 
> objectAtIndex:0]]];
> AVPlayerItem *item = [avMovie currentItem];
> AVAsset *asset = [item asset];
> AVAssetTrack *videoTrack = nil;
> NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo];
> if ([tracks count] == 1) {
>    videoTrack = [tracks objectAtIndex:0];
>    NSError *error = nil;
>    assetReader = [[AVAssetReader alloc] initWithAsset:asset error:&error];
>    if (error!=nil) {
>        NSLog(@"Unable to create asset reader %@", [error 
> localizedDescription]);
>    } else {
>        NSMutableDictionary *bufferOptions = [NSMutableDictionary dictionary];
>        [bufferOptions setObject:[NSNumber numberWithInt:kYUVSPixelFormat] 
> forKey:(NSString*)kCVPixelBufferPixelFormatTypeKey];
>        [assetReader addOutput:[AVAssetReaderTrackOutput 
> assetReaderTrackOutputWithTrack:videoTrack outputSettings:bufferOptions]];
>        [assetReader startReading];
>    }
> }
> if (videoTrack!=nil) {
>    NSLog(@"%f", (Float64)1/[videoTrack nominalFrameRate]);
>    [avMovie addPeriodicTimeObserverForInterval:CMTimeMake(1001, [videoTrack 
> nominalFrameRate]*1001) queue:dispatch_queue_create("eventQueue", NULL) 
> usingBlock:^(CMTime time) {
>        dispatch_sync(dispatch_get_main_queue(), ^{
>            AVAssetReaderTrackOutput *output = [[assetReader outputs] 
> objectAtIndex:0];
>            CMSampleBufferRef sampleBuffer = [output copyNextSampleBuffer];
>            if (sampleBuffer!=NULL) {
>                CVImageBufferRef texture = 
> CMSampleBufferGetImageBuffer(sampleBuffer);
> 
>                // Do code with the CoreVideo Image.
> 
>                CFRelease(texture);
>            }
>        });
>    }];
> }
> [avMovie setRate:1.0];
> 
> P.S. I know I have leaks in there somewhere, I plan to remove them after 
> getting the concept completed with all of this.
> _______________________________________________
> 
> Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)
> 
> Please do not post admin requests or moderator comments to the list.
> Contact the moderators at cocoa-dev-admins(at)lists.apple.com
> 
> Help/Unsubscribe/Update your Subscription:
> http://lists.apple.com/mailman/options/cocoa-dev/bob%40gluetools.com
> 
> This email sent to b...@gluetools.com

_______________________________________________

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Reply via email to