You want AVCaptureSession.

--Kyle Sluder

> On Nov 25, 2013, at 9:53 AM, Motti Shneor <su...@bezeqint.net> wrote:
> 
> Hello everyone. 
> 
> This seems a novice question, but I have scanned Apple Mac-Dev-Center site 
> and I dived into all kinds of documentation, to no avail. 
> It seems that some basic functionality that was once beautifully covered by 
> the grand "QuickTime" API-set has split into so many parts of the system I 
> can't figure out how to do it anymore. 
> I'm re-writing an application I did 12 years ago, on MacOS 9, QuickTime 
> Sequence-Grabber APIs and the first versions of Carbon. 
> 
> My task:
> ---------
> I need to continuously grab frames from an IIDC/DCam camera connected to the 
> Mac via FireWire (iEEE1394), and display them on the application window, 
> scaled,  somewhat enhanced (contrast, edges), with a grid overlay drawn on 
> them. Further, I need to allow a user to draw geometrical objects on the live 
> image, and measure distances and curves clicking over the live video view.
> ---------
> 
> I neither need to record video to disk nor to compress the incoming video. 
> Just display it at the best quality and frame-rate I can. The camera is 
> mounted on a Microscope, and the need for live-image is for the operator to 
> be able to focus the microscope with on-screen feedback, or move the 
> objective to search for some microscopic object.
> 
> The original implementation did this without problems on 1999 Macs, using 
> QuickTime SG (Sequence-Grabber) APIs for grabbing video, QuickDraw for 
> drawing over the actual GWorld's. I also converted the color spaces by hand, 
> and optimized here and there, until I was able to reach 15fps with a 
> 2mega-pixel camera, on a PPC G3 iMac of that time.
> 
> Now --- not only I can't find any API set that will allow me to grab video 
> from camera, I find so many frameworks involved in Video that I can't find 
> the connection points between them. Embarrassingly complicated and incomplete 
> APIs.
> 
> I know I'll need Quartz to draw over the image. I know CoreGraphics will be 
> involved in the layering and CoreImage for image-enhancements. I don't know 
> If I need CoreVideo, although it is about manipulating video as it is 
> displayed. I don't know If I need QTKit or AVFoundation, or something else, 
> and where does OpenGL get in the middle.
> 
> There is NO SAMPLE PROGRAM now, in in the whole of Apple's developer site to 
> simply grab frames from the iSight (internal camera of the Mac) and display 
> them! The last thing I have (BrideOfMungGrab) does not compile anymore with 
> MacOS 10.7 SDK. In the past, the same API was used for grabbing from iSight, 
> DCAM/IIDCS cameras, and DV-Cam --- today I don't know.
> 
> I absolutely need a pointer, or I'm missing something big as an elephant. 
> 
> Help please?
> 
> 
> Motti Shneor, 
> ---
> ceterum censeo microsoftiem delendam esse
> ---

_______________________________________________

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Reply via email to