Hi there

I have a few questions about what way should I choose for develop an real time video analysis app. I already have a solid knowledge of image analysis and objetive c aswell. The basic idea to start with this is an app that get a real time video signal and pass a set of filters (thresholding, segmentation) that count or discount an object (e.a Beans) and shows the number of the current count.

Im not really sure what core video offers, but as fair i understand while reading the guide, Do i need to think something like the next workflow?

iSight Signal - Core Video - Buffer - vImage (convolutions to each frame?) - Core Video (Compose an output video with a eliptical color around the bean?) - Results?

Thanks in advance.
_______________________________________________

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to [EMAIL PROTECTED]

Reply via email to