Hello,I am trying to write a program that will detect bright "blobs" of
light in an image and then track those blobs of light. I would be a Cocoa
version of OpenTouch at http://code.google.com/p/opentouch/. I am wondering
the best way to do this sort of image processing with Cocoa frameworks.

I have a started this app and use QTKit Capture to grab video from the
webcam. I get my images through QTCaptureDecompressedVideoOutput as a
CIImage. I can apply some filters to the images and display them in a
OpenGLView, but I don't know how I should implement the blob tracking. From
experience, making an NSBitmapImageRep from the CIImage so I can work with
the image data is far too slow, so I can't work with the blob detection
library used in OpenTouch. Is it possible, or recommended, to implement the
blob tracking as a CIFilter?

I read through CIColorTracking sample code, which is very close to what I
want to do. However, CIColorTracking simplifies the areas of interest down
to one location (where to place the duck). I am having trouble seeing how it
could be adapted to track more than one blob of light. Is it possible to
make a CIFilter that would have an output NSArray containing the points
where the blobs were found? I could see how it would be possible to simplify
the image down to an alpha mask of the blobs, but don't know how I would
extract the number of blobs and location of each from that image. Also,
getting the size of the blob would be desirable.

I have done a lot of reading and don't seem to be getting anywhere. Some
advice on how to proceed would be greatly appreciated.

Thank You,
Bridger Maxwell
_______________________________________________

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to [EMAIL PROTECTED]

Reply via email to