Hi, I've never before done image (pixel) data processing in Cocoa before. Now I've heard of QuartzComposer and Core Image Units as the way to go nowadays. But, a big but comes here: When I process the image I need to have random access to all pixels because I want to do some floyd steinberg like dithering. Now:
http://developer.apple.com/library/mac/#documentation/GraphicsImaging/Conceptual/CoreImaging/ci_custom_filters/ci_custom_filters.html#//apple_ref/doc/uid/TP30001185-CH207-TPXREF101 states: "A kernel routine signature must return a vector (vec4) that contains the result of mapping the source pixel to a destination pixel. Core Image invokes a kernel routine once for each pixel. Keep in mind that your code can’t accumulate knowledge from pixel to pixel." so I guess using Core Image to write a Custom Filter is beyond the means here. So what would be the appropriate stuff to look for when planing to do some image data processing using Cocoa assuming that I need random access at all the data (mainly to diffuse some quantization error to nearby pixels)? thanks, Lars _______________________________________________ Cocoa-dev mailing list (Cocoa-dev@lists.apple.com) Please do not post admin requests or moderator comments to the list. Contact the moderators at cocoa-dev-admins(at)lists.apple.com Help/Unsubscribe/Update your Subscription: https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com This email sent to arch...@mail-archive.com