I want to compute the alpha value of each pixel of an image.
What i am doing right now is i create a NSBitmapImageRep from the image and use colorAtX: y: to get the alpha value form the color at that pixel.
So i need to do this for each and every pixel in the image.
I was just going through the WWDC 2008's "916-Getting started with Instruments" session where they worked on a sample image enhancement app. They had a similar case where they had to compute the color at every pixel.

The author says the process can be optimized if we first gather the imageRep's data and then access the color values directly from that data
        
So heres how i get the data:
unsigned char *data = [mImageRep bitmapData] , *pixel;

Now how do i access the pixel information from over here? any idea?
I'd be glad if someone can help me optimize this.

_______________________________________________

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to [EMAIL PROTECTED]

Reply via email to