AVFoundation vs Core Image for QR Codes?

2017-09-19 Thread Rick Mann
I'm trying to read Data Matrix codes (like QR Codes) from static images, but it 
seems Core Image can only read QR Codes. AVFoundation seems to have more 
bar-code reading capabilities, but that seems wholly focused on real-time video 
capture.

I'm astonished that the decoding isn't better decoupled from AVFoundation. Am I 
just missing something?

-- 
Rick Mann
rm...@latencyzero.com


___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Writing a core image custom filter

2014-02-23 Thread Kevin Meaney
Apologies for the cross posting from quartz-dev. But it seems these days that 
it is hard to work out which forum, lists/devforum and then which of the 
discussions to then post.

I've been playing with creating a core image filter for OS X. I've got simple 
chroma key filter working as I would like, except for two issues. The first 
issue is a compile warning which if I ignore doesn't stop the filter from 
working. But I'd still like it resolved even if it is only for me to understand 
what is going on, or perhaps that it highlights something that I'm doing wrong 
but non-critical.

The filter is defined like so in the header:

@interface YVSChromaKeyFilter : CIFilter

Because I'm currently using the filter in a simple stand alone command line 
tool and I want to keep this option as a use for the filter I'm not packaging 
the filter up into a image unit as the recommended way to distribute image 
filters by apple so I'm using the alternate method which uses the 
registerFilterName method called from the filter's class initialize method:

@implementation YVSChromaKeyFilter

+(void)initialize
{
   if (self == [YVSChromaKeyFilter class])
   {
   NSArray *kernels = [CIKernel kernelsWithString:YVSChromaKeyFilterString];
   chromaKeyKernel = kernels[0];
   [CIFilter registerFilterName:@YVSChromaKeyFilter
constructor:(idCIFilterConstructor)self
classAttributes:@{
   kCIAttributeFilterDisplayName : @Simple Chroma Key.,
kCIAttributeFilterCategories : @[
  kCICategoryColorAdjustment, kCICategoryVideo,
  kCICategoryStillImage, kCICategoryInterlaced,
  kCICategoryNonSquarePixels]
}
];
   }
}

+(CIFilter *)filterWithName:(NSString *)name
{
   CIFilter  *filter;
   filter = [[YVSChromaKeyFilter alloc] init];
   return filter;
}

Note the cast in the constructor: option of the registerFilterName method, this 
was done to remove the compiler warning but really should not be there. I tried 
adding the protocol to the YVSChromaKeyFilter interface but that made no 
difference. My filter gets created correctly when I set things up as above and 
do:

CIFilter *filter =  [CIFilter filterWithName:@YVSChromaKeyFilter];

Now the protocol CIFilterConstructor interface is as follows:

@protocol CIFilterConstructor
- (CIFilter *)filterWithName:(NSString *)name;
@end

Which confuses me since I have to implement a class method filterWithName not 
an object method.

As I said, the filter works and does what I want but I'm missing something here 
that I'd like clarified and hopefully resolve things in a way that the ugly 
cast can be removed.

I'll put the second issue into a new e-mail as I think putting it here now will 
just add confusion.

Kevin

___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Writing a custom core image filter part 2

2014-02-23 Thread Kevin Meaney
In the CoreImage Programming guide on the discussion about writing a custom 
filter one of the steps described it to write a custom attributes method called 
customAttributes. I used as a template of what I needed to do the 
customAttributes method implemented in Apple's sample code for 
AVGreenScreenPlayer and so I have implemented my customAttributes method as 
follows:

- (NSDictionary *)customAttributes
{
   NSDictionary *inputColorProps;
   inputColorProps = @{ kCIAttributeClass : [CIColor class],
  kCIAttributeDefault : YVSChromaKeyFilterDefaultInputColor,
 kCIAttributeType : kCIAttributeTypeOpaqueColor };

   NSDictionary *inputDistanceProps;
   inputDistanceProps = @{
kCIAttributeClass : [NSNumber class],
  kCIAttributeDefault : YVSChromaKeyFilterDefaultInputDistance,
 kCIAttributeType : kCIAttributeTypeDistance };

   NSDictionary *inputSlopeWidthProps = @{
   kCIAttributeClass : [NSNumber class],
 kCIAttributeDefault : YVSChromaKeyFilterDefaultInputSlopeWidth,
kCIAttributeType : kCIAttributeTypeDistance };

   return @{ kCIInputColorKey : inputColor,
 @inputDistance : inputDistanceProps,
   @inputSlopeWidth : inputSlopeWidthProps };
}

Nothing else is described as being needed in relation to attributes in the 
CoreImage Programming guide. But when I call where filter is a 
YVSChromaKeyFilter object:

NSDictionary *attribs = [filter attributes];

I get an exception thrown:

2014-02-23 16:19:00.623 chromakey[19946:303] *** Terminating app due to 
uncaught exception 'NSInvalidArgumentException', reason: '*** 
-[NSMutableDictionary addEntriesFromDictionary:]: dictionary argument is not an 
NSDictionary'
*** First throw call stack:
(
0   CoreFoundation  0x7fff8608e41c 
__exceptionPreprocess + 172
1   libobjc.A.dylib 0x7fff808d1e75 
objc_exception_throw + 43
2   CoreFoundation  0x7fff85faabfc 
-[NSMutableDictionary addEntriesFromDictionary:] + 492
3   CoreImage   0x7fff8492d4a9 -[CIFilter 
attributes] + 458
4   chromakey   0x0001365d 
-[YVSChromaKeyImageProcessor run] + 1517
5   chromakey   0x00014115 main + 149
6   libdyld.dylib   0x7fff839af5fd start + 1
7   ??? 0x000f 0x0 + 15
)
libc++abi.dylib: terminating with uncaught exception of type NSException

So once again, I feel like I'm missing something. Is there something else I 
need to be doing?

Kevin

___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Re: Writing a custom core image filter part 2

2014-02-23 Thread Sandy McGuffog
In the return statement, did you not perhaps mean inputColorProps rather than 
inputColor?

Sandy

On Feb 24, 2014, at 12:10 AM, Kevin Meaney k...@yvs.eu.com wrote:

 inputColor,


___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Core Image

2012-06-15 Thread Luca Ciciriello
Hi All.
I'm porting some  CoreImage code from MacOS X to iOS and I'm unable to find the 
key KCGImagePropertyOrientation.

My doubt is: Is this key available in iOS?

I'm using iOS 5.1 with Xcode 4.3.3

Thanks in advance for any answer.

Luca.

___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Re: Core Image

2012-06-15 Thread Roland King
It's in the documentation as iOS4.0 and later and it's in the header file too

grep kCGImagePropertyOrientation *
CGImageProperties.h:IMAGEIO_EXTERN const CFStringRef 
kCGImagePropertyOrientation  IMAGEIO_AVAILABLE_STARTING(__MAC_10_4, 
__IPHONE_4_0);

Did you pick the right framework? It's normally, and indeed in this case is, 
listed at the top of the documentation for the symbol. 

On Jun 15, 2012, at 8:24 PM, Luca Ciciriello wrote:

 Hi All.
 I'm porting some  CoreImage code from MacOS X to iOS and I'm unable to find 
 the key KCGImagePropertyOrientation.
 
 My doubt is: Is this key available in iOS?
 
 I'm using iOS 5.1 with Xcode 4.3.3
 
 Thanks in advance for any answer.
 
 Luca.
 
 ___
 
 Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)
 
 Please do not post admin requests or moderator comments to the list.
 Contact the moderators at cocoa-dev-admins(at)lists.apple.com
 
 Help/Unsubscribe/Update your Subscription:
 https://lists.apple.com/mailman/options/cocoa-dev/rols%40rols.org
 
 This email sent to r...@rols.org


___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Re: Core Image

2012-06-15 Thread Luca Ciciriello
I build using CoreImage.framework and including CoreImage/CoreImage.h  header
The BASE SDK is 5.1 and the Deployment Target is 5.1

The error I get is Use of undeclared identifier KCGImagePropertyOrientation.

Is there some other header I've to include?

Luca.


On Jun 15, 2012, at 2:41 PM, Roland King wrote:

 It's in the documentation as iOS4.0 and later and it's in the header file too
 
 grep kCGImagePropertyOrientation *
 CGImageProperties.h:IMAGEIO_EXTERN const CFStringRef 
 kCGImagePropertyOrientation  IMAGEIO_AVAILABLE_STARTING(__MAC_10_4, 
 __IPHONE_4_0);
 
 Did you pick the right framework? It's normally, and indeed in this case is, 
 listed at the top of the documentation for the symbol. 
 
 On Jun 15, 2012, at 8:24 PM, Luca Ciciriello wrote:
 
 Hi All.
 I'm porting some  CoreImage code from MacOS X to iOS and I'm unable to find 
 the key KCGImagePropertyOrientation.
 
 My doubt is: Is this key available in iOS?
 
 I'm using iOS 5.1 with Xcode 4.3.3
 
 Thanks in advance for any answer.
 
 Luca.
 
 ___
 
 Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)
 
 Please do not post admin requests or moderator comments to the list.
 Contact the moderators at cocoa-dev-admins(at)lists.apple.com
 
 Help/Unsubscribe/Update your Subscription:
 https://lists.apple.com/mailman/options/cocoa-dev/rols%40rols.org
 
 This email sent to r...@rols.org
 
 


___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Re: Core Image

2012-06-15 Thread Roland King
What framework does the documentation for kCGImagePropertyOrientation tell you 
to add? In my last mail I said it's right at the top of the documentation page, 
as it usually is for all such things. 

On Jun 15, 2012, at 9:00 PM, Luca Ciciriello wrote:

 I build using CoreImage.framework and including CoreImage/CoreImage.h  header
 The BASE SDK is 5.1 and the Deployment Target is 5.1
 
 The error I get is Use of undeclared identifier KCGImagePropertyOrientation.
 
 Is there some other header I've to include?
 
 Luca.
 
 
 On Jun 15, 2012, at 2:41 PM, Roland King wrote:
 
 It's in the documentation as iOS4.0 and later and it's in the header file too
 
 grep kCGImagePropertyOrientation *
 CGImageProperties.h:IMAGEIO_EXTERN const CFStringRef 
 kCGImagePropertyOrientation  IMAGEIO_AVAILABLE_STARTING(__MAC_10_4, 
 __IPHONE_4_0);
 
 Did you pick the right framework? It's normally, and indeed in this case is, 
 listed at the top of the documentation for the symbol. 
 
 On Jun 15, 2012, at 8:24 PM, Luca Ciciriello wrote:
 
 Hi All.
 I'm porting some  CoreImage code from MacOS X to iOS and I'm unable to find 
 the key KCGImagePropertyOrientation.
 
 My doubt is: Is this key available in iOS?
 
 I'm using iOS 5.1 with Xcode 4.3.3
 
 Thanks in advance for any answer.
 
 Luca.
 
 ___
 
 Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)
 
 Please do not post admin requests or moderator comments to the list.
 Contact the moderators at cocoa-dev-admins(at)lists.apple.com
 
 Help/Unsubscribe/Update your Subscription:
 https://lists.apple.com/mailman/options/cocoa-dev/rols%40rols.org
 
 This email sent to r...@rols.org
 
 
 


___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Re: Core Image

2012-06-15 Thread Fritz Anderson

On Fri, June 15, 2012 7:24 am, Luca Ciciriello wrote:
 Hi All.
 I'm porting some  CoreImage code from MacOS X to iOS and I'm unable to
 find the key KCGImagePropertyOrientation.

I notice that you keep spelling it KCGImagePropertyOrientation. The proper
spelling is kCGImagePropertyOrientation, with a lower-case k.

-- F

-- 
Fritz Anderson
Xcode 4 Unleashed - Classics professors ask for it by name.
x4u.manoverboard.org


___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Re: Core Image

2012-06-15 Thread Luca Ciciriello
Yes this is the problem.

Thanks

Luca.
On Jun 15, 2012, at 4:03 PM, Fritz Anderson wrote:

 
 On Fri, June 15, 2012 7:24 am, Luca Ciciriello wrote:
 Hi All.
 I'm porting some  CoreImage code from MacOS X to iOS and I'm unable to
 find the key KCGImagePropertyOrientation.
 
 I notice that you keep spelling it KCGImagePropertyOrientation. The proper
 spelling is kCGImagePropertyOrientation, with a lower-case k.
 
-- F
 
 -- 
 Fritz Anderson
 Xcode 4 Unleashed - Classics professors ask for it by name.
 x4u.manoverboard.org
 
 
 

___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Re: Core Image

2012-06-15 Thread Richard Altenburg (Brainchild)
Op 15 jun. 2012, om 14:24 heeft Luca Ciciriello het volgende geschreven:

 I'm porting some  CoreImage code from MacOS X to iOS and I'm unable to find 
 the key KCGImagePropertyOrientation.
 My doubt is: Is this key available in iOS?
 I'm using iOS 5.1 with Xcode 4.3.3

My documentation serach came up with this:

kCGImagePropertyOrientation
The intended display orientation of the image. If present, this key is a 
CFNumber value with the same value as defined by the TIFF and EXIF 
specifications. The value specifies where the origin (0,0) of the image is 
located, as shown in Table 1. If not present, a value of 1 is assumed.
Available in iOS 4.0 and later.
Declared in CGImageProperties.h.

What are you trying to accomplish?
___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Re: Core Image

2012-06-15 Thread Richard Altenburg
Op 15 jun. 2012, om 14:24 heeft Luca Ciciriello het volgende geschreven:

 I'm porting some  CoreImage code from MacOS X to iOS and I'm unable to find 
 the key KCGImagePropertyOrientation.
 My doubt is: Is this key available in iOS?
 I'm using iOS 5.1 with Xcode 4.3.3

My documentation serach came up with this:

kCGImagePropertyOrientation
The intended display orientation of the image. If present, this key is a 
CFNumber value with the same value as defined by the TIFF and EXIF 
specifications. The value specifies where the origin (0,0) of the image is 
located, as shown in Table 1. If not present, a value of 1 is assumed.
Available in iOS 4.0 and later.
Declared in CGImageProperties.h.

What are you trying to accomplish?
___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Core Image Capture: ICScanner and brightness/contrast adjustments.

2011-10-05 Thread Robert Tillyard
Hello,

I'm struggling to find documentation on using ICScanner apart from the Scanner 
Browser example which was really great and helped a lot but the images that are 
scanned are too light.

In some scanner apps the user can set the brightness/contrast then scan to get 
a better image.

Do I need to pass some values to the ICScanner to do this? Again can't find any 
documentation on this or anything useful from the ICScanner.h, ICDevice.h or 
ICScannerFunctionalUnit.h headers.

I need to support 10.6 so I don't want to dump all of the code and use another 
framework unless I really have to.

Thanks, regards, Rob.___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Determining bounds needed for a Core Image effect

2011-06-07 Thread Graham Cox
I'm using Core Image filters to apply real-time effects to vector objects.

I've run into the problem of determining just how much space I need to 
accommodate any given effect. Currently I just add a fixed percentage to the 
bounds I start with, but it's actually inadequate to do this for many effects, 
which frequently end up running off the edges of the space allotted.

The vector objects have a defined bounds which fully encloses all drawing that 
they do. When these objects are altered, that bounds is used to refresh just 
that part of the view as needed. When a CI Filter is applied, I use that 
bounds, multiply it by some scaling factor, and use that to create an offscreen 
image into which the vector object plus its CI effect is rendered. The 
resulting image is then drawn in the view. The needed space for a given effect 
varies depending on the effect and its parameters, but I see no way to compute 
that reliably. If I make the bounds some enormous scale-up of the original 
bounds to accommodate any potential effect, performance suffers dramatically 
because of all the wasted area of the view that has to be updated.

Is there any way to preflight a Core Image filter effect so I know how much 
space I'll need to draw it?


--Graham


___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Core Image increases memory use a lot

2011-01-16 Thread Seth Willits


I have a CALayer subclass and I'm drawing into it using Core Image. I'm taking 
a CGImage, blurring, and adding a darkened radial gradient on top of it. Pretty 
simple. Now the weird thing is, using Core Image for drawing this image 
(instead of just drawing with no effects using CGContextDrawImage) increases my 
app's memory usage by 30 MB. That almost doubles the entire usage of my app. 

It's not like I'm leaking any memory here, so why the large permanent increase? 
I'd like to avoid it if I can because I'm getting criticized for using too 
much memory despite it not being my fault. :-p



- (void)drawInContext:(CGContextRef)theContext
{
CIImage * image = [CIImage 
imageWithCGImage:(CGImageRef)self.backgroundImage];
CGRect imageRect = [image extent];

CIFilter * blur = [CIFilter filterWithName:@CIGaussianBlur];
[blur setValue:image forKey:@inputImage];
[blur setValue:[NSNumber numberWithInt:2] 
forKey:@inputRadius];

CIFilter * gradient = [CIFilter 
filterWithName:@CIRadialGradient];
CGPoint center = CGPointMake(CGRectGetMidX(imageRect), 
CGRectGetMidY(imageRect));
[gradient setValue:[CIVector vectorWithX:center.x Y:center.y]   
forKey:@inputCenter];
[gradient setValue:[NSNumber numberWithInt:0.0] 
forKey:@inputRadius0];
[gradient setValue:[NSNumber numberWithInt:600.0]   
forKey:@inputRadius1];
[gradient setValue:[CIColor colorWithRed:0.0 green:0.0 blue:0.0 
alpha:0.3]  forKey:@inputColor0];
[gradient setValue:[CIColor colorWithRed:0.0 green:0.0 blue:0.0 
alpha:0.6]  forKey:@inputColor1];

CIContext * context = [CIContext 
contextWithCGContext:theContext options:NULL];
CGRect extent = [image extent];
[context drawImage:[blur valueForKey:@outputImage] 
inRect:self.bounds fromRect:extent];
[context drawImage:[gradient valueForKey:@outputImage] 
inRect:self.bounds fromRect:extent];
}



--
Seth Willits



___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Re: Core Image increases memory use a lot

2011-01-16 Thread Jeff Johnson
Hi Seth.

You might want to try putting an autorelease pool around your method. See the 
Tip in this document:

https://developer.apple.com/library/mac/#documentation/GraphicsImaging/Conceptual/CoreImaging/ci_tasks/ci_tasks.html%23//apple_ref/doc/uid/TP30001185-CH203-TPXREF101

I've used this tip before to fix what appeared to be a memory leak in my app.

-Jeff


On Jan 16, 2011, at 3:03 AM, Seth Willits wrote:

 I have a CALayer subclass and I'm drawing into it using Core Image. I'm 
 taking a CGImage, blurring, and adding a darkened radial gradient on top of 
 it. Pretty simple. Now the weird thing is, using Core Image for drawing this 
 image (instead of just drawing with no effects using CGContextDrawImage) 
 increases my app's memory usage by 30 MB. That almost doubles the entire 
 usage of my app. 
 
 It's not like I'm leaking any memory here, so why the large permanent 
 increase? I'd like to avoid it if I can because I'm getting criticized for 
 using too much memory despite it not being my fault. :-p
 
 
 
   - (void)drawInContext:(CGContextRef)theContext
   {
   CIImage * image = [CIImage 
 imageWithCGImage:(CGImageRef)self.backgroundImage];
   CGRect imageRect = [image extent];
   
   CIFilter * blur = [CIFilter filterWithName:@CIGaussianBlur];
   [blur setValue:image forKey:@inputImage];
   [blur setValue:[NSNumber numberWithInt:2] 
 forKey:@inputRadius];
   
   CIFilter * gradient = [CIFilter 
 filterWithName:@CIRadialGradient];
   CGPoint center = CGPointMake(CGRectGetMidX(imageRect), 
 CGRectGetMidY(imageRect));
   [gradient setValue:[CIVector vectorWithX:center.x Y:center.y]   
 forKey:@inputCenter];
   [gradient setValue:[NSNumber numberWithInt:0.0] 
 forKey:@inputRadius0];
   [gradient setValue:[NSNumber numberWithInt:600.0]   
 forKey:@inputRadius1];
   [gradient setValue:[CIColor colorWithRed:0.0 green:0.0 blue:0.0 
 alpha:0.3]  forKey:@inputColor0];
   [gradient setValue:[CIColor colorWithRed:0.0 green:0.0 blue:0.0 
 alpha:0.6]  forKey:@inputColor1];
   
   CIContext * context = [CIContext 
 contextWithCGContext:theContext options:NULL];
   CGRect extent = [image extent];
   [context drawImage:[blur valueForKey:@outputImage] 
 inRect:self.bounds fromRect:extent];
   [context drawImage:[gradient valueForKey:@outputImage] 
 inRect:self.bounds fromRect:extent];
   }
 
 
 
 --
 Seth Willits

___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Re: Core Image increases memory use a lot

2011-01-16 Thread Seth Willits



That's not applicable here. I'm not drawing many many things before returning 
to run loop so my memory usage isn't increasing due to repeated calls. It's 
just this single image. Just for giggles I tried it anyway, and as expected 
there's no difference at all.


--
Seth Willits






On Jan 16, 2011, at 5:32 AM, Jeff Johnson wrote:

 Hi Seth.
 
 You might want to try putting an autorelease pool around your method. See the 
 Tip in this document:
 
 https://developer.apple.com/library/mac/#documentation/GraphicsImaging/Conceptual/CoreImaging/ci_tasks/ci_tasks.html%23//apple_ref/doc/uid/TP30001185-CH203-TPXREF101
 
 I've used this tip before to fix what appeared to be a memory leak in my app.
 
 -Jeff
 
 
 On Jan 16, 2011, at 3:03 AM, Seth Willits wrote:
 
 I have a CALayer subclass and I'm drawing into it using Core Image. I'm 
 taking a CGImage, blurring, and adding a darkened radial gradient on top of 
 it. Pretty simple. Now the weird thing is, using Core Image for drawing this 
 image (instead of just drawing with no effects using CGContextDrawImage) 
 increases my app's memory usage by 30 MB. That almost doubles the entire 
 usage of my app. 
 
 It's not like I'm leaking any memory here, so why the large permanent 
 increase? I'd like to avoid it if I can because I'm getting criticized for 
 using too much memory despite it not being my fault. :-p
 
 
 
  - (void)drawInContext:(CGContextRef)theContext
  {
  CIImage * image = [CIImage 
 imageWithCGImage:(CGImageRef)self.backgroundImage];
  CGRect imageRect = [image extent];
  
  CIFilter * blur = [CIFilter filterWithName:@CIGaussianBlur];
  [blur setValue:image forKey:@inputImage];
  [blur setValue:[NSNumber numberWithInt:2] 
 forKey:@inputRadius];
  
  CIFilter * gradient = [CIFilter 
 filterWithName:@CIRadialGradient];
  CGPoint center = CGPointMake(CGRectGetMidX(imageRect), 
 CGRectGetMidY(imageRect));
  [gradient setValue:[CIVector vectorWithX:center.x Y:center.y]   
 forKey:@inputCenter];
  [gradient setValue:[NSNumber numberWithInt:0.0] 
 forKey:@inputRadius0];
  [gradient setValue:[NSNumber numberWithInt:600.0]   
 forKey:@inputRadius1];
  [gradient setValue:[CIColor colorWithRed:0.0 green:0.0 blue:0.0 
 alpha:0.3]  forKey:@inputColor0];
  [gradient setValue:[CIColor colorWithRed:0.0 green:0.0 blue:0.0 
 alpha:0.6]  forKey:@inputColor1];
  
  CIContext * context = [CIContext 
 contextWithCGContext:theContext options:NULL];
  CGRect extent = [image extent];
  [context drawImage:[blur valueForKey:@outputImage] 
 inRect:self.bounds fromRect:extent];
  [context drawImage:[gradient valueForKey:@outputImage] 
 inRect:self.bounds fromRect:extent];
  }
 
 
 
 --
 Seth Willits
 
 


___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Re: Core Image increases memory use a lot

2011-01-16 Thread Graham Cox

On 16/01/2011, at 8:03 PM, Seth Willits wrote:

 It's not like I'm leaking any memory here, so why the large permanent 
 increase? I'd like to avoid it if I can because I'm getting criticized for 
 using too much memory despite it not being my fault. :-p
 


Who complains about a 30MB memory usage increase these days? Who even notices? 
This is not Mac OS 9, it doesn't matter what your app's memory footprint is 
(within reason).

I expect Core Image is caching stuff for future use - blurs for example are 
expensive to set up. Might be worth comparing the time taken to execute the 
first time this is called with subsequent times - if there's a worthwhile speed 
up you can claim it as a feature - classic speed vs. memory tradeoff.

--Graham___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Re: Core Image increases memory use a lot

2011-01-16 Thread Seth Willits
On Jan 16, 2011, at 1:43 PM, Graham Cox wrote:

 It's not like I'm leaking any memory here, so why the large permanent 
 increase? I'd like to avoid it if I can because I'm getting criticized for 
 using too much memory despite it not being my fault. :-p
 
 
 Who complains about a 30MB memory usage increase these days? Who even 
 notices? This is not Mac OS 9, it doesn't matter what your app's memory 
 footprint is (within reason).

That's 30 MB for *a single image*. It happens multiple times and adds up. Who 
notices? Me, but also my customers (really). The recently released rewrite of 
my app QuickPick uses Core Animation and Core Image. It's a document and 
application launcher. Should it be using 150 MB? No. That's clearly excessive. 
Why is it using 150 MB? A little bit is my slight misuse of CA thinking it's 
smarter than it is, but almost 100 MB is just from using Core Image instead of 
Core Graphics.

If I remove just the blur filter, the memory usage isn't any lower. I can 
achieve the same effect (without the blur) using CG/NS and memory usage doesn't 
increase at all, as I would expect. 


 I expect Core Image is caching stuff for future use...

I have the same thought, but if I kill off the filter and every other 
CI-related object, I'd expect that to go away. Creating new filters and 
executing it again isn't significantly faster. 



--
Seth Willits



___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Re: Core Image increases memory use a lot

2011-01-16 Thread Nick Zitzmann

On Jan 16, 2011, at 2:43 PM, Graham Cox wrote:

 Who complains about a 30MB memory usage increase these days? Who even 
 notices? This is not Mac OS 9, it doesn't matter what your app's memory 
 footprint is (within reason).

Two kinds of people:

1. Experienced users that watch Activity Monitor like a hawk.

2. Unexperienced users that blame your program for slowing down their computer 
once its memory usage spikes and forces the OS to start swapping. Not everyone 
is literate enough to understand swapping, and since the OS outside of Activity 
Monitor gives the user no indication that this is going on (unless the disk is 
nearly full), then they need a scapegoat.

Using available memory is normally a good thing, since loading things from disk 
takes longer than accessing them from memory. But it is a balancing game, 
because using too much memory has negative side effects. So yes, it does 
matter, especially in 64-bit applications, since at least 32-bit apps will 
crash if they go overboard.

Nick Zitzmann
http://www.chronosnet.com/



___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Applying a Core Image filter to a PDFView, WebView or any view

2009-10-21 Thread Uwe Dauernheim

Hej,

I tried to apply the following CIFilter to a pdfview, webview and/or a  
simple view. I don't get any compiling errors, but it just nothing  
happens. Am I missing a certain initialization part?


#import Quartz/Quartz.h
#import QuartzCore/CAAnimation.h
#import QuartzCore/CoreImage.h
...
{
NSArray *filters = nil;
CIFilter *filter = [CIFilter filterWithName:@CIPointillize];
[filter setDefaults];
[filter setValue:[NSNumber numberWithFloat:4.0] forKey:@inputRadius];
filters = [NSArray arrayWithObject:filter];

[NSAnimationContext beginGrouping];
[[NSAnimationContext currentContext] setDuration:1.5];
[[webView animator] setContentFilters:filters]; // (a)
[[pdfView animator] setContentFilters:filters]; // (b)
[[view animator] setContentFilters:filters]; // (c)
[NSAnimationContext endGrouping];
}

\\\Uwe

___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Core Animation/Core Image Crash

2009-01-30 Thread Simon Haertel

Hi all,

In our application we have a window with a layer-backed NSView which  
has a content filter applied. The view also has a CAAnimation  
attached, which alters one of the filter's parameters.


Our problem is: When we repeatedly close and re-open the window, the  
application crashes. (Note that the window's isReleasedWhenClosed flag  
is not set!) We broke down our setup to the following code, which  
always causes a crash after opening and closing the window a few times:



@implementation MyView

- (void) drawRect: (NSRect) dirtyRect
{
[[NSColor greenColor] set];
NSRectFill( [self bounds] );
}

@end

@implementation MyController

- (void) toggleWindow: (NSWindow*) window
{
if ( [window isVisible] ) {
[window close];
} else {
[window makeKeyAndOrderFront:self];
}

[self performSelector:@selector(toggleWindow:) withObject:window  
afterDelay:0.2];

}

- (void) awakeFromNib
{
NSWindow* window = [[NSWindow alloc]  
initWithContentRect:NSMakeRect( 0.0, 0.0, 500.0, 300.0 )
styleMask:NSTitledWindowMask backing:NSBackingStoreBuffered  
defer:NO]; // not released

[window setReleasedWhenClosed:NO];
[window center];
[[window contentView] setWantsLayer:YES];

MyView* view = [[MyView alloc] initWithFrame:NSMakeRect( 20.0,  
20.0, 200.0, 100.0 )];

[[window contentView] addSubview:view];
[view release];

CIFilter* bloomFilter = [CIFilter filterWithName:@CIBloom];
[bloomFilter setDefaults];
[bloomFilter setName:@bloomFilter];
[view setContentFilters:[NSArray arrayWithObject:bloomFilter]];

CABasicAnimation* animation = [CABasicAnimation  
animationWithKeyPath:@filters.bloomFilter.inputRadius];

animation.fromValue = [NSNumber numberWithFloat:0.0];
animation.toValue = [NSNumber numberWithFloat:10.0];
animation.repeatCount = FLT_MAX;
[[view layer] addAnimation:animation forKey:@theAnimation];

[self toggleWindow:window];
}

@end


We've already tried removing the filter and animation before the  
window closes (by listening to the NSWindowWillCloseNotification), but  
this only seems to delay the crash a bit.


Does anybody have a clue what's wrong here? Thanks in advance.

Simon Haertel
___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Use the CATransition to switch one layer from another layer with Core Image effects?

2008-08-02 Thread Cloud Strife
Hi everyone.
Maybe the idea as the topic mentioned sounds a little crazy, I really want
to realize it.
For example, I created two layer with ImageA and ImageB. Now I am displaying
ImageA in LayerA to users. After the user click some button, I should
display ImageB in LayerB to the user with the Core Image transition effects
like page curl, swipe, water ripple and so on.
Can anyone give me a guidance to do that? Thank you very much for any help.
Good Luck. :-)

-- 
Best regards.
___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to [EMAIL PROTECTED]


Gaussian blur with core image, using CPU or GPU?

2008-05-23 Thread Jordan Woehr
First, I sent this once but I don't think it made it onto the list. I've
done a quick search of the archives and couldn't find it. I apologize in
advance if this end up being a double post.

Hi everyone,

I'm trying to write a bilateral filter using Core Image with the
specific goal of having it preform the filter on the GPU for high
performance as this will be for large 4d data sets.

I started out by reading the Core Image Programming Guide and came to
the Writing Nonexecutable Filters page and came across this
sentence:

Core Image assumes that the ROI coincides with the domain of
definition. This means that nonexecutable filters are not suited for
such effects as blur or distortion.

Does this mean that it is not possible to write a bilateral filter
which does the computations on the GPU?

I've looked at the Core Image gaussian filter. Thus far I cannot find
out whether it is executed on the CPU or GPU and I was wondering if
there is a way to determine this. Is there source code available for
this filter and if so where is it? I have had no luck finding it so
far.

Also, if anyone has experience with this type of filter and could
point me in the right direction for implementing it with Core Image it
would be much appreciated.

Lots of questions I know, but I hope someone can help.

Thank you,
Jordan
___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to [EMAIL PROTECTED]


Re: Gaussian blur with core image, using CPU or GPU?

2008-05-23 Thread Vijay Malhan


On 23-May-08, at 7:41 PM, Jordan Woehr wrote:

First, I sent this once but I don't think it made it onto the list.  
I've
done a quick search of the archives and couldn't find it. I  
apologize in

advance if this end up being a double post.

Hi everyone,

I'm trying to write a bilateral filter using Core Image with the
specific goal of having it preform the filter on the GPU for high
performance as this will be for large 4d data sets.

I started out by reading the Core Image Programming Guide and came to
the Writing Nonexecutable Filters page and came across this
sentence:

Core Image assumes that the ROI coincides with the domain of
definition. This means that nonexecutable filters are not suited for
such effects as blur or distortion.

Does this mean that it is not possible to write a bilateral filter
which does the computations on the GPU?

I've looked at the Core Image gaussian filter. Thus far I cannot find
out whether it is executed on the CPU or GPU and I was wondering if
there is a way to determine this. Is there source code available for
this filter and if so where is it? I have had no luck finding it so
far.


Have you gone through the FunHouse sample app in development examples.
I tried this for you:
I used FunHouse to apply the Gausian Blur (It uses Core Image) on a  
very heavy image.
I had Activity Monitor opened showing me the CPU usage. With FunHouse,  
there was no evident increase in CPU usage.


But then on the same heavy Image I used PhotoShop to apply Gausian  
Blur. There was clear shoot in CPU usage.
So, I think this kinda shows, that the FunHouse implementation, which  
uses Core Image uses GPU for processing.

I'm using MacBook Pro with ATI Radeon X1600 Graphics Card.
See if this helps.

There is another way of tiling the huge image data-set for better  
processing. I dun remember right now, but I'll come back to you with  
that.





Also, if anyone has experience with this type of filter and could
point me in the right direction for implementing it with Core Image it
would be much appreciated.

Lots of questions I know, but I hope someone can help.

Thank you,
Jordan
___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/vijay.malhan%40gmail.com

This email sent to [EMAIL PROTECTED]


___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to [EMAIL PROTECTED]


Re: Gaussian blur with core image, using CPU or GPU?

2008-05-23 Thread Paul Sargent
Just FYI, Core Image is normally dealt with n the Quartz list rather  
than here, but not to worry.


On 22 May 2008, at 22:36, Jordan Woehr wrote:

Does this mean that it is not possible to write a bilateral filter
which does the computations on the GPU?


Filters of variable kernel size are tricky (to say the least).

Apple's blur filter seems to generate different GPU programs depending  
on the radius parameter, but it does run on the GPU. It might also be  
multi-pass. You can see that varying the radius of the blur is quite  
low performance compared to changing the image it's blurring.


If your filter has a fixed kernel size (that isn't too big), then it  
can normally be coded. The key problem is the the kernel language has  
no conditional branching, so loops of variable length aren't allowed.  
Fixed length


You may have more luck coding for OpenGL rather than CoreImage, and  
using GLSL. Unfortunately Apple decided to remove the parts of GLSL  
that the hardware of the time didn't support when they created the  
CIKernel language. Now that hardware is more capable CI is stuck with  
those limitations.


That said, getting a OpenGL solution running will probably need you to  
be familiar with Pixel Buffer Objects (PBOs), Frame Buffer Objects  
(FBOs) and devising some way of putting your 4D data into textures  
(which you'd have to do for CoreImage anyway).


Another advantage of going the OpenGL route is that you've be able to  
use 3D textures, rather than just 2D images that you'll be working  
with in CoreImage. At least that's only one dimension off.



Is there source code available for
this filter and if so where is it? I have had no luck finding it so
far.


You won't find it, it's in a framework that isn't open-source.

___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to [EMAIL PROTECTED]


Blob Detection with Core Image

2008-05-06 Thread Bridger Maxwell
Hello,I am trying to write a program that will detect bright blobs of
light in an image and then track those blobs of light. I would be a Cocoa
version of OpenTouch at http://code.google.com/p/opentouch/. I am wondering
the best way to do this sort of image processing with Cocoa frameworks.

I have a started this app and use QTKit Capture to grab video from the
webcam. I get my images through QTCaptureDecompressedVideoOutput as a
CIImage. I can apply some filters to the images and display them in a
OpenGLView, but I don't know how I should implement the blob tracking. From
experience, making an NSBitmapImageRep from the CIImage so I can work with
the image data is far too slow, so I can't work with the blob detection
library used in OpenTouch. Is it possible, or recommended, to implement the
blob tracking as a CIFilter?

I read through CIColorTracking sample code, which is very close to what I
want to do. However, CIColorTracking simplifies the areas of interest down
to one location (where to place the duck). I am having trouble seeing how it
could be adapted to track more than one blob of light. Is it possible to
make a CIFilter that would have an output NSArray containing the points
where the blobs were found? I could see how it would be possible to simplify
the image down to an alpha mask of the blobs, but don't know how I would
extract the number of blobs and location of each from that image. Also,
getting the size of the blob would be desirable.

I have done a lot of reading and don't seem to be getting anywhere. Some
advice on how to proceed would be greatly appreciated.

Thank You,
Bridger Maxwell
___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to [EMAIL PROTECTED]


Re: Blob Detection with Core Image

2008-05-06 Thread Bridger Maxwell
I think I was unclear on where I was lost. I didn't think that I would be
able to use the OpenTouch blob detection framework, because I couldn't pass
it a CIImage, and converting the CIImage to an NSBitMapImageRep was too
slow. The only way to pass the image data to the blob detection library was
through the function:

void computeBlobs(int *pixels);


Therefor I thought that I would have to work with the CIImage only, perhaps
by making a CIFilter. How would the ObjC wrapper work? Oh, and I think the
Cocoa app which you are seeing with the OpenTouch source is actually the one
I am working on right now, I have access the the svn. :)
Thank You,
Bridger Maxwell

On Tue, May 6, 2008 at 3:50 AM, Mike Abdullah [EMAIL PROTECTED]
wrote:

 This seems an awful lot of work to me for little gain. If you check out
 the OpenTouch source, they have an example Cocoa app which really requires
 very little extra work. I think you'd be far better off writing an ObjC
 wrapper than creating your own entirely separate framework. Paweł would
 quite likely be happy to even incorporate it into the framework.

 Mike.


 On 6 May 2008, at 08:33, Bridger Maxwell wrote:

  Hello,I am trying to write a program that will detect bright blobs of
  light in an image and then track those blobs of light. I would be a
  Cocoa
  version of OpenTouch at http://code.google.com/p/opentouch/. I am
  wondering
  the best way to do this sort of image processing with Cocoa frameworks.
 
  I have a started this app and use QTKit Capture to grab video from the
  webcam. I get my images through QTCaptureDecompressedVideoOutput as a
  CIImage. I can apply some filters to the images and display them in a
  OpenGLView, but I don't know how I should implement the blob tracking.
  From
  experience, making an NSBitmapImageRep from the CIImage so I can work
  with
  the image data is far too slow, so I can't work with the blob detection
  library used in OpenTouch. Is it possible, or recommended, to implement
  the
  blob tracking as a CIFilter?
 
  I read through CIColorTracking sample code, which is very close to what
  I
  want to do. However, CIColorTracking simplifies the areas of interest
  down
  to one location (where to place the duck). I am having trouble seeing
  how it
  could be adapted to track more than one blob of light. Is it possible to
  make a CIFilter that would have an output NSArray containing the points
  where the blobs were found? I could see how it would be possible to
  simplify
  the image down to an alpha mask of the blobs, but don't know how I would
  extract the number of blobs and location of each from that image. Also,
  getting the size of the blob would be desirable.
 
  I have done a lot of reading and don't seem to be getting anywhere. Some
  advice on how to proceed would be greatly appreciated.
 
  Thank You,
  Bridger Maxwell
  ___
 
  Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)
 
  Please do not post admin requests or moderator comments to the list.
  Contact the moderators at cocoa-dev-admins(at)lists.apple.com
 
  Help/Unsubscribe/Update your Subscription:
 
  http://lists.apple.com/mailman/options/cocoa-dev/cocoadev%40mikeabdullah.net
 
  This email sent to [EMAIL PROTECTED]
 


___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to [EMAIL PROTECTED]

Re: Blob Detection with Core Image

2008-05-06 Thread Jean-Daniel Dupas

You have to properly configure your QTVideoContext to get this.

By default, most of the CoreVideo sample code uses  
QTOpenGLTextureContextCreate(), and so, you get CVOpenGLTextureRef.
If you want to retreive CVPixelBuffers, you have to create your  
QTVisualContext using the QTPixelBufferContextCreate() function instead.
And when you create this kind of context, you have to configure the  
expected pixel format too.


The following sample show you how to configure this kind of context  
and how to access pixel data:


http://developer.apple.com/samplecode/QTPixelBufferVCToCGImage/listing1.html



Le 6 mai 08 à 17:08, Raphael Sebbe a écrit :

I understand, processing is made on CPU anyway. The overhead you get  
is

because you duplicate (or redraw) the image before processing it.

I believe you actually get a CVImageBufferRef from QTKit, not a  
CIImage,
which resides in memory (not VRAM, as it comes from a camera  
anyway). You

could get access to pixel data that way:
CVPixelBufferLockBaseAddress
CVPixelBufferGetBaseAddress
CV...Unlock...

Avoiding unnecessary copy. This is not tested.

Raphael

2008/5/6 Bridger Maxwell [EMAIL PROTECTED]:

I think I was unclear on where I was lost. I didn't think that I  
would be
able to use the OpenTouch blob detection framework, because I  
couldn't

pass
it a CIImage, and converting the CIImage to an NSBitMapImageRep was  
too
slow. The only way to pass the image data to the blob detection  
library

was
through the function:

void computeBlobs(int *pixels);


Therefor I thought that I would have to work with the CIImage only,
perhaps
by making a CIFilter. How would the ObjC wrapper work? Oh, and I  
think the
Cocoa app which you are seeing with the OpenTouch source is  
actually the

one
I am working on right now, I have access the the svn. :)
Thank You,
Bridger Maxwell

On Tue, May 6, 2008 at 3:50 AM, Mike Abdullah [EMAIL PROTECTED] 


wrote:

This seems an awful lot of work to me for little gain. If you  
check out

the OpenTouch source, they have an example Cocoa app which really

requires
very little extra work. I think you'd be far better off writing an  
ObjC
wrapper than creating your own entirely separate framework. Paweł  
would

quite likely be happy to even incorporate it into the framework.

Mike.


On 6 May 2008, at 08:33, Bridger Maxwell wrote:

Hello,I am trying to write a program that will detect bright  
blobs of

light in an image and then track those blobs of light. I would be a
Cocoa
version of OpenTouch at http://code.google.com/p/opentouch/. I am
wondering
the best way to do this sort of image processing with Cocoa

frameworks.


I have a started this app and use QTKit Capture to grab video  
from the
webcam. I get my images through QTCaptureDecompressedVideoOutput  
as a
CIImage. I can apply some filters to the images and display them  
in a
OpenGLView, but I don't know how I should implement the blob  
tracking.

From
experience, making an NSBitmapImageRep from the CIImage so I can  
work

with
the image data is far too slow, so I can't work with the blob

detection

library used in OpenTouch. Is it possible, or recommended, to

implement

the
blob tracking as a CIFilter?

I read through CIColorTracking sample code, which is very close to

what

I
want to do. However, CIColorTracking simplifies the areas of  
interest

down
to one location (where to place the duck). I am having trouble  
seeing

how it
could be adapted to track more than one blob of light. Is it  
possible

to

make a CIFilter that would have an output NSArray containing the

points

where the blobs were found? I could see how it would be possible to
simplify
the image down to an alpha mask of the blobs, but don't know how I

would

extract the number of blobs and location of each from that image.

Also,

getting the size of the blob would be desirable.

I have done a lot of reading and don't seem to be getting anywhere.

Some

advice on how to proceed would be greatly appreciated.

Thank You,
Bridger Maxwell
___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the  
list.

Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:



http://lists.apple.com/mailman/options/cocoa-dev/cocoadev%40mikeabdullah.net


This email sent to [EMAIL PROTECTED]






___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/raphael.sebbe%40gmail.com

This email sent to [EMAIL PROTECTED]


___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the