Thanks for your reply, there is still so much I need to learn.
I thought adding an autorelease pool wasn't necessary when garbage collection was turned on? I suppose making the NSBitmapRep image is taking so long to convert to because I have seven CISourceOverCompositing filters to combine seven images into a single one. Like so:

-----
CIFilter* compose = [CIFilter filterWithName:@"CISourceOverCompositing"];
        [compose setValue:image forKey:@"inputBackgroundImage"];
        [compose setValue:radialBlur forKey:@"inputImage"];
        CIImage* radialBlurFinal = [compose valueForKey:@"outputImage"];
radialBlurFinal = [radialBlurFinal imageByApplyingTransform:CGAffineTransformMakeTranslation (blurredImageRect.origin.x, blurredImageRect.origin.y)];
        
CIImage* normalImage = [image imageByApplyingTransform:CGAffineTransformMakeTranslation (normalImageRect.origin.x, normalImageRect.origin.y)];
        
        [compose setValue:finalResult forKey:@"inputBackgroundImage"];
        [compose setValue:normalImage forKey:@"inputImage"];
        finalResult = [compose valueForKey:@"outputImage"];
-------
etc.

Is there a better way to do this? Like drawing into a context and somehow converting it to an image? How should go on about doing this?

Thanks,
Jan


Op 5 mei 2009, om 07:44 heeft cocoa-dev-requ...@lists.apple.com het volgende geschreven:

I'm running a CIImage through some CIFilters, and I want to save the
returned CIImage to disk. Everything is very fast up until the point where I try to actually save the image. I first create an NSBitmapImageRep with my CIImage, get the needed NSData from that object, and then save it to disk. Not only is this incredibly slow, Instrument shows that the memory gets filled up extremely fast as well. In fact, when I input a couple dozen files into my program it takes longer and longer for it to process them, and
eventually the application just freezes.
Here's the relevant code:

for(NSString* file in fileArray) {
.....
//filter the image
CIImage* result = [self filterImage:source];

//saving to disk
NSBitmapImageRep* rep = [[NSBitmapImageRep alloc] initWithCIImage:result];
NSData* PNGData = [rep representationUsingType:NSPNGFileType
properties:nil];
[PNGData writeToFile:targetPath atomatically:NO];
....
}

Obviously there must be some better way to do this. And why does my memory fill up? The leak must be in these couple of lines. If anyone can help me towards the direction I should look in that would be very much appreciated.

Memory is filling up because of autoreleased objects created during
processing. Add an autorelease pool around your code inside your loop.

Everything is fast until you save the image because CoreImage is built
on lazy evaluation. When you apply filters and such, CoreImage doesn't
actually do any processing. All it does is build a pipeline. It's only
when you actually request the image data in some way, for example by
drawing it or by converting it to an NSBitmapImageRep, that CoreImage
actually does all the image processing and rendering. This code is
slow because that's where all the real work is actually being done.
You can't fix that unless you somehow lighten the load on CoreImage.

Mike

_______________________________________________

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Reply via email to