On Sun, Jun 1, 2008 at 8:49 AM, Brian Christensen <[EMAIL PROTECTED]> wrote:

> On Jun 1, 2008, at 2:55 , Stéphane Droux wrote:
>
>  When I run the program in Mallocdebug, it seems that one of the threads
>> used
>> by Core Animation to animate the layers doesn't release its memory.
>> So it looks like a bug in Core Animation. However, since animating
>> non-leaf
>> layers is such a core feature of Core Animation, I guess there's something
>> missing in my code
>>
>
> Are you sure you're getting a leak? I ran your sample code in Instruments
> with the Object Alloc and Leaks tools and I didn't detect any leaks. Object
> allocation and memory usage remained constant. Whether l1 or l2 were being
> animated made no discernible difference.
>
> /brian
>
>

Brian,

You're right, when l1 is animated the memory usage increases for a while and
then reaches a peak from which it remains constant.

I think my example wasn't complex enough to reproduce my application
behaviour.
I've changed the timer function to do random animations and this time it
really "leaks":

- (void) fromTimer: (NSTimer *) t
{
    l2.frame = CGRectMake((double)random() / RAND_MAX*30, (double)random() /
RAND_MAX*30, (double)random() / RAND_MAX*40, (double)random() /
RAND_MAX*40);
    l1.opacity = (double)random() / RAND_MAX;
}


I ran it in Object alloc and can see a trend of increasing memory usage :
the memory usage keeps changing up and down but the general trend is up and
it doubled after a minute or so.

Could that be caused by some kind of caching in  Core Animation ?
If it is, is there a way to flush the cache ?

Thanks
Stephane
_______________________________________________

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to [EMAIL PROTECTED]

Reply via email to