Well, I'm novice to Cocoa/ObjC programming, so, I might be wrong. But
I see you're setting a new frame on l2 every time fromTimer: is
called, but you never release the old one. The following may solve it,
if that's the problem.

- (void) fromTimer: (NSTimer *) t
{
    [l2.frame release]
    l2.frame = CGRectMake((double)random() / RAND_MAX*30, (double)random() /
        RAND_MAX*30, (double)random() / RAND_MAX*40, (double)random() /
        RAND_MAX*40);
    l1.opacity = (double)random() / RAND_MAX;
}

Or are you using the garbage collector?
By the way, shouldn't this be done via an accessor method? [l2 setFrame:...]

-- 
Gustavo Eulalio
[EMAIL PROTECTED]

On Sun, Jun 1, 2008 at 6:03 AM, Stéphane Droux <[EMAIL PROTECTED]> wrote:
> On Sun, Jun 1, 2008 at 8:49 AM, Brian Christensen <[EMAIL PROTECTED]> wrote:
>
>> On Jun 1, 2008, at 2:55 , Stéphane Droux wrote:
>>
>>  When I run the program in Mallocdebug, it seems that one of the threads
>>> used
>>> by Core Animation to animate the layers doesn't release its memory.
>>> So it looks like a bug in Core Animation. However, since animating
>>> non-leaf
>>> layers is such a core feature of Core Animation, I guess there's something
>>> missing in my code
>>>
>>
>> Are you sure you're getting a leak? I ran your sample code in Instruments
>> with the Object Alloc and Leaks tools and I didn't detect any leaks. Object
>> allocation and memory usage remained constant. Whether l1 or l2 were being
>> animated made no discernible difference.
>>
>> /brian
>>
>>
>
> Brian,
>
> You're right, when l1 is animated the memory usage increases for a while and
> then reaches a peak from which it remains constant.
>
> I think my example wasn't complex enough to reproduce my application
> behaviour.
> I've changed the timer function to do random animations and this time it
> really "leaks":
>
> - (void) fromTimer: (NSTimer *) t
> {
>    l2.frame = CGRectMake((double)random() / RAND_MAX*30, (double)random() /
> RAND_MAX*30, (double)random() / RAND_MAX*40, (double)random() /
> RAND_MAX*40);
>    l1.opacity = (double)random() / RAND_MAX;
> }
>
>
> I ran it in Object alloc and can see a trend of increasing memory usage :
> the memory usage keeps changing up and down but the general trend is up and
> it doubled after a minute or so.
>
> Could that be caused by some kind of caching in  Core Animation ?
> If it is, is there a way to flush the cache ?
>
> Thanks
> Stephane
> _______________________________________________
_______________________________________________

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to [EMAIL PROTECTED]

Reply via email to