On Sat, May 31, 2008 at 11:14 PM, Brian Christensen [EMAIL PROTECTED]
wrote:
On May 31, 2008, at 16:39, Stéphane Droux [EMAIL PROTECTED]
When I run the program in MallocDebug, the memory usage goes up every
time the timer function is executed. If I change the animation to be on l2
instead
On Sun, Jun 1, 2008 at 8:49 AM, Brian Christensen [EMAIL PROTECTED] wrote:
On Jun 1, 2008, at 2:55 , Stéphane Droux wrote:
When I run the program in Mallocdebug, it seems that one of the threads
used
by Core Animation to animate the layers doesn't release its memory.
So it looks like a bug
On Sun, Jun 1, 2008 at 1:40 PM, Brian Christensen [EMAIL PROTECTED] wrote:
Even with this new code I'm still not observing any leaking. Are you using
garbage collection? With GC enabled you will observe fluctuations until the
collector gets a chance to free up unused memory, but even then
On Sun, Jun 1, 2008 at 3:18 PM, Jonathan del Strother
[EMAIL PROTECTED] wrote:
If you kill the timer after its been running for a while, does the
memory usage drop back down? Maybe the implicit animations are never
completing before a new one gets added, so they're just stacking up on
top
On Sun, Jun 1, 2008 at 6:31 PM, Brian Christensen [EMAIL PROTECTED] wrote:
I would consider that to be expected behavior. If you aren't ever releasing
the layers you created, why would any of the relevant memory be freed? The
timer and the animations it is causing to be performed should not
Hi,
I'm currently writing an app that creates a complex hierarchy of CALayers.
Everytime I animate one of the layers, I get a memory leak.
In order to investigate the issue, I created a simple Cocoa project in which
I only added a CustomView and the following code for this view :
@implementation