Hi Thomas,

Fixing memory consumption problems is hard, but important: memory efficient 
code is automatically faster in the long run as well.

Your issue sounds serious. However, I would start by trying to figure out what 
is happening at your coding level: somehow you (or something you use) must be 
holding on too much memory. Questioning low level memory management 
functionality should be the last resort, not the first.

There is SpaceTally that you could use before and after running part of your 
code. Once something unexpected survives GC, there is the PointerFinder 
functionality (Inspector > Explore Pointers) to find what holds onto objects. 
But no matter what, it is hard.

If you have some public code that you could share to demonstrate your problem, 
then we could try to help.

Sven

On 09 Apr 2014, at 12:54, Thomas Bany <mun.sys...@gmail.com> wrote:

> Hi,
> 
> My app is a parser/filter for binary files, that produces a bunch of ascii 
> files. 
> 
> At the begining of the parsing, the filtering step involves the storage of 
> the positions of 32 objects, each second for a whole day. So that's 32 Arrays 
> with 86400 elements each.
> 
> During this step, the memory used by my image grows from 50Mb to ~500Mb. I 
> find it far too large since I'm pretty sure my arrays are the largest objects 
> I create and only weight something like 300kb.
> 
> The profiling of the app shows that hte footprint of the "old memory" went up 
> by 350Mb. Which I'm pretty sure is super bad. Maybe as a consequence, after 
> the parsing is finished, the memory footprint of the image stays at ~500Mb
> 
> What are the tools I have to find where precisely the memory usage explodes ? 
> For example, is it possible to browse the "old memory" objects to see which 
> one fails to get GC'ed ?
> 
> Thanks in advance,
> 
> Thomas.


Reply via email to