Thanks !

That's exactly what I was looking for. There is a compare method I dont
quite understand but I think I found what is going on.

I failed to grasp that an Array reply to #sizeInMemory with it's own size,
without the sizes of its references. A single position object weight 96
bytes, which make the whole Array weight arround 8Mb, and the 32 objects
arround 250 Mb.

I'm not sure I can get arround that aspect since the computation is costly
and I need its output multiple times.

I will make further testing to see why the memory is not released at the
end of the execution.

Thanks again !



2014-04-09 15:19 GMT+02:00 Sven Van Caekenberghe <s...@stfx.eu>:

> Hi Thomas,
>
> Fixing memory consumption problems is hard, but important: memory
> efficient code is automatically faster in the long run as well.
>
> Your issue sounds serious. However, I would start by trying to figure out
> what is happening at your coding level: somehow you (or something you use)
> must be holding on too much memory. Questioning low level memory management
> functionality should be the last resort, not the first.
>
> There is SpaceTally that you could use before and after running part of
> your code. Once something unexpected survives GC, there is the
> PointerFinder functionality (Inspector > Explore Pointers) to find what
> holds onto objects. But no matter what, it is hard.
>
> If you have some public code that you could share to demonstrate your
> problem, then we could try to help.
>
> Sven
>
> On 09 Apr 2014, at 12:54, Thomas Bany <mun.sys...@gmail.com> wrote:
>
> > Hi,
> >
> > My app is a parser/filter for binary files, that produces a bunch of
> ascii files.
> >
> > At the begining of the parsing, the filtering step involves the storage
> of the positions of 32 objects, each second for a whole day. So that's 32
> Arrays with 86400 elements each.
> >
> > During this step, the memory used by my image grows from 50Mb to ~500Mb.
> I find it far too large since I'm pretty sure my arrays are the largest
> objects I create and only weight something like 300kb.
> >
> > The profiling of the app shows that hte footprint of the "old memory"
> went up by 350Mb. Which I'm pretty sure is super bad. Maybe as a
> consequence, after the parsing is finished, the memory footprint of the
> image stays at ~500Mb
> >
> > What are the tools I have to find where precisely the memory usage
> explodes ? For example, is it possible to browse the "old memory" objects
> to see which one fails to get GC'ed ?
> >
> > Thanks in advance,
> >
> > Thomas.
>
>
>

Reply via email to