I have found a "orphaned object snippet" elsewhere on the forum and run
this each time round the loop - this didn't print anything, the only object
was the file itself, which seemed to make sense - I'll try adding some
explicit close's too - just in case.

Thanks,
- Jorj

On Thu, 13 Aug 2015 at 17:38 Miller, Mark C. <[email protected]> wrote:

> Hmm. Well I have no experience with HDF5's C++ interface.
>
> My first thought when reading your description was. . . I've seen that
> before. It happens when I forgot to H5Xclose() all the objects I H5Xopened
> (groups, datasets, types, dataspaces, etc.).
>
> However, with C++, I presume the interface is designed to close objects
> when they fall out of scope (e.g. deconstructor is called). So, in looking
> at your code, even though I don't see any explicit calls to close objects
> previously opened, I assume that *should* be happening when the objects
> fall out of scope. But, are you *certain* that *is* happening? Just before
> exiting main, you migth wanna make a call to H5Fget_obj_count() to get some
> idea how many objects HDF5 library thinks are still open in the file. If
> you get a large number, then that would suggest the problem is that the C++
> interface isn't somehow closing objects as they fall out of scope.
>
> Thats all I can think of. Sorry if no help.
>
> Mark
>
>
> From: Hdf-forum <[email protected]> on behalf of Jorj
> Pimm <[email protected]>
> Reply-To: HDF Users Discussion List <[email protected]>
> Date: Thursday, August 13, 2015 9:21 AM
> To: "[email protected]" <[email protected]>
> Subject: [Hdf-forum] Growing memory usage in small HDF program
>
> Hello,
>
> I am writing an application which writes large data sets to HDF5 files, in
> fixed size blocks, using the HDF C++ API (version 1.8.15, patch 1, built in
> msvc 2013 x64)
>
> I my application seems to quickly consume all the available memory on my
> system (win32 - around 5.9GB), and then crash whenever the system becomes
> stressed (windows kills it as it has no memory)
>
> I have also tested the application on a linux machine, where I saw similar
> results.
>
> I was under the impression that by using HDF5, the file would be brought
> in and out of memory in such a way that the library would only use a small
> working set - is this not true?
>
> I have experimented with HDF features such as flushing to disk, regularly
> closing and re opening, garbage collection and tuning chunking and caching
> settings and haven't managed to get a stable working set.
>
> I've attached a minimal example, can anyone point out my mistake?
>
> Thanks,
> - Jorj
>
> _______________________________________________
> Hdf-forum is for HDF software users discussion.
> [email protected]
> http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
> Twitter: https://twitter.com/hdf5
_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5

Reply via email to