Matthew,

If you're using MSVC it has some useful tools for pinpointing memory leaks.
See this article for an example:
http://www.david-amador.com/2010/10/tracking-memory-leaks-in-visual-studio/
I believe there are ways to enact the dump only on changes that happened
between specific points in your code which would let you find something
that is getting cleaned up on exit but still shouldn't be happening. Sorry
I can't remember the specific calls for that but it's in the same part of
the API.

David

On Tue, Mar 29, 2016 at 10:39 AM, Xavier, Matthew <[email protected]>
wrote:

> I am developing on Windows, so valgrind isn’t available to me (and thus I
> don’t know the exact semantics of –show-reachable), but the memory
> profilers I’ve tried also don’t detect this ‘leak’. If HDF5 shuts down
> cleanly, it frees the conversion path cache. I found it because I was
> chasing a bug report consisting of “Why does heap memory grow by 50MB each
> time I open and then close a [file]?” (A file in this case is actually a
> more complicated structure and may cause hundreds of HDF5 files to be
> accessed.)
>
>
>
> I think that the presence of a vlen field is important. As I recall, the
> type comparison for compound types does not check what files the types were
> defined in, but if a vlen field is present, then the files in which the
> vlen field types were defined can make two otherwise identical compound
> types compare as different.
>
>
>
> I have not used committed datatypes. I will take a look at that function
> and see if it changes the behavior.
>
>
>
> Matthew
>
>
>
> *From:* Hdf-forum [mailto:[email protected]] *On
> Behalf Of *Miller, Mark C.
> *Sent:* Tuesday, March 29, 2016 12:02 PM
> *To:* HDF Users Discussion List <[email protected]>
> *Subject:* Re: [Hdf-forum] H5T module 'leaks' conversion paths
>
>
>
> I use compound types a lot *but*not* with vlen fields.
>
>
>
> I often run valgrind --leak-check=yes --show-reachable=yes on my HDF5 code
> and do not observe the leak you describe.
>
>
>
> But, I also 'commit' the types to the files they are defined in.
>
>
>
> I wonder if commiting the types would make a difference in the memory
> behavior you are seeing?
>
>
>
> Mark
>
>
>
> *From: *Hdf-forum <[email protected]> on behalf of
> "Xavier, Matthew" <[email protected]>
> *Reply-To: *HDF Users Discussion List <[email protected]>
> *Date: *Monday, March 28, 2016 2:15 PM
> *To: *"[email protected]" <[email protected]>
> *Subject: *[Hdf-forum] H5T module 'leaks' conversion paths
>
>
>
> Some of the datasets I have contain compound types with vlen fields. When
> I read these datasets, HDF5 creates new conversion ‘paths’ to convert
> between the file types and memory types involved. HDF5 caches these paths
> (see struct H5T_g defined in H5T.c).
>
>
>
> I’ve finally traced a memory ‘leak’ in my application to the unbounded
> growth of the conversion path cache. HDF5 treats vlen types as different if
> they come from different files, so I get a new set of conversion paths for
> every file I open, even if the types are actually identical.
>
>
>
> That would be fine, except that I can’t find a way to get rid of the
> cached paths when I close a file. There is a function provided for removing
> paths, H5Tunregister(pers, name, src_id, dst_id, func), but it does not
> work for compound types because of the way that the pers parameter is
> handled. If I pass H5T_PERS_HARD, no compound type conversions are removed
> because H5T_path_t.is_hard is set to false by H5T_path_find() when it falls
> back on the compound->compound soft conversion and generates a new path.
> Alternately, if I pass H5T_PERS_DONTCARE or H5T_PERS_SOFT, H5Tunregister()
> removes the default compound->compound soft conversion and I can't read any
> more datasets because the library can't create conversion paths for them.
>
>
>
> Incidentally, I also discovered that the way the type comparison function
> determines file identity depends on pointers that are left dangling when a
> file is closed, which both complicated my minimum reproduction of the
> problem and also undermines the file identity check. (When the same
> allocation is re-used from the free list for a different file, types can
> compare as the same even when they are from different files, which is
> contrary to the intent of the code.)
>
>
>
> I have attached a small program that reproduces the problem. It takes one
> argument, which is a path at which it can write a temporary file. To run it
> does require a custom build of HDF5 so that the test program can read the
> size of the path table. (Or alternately, you can comment out the relevant
> parts of the test program and inspect H5T_g.npaths with a debugger.)
>
>
>
> Has anyone else encountered and/or found a solution for this problem? I am
> already patching my own HDF5 builds to get Unicode file name support on
> Windows, so if I have to make code changes it’s not the end of the world.
>
>
>
> Thanks,
>
> Matthew Xavier
>
>
> _______________________________________________
> Hdf-forum is for HDF software users discussion.
> [email protected]
> http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
> Twitter: https://twitter.com/hdf5
>
_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5

Reply via email to