-----Original Message-----
From: Hdf-forum [mailto:[email protected]] On Behalf Of 
Kochunas, Brendan
Sent: Thursday, January 11, 2018 11:14 AM
To: [email protected]
Subject: [Hdf-forum] Heap Fragmentation in H5Zdeflate?

Hi,

  I have a detailed question about a problem I am encountering using the gzip 
filter in HDF5.

The problem manifests as a failure to allocate the output buffer in H5Zdeflate 
(line 135) reporting unavailable resources. I have debugged this quite a bit 
and have confirmed that the buffer trying to be allocated is only a few MB and 
there is roughly 60GB of memory still available on the machine at the time the 
allocation fails. I strongly suspect it is due to heap fragmentation (I don't 
have a 1MB contiguous block of memory available).

This is only happening on a particular machine, and I understand there's a lot 
of important subtlety to chunk size and chunk cache size so I'm wondering if 
anyone has encountered some similar behavior in that heap fragmentation is 
likely to occur in H5Zdeflate. Also wondering if anyone can provide any 
guidance on how to resolve this. I was thinking the next thing to try would be 
to play with options for the chunk cache size
(H5Pset_cache) described here
https://support.hdfgroup.org/HDF5/doc/_topic/Chunking/Chunking_Tutorial_EOS13_2009.pdf

Also happy to provide more details as needed.


Thanks in advance,
-Brendan


_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5
_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5

Reply via email to