Patrick,
On Jan 16, 2015, at 3:29 PM, Patrick Weinandy
<[email protected]<mailto:[email protected]>> wrote:
Hi All,
Currently on unix using the hdf5-1.8.14 package (standard installation).
I have a .gh5 file with many datasets in this format:
DATATYPE H5T_IEEE_F32LE
DATASPACE SIMPLE { ( 133004 ) / ( H5S_UNLIMITED ) }
STORAGE_LAYOUT {
CHUNKED ( 4096 )
SIZE 3406 (156.200:1 COMPRESSION)
}
FILTERS {
COMPRESSION DEFLATE { LEVEL 4 }
}
I'm simply trying to extract a dataset using the command:
"./tools/h5dump/h5dump -d", however I keep getting this error: h5dump error:
unable to print data.
This error usually occurs when a compression filter is not available to HDF5.
Deflate filter (zlib compression) is enabled by default and will be configured
in if libz.* libraries are present on the build system in the /usr/lib
directory. (I am assuming you are on UNIX system).
Could you please check the libhdf5.settings file found under the lib directory
of the HDF5 installation point? This is a text file. Check if the line as shown
below contains “deflate(zlib)”:
I/O filters (external): deflate(zlib),szip(encoder)
If “deflate" is not there, you will need to rebuild HDF5 to get your data, but
first, please make sure that you have zlib on your system.
The output file is created nut the data area is empty.
DATASET "/MEASURED" {
DATATYPE H5T_IEEE_F32LE
DATASPACE SIMPLE { ( 133004 ) / ( H5S_UNLIMITED ) }
DATA {
}
I have been able to run other variations and commands without any issues
(h5repack, h5stat, h5dump -a/H/n, etc).
When checking using "--enable-error-stack" this is the output
*** glibc detected ***
/training/config/hdf5/hdf5-1.8.14/tools/h5dump/.libs/lt-h5dump: double free or
corruption (!prev):
Unfortunately, I cannot reproduce this error when I use HDF5 built without
zlib. Could you please contact our Helpdesk
([email protected]<mailto:[email protected]>) and send us your file for further
investigation?
Thank you!
Elena
0x0000000001a9e430 ***
======= Backtrace: =========
/lib64/libc.so.6[0x34e4275e76]
/lib64/libc.so.6[0x34e42789b3]
/training/config/hdf5/hdf5-1.8.14/tools/h5dump/.libs/lt-h5dump[0x407ef1]
/lib64/libc.so.6(__libc_start_main+0xfd)[0x34e421ed5d]
/training/config/hdf5/hdf5-1.8.14/tools/h5dump/.libs/lt-h5dump[0x4053f9]
======= Memory map: ========
...
....
Aborted (core dumped)
I can create a brand new file with no compression, copy the dataset from the
file I can't extract from to the new file and then use h5dump on the new file
(so I don't think its memory related).. I'm leaning towards something with the
original file's compression? I'm unable to remove compression/filter on the
file, I receive an error for each dataset: file cannot be read, deflate filter
is not available.
Any help/direction/insight is much appreciated.
Thank you
_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]<mailto:[email protected]>
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5
_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5