Hi Hei,

Yes, the HDF5 installation needs to include LZO compression in order to read 
your file. The link you mentioned is the correct location to
obtain the source code for LZO:    
https://support.hdfgroup.org/services/contributions.html

You can get the HDF5-1.8 source code (version HDF5-1.8.19) from here:

   https://support.hdfgroup.org/HDF5/release/obtainsrc518.html

As of HDF5-1.8.11, we provide support for dynamically loaded filters, to make 
third-party filters available at runtime.
See this page for details on that:

   https://support.hdfgroup.org/HDF5/doc/Advanced/DynamicallyLoadedFilters/

It should be fine to set optimization to –O3. You can set that in CFLAGS before 
you build the software.

-Barbara
[email protected]




From: Hei Chan [mailto:[email protected]]
Sent: Monday, October 23, 2017 9:11 PM
To: HDF Users Discussion List; Barbara Jones
Subject: Re: [Hdf-forum] h5dump error: unable to print data


Hi Barbara,

Thanks for your reply.

The information is very helpful.

I just yum-installed hdf5 1.8.5 from epel repo so I didn't get to configure the 
build parameters.

I think the setting file explains why I couldn't h5dump my file:
                Extra libraries:  -lz -lm
         I/O filters (external): deflate(zlib)

h5dump -pH my.h5:
            FILTERS {
               UNKNOWN_FILTER {
                  FILTER_ID 305
                  COMMENT lzo
                  PARAMS { 1 }
               }

According to https://support.hdfgroup.org/services/contributions.html, 305 is 
for lzo.  I guess I have to compile myself with lzo.

Before I compile to try again, I noticed libhdf5.sttings contains a mix of -O0, 
-O2, and -O3.  Is it possible and safe to build with just -O3?
Compiling Options:
------------------
               Compilation Mode: production
                     C Compiler: /usr/bin/gcc ( gcc (GCC) 4.4.7 20120313 )
                         CFLAGS: -O0 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 
-fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic

Thanks in advance.
On Monday, October 23, 2017, 10:57:11 PM GMT+8, Barbara Jones 
<[email protected]<mailto:[email protected]>> wrote:



Hello!



Do you get the same errors without the additional options? For example:  h5dump 
my.h5



A possibility is that your datasets are compressed with a filter that is not 
included in your HDF5 library.

You can use the “h5dump -pH” option to see the properties of a dataset.  For 
example:



$ h5dump -pH cmprss.h5

HDF5 "cmprss.h5" {

GROUP "/" {

   DATASET "Compressed_Data" {

      DATATYPE  H5T_STD_I32BE

      DATASPACE  SIMPLE { ( 100, 20 ) / ( 100, 20 ) }

      STORAGE_LAYOUT {

         CHUNKED ( 20, 20 )

         SIZE 520 (15.385:1 COMPRESSION)

      }

      FILTERS {

         COMPRESSION DEFLATE { LEVEL 6 }

      }

      FILLVALUE {

         FILL_TIME H5D_FILL_TIME_IFSET

         VALUE  H5D_FILL_VALUE_DEFAULT

      }

      ALLOCATION_TIME {

         H5D_ALLOC_TIME_INCR

      }

   }

}

}



The libhdf5.settings file that is placed in the location of the HDF5 libraries 
when it is built has information on the

external filters that were used to build the HDF5 library. For example:



$ ls

libhdf5.a      libhdf5_fortran.a  libhdf5_hl_cpp.a     libhdf5.settings  libz.a

libhdf5_cpp.a  libhdf5_hl.a       libhdf5hl_fortran.a  libsz.a

$ more libhdf5.settings

            SUMMARY OF THE HDF5 CONFIGURATION

            =================================



General Information:

-------------------

                   HDF5 Version: 1.10.1

                  Configured on: Thu Apr 27 10:15:49 CDT 2017





--- >8 cut ---





Features:

---------

                  Parallel HDF5: no

             High-level library: yes

                   Threadsafety: no

            Default API mapping: v110

With deprecated public symbols: yes

         I/O filters (external): deflate(zlib),szip(encoder)



--- >8 cut ---





If you cannot resolve the issue, you can send the file to the helpdesk, and we 
will take a look.



Thanks!

-Barbara

HDF Helpdesk: [email protected]<mailto:[email protected]>



From: Hdf-forum [mailto:[email protected]] On Behalf Of Hei 
Chan
Sent: Sunday, October 22, 2017 2:58 AM
To: [email protected]<mailto:[email protected]>
Subject: [Hdf-forum] h5dump error: unable to print data



Hi,



I ran the following command:

$ h5dump -d /A/B/960 -d /A/C/D88 -d /A/E/D88 -b LE -o tmp.o my.h5 
--enable-error-stack



And I got:

HDF5 "my.h5" {

DATASET "/A/B/960" {

   DATATYPE  H5T_COMPOUND {

      H5T_STD_U32LE "a";

      H5T_STD_I8LE "b";

      H5T_STD_I64LE "c";

   }

   DATASPACE  SIMPLE { ( 15265747 ) / ( H5S_UNLIMITED ) }

   DATA {

      h5dump error: unable to print data

   }

}

DATASET "/A/C/D88" {

   DATATYPE  H5T_COMPOUND {

      H5T_STD_I32LE "d";

      H5T_STD_U32LE "e";

      H5T_STD_U32LE "f";

      H5T_STD_U32LE "g";

      H5T_STD_I32LE "h";

      H5T_STD_I32LE "i";

      H5T_STD_I8LE "j";

      H5T_STD_I16LE "k";

      H5T_IEEE_F64LE "l";

      H5T_STD_U32LE "m";

      H5T_STD_I8LE "n";

      H5T_STD_I64LE "o";

      H5T_STD_U32LE "p";

   }

   DATASPACE  SIMPLE { ( 2114658 ) / ( H5S_UNLIMITED ) }

   DATA {

      h5dump error: unable to print data

   }

}

DATASET "/A/E/D88" {

   DATATYPE  H5T_COMPOUND {

      H5T_STD_I32LE "q";

      H5T_STD_U32LE "r";

      H5T_STD_U32LE "s";

      H5T_STD_I32LE "t";

      H5T_STD_I32LE "u";

      H5T_STD_I8LE "v";

      H5T_IEEE_F64LE "x";

      H5T_STD_I8LE "y";

      H5T_STD_I64LE "z";

   }

   DATASPACE  SIMPLE { ( 121815 ) / ( H5S_UNLIMITED ) }

   DATA {

      h5dump error: unable to print data

   }

}

}



Any idea what goes wrong?



Thanks in advance!
_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]<mailto:[email protected]>
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5
_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5

Reply via email to