Don, just a comment on sizing. H5D.getStorageSize returns the (allocated)
dataset size

in bytes and NOT the number of elements in your dataset. 'storageSZ' in your
code

is not the number of floats in your dataset. Maybe you're doing stuff

in your code that you haven't shown us that makes incorrect assumptions
about sizes.

 

Best, G.

 

 

From: [email protected] [mailto:[email protected]]
On Behalf Of Donald Brandon
Sent: Monday, September 19, 2011 2:22 PM
To: HDF Users Discussion List
Subject: [Hdf-forum] memory issues when reading multiple datasets

 

Hello all:

I have another problem that I could use some help with.  In a nutshell, I
have multiple datasets that I am trying to read arrays from and then work
with those arrays one by one.  The structure of the .h5 file kinda look like
this:

Group<root>
        Group<Group1>
                Group<GroupA>
                        Group<Group_a>
                                Dataset<1>
                        Group<group_b>
                                Dataset<1>
                Group<GroupB>
                        Group<Group_a>
                                Dataset<1>
                        Group<group_b>
                                Dataset<1>
        Group<Group2>
                        Group<Group_a>
                                Dataset<1>
                        Group<group_b>
                                Dataset<1>
                Group<GroupB>
                        Group<Group_a>
                                Dataset<1>
                        Group<group_b>
                                Dataset<1>

I am recursively going through each group and then executing code that looks
something like this:

        public float[] GetDepthArray(H5DataSetId dataset)
        {
            H5DataTypeId datasetType = H5D.getType(dataset);
            H5T.H5TClass datasetClass = H5T.getClass(datasetType);
            H5DataTypeId datasetNativeType = H5T.getNativeType(datasetType,
H5T.Direction.ASCEND);

            long storageSZ = H5D.getStorageSize(dataset);

            float[] dArray = new float[storageSZ];

            H5D.read(dataset, datasetNativeType, new
H5Array<float>(dArray));

            return dArray;
        }

After about the seventh iteration, the code will fail with an
AccessViolationException - "Attempted to read or write protected memory.
This is often an indication that other memory is corrupt."

Anyone have advice as to whether or not I should be cleaning something up
after each iteration that I am not seeing?  My system RAM seems to be
un-phased after each iteration so at least it doesn't look like I am losing
memory anywhere... 

Thanks for the help!
DB 



 
_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Reply via email to