Hey G:
Thanks for your input. Again, you comments made me look at things from
a different perspective and I noticed that the value of storageSZ was
changing with each iteration and I was not expecting that. My final
outputs are two dimensional arrays of a fixed size and knowing that I
can predetermine the value I need for the array size so I decided to
just go that route for now. I ran through my test code again with this
change and every array was successfully created.
Matt - Thanks for your input as well. I think I am OK for now.
I just want to say, this forum is one of the best I have joined as far
as getting people willing to help with the questions that I pose.
Thanks everyone for your help.
Until next time,
DB
On 9/19/2011 4:48 PM, Gerd Heber wrote:
Don, just a comment on sizing. H5D.getStorageSize returns the
(allocated) dataset size
in bytes and NOT the number of elements in your dataset. 'storageSZ'
in your code
is not the number of floats in your dataset. Maybe you're doing stuff
in your code that you haven't shown us that makes incorrect
assumptions about sizes.
Best, G.
*From:*[email protected]
[mailto:[email protected]] *On Behalf Of *Donald Brandon
*Sent:* Monday, September 19, 2011 2:22 PM
*To:* HDF Users Discussion List
*Subject:* [Hdf-forum] memory issues when reading multiple datasets
Hello all:
I have another problem that I could use some help with. In a
nutshell, I have multiple datasets that I am trying to read arrays
from and then work with those arrays one by one. The structure of the
.h5 file kinda look like this:
*Group<root>
Group<Group1>
Group<GroupA>
Group<Group_a>
Dataset<1>
Group<group_b>
Dataset<1>
Group<GroupB>
Group<Group_a>
Dataset<1>
Group<group_b>
Dataset<1>
Group<Group2>
Group<Group_a>
Dataset<1>
Group<group_b>
Dataset<1>
Group<GroupB>
Group<Group_a>
Dataset<1>
Group<group_b>
Dataset<1>*
I am recursively going through each group and then executing code that
looks something like this:
* public float[] GetDepthArray(H5DataSetId dataset)
{
H5DataTypeId datasetType = H5D.getType(dataset);
H5T.H5TClass datasetClass = H5T.getClass(datasetType);
H5DataTypeId datasetNativeType =
H5T.getNativeType(datasetType, H5T.Direction.ASCEND);
long storageSZ = H5D.getStorageSize(dataset);
float[] dArray = new float[storageSZ];
H5D.read(dataset, datasetNativeType, new
H5Array<float>(dArray));
return dArray;
}*
After about the seventh iteration, the code will fail with an
AccessViolationException - "Attempted to read or write protected
memory. This is often an indication that other memory is corrupt."
Anyone have advice as to whether or not I should be cleaning something
up after each iteration that I am not seeing? My system RAM seems to
be un-phased after each iteration so at least it doesn't look like I
am losing memory anywhere...
Thanks for the help!
DB
_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org