Aaron, how are you? You made a (incorrect) dataspace selection, but never
used it in the H5Dwrite call. 
You're writing one record at a time(*), that means your selections in memory
and the file must be
singletons. Finally, you have to tell H5Dwrite about those selections.
Attached is a fixed version of your program. I've tested it with HDF5DotNet
1.8.7, .NET4 64-bit.

Best, G.

(*) This is not very efficient. Consider using HDF5 packet tables:
http://www.hdfgroup.org/HDF5/doc/HL/H5PT_Intro.html



> -----Original Message-----
> From: [email protected] [mailto:hdf-forum-
> [email protected]] On Behalf Of Aaron Altman
> Sent: Thursday, September 01, 2011 5:13 PM
> To: [email protected]
> Subject: [Hdf-forum] HDF5DotNet memory leak when expanding with
> HD5.setExtent?
> 
> Hi all.  I'm collecting signals from some electronics instrumentation and
> trying to store them in an HDF5 file.  Without knowing how long my
> collection is going to go in advance, it seems like my best option is to
> expand the dataset as I need.  But it seems like doing this with a
> compound data type of more than about 10 doubles leads to a crash.  The
> crash looks like this:
> 
> C:\Users\awaltman\Documents\Visual Studio
> 2010\Projects\ConsoleApplication3\Cons
> oleApplication3\bin\Debug>ConsoleApplication3.exe
> 
> Unhandled Exception: System.AccessViolationException: Attempted to read or
> write  protected memory. This is often an indication that other memory is
> corrupt.
>    at H5Dwrite(Int32 dataSetId, Int32 memType, Int32 memSpace, Int32
> fileSpace,
> Int32 xfer, Void* data)
>    at HDF5DotNet.H5D.write[Type](H5DataSetId dataSetId, H5DataTypeId
> memTypeId, H5DataSpaceId memSpaceId, H5DataSpaceId fileSpaceId,
> H5PropertyListId xferPropLi stId, H5Array`1 data)
>    at HDF5DotNet.H5D.write[Type](H5DataSetId dataSetId, H5DataTypeId
> memTypeId,
> H5Array`1 data)
>    at ConsoleApplication3.Program.Main(String[] args) in
> C:\Users\awaltman\docum ents\visual studio
> 2010\Projects\ConsoleApplication3\ConsoleApplication3\Program
> .cs:line 62
> 
> Source to reproduce the problem:
> 
> http://hdf-forum.184993.n3.nabble.com/file/n3302724/Program.cs Program.cs
> 
> The attached version tries to write 15000 columns, expanding the data set
> one at a time for each write, with 32 rows (double values) in each.  If
> you drop it down to rows = 8, it runs successfully and I can check out the
> results in an HDF viewer.
> 
> If there's another way I should be doing this, like tables, let me know.
> This seems to be what I have available through the HDF5DotNet API though.
> 
> Thanks,
> 
> Aaron Altman
> 
> --
> View this message in context: http://hdf-
> forum.184993.n3.nabble.com/HDF5DotNet-memory-leak-when-expanding-with-HD5-
> setExtent-tp3302724p3302724.html
> Sent from the hdf-forum mailing list archive at Nabble.com.
> 
> _______________________________________________
> Hdf-forum is for HDF software users discussion.
> [email protected]
> http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using HDF5DotNet;

namespace ConsoleApplication3
{
    class Program
    {
        static void Main(string[] args)
        {
            long rows = 32;
            long cols = 15000;
            long[] dims = new long[1];
            dims[0] = 1;
            long[] maxdims = new long[1];
            maxdims[0] = (long)H5S.H5SType.UNLIMITED;
            
            H5FileId logFile = H5F.create("test.h5", H5F.CreateMode.ACC_TRUNC);
            
            H5DataSpaceId insertionSpace = H5S.create_simple(1, dims, maxdims);

            H5DataTypeId rawPowerDataType = H5T.copy(H5T.H5Type.NATIVE_DOUBLE);
            /* Create a compound datatype corresponding to the NIDAQ MAX task. 
             * Individual elements of the compound datatype will be doubles. 
             * The elements of the compound datatype are named according to the
             * channel names specified in the MAX task.
             */
            H5DataTypeId powerDataRow = H5T.create(H5T.CreateClass.COMPOUND, 
(int)(sizeof(double) * rows));
            for (int i = 0; i < rows; i++)
            {
                H5T.insert(powerDataRow, "Namedrow" + i.ToString(), 
sizeof(double) * i, rawPowerDataType);
            }

            /* Enable 4K chunking.*/
            H5PropertyListId createChunked = 
H5P.create(H5P.PropertyListClass.DATASET_CREATE);
            H5PropertyListId linkCreationDefaults = 
H5P.create(H5P.PropertyListClass.LINK_CREATE);
            H5PropertyListId accessCreationDefaults = 
H5P.create(H5P.PropertyListClass.DATASET_ACCESS);
            H5P.setChunk(createChunked, new long[1] { 4096 });

            /* Create a new dataset based on the compound type specific to this 
task.  
             * Chunking is enabled on this dataset so it can be expanded at 
runtime as
             * more power data comes in.
             */
            H5DataSetId dataSetId = H5D.create(logFile, "/data", powerDataRow, 
insertionSpace, linkCreationDefaults, createChunked, accessCreationDefaults);

            Random randomDouble = new Random();
            double[] data = new double[rows];
            long currentRow = 0, newSize = 1;

            // memspace is one row in memory
            H5DataSpaceId memSpace = H5S.create_simple(1, new long[1] { 1 }, 
new long[1] { 1 });

            for (int i = 0; i < cols; i++) {
                for (int j = 0; j < rows; j++) {
                    data[j] = randomDouble.NextDouble();
                }

                H5D.setExtent(dataSetId, new long[1] { newSize++ });

                H5DataSpaceId fileSpace = H5D.getSpace(dataSetId);

                // select the current row, one row to be written
                H5S.selectHyperslab(fileSpace, H5S.SelectOperator.SET, new 
long[1] { currentRow++ }, new long[1] { 1 });

                // we write the entire memory buffer, 
                H5D.write(dataSetId, powerDataRow, memSpace, fileSpace, new 
H5PropertyListId(H5P.Template.DEFAULT), new H5Array<double>(data));

                H5S.close(fileSpace);
            }

            H5S.close(memSpace);
            H5S.close(insertionSpace);
            H5D.close(dataSetId);
            H5T.close(rawPowerDataType);
            H5T.close(powerDataRow);
            H5P.close(createChunked);
            H5P.close(linkCreationDefaults);
            H5P.close(accessCreationDefaults);
            H5F.close(logFile);
            H5.Close();
        }
    }
}
_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Reply via email to