Robert, On Jan 14, 2015, at 8:01 AM, Robert van Loenhout <[email protected]<mailto:[email protected]>> wrote:
Hi, What would be the best way of storing in hdf5 a large number of arrays (millions) that all have the same size, however when writing the arrays, you do not yet know how many there are. Data arrays are stored in HDF5 as HDF5 datasets. There is no restriction on the number of datasets in HDF5 file. Are you familiar with the HDF5 datasets? If not, please take a look at the HDF5 Tutorial http://www.hdfgroup.org/HDF5/Tutor/ and the misc. examples http://www.hdfgroup.org/HDF5/examples/api18-c.html Could you use a multi-dimensional array and write blocks to it? What do you mean by a “block”? A sub-array? One can write or read a subset of an array. The HDF5 subsets may be quite complex. The simplest subsets are an array element (a point) or a sub-array (a hyperslab). Is there any way to read back how many blocks have been written to it? There are functions to find out the current size of the dataset and amount of space used by its data in the file. Elena ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Elena Pourmal The HDF Group http://hdfgroup.org 1800 So. Oak St., Suite 203, Champaign IL 61820 217.531.6112 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Regards, Robert _______________________________________________ Hdf-forum is for HDF software users discussion. [email protected]<mailto:[email protected]> http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org Twitter: https://twitter.com/hdf5
_______________________________________________ Hdf-forum is for HDF software users discussion. [email protected] http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org Twitter: https://twitter.com/hdf5
