Dear all,

I am encountering an issue with HDF5/XDMF output using 
DataOut::write_hdf5_parallel() on large meshes. Up to a resolution of 
256×256×256, everything works correctly, but at *512×512×512* the program 
aborts with the following HDF5 error:

```
#014 HDF5-DIAG: Error ... unable to set chunk sizes
    major: Dataset
    minor: Bad value
#015 HDF5-DIAG: ... chunk size must be < 4GB
```
I have attached the error log as well.

>From inspecting the source in data_out_base.cc, HDF5 uses chunked layout, 
with chunk dimensions set internally in deal.II. At 512³, the computed 
chunk size exceeds the HDF5 hard limit of 4 GB per chunk, which leads to 
the failure.

Is there a recommended way for users to avoid this error when writing large 
output files? 

I would really appreciate any suggestions.
Thank you.

Best Regards,
JRK

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion visit 
https://groups.google.com/d/msgid/dealii/74b177a2-dd58-42de-8d1e-2c83609e530fn%40googlegroups.com.
HDF5-DIAG: Error detected in HDF5 (1.14.5) MPI-process 0:
  #000: 
/tmp/root/spack-stage/spack-stage-hdf5-1.14.5-26nmxvbzekiovzoy56uibsnolivaxi3e/spack-src/src/H5D.c
 line 186 in H5Dcreate2(): unable to synchronously create dataset
    major: Dataset
    minor: Unable to create file
  #001: 
/tmp/root/spack-stage/spack-stage-hdf5-1.14.5-26nmxvbzekiovzoy56uibsnolivaxi3e/spack-src/src/H5D.c
 line 135 in H5D__create_api_common(): unable to create dataset
    major: Dataset
    minor: Unable to create file
  #002: 
/tmp/root/spack-stage/spack-stage-hdf5-1.14.5-26nmxvbzekiovzoy56uibsnolivaxi3e/spack-src/src/H5VLcallback.c
 line 1865 in H5VL_dataset_create(): dataset create failed
    major: Virtual Object Layer
    minor: Unable to create file
  #003: 
/tmp/root/spack-stage/spack-stage-hdf5-1.14.5-26nmxvbzekiovzoy56uibsnolivaxi3e/spack-src/src/H5VLcallback.c
 line 1830 in H5VL__dataset_create(): dataset create failed
    major: Virtual Object Layer
    minor: Unable to create file
  #004: 
/tmp/root/spack-stage/spack-stage-hdf5-1.14.5-26nmxvbzekiovzoy56uibsnolivaxi3e/spack-src/src/H5VLnative_dataset.c
 line 282 in H5VL__native_dataset_create(): unable to create dataset
    major: Dataset
    minor: Unable to initialize object
  #005: 
/tmp/root/spack-stage/spack-stage-hdf5-1.14.5-26nmxvbzekiovzoy56uibsnolivaxi3e/spack-src/src/H5Dint.c
 line 351 in H5D__create_named(): unable to create and link to dataset
    major: Dataset
    minor: Unable to initialize object
  #006: 
/tmp/root/spack-stage/spack-stage-hdf5-1.14.5-26nmxvbzekiovzoy56uibsnolivaxi3e/spack-src/src/H5Lint.c
 line 492 in H5L_link_object(): unable to create new link to object
    major: Links
    minor: Unable to initialize object
  #007: 
/tmp/root/spack-stage/spack-stage-hdf5-1.14.5-26nmxvbzekiovzoy56uibsnolivaxi3e/spack-src/src/H5Lint.c
 line 732 in H5L__create_real(): can't insert link
    major: Links
    minor: Unable to insert object
  #008: 
/tmp/root/spack-stage/spack-stage-hdf5-1.14.5-26nmxvbzekiovzoy56uibsnolivaxi3e/spack-src/src/H5Gtraverse.c
 line 824 in H5G_traverse(): internal path traversal failed
    major: Symbol table
    minor: Object not found
  #009: 
/tmp/root/spack-stage/spack-stage-hdf5-1.14.5-26nmxvbzekiovzoy56uibsnolivaxi3e/spack-src/src/H5Gtraverse.c
 line 596 in H5G__traverse_real(): traversal operator failed
    major: Symbol table
    minor: Callback failed
  #010: 
/tmp/root/spack-stage/spack-stage-hdf5-1.14.5-26nmxvbzekiovzoy56uibsnolivaxi3e/spack-src/src/H5Lint.c
 line 537 in H5L__link_cb(): unable to create object
    major: Links
    minor: Unable to initialize object
  #011: 
/tmp/root/spack-stage/spack-stage-hdf5-1.14.5-26nmxvbzekiovzoy56uibsnolivaxi3e/spack-src/src/H5Oint.c
 line 2346 in H5O_obj_create(): unable to open object
    major: Object header
    minor: Can't open object
  #012: 
/tmp/root/spack-stage/spack-stage-hdf5-1.14.5-26nmxvbzekiovzoy56uibsnolivaxi3e/spack-src/src/H5Doh.c
 line 273 in H5O__dset_create(): unable to create dataset
    major: Dataset
    minor: Unable to initialize object
  #013: 
/tmp/root/spack-stage/spack-stage-hdf5-1.14.5-26nmxvbzekiovzoy56uibsnolivaxi3e/spack-src/src/H5Dint.c
 line 1304 in H5D__create(): unable to construct layout information
    major: Dataset
    minor: Unable to initialize object
  #014: 
/tmp/root/spack-stage/spack-stage-hdf5-1.14.5-26nmxvbzekiovzoy56uibsnolivaxi3e/spack-src/src/H5Dchunk.c
 line 835 in H5D__chunk_construct(): unable to set chunk sizes
    major: Dataset
    minor: Bad value
  #015: 
/tmp/root/spack-stage/spack-stage-hdf5-1.14.5-26nmxvbzekiovzoy56uibsnolivaxi3e/spack-src/src/H5Dchunk.c
 line 798 in H5D__chunk_set_sizes(): chunk size must be < 4GB
    major: Dataset
    minor: Unable to initialize object

    ----------------------------------------------------
Exception on processing: 

--------------------------------------------------------
An error occurred in line <8330> of file 
</tmp/root/spack-stage/spack-stage-dealii-9.6.2-pz5z6ke2736dynce3mxjd6346tifhuyk/spack-src/source/base/data_out_base.cc>
 in function
    void dealii::{anonymous}::do_write_hdf5(const 
std::vector<dealii::DataOutBase::Patch<dim, spacedim> >&, const 
dealii::DataOutBase::DataOutFilter&, const dealii::DataOutBase::Hdf5Flags&, 
bool, const string&, const string&, MPI_Comm) [with int dim = 3; int spacedim = 
3; std::string = std::__cxx11::basic_string<char>; MPI_Comm = 
ompi_communicator_t*]
The violated condition was: 
    cell_dataset >= 0
Additional information: 
    An input/output error has occurred. There are a number of reasons why
    this may be happening, both for reading and writing operations.
    
    If this happens during an operation that tries to read data: First,
    you may be trying to read from a file that doesn't exist or that is
    not readable given its file permissions. Second, deal.II uses this
    error at times if it tries to read information from a file but where
    the information in the file does not correspond to the expected
    format. An example would be a truncated file, or a mesh file that
    contains not only sections that describe the vertices and cells, but
    also sections for additional data that deal.II does not understand.
    
    If this happens during an operation that tries to write data: you may
    be trying to write to a file to which file or directory permissions do
    not allow you to write. A typical example is where you specify an
    output file in a directory that does not exist.

Stacktrace:
-----------
#0  
/apps/spack/opt/spack/linux-rocky8-zen2/gcc-11.2.0/dealii-9.6.2-pz5z6ke2736dynce3mxjd6346tifhuyk/lib/libdeal_II.so.9.6.2:
 
#1  
/apps/spack/opt/spack/linux-rocky8-zen2/gcc-11.2.0/dealii-9.6.2-pz5z6ke2736dynce3mxjd6346tifhuyk/lib/libdeal_II.so.9.6.2:
 void dealii::DataOutBase::write_hdf5_parallel<3, 
3>(std::vector<dealii::DataOutBase::Patch<3, 3>, 
std::allocator<dealii::DataOutBase::Patch<3, 3> > > const&, 
dealii::DataOutBase::DataOutFilter const&, dealii::DataOutBase::Hdf5Flags 
const&, bool, std::__cxx11::basic_string<char, std::char_traits<char>, 
std::allocator<char> > const&, std::__cxx11::basic_string<char, 
std::char_traits<char>, std::allocator<char> > const&, ompi_communicator_t*)
#2  
/apps/spack/opt/spack/linux-rocky8-zen2/gcc-11.2.0/dealii-9.6.2-pz5z6ke2736dynce3mxjd6346tifhuyk/lib/libdeal_II.so.9.6.2:
 dealii::DataOutInterface<3, 
3>::write_hdf5_parallel(dealii::DataOutBase::DataOutFilter const&, bool, 
std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > 
const&, std::__cxx11::basic_string<char, std::char_traits<char>, 
std::allocator<char> > const&, ompi_communicator_t*) const
#3  ./build/main: CGSEM<3>::output_results_xdmf()
#4  ./build/main: CGSEM<3>::output_results()
#5  ./build/main: CGSEM<3>::run()
#6  ./build/main: main
--------------------------------------------------------

Aborting!
----------------------------------------------------

Reply via email to