On 9/16/25 05:04, Jalil Khan wrote:

From inspecting the source in data_out_base.cc, HDF5 uses chunked layout, with chunk dimensions set internally in deal.II. At 512³, the computed chunk size exceeds the HDF5 hard limit of 4 GB per chunk, which leads to the failure.

Is there a recommended way for users to avoid this error when writing large output files?

Jalil:
that likely falls in the category of "the person who wrote these HDF5 calls didn't think of this scenario, and nobody has run into it since then". Can you identify which line in data_out_base.cc specifically triggers the issue, and can you show how you call deal.II functions to run into this issue?

In the end, this ought to be fixed of course. Any help is obviously always welcome!

Best
 Wolfgang


--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- You received this message because you are subscribed to the Google Groups "deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion visit 
https://groups.google.com/d/msgid/dealii/f1bd81c7-9629-460b-977f-9b2c627bdd59%40colostate.edu.

Reply via email to