Hi, I want to save the program state of a parallel program in an HDF5 file.For performance reason I do not want to open/close the file each time I write the program state, but use flush it instead.
Thus my main loop looks basically like this:
while more_work() {
do_work()
update_hdf5_attributes()
update hdf5_dataset()
flush_hdf5()
}
However, even if the program crashes during do_work(), I get a corrupt
HDF5 file.
I found a short conversation regarding this but it is already 5 years old: http://lists.hdfgroup.org/pipermail/hdf-forum_lists.hdfgroup.org/2010-February/002543.html They mentioned that this might be a problem with the meta data cache. Is this still true? Is there a way around it? I also have some other open HDF5 file. Might this be a problem? Best regards, Sebastian -- Sebastian Rettenberger, M.Sc. Technische Universität München Department of Informatics Chair of Scientific Computing Boltzmannstrasse 3, 85748 Garching, Germany http://www5.in.tum.de/
smime.p7s
Description: S/MIME Cryptographic Signature
_______________________________________________ Hdf-forum is for HDF software users discussion. [email protected] http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org Twitter: https://twitter.com/hdf5
