Hi Sebastian, What happens in do_work()? Are you modifying the file in question? If yes, then corruption can be expected.. If not, then the file should not be corrupted, and if it is then we have a bug in the library.
If you can send a replicator for this problem we can investigate further. Thanks, Mohamad -----Original Message----- From: Hdf-forum [mailto:[email protected]] On Behalf Of Sebastian Rettenberger Sent: Tuesday, June 02, 2015 4:27 AM To: HDF Users Discussion List Subject: [Hdf-forum] File state after flush and crash Hi, I want to save the program state of a parallel program in an HDF5 file. For performance reason I do not want to open/close the file each time I write the program state, but use flush it instead. Thus my main loop looks basically like this: while more_work() { do_work() update_hdf5_attributes() update hdf5_dataset() flush_hdf5() } However, even if the program crashes during do_work(), I get a corrupt HDF5 file. I found a short conversation regarding this but it is already 5 years old: http://lists.hdfgroup.org/pipermail/hdf-forum_lists.hdfgroup.org/2010-February/002543.html They mentioned that this might be a problem with the meta data cache. Is this still true? Is there a way around it? I also have some other open HDF5 file. Might this be a problem? Best regards, Sebastian -- Sebastian Rettenberger, M.Sc. Technische Universität München Department of Informatics Chair of Scientific Computing Boltzmannstrasse 3, 85748 Garching, Germany http://www5.in.tum.de/ _______________________________________________ Hdf-forum is for HDF software users discussion. [email protected] http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org Twitter: https://twitter.com/hdf5
