Just for everyone's information:
I've finally had a chance to play around with this a little bit, and
I've found the only thing that makes my hdf5 files resistant to my
program being terminated (e.g. with "kill -9" while in the middle of a
time.sleep() call). Is to actually close and re-open the file after each
write. The suggestion to call Leaf.close() (== Table.close()) after
appending rows doesn't fix the hdf5 integrity issues. I suspect this may
be because I'm not calling close() (or _f_close()) on the root group. I
haven't tried doing that because I'm not sure what other effects that
would have.
Cheers!
Andrew
En/na Francesc Altet ha escrit::
Yes, for closing nodes there is the method Node._f_close() (or
Leaf.close() which is equivalent). Closing a node is a good way to
ensure that you force a flush on data belonging to this node (and only
that), so reducing the risk.
[...]
-------------------------------------------------------
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=110944&bid=241720&dat=121642
_______________________________________________
Pytables-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/pytables-users