Hello Carlos,

Why not write a program that takes data for a given amount of time, say 5
minutes, and stores it into a temporary text file. Then at the end of the 5
minutes, store that data into HDF, purge the file and then continue to read
data. If an outage happens, you should still have the data available in
your temporary file which can be recovered.

Regards,
Landon Clipp

On Oct 4, 2016 7:09 PM, "Carlos Penedo Rocha" <[email protected]> wrote:

> *Schlumberger-Private*
>
> Hi,
>
>
>
> I have a scenario in which my compressed h5 file needs to be updated with
> new data that is coming in every, say, 5 seconds.
>
>
>
> Approach #1: keep the file opened and just write data as they come, or
> write a buffer at once.
>
> Approach #2: open the file (RDWR), write the data (or a buffer) and then
> close the file.
>
>
>
> Approach #1 is not desirable for my case because if there’s any problem
> (outage, etc), then the h5 file will likely get corrupted. Or if I want to
> have a look at the file, I can’t because it’s still writing (still opened).
>
>
>
> Approach #2 is good to address the issue above, *BUT* I noticed that if I
> open/write/close the file every 5 seconds, the file compression gets really
> bad and the file size goes up big time. Approach 1 doesn’t suffer from this
> problem.
>
>
>
> So, my question is: is there an “Approach #3” that gives me the best of
> the two worlds? Less likely to get me a corrupted h5 file and at the same
> time, a good compression rate?
>
>
>
> Thanks,
>
> Carlos R.
>
>
>
> _______________________________________________
> Hdf-forum is for HDF software users discussion.
> [email protected]
> http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
> Twitter: https://twitter.com/hdf5
>
_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5

Reply via email to