Hello,

I am writing an application which writes large data sets to HDF5 files, in
fixed size blocks, using the HDF C++ API (version 1.8.15, patch 1, built in
msvc 2013 x64)

I my application seems to quickly consume all the available memory on my
system (win32 - around 5.9GB), and then crash whenever the system becomes
stressed (windows kills it as it has no memory)

I have also tested the application on a linux machine, where I saw similar
results.

I was under the impression that by using HDF5, the file would be brought in
and out of memory in such a way that the library would only use a small
working set - is this not true?

I have experimented with HDF features such as flushing to disk, regularly
closing and re opening, garbage collection and tuning chunking and caching
settings and haven't managed to get a stable working set.

I've attached a minimal example, can anyone point out my mistake?

Thanks,
- Jorj

Attachment: hdf_test.cpp
Description: Binary data

_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5

Reply via email to