Hi, It turns out that the data is highly compressed, once loaded in memory, it 
consumes 100G or more memory which exceeds the single node RAM capacity. 
I think that maybe the reason, Thanks. 


Best,
Jialin

> On Jul 14, 2016, at 2:09 PM, Jialin Liu <jaln...@lbl.gov> wrote:
> 
> Hi, 
> I’m using the h5copy to convert an netcdf file to hdf5, it returns 
> segmentation fault, the file size is 24G, 
> the specific dataset that i tried to convert is H5T_STD_I16LE and ( 1617, 
> 4320, 8640).
> 
> Any idea about this error? Thanks, 
> 
> Best,
> Jialin

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5

Reply via email to