Hi, It turns out that the data is highly compressed, once loaded in memory, it
consumes 100G or more memory which exceeds the single node RAM capacity.
I think that maybe the reason, Thanks.
Best,
Jialin
> On Jul 14, 2016, at 2:09 PM, Jialin Liu wrote:
>
> Hi,
> I’m using
Hi,
I’m using the h5copy to convert an netcdf file to hdf5, it returns segmentation
fault, the file size is 24G,
the specific dataset that i tried to convert is H5T_STD_I16LE and ( 1617, 4320,
8640).
Any idea about this error? Thanks,
Best,