Try give a look here [1]. There are several examples: "Read / Write
GZIP Compressed Dataset", "Read / Write Dataset w/ Shuffle Filter and
GZIP Compression" and "Read / Write Dataset using SZIP Compression".

Consider also to apply a shuffling filter before to compress. It could
give you better compression ratio [2].

Cheers,
Nicola.

[1] http://www.hdfgroup.org/HDF5/examples/api18-java.html
[2] http://www.hdfgroup.org/HDF5/doc_resource/H5Shuffle_Perf.pdf

On 24 July 2014 07:59, Philipp Kraus <[email protected]> wrote:
> Hello,
>
> I'm using HDF5 with Java port for storing values from a Java program into a 
> HDF5 file.
>
> I write the data with
> m_file.createScalarDS(p_dataId, p_workingGroup, DT_INTEGER, l_dims, null, 
> null, 0, l_data);
>
> But my datasets are very large, so I would like to create a zipped dataset, 
> imho I must add a filter for SZ.
> How can I create a zipped dataset with the Java API? I don't find any 
> documentation & example
>
> Thanks a lot
>
> Phil
>
>
>
>
> _______________________________________________
> Hdf-forum is for HDF software users discussion.
> [email protected]
> http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
> Twitter: https://twitter.com/hdf5



-- 
Nicola Cadenelli
Phone (IT)    +39 334 6550271
Office (DE)  +49 2461 6196 475
Skype: nicolacdnll

_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5

Reply via email to