Hi Jarom,
This is a known problem with h5repack and VDS (see Known Problems section
https://www.hdfgroup.org/HDF5/release/obtain5110.html and our release
announcement to this FORUM). The info didn’t make into the RELEASE.txt file
because the problem was discovered just after the source for the release was
out.
As a quick fix that will allow you to repack VDS data, you may try to edit
tools/h5repack/h5repack_copy.c , line 1037. Replace
if (nelmts > 0 && space_status != H5D_SPACE_STATUS_NOT_ALLOCATED) {
with
if (nelmts > 0) {
We are working on the patch and will issue it as soon as it becomes available.
"space_status != H5D_SPACE_STATUS_NOT_ALLOCATED" condition was there to avoid
issuing H5Dwrite call when there was no data to write. Removing the condition,
we do address VDS case, since the space is not allocated for VDS, but we are
changing previous behavior. I.e., this is really a workaround.
Elena
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Elena Pourmal The HDF Group http://hdfgroup.org
1800 So. Oak St., Suite 203, Champaign IL 61820
217.531.6112
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
On Apr 5, 2016, at 6:26 PM, Nelson, Jarom
<[email protected]<mailto:[email protected]>> wrote:
I tried a few different tests:
1. Adding some filters (SHUF, GZIP) looks to have failed silently.
>$ h5repack -f SHUF -f GZIP=1 h5g_output_parallel.global.100.10.h5
>h5g_output_parallel.global.100.10.shuf.gzip1.h5
No error reported, but the data was not accessible. H5dump output in attached.
2. I attempted to change the layout to CONTI (contiguous), but it also
fails silently.
>$ h5repack -l CONTI h5g_output_parallel.global.100.10.h5
>h5g_output_parallel.global.100.10.conti.h5
Same as test#1: No error reported, but the data was not accessible. H5dump
output in attached.
3. I attempted to change the layout to CHUNK=100, but it looks like the
resulting file is identical. H5dump doesn’t indicate that the layout is now
chunked, and the dataset is still virtual:
>$ h5repack -l CHUNK=100 h5g_output_parallel.global.100.10.h5
>h5g_output_parallel.global.100.10.chunk100.h5
No errors, but the h5dump output is identical to the original.
4. Same result with CHUNK=100 and SHUF, GZIP filters.
>$ h5repack -l CHUNK=100 -f SHUF -f GZIP=1
>h5g_output_parallel.global.100.10.h5
>h5g_output_parallel.global.100.10.chunk100.shuf.gzip.h5
Same result as test #3: No errors, but the h5dump output is identical to the
original.
5. I also (unwisely) attempted to change layout to CONTI and add
SHUF,GZIP filters, but realized that you can’t add these filters with
contiguous layout. I suppose this error is expected.
HDF5-DIAG: Error detected in HDF5 (1.10.0) thread 0:
#000: H5Pdcpl.c line 2009 in H5Pset_chunk(): chunk dimensionality must be
positive
major: Invalid arguments to routine
minor: Out of range
h5repack error: <h5g_output_parallel.global.100.10.h5>: Could not copy data to:
h5g_output_parallel.conti.shuf.gzip1.h5
Resulting files and “h5dump -p” output in the attached (with exception of
CONTI/SHUF/GZIP test).
Original file was created in HDF5 1.10.0 with single VDS with source datasets
spread across several files (attached).
>$ h5repack --version
h5repack: Version 1.10.0
>$ h5dump --version
h5dump: Version 1.10.0
My suspicion is that what I’m attempting to do is not (yet?) supported.
Jarom
From: Hdf-forum [mailto:[email protected]] On Behalf Of
Miller, Mark C.
Sent: Tuesday, April 05, 2016 3:17 PM
To: HDF Users Discussion List
Subject: Re: [Hdf-forum] h5repack on files with VDS
I honestly don't know. But, if you have a small file with VDS datasetes in it,
maybe give it a try and see what happens.
Mark
From: Hdf-forum
<[email protected]<mailto:[email protected]>>
on behalf of "Nelson, Jarom" <[email protected]<mailto:[email protected]>>
Reply-To: HDF Users Discussion List
<[email protected]<mailto:[email protected]>>
Date: Tuesday, April 5, 2016 3:06 PM
To: "[email protected]<mailto:[email protected]>"
<[email protected]<mailto:[email protected]>>
Subject: [Hdf-forum] h5repack on files with VDS
Can h5repack be used to un-virtualize a VDS?
Jarom Nelson
Lawrence Livermore National Lab
<h5repack.vds.tar.gz>_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]<mailto:[email protected]>
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5
_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5