Hi Wolf,

It doesn't hang for me. I get a seg fault with the following traceback.
By the way, I'm using
- gcc 4.9.2
- openmpi 1.8.4
- szip 2.1
- hdf5 1.8.14
On a x86_64 linux machine.

test.hangs: test.hangs.cpp:121: void writeH5(const char*, double*) [with T = 
float]: Assertion `status_h5 >= 0' failed.
HDF5-DIAG: Error detected in HDF5 (1.8.14) MPI-process 0:
  #000: H5F.c line 795 in H5Fclose(): decrementing file ID failed
    major: Object atom
    minor: Unable to close file
  #001: H5I.c line 1475 in H5I_dec_app_ref(): can't decrement ID ref count
    major: Object atom
    minor: Unable to decrement reference count
  #002: H5Fint.c line 1259 in H5F_close(): can't close file
    major: File accessibilty
    minor: Unable to close file
  #003: H5Fint.c line 1421 in H5F_try_close(): problems closing file
    major: File accessibilty
    minor: Unable to close file
  #004: H5Fint.c line 861 in H5F_dest(): low level truncate failed
    major: File accessibilty
    minor: Write failed
  #005: H5FD.c line 1908 in H5FD_truncate(): driver truncate request failed
    major: Virtual File Layer
    minor: Can't update object
  #006: H5FDmpio.c line 1982 in H5FD_mpio_truncate(): MPI_File_set_size failed
    major: Internal error (too specific to document in detail)
    minor: Some MPI function failed
  #007: H5FDmpio.c line 1982 in H5FD_mpio_truncate(): MPI_ERR_ARG: invalid 
argument of some other kind
    major: Internal error (too specific to document in detail)
    minor: MPI Error String
test.hangs: test.hangs.cpp:121: void writeH5(const char*, double*) [with T = 
float]: Assertion `status_h5 >= 0' failed.
--------------------------------------------------------------------------
mpiexec noticed that process rank 4 with PID 1158 on node node1446 exited on 
signal 11 (Segmentation fault).
--------------------------------------------------------------------------

If you want I can try and help debug it, however I am flat out today, so it'd 
have to wait till tomorrow. In the mean time, hope this error helps.

Timoth


> On Apr 7, 2015, at 10:30 AM, Wolf Dapp <[email protected]> wrote:
> 
> Dear hdf-forum members,
> 
> I have a problem I am hoping someone can help me with. I have a program
> that outputs a 2D-array (contiguous, indexed linearly) using parallel
> HDF5. When I choose a number of processors that is not a power of 2
> (1,2,4,8,...) H5Fclose() hangs, inexplicably. I'm using HDF5 v.1.8.14,
> and OpenMPI 1.7.2, on top of GCC 4.8 with Linux.
> 
> Can someone help me pinpoint my mistake?
> 
> I have searched the forum, and the first hit [searching for "h5fclose
> hangs"] was a user mistake that I didn't make (to the best of my
> knowledge). The second didn't go on beyond the initial problem
> description, and didn't offer a solution.
> 
> Attached is a (maybe insufficiently bare-boned, apologies) demonstrator
> program. Strangely, the hang only happens if nx >= 32. The code is
> adapted from an HDF5 example program.
> 
> The demonstrator is compiled with
> h5pcc test.hangs.cpp -DVERBOSE -lstdc++
> 
> ( on my system, for some strange reason, MPI has been compiled with the
> deprecated C++ bindings. I need to include -lmpi_cxx also, but that
> shouldn't be necessary for anyone else. I hope that's not the reason for
> the hang-ups. )
> 
> Thanks in advance for your help!
> 
> Wolf Dapp
> 
> 
> -- 
> 
> 
> <test.hangs.cpp>_______________________________________________
> Hdf-forum is for HDF software users discussion.
> [email protected]
> http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
> Twitter: https://twitter.com/hdf5


_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5

Reply via email to