Hi Steven, I think this the same issue as one that had come up before, and it has to do with the h5perf ignoring the prefix environment variable. I did fix this for the 1.8.14 release coming in a few weeks. So it would be awesome if you could test the pre-release when it comes out (should be very soon).
But for now, I attached the patch for you to try on 1.8.13. Please let me know if that resolves your problem. Thanks, Mohamad From: Hdf-forum [mailto:[email protected]] On Behalf Of Steven Varga Sent: Thursday, October 23, 2014 3:16 AM To: [email protected] Subject: [Hdf-forum] MPIO hdf5-1.8.13 fails with openmpi 1.8.3 orangefs 2.8.8 Hi, running phdf5 test with working PVFS2 setup MPIO and PHDF5 IO fails with printout below; Hdf5, OpenMP, pvfs2 compiled from source; passed checks with the exception of parallel hdf5. Is there something obvious I am not doing right? printouts: 1) export HDF5_PARAPREFIX=pvfs2:/scratch; mpiexec -n 3 ./testphdf5 2) mpiexec -n 3 h5perf 3) pvfs2-ping -m /scratch/ steve Linux version 3.11.0-031100-generic (apw@gomeisa) (gcc version 4.6.3 (Ubuntu/Linaro 4.6.3-1ubuntu5) ) #201309021735 SMP Mon Sep 2 21:36:21 UTC 2013 PHDF5 TESTS START =================================== MPI-process 1. hostname=master MPI-process 2. hostname=master MPI-process 0. hostname=master For help use: ./testphdf5 -help Linked with hdf5 version 1.8 release 13 For help use: ./testphdf5 -help Linked with hdf5 version 1.8 release 13 For help use: ./testphdf5 -help Linked with hdf5 version 1.8 release 13 Test filenames are: pvfs2:/scratch/ParaTest.h5 Testing -- fapl_mpio duplicate (mpiodup) Test filenames are: pvfs2:/scratch/ParaTest.h5 Testing -- fapl_mpio duplicate (mpiodup) Test filenames are: pvfs2:/scratch/ParaTest.h5 Testing -- fapl_mpio duplicate (mpiodup) Testing -- dataset using split communicators (split) Testing -- dataset using split communicators (split) Testing -- dataset using split communicators (split) Proc 0: *** Parallel ERROR *** VRFY (H5Fcreate succeeded) failed at line 75 in t_file.c aborting MPI processes Proc 2: *** Parallel ERROR *** VRFY (H5Fcreate succeeded) failed at line 75 in t_file.c aborting MPI processes -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [master:06171] 1 more process has sent help message help-mpi-api.txt / mpi-abort [master:06171] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages ------------------------------------------------ HDF5 PERF print out ------------------------------------------------ HDF5 Library: Version 1.8.13 rank 0: ==== Parameters ==== rank 0: IO API=posix mpiio phdf5 rank 0: Number of files=1 rank 0: Number of datasets=1 rank 0: Number of iterations=1 rank 0: Number of processes=1:3 rank 0: Number of bytes per process per dataset=256KB rank 0: Size of dataset(s)=256KB:768KB rank 0: File size=256KB:768KB rank 0: Transfer buffer size=128KB:256KB rank 0: Block size=128KB rank 0: Block Pattern in Dataset=Contiguous rank 0: I/O Method for MPI and HDF5=Independent rank 0: Geometry=1D rank 0: VFL used for HDF5 I/O=MPI-IO driver rank 0: Data storage method in HDF5=Contiguous rank 0: Env HDF5_PARAPREFIX=not set rank 0: Dumping MPI Info Object(9118432) (up to 256 bytes per item): object is MPI_INFO_NULL rank 0: ==== End of Parameters ==== Number of processors = 3 Transfer Buffer Size: 131072 bytes, File size: 0.75 MBs # of files: 1, # of datasets: 1, dataset size: 0.75 MBs IO API = POSIX Write (1 iteration(s)): Maximum Throughput: 243.67 MB/s Average Throughput: 243.67 MB/s Minimum Throughput: 243.67 MB/s Write Open-Close (1 iteration(s)): Maximum Throughput: 31.48 MB/s Average Throughput: 31.48 MB/s Minimum Throughput: 31.48 MB/s Read (1 iteration(s)): Maximum Throughput: 182.74 MB/s Average Throughput: 182.74 MB/s Minimum Throughput: 182.74 MB/s Read Open-Close (1 iteration(s)): Maximum Throughput: 64.55 MB/s Average Throughput: 64.55 MB/s Minimum Throughput: 64.55 MB/s IO API = MPIO MPI File Open failed(#pio_tmp_1.mpio) MPI File Open failed(#pio_tmp_1.mpio) Proc 2: *** Assertion failed (do_fopen failed) at line 309 in pio_engine.c Proc 1: *** Assertion failed (do_fopen failed) at line 309 in pio_engine.c MPI File Open failed(#pio_tmp_1.mpio) Proc 0: *** Assertion failed (do_fopen failed) at line 309 in pio_engine.c Write (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s Write Open-Close (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s Read (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s Read Open-Close (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s IO API = PHDF5 (w/MPI-IO driver) HDF5-DIAG: Error detected in HDF5 (1.8.13) MPI-process 2HDF5-DIAG: Error detected in HDF5 (1.8.13) : #000: H5F.c line 1512 in H5Fcreate(): unable to create file MPI-process 0: major: File accessibilty #000: H5F.c line 1512 in H5Fcreate(): unable to create file major: File accessibilty minor: Unable to open file minor: Unable to open file #001: H5F.c line 1290 in H5F_open(): unable to open file: time = Thu Oct 23 07:59:35 2014 , name = '#pio_tmp_1.h5', tent_flags = 13 major: File accessibilty minor: Unable to open file #002: H5FD.c line 985 in H5FD_open(): open failed major: Virtual File Layer #001: H5F.c line 1290 in H5F_open(): unable to open file: time = Thu Oct 23 07:59:35 2014 , name = '#pio_tmp_1.h5', tent_flags = 13 major: File accessibilty minor: Unable to initialize object minor: Unable to open file HDF5-DIAG: Error detected in HDF5 (1.8.13) #003: H5FDmpio.c line 1064 in H5FD_mpio_open(): MPI_File_open failed #002: H5FD.c line 985 in H5FD_open(): open failed major: Internal error (too specific to document in detail) minor: Some MPI function failed MPI-process 1: #000: H5F.c line 1512 in H5Fcreate(): unable to create file #004: H5FDmpio.c line 1064 in H5FD_mpio_open(): MPI_ERR_IO: input/output error major: Internal error (too specific to document in detail) minor: MPI Error String major: File accessibilty minor: Unable to open file #001: H5F.c line 1290 in H5F_open(): unable to open file: time = Thu Oct 23 07:59:35 2014 , name = '#pio_tmp_1.h5', tent_flags = 13 HDF5 File Create failed(#pio_tmp_1.h5) Proc 0: major: File accessibilty minor: Unable to open file #002: H5FD.c line 985 in H5FD_open(): open failed *** Assertion failed (do_fopen failed) at line 309 in pio_engine.c major: Virtual File Layer minor: Unable to initialize object #003: H5FDmpio.c line 1064 in H5FD_mpio_open(): MPI_File_open failed major: Internal error (too specific to document in detail) minor: Some MPI function failed #004: H5FDmpio.c line 1064 in H5FD_mpio_open(): MPI_ERR_IO: input/output error major: Internal error (too specific to document in detail) minor: MPI Error String HDF5 File Create failed(#pio_tmp_1.h5) Proc 1: major: Virtual File Layer *** Assertion failed (do_fopen failed) at line 309 in pio_engine.c minor: Unable to initialize object #003: H5FDmpio.c line 1064 in H5FD_mpio_open(): MPI_File_open failed major: Internal error (too specific to document in detail) minor: Some MPI function failed #004: H5FDmpio.c line 1064 in H5FD_mpio_open(): MPI_ERR_IO: input/output error major: Internal error (too specific to document in detail) minor: MPI Error String HDF5 File Create failed(#pio_tmp_1.h5) Proc 2: *** Assertion failed (do_fopen failed) at line 309 in pio_engine.c Write (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s Write Open-Close (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s Read (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s Read Open-Close (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s Transfer Buffer Size: 262144 bytes, File size: 0.75 MBs # of files: 1, # of datasets: 1, dataset size: 0.75 MBs IO API = POSIX Write (1 iteration(s)): Maximum Throughput: 426.37 MB/s Average Throughput: 426.37 MB/s Minimum Throughput: 426.37 MB/s Write Open-Close (1 iteration(s)): Maximum Throughput: 51.30 MB/s Average Throughput: 51.30 MB/s Minimum Throughput: 51.30 MB/s Read (1 iteration(s)): Maximum Throughput: 180.81 MB/s Average Throughput: 180.81 MB/s Minimum Throughput: 180.81 MB/s Read Open-Close (1 iteration(s)): Maximum Throughput: 89.44 MB/s Average Throughput: 89.44 MB/s Minimum Throughput: 89.44 MB/s IO API = MPIO MPI File Open failed(#pio_tmp_1.mpio) MPI File Open failed(#pio_tmp_1.mpio) Proc 1: *** Assertion failed (do_fopen failed) at line 309 in pio_engine.c Proc 2: *** Assertion failed (do_fopen failed) at line 309 in pio_engine.c MPI File Open failed(#pio_tmp_1.mpio) Proc 0: *** Assertion failed (do_fopen failed) at line 309 in pio_engine.c Write (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s Write Open-Close (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s Read (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s Read Open-Close (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s IO API = PHDF5 (w/MPI-IO driver) HDF5-DIAG: Error detected in HDF5 (1.8.13) MPI-process 2: #000: H5F.c line 1512 in H5Fcreate(): unable to create file major: File accessibilty minor: Unable to open file #001: H5F.c line 1290 in H5F_open(): unable to open file: time = Thu Oct 23 07:59:37 2014 , name = '#pio_tmp_1.h5', tent_flags = 13 major: File accessibilty minor: Unable to open file #002: H5FD.c line 985 in H5FD_open(): open failed major: Virtual File Layer minor: Unable to initialize object #003: H5FDmpio.c line 1064 in H5FD_mpio_open(): MPI_File_open failed major: Internal error (too specific to document in detail) minor: Some MPI function failed #004: H5FDmpio.c line 1064 in H5FD_mpio_open(): MPI_ERR_IO: input/output error major: Internal error (too specific to document in detail) minor: MPI Error String HDF5 File Create failed(#pio_tmp_1.h5) Proc 2: *** Assertion failed (do_fopen failed) at line 309 in pio_engine.c HDF5-DIAG: Error detected in HDF5 (1.8.13) MPI-process 1: #000: H5F.c line 1512 in H5Fcreate(): unable to create file major: File accessibilty minor: Unable to open file #001: H5F.c line 1290 in H5F_open(): unable to open file: time = Thu Oct 23 07:59:37 2014 , name = '#pio_tmp_1.h5', tent_flags = 13 HDF5-DIAG: Error detected in HDF5 (1.8.13) major: File accessibilty minor: Unable to open file #002: H5FD.c line 985 in H5FD_open(): open failed major: Virtual File Layer minor: Unable to initialize object #003: H5FDmpio.c line 1064 in H5FD_mpio_open(): MPI_File_open failed major: Internal error (too specific to document in detail) minor: Some MPI function failed #004: H5FDmpio.c line 1064 in H5FD_mpio_open(): MPI_ERR_IO: input/output error major: Internal error (too specific to document in detail) minor: MPI Error String HDF5 File Create failed(#pio_tmp_1.h5) Proc 1: *** Assertion failed (do_fopen failed) at line 309 in pio_engine.c MPI-process 0: #000: H5F.c line 1512 in H5Fcreate(): unable to create file major: File accessibilty minor: Unable to open file #001: H5F.c line 1290 in H5F_open(): unable to open file: time = Thu Oct 23 07:59:37 2014 , name = '#pio_tmp_1.h5', tent_flags = 13 major: File accessibilty minor: Unable to open file #002: H5FD.c line 985 in H5FD_open(): open failed major: Virtual File Layer minor: Unable to initialize object #003: H5FDmpio.c line 1064 in H5FD_mpio_open(): MPI_File_open failed major: Internal error (too specific to document in detail) minor: Some MPI function failed #004: H5FDmpio.c line 1064 in H5FD_mpio_open(): MPI_ERR_IO: input/output error major: Internal error (too specific to document in detail) minor: MPI Error String HDF5 File Create failed(#pio_tmp_1.h5) Proc 0: *** Assertion failed (do_fopen failed) at line 309 in pio_engine.c Write (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s Write Open-Close (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s Read (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s Read Open-Close (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s Number of processors = 1 Transfer Buffer Size: 131072 bytes, File size: 0.25 MBs # of files: 1, # of datasets: 1, dataset size: 0.25 MBs IO API = POSIX Write (1 iteration(s)): Maximum Throughput: 198.56 MB/s Average Throughput: 198.56 MB/s Minimum Throughput: 198.56 MB/s Write Open-Close (1 iteration(s)): Maximum Throughput: 50.59 MB/s Average Throughput: 50.59 MB/s Minimum Throughput: 50.59 MB/s Read (1 iteration(s)): Maximum Throughput: 344.36 MB/s Average Throughput: 344.36 MB/s Minimum Throughput: 344.36 MB/s Read Open-Close (1 iteration(s)): Maximum Throughput: 168.45 MB/s Average Throughput: 168.45 MB/s Minimum Throughput: 168.45 MB/s IO API = MPIO MPI File Open failed(#pio_tmp_1.mpio) Proc 0: *** Assertion failed (do_fopen failed) at line 309 in pio_engine.c Write (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s Write Open-Close (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s Read (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s Read Open-Close (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s IO API = PHDF5 (w/MPI-IO driver) Write (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s Write Open-Close (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s Read (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s Read Open-Close (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s Transfer Buffer Size: 262144 bytes, File size: 0.25 MBs # of files: 1, # of datasets: 1, dataset size: 0.25 MBs IO API = POSIX HDF5-DIAG: Error detected in HDF5 (1.8.13) MPI-process 0: #000: H5F.c line 1512 in H5Fcreate(): unable to create file major: File accessibilty minor: Unable to open file #001: H5F.c line 1290 in H5F_open(): unable to open file: time = Thu Oct 23 07:59:37 2014 , name = '#pio_tmp_1.h5', tent_flags = 13 major: File accessibilty minor: Unable to open file #002: H5FD.c line 985 in H5FD_open(): open failed major: Virtual File Layer minor: Unable to initialize object #003: H5FDmpio.c line 1064 in H5FD_mpio_open(): MPI_File_open failed major: Internal error (too specific to document in detail) minor: Some MPI function failed #004: H5FDmpio.c line 1064 in H5FD_mpio_open(): MPI_ERR_IO: input/output error major: Internal error (too specific to document in detail) minor: MPI Error String HDF5 File Create failed(#pio_tmp_1.h5) Proc 0: *** Assertion failed (do_fopen failed) at line 309 in pio_engine.c Write (1 iteration(s)): Maximum Throughput: 330.26 MB/s Average Throughput: 330.26 MB/s Minimum Throughput: 330.26 MB/s Write Open-Close (1 iteration(s)): Maximum Throughput: 61.98 MB/s Average Throughput: 61.98 MB/s Minimum Throughput: 61.98 MB/s Read (1 iteration(s)): Maximum Throughput: 572.05 MB/s Average Throughput: 572.05 MB/s Minimum Throughput: 572.05 MB/s Read Open-Close (1 iteration(s)): Maximum Throughput: 213.30 MB/s Average Throughput: 213.30 MB/s Minimum Throughput: 213.30 MB/s IO API = MPIO MPI File Open failed(#pio_tmp_1.mpio) Proc 0: *** Assertion failed (do_fopen failed) at line 309 in pio_engine.c Write (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s Write Open-Close (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s Read (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s Read Open-Close (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s IO API = PHDF5 (w/MPI-IO driver) HDF5-DIAG: Error detected in HDF5 (1.8.13) MPI-process 0: #000: H5F.c line 1512 in H5Fcreate(): unable to create file major: File accessibilty minor: Unable to open file #001: H5F.c line 1290 in H5F_open(): unable to open file: time = Thu Oct 23 07:59:37 2014 , name = '#pio_tmp_1.h5', tent_flags = 13 major: File accessibilty minor: Unable to open file #002: H5FD.c line 985 in H5FD_open(): open failed major: Virtual File Layer minor: Unable to initialize object #003: H5FDmpio.c line 1064 in H5FD_mpio_open(): MPI_File_open failed major: Internal error (too specific to document in detail) minor: Some MPI function failed #004: H5FDmpio.c line 1064 in H5FD_mpio_open(): MPI_ERR_IO: input/output error major: Internal error (too specific to document in detail) minor: MPI Error String HDF5 File Create failed(#pio_tmp_1.h5) Proc 0: *** Assertion failed (do_fopen failed) at line 309 in pio_engine.c Write (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s Write Open-Close (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s Read (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s Read Open-Close (1 iteration(s)): Maximum Throughput: 0.00 MB/s Average Throughput: 0.00 MB/s Minimum Throughput: 0.00 MB/s *** The MPI_Comm_free() function was called after MPI_FINALIZE was invoked. *** This is disallowed by the MPI standard. *** The MPI_Comm_free() function was called after MPI_FINALIZE was invoked. *** This is disallowed by the MPI standard. *** Your MPI job will now abort. [(null):2494] Local abort after MPI_FINALIZE completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed! ------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code.. Per user-direction, the job has been aborted. ------------------------------------------------------- *** Your MPI job will now abort. [(null):2493] Local abort after MPI_FINALIZE completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- mpiexec detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[21269,1],2] Exit code: 1 -------------------------------------------------------------------------- -------------------------------------------------------------------------------- PVFS PING --------------------------------------------------------------------------------- sgeadmin@master:/scratch$ pvfs2-ping -m /scratch/ (1) Parsing tab file... (2) Initializing system interface... (3) Initializing each file system found in tab file: /etc/pvfs2tab... PVFS2 servers: tcp://master:3334 Storage name: pvfs2-fs Local mount point: /scratch /scratch: Ok (4) Searching for /scratch/ in pvfstab... PVFS2 servers: tcp://master:3334 Storage name: pvfs2-fs Local mount point: /scratch meta servers: tcp://master:3334 data servers: tcp://master:3334 (5) Verifying that all servers are responding... meta servers: tcp://master:3334 Ok data servers: tcp://master:3334 Ok (6) Verifying that fsid 957524283 is acceptable to all servers... Ok; all servers understand fs_id 957524283 (7) Verifying that root handle is owned by one server... Root handle: 1048576 Ok; root handle is owned by exactly one server. ============================================================= The PVFS2 filesystem at /scratch/ appears to be correctly configured.
prefix_patch.diff
Description: prefix_patch.diff
_______________________________________________ Hdf-forum is for HDF software users discussion. [email protected] http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org Twitter: https://twitter.com/hdf5
