Hi,
New to MPI and decided to try OpenMPI out on hello.cpp, but I get the
following messages and the program hangs…
MPICH had no problem with this…what am I doing wrong?
Thanks,
Jon
Here is hello.cpp
#include <iostream>
#include <stdio.h>
#include <mpi.h>
using namespace std;
int main(int argc, char *argv[]) {
int numprocs, rank, namelen;
char processor_name[MPI_MAX_PROCESSOR_NAME];
MPI_Init(&argc, &argv);
MPI_Comm_size(MPI_COMM_WORLD, &numprocs);
MPI_Comm_rank(MPI_COMM_WORLD, &rank);
MPI_Get_processor_name(processor_name, &namelen);
printf("Process %d on %s out of %d\n", rank, processor_name, numprocs);
MPI_Finalize();
}
And here is the output I got…
libibverbs: Fatal: couldn't read uverbs ABI version.
libibverbs: Fatal: couldn't read uverbs ABI version.
libibverbs: Fatal: couldn't read uverbs ABI version.
libibverbs: Fatal: couldn't read uverbs ABI version.
CMA: unable to open /dev/infiniband/rdma_cm
CMA: unable to open /dev/infiniband/rdma_cm
CMA: unable to open /dev/infiniband/rdma_cm
--------------------------------------------------------------------------
[0,0,0]: OpenIB on host o14sa44 was unable to find any HCAs.
Another transport will be used instead, although this may result in
lower performance.
--------------------------------------------------------------------------
librdmacm: couldn't read ABI version.
librdmacm: assuming: 4
libibverbs: Fatal: couldn't read uverbs ABI version.
--------------------------------------------------------------------------
[0,0,0]: OpenIB on host o14sa44 was unable to find any HCAs.
Another transport will be used instead, although this may result in
lower performance.
--------------------------------------------------------------------------
librdmacm: couldn't read ABI version.
librdmacm: assuming: 4
libibverbs: Fatal: couldn't read uverbs ABI version.
--------------------------------------------------------------------------
[0,0,0]: OpenIB on host o14sa44 was unable to find any HCAs.
Another transport will be used instead, although this may result in
lower performance.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
[0,0,0]: OpenIB on host o14sa44 was unable to find any HCAs.
Another transport will be used instead, although this may result in
lower performance.
--------------------------------------------------------------------------
librdmacm: couldn't read ABI version.
librdmacm: assuming: 4
libibverbs: Fatal: couldn't read uverbs ABI version.
librdmacm: couldn't read ABI version.
librdmacm: assuming: 4
libibverbs: Fatal: couldn't read uverbs ABI version.
CMA: unable to open /dev/infiniband/rdma_cm
Here is the output of ompi_info…
Package: Open MPI jbishop@o14sa44 Distribution
Open MPI: 1.4.2
Open MPI SVN revision: r23093
Open MPI release date: May 04, 2010
Open RTE: 1.4.2
Open RTE SVN revision: r23093
Open RTE release date: May 04, 2010
OPAL: 1.4.2
OPAL SVN revision: r23093
OPAL release date: May 04, 2010
Ident string: 1.4.2
Prefix: /home/jbishop/openmpi-1.4.2-install
Configured architecture: x86_64-unknown-linux-gnu
Configure host: o14sa44
Configured by: jbishop
Configured on: Mon Oct 10 14:29:45 PDT 2011
Configure host: o14sa44
Built by: jbishop
Built on: Mon Oct 10 14:41:44 PDT 2011
Built host: o14sa44
C bindings: yes
C++ bindings: yes
Fortran77 bindings: yes (all)
Fortran90 bindings: yes
Fortran90 bindings size: small
C compiler: gcc
C compiler absolute: /sierra/project/tools/linux_x86_64_2.3.4/bin/gcc
C++ compiler: g++
C++ compiler absolute: /sierra/project/tools/linux_x86_64_2.3.4/bin/g++
Fortran77 compiler: gfortran
Fortran77 compiler abs: /usr/bin/gfortran
Fortran90 compiler: gfortran
Fortran90 compiler abs: /usr/bin/gfortran
C profiling: yes
C++ profiling: yes
Fortran77 profiling: yes
Fortran90 profiling: yes
C++ exceptions: no
Thread support: posix (mpi: no, progress: no)
Sparse Groups: no
Internal debug support: no
MPI parameter check: runtime
Memory profiling support: no
Memory debugging support: no
libltdl support: yes
Heterogeneous support: no
mpirun default --prefix: no
MPI I/O support: yes
MPI_WTIME support: gettimeofday
Symbol visibility support: yes
FT Checkpoint support: no (checkpoint thread: no)
MCA backtrace: execinfo (MCA v2.0, API v2.0, Component v1.4.2)
MCA memory: ptmalloc2 (MCA v2.0, API v2.0, Component v1.4.2)
MCA paffinity: linux (MCA v2.0, API v2.0, Component v1.4.2)
MCA carto: auto_detect (MCA v2.0, API v2.0, Component v1.4.2)
MCA carto: file (MCA v2.0, API v2.0, Component v1.4.2)
MCA maffinity: first_use (MCA v2.0, API v2.0, Component v1.4.2)
MCA maffinity: libnuma (MCA v2.0, API v2.0, Component v1.4.2)
MCA timer: linux (MCA v2.0, API v2.0, Component v1.4.2)
MCA installdirs: env (MCA v2.0, API v2.0, Component v1.4.2)
MCA installdirs: config (MCA v2.0, API v2.0, Component v1.4.2)
MCA dpm: orte (MCA v2.0, API v2.0, Component v1.4.2)
MCA pubsub: orte (MCA v2.0, API v2.0, Component v1.4.2)
MCA allocator: basic (MCA v2.0, API v2.0, Component v1.4.2)
MCA allocator: bucket (MCA v2.0, API v2.0, Component v1.4.2)
MCA coll: basic (MCA v2.0, API v2.0, Component v1.4.2)
MCA coll: hierarch (MCA v2.0, API v2.0, Component v1.4.2)
MCA coll: inter (MCA v2.0, API v2.0, Component v1.4.2)
MCA coll: self (MCA v2.0, API v2.0, Component v1.4.2)
MCA coll: sm (MCA v2.0, API v2.0, Component v1.4.2)
MCA coll: sync (MCA v2.0, API v2.0, Component v1.4.2)
MCA coll: tuned (MCA v2.0, API v2.0, Component v1.4.2)
MCA io: romio (MCA v2.0, API v2.0, Component v1.4.2)
MCA mpool: fake (MCA v2.0, API v2.0, Component v1.4.2)
MCA mpool: rdma (MCA v2.0, API v2.0, Component v1.4.2)
MCA mpool: sm (MCA v2.0, API v2.0, Component v1.4.2)
MCA pml: cm (MCA v2.0, API v2.0, Component v1.4.2)
MCA pml: csum (MCA v2.0, API v2.0, Component v1.4.2)
MCA pml: ob1 (MCA v2.0, API v2.0, Component v1.4.2)
MCA pml: v (MCA v2.0, API v2.0, Component v1.4.2)
MCA bml: r2 (MCA v2.0, API v2.0, Component v1.4.2)
MCA rcache: vma (MCA v2.0, API v2.0, Component v1.4.2)
MCA btl: ofud (MCA v2.0, API v2.0, Component v1.4.2)
MCA btl: openib (MCA v2.0, API v2.0, Component v1.4.2)
MCA btl: self (MCA v2.0, API v2.0, Component v1.4.2)
MCA btl: sm (MCA v2.0, API v2.0, Component v1.4.2)
MCA btl: tcp (MCA v2.0, API v2.0, Component v1.4.2)
MCA topo: unity (MCA v2.0, API v2.0, Component v1.4.2)
MCA osc: pt2pt (MCA v2.0, API v2.0, Component v1.4.2)
MCA osc: rdma (MCA v2.0, API v2.0, Component v1.4.2)
MCA iof: hnp (MCA v2.0, API v2.0, Component v1.4.2)
MCA iof: orted (MCA v2.0, API v2.0, Component v1.4.2)
MCA iof: tool (MCA v2.0, API v2.0, Component v1.4.2)
MCA oob: tcp (MCA v2.0, API v2.0, Component v1.4.2)
MCA odls: default (MCA v2.0, API v2.0, Component v1.4.2)
MCA ras: gridengine (MCA v2.0, API v2.0, Component v1.4.2)
MCA ras: slurm (MCA v2.0, API v2.0, Component v1.4.2)
MCA rmaps: load_balance (MCA v2.0, API v2.0, Component v1.4.2)
MCA rmaps: rank_file (MCA v2.0, API v2.0, Component v1.4.2)
MCA rmaps: round_robin (MCA v2.0, API v2.0, Component v1.4.2)
MCA rmaps: seq (MCA v2.0, API v2.0, Component v1.4.2)
MCA rml: oob (MCA v2.0, API v2.0, Component v1.4.2)
MCA routed: binomial (MCA v2.0, API v2.0, Component v1.4.2)
MCA routed: direct (MCA v2.0, API v2.0, Component v1.4.2)
MCA routed: linear (MCA v2.0, API v2.0, Component v1.4.2)
MCA plm: rsh (MCA v2.0, API v2.0, Component v1.4.2)
MCA plm: slurm (MCA v2.0, API v2.0, Component v1.4.2)
MCA filem: rsh (MCA v2.0, API v2.0, Component v1.4.2)
MCA errmgr: default (MCA v2.0, API v2.0, Component v1.4.2)
MCA ess: env (MCA v2.0, API v2.0, Component v1.4.2)
MCA ess: hnp (MCA v2.0, API v2.0, Component v1.4.2)
MCA ess: singleton (MCA v2.0, API v2.0, Component v1.4.2)
MCA ess: slurm (MCA v2.0, API v2.0, Component v1.4.2)
MCA ess: tool (MCA v2.0, API v2.0, Component v1.4.2)
MCA grpcomm: bad (MCA v2.0, API v2.0, Component v1.4.2)
MCA grpcomm: basic (MCA v2.0, API v2.0, Component v1.4.2)