I am working on a computer running CentOS 5 with 2 Quad-core CPUs (only one computer, it is not connected with others). Previously the OpenMPI version was 1.27 and my programs worked fine. After the automatic upgrade to 1.32 (through yum), I can compile programs but it shows error in running:

# mpirun -np 2 mpi

libibverbs: Fatal: couldn't read uverbs ABI version.
--------------------------------------------------------------------------
[[948,1],0]: A high-performance Open MPI point-to-point messaging module
was unable to find any relevant network interfaces:

Module: OpenFabrics (openib)
  Host: supernova.localdomain

Another transport will be used instead, although this may result in
lower performance.
--------------------------------------------------------------------------
libibverbs: Fatal: couldn't read uverbs ABI version.
librdmacm: couldn't read ABI version.
librdmacm: assuming: 4
libibverbs: Fatal: couldn't read uverbs ABI version.
CMA: unable to open /dev/infiniband/rdma_cm
--------------------------------------------------------------------------
WARNING: Failed to open "OpenIB-cma" [DAT_INTERNAL_ERROR:].
This may be a real error or it may be an invalid entry in the uDAPL
Registry which is contained in the dat.conf file. Contact your local
System Administrator to confirm the availability of the interfaces in
the dat.conf file.
--------------------------------------------------------------------------
librdmacm: couldn't read ABI version.
librdmacm: assuming: 4
libibverbs: Fatal: couldn't read uverbs ABI version.
CMA: unable to open /dev/infiniband/rdma_cm


Here the "mpi" program is a very simple hello-world example.

What is the problem it and do I need to configure OpenMPI somehow to make my programs running again?

Thank you very much.

Reply via email to