Sorry for the delay in replying; many of us were at SC10 and then most of us 
were off for the US Thanksgiving holiday last week.

This error means that your application called MPI_ABORT -- meaning that your 
application intentionally chose to quit.  You might need to look through the 
source code to figure out why it's doing that, and/or look at any messages 
above this one to figure out why it's choosing to abort like this.



On Nov 21, 2010, at 8:01 PM, Tushar Andriyas wrote:

> Hi there,
> 
> I had a few suggestions in the previous few threads and I have been trying a 
> lot of things to coax mpirun to work. But it does not want to. I am pasting 
> an error file and ompi_info 
> 
> ERROR FILE:
> 
> [uinta-0027:03360] MPI_ABORT invoked on rank 0 in communicator MPI_COMM_WORLD 
> w\
> ith errorcode 4776233
> [uinta-0027:03357] [0,0,0]-[0,1,4] mca_oob_tcp_msg_recv: readv failed: 
> Connecti\
> on reset by peer (104)
> mpirun noticed that job rank 1 with PID 3361 on node uinta-0027 exited on 
> signa\
> l 15 (Terminated).
> 
> ompi_info:
> 
>     Open MPI: 1.2.7
>    Open MPI SVN revision: r19401
>                 Open RTE: 1.2.7
>    Open RTE SVN revision: r19401
>                     OPAL: 1.2.7
>        OPAL SVN revision: r19401
>                   Prefix: /opt/libraries/openmpi/openmpi-1.2.7-pgi
>  Configured architecture: x86_64-unknown-linux-gnu
>            Configured by: A00017402
>            Configured on: Thu Sep 18 15:00:05 MDT 2008
>           Configure host: volvox.hpc.usu.edu
>                 Built by: A00017402
>                 Built on: Thu Sep 18 15:20:06 MDT 2008
>               Built host: volvox.hpc.usu.edu
>               C bindings: yes
>             C++ bindings: yes
>       Fortran77 bindings: yes (all)
>       Fortran90 bindings: yes
>  Fortran90 bindings size: large
>               C compiler: pgcc
>      C compiler absolute: /opt/apps/pgi/linux86-64/7.2/bin/pgcc
>             C++ compiler: pgCC
>    C++ compiler absolute: /opt/apps/pgi/linux86-64/7.2/bin/pgCC
>       Fortran77 compiler: pgf77
>   Fortran77 compiler abs: /opt/apps/pgi/linux86-64/7.2/bin/pgf77
>       Fortran90 compiler: pgf90
>   Fortran90 compiler abs: /opt/apps/pgi/linux86-64/7.2/bin/pgf90
>              C profiling: yes
>            C++ profiling: yes
>      Fortran77 profiling: yes
>      Fortran90 profiling: yes
>           C++ exceptions: no
>           Thread support: posix (mpi: no, progress: no)
>   Internal debug support: no
>      MPI parameter check: runtime
> Memory profiling support: no
> Memory debugging support: no
>          libltdl support: yes
>    Heterogeneous support: yes
>  mpirun default --prefix: no
>            MCA backtrace: execinfo (MCA v1.0, API v1.0, Component v1.2.7)
>               MCA memory: ptmalloc2 (MCA v1.0, API v1.0, Component v1.2.7)
>            MCA paffinity: linux (MCA v1.0, API v1.0, Component v1.2.7)
>            MCA maffinity: first_use (MCA v1.0, API v1.0, Component v1.2.7)
>            MCA maffinity: libnuma (MCA v1.0, API v1.0, Component v1.2.7)
>                MCA timer: linux (MCA v1.0, API v1.0, Component v1.2.7)
>          MCA installdirs: env (MCA v1.0, API v1.0, Component v1.2.7)
>          MCA installdirs: config (MCA v1.0, API v1.0, Component v1.2.7)
>            MCA allocator: basic (MCA v1.0, API v1.0, Component v1.0)
>            MCA allocator: bucket (MCA v1.0, API v1.0, Component v1.0)
>                 MCA coll: basic (MCA v1.0, API v1.0, Component v1.2.7)
>                 MCA coll: self (MCA v1.0, API v1.0, Component v1.2.7)
>                 MCA coll: sm (MCA v1.0, API v1.0, Component v1.2.7)
>                 MCA coll: tuned (MCA v1.0, API v1.0, Component v1.2.7)
>                   MCA io: romio (MCA v1.0, API v1.0, Component v1.2.7)
>                MCA mpool: rdma (MCA v1.0, API v1.0, Component v1.2.7)
>                MCA mpool: sm (MCA v1.0, API v1.0, Component v1.2.7)
>                  MCA pml: cm (MCA v1.0, API v1.0, Component v1.2.7)
>                  MCA pml: ob1 (MCA v1.0, API v1.0, Component v1.2.7)
>                  MCA bml: r2 (MCA v1.0, API v1.0, Component v1.2.7)
>               MCA rcache: vma (MCA v1.0, API v1.0, Component v1.2.7)
>                  MCA btl: gm (MCA v1.0, API v1.0.1, Component v1.2.7)
>                  MCA btl: self (MCA v1.0, API v1.0.1, Component v1.2.7)
>                  MCA btl: sm (MCA v1.0, API v1.0.1, Component v1.2.7)
>                  MCA btl: tcp (MCA v1.0, API v1.0.1, Component v1.0)
>                 MCA topo: unity (MCA v1.0, API v1.0, Component v1.2.7)
>                  MCA osc: pt2pt (MCA v1.0, API v1.0, Component v1.2.7)
>               MCA errmgr: hnp (MCA v1.0, API v1.3, Component v1.2.7)
>               MCA errmgr: orted (MCA v1.0, API v1.3, Component v1.2.7)
>               MCA errmgr: proxy (MCA v1.0, API v1.3, Component v1.2.7)
>                  MCA gpr: null (MCA v1.0, API v1.0, Component v1.2.7)
>                  MCA gpr: proxy (MCA v1.0, API v1.0, Component v1.2.7)
>                  MCA gpr: replica (MCA v1.0, API v1.0, Component v1.2.7)
>                  MCA iof: proxy (MCA v1.0, API v1.0, Component v1.2.7)
>                  MCA iof: svc (MCA v1.0, API v1.0, Component v1.2.7)
>                   MCA ns: proxy (MCA v1.0, API v2.0, Component v1.2.7)
>                   MCA ns: replica (MCA v1.0, API v2.0, Component v1.2.7)
>                  MCA oob: tcp (MCA v1.0, API v1.0, Component v1.0)
>                  MCA ras: dash_host (MCA v1.0, API v1.3, Component v1.2.7)
>                  MCA ras: gridengine (MCA v1.0, API v1.3, Component v1.2.7)
>                  MCA ras: localhost (MCA v1.0, API v1.3, Component v1.2.7)
>                  MCA ras: slurm (MCA v1.0, API v1.3, Component v1.2.7)
>                  MCA ras: tm (MCA v1.0, API v1.3, Component v1.2.7)
>                  MCA rds: hostfile (MCA v1.0, API v1.3, Component v1.2.7)
>                  MCA rds: proxy (MCA v1.0, API v1.3, Component v1.2.7)
>                  MCA rds: resfile (MCA v1.0, API v1.3, Component v1.2.7)
>                MCA rmaps: round_robin (MCA v1.0, API v1.3, Component v1.2.7)
>                 MCA rmgr: proxy (MCA v1.0, API v2.0, Component v1.2.7)
>                 MCA rmgr: urm (MCA v1.0, API v2.0, Component v1.2.7)
>                  MCA rml: oob (MCA v1.0, API v1.0, Component v1.2.7)
>                  MCA pls: gridengine (MCA v1.0, API v1.3, Component v1.2.7)
>                  MCA pls: proxy (MCA v1.0, API v1.3, Component v1.2.7)
>                  MCA pls: rsh (MCA v1.0, API v1.3, Component v1.2.7)
>                  MCA pls: slurm (MCA v1.0, API v1.3, Component v1.2.7)
>                  MCA pls: tm (MCA v1.0, API v1.3, Component v1.2.7)
>                  MCA sds: env (MCA v1.0, API v1.0, Component v1.2.7)
>                  MCA sds: pipe (MCA v1.0, API v1.0, Component v1.2.7)
>                  MCA sds: seed (MCA v1.0, API v1.0, Component v1.2.7)
>                  MCA sds: singleton (MCA v1.0, API v1.0, Component v1.2.7)
>                  MCA sds: slurm (MCA v1.0, API v1.0, Component v1.2.7)
> 
> Can anyone please look at the files and tell me what to do?
> 
> Tushar
> 
> _______________________________________________
> users mailing list
> us...@open-mpi.org
> http://www.open-mpi.org/mailman/listinfo.cgi/users


-- 
Jeff Squyres
jsquy...@cisco.com
For corporate legal information go to:
http://www.cisco.com/web/about/doing_business/legal/cri/


Reply via email to