Hi, I am using OpenMPI 1.8.1 in a Linux cluster that we recently setup. It builds fine, but when I try to run even the simplest hello.c program it'll cause a segfault. Any suggestions on how to correct this?
The steps I did and error message are below. 1. Built OpenMPI 1.8.1 on the cluster. The ompi_info is attached. 2. cd to examples directory and mpicc hello_c.c 3. mpirun -np 2 ./a.out 4. Error text is attached. Please let me know if you need more info. Thank you, Saliya -- Saliya Ekanayake esal...@gmail.com Cell 812-391-4914 Home 812-961-6383 http://saliya.org
Package: Open MPI sekan...@tempest.dsc.soic.indiana.edu Distribution Open MPI: 1.8.1 Open MPI repo revision: r31483 Open MPI release date: Apr 22, 2014 Open RTE: 1.8.1 Open RTE repo revision: r31483 Open RTE release date: Apr 22, 2014 OPAL: 1.8.1 OPAL repo revision: r31483 OPAL release date: Apr 22, 2014 MPI API: 3.0 Ident string: 1.8.1 Prefix: /N/u/sekanaya/buildompi-1.8.1 Configured architecture: x86_64-unknown-linux-gnu Configure host: tempest.dsc.soic.indiana.edu Configured by: sekanaya Configured on: Thu Oct 23 11:02:15 EDT 2014 Configure host: tempest.dsc.soic.indiana.edu Built by: sekanaya Built on: Thu Oct 23 11:20:55 EDT 2014 Built host: tempest.dsc.soic.indiana.edu C bindings: yes C++ bindings: yes Fort mpif.h: yes (all) Fort use mpi: yes (limited: overloading) Fort use mpi size: deprecated-ompi-info-value Fort use mpi_f08: no Fort mpi_f08 compliance: The mpi_f08 module was not built Fort mpi_f08 subarrays: no Java bindings: yes Wrapper compiler rpath: runpath C compiler: gcc C compiler absolute: /usr/bin/gcc C compiler family name: GNU C compiler version: 4.4.7 C++ compiler: g++ C++ compiler absolute: /usr/bin/g++ Fort compiler: gfortran Fort compiler abs: /usr/bin/gfortran Fort ignore TKR: no Fort 08 assumed shape: no Fort optional args: no Fort BIND(C) (all): no Fort ISO_C_BINDING: no Fort SUBROUTINE BIND(C): no Fort TYPE,BIND(C): no Fort T,BIND(C,name="a"): no Fort PRIVATE: no Fort PROTECTED: no Fort ABSTRACT: no Fort ASYNCHRONOUS: no Fort PROCEDURE: no Fort f08 using wrappers: no C profiling: yes C++ profiling: yes Fort mpif.h profiling: yes Fort use mpi profiling: yes Fort use mpi_f08 prof: no C++ exceptions: no Thread support: posix (MPI_THREAD_MULTIPLE: no, OPAL support: yes, OMPI progress: no, ORTE progress: yes, Event lib: yes) Sparse Groups: no Internal debug support: no MPI interface warnings: yes MPI parameter check: runtime Memory profiling support: no Memory debugging support: no libltdl support: yes Heterogeneous support: no mpirun default --prefix: no MPI I/O support: yes MPI_WTIME support: gettimeofday Symbol vis. support: yes Host topology support: yes MPI extensions: FT Checkpoint support: no (checkpoint thread: no) C/R Enabled Debugging: no VampirTrace support: yes MPI_MAX_PROCESSOR_NAME: 256 MPI_MAX_ERROR_STRING: 256 MPI_MAX_OBJECT_NAME: 64 MPI_MAX_INFO_KEY: 36 MPI_MAX_INFO_VAL: 256 MPI_MAX_PORT_NAME: 1024 MPI_MAX_DATAREP_STRING: 128 MCA backtrace: execinfo (MCA v2.0, API v2.0, Component v1.8.1) MCA compress: bzip (MCA v2.0, API v2.0, Component v1.8.1) MCA compress: gzip (MCA v2.0, API v2.0, Component v1.8.1) MCA crs: none (MCA v2.0, API v2.0, Component v1.8.1) MCA db: hash (MCA v2.0, API v1.0, Component v1.8.1) MCA db: print (MCA v2.0, API v1.0, Component v1.8.1) MCA event: libevent2021 (MCA v2.0, API v2.0, Component v1.8.1) MCA hwloc: hwloc172 (MCA v2.0, API v2.0, Component v1.8.1) MCA if: posix_ipv4 (MCA v2.0, API v2.0, Component v1.8.1) MCA if: linux_ipv6 (MCA v2.0, API v2.0, Component v1.8.1) MCA installdirs: env (MCA v2.0, API v2.0, Component v1.8.1) MCA installdirs: config (MCA v2.0, API v2.0, Component v1.8.1) MCA memory: linux (MCA v2.0, API v2.0, Component v1.8.1) MCA pstat: linux (MCA v2.0, API v2.0, Component v1.8.1) MCA sec: basic (MCA v2.0, API v1.0, Component v1.8.1) MCA shmem: mmap (MCA v2.0, API v2.0, Component v1.8.1) MCA shmem: posix (MCA v2.0, API v2.0, Component v1.8.1) MCA shmem: sysv (MCA v2.0, API v2.0, Component v1.8.1) MCA timer: linux (MCA v2.0, API v2.0, Component v1.8.1) MCA dfs: app (MCA v2.0, API v1.0, Component v1.8.1) MCA dfs: orted (MCA v2.0, API v1.0, Component v1.8.1) MCA dfs: test (MCA v2.0, API v1.0, Component v1.8.1) MCA errmgr: default_app (MCA v2.0, API v3.0, Component v1.8.1) MCA errmgr: default_hnp (MCA v2.0, API v3.0, Component v1.8.1) MCA errmgr: default_orted (MCA v2.0, API v3.0, Component v1.8.1) MCA errmgr: default_tool (MCA v2.0, API v3.0, Component v1.8.1) MCA ess: env (MCA v2.0, API v3.0, Component v1.8.1) MCA ess: hnp (MCA v2.0, API v3.0, Component v1.8.1) MCA ess: singleton (MCA v2.0, API v3.0, Component v1.8.1) MCA ess: slurm (MCA v2.0, API v3.0, Component v1.8.1) MCA ess: tool (MCA v2.0, API v3.0, Component v1.8.1) MCA filem: raw (MCA v2.0, API v2.0, Component v1.8.1) MCA grpcomm: bad (MCA v2.0, API v2.0, Component v1.8.1) MCA iof: hnp (MCA v2.0, API v2.0, Component v1.8.1) MCA iof: mr_hnp (MCA v2.0, API v2.0, Component v1.8.1) MCA iof: mr_orted (MCA v2.0, API v2.0, Component v1.8.1) MCA iof: orted (MCA v2.0, API v2.0, Component v1.8.1) MCA iof: tool (MCA v2.0, API v2.0, Component v1.8.1) MCA odls: default (MCA v2.0, API v2.0, Component v1.8.1) MCA oob: tcp (MCA v2.0, API v2.0, Component v1.8.1) MCA plm: isolated (MCA v2.0, API v2.0, Component v1.8.1) MCA plm: rsh (MCA v2.0, API v2.0, Component v1.8.1) MCA plm: slurm (MCA v2.0, API v2.0, Component v1.8.1) MCA ras: loadleveler (MCA v2.0, API v2.0, Component v1.8.1) MCA ras: simulator (MCA v2.0, API v2.0, Component v1.8.1) MCA ras: slurm (MCA v2.0, API v2.0, Component v1.8.1) MCA rmaps: lama (MCA v2.0, API v2.0, Component v1.8.1) MCA rmaps: mindist (MCA v2.0, API v2.0, Component v1.8.1) MCA rmaps: ppr (MCA v2.0, API v2.0, Component v1.8.1) MCA rmaps: rank_file (MCA v2.0, API v2.0, Component v1.8.1) MCA rmaps: resilient (MCA v2.0, API v2.0, Component v1.8.1) MCA rmaps: round_robin (MCA v2.0, API v2.0, Component v1.8.1) MCA rmaps: seq (MCA v2.0, API v2.0, Component v1.8.1) MCA rmaps: staged (MCA v2.0, API v2.0, Component v1.8.1) MCA rml: oob (MCA v2.0, API v2.0, Component v1.8.1) MCA routed: binomial (MCA v2.0, API v2.0, Component v1.8.1) MCA routed: debruijn (MCA v2.0, API v2.0, Component v1.8.1) MCA routed: direct (MCA v2.0, API v2.0, Component v1.8.1) MCA routed: radix (MCA v2.0, API v2.0, Component v1.8.1) MCA state: app (MCA v2.0, API v1.0, Component v1.8.1) MCA state: hnp (MCA v2.0, API v1.0, Component v1.8.1) MCA state: novm (MCA v2.0, API v1.0, Component v1.8.1) MCA state: orted (MCA v2.0, API v1.0, Component v1.8.1) MCA state: staged_hnp (MCA v2.0, API v1.0, Component v1.8.1) MCA state: staged_orted (MCA v2.0, API v1.0, Component v1.8.1) MCA state: tool (MCA v2.0, API v1.0, Component v1.8.1) MCA allocator: basic (MCA v2.0, API v2.0, Component v1.8.1) MCA allocator: bucket (MCA v2.0, API v2.0, Component v1.8.1) MCA bcol: basesmuma (MCA v2.0, API v2.0, Component v1.8.1) MCA bcol: ptpcoll (MCA v2.0, API v2.0, Component v1.8.1) MCA bml: r2 (MCA v2.0, API v2.0, Component v1.8.1) MCA btl: openib (MCA v2.0, API v2.0, Component v1.8.1) MCA btl: self (MCA v2.0, API v2.0, Component v1.8.1) MCA btl: sm (MCA v2.0, API v2.0, Component v1.8.1) MCA btl: tcp (MCA v2.0, API v2.0, Component v1.8.1) MCA btl: vader (MCA v2.0, API v2.0, Component v1.8.1) MCA coll: basic (MCA v2.0, API v2.0, Component v1.8.1) MCA coll: hierarch (MCA v2.0, API v2.0, Component v1.8.1) MCA coll: inter (MCA v2.0, API v2.0, Component v1.8.1) MCA coll: libnbc (MCA v2.0, API v2.0, Component v1.8.1) MCA coll: ml (MCA v2.0, API v2.0, Component v1.8.1) MCA coll: self (MCA v2.0, API v2.0, Component v1.8.1) MCA coll: sm (MCA v2.0, API v2.0, Component v1.8.1) MCA coll: tuned (MCA v2.0, API v2.0, Component v1.8.1) MCA dpm: orte (MCA v2.0, API v2.0, Component v1.8.1) MCA fbtl: posix (MCA v2.0, API v2.0, Component v1.8.1) MCA fcoll: dynamic (MCA v2.0, API v2.0, Component v1.8.1) MCA fcoll: individual (MCA v2.0, API v2.0, Component v1.8.1) MCA fcoll: static (MCA v2.0, API v2.0, Component v1.8.1) MCA fcoll: two_phase (MCA v2.0, API v2.0, Component v1.8.1) MCA fcoll: ylib (MCA v2.0, API v2.0, Component v1.8.1) MCA fs: ufs (MCA v2.0, API v2.0, Component v1.8.1) MCA io: ompio (MCA v2.0, API v2.0, Component v1.8.1) MCA io: romio (MCA v2.0, API v2.0, Component v1.8.1) MCA mpool: grdma (MCA v2.0, API v2.0, Component v1.8.1) MCA mpool: sm (MCA v2.0, API v2.0, Component v1.8.1) MCA osc: rdma (MCA v2.0, API v3.0, Component v1.8.1) MCA osc: sm (MCA v2.0, API v3.0, Component v1.8.1) MCA pml: v (MCA v2.0, API v2.0, Component v1.8.1) MCA pml: bfo (MCA v2.0, API v2.0, Component v1.8.1) MCA pml: cm (MCA v2.0, API v2.0, Component v1.8.1) MCA pml: ob1 (MCA v2.0, API v2.0, Component v1.8.1) MCA pubsub: orte (MCA v2.0, API v2.0, Component v1.8.1) MCA rcache: vma (MCA v2.0, API v2.0, Component v1.8.1) MCA rte: orte (MCA v2.0, API v2.0, Component v1.8.1) MCA sbgp: basesmsocket (MCA v2.0, API v2.0, Component v1.8.1) MCA sbgp: basesmuma (MCA v2.0, API v2.0, Component v1.8.1) MCA sbgp: p2p (MCA v2.0, API v2.0, Component v1.8.1) MCA sharedfp: individual (MCA v2.0, API v2.0, Component v1.8.1) MCA sharedfp: lockedfile (MCA v2.0, API v2.0, Component v1.8.1) MCA sharedfp: sm (MCA v2.0, API v2.0, Component v1.8.1) MCA topo: basic (MCA v2.0, API v2.1, Component v1.8.1) MCA vprotocol: pessimist (MCA v2.0, API v2.0, Component v1.8.1)
[sekanaya@tempest examples]$ mpirun -np 1 ./a.out [tempest:10471] *** Process received signal *** [tempest:10471] Signal: Segmentation fault (11) [tempest:10471] Signal code: Address not mapped (1) [tempest:10471] Failing at address: 0x30 [tempest:10471] [ 0] /lib64/libpthread.so.0[0x38b4a0f710] [tempest:10471] [ 1] /N/u/sekanaya/buildompi-1.8.1/lib/openmpi/mca_btl_openib.so(+0x1eb66)[0x7f0f20481b66] [tempest:10471] [ 2] /N/u/sekanaya/buildompi-1.8.1/lib/openmpi/mca_btl_openib.so(+0x1f120)[0x7f0f20482120] [tempest:10471] [ 3] /N/u/sekanaya/buildompi-1.8.1/lib/openmpi/mca_btl_openib.so(ompi_btl_openib_connect_base_select_for_local_port+0x12c)[0x7f0f2048081c] [tempest:10471] [ 4] /N/u/sekanaya/buildompi-1.8.1/lib/openmpi/mca_btl_openib.so(+0x11938)[0x7f0f20474938] [tempest:10471] [ 5] /N/u/sekanaya/buildompi-1.8.1/lib/libmpi.so.1(mca_btl_base_select+0x117)[0x7f0f2455ef77] [tempest:10471] [ 6] /N/u/sekanaya/buildompi-1.8.1/lib/openmpi/mca_bml_r2.so(mca_bml_r2_component_init+0x12)[0x7f0f206916f2] [tempest:10471] [ 7] /N/u/sekanaya/buildompi-1.8.1/lib/libmpi.so.1(mca_bml_base_init+0x99)[0x7f0f2455e799] [tempest:10471] [ 8] /N/u/sekanaya/buildompi-1.8.1/lib/openmpi/mca_pml_ob1.so(+0x51e8)[0x7f0f1f1771e8] [tempest:10471] [ 9] /N/u/sekanaya/buildompi-1.8.1/lib/libmpi.so.1(mca_pml_base_select+0x1e0)[0x7f0f245717a0] [tempest:10471] [10] /N/u/sekanaya/buildompi-1.8.1/lib/libmpi.so.1(ompi_mpi_init+0x510)[0x7f0f24522290] [tempest:10471] [11] /N/u/sekanaya/buildompi-1.8.1/lib/libmpi.so.1(MPI_Init+0x170)[0x7f0f245411d0] [tempest:10471] [12] ./a.out[0x400806] [tempest:10471] [13] /lib64/libc.so.6(__libc_start_main+0xfd)[0x38b461ed5d] [tempest:10471] [14] ./a.out[0x400719] [tempest:10471] *** End of error message *** -------------------------------------------------------------------------- mpirun noticed that process rank 0 with PID 10471 on node tempest exited on signal 11 (Segmentation fault).