On Apr 5, 2013, at 12:33 AM, Siegmar Gross 
<siegmar.gr...@informatik.hs-fulda.de> wrote:

> Hi
> 
> today I tried to install openmpi-1.9r28290 and got the following errors.
> 
> Solaris 10, Sparc, Sun C 5.12, 32-bit version of openmpi
> Solaris 10, x86_64, Sun C 5.12, 32-bit version of openmpi
> Solaris 10, Sparc, Sun C 5.12, 64-bit version of openmpi
> Solaris 10, x86_64, Sun C 5.12, 64-bit version of openmpi
> ---------------------------------------------------------
> 
> ...
>  CC       topology-solaris.lo
> "../../../../../../../openmpi-1.9r28290/opal/mca/hwloc/hwloc152/hwloc/src/topolo
> gy-solaris.c", line 226: undefined symbol: binding
> "../../../../../../../openmpi-1.9r28290/opal/mca/hwloc/hwloc152/hwloc/src/topolo
> gy-solaris.c", line 227: undefined symbol: hwloc_set
> "../../../../../../../openmpi-1.9r28290/opal/mca/hwloc/hwloc152/hwloc/src/topolo
> gy-solaris.c", line 227: warning: improper pointer/integer combination: arg #1
> cc: acomp failed for 
> ../../../../../../../openmpi-1.9r28290/opal/mca/hwloc/hwloc
> 152/hwloc/src/topology-solaris.c
> make[4]: *** [topology-solaris.lo] Error 1
> ...
> 

Found a missing variable declaration - try with r28293 or above.

> 
> 
> 
> openSuSE Linux 12.1, x86_64, Sun C 5.12, 32-bit version of openmpi
> openSuSE Linux 12.1, x86_64, Sun C 5.12, 64-bit version of openmpi
> ------------------------------------------------------------------
> 
> ...
>  PPFC     mpi-f08-sizeof.lo
>  PPFC     mpi-f08.lo
> "../../../../../openmpi-1.9r28290/ompi/mpi/fortran/use-mpi-f08/mpi-f08.F90", 
> Lin
> e = 1, Column = 1: INTERNAL: Interrupt: Segmentation fault
> make[2]: *** [mpi-f08.lo] Error 1
> make[2]: Leaving directory 
> `/export2/src/openmpi-1.9/openmpi-1.9-Linux.x86_64.32
> _cc/ompi/mpi/fortran/use-mpi-f08'
> make[1]: *** [all-recursive] Error 1
> ...
> 

I have to defer the Fortran stuff to Jeff.


> 
> I could built an older version.
> 
>                 Package: Open MPI root@linpc1 Distribution
>                Open MPI: 1.9r28209
>  Open MPI repo revision: r28209
>   Open MPI release date: Mar 25, 2013 (nightly snapshot tarball)
>                Open RTE: 1.9
>  Open RTE repo revision: r28134M
>   Open RTE release date: Feb 28, 2013
>                    OPAL: 1.9
>      OPAL repo revision: r28134M
>       OPAL release date: Feb 28, 2013
>                 MPI API: 2.1
>            Ident string: 1.9r28209
>                  Prefix: /usr/local/ompi-java_64_cc
> Configured architecture: x86_64-unknown-linux-gnu
>          Configure host: linpc1
>           Configured by: root
>           Configured on: Tue Mar 26 15:54:59 CET 2013
>          Configure host: linpc1
>                Built by: root
>                Built on: Tue Mar 26 16:31:01 CET 2013
>              Built host: linpc1
>              C bindings: yes
>            C++ bindings: yes
>             Fort mpif.h: yes (all)
>            Fort use mpi: yes (full: ignore TKR)
>       Fort use mpi size: deprecated-ompi-info-value
>        Fort use mpi_f08: yes
> Fort mpi_f08 compliance: The mpi_f08 module is available, but due to 
> limitations in the f95 compiler, does not support the following: array 
> subsections, ABSTRACT INTERFACE function pointers, Fortran '08-specified 
> ASYNCHRONOUS behavior, PROCEDUREs, direct passthru (where possible) to 
> underlying Open MPI's C functionality
>  Fort mpi_f08 subarrays: no
>           Java bindings: yes
>              C compiler: cc
>     C compiler absolute: /opt/solstudio12.3/bin/cc
>  C compiler family name: SUN
>      C compiler version: 0x5120
>            C++ compiler: CC
>   C++ compiler absolute: /opt/solstudio12.3/bin/CC
>           Fort compiler: f95
>       Fort compiler abs: /opt/solstudio12.3/bin/f95
>         Fort ignore TKR: yes (!$PRAGMA IGNORE_TKR)
>   Fort 08 assumed shape: no
>      Fort optional args: yes
>            Fort BIND(C): yes
>            Fort PRIVATE: yes
>           Fort ABSTRACT: no
>       Fort ASYNCHRONOUS: no
>          Fort PROCEDURE: no
> Fort f08 using wrappers: yes
>             C profiling: yes
>           C++ profiling: yes
>   Fort mpif.h profiling: yes
>  Fort use mpi profiling: yes
>   Fort use mpi_f08 prof: yes
>          C++ exceptions: yes
>          Thread support: posix (MPI_THREAD_MULTIPLE: yes, OPAL support: yes, 
> OMPI progress: no, ORTE progress: no, Event lib: no)
>           Sparse Groups: no
>  Internal debug support: yes
>  MPI interface warnings: yes
>     MPI parameter check: runtime
> Memory profiling support: no
> Memory debugging support: no
>         libltdl support: yes
>   Heterogeneous support: yes
> mpirun default --prefix: no
>         MPI I/O support: yes
>       MPI_WTIME support: gettimeofday
>     Symbol vis. support: yes
>   Host topology support: yes
>          MPI extensions: 
>   FT Checkpoint support: no (checkpoint thread: no)
>   C/R Enabled Debugging: no
>     VampirTrace support: yes
>  MPI_MAX_PROCESSOR_NAME: 256
>    MPI_MAX_ERROR_STRING: 256
>     MPI_MAX_OBJECT_NAME: 64
>        MPI_MAX_INFO_KEY: 36
>        MPI_MAX_INFO_VAL: 256
>       MPI_MAX_PORT_NAME: 1024
>  MPI_MAX_DATAREP_STRING: 128
>           MCA backtrace: execinfo (MCA v2.0, API v2.0, Component v1.9)
>               MCA event: libevent2019 (MCA v2.0, API v2.0, Component v1.9)
>               MCA hwloc: hwloc152 (MCA v2.0, API v2.0, Component v1.9)
>                  MCA if: linux_ipv6 (MCA v2.0, API v2.0, Component v1.9)
>                  MCA if: posix_ipv4 (MCA v2.0, API v2.0, Component v1.9)
>         MCA installdirs: env (MCA v2.0, API v2.0, Component v1.9)
>         MCA installdirs: config (MCA v2.0, API v2.0, Component v1.9)
>              MCA memory: linux (MCA v2.0, API v2.0, Component v1.9)
>               MCA shmem: mmap (MCA v2.0, API v2.0, Component v1.9)
>               MCA shmem: posix (MCA v2.0, API v2.0, Component v1.9)
>               MCA shmem: sysv (MCA v2.0, API v2.0, Component v1.9)
>               MCA timer: linux (MCA v2.0, API v2.0, Component v1.9)
>                 MCA dfs: app (MCA v2.0, API v1.0, Component v1.9)
>                 MCA dfs: orted (MCA v2.0, API v1.0, Component v1.9)
>                 MCA dfs: test (MCA v2.0, API v1.0, Component v1.9)
>              MCA errmgr: default_app (MCA v2.0, API v3.0, Component v1.9)
>              MCA errmgr: default_hnp (MCA v2.0, API v3.0, Component v1.9)
>              MCA errmgr: default_orted (MCA v2.0, API v3.0, Component v1.9)
>                 MCA ess: env (MCA v2.0, API v3.0, Component v1.9)
>                 MCA ess: hnp (MCA v2.0, API v3.0, Component v1.9)
>                 MCA ess: singleton (MCA v2.0, API v3.0, Component v1.9)
>                 MCA ess: slurm (MCA v2.0, API v3.0, Component v1.9)
>                 MCA ess: tool (MCA v2.0, API v3.0, Component v1.9)
>               MCA filem: raw (MCA v2.0, API v2.0, Component v1.9)
>               MCA filem: rsh (MCA v2.0, API v2.0, Component v1.9)
>             MCA grpcomm: bad (MCA v2.0, API v2.0, Component v1.9)
>                 MCA iof: hnp (MCA v2.0, API v2.0, Component v1.9)
>                 MCA iof: mr_hnp (MCA v2.0, API v2.0, Component v1.9)
>                 MCA iof: mr_orted (MCA v2.0, API v2.0, Component v1.9)
>                 MCA iof: orted (MCA v2.0, API v2.0, Component v1.9)
>                 MCA iof: tool (MCA v2.0, API v2.0, Component v1.9)
>                MCA odls: default (MCA v2.0, API v2.0, Component v1.9)
>                 MCA oob: tcp (MCA v2.0, API v2.0, Component v1.9)
>                 MCA plm: rsh (MCA v2.0, API v2.0, Component v1.9)
>                 MCA plm: slurm (MCA v2.0, API v2.0, Component v1.9)
>                 MCA ras: loadleveler (MCA v2.0, API v2.0, Component v1.9)
>                 MCA ras: simulator (MCA v2.0, API v2.0, Component v1.9)
>                 MCA ras: slurm (MCA v2.0, API v2.0, Component v1.9)
>               MCA rmaps: lama (MCA v2.0, API v2.0, Component v1.9)
>               MCA rmaps: ppr (MCA v2.0, API v2.0, Component v1.9)
>               MCA rmaps: rank_file (MCA v2.0, API v2.0, Component v1.9)
>               MCA rmaps: resilient (MCA v2.0, API v2.0, Component v1.9)
>               MCA rmaps: round_robin (MCA v2.0, API v2.0, Component v1.9)
>               MCA rmaps: seq (MCA v2.0, API v2.0, Component v1.9)
>               MCA rmaps: staged (MCA v2.0, API v2.0, Component v1.9)
>                 MCA rml: oob (MCA v2.0, API v2.0, Component v1.9)
>              MCA routed: binomial (MCA v2.0, API v2.0, Component v1.9)
>              MCA routed: debruijn (MCA v2.0, API v2.0, Component v1.9)
>              MCA routed: direct (MCA v2.0, API v2.0, Component v1.9)
>              MCA routed: radix (MCA v2.0, API v2.0, Component v1.9)
>               MCA state: app (MCA v2.0, API v1.0, Component v1.9)
>               MCA state: hnp (MCA v2.0, API v1.0, Component v1.9)
>               MCA state: novm (MCA v2.0, API v1.0, Component v1.9)
>               MCA state: orted (MCA v2.0, API v1.0, Component v1.9)
>               MCA state: staged_hnp (MCA v2.0, API v1.0, Component v1.9)
>               MCA state: staged_orted (MCA v2.0, API v1.0, Component v1.9)
>           MCA allocator: basic (MCA v2.0, API v2.0, Component v1.9)
>           MCA allocator: bucket (MCA v2.0, API v2.0, Component v1.9)
>                 MCA bml: r2 (MCA v2.0, API v2.0, Component v1.9)
>                 MCA btl: self (MCA v2.0, API v2.0, Component v1.9)
>                 MCA btl: sm (MCA v2.0, API v2.0, Component v1.9)
>                 MCA btl: tcp (MCA v2.0, API v2.0, Component v1.9)
>                MCA coll: basic (MCA v2.0, API v2.0, Component v1.9)
>                MCA coll: hierarch (MCA v2.0, API v2.0, Component v1.9)
>                MCA coll: inter (MCA v2.0, API v2.0, Component v1.9)
>                MCA coll: libnbc (MCA v2.0, API v2.0, Component v1.9)
>                MCA coll: self (MCA v2.0, API v2.0, Component v1.9)
>                MCA coll: sm (MCA v2.0, API v2.0, Component v1.9)
>                MCA coll: tuned (MCA v2.0, API v2.0, Component v1.9)
>                MCA fbtl: posix (MCA v2.0, API v2.0, Component v1.9)
>               MCA fcoll: dynamic (MCA v2.0, API v2.0, Component v1.9)
>               MCA fcoll: individual (MCA v2.0, API v2.0, Component v1.9)
>               MCA fcoll: static (MCA v2.0, API v2.0, Component v1.9)
>               MCA fcoll: two_phase (MCA v2.0, API v2.0, Component v1.9)
>               MCA fcoll: ylib (MCA v2.0, API v2.0, Component v1.9)
>                  MCA fs: ufs (MCA v2.0, API v2.0, Component v1.9)
>                  MCA io: ompio (MCA v2.0, API v2.0, Component v1.9)
>                  MCA io: romio (MCA v2.0, API v2.0, Component v1.9)
>               MCA mpool: grdma (MCA v2.0, API v2.0, Component v1.9)
>               MCA mpool: sm (MCA v2.0, API v2.0, Component v1.9)
>                 MCA osc: pt2pt (MCA v2.0, API v2.0, Component v1.9)
>                 MCA osc: rdma (MCA v2.0, API v2.0, Component v1.9)
>                 MCA pml: v (MCA v2.0, API v2.0, Component v1.9)
>                 MCA pml: bfo (MCA v2.0, API v2.0, Component v1.9)
>                 MCA pml: dr (MCA v2.0, API v2.0, Component v1.9)
>                 MCA pml: ob1 (MCA v2.0, API v2.0, Component v1.9)
>              MCA rcache: vma (MCA v2.0, API v2.0, Component v1.9)
>            MCA sharedfp: dummy (MCA v2.0, API v2.0, Component v1.9)
>                MCA topo: unity (MCA v2.0, API v2.0, Component v1.9)
> 
> 
> I would be grateful, if somebody can fix the problems. Thank you
> very much for any help in advance.
> 
> 
> Kind regards
> 
> Siegmar
> 
> _______________________________________________
> users mailing list
> us...@open-mpi.org
> http://www.open-mpi.org/mailman/listinfo.cgi/users


Reply via email to