On Mon, 22 Oct 2007, Jeff Squyres wrote:
There is in the openib BTL.
The bug #1025 has in one the answers the following phrase:
"It looks like this will affect many threading issues with the
pathscale compiler -- the openib BTL is simply the first place we
tripped it."
which along with the rest of the data (failure dependency on TLS
usage) led me to wonder about threading issues.
To be honest, I removed the pathscale suite from my regular
regression testing
So, is anyone else testing PathScale 3.0 with stable versions of Open
MPI ? Or with development versions ?
I just recompiled the OMPI 1.2 branch with pathscale 3.0 on RHEL4U4
and I do not see the problems that you are seeing. :-\ Is Debian
etch a supported pathscale platform?
Seems like it's not... And indeed the older RHEL4 is a supported
platform, which might explain the different results.
But...
I made some progress: if I configure with "--without-memory-manager"
(along with all other options that I mentioned before), then it works.
This was inspired by the fact that the segmentation fault occured in
ptmalloc2. I have previously tried to remove the MX support without
any effect; with ptmalloc2 out of the picture I have had test runs
over MX and TCP without problems.
Should I file a bug report ? Is there something else that you'd like
me to try ?
ompi_info output:
Open MPI: 1.2.3
Open MPI SVN revision: r15136
Open RTE: 1.2.3
Open RTE SVN revision: r15136
OPAL: 1.2.3
OPAL SVN revision: r15136
Prefix: /home/thor1/costescu/openmpi-1.2.3-ps30
Configured architecture: x86_64-unknown-linux-gnu
Configured by: costescu
Configured on: Tue Oct 23 11:09:33 CEST 2007
Configure host: helics4
Built by: costescu
Built on: Tue Oct 23 11:30:31 CEST 2007
Built host: helics4
C bindings: yes
C++ bindings: yes
Fortran77 bindings: yes (all)
Fortran90 bindings: yes
Fortran90 bindings size: small
C compiler: pathcc
C compiler absolute: /opt_local/pathscale/bin/pathcc
C++ compiler: pathCC
C++ compiler absolute: /opt_local/pathscale/bin/pathCC
Fortran77 compiler: pathf90
Fortran77 compiler abs: /opt_local/pathscale/bin/pathf90
Fortran90 compiler: pathf90
Fortran90 compiler abs: /opt_local/pathscale/bin/pathf90
C profiling: yes
C++ profiling: yes
Fortran77 profiling: yes
Fortran90 profiling: yes
C++ exceptions: no
Thread support: posix (mpi: no, progress: no)
Internal debug support: yes
MPI parameter check: runtime
Memory profiling support: no
Memory debugging support: no
libltdl support: yes
Heterogeneous support: yes
mpirun default --prefix: no
MCA backtrace: execinfo (MCA v1.0, API v1.0, Component v1.2.3)
MCA paffinity: linux (MCA v1.0, API v1.0, Component v1.2.3)
MCA maffinity: first_use (MCA v1.0, API v1.0, Component v1.2.3)
MCA timer: linux (MCA v1.0, API v1.0, Component v1.2.3)
MCA installdirs: env (MCA v1.0, API v1.0, Component v1.2.3)
MCA installdirs: config (MCA v1.0, API v1.0, Component v1.2.3)
MCA allocator: basic (MCA v1.0, API v1.0, Component v1.0)
MCA allocator: bucket (MCA v1.0, API v1.0, Component v1.0)
MCA coll: basic (MCA v1.0, API v1.0, Component v1.2.3)
MCA coll: self (MCA v1.0, API v1.0, Component v1.2.3)
MCA coll: sm (MCA v1.0, API v1.0, Component v1.2.3)
MCA coll: tuned (MCA v1.0, API v1.0, Component v1.2.3)
MCA mpool: rdma (MCA v1.0, API v1.0, Component v1.2.3)
MCA mpool: sm (MCA v1.0, API v1.0, Component v1.2.3)
MCA pml: cm (MCA v1.0, API v1.0, Component v1.2.3)
MCA pml: ob1 (MCA v1.0, API v1.0, Component v1.2.3)
MCA bml: r2 (MCA v1.0, API v1.0, Component v1.2.3)
MCA rcache: vma (MCA v1.0, API v1.0, Component v1.2.3)
MCA btl: self (MCA v1.0, API v1.0.1, Component v1.2.3)
MCA btl: sm (MCA v1.0, API v1.0.1, Component v1.2.3)
MCA btl: mx (MCA v1.0, API v1.0.1, Component v1.2.3)
MCA btl: tcp (MCA v1.0, API v1.0.1, Component v1.0)
MCA mtl: mx (MCA v1.0, API v1.0, Component v1.2.3)
MCA topo: unity (MCA v1.0, API v1.0, Component v1.2.3)
MCA osc: pt2pt (MCA v1.0, API v1.0, Component v1.2.3)
MCA errmgr: hnp (MCA v1.0, API v1.3, Component v1.2.3)
MCA errmgr: orted (MCA v1.0, API v1.3, Component v1.2.3)
MCA errmgr: proxy (MCA v1.0, API v1.3, Component v1.2.3)
MCA gpr: null (MCA v1.0, API v1.0, Component v1.2.3)
MCA gpr: proxy (MCA v1.0, API v1.0, Component v1.2.3)
MCA gpr: replica (MCA v1.0, API v1.0, Component v1.2.3)
MCA iof: proxy (MCA v1.0, API v1.0, Component v1.2.3)
MCA iof: svc (MCA v1.0, API v1.0, Component v1.2.3)
MCA ns: proxy (MCA v1.0, API v2.0, Component v1.2.3)
MCA ns: replica (MCA v1.0, API v2.0, Component v1.2.3)
MCA oob: tcp (MCA v1.0, API v1.0, Component v1.0)
MCA ras: dash_host (MCA v1.0, API v1.3, Component v1.2.3)
MCA ras: localhost (MCA v1.0, API v1.3, Component v1.2.3)
MCA ras: gridengine (MCA v1.0, API v1.3, Component v1.2.3)
MCA ras: slurm (MCA v1.0, API v1.3, Component v1.2.3)
MCA rds: hostfile (MCA v1.0, API v1.3, Component v1.2.3)
MCA rds: proxy (MCA v1.0, API v1.3, Component v1.2.3)
MCA rds: resfile (MCA v1.0, API v1.3, Component v1.2.3)
MCA rmaps: round_robin (MCA v1.0, API v1.3, Component v1.2.3)
MCA rmgr: proxy (MCA v1.0, API v2.0, Component v1.2.3)
MCA rmgr: urm (MCA v1.0, API v2.0, Component v1.2.3)
MCA rml: oob (MCA v1.0, API v1.0, Component v1.2.3)
MCA pls: proxy (MCA v1.0, API v1.3, Component v1.2.3)
MCA pls: gridengine (MCA v1.0, API v1.3, Component v1.2.3)
MCA pls: rsh (MCA v1.0, API v1.3, Component v1.2.3)
MCA pls: slurm (MCA v1.0, API v1.3, Component v1.2.3)
MCA sds: env (MCA v1.0, API v1.0, Component v1.2.3)
MCA sds: seed (MCA v1.0, API v1.0, Component v1.2.3)
MCA sds: singleton (MCA v1.0, API v1.0, Component v1.2.3)
MCA sds: pipe (MCA v1.0, API v1.0, Component v1.2.3)
MCA sds: slurm (MCA v1.0, API v1.0, Component v1.2.3)
--
Bogdan Costescu
IWR - Interdisziplinaeres Zentrum fuer Wissenschaftliches Rechnen
Universitaet Heidelberg, INF 368, D-69120 Heidelberg, GERMANY
Telephone: +49 6221 54 8869, Telefax: +49 6221 54 8868
E-mail: bogdan.coste...@iwr.uni-heidelberg.de