Hi Jeff again!
But the setting of the environtemt variable OPAL_PREFIX to an appropriate value (assuming PATH and LD_LIBRARY_PATH are setted too) is not enough to let the OpenMPI rock&roll from the new lokation.Hmm. It should be.
(update) it works with "truly" OpenMPI, but it works *not* with SUN Cluster Tools 8.0 (which is also an OpenMPI). So, it seems be an SUN problem and not general problem of openMPI. Sorry for false relating the problem.
The only trouble we have now are the error messages like -------------------------------------------------------------------------- Sorry! You were supposed to get help about: no hca params found from the file: help-mpi-btl-openib.txt But I couldn't find any file matching that name. Sorry! -------------------------------------------------------------------------- (the job still runs without problems! :o)if running openmpi from new location, and the old location being removed. (if the old location being also persistense there is no error, so it seems to be an attempt to access to an file on old path).
Maybe we have to explicitly pass the OPAL_PREFIX environment variable to all processes?
Because of the fact, that all the files containing settings for opal_wrapper, which are located in share/openmpi/ and called e.g. mpif77-wrapper-data.txt, contain (defined by installation with --prefix) hard-coded paths, too.Hmm; they should not. In my 1.2.7 install, I see the following: -----[11:14] svbu-mpi:/home/jsquyres/bogus/share/openmpi % cat mpif77-wrapper-data.txt# There can be multiple blocks of configuration data, chosen by # compiler flags (using the compiler_args key to chose which block # should be activated. This can be useful for multilib builds. See the # multilib page at: # https://svn.open-mpi.org/trac/ompi/wiki/compilerwrapper3264 # for more information. project=Open MPI project_short=OMPI version=1.2.7rc6r19546 language=Fortran 77 compiler_env=F77 compiler_flags_env=FFLAGS compiler=gfortran extra_includes= preprocessor_flags= compiler_flags= linker_flags=libs=-lmpi_f77 -lmpi -lopen-rte -lopen-pal -ldl -Wl,--export-dynamic -lnsl -lutil -lm -ldlrequired_file=not supported includedir=${includedir} libdir=${libdir} [11:14] svbu-mpi:/home/jsquyres/bogus/share/openmpi % -----Note the "includedir" and "libdir" lines -- they're expressed in terms of ${foo}, which we can replace when OPAL_PREFIX (or related) is used.What version of OMPI are you using?
Note one of configure files contained in Sun ClusterMPI 8.0 (see attached file). The paths are really hard-coded in instead of usage of variables; this makes the package really not relocable without parsing the configure files.
Did you (or anyone reading this message) have any contact to SUN developers to point to this circumstance? *Why* do them use hard-coded paths? :o)
best regards, Paul Kapinos
# # Default word-size (used when -m flag is supplied to wrapper compiler) # compiler_args= project=Open MPI project_short=OMPI version=r19400-ct8.0-b31c-r29 language=Fortran 90 compiler_env=FC compiler_flags_env=FCFLAGS compiler=f90 module_option=-M extra_includes= preprocessor_flags= compiler_flags= libs=-lmpi -lopen-rte -lopen-pal -lnsl -lrt -lm -ldl -lutil -lpthread -lmpi_f77 -lmpi_f90 linker_flags=-R/opt/mx/lib/lib64 -R/opt/SUNWhpc/HPC8.0/lib/lib64 required_file= includedir=/opt/SUNWhpc/HPC8.0/include/64 libdir=/opt/SUNWhpc/HPC8.0/lib/lib64 # # Alternative word-size (used when -m flag is not supplied to wrapper compiler) # compiler_args=-m32 project=Open MPI project_short=OMPI version=r19400-ct8.0-b31c-r29 language=Fortran 90 compiler_env=FC compiler_flags_env=FCFLAGS compiler=f90 module_option=-M extra_includes= preprocessor_flags= compiler_flags=-m32 libs=-lmpi -lopen-rte -lopen-pal -lnsl -lrt -lm -ldl -lutil -lpthread -lmpi_f77 -lmpi_f90 linker_flags=-R/opt/mx/lib -R/opt/SUNWhpc/HPC8.0/lib required_file= includedir=/opt/SUNWhpc/HPC8.0/include libdir=/opt/SUNWhpc/HPC8.0/lib
<<attachment: kapinos.vcf>>
smime.p7s
Description: S/MIME Cryptographic Signature