Ralph Castain a écrit :
Did the version you are running get installed in /usr? Sounds like you are
picking up a different version when running a command - i.e., that your PATH is
finding a different installation than the one in /usr.
Right! I'm using OpenMPI with Rocks Cluster distribution. T
Sorry please disregard my reply to this email.
:-)
--td
On 10/26/2011 10:44 AM, Ralph Castain wrote:
Did the version you are running get installed in /usr? Sounds like you are
picking up a different version when running a command - i.e., that your PATH is
finding a different installation tha
I am using prefix configuration so no it does not exist in /usr.
--td
On 10/26/2011 10:44 AM, Ralph Castain wrote:
Did the version you are running get installed in /usr? Sounds like you are
picking up a different version when running a command - i.e., that your PATH is
finding a different ins
Did the version you are running get installed in /usr? Sounds like you are
picking up a different version when running a command - i.e., that your PATH is
finding a different installation than the one in /usr.
On Oct 26, 2011, at 3:11 AM, Patrick Begou wrote:
> I need to change system wide how
I need to change system wide how OpenMPI launch the jobs on the nodes of my
cluster.
Setting:
export OMPI_MCA_plm_rsh_agent=oarsh
works fine but I would like this config to be the default with OpenMPI. I've
read several threads (discussions, FAQ) about this but none of the provided
solutions