Prentice Bisbal wrote:
Ashley Pittman wrote:
This smacks of a firewall issue, I thought you'd said you weren't using one but
now I read back your emails I can't see anywhere where you say that. Are you
running a flrewall or any iptables rules on any of the nodes? It looks to me
like you may
Ashley Pittman wrote:
> This smacks of a firewall issue, I thought you'd said you weren't using one
> but now I read back your emails I can't see anywhere where you say that. Are
> you running a flrewall or any iptables rules on any of the nodes? It looks
> to me like you may have some setup f
This smacks of a firewall issue, I thought you'd said you weren't using one but
now I read back your emails I can't see anywhere where you say that. Are you
running a flrewall or any iptables rules on any of the nodes? It looks to me
like you may have some setup from on the worker nodes.
Ash
Rolf vandeVaart wrote:
Ethan:
Can you run just "hostname" successfully? In other words, a non-MPI
program.
If that does not work, then we know the problem is in the runtime. If
it does works, then
there is something with the way the MPI library is setting up its
connections.
Interesting.
Ethan:
Can you run just "hostname" successfully? In other words, a non-MPI
program.
If that does not work, then we know the problem is in the runtime. If
it does works, then
there is something with the way the MPI library is setting up its
connections.
Is there more than one interface on
Prentice Bisbal wrote:
I'm assuming you already tested ssh connectivity and verified everything
is working as it should. (You did test all that, right?)
Yes. I am able to log in remotely to all nodes from the master, and to each node from each node
without a password. Each node mounts the sa
Prentice Bisbal wrote:
Ethan Deneault wrote:
All,
I am running Scientific Linux 5.5, with OpenMPI 1.4 installed into the
/usr/lib/openmpi/1.4-gcc/ directory. I know this is typically
/opt/openmpi, but Red Hat does things differently. I have my PATH and
LD_LIBRARY_PATH set correctly; because th
Ethan Deneault wrote:
> All,
>
> I am running Scientific Linux 5.5, with OpenMPI 1.4 installed into the
> /usr/lib/openmpi/1.4-gcc/ directory. I know this is typically
> /opt/openmpi, but Red Hat does things differently. I have my PATH and
> LD_LIBRARY_PATH set correctly; because the test program
Professor of Physics
SC 234
University of Tampa
Tampa, FL 33606
-Original Message-
From: users-boun...@open-mpi.org on behalf of David Zhang
Sent: Mon 9/20/2010 9:58 PM
To: Open MPI Users
Subject: Re: [OMPI users] Test Program works on 1, 2 or 3 nodes. Hangs on 4 or
more nodes.
I don
I don't know if this will help, but try
mpirun --machinefile testfile -np 4 ./test.out
for running 4 processes
On Mon, Sep 20, 2010 at 3:00 PM, Ethan Deneault wrote:
> All,
>
> I am running Scientific Linux 5.5, with OpenMPI 1.4 installed into the
> /usr/lib/openmpi/1.4-gcc/ directory. I know th
10 matches
Mail list logo