I have a problem with my MPI code, it hangs when the code is run on multiple
nodes. It successfullycompletes when run on a single node. I am not sure how to
debug this. Can someone help me debug this issue?
Program Usage:
mpicc -o string string.cpp
mpirun -np 4 -npernode 2 -hostfile hosts ./strin
Your code works for me on two platforms. Thus, I guess the problem is with the
communication layer (BTL) is Open MPI. What network do you use? If Ethernet how
many interfaces?
Thanks,
george.
On Oct 10, 2012, at 09:30 , Santhosh Kokala
wrote:
> I have a problem with my MPI code, it han
Dear Open MPI developer,
in this post:
http://www.open-mpi.org/community/lists/users/2012/10/20416.php
I already reported about a case when Open MPI silently (without any word of
caution!) changed the transport from InfiniBand to IPoIB, thus loosing the
performance.
Another case of 'secret' d
George,
I am using each host with 4 interfaces including loopback interface. Can you
please let me know more about your environment?
eth0 Link encap:Ethernet HWaddr bc:30:5b:db:ae:6f
inet addr:xxx.xxx.xxx.134 Bcast:xxx.xxx.xxx.255 Mask:255.255.255.0
inet6 addr: fe80::
I guess the TCP BTL gets confused by your virtual interfaces (vmnet?). Try to
limit the used interfaces using the "--mca btl_tcp_if_include eth0" argument.
Let us know if this solves your issue.
Thanks,
george.
On Oct 10, 2012, at 18:54 , Santhosh Kokala
wrote:
> George,
> I am using
George,
You are a life saver. This solved my issue.
From: devel-boun...@open-mpi.org [mailto:devel-boun...@open-mpi.org] On Behalf
Of George Bosilca
Sent: Wednesday, October 10, 2012 10:10 AM
To: Open MPI Developers
Subject: Re: [OMPI devel] MPI_Reduce Hangs in my Application
I guess the TCP BTL
Good to hear ;)
Btw, the parameter I was talking about can accept more complex forms of
inclusion/exclusions. Basically, all CIDR formats are supported.
george.
On Oct 10, 2012, at 19:20 , Santhosh Kokala
wrote:
> George,
> You are a life saver. This solved my issue.
>
> From: devel-boun
Hi there,
I'd had deep joy when rebuilding from the SRPM of 1.6.2 as follows
rpmbuild \
--define '_topdir /home/scifachpc01/buckleke/rpmbuild' \
--define 'install_in_opt 1' \
--define 'install_shell_scripts 1' \
--define 'install_modulefile 1' \
--define 'use_mpi_selector 1' \
-bb openm