Re: [OMPI users] Open MPI on Cray XE6 / Gemini

2012-10-10 Thread Ralph Castain
Sorry - I saw the "pirun" cmd and thought it was some kind of Cray cmd Sent from my iPhone On Oct 10, 2012, at 9:11 AM, Nathan Hjelm wrote: > He is using mpirun from what I can see. And in this case the orted will use > PMI but the app will use the tcp oob to talk to the orted since there is

Re: [OMPI users] Intel 13.0.0 -pthreads warning

2012-10-10 Thread Jonas Juselius
Hi! On 10/10/2012 07:25 PM, Reuti wrote: Hi, Am 10.10.2012 um 19:16 schrieb Jonas Juselius: we recently installed the latest compilers from Intel (version 13.0.0), and compiled openmpi 1.6.2 for those compilers. When compiling using the mpif90, mpicc and mpicxx commands, the compiler spews

[OMPI users] Can not submit openmpi jobs with slurm on Centos 6.0

2012-10-10 Thread USA Linux UAE
Hello I am using openmpi (1.4.3) with slurm (2.4.2) on Centos 6.0 I can execute my jobs with mpirun to my nodelist in partition using "-H" option with mpirun. But when i use slurm and use salloc -n 3 sh and then submit mpi jobs using mpirun I get the following error: salloc: Granted job

Re: [OMPI users] Intel 13.0.0 -pthreads warning

2012-10-10 Thread Reuti
Hi, Am 10.10.2012 um 19:16 schrieb Jonas Juselius: > we recently installed the latest compilers from Intel (version 13.0.0), and > compiled openmpi 1.6.2 for those compilers. When compiling using the mpif90, > mpicc and mpicxx commands, the compiler spews out warnings that the > "-pthreads" op

[OMPI users] Intel 13.0.0 -pthreads warning

2012-10-10 Thread Jonas Juselius
Hi, we recently installed the latest compilers from Intel (version 13.0.0), and compiled openmpi 1.6.2 for those compilers. When compiling using the mpif90, mpicc and mpicxx commands, the compiler spews out warnings that the "-pthreads" option is deprecated and that "-reentrant threads" shoul

Re: [OMPI users] Open MPI on Cray XE6 / Gemini

2012-10-10 Thread Nathan Hjelm
He is using mpirun from what I can see. And in this case the orted will use PMI but the app will use the tcp oob to talk to the orted since there is no shmem oob atm. -Nathan On Wed, Oct 10, 2012 at 08:04:20AM -0700, Ralph Castain wrote: > Hi Nathan > > The only way to get that OOB error is if

Re: [OMPI users] Open MPI on Cray XE6 / Gemini

2012-10-10 Thread Ralph Castain
Hi Nathan The only way to get that OOB error is if PMI isn't running - hence my earlier note. If PMI isn't actually running, then we fall back to the TCP OOB and try to open sockets - which won't work because the app is being direct-launched. Alternatively, he could launch using "mpirun" and then

Re: [OMPI users] Open MPI on Cray XE6 / Gemini

2012-10-10 Thread Nathan Hjelm
On Wed, Oct 10, 2012 at 02:50:59PM +0200, Christoph Niethammer wrote: > Hello, > > I just tried to use Open MPI 1.7a1r27416 on a Cray XE6 system. Unfortunately > I > get the following error when I run a simple HelloWorldMPI program: > > $ pirun HelloWorldMPI > App launch reported: 2 (out of 2)

Re: [OMPI users] Open MPI on Cray XE6 / Gemini

2012-10-10 Thread Ralph Castain
Actually, I suspect the problem is that you don't have PMI running on the machine. The processes have no reason to be opening sockets for the OOB on a Cray XE6, and if you look at that platform file, it defines the location of the PMI libraries that are required. Since it built, I expect the libra

Re: [OMPI users] Open MPI on Cray XE6 / Gemini

2012-10-10 Thread Howard Pritchard
Hello Christoph, When ompi is configured with the lanl/cray_xe6/optimized-nopanasa, you have to use the 'aprun' launch command. Trying running your HelloWorldMPI as aprun -n 1 HelloWorldMPI Also, this particular config of ompi will not run in the 'cluster compatability' mode environment. Ho

Re: [OMPI users] Open MPI on Cray XE6 / Gemini

2012-10-10 Thread Reuti
Hi, Am 10.10.2012 um 14:50 schrieb Christoph Niethammer: > I just tried to use Open MPI 1.7a1r27416 on a Cray XE6 system. Unfortunately > I get the following error when I run a simple HelloWorldMPI program: > > $ pirun HelloWorldMPI > App launch reported: 2 (out of 2) daemons - 0 (out of 32) p

Re: [OMPI users] PAPI errors when compiling OpenMPI

2012-10-10 Thread Jeff Squyres
Check out our version philosophy: http://www.open-mpi.org/software/ompi/versions/ On Oct 10, 2012, at 10:27 AM, Tohiko Looka wrote: > Thanks a lot Jeff > 1.6.2 had similar problems but --disable-vt worked > Is there a page that tell what OpenMPI versions are compatible with > each other? (I

Re: [OMPI users] internal error with mpiJava in openmpi-1.9a1r27380

2012-10-10 Thread Ralph Castain
I haven't tried heterogeneous apps on the Java code yet - could well not work. At the least, I would expect you need to compile your Java app against the corresponding OMPI install on each architecture, and ensure the right one gets run on each node. Even though it's a Java app, the classes need to

Re: [OMPI users] PAPI errors when compiling OpenMPI

2012-10-10 Thread Tohiko Looka
Thanks a lot Jeff 1.6.2 had similar problems but --disable-vt worked Is there a page that tell what OpenMPI versions are compatible with each other? (In the sense that they can communicate with each other) On Tue, Oct 9, 2012 at 6:42 PM, Jeff Squyres wrote: > On Oct 9, 2012, at 11:34 AM, Tohiko L

Re: [OMPI users] undefined reference to `__intel_sse2_strlen'

2012-10-10 Thread Thomas Evangelidis
My apologies, I haven't searched in the FAQs before posting, just in the mailing list. Indeed I needed to specify the CXX compiler. These are the steps I followed to compile it: source /home/thomas/Programs/Intel_Compilers/bin/compilervars.sh intel64 ./configure CC=icc CXX=icpc F77=ifort FC=ifort

[OMPI users] Open MPI on Cray XE6 / Gemini

2012-10-10 Thread Christoph Niethammer
Hello, I just tried to use Open MPI 1.7a1r27416 on a Cray XE6 system. Unfortunately I get the following error when I run a simple HelloWorldMPI program: $ pirun HelloWorldMPI App launch reported: 2 (out of 2) daemons - 0 (out of 32) procs ... [unset]:_pmi_alps_get_appLayout:pmi_alps_get_apid ret

[OMPI users] internal error with mpiJava in openmpi-1.9a1r27380

2012-10-10 Thread Siegmar Gross
Hi, I have built openmpi-1.9a1r27380 with Java support and implemented a small program that sends some kind of "hello" with Send/Recv. tyr java 164 make mpijavac -d /home/fd1026/mpi_classfiles MsgSendRecvMain.java ... Everything works fine, if I use Solaris 10 x86_84. tyr java 165 mpiexec -np

[OMPI users] Datatype.Vector in mpijava in openmpi-1.9a1r27380

2012-10-10 Thread Siegmar Gross
Hi, I have built openmpi-1.9a1r27380 with Java support and try some small programs. When I try to Send/Recv the columns of a matrix, I don't get the expected results. I used "offset = 0" instead of "offset = i" in MPI.COMM_WORLD.Send for the following output, so that all processes should have rece

Re: [OMPI users] undefined reference to `__intel_sse2_strlen'

2012-10-10 Thread Matthias Jurenz
Hello Thomas, this error typically occurs when different compiler suites used for compiling C/C++ mixed source code. Please add CXX=icpc to your configure command in order to use a single compiler suite (=Intel) for compiling Open MPI. Otherwise, CXX is set to the default compiler (=g++) which

[OMPI users] Is MPI_Accumulate compatible with an user-defined derived datatype?

2012-10-10 Thread Victor Vysotskiy
Hello, I am wondering whether or not the MPI_Accumulate subroutine implemented in OpenMPI v1.6.2 is capable to operate on derived datatypes? I wrote a very simple test program for accumulating data from several process on master. The program works properly only with predefined datatypes. In th

Re: [OMPI users] Problems with cuda when installing openmpi 1.6.2

2012-10-10 Thread Hodgess, Erin
I found it...I had uninstalled CUDA but did not re-run ./configure. Thanks, Erin From: users-boun...@open-mpi.org [users-boun...@open-mpi.org] on behalf of Hodgess, Erin Sent: Tuesday, October 09, 2012 11:23 PM To: us...@open-mpi.org Subject: [OMPI users] Problem

[OMPI users] Problems with cuda when installing openmpi 1.6.2

2012-10-10 Thread Hodgess, Erin
Hello! I'm trying to install Open MPI 1.6.2. However, I'm getting the following error when running "make all install" make[5]: Entering directory `/home/erin/openmpi-1.6.2/ompi/contrib/vt/vt/vtlib' CC vt_gpu.lo In file included from vt_gpu.h:97:0, from vt_gpu.c:13: vt_cuda