Re: [OMPI users] is there an equiv of iprove for bcast?

2011-05-10 Thread Randolph Pullen
Thanks, The messages are small and frequent (they flash metadata across the cluster).   The current approach works fine for small to medium clusters but I want it to be able to go big.  Maybe up to several hundred or even a thousands of nodes. Its these larger deployments that concern me.  The cu

[OMPI users] Trouble with MPI-IO

2011-05-10 Thread Tom Rosmond
I would appreciate someone with experience with MPI-IO look at the simple fortran program gzipped and attached to this note. It is imbedded in a script so that all that is necessary to run it is do: 'testio' from the command line. The program generates a small 2-D input array, sets up an MPI-IO e

Re: [OMPI users] MPI_COMM_DUP freeze with OpenMPI 1.4.1

2011-05-10 Thread George Bosilca
On May 10, 2011, at 08:10 , Tim Prince wrote: > On 5/10/2011 6:43 AM, francoise.r...@obs.ujf-grenoble.fr wrote: >> >> Hi, >> >> I compile a parallel program with OpenMPI 1.4.1 (compiled with intel >> compilers 12 from composerxe package) . This program is linked to MUMPS >> library 4.9.2, compi

Re: [OMPI users] MPI_COMM_DUP freeze with OpenMPI 1.4.1

2011-05-10 Thread Tim Prince
On 5/10/2011 6:43 AM, francoise.r...@obs.ujf-grenoble.fr wrote: Hi, I compile a parallel program with OpenMPI 1.4.1 (compiled with intel compilers 12 from composerxe package) . This program is linked to MUMPS library 4.9.2, compiled with the same compilers and link with intel MKL. The OS is lin

[OMPI users] Issue with Open MPI 1.5.3 Windows binary builds

2011-05-10 Thread Tyler W. Wilson
Good day, I am new to the Open MPI package, and so am starting at the beginning. I have little if any desire to build the binaries, so I was glad to see a Windows binary release. I started with I think is the minimum program: #include "mpi.h" int main(int argc, char* argv[]) { MPI_Init(

Re: [OMPI users] Windows: MPI_Allreduce() crashes when using MPI_DOUBLE_PRECISION

2011-05-10 Thread Jeff Squyres
On May 10, 2011, at 2:30 AM, hi wrote: >> You didn't answer my prior questions. :-) > I am observing this crash using MPI_ALLREDUCE() in test program; and > which does not have any memory corruption issue. ;) Can you send the info listed on the help page? >> I ran your test program with -np 2 a

[OMPI users] MPI_COMM_DUP freeze with OpenMPI 1.4.1

2011-05-10 Thread francoise.r...@obs.ujf-grenoble.fr
Hi, I compile a parallel program with OpenMPI 1.4.1 (compiled with intel compilers 12 from composerxe package) . This program is linked to MUMPS library 4.9.2, compiled with the same compilers and link with intel MKL. The OS is linux debian. No error in compiling or running the job, but the

[OMPI users] openmpi (1.2.8 or above) and Intel composer XE 2011 (aka 12.0)

2011-05-10 Thread Salvatore Podda
Dear all, we succeed in building several version of openmpi from 1.2.8 to 1.4.3 with Intel composer XE 2011 (aka 12.0). However we found a threshold in the number of cores (depending from the application: IMB, xhpl or user applications and form the number of required cores) above which the a

Re: [OMPI users] Windows: MPI_Allreduce() crashes when using MPI_DOUBLE_PRECISION

2011-05-10 Thread hi
Hi Jeff, > You didn't answer my prior questions. :-) I am observing this crash using MPI_ALLREDUCE() in test program; and which does not have any memory corruption issue. ;) > I ran your test program with -np 2 and -np 4 and it seemed to work ok. Can you please let me know what environment (incl