Re: [OMPI users] Spawn_multiple with tight integration to SGE grid engine

2012-01-30 Thread Tom Bryan
On 1/29/12 5:44 PM, "Reuti" wrote: > you compiled Open MPI --with-sge I assume, as the above is working - fine. Yes, we compiled --with-sge. >> #$ -pe orte 1- > > This number should match the processes you want to start plus one the master. > Otherwise SGE might refuse to start a process on a

Re: [OMPI users] OpenMPI and pgf90 -- adding flags to mpif90 compile

2012-01-30 Thread Jeff Squyres
Do you need these flags when you run pgf90 or mpif90? If you need it with mpif90, then using --with-wrapper-ldflags= should add to mpif90 every time you link something. If you need something added to mpif90 every time you compile or line, then use --with-wrapper-fcflags=. If neither of those

Re: [OMPI users] OpenMPI and pgf90 -- adding flags to mpif90 compile

2012-01-30 Thread Reuti
Am 31.01.2012 um 00:24 schrieb Cable, Sam B Civ USAF AFMC AFRL/RVBXI: > I need to build OpenMPI with Portland Group Fortran. I need to add a flag to > the pgf90 linker command that is run when mpif90 is invoked. I have tried > configuring with LDFLAGS and –with-wrapper-ldflags, but nothing wor

[OMPI users] OpenMPI and pgf90 -- adding flags to mpif90 compile

2012-01-30 Thread Cable, Sam B Civ USAF AFMC AFRL/RVBXI
I need to build OpenMPI with Portland Group Fortran. I need to add a flag to the pgf90 linker command that is run when mpif90 is invoked. I have tried configuring with LDFLAGS and -with-wrapper-ldflags, but nothing works. I am thinking that surely there is a way to get non-default flags put into

Re: [OMPI users] pure static "mpirun" launcher (Jeff Squyres) - now testing

2012-01-30 Thread Jeff Squyres
Try running a dynamic version of your process through valgrind, or another memory-checking debugger and see if anything shows up. On Jan 30, 2012, at 2:50 PM, Ilias Miroslav wrote: > Well, > > the simplest program, > > program main >implicit none >include 'mpif.h' >

Re: [OMPI users] pure static "mpirun" launcher (Jeff Squyres) - now testing

2012-01-30 Thread Ilias Miroslav
Well, the simplest program, program main implicit none include 'mpif.h' integer ierr, rank, size call MPI_INIT(ierr) call MPI_COMM_RANK(MPI_COMM_WORLD, rank, ierr) call MPI_COMM_SIZE(MPI_COMM_WORLD, size, ierr) print *, "Hello, world,

Re: [OMPI users] pure static "mpirun" launcher (Jeff Squyres) - now testing

2012-01-30 Thread Ilias Miroslav
Hi, what segfaulted ? I am not sure...maybe application is bug showing up with static OpenMPI. I try to compile & run simplest MPI example and I shall let you know. In betweem I am attaching debugger output would help to track this bug: Backtrace for this error: + function __restore_rt (0x25

Re: [OMPI users] pure static "mpirun" launcher (Jeff Squyres) - now testing

2012-01-30 Thread Ralph Castain
What segfaulted - mpirun or your app? On Jan 30, 2012, at 11:24 AM, Ilias Miroslav wrote: > Hi Jeff, > > thanks for the fix; > > I downloaded the Open MPI trunk and have built it up, > > the (most recent) revision 25818 is giving this error and hangs: > > /home/ilias/bin/ompi_ilp64_static/bi

Re: [OMPI users] pure static "mpirun" launcher (Jeff Squyres) - now testing

2012-01-30 Thread Ilias Miroslav
Hi Jeff, thanks for the fix; I downloaded the Open MPI trunk and have built it up, the (most recent) revision 25818 is giving this error and hangs: /home/ilias/bin/ompi_ilp64_static/bin/mpirun -np 2 ./dirac.x . . Program received signal 11 (SIGSEGV): Segmentation fault. Backtrace for this

Re: [OMPI users] Latest Intel Compilers (ICS, version 12.1.0.233 Build 20110811) issues ...

2012-01-30 Thread Richard Walsh
Hey Gotz, I have not seen this mpirun error with the OpenMPI version I have built with Intel 12.1 and the mpicc fix: openmpi-1.5.5rc1.tar.bz2 and from the looks of things, I wonder if your problem is related. The solution in the original case was to conditionally dial-down optimization when us

Re: [OMPI users] Latest Intel Compilers (ICS, version 12.1.0.233 Build 20110811) issues ...

2012-01-30 Thread Götz Waschk
Hi Richard, On Wed, Jan 4, 2012 at 4:06 PM, Richard Walsh wrote: > Moreover, this problem has been addressed with the following go-around > in the 1.5.5 OpenMPI release with the following fix in > opal/mca/memory/linux/malloc.c: > #ifdef __INTEL_COMPILER_BUILD_DATE > #  if __INTEL_COMPILER_BUI

Re: [OMPI users] MPI_Barrier, again

2012-01-30 Thread Evgeniy Shapiro
I have attached an example. Compiler: ifort (IFORT) 11.1 20090630 Copyright (C) 1985-2009 Intel Corporation. All rights reserved. flags: mpif90 -O0 -fp-model precise -traceback -r8 -i4 -fpp -check all -warn all -warn nounused -save-temps -g -check noarg_temp_created -o testbar ./mpibarriertes