Re: [OMPI users] Increasing OpenMPI RMA win attach region count.

2019-01-09 Thread Jeff Hammond
Why is this allocated statically? I dont understand the difficulty of a dynamically allocates and thus unrestricted implementation. Is there some performance advantage to a bounded static allocation? Or is it that you use O(n) lookups and need to keep n small to avoid exposing that to users? I ha

Re: [OMPI users] Increasing OpenMPI RMA win attach region count.

2019-01-09 Thread Udayanga Wickramasinghe
Thanks, I think that will be very useful. Best, Udayanga On Wed, Jan 9, 2019 at 1:39 PM Jeff Squyres (jsquyres) via users < users@lists.open-mpi.org> wrote: > You can set this MCA var on a site-wide basis in a file: > > https://www.open-mpi.org/faq/?category=tuning#setting-mca-params > > >

Re: [OMPI users] Suggestion to add one thing to look/check for when running OpenMPI program

2019-01-09 Thread Ewen Chan
Jeff: You're welcome. Not a problem. I was trying to email somebody more directly about this with this recommended change only because I just encountered this problem, where a bit of time, trying to figure out why OpenFOAM on OpenMPI, trying to get it running across the nodes wasn't working.

Re: [OMPI users] Increasing OpenMPI RMA win attach region count.

2019-01-09 Thread Jeff Squyres (jsquyres) via users
You can set this MCA var on a site-wide basis in a file: https://www.open-mpi.org/faq/?category=tuning#setting-mca-params > On Jan 9, 2019, at 1:18 PM, Udayanga Wickramasinghe wrote: > > Thanks. Yes, I am aware of that however, I currently have a requirement to > increase the default. >

Re: [OMPI users] Increasing OpenMPI RMA win attach region count.

2019-01-09 Thread Udayanga Wickramasinghe
Thanks. Yes, I am aware of that however, I currently have a requirement to increase the default. Best, Udayanga On Wed, Jan 9, 2019 at 9:10 AM Nathan Hjelm via users < users@lists.open-mpi.org> wrote: > If you need to support more attachments you can set the value of that > variable either by se

Re: [OMPI users] Suggestion to add one thing to look/check for when running OpenMPI program

2019-01-09 Thread Jeff Squyres (jsquyres) via users
Good suggestion; thank you! > On Jan 8, 2019, at 9:44 PM, Ewen Chan wrote: > > To Whom It May Concern: > > Hello. I'm new here and I got here via OpenFOAM. > > In the FAQ regarding running OpenMPI programs, specifically where someone > might be able to run their OpenMPI program on a local no

Re: [OMPI users] Open MPI 4.0.0 - error with MPI_Send

2019-01-09 Thread Gilles Gouaillardet
Eduardo, The first part of the configure command line is for an install in /usr, but then there is ‘—prefix=/opt/openmpi/4.0.0’ and this is very fishy. You should also use ‘—with-hwloc=external’. How many nodes are you running on and which interconnect are you using ? What if you mpirun —mca pml

Re: [OMPI users] Increasing OpenMPI RMA win attach region count.

2019-01-09 Thread Nathan Hjelm via users
If you need to support more attachments you can set the value of that variable either by setting: Environment: OMPI_MCA_osc_rdma_max_attach mpirun command line: —mca osc_rdma_max_attach Keep in mind that each attachment may use an underlying hardware resource that may be easy to exhaust (h

[OMPI users] Open MPI 4.0.0 - error with MPI_Send

2019-01-09 Thread ROTHE Eduardo - externe
Hi. I'm testing Open MPI 4.0.0 and I'm struggling with a weird behaviour. In a very simple example (very frustrating). I'm having the following error returned by MPI_Send: [gafront4:25692] *** An error occurred in MPI_Send [gafront4:25692] *** reported by process [3152019457