Re: [gmx-users] Gromacs 4.6.7 with MPI and OpenMP

2015-05-14 Thread Szilárd Páll
Malcolm, On Mon, May 11, 2015 at 4:23 PM, Malcolm Tobias wrote: > > Szilárd, > > On Friday 08 May 2015 21:18:12 Szilárd Páll wrote: >> >> What is your goal with using CPUSETs? Node sharing? >> > >> > Correct. While it might be possible to see the cores that have been >> > assigned to the job an

Re: [gmx-users] Gromacs 4.6.7 with MPI and OpenMP

2015-05-11 Thread Malcolm Tobias
Mark, On Friday 08 May 2015 15:15:31 Mark Abraham wrote: > > FWIW, I ran the same GROMACs run outside of the queuing system to verify > > that the CPUSETs were not causing the issue. > > > > MPI gets a chance to play with OMP_NUM_THREADS (and pinning!), too, so your > tests suggest the issue lie

Re: [gmx-users] Gromacs 4.6.7 with MPI and OpenMP

2015-05-11 Thread Malcolm Tobias
Szilárd, On Friday 08 May 2015 21:18:12 Szilárd Páll wrote: > >> What is your goal with using CPUSETs? Node sharing? > > > > Correct. While it might be possible to see the cores that have been > > assigned to the job and do the correct 'pin setting' it would probably be > > ugly. > > Not sure

Re: [gmx-users] Gromacs 4.6.7 with MPI and OpenMP

2015-05-08 Thread Szilárd Páll
On Fri, May 8, 2015 at 8:44 PM, Malcolm Tobias wrote: > > Szilárd, > > On Friday 08 May 2015 20:25:09 Szilárd Páll wrote: >> > I wouldn't expect the CPUSETs to be problematic, I've been using them with >> > Gromacs for over a decade now ;-) >> >> Thread affinity setting within mdrun has been empl

Re: [gmx-users] Gromacs 4.6.7 with MPI and OpenMP

2015-05-08 Thread Malcolm Tobias
Szilárd, On Friday 08 May 2015 20:25:09 Szilárd Páll wrote: > > I wouldn't expect the CPUSETs to be problematic, I've been using them with > > Gromacs for over a decade now ;-) > > Thread affinity setting within mdrun has been employed since v4.6 and > we do it on a per-thread basis and not doi

Re: [gmx-users] Gromacs 4.6.7 with MPI and OpenMP

2015-05-08 Thread Szilárd Páll
On Fri, May 8, 2015 at 4:45 PM, Malcolm Tobias wrote: > > Szilárd, > > On Friday 08 May 2015 15:56:12 Szilárd Páll wrote: >> What's being utilized vs what's being started are different things. If >> you don't believe the mdrun output - which is quite likely not wrong >> about the 2 ranks x 4 threa

Re: [gmx-users] Gromacs 4.6.7 with MPI and OpenMP

2015-05-08 Thread Mark Abraham
On Fri, May 8, 2015 at 4:28 PM Malcolm Tobias wrote: > > Mark, > > On Friday 08 May 2015 13:48:30 Mark Abraham wrote: > > > What kind of simulation are you testing with? A reaction-field water box > > will have almost nothing to do on the CPU, so no real change with > #threads. > > Check with you

Re: [gmx-users] Gromacs 4.6.7 with MPI and OpenMP

2015-05-08 Thread Malcolm Tobias
Szilárd, On Friday 08 May 2015 15:56:12 Szilárd Páll wrote: > What's being utilized vs what's being started are different things. If > you don't believe the mdrun output - which is quite likely not wrong > about the 2 ranks x 4 threads -, use your favorite tool to check the > number of ranks and

Re: [gmx-users] Gromacs 4.6.7 with MPI and OpenMP

2015-05-08 Thread Malcolm Tobias
Mark, On Friday 08 May 2015 13:48:30 Mark Abraham wrote: > What kind of simulation are you testing with? A reaction-field water box > will have almost nothing to do on the CPU, so no real change with #threads. > Check with your users, but a PME test case is often more appropriate. I have no ide

Re: [gmx-users] Gromacs 4.6.7 with MPI and OpenMP

2015-05-08 Thread Szilárd Páll
On Fri, May 8, 2015 at 2:50 PM, Malcolm Tobias wrote: > > Hi Mark, > > On Friday 08 May 2015 11:51:03 Mark Abraham wrote: >> > >> > I'm attempting to build gromacs on a new cluster and following the same >> > recipies that I've used in the past, but encountering a strange behavior: >> > It claims

Re: [gmx-users] Gromacs 4.6.7 with MPI and OpenMP

2015-05-08 Thread Mark Abraham
Hi, On Fri, May 8, 2015 at 3:01 PM Malcolm Tobias wrote: > > Hi Mark, > > On Friday 08 May 2015 11:51:03 Mark Abraham wrote: > > > > > > I'm attempting to build gromacs on a new cluster and following the same > > > recipies that I've used in the past, but encountering a strange > behavior: > > >

Re: [gmx-users] Gromacs 4.6.7 with MPI and OpenMP

2015-05-08 Thread Malcolm Tobias
Hi Mark, On Friday 08 May 2015 11:51:03 Mark Abraham wrote: > > > > I'm attempting to build gromacs on a new cluster and following the same > > recipies that I've used in the past, but encountering a strange behavior: > > It claims to be using both MPI and OpenMP, but I can see by 'top' and the >

Re: [gmx-users] Gromacs 4.6.7 with MPI and OpenMP

2015-05-08 Thread Mark Abraham
Hi, On Thu, May 7, 2015 at 6:16 PM Malcolm Tobias wrote: > > All, > > I'm attempting to build gromacs on a new cluster and following the same > recipies that I've used in the past, but encountering a strange behavior: > It claims to be using both MPI and OpenMP, but I can see by 'top' and the >

[gmx-users] Gromacs 4.6.7 with MPI and OpenMP

2015-05-07 Thread Malcolm Tobias
All, I'm attempting to build gromacs on a new cluster and following the same recipies that I've used in the past, but encountering a strange behavior: It claims to be using both MPI and OpenMP, but I can see by 'top' and the reported core/walltime that it's really only generating the MPI proce