Re: [gmx-users] MPI error in gromacs 4.6

2014-03-24 Thread Pavan Kumar
Hello Ankita
You have to just include the following line in your mdp file
cutoff-scheme=Verlet
And run your grompp with the modfied mdp file to generate tpr file and then
mdrun.
Hope this doesn't give you the same error


On Mon, Mar 24, 2014 at 4:47 PM, Ankita Naithani
ankitanaith...@gmail.comwrote:

 Hi,

 I am trying to run a simulation of my protein (monomer ~500 residues). I
 had few questions and erors regarding the same.
 I have previously run the simulation of the apo form of the same protein
 using Gromacs 4.5.5 which was available at the cluster facility I was using
 and also which is installed in my system. However, when I tried to run the
 holo form, I got error :
 Fatal error:
 11 particles communicated to PME node 106 are more than 2/3 times the
 cut-off out of the domain decomposition cell of their charge group in
 dimension y.
 This usually means that your system is not well equilibrated.
 For more information and tips for troubleshooting, please check the GROMACS
 website at http://www.gromacs.org/Documentation/Errors

 This I figured out could be solved using a lower timestep as my previous
 timestep was 4fs and now I have reduced it to 3fs which should work fine
 now.
 However, after producing the tpr file for production run in my GROMACS
 4.5.5, I realised that the grant for the cluster facility is over and the
 new clusters which I am trying to set up the same protein for support only
 gromacs 4.6. I am trying to run the code in these clusters and I get he
 following error:


 ---
 Program mdrun_mpi, VERSION 4.6.3
 Source code file: /home/gromacs-4.6.3/src/kernel/runner
 .c, line: 824

 Fatal error:
 OpenMP threads have been requested with cut-off scheme Group, but these are
 only
  supported with cut-off scheme Verlet
 For more information and tips for troubleshooting, please check the GROMACS
 website at http://www.gromacs.org/Documentation/Errors

 -

 1. I wanted help with my mdp options to make it compatible.
 2. Since my pevious calculations were based on gromacs 4.5.5, switching to
 gromacs 4.6, would that break the continuity of the run or would that bring
 about differences in the way the trajectories would be analysed?


 Below, is my mdp file
 title= production MD
 ; Run parameters
 integrator= md; leap-frog algorithm
 nsteps= ; 0.003 *  = 10 ps or 100 n
 dt= 0.003; 3 fs
 ; Output control
 nstxout= 0; save coordinates every 2 ps
 nstvout= 0; save velocities every 2 ps
 nstxtcout= 1000; xtc compressed trajectory output every 5 ps
 nstenergy= 1000; save energies every 5 ps
 nstlog= 1000; update log file every 5 ps
 energygrps  = Protein ATP
 ; Bond parameters
 constraint_algorithm = lincs; holonomic constraints
 constraints= all-bonds; all bonds (even heavy atom-H bonds)
 constrained
 lincs_iter= 1; accuracy of LINCS
 lincs_order= 4; also related to accuracy
 ; Neighborsearching
 ns_type= grid; search neighboring grid cells
 nstlist= 5; 25 fs
 rlist= 1.0; short-range neighborlist cutoff (in nm)
 rcoulomb= 1.0; short-range electrostatic cutoff (in nm)
 rvdw= 1.0; short-range van der Waals cutoff (in nm)
 rlistlong= 1.0; long-range neighborlist cutoff (in nm)
 ; Electrostatics
 coulombtype= PME; Particle Mesh Ewald for long-range
 electrostatics
 pme_order= 4; cubic interpolation
 fourierspacing= 0.16; grid spacing for FFT
 nstcomm = 10; remove com every 10 steps
 ; Temperature coupling is on
 tcoupl= V-rescale; modified Berendsen thermostat
 tc-grps= Protein Non-Protein; two coupling groups - more
 accurate
 tau_t= 0.10.1; time constant, in ps
 ref_t= 318 318; reference temperature, one for each group,
 in K
 ; Pressure coupling is off
 pcoupl  = berendsen; Berendsen thermostat
 pcoupltype= isotropic; uniform scaling of box vectors
 tau_p= 1.0; time constant, in ps
 ref_p= 1.0; reference pressure, in bar
 compressibility = 4.5e-5; isothermal compressibility of water, bar^-1
 ; Periodic boundary conditions
 pbc= xyz; 3-D PBC
 ; Dispersion correction
 DispCorr= EnerPres; account for cut-off vdW scheme
 ; Velocity generation
 gen_vel= yes; Velocity generation is on
 gen_temp= 318; reference temperature, for protein in K




 Kind regards--
 Ankita Naithani
 --
 Gromacs Users mailing list

 * Please search the archive at
 http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
 posting!

 * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

 

Re: [gmx-users] MPI error in gromacs 4.6

2014-03-24 Thread Ankita Naithani
Hi Pavan,
Thank you for your response. I am trying to generate the tpr file with the
following parameter;
; Neighborsearching
 ns_type= grid; search neighboring grid cells
 nstlist= 5; 25 fs
 rlist= 1.0; short-range neighborlist cutoff (in nm)
 rcoulomb= 1.0; short-range electrostatic cutoff (in nm)
 rvdw= 1.0; short-range van der Waals cutoff (in nm)
 rlistlong= 1.0; long-range neighborlist cutoff (in nm)
cutoff-scheme = Verlet

But, I get a warning of Unknown left-hand 'cutoff-scheme' in parameter
file.


On Mon, Mar 24, 2014 at 11:26 AM, Pavan Kumar kumar.pavan...@gmail.comwrote:

 Hello Ankita
 You have to just include the following line in your mdp file
 cutoff-scheme=Verlet
 And run your grompp with the modfied mdp file to generate tpr file and then
 mdrun.
 Hope this doesn't give you the same error


 On Mon, Mar 24, 2014 at 4:47 PM, Ankita Naithani
 ankitanaith...@gmail.comwrote:

  Hi,
 
  I am trying to run a simulation of my protein (monomer ~500 residues). I
  had few questions and erors regarding the same.
  I have previously run the simulation of the apo form of the same protein
  using Gromacs 4.5.5 which was available at the cluster facility I was
 using
  and also which is installed in my system. However, when I tried to run
 the
  holo form, I got error :
  Fatal error:
  11 particles communicated to PME node 106 are more than 2/3 times the
  cut-off out of the domain decomposition cell of their charge group in
  dimension y.
  This usually means that your system is not well equilibrated.
  For more information and tips for troubleshooting, please check the
 GROMACS
  website at http://www.gromacs.org/Documentation/Errors
 
  This I figured out could be solved using a lower timestep as my previous
  timestep was 4fs and now I have reduced it to 3fs which should work fine
  now.
  However, after producing the tpr file for production run in my GROMACS
  4.5.5, I realised that the grant for the cluster facility is over and the
  new clusters which I am trying to set up the same protein for support
 only
  gromacs 4.6. I am trying to run the code in these clusters and I get he
  following error:
 
 
  ---
  Program mdrun_mpi, VERSION 4.6.3
  Source code file: /home/gromacs-4.6.3/src/kernel/runner
  .c, line: 824
 
  Fatal error:
  OpenMP threads have been requested with cut-off scheme Group, but these
 are
  only
   supported with cut-off scheme Verlet
  For more information and tips for troubleshooting, please check the
 GROMACS
  website at http://www.gromacs.org/Documentation/Errors
 
 
 -
 
  1. I wanted help with my mdp options to make it compatible.
  2. Since my pevious calculations were based on gromacs 4.5.5, switching
 to
  gromacs 4.6, would that break the continuity of the run or would that
 bring
  about differences in the way the trajectories would be analysed?
 
 
  Below, is my mdp file
  title= production MD
  ; Run parameters
  integrator= md; leap-frog algorithm
  nsteps= ; 0.003 *  = 10 ps or 100 n
  dt= 0.003; 3 fs
  ; Output control
  nstxout= 0; save coordinates every 2 ps
  nstvout= 0; save velocities every 2 ps
  nstxtcout= 1000; xtc compressed trajectory output every 5 ps
  nstenergy= 1000; save energies every 5 ps
  nstlog= 1000; update log file every 5 ps
  energygrps  = Protein ATP
  ; Bond parameters
  constraint_algorithm = lincs; holonomic constraints
  constraints= all-bonds; all bonds (even heavy atom-H bonds)
  constrained
  lincs_iter= 1; accuracy of LINCS
  lincs_order= 4; also related to accuracy
  ; Neighborsearching
  ns_type= grid; search neighboring grid cells
  nstlist= 5; 25 fs
  rlist= 1.0; short-range neighborlist cutoff (in nm)
  rcoulomb= 1.0; short-range electrostatic cutoff (in nm)
  rvdw= 1.0; short-range van der Waals cutoff (in nm)
  rlistlong= 1.0; long-range neighborlist cutoff (in nm)
  ; Electrostatics
  coulombtype= PME; Particle Mesh Ewald for long-range
  electrostatics
  pme_order= 4; cubic interpolation
  fourierspacing= 0.16; grid spacing for FFT
  nstcomm = 10; remove com every 10 steps
  ; Temperature coupling is on
  tcoupl= V-rescale; modified Berendsen thermostat
  tc-grps= Protein Non-Protein; two coupling groups - more
  accurate
  tau_t= 0.10.1; time constant, in ps
  ref_t= 318 318; reference temperature, one for each
 group,
  in K
  ; Pressure coupling is off
  pcoupl  = berendsen; Berendsen thermostat
  

Re: [gmx-users] MPI error in gromacs 4.6

2014-03-24 Thread Pavan Kumar
It might be some typographical errors.
Check the mdp file thoroughly. I think semicolon is required for the last
line in your mdp file


On Mon, Mar 24, 2014 at 5:18 PM, Ankita Naithani
ankitanaith...@gmail.comwrote:

 Hi Pavan,
 Thank you for your response. I am trying to generate the tpr file with the
 following parameter;
 ; Neighborsearching
  ns_type= grid; search neighboring grid cells
  nstlist= 5; 25 fs
  rlist= 1.0; short-range neighborlist cutoff (in nm)
  rcoulomb= 1.0; short-range electrostatic cutoff (in nm)
  rvdw= 1.0; short-range van der Waals cutoff (in nm)
  rlistlong= 1.0; long-range neighborlist cutoff (in nm)
 cutoff-scheme = Verlet

 But, I get a warning of Unknown left-hand 'cutoff-scheme' in parameter
 file.


 On Mon, Mar 24, 2014 at 11:26 AM, Pavan Kumar kumar.pavan...@gmail.com
 wrote:

  Hello Ankita
  You have to just include the following line in your mdp file
  cutoff-scheme=Verlet
  And run your grompp with the modfied mdp file to generate tpr file and
 then
  mdrun.
  Hope this doesn't give you the same error
 
 
  On Mon, Mar 24, 2014 at 4:47 PM, Ankita Naithani
  ankitanaith...@gmail.comwrote:
 
   Hi,
  
   I am trying to run a simulation of my protein (monomer ~500 residues).
 I
   had few questions and erors regarding the same.
   I have previously run the simulation of the apo form of the same
 protein
   using Gromacs 4.5.5 which was available at the cluster facility I was
  using
   and also which is installed in my system. However, when I tried to run
  the
   holo form, I got error :
   Fatal error:
   11 particles communicated to PME node 106 are more than 2/3 times the
   cut-off out of the domain decomposition cell of their charge group in
   dimension y.
   This usually means that your system is not well equilibrated.
   For more information and tips for troubleshooting, please check the
  GROMACS
   website at http://www.gromacs.org/Documentation/Errors
  
   This I figured out could be solved using a lower timestep as my
 previous
   timestep was 4fs and now I have reduced it to 3fs which should work
 fine
   now.
   However, after producing the tpr file for production run in my GROMACS
   4.5.5, I realised that the grant for the cluster facility is over and
 the
   new clusters which I am trying to set up the same protein for support
  only
   gromacs 4.6. I am trying to run the code in these clusters and I get he
   following error:
  
  
   ---
   Program mdrun_mpi, VERSION 4.6.3
   Source code file: /home/gromacs-4.6.3/src/kernel/runner
   .c, line: 824
  
   Fatal error:
   OpenMP threads have been requested with cut-off scheme Group, but these
  are
   only
supported with cut-off scheme Verlet
   For more information and tips for troubleshooting, please check the
  GROMACS
   website at http://www.gromacs.org/Documentation/Errors
  
  
 
 -
  
   1. I wanted help with my mdp options to make it compatible.
   2. Since my pevious calculations were based on gromacs 4.5.5, switching
  to
   gromacs 4.6, would that break the continuity of the run or would that
  bring
   about differences in the way the trajectories would be analysed?
  
  
   Below, is my mdp file
   title= production MD
   ; Run parameters
   integrator= md; leap-frog algorithm
   nsteps= ; 0.003 *  = 10 ps or 100 n
   dt= 0.003; 3 fs
   ; Output control
   nstxout= 0; save coordinates every 2 ps
   nstvout= 0; save velocities every 2 ps
   nstxtcout= 1000; xtc compressed trajectory output every 5
 ps
   nstenergy= 1000; save energies every 5 ps
   nstlog= 1000; update log file every 5 ps
   energygrps  = Protein ATP
   ; Bond parameters
   constraint_algorithm = lincs; holonomic constraints
   constraints= all-bonds; all bonds (even heavy atom-H bonds)
   constrained
   lincs_iter= 1; accuracy of LINCS
   lincs_order= 4; also related to accuracy
   ; Neighborsearching
   ns_type= grid; search neighboring grid cells
   nstlist= 5; 25 fs
   rlist= 1.0; short-range neighborlist cutoff (in nm)
   rcoulomb= 1.0; short-range electrostatic cutoff (in nm)
   rvdw= 1.0; short-range van der Waals cutoff (in nm)
   rlistlong= 1.0; long-range neighborlist cutoff (in nm)
   ; Electrostatics
   coulombtype= PME; Particle Mesh Ewald for long-range
   electrostatics
   pme_order= 4; cubic interpolation
   fourierspacing= 0.16; grid spacing for FFT
   nstcomm = 10; remove com every 10 steps
   ; Temperature coupling is on
   tcoupl= V-rescale; 

Re: [gmx-users] MPI error in gromacs 4.6

2014-03-24 Thread Mark Abraham
On Mon, Mar 24, 2014 at 12:17 PM, Ankita Naithani
ankitanaith...@gmail.comwrote:

 Hi,

 I am trying to run a simulation of my protein (monomer ~500 residues). I
 had few questions and erors regarding the same.
 I have previously run the simulation of the apo form of the same protein
 using Gromacs 4.5.5 which was available at the cluster facility I was using
 and also which is installed in my system. However, when I tried to run the
 holo form, I got error :
 Fatal error:
 11 particles communicated to PME node 106 are more than 2/3 times the
 cut-off out of the domain decomposition cell of their charge group in
 dimension y.
 This usually means that your system is not well equilibrated.
 For more information and tips for troubleshooting, please check the GROMACS
 website at http://www.gromacs.org/Documentation/Errors

 This I figured out could be solved using a lower timestep as my previous
 timestep was 4fs and now I have reduced it to 3fs which should work fine
 now.


You should need to do that kind of thing only for equilibration, e.g. see
http://www.gromacs.org/Documentation/How-tos/Steps_to_Perform_a_Simulation


 However, after producing the tpr file for production run in my GROMACS
 4.5.5, I realised that the grant for the cluster facility is over and the
 new clusters which I am trying to set up the same protein for support only
 gromacs 4.6. I am trying to run the code in these clusters and I get he
 following error:


 ---
 Program mdrun_mpi, VERSION 4.6.3
 Source code file: /home/gromacs-4.6.3/src/kernel/runner
 .c, line: 824

 Fatal error:
 OpenMP threads have been requested with cut-off scheme Group, but these are
 only
  supported with cut-off scheme Verlet
 For more information and tips for troubleshooting, please check the GROMACS
 website at http://www.gromacs.org/Documentation/Errors

 -

 1. I wanted help with my mdp options to make it compatible.


If you want to run with the group scheme, use cutoff-scheme = group,
control your job script to use an MPI rank per core, and do not attempt to
use OpenMP.

2. Since my pevious calculations were based on gromacs 4.5.5, switching to
 gromacs 4.6, would that break the continuity of the run or would that bring
 about differences in the way the trajectories would be analysed?


Many things are different, but if you're confident no relevant bugs were
fixed, you can use 4.6.x to do the same things as 4.5.5 could do. But since
such runs would not have
http://www.gromacs.org/Documentation/Terminology/Reproducibility, then they
will not have continuity either.




 Below, is my mdp file
 title= production MD
 ; Run parameters
 integrator= md; leap-frog algorithm
 nsteps= ; 0.003 *  = 10 ps or 100 n
 dt= 0.003; 3 fs
 ; Output control
 nstxout= 0; save coordinates every 2 ps
 nstvout= 0; save velocities every 2 ps
 nstxtcout= 1000; xtc compressed trajectory output every 5 ps
 nstenergy= 1000; save energies every 5 ps
 nstlog= 1000; update log file every 5 ps
 energygrps  = Protein ATP
 ; Bond parameters
 constraint_algorithm = lincs; holonomic constraints
 constraints= all-bonds; all bonds (even heavy atom-H bonds)
 constrained
 lincs_iter= 1; accuracy of LINCS
 lincs_order= 4; also related to accuracy
 ; Neighborsearching
 ns_type= grid; search neighboring grid cells
 nstlist= 5; 25 fs
 rlist= 1.0; short-range neighborlist cutoff (in nm)
 rcoulomb= 1.0; short-range electrostatic cutoff (in nm)
 rvdw= 1.0; short-range van der Waals cutoff (in nm)
 rlistlong= 1.0; long-range neighborlist cutoff (in nm)
 ; Electrostatics
 coulombtype= PME; Particle Mesh Ewald for long-range
 electrostatics
 pme_order= 4; cubic interpolation
 fourierspacing= 0.16; grid spacing for FFT
 nstcomm = 10; remove com every 10 steps
 ; Temperature coupling is on
 tcoupl= V-rescale; modified Berendsen thermostat
 tc-grps= Protein Non-Protein; two coupling groups - more
 accurate
 tau_t= 0.10.1; time constant, in ps
 ref_t= 318 318; reference temperature, one for each group,
 in K
 ; Pressure coupling is off
 pcoupl  = berendsen; Berendsen thermostat
 pcoupltype= isotropic; uniform scaling of box vectors
 tau_p= 1.0; time constant, in ps
 ref_p= 1.0; reference pressure, in bar
 compressibility = 4.5e-5; isothermal compressibility of water, bar^-1
 ; Periodic boundary conditions
 pbc= xyz; 3-D PBC
 ; Dispersion correction
 DispCorr= EnerPres; account for cut-off vdW scheme
 ; Velocity 

Re: [gmx-users] MPI error in gromacs 4.6

2014-03-24 Thread Mark Abraham
On Mon, Mar 24, 2014 at 12:48 PM, Ankita Naithani
ankitanaith...@gmail.comwrote:

 Hi Pavan,
 Thank you for your response. I am trying to generate the tpr file with the
 following parameter;
 ; Neighborsearching
  ns_type= grid; search neighboring grid cells
  nstlist= 5; 25 fs
  rlist= 1.0; short-range neighborlist cutoff (in nm)
  rcoulomb= 1.0; short-range electrostatic cutoff (in nm)
  rvdw= 1.0; short-range van der Waals cutoff (in nm)
  rlistlong= 1.0; long-range neighborlist cutoff (in nm)
 cutoff-scheme = Verlet

 But, I get a warning of Unknown left-hand 'cutoff-scheme' in parameter
 file.


If you want to prepare a run for GROMACS 4.6, use a 4.6 version of grompp!

Mark




 On Mon, Mar 24, 2014 at 11:26 AM, Pavan Kumar kumar.pavan...@gmail.com
 wrote:

  Hello Ankita
  You have to just include the following line in your mdp file
  cutoff-scheme=Verlet
  And run your grompp with the modfied mdp file to generate tpr file and
 then
  mdrun.
  Hope this doesn't give you the same error
 
 
  On Mon, Mar 24, 2014 at 4:47 PM, Ankita Naithani
  ankitanaith...@gmail.comwrote:
 
   Hi,
  
   I am trying to run a simulation of my protein (monomer ~500 residues).
 I
   had few questions and erors regarding the same.
   I have previously run the simulation of the apo form of the same
 protein
   using Gromacs 4.5.5 which was available at the cluster facility I was
  using
   and also which is installed in my system. However, when I tried to run
  the
   holo form, I got error :
   Fatal error:
   11 particles communicated to PME node 106 are more than 2/3 times the
   cut-off out of the domain decomposition cell of their charge group in
   dimension y.
   This usually means that your system is not well equilibrated.
   For more information and tips for troubleshooting, please check the
  GROMACS
   website at http://www.gromacs.org/Documentation/Errors
  
   This I figured out could be solved using a lower timestep as my
 previous
   timestep was 4fs and now I have reduced it to 3fs which should work
 fine
   now.
   However, after producing the tpr file for production run in my GROMACS
   4.5.5, I realised that the grant for the cluster facility is over and
 the
   new clusters which I am trying to set up the same protein for support
  only
   gromacs 4.6. I am trying to run the code in these clusters and I get he
   following error:
  
  
   ---
   Program mdrun_mpi, VERSION 4.6.3
   Source code file: /home/gromacs-4.6.3/src/kernel/runner
   .c, line: 824
  
   Fatal error:
   OpenMP threads have been requested with cut-off scheme Group, but these
  are
   only
supported with cut-off scheme Verlet
   For more information and tips for troubleshooting, please check the
  GROMACS
   website at http://www.gromacs.org/Documentation/Errors
  
  
 
 -
  
   1. I wanted help with my mdp options to make it compatible.
   2. Since my pevious calculations were based on gromacs 4.5.5, switching
  to
   gromacs 4.6, would that break the continuity of the run or would that
  bring
   about differences in the way the trajectories would be analysed?
  
  
   Below, is my mdp file
   title= production MD
   ; Run parameters
   integrator= md; leap-frog algorithm
   nsteps= ; 0.003 *  = 10 ps or 100 n
   dt= 0.003; 3 fs
   ; Output control
   nstxout= 0; save coordinates every 2 ps
   nstvout= 0; save velocities every 2 ps
   nstxtcout= 1000; xtc compressed trajectory output every 5
 ps
   nstenergy= 1000; save energies every 5 ps
   nstlog= 1000; update log file every 5 ps
   energygrps  = Protein ATP
   ; Bond parameters
   constraint_algorithm = lincs; holonomic constraints
   constraints= all-bonds; all bonds (even heavy atom-H bonds)
   constrained
   lincs_iter= 1; accuracy of LINCS
   lincs_order= 4; also related to accuracy
   ; Neighborsearching
   ns_type= grid; search neighboring grid cells
   nstlist= 5; 25 fs
   rlist= 1.0; short-range neighborlist cutoff (in nm)
   rcoulomb= 1.0; short-range electrostatic cutoff (in nm)
   rvdw= 1.0; short-range van der Waals cutoff (in nm)
   rlistlong= 1.0; long-range neighborlist cutoff (in nm)
   ; Electrostatics
   coulombtype= PME; Particle Mesh Ewald for long-range
   electrostatics
   pme_order= 4; cubic interpolation
   fourierspacing= 0.16; grid spacing for FFT
   nstcomm = 10; remove com every 10 steps
   ; Temperature coupling is on
   tcoupl= V-rescale; modified Berendsen thermostat
   tc-grps= 

Re: [gmx-users] MPI error in gromacs 4.6

2014-03-24 Thread Justin Lemkul



On 3/24/14, 7:48 AM, Ankita Naithani wrote:

Hi Pavan,
Thank you for your response. I am trying to generate the tpr file with the
following parameter;
; Neighborsearching
  ns_type= grid; search neighboring grid cells
  nstlist= 5; 25 fs
  rlist= 1.0; short-range neighborlist cutoff (in nm)
  rcoulomb= 1.0; short-range electrostatic cutoff (in nm)
  rvdw= 1.0; short-range van der Waals cutoff (in nm)
  rlistlong= 1.0; long-range neighborlist cutoff (in nm)
cutoff-scheme = Verlet

But, I get a warning of Unknown left-hand 'cutoff-scheme' in parameter
file.



That option was introduced in version 4.6.  It won't work with an earlier 
version.

-Justin



On Mon, Mar 24, 2014 at 11:26 AM, Pavan Kumar kumar.pavan...@gmail.comwrote:


Hello Ankita
You have to just include the following line in your mdp file
cutoff-scheme=Verlet
And run your grompp with the modfied mdp file to generate tpr file and then
mdrun.
Hope this doesn't give you the same error


On Mon, Mar 24, 2014 at 4:47 PM, Ankita Naithani
ankitanaith...@gmail.comwrote:


Hi,

I am trying to run a simulation of my protein (monomer ~500 residues). I
had few questions and erors regarding the same.
I have previously run the simulation of the apo form of the same protein
using Gromacs 4.5.5 which was available at the cluster facility I was

using

and also which is installed in my system. However, when I tried to run

the

holo form, I got error :
Fatal error:
11 particles communicated to PME node 106 are more than 2/3 times the
cut-off out of the domain decomposition cell of their charge group in
dimension y.
This usually means that your system is not well equilibrated.
For more information and tips for troubleshooting, please check the

GROMACS

website at http://www.gromacs.org/Documentation/Errors

This I figured out could be solved using a lower timestep as my previous
timestep was 4fs and now I have reduced it to 3fs which should work fine
now.
However, after producing the tpr file for production run in my GROMACS
4.5.5, I realised that the grant for the cluster facility is over and the
new clusters which I am trying to set up the same protein for support

only

gromacs 4.6. I am trying to run the code in these clusters and I get he
following error:


---
Program mdrun_mpi, VERSION 4.6.3
Source code file: /home/gromacs-4.6.3/src/kernel/runner
.c, line: 824

Fatal error:
OpenMP threads have been requested with cut-off scheme Group, but these

are

only
  supported with cut-off scheme Verlet
For more information and tips for troubleshooting, please check the

GROMACS

website at http://www.gromacs.org/Documentation/Errors



-


1. I wanted help with my mdp options to make it compatible.
2. Since my pevious calculations were based on gromacs 4.5.5, switching

to

gromacs 4.6, would that break the continuity of the run or would that

bring

about differences in the way the trajectories would be analysed?


Below, is my mdp file
title= production MD
; Run parameters
integrator= md; leap-frog algorithm
nsteps= ; 0.003 *  = 10 ps or 100 n
dt= 0.003; 3 fs
; Output control
nstxout= 0; save coordinates every 2 ps
nstvout= 0; save velocities every 2 ps
nstxtcout= 1000; xtc compressed trajectory output every 5 ps
nstenergy= 1000; save energies every 5 ps
nstlog= 1000; update log file every 5 ps
energygrps  = Protein ATP
; Bond parameters
constraint_algorithm = lincs; holonomic constraints
constraints= all-bonds; all bonds (even heavy atom-H bonds)
constrained
lincs_iter= 1; accuracy of LINCS
lincs_order= 4; also related to accuracy
; Neighborsearching
ns_type= grid; search neighboring grid cells
nstlist= 5; 25 fs
rlist= 1.0; short-range neighborlist cutoff (in nm)
rcoulomb= 1.0; short-range electrostatic cutoff (in nm)
rvdw= 1.0; short-range van der Waals cutoff (in nm)
rlistlong= 1.0; long-range neighborlist cutoff (in nm)
; Electrostatics
coulombtype= PME; Particle Mesh Ewald for long-range
electrostatics
pme_order= 4; cubic interpolation
fourierspacing= 0.16; grid spacing for FFT
nstcomm = 10; remove com every 10 steps
; Temperature coupling is on
tcoupl= V-rescale; modified Berendsen thermostat
tc-grps= Protein Non-Protein; two coupling groups - more
accurate
tau_t= 0.10.1; time constant, in ps
ref_t= 318 318; reference temperature, one for each

group,

in K
; Pressure coupling is off
pcoupl  = berendsen; Berendsen thermostat
pcoupltype= 

Re: [gmx-users] MPI error in gromacs 4.6, more Errors

2014-03-24 Thread Justin Lemkul



On 3/24/14, 7:57 AM, Ankita Naithani wrote:

Hi, so I modified my mdp file which now looks like the following:

title= production MD
; Run parameters
integrator= md; leap-frog algorithm
;nsteps= 2000; 0.005 * 2000 = 10 ps or 100 ns
;nsteps= 20; 0.005 * 20 = 1 ns
;dt= 0.005; 5 fs
nsteps= ; 0.003 *  = 10 ps or 100 n
dt= 0.003; 3 fs
; Output control
nstxout= 0; save coordinates every 2 ps
nstvout= 0; save velocities every 2 ps
nstxtcout= 1000; xtc compressed trajectory output every 5 ps
nstenergy= 1000; save energies every 5 ps
nstlog= 1000; update log file every 5 ps
; Bond parameters
constraint_algorithm = lincs; holonomic constraints
constraints= all-bonds; all bonds (even heavy atom-H bonds)
constrained
lincs_iter= 1; accuracy of LINCS
lincs_order= 4; also related to accuracy
; Neighborsearching
ns_type= grid; search neighboring grid cells
nstlist= 5; 25 fs
rlist= 1.0; short-range neighborlist cutoff (in nm)
rcoulomb= 1.0; short-range electrostatic cutoff (in nm)
rvdw= 1.0; short-range van der Waals cutoff (in nm)
rlistlong= 1.0; long-range neighborlist cutoff (in nm)
cutoff-scheme   = Verlet
; Electrostatics
coulombtype= PME; Particle Mesh Ewald for long-range
electrostatics
pme_order= 4; cubic interpolation
fourierspacing= 0.16; grid spacing for FFT
nstcomm = 10; remove com every 10 steps
; Temperature coupling is on
tcoupl= V-rescale; modified Berendsen thermostat
tc-grps= Protein Non-Protein; two coupling groups - more
accurate
tau_t= 0.10.1; time constant, in ps
ref_t= 318 318; reference temperature, one for each group,
in K
; Pressure coupling is off
pcoupl  = berendsen; Berendsen thermostat
pcoupltype= isotropic; uniform scaling of box vectors
tau_p= 1.0; time constant, in ps
ref_p= 1.0; reference pressure, in bar
compressibility = 4.5e-5; isothermal compressibility of water, bar^-1
; Periodic boundary conditions
pbc= xyz; 3-D PBC
; Dispersion correction
DispCorr= EnerPres; account for cut-off vdW scheme
; Velocity generation
gen_vel= yes; Velocity generation is on
gen_temp= 318; reference temperature, for protein in K
--


But, when I try to generate the tpr file on the cluster itself using
gromacs 4.6.3, I get the following error:


NOTE 1 [file md3.mdp]:
   With Verlet lists the optimal nstlist is = 10, with GPUs = 20. Note
   that with the Verlet scheme, nstlist has no effect on the accuracy of
   your simulation.


NOTE 2 [file md3.mdp]:
   nstcomm  nstcalcenergy defeats the purpose of nstcalcenergy, setting
   nstcomm to nstcalcenergy

Generated 3403 of the 3403 non-bonded parameter combinations
Generating 1-4 interactions: fudge = 0.5
Generated 3403 of the 3403 1-4 parameter combinations
Segmentation fault

Can anyone please suggest further?



Do as the notes suggest.  They're not fatal errors, they're just cautionary. 
You should probably educate yourself a bit further on what all of these 
algorithms are by taking a look at 
http://www.gromacs.org/Documentation/Cut-off_schemes.  The Verlet scheme is not 
mandatory, but it is required by the type of parallelization you requested.


Reviewers may question the changes in version and cutoff methods when critiquing 
your work, so be aware of that.  Also, the instability you are seeing is 
probably a result of the large time step, unless you are using virtual sites.


-Justin

--
==

Justin A. Lemkul, Ph.D.
Ruth L. Kirschstein NRSA Postdoctoral Fellow

Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 601
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201

jalem...@outerbanks.umaryland.edu | (410) 706-7441
http://mackerell.umaryland.edu/~jalemkul

==
--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] MPI error in gromacs 4.6, more Errors

2014-03-24 Thread Mark Abraham
The segmentation fault is highly unusual, and suggests that the
installation of gromacs used a shared library that has now
migrated/changed/whatever. I suggest you discuss that with your system
admins and ask them to re-install, or re-run the GROMACS regression tests,
to check things are OK.

Mark


On Mon, Mar 24, 2014 at 2:13 PM, Ankita Naithani
ankitanaith...@gmail.comwrote:

 Hi Justin,

 Thank you very much for your reply. I shall try to work my way around and
 see.


 Kind regards,

 Ankita


 On Mon, Mar 24, 2014 at 12:12 PM, Justin Lemkul jalem...@vt.edu wrote:

 
 
  On 3/24/14, 7:57 AM, Ankita Naithani wrote:
 
  Hi, so I modified my mdp file which now looks like the following:
 
  title= production MD
  ; Run parameters
  integrator= md; leap-frog algorithm
  ;nsteps= 2000; 0.005 * 2000 = 10 ps or 100 ns
  ;nsteps= 20; 0.005 * 20 = 1 ns
  ;dt= 0.005; 5 fs
  nsteps= ; 0.003 *  = 10 ps or 100 n
  dt= 0.003; 3 fs
  ; Output control
  nstxout= 0; save coordinates every 2 ps
  nstvout= 0; save velocities every 2 ps
  nstxtcout= 1000; xtc compressed trajectory output every 5 ps
  nstenergy= 1000; save energies every 5 ps
  nstlog= 1000; update log file every 5 ps
  ; Bond parameters
  constraint_algorithm = lincs; holonomic constraints
  constraints= all-bonds; all bonds (even heavy atom-H bonds)
  constrained
  lincs_iter= 1; accuracy of LINCS
  lincs_order= 4; also related to accuracy
  ; Neighborsearching
  ns_type= grid; search neighboring grid cells
  nstlist= 5; 25 fs
  rlist= 1.0; short-range neighborlist cutoff (in nm)
  rcoulomb= 1.0; short-range electrostatic cutoff (in nm)
  rvdw= 1.0; short-range van der Waals cutoff (in nm)
  rlistlong= 1.0; long-range neighborlist cutoff (in nm)
  cutoff-scheme   = Verlet
  ; Electrostatics
  coulombtype= PME; Particle Mesh Ewald for long-range
  electrostatics
  pme_order= 4; cubic interpolation
  fourierspacing= 0.16; grid spacing for FFT
  nstcomm = 10; remove com every 10 steps
  ; Temperature coupling is on
  tcoupl= V-rescale; modified Berendsen thermostat
  tc-grps= Protein Non-Protein; two coupling groups - more
  accurate
  tau_t= 0.10.1; time constant, in ps
  ref_t= 318 318; reference temperature, one for each
 group,
  in K
  ; Pressure coupling is off
  pcoupl  = berendsen; Berendsen thermostat
  pcoupltype= isotropic; uniform scaling of box vectors
  tau_p= 1.0; time constant, in ps
  ref_p= 1.0; reference pressure, in bar
  compressibility = 4.5e-5; isothermal compressibility of water,
 bar^-1
  ; Periodic boundary conditions
  pbc= xyz; 3-D PBC
  ; Dispersion correction
  DispCorr= EnerPres; account for cut-off vdW scheme
  ; Velocity generation
  gen_vel= yes; Velocity generation is on
  gen_temp= 318; reference temperature, for protein in K
  --
 
 
  But, when I try to generate the tpr file on the cluster itself using
  gromacs 4.6.3, I get the following error:
 
 
  NOTE 1 [file md3.mdp]:
 With Verlet lists the optimal nstlist is = 10, with GPUs = 20. Note
 that with the Verlet scheme, nstlist has no effect on the accuracy of
 your simulation.
 
 
  NOTE 2 [file md3.mdp]:
 nstcomm  nstcalcenergy defeats the purpose of nstcalcenergy, setting
 nstcomm to nstcalcenergy
 
  Generated 3403 of the 3403 non-bonded parameter combinations
  Generating 1-4 interactions: fudge = 0.5
  Generated 3403 of the 3403 1-4 parameter combinations
  Segmentation fault
 
  Can anyone please suggest further?
 
 
  Do as the notes suggest.  They're not fatal errors, they're just
  cautionary. You should probably educate yourself a bit further on what
 all
  of these algorithms are by taking a look at http://www.gromacs.org/
  Documentation/Cut-off_schemes.  The Verlet scheme is not mandatory, but
  it is required by the type of parallelization you requested.
 
  Reviewers may question the changes in version and cutoff methods when
  critiquing your work, so be aware of that.  Also, the instability you are
  seeing is probably a result of the large time step, unless you are using
  virtual sites.
 
  -Justin
 
  --
  ==
 
  Justin A. Lemkul, Ph.D.
  Ruth L. Kirschstein NRSA Postdoctoral Fellow
 
  Department of Pharmaceutical Sciences
  School of Pharmacy
  Health Sciences Facility II, Room 601
  University of Maryland, Baltimore
  20 Penn St.
  Baltimore, MD 21201
 
  jalem...@outerbanks.umaryland.edu