[gmx-users] amber ff gromacs

2014-03-24 Thread virk
Dear People,

I am trying to simulate glycine molecule using amber ff. I have changed the
residues in pdb file with N and C respectively.
ATOM  1  NNGLY1  -1.476   0.232   0.252  1.00  0.00
ATOM  2  CA  CGLY1  -0.012   0.296   0.348  1.00  0.00
ATOM  3  CCGLY1   0.596  -0.652  -0.648  1.00  0.00
ATOM  4  O GLY 1  -0.124  -1.320  -1.368  1.00  0.00
ATOM  5  OXT GLY 1   1.916  -0.760  -0.740  1.00  0.00
ATOM  6  H3   GLY 1  -1.736  -0.252  -0.592  1.00  0.00
ATOM  7  HA2 GLY 1   0.292   0.020   1.364  1.00  0.00
ATOM  8  HA1 GLY 1   0.320   1.320   0.132  1.00  0.00
ATOM  9  HGLY 1  -1.636  -0.236   1.132  1.00  0.00
ATOM 10  H   GLY 1  -1.916   1.136   0.272  1.00  0.00
TER  11 GLY 1 
END 

But when I am using pdb2gmx, I am getting this error 
Residue 1 named NGLY of a molecule in the input file was mapped to an entry
in the topology database, but the atom CA used in that entry is not found in
the input file.

But atom CA is in the input pdb file and with same name in amber .rtp file
too. If I remove N and C prefixes in pdb file, pdb works for all other
fields.

I will be thankful for your help and time.

Kind Regards,
Amninder Virk
 

--
View this message in context: 
http://gromacs.5086.x6.nabble.com/amber-ff-gromacs-tp5015342.html
Sent from the GROMACS Users Forum mailing list archive at Nabble.com.
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] refcoord-scaling

2014-03-24 Thread Mark Abraham
On Mar 24, 2014 1:10 AM, Chetan Mahajan chetanv...@gmail.com wrote:

 Dear all:

 I am trying to get a simulation of water solvated titanium oxide running.
 When 'all' option is used for refcoord-scaling, simulation runs ok.
 However, when 'com' option is used for refcoord-scaling, simulation
crashes
 with any of the following errors. Could anyone explain to me why is this
 happening

See the description of com. You didn't tell us what you were restraining,
so it's hard to help. But I can see multiple com of tio2 and PBC not
working well together, particularly if you box size is far from best.

 or when each of the options such as 'all', 'com' and 'no' is used?

When you really care about your starting position and need to equilibrate
in NPT.

Mark

 Thanks a lot!
 regards
 Chetan


 Errors:

 X particles communicated to PME node Y are more than a cell length out of
 the domain decomposition cell of their charge group

 This is another way that mdrun tells you your system is blowing
 uphttp://www.gromacs.org/Documentation/Terminology/Blowing_Up.
 In GROMACS version 4.0, domain decomposition was introduced to divide the
 system into regions containing nearby atoms (for more details, see the
 manual http://www.gromacs.org/Documentation/Manual or the GROMACS 4
 paperhttp://dx.doi.org/10.1021/ct700301q).
 If you have particles that are flying across the system, you will get this
 fatal error. The message indicates that some piece of your system is
 tearing apart (hence out of the cell of their charge group). Refer
 to the Blowing
 Up http://www.gromacs.org/Documentation/Terminology/Blowing_Up page for
 advice on how to fix this issue.


 A charge group moved too far between two domain decomposition steps.
 --
 Gromacs Users mailing list

 * Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

 * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

 * For (un)subscribe requests visit
 https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
send a mail to gmx-users-requ...@gromacs.org.
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Add atom to .itp file

2014-03-24 Thread Mark Abraham
I don't understand your description.

Mark
On Mar 24, 2014 4:09 AM, deep satyadeep.r...@gmail.com wrote:

 I want to create a polarizable particle system consisting of a sphere
 carying
 charge q and some size r1 on which a oppositively charge -q is present with
 some other size r2 . Overall I want to create a colloidal particle . Can
 anyone please help me in making these sphere (atom ) in itp file ? I want
 to
 write some command and I don't know what exactly I have to write .

 Thanks

 --
 View this message in context:
 http://gromacs.5086.x6.nabble.com/Add-atom-to-itp-file-tp5015341.html
 Sent from the GROMACS Users Forum mailing list archive at Nabble.com.
 --
 Gromacs Users mailing list

 * Please search the archive at
 http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
 posting!

 * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

 * For (un)subscribe requests visit
 https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
 send a mail to gmx-users-requ...@gromacs.org.

-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Error: illegal instruction (core dumped)

2014-03-24 Thread ooker
@Mark Abraham: I have received your email but because I don't see it on the
thread so I think you deleted your answer. Sorry about that. I'm still new
with the mail list. Anyway, I have already deleted the build folder and
make a fresh install but it doesn't solve.

@Justin Lemkul: my machine is Intel Pentium P6200 (2.13 GHz, 3 MB L3
cache). Any cmake command leads to the same error. So if you want a
specific command, let's use the simplist one:* cmake ..*

I know that I optimized it wrong, but how can I fix that?


On Mon, Mar 24, 2014 at 6:11 AM, Justin Lemkul [via GROMACS] 
ml-node+s5086n5015337...@n6.nabble.com wrote:



 On 3/23/14, 11:57 AM, ooker wrote:
  sorry for spamming, but I need to up this thread. I'll delete this after
  someone answer. Sorry.
 

 You'll need to provide exact details of your hardware and your exact cmake
 command, otherwise there's nothing productive to suggest.  The build
 machinery
 is good at picking the correct optimization type, so the only real issue
 is if
 you're selecting an optimization that is not supported.

 -Justin

 --
 ==

 Justin A. Lemkul, Ph.D.
 Ruth L. Kirschstein NRSA Postdoctoral Fellow

 Department of Pharmaceutical Sciences
 School of Pharmacy
 Health Sciences Facility II, Room 601
 University of Maryland, Baltimore
 20 Penn St.
 Baltimore, MD 21201

 [hidden email] http://user/SendEmail.jtp?type=nodenode=5015337i=0 |
 (410) 706-7441
 http://mackerell.umaryland.edu/~jalemkul

 ==
 --
 Gromacs Users mailing list

 * Please search the archive at
 http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
 posting!

 * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

 * For (un)subscribe requests visit
 https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
 send a mail to [hidden 
 email]http://user/SendEmail.jtp?type=nodenode=5015337i=1.



 --
  If you reply to this email, your message will be added to the discussion
 below:

 http://gromacs.5086.x6.nabble.com/Error-illegal-instruction-core-dumped-tp5015293p5015337.html
  To unsubscribe from Error: illegal instruction (core dumped), click 
 herehttp://gromacs.5086.x6.nabble.com/template/NamlServlet.jtp?macro=unsubscribe_by_codenode=5015293code=Z2FudW9uZ3BoYXBAZ21haWwuY29tfDUwMTUyOTN8MjA5NTYxNDQyOQ==
 .
 NAMLhttp://gromacs.5086.x6.nabble.com/template/NamlServlet.jtp?macro=macro_viewerid=instant_html%21nabble%3Aemail.namlbase=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespacebreadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml



--
View this message in context: 
http://gromacs.5086.x6.nabble.com/Error-illegal-instruction-core-dumped-tp5015293p5015343.html
Sent from the GROMACS Users Forum mailing list archive at Nabble.com.
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] free energy of removal of a molecule from the bulk

2014-03-24 Thread Valentina
Thank you Justin.

This is what I have tried, but obviously gone wrong with [settles]. I'll try
again. 

V

--
View this message in context: 
http://gromacs.5086.x6.nabble.com/free-energy-of-removal-of-a-molecule-from-the-bulk-tp5015333p5015344.html
Sent from the GROMACS Users Forum mailing list archive at Nabble.com.
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Add atom to .itp file

2014-03-24 Thread Satyadeep Roat
I want to make a new itp file and add two atoms in it.

Something like this

;[atoms]


1a1  q  r1
2 a2 -q r2


On Mon, Mar 24, 2014 at 2:47 PM, Mark Abraham mark.j.abra...@gmail.comwrote:

 I don't understand your description.

 Mark
 On Mar 24, 2014 4:09 AM, deep satyadeep.r...@gmail.com wrote:

  I want to create a polarizable particle system consisting of a sphere
  carying
  charge q and some size r1 on which a oppositively charge -q is present
 with
  some other size r2 . Overall I want to create a colloidal particle . Can
  anyone please help me in making these sphere (atom ) in itp file ? I want
  to
  write some command and I don't know what exactly I have to write .
 
  Thanks
 
  --
  View this message in context:
  http://gromacs.5086.x6.nabble.com/Add-atom-to-itp-file-tp5015341.html
  Sent from the GROMACS Users Forum mailing list archive at Nabble.com.
  --
  Gromacs Users mailing list
 
  * Please search the archive at
  http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
  posting!
 
  * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
 
  * For (un)subscribe requests visit
  https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
  send a mail to gmx-users-requ...@gromacs.org.
 
 --
 Gromacs Users mailing list

 * Please search the archive at
 http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
 posting!

 * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

 * For (un)subscribe requests visit
 https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
 send a mail to gmx-users-requ...@gromacs.org.




-- 
*Satyadeep roat*
Undergraduate Student,
Department of Chemical Engineering, IIT Delhi
Email- satyadeep.r...@gmail.com
Mobile- +91 8527211721
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Error: illegal instruction (core dumped)

2014-03-24 Thread Justin Lemkul



On 3/24/14, 5:32 AM, ooker wrote:

@Mark Abraham: I have received your email but because I don't see it on the
thread so I think you deleted your answer. Sorry about that. I'm still new
with the mail list. Anyway, I have already deleted the build folder and
make a fresh install but it doesn't solve.

@Justin Lemkul: my machine is Intel Pentium P6200 (2.13 GHz, 3 MB L3
cache). Any cmake command leads to the same error. So if you want a
specific command, let's use the simplist one:* cmake ..*

I know that I optimized it wrong, but how can I fix that?



It's still not clear.  Simply running cmake and nothing else relies on a lot 
of assumptions - does it work?  Does it give an error?  What level of 
optimization does cmake detect?


As for hardware, what is the output of `cat /proc/cpuinfo` and what C compiler 
are you using?


-Justin



On Mon, Mar 24, 2014 at 6:11 AM, Justin Lemkul [via GROMACS] 
ml-node+s5086n5015337...@n6.nabble.com wrote:




On 3/23/14, 11:57 AM, ooker wrote:

sorry for spamming, but I need to up this thread. I'll delete this after
someone answer. Sorry.



You'll need to provide exact details of your hardware and your exact cmake
command, otherwise there's nothing productive to suggest.  The build
machinery
is good at picking the correct optimization type, so the only real issue
is if
you're selecting an optimization that is not supported.

-Justin

--
==

Justin A. Lemkul, Ph.D.
Ruth L. Kirschstein NRSA Postdoctoral Fellow

Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 601
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201

[hidden email] http://user/SendEmail.jtp?type=nodenode=5015337i=0 |
(410) 706-7441
http://mackerell.umaryland.edu/~jalemkul

==
--
Gromacs Users mailing list

* Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
send a mail to [hidden 
email]http://user/SendEmail.jtp?type=nodenode=5015337i=1.



--
  If you reply to this email, your message will be added to the discussion
below:

http://gromacs.5086.x6.nabble.com/Error-illegal-instruction-core-dumped-tp5015293p5015337.html
  To unsubscribe from Error: illegal instruction (core dumped), click 
herehttp://gromacs.5086.x6.nabble.com/template/NamlServlet.jtp?macro=unsubscribe_by_codenode=5015293code=Z2FudW9uZ3BoYXBAZ21haWwuY29tfDUwMTUyOTN8MjA5NTYxNDQyOQ==
.
NAMLhttp://gromacs.5086.x6.nabble.com/template/NamlServlet.jtp?macro=macro_viewerid=instant_html%21nabble%3Aemail.namlbase=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespacebreadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml




--
View this message in context: 
http://gromacs.5086.x6.nabble.com/Error-illegal-instruction-core-dumped-tp5015293p5015343.html
Sent from the GROMACS Users Forum mailing list archive at Nabble.com.



--
==

Justin A. Lemkul, Ph.D.
Ruth L. Kirschstein NRSA Postdoctoral Fellow

Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 601
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201

jalem...@outerbanks.umaryland.edu | (410) 706-7441
http://mackerell.umaryland.edu/~jalemkul

==
--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] MPI error in gromacs 4.6

2014-03-24 Thread Pavan Kumar
Hello Ankita
You have to just include the following line in your mdp file
cutoff-scheme=Verlet
And run your grompp with the modfied mdp file to generate tpr file and then
mdrun.
Hope this doesn't give you the same error


On Mon, Mar 24, 2014 at 4:47 PM, Ankita Naithani
ankitanaith...@gmail.comwrote:

 Hi,

 I am trying to run a simulation of my protein (monomer ~500 residues). I
 had few questions and erors regarding the same.
 I have previously run the simulation of the apo form of the same protein
 using Gromacs 4.5.5 which was available at the cluster facility I was using
 and also which is installed in my system. However, when I tried to run the
 holo form, I got error :
 Fatal error:
 11 particles communicated to PME node 106 are more than 2/3 times the
 cut-off out of the domain decomposition cell of their charge group in
 dimension y.
 This usually means that your system is not well equilibrated.
 For more information and tips for troubleshooting, please check the GROMACS
 website at http://www.gromacs.org/Documentation/Errors

 This I figured out could be solved using a lower timestep as my previous
 timestep was 4fs and now I have reduced it to 3fs which should work fine
 now.
 However, after producing the tpr file for production run in my GROMACS
 4.5.5, I realised that the grant for the cluster facility is over and the
 new clusters which I am trying to set up the same protein for support only
 gromacs 4.6. I am trying to run the code in these clusters and I get he
 following error:


 ---
 Program mdrun_mpi, VERSION 4.6.3
 Source code file: /home/gromacs-4.6.3/src/kernel/runner
 .c, line: 824

 Fatal error:
 OpenMP threads have been requested with cut-off scheme Group, but these are
 only
  supported with cut-off scheme Verlet
 For more information and tips for troubleshooting, please check the GROMACS
 website at http://www.gromacs.org/Documentation/Errors

 -

 1. I wanted help with my mdp options to make it compatible.
 2. Since my pevious calculations were based on gromacs 4.5.5, switching to
 gromacs 4.6, would that break the continuity of the run or would that bring
 about differences in the way the trajectories would be analysed?


 Below, is my mdp file
 title= production MD
 ; Run parameters
 integrator= md; leap-frog algorithm
 nsteps= ; 0.003 *  = 10 ps or 100 n
 dt= 0.003; 3 fs
 ; Output control
 nstxout= 0; save coordinates every 2 ps
 nstvout= 0; save velocities every 2 ps
 nstxtcout= 1000; xtc compressed trajectory output every 5 ps
 nstenergy= 1000; save energies every 5 ps
 nstlog= 1000; update log file every 5 ps
 energygrps  = Protein ATP
 ; Bond parameters
 constraint_algorithm = lincs; holonomic constraints
 constraints= all-bonds; all bonds (even heavy atom-H bonds)
 constrained
 lincs_iter= 1; accuracy of LINCS
 lincs_order= 4; also related to accuracy
 ; Neighborsearching
 ns_type= grid; search neighboring grid cells
 nstlist= 5; 25 fs
 rlist= 1.0; short-range neighborlist cutoff (in nm)
 rcoulomb= 1.0; short-range electrostatic cutoff (in nm)
 rvdw= 1.0; short-range van der Waals cutoff (in nm)
 rlistlong= 1.0; long-range neighborlist cutoff (in nm)
 ; Electrostatics
 coulombtype= PME; Particle Mesh Ewald for long-range
 electrostatics
 pme_order= 4; cubic interpolation
 fourierspacing= 0.16; grid spacing for FFT
 nstcomm = 10; remove com every 10 steps
 ; Temperature coupling is on
 tcoupl= V-rescale; modified Berendsen thermostat
 tc-grps= Protein Non-Protein; two coupling groups - more
 accurate
 tau_t= 0.10.1; time constant, in ps
 ref_t= 318 318; reference temperature, one for each group,
 in K
 ; Pressure coupling is off
 pcoupl  = berendsen; Berendsen thermostat
 pcoupltype= isotropic; uniform scaling of box vectors
 tau_p= 1.0; time constant, in ps
 ref_p= 1.0; reference pressure, in bar
 compressibility = 4.5e-5; isothermal compressibility of water, bar^-1
 ; Periodic boundary conditions
 pbc= xyz; 3-D PBC
 ; Dispersion correction
 DispCorr= EnerPres; account for cut-off vdW scheme
 ; Velocity generation
 gen_vel= yes; Velocity generation is on
 gen_temp= 318; reference temperature, for protein in K




 Kind regards--
 Ankita Naithani
 --
 Gromacs Users mailing list

 * Please search the archive at
 http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
 posting!

 * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

 

Re: [gmx-users] MPI error in gromacs 4.6

2014-03-24 Thread Ankita Naithani
Hi Pavan,
Thank you for your response. I am trying to generate the tpr file with the
following parameter;
; Neighborsearching
 ns_type= grid; search neighboring grid cells
 nstlist= 5; 25 fs
 rlist= 1.0; short-range neighborlist cutoff (in nm)
 rcoulomb= 1.0; short-range electrostatic cutoff (in nm)
 rvdw= 1.0; short-range van der Waals cutoff (in nm)
 rlistlong= 1.0; long-range neighborlist cutoff (in nm)
cutoff-scheme = Verlet

But, I get a warning of Unknown left-hand 'cutoff-scheme' in parameter
file.


On Mon, Mar 24, 2014 at 11:26 AM, Pavan Kumar kumar.pavan...@gmail.comwrote:

 Hello Ankita
 You have to just include the following line in your mdp file
 cutoff-scheme=Verlet
 And run your grompp with the modfied mdp file to generate tpr file and then
 mdrun.
 Hope this doesn't give you the same error


 On Mon, Mar 24, 2014 at 4:47 PM, Ankita Naithani
 ankitanaith...@gmail.comwrote:

  Hi,
 
  I am trying to run a simulation of my protein (monomer ~500 residues). I
  had few questions and erors regarding the same.
  I have previously run the simulation of the apo form of the same protein
  using Gromacs 4.5.5 which was available at the cluster facility I was
 using
  and also which is installed in my system. However, when I tried to run
 the
  holo form, I got error :
  Fatal error:
  11 particles communicated to PME node 106 are more than 2/3 times the
  cut-off out of the domain decomposition cell of their charge group in
  dimension y.
  This usually means that your system is not well equilibrated.
  For more information and tips for troubleshooting, please check the
 GROMACS
  website at http://www.gromacs.org/Documentation/Errors
 
  This I figured out could be solved using a lower timestep as my previous
  timestep was 4fs and now I have reduced it to 3fs which should work fine
  now.
  However, after producing the tpr file for production run in my GROMACS
  4.5.5, I realised that the grant for the cluster facility is over and the
  new clusters which I am trying to set up the same protein for support
 only
  gromacs 4.6. I am trying to run the code in these clusters and I get he
  following error:
 
 
  ---
  Program mdrun_mpi, VERSION 4.6.3
  Source code file: /home/gromacs-4.6.3/src/kernel/runner
  .c, line: 824
 
  Fatal error:
  OpenMP threads have been requested with cut-off scheme Group, but these
 are
  only
   supported with cut-off scheme Verlet
  For more information and tips for troubleshooting, please check the
 GROMACS
  website at http://www.gromacs.org/Documentation/Errors
 
 
 -
 
  1. I wanted help with my mdp options to make it compatible.
  2. Since my pevious calculations were based on gromacs 4.5.5, switching
 to
  gromacs 4.6, would that break the continuity of the run or would that
 bring
  about differences in the way the trajectories would be analysed?
 
 
  Below, is my mdp file
  title= production MD
  ; Run parameters
  integrator= md; leap-frog algorithm
  nsteps= ; 0.003 *  = 10 ps or 100 n
  dt= 0.003; 3 fs
  ; Output control
  nstxout= 0; save coordinates every 2 ps
  nstvout= 0; save velocities every 2 ps
  nstxtcout= 1000; xtc compressed trajectory output every 5 ps
  nstenergy= 1000; save energies every 5 ps
  nstlog= 1000; update log file every 5 ps
  energygrps  = Protein ATP
  ; Bond parameters
  constraint_algorithm = lincs; holonomic constraints
  constraints= all-bonds; all bonds (even heavy atom-H bonds)
  constrained
  lincs_iter= 1; accuracy of LINCS
  lincs_order= 4; also related to accuracy
  ; Neighborsearching
  ns_type= grid; search neighboring grid cells
  nstlist= 5; 25 fs
  rlist= 1.0; short-range neighborlist cutoff (in nm)
  rcoulomb= 1.0; short-range electrostatic cutoff (in nm)
  rvdw= 1.0; short-range van der Waals cutoff (in nm)
  rlistlong= 1.0; long-range neighborlist cutoff (in nm)
  ; Electrostatics
  coulombtype= PME; Particle Mesh Ewald for long-range
  electrostatics
  pme_order= 4; cubic interpolation
  fourierspacing= 0.16; grid spacing for FFT
  nstcomm = 10; remove com every 10 steps
  ; Temperature coupling is on
  tcoupl= V-rescale; modified Berendsen thermostat
  tc-grps= Protein Non-Protein; two coupling groups - more
  accurate
  tau_t= 0.10.1; time constant, in ps
  ref_t= 318 318; reference temperature, one for each
 group,
  in K
  ; Pressure coupling is off
  pcoupl  = berendsen; Berendsen thermostat
  

Re: [gmx-users] MPI error in gromacs 4.6

2014-03-24 Thread Pavan Kumar
It might be some typographical errors.
Check the mdp file thoroughly. I think semicolon is required for the last
line in your mdp file


On Mon, Mar 24, 2014 at 5:18 PM, Ankita Naithani
ankitanaith...@gmail.comwrote:

 Hi Pavan,
 Thank you for your response. I am trying to generate the tpr file with the
 following parameter;
 ; Neighborsearching
  ns_type= grid; search neighboring grid cells
  nstlist= 5; 25 fs
  rlist= 1.0; short-range neighborlist cutoff (in nm)
  rcoulomb= 1.0; short-range electrostatic cutoff (in nm)
  rvdw= 1.0; short-range van der Waals cutoff (in nm)
  rlistlong= 1.0; long-range neighborlist cutoff (in nm)
 cutoff-scheme = Verlet

 But, I get a warning of Unknown left-hand 'cutoff-scheme' in parameter
 file.


 On Mon, Mar 24, 2014 at 11:26 AM, Pavan Kumar kumar.pavan...@gmail.com
 wrote:

  Hello Ankita
  You have to just include the following line in your mdp file
  cutoff-scheme=Verlet
  And run your grompp with the modfied mdp file to generate tpr file and
 then
  mdrun.
  Hope this doesn't give you the same error
 
 
  On Mon, Mar 24, 2014 at 4:47 PM, Ankita Naithani
  ankitanaith...@gmail.comwrote:
 
   Hi,
  
   I am trying to run a simulation of my protein (monomer ~500 residues).
 I
   had few questions and erors regarding the same.
   I have previously run the simulation of the apo form of the same
 protein
   using Gromacs 4.5.5 which was available at the cluster facility I was
  using
   and also which is installed in my system. However, when I tried to run
  the
   holo form, I got error :
   Fatal error:
   11 particles communicated to PME node 106 are more than 2/3 times the
   cut-off out of the domain decomposition cell of their charge group in
   dimension y.
   This usually means that your system is not well equilibrated.
   For more information and tips for troubleshooting, please check the
  GROMACS
   website at http://www.gromacs.org/Documentation/Errors
  
   This I figured out could be solved using a lower timestep as my
 previous
   timestep was 4fs and now I have reduced it to 3fs which should work
 fine
   now.
   However, after producing the tpr file for production run in my GROMACS
   4.5.5, I realised that the grant for the cluster facility is over and
 the
   new clusters which I am trying to set up the same protein for support
  only
   gromacs 4.6. I am trying to run the code in these clusters and I get he
   following error:
  
  
   ---
   Program mdrun_mpi, VERSION 4.6.3
   Source code file: /home/gromacs-4.6.3/src/kernel/runner
   .c, line: 824
  
   Fatal error:
   OpenMP threads have been requested with cut-off scheme Group, but these
  are
   only
supported with cut-off scheme Verlet
   For more information and tips for troubleshooting, please check the
  GROMACS
   website at http://www.gromacs.org/Documentation/Errors
  
  
 
 -
  
   1. I wanted help with my mdp options to make it compatible.
   2. Since my pevious calculations were based on gromacs 4.5.5, switching
  to
   gromacs 4.6, would that break the continuity of the run or would that
  bring
   about differences in the way the trajectories would be analysed?
  
  
   Below, is my mdp file
   title= production MD
   ; Run parameters
   integrator= md; leap-frog algorithm
   nsteps= ; 0.003 *  = 10 ps or 100 n
   dt= 0.003; 3 fs
   ; Output control
   nstxout= 0; save coordinates every 2 ps
   nstvout= 0; save velocities every 2 ps
   nstxtcout= 1000; xtc compressed trajectory output every 5
 ps
   nstenergy= 1000; save energies every 5 ps
   nstlog= 1000; update log file every 5 ps
   energygrps  = Protein ATP
   ; Bond parameters
   constraint_algorithm = lincs; holonomic constraints
   constraints= all-bonds; all bonds (even heavy atom-H bonds)
   constrained
   lincs_iter= 1; accuracy of LINCS
   lincs_order= 4; also related to accuracy
   ; Neighborsearching
   ns_type= grid; search neighboring grid cells
   nstlist= 5; 25 fs
   rlist= 1.0; short-range neighborlist cutoff (in nm)
   rcoulomb= 1.0; short-range electrostatic cutoff (in nm)
   rvdw= 1.0; short-range van der Waals cutoff (in nm)
   rlistlong= 1.0; long-range neighborlist cutoff (in nm)
   ; Electrostatics
   coulombtype= PME; Particle Mesh Ewald for long-range
   electrostatics
   pme_order= 4; cubic interpolation
   fourierspacing= 0.16; grid spacing for FFT
   nstcomm = 10; remove com every 10 steps
   ; Temperature coupling is on
   tcoupl= V-rescale; 

Re: [gmx-users] MPI error in gromacs 4.6

2014-03-24 Thread Mark Abraham
On Mon, Mar 24, 2014 at 12:17 PM, Ankita Naithani
ankitanaith...@gmail.comwrote:

 Hi,

 I am trying to run a simulation of my protein (monomer ~500 residues). I
 had few questions and erors regarding the same.
 I have previously run the simulation of the apo form of the same protein
 using Gromacs 4.5.5 which was available at the cluster facility I was using
 and also which is installed in my system. However, when I tried to run the
 holo form, I got error :
 Fatal error:
 11 particles communicated to PME node 106 are more than 2/3 times the
 cut-off out of the domain decomposition cell of their charge group in
 dimension y.
 This usually means that your system is not well equilibrated.
 For more information and tips for troubleshooting, please check the GROMACS
 website at http://www.gromacs.org/Documentation/Errors

 This I figured out could be solved using a lower timestep as my previous
 timestep was 4fs and now I have reduced it to 3fs which should work fine
 now.


You should need to do that kind of thing only for equilibration, e.g. see
http://www.gromacs.org/Documentation/How-tos/Steps_to_Perform_a_Simulation


 However, after producing the tpr file for production run in my GROMACS
 4.5.5, I realised that the grant for the cluster facility is over and the
 new clusters which I am trying to set up the same protein for support only
 gromacs 4.6. I am trying to run the code in these clusters and I get he
 following error:


 ---
 Program mdrun_mpi, VERSION 4.6.3
 Source code file: /home/gromacs-4.6.3/src/kernel/runner
 .c, line: 824

 Fatal error:
 OpenMP threads have been requested with cut-off scheme Group, but these are
 only
  supported with cut-off scheme Verlet
 For more information and tips for troubleshooting, please check the GROMACS
 website at http://www.gromacs.org/Documentation/Errors

 -

 1. I wanted help with my mdp options to make it compatible.


If you want to run with the group scheme, use cutoff-scheme = group,
control your job script to use an MPI rank per core, and do not attempt to
use OpenMP.

2. Since my pevious calculations were based on gromacs 4.5.5, switching to
 gromacs 4.6, would that break the continuity of the run or would that bring
 about differences in the way the trajectories would be analysed?


Many things are different, but if you're confident no relevant bugs were
fixed, you can use 4.6.x to do the same things as 4.5.5 could do. But since
such runs would not have
http://www.gromacs.org/Documentation/Terminology/Reproducibility, then they
will not have continuity either.




 Below, is my mdp file
 title= production MD
 ; Run parameters
 integrator= md; leap-frog algorithm
 nsteps= ; 0.003 *  = 10 ps or 100 n
 dt= 0.003; 3 fs
 ; Output control
 nstxout= 0; save coordinates every 2 ps
 nstvout= 0; save velocities every 2 ps
 nstxtcout= 1000; xtc compressed trajectory output every 5 ps
 nstenergy= 1000; save energies every 5 ps
 nstlog= 1000; update log file every 5 ps
 energygrps  = Protein ATP
 ; Bond parameters
 constraint_algorithm = lincs; holonomic constraints
 constraints= all-bonds; all bonds (even heavy atom-H bonds)
 constrained
 lincs_iter= 1; accuracy of LINCS
 lincs_order= 4; also related to accuracy
 ; Neighborsearching
 ns_type= grid; search neighboring grid cells
 nstlist= 5; 25 fs
 rlist= 1.0; short-range neighborlist cutoff (in nm)
 rcoulomb= 1.0; short-range electrostatic cutoff (in nm)
 rvdw= 1.0; short-range van der Waals cutoff (in nm)
 rlistlong= 1.0; long-range neighborlist cutoff (in nm)
 ; Electrostatics
 coulombtype= PME; Particle Mesh Ewald for long-range
 electrostatics
 pme_order= 4; cubic interpolation
 fourierspacing= 0.16; grid spacing for FFT
 nstcomm = 10; remove com every 10 steps
 ; Temperature coupling is on
 tcoupl= V-rescale; modified Berendsen thermostat
 tc-grps= Protein Non-Protein; two coupling groups - more
 accurate
 tau_t= 0.10.1; time constant, in ps
 ref_t= 318 318; reference temperature, one for each group,
 in K
 ; Pressure coupling is off
 pcoupl  = berendsen; Berendsen thermostat
 pcoupltype= isotropic; uniform scaling of box vectors
 tau_p= 1.0; time constant, in ps
 ref_p= 1.0; reference pressure, in bar
 compressibility = 4.5e-5; isothermal compressibility of water, bar^-1
 ; Periodic boundary conditions
 pbc= xyz; 3-D PBC
 ; Dispersion correction
 DispCorr= EnerPres; account for cut-off vdW scheme
 ; Velocity 

Re: [gmx-users] MPI error in gromacs 4.6

2014-03-24 Thread Mark Abraham
On Mon, Mar 24, 2014 at 12:48 PM, Ankita Naithani
ankitanaith...@gmail.comwrote:

 Hi Pavan,
 Thank you for your response. I am trying to generate the tpr file with the
 following parameter;
 ; Neighborsearching
  ns_type= grid; search neighboring grid cells
  nstlist= 5; 25 fs
  rlist= 1.0; short-range neighborlist cutoff (in nm)
  rcoulomb= 1.0; short-range electrostatic cutoff (in nm)
  rvdw= 1.0; short-range van der Waals cutoff (in nm)
  rlistlong= 1.0; long-range neighborlist cutoff (in nm)
 cutoff-scheme = Verlet

 But, I get a warning of Unknown left-hand 'cutoff-scheme' in parameter
 file.


If you want to prepare a run for GROMACS 4.6, use a 4.6 version of grompp!

Mark




 On Mon, Mar 24, 2014 at 11:26 AM, Pavan Kumar kumar.pavan...@gmail.com
 wrote:

  Hello Ankita
  You have to just include the following line in your mdp file
  cutoff-scheme=Verlet
  And run your grompp with the modfied mdp file to generate tpr file and
 then
  mdrun.
  Hope this doesn't give you the same error
 
 
  On Mon, Mar 24, 2014 at 4:47 PM, Ankita Naithani
  ankitanaith...@gmail.comwrote:
 
   Hi,
  
   I am trying to run a simulation of my protein (monomer ~500 residues).
 I
   had few questions and erors regarding the same.
   I have previously run the simulation of the apo form of the same
 protein
   using Gromacs 4.5.5 which was available at the cluster facility I was
  using
   and also which is installed in my system. However, when I tried to run
  the
   holo form, I got error :
   Fatal error:
   11 particles communicated to PME node 106 are more than 2/3 times the
   cut-off out of the domain decomposition cell of their charge group in
   dimension y.
   This usually means that your system is not well equilibrated.
   For more information and tips for troubleshooting, please check the
  GROMACS
   website at http://www.gromacs.org/Documentation/Errors
  
   This I figured out could be solved using a lower timestep as my
 previous
   timestep was 4fs and now I have reduced it to 3fs which should work
 fine
   now.
   However, after producing the tpr file for production run in my GROMACS
   4.5.5, I realised that the grant for the cluster facility is over and
 the
   new clusters which I am trying to set up the same protein for support
  only
   gromacs 4.6. I am trying to run the code in these clusters and I get he
   following error:
  
  
   ---
   Program mdrun_mpi, VERSION 4.6.3
   Source code file: /home/gromacs-4.6.3/src/kernel/runner
   .c, line: 824
  
   Fatal error:
   OpenMP threads have been requested with cut-off scheme Group, but these
  are
   only
supported with cut-off scheme Verlet
   For more information and tips for troubleshooting, please check the
  GROMACS
   website at http://www.gromacs.org/Documentation/Errors
  
  
 
 -
  
   1. I wanted help with my mdp options to make it compatible.
   2. Since my pevious calculations were based on gromacs 4.5.5, switching
  to
   gromacs 4.6, would that break the continuity of the run or would that
  bring
   about differences in the way the trajectories would be analysed?
  
  
   Below, is my mdp file
   title= production MD
   ; Run parameters
   integrator= md; leap-frog algorithm
   nsteps= ; 0.003 *  = 10 ps or 100 n
   dt= 0.003; 3 fs
   ; Output control
   nstxout= 0; save coordinates every 2 ps
   nstvout= 0; save velocities every 2 ps
   nstxtcout= 1000; xtc compressed trajectory output every 5
 ps
   nstenergy= 1000; save energies every 5 ps
   nstlog= 1000; update log file every 5 ps
   energygrps  = Protein ATP
   ; Bond parameters
   constraint_algorithm = lincs; holonomic constraints
   constraints= all-bonds; all bonds (even heavy atom-H bonds)
   constrained
   lincs_iter= 1; accuracy of LINCS
   lincs_order= 4; also related to accuracy
   ; Neighborsearching
   ns_type= grid; search neighboring grid cells
   nstlist= 5; 25 fs
   rlist= 1.0; short-range neighborlist cutoff (in nm)
   rcoulomb= 1.0; short-range electrostatic cutoff (in nm)
   rvdw= 1.0; short-range van der Waals cutoff (in nm)
   rlistlong= 1.0; long-range neighborlist cutoff (in nm)
   ; Electrostatics
   coulombtype= PME; Particle Mesh Ewald for long-range
   electrostatics
   pme_order= 4; cubic interpolation
   fourierspacing= 0.16; grid spacing for FFT
   nstcomm = 10; remove com every 10 steps
   ; Temperature coupling is on
   tcoupl= V-rescale; modified Berendsen thermostat
   tc-grps= 

Re: [gmx-users] MPI error in gromacs 4.6

2014-03-24 Thread Justin Lemkul



On 3/24/14, 7:48 AM, Ankita Naithani wrote:

Hi Pavan,
Thank you for your response. I am trying to generate the tpr file with the
following parameter;
; Neighborsearching
  ns_type= grid; search neighboring grid cells
  nstlist= 5; 25 fs
  rlist= 1.0; short-range neighborlist cutoff (in nm)
  rcoulomb= 1.0; short-range electrostatic cutoff (in nm)
  rvdw= 1.0; short-range van der Waals cutoff (in nm)
  rlistlong= 1.0; long-range neighborlist cutoff (in nm)
cutoff-scheme = Verlet

But, I get a warning of Unknown left-hand 'cutoff-scheme' in parameter
file.



That option was introduced in version 4.6.  It won't work with an earlier 
version.

-Justin



On Mon, Mar 24, 2014 at 11:26 AM, Pavan Kumar kumar.pavan...@gmail.comwrote:


Hello Ankita
You have to just include the following line in your mdp file
cutoff-scheme=Verlet
And run your grompp with the modfied mdp file to generate tpr file and then
mdrun.
Hope this doesn't give you the same error


On Mon, Mar 24, 2014 at 4:47 PM, Ankita Naithani
ankitanaith...@gmail.comwrote:


Hi,

I am trying to run a simulation of my protein (monomer ~500 residues). I
had few questions and erors regarding the same.
I have previously run the simulation of the apo form of the same protein
using Gromacs 4.5.5 which was available at the cluster facility I was

using

and also which is installed in my system. However, when I tried to run

the

holo form, I got error :
Fatal error:
11 particles communicated to PME node 106 are more than 2/3 times the
cut-off out of the domain decomposition cell of their charge group in
dimension y.
This usually means that your system is not well equilibrated.
For more information and tips for troubleshooting, please check the

GROMACS

website at http://www.gromacs.org/Documentation/Errors

This I figured out could be solved using a lower timestep as my previous
timestep was 4fs and now I have reduced it to 3fs which should work fine
now.
However, after producing the tpr file for production run in my GROMACS
4.5.5, I realised that the grant for the cluster facility is over and the
new clusters which I am trying to set up the same protein for support

only

gromacs 4.6. I am trying to run the code in these clusters and I get he
following error:


---
Program mdrun_mpi, VERSION 4.6.3
Source code file: /home/gromacs-4.6.3/src/kernel/runner
.c, line: 824

Fatal error:
OpenMP threads have been requested with cut-off scheme Group, but these

are

only
  supported with cut-off scheme Verlet
For more information and tips for troubleshooting, please check the

GROMACS

website at http://www.gromacs.org/Documentation/Errors



-


1. I wanted help with my mdp options to make it compatible.
2. Since my pevious calculations were based on gromacs 4.5.5, switching

to

gromacs 4.6, would that break the continuity of the run or would that

bring

about differences in the way the trajectories would be analysed?


Below, is my mdp file
title= production MD
; Run parameters
integrator= md; leap-frog algorithm
nsteps= ; 0.003 *  = 10 ps or 100 n
dt= 0.003; 3 fs
; Output control
nstxout= 0; save coordinates every 2 ps
nstvout= 0; save velocities every 2 ps
nstxtcout= 1000; xtc compressed trajectory output every 5 ps
nstenergy= 1000; save energies every 5 ps
nstlog= 1000; update log file every 5 ps
energygrps  = Protein ATP
; Bond parameters
constraint_algorithm = lincs; holonomic constraints
constraints= all-bonds; all bonds (even heavy atom-H bonds)
constrained
lincs_iter= 1; accuracy of LINCS
lincs_order= 4; also related to accuracy
; Neighborsearching
ns_type= grid; search neighboring grid cells
nstlist= 5; 25 fs
rlist= 1.0; short-range neighborlist cutoff (in nm)
rcoulomb= 1.0; short-range electrostatic cutoff (in nm)
rvdw= 1.0; short-range van der Waals cutoff (in nm)
rlistlong= 1.0; long-range neighborlist cutoff (in nm)
; Electrostatics
coulombtype= PME; Particle Mesh Ewald for long-range
electrostatics
pme_order= 4; cubic interpolation
fourierspacing= 0.16; grid spacing for FFT
nstcomm = 10; remove com every 10 steps
; Temperature coupling is on
tcoupl= V-rescale; modified Berendsen thermostat
tc-grps= Protein Non-Protein; two coupling groups - more
accurate
tau_t= 0.10.1; time constant, in ps
ref_t= 318 318; reference temperature, one for each

group,

in K
; Pressure coupling is off
pcoupl  = berendsen; Berendsen thermostat
pcoupltype= 

Re: [gmx-users] MPI error in gromacs 4.6, more Errors

2014-03-24 Thread Justin Lemkul



On 3/24/14, 7:57 AM, Ankita Naithani wrote:

Hi, so I modified my mdp file which now looks like the following:

title= production MD
; Run parameters
integrator= md; leap-frog algorithm
;nsteps= 2000; 0.005 * 2000 = 10 ps or 100 ns
;nsteps= 20; 0.005 * 20 = 1 ns
;dt= 0.005; 5 fs
nsteps= ; 0.003 *  = 10 ps or 100 n
dt= 0.003; 3 fs
; Output control
nstxout= 0; save coordinates every 2 ps
nstvout= 0; save velocities every 2 ps
nstxtcout= 1000; xtc compressed trajectory output every 5 ps
nstenergy= 1000; save energies every 5 ps
nstlog= 1000; update log file every 5 ps
; Bond parameters
constraint_algorithm = lincs; holonomic constraints
constraints= all-bonds; all bonds (even heavy atom-H bonds)
constrained
lincs_iter= 1; accuracy of LINCS
lincs_order= 4; also related to accuracy
; Neighborsearching
ns_type= grid; search neighboring grid cells
nstlist= 5; 25 fs
rlist= 1.0; short-range neighborlist cutoff (in nm)
rcoulomb= 1.0; short-range electrostatic cutoff (in nm)
rvdw= 1.0; short-range van der Waals cutoff (in nm)
rlistlong= 1.0; long-range neighborlist cutoff (in nm)
cutoff-scheme   = Verlet
; Electrostatics
coulombtype= PME; Particle Mesh Ewald for long-range
electrostatics
pme_order= 4; cubic interpolation
fourierspacing= 0.16; grid spacing for FFT
nstcomm = 10; remove com every 10 steps
; Temperature coupling is on
tcoupl= V-rescale; modified Berendsen thermostat
tc-grps= Protein Non-Protein; two coupling groups - more
accurate
tau_t= 0.10.1; time constant, in ps
ref_t= 318 318; reference temperature, one for each group,
in K
; Pressure coupling is off
pcoupl  = berendsen; Berendsen thermostat
pcoupltype= isotropic; uniform scaling of box vectors
tau_p= 1.0; time constant, in ps
ref_p= 1.0; reference pressure, in bar
compressibility = 4.5e-5; isothermal compressibility of water, bar^-1
; Periodic boundary conditions
pbc= xyz; 3-D PBC
; Dispersion correction
DispCorr= EnerPres; account for cut-off vdW scheme
; Velocity generation
gen_vel= yes; Velocity generation is on
gen_temp= 318; reference temperature, for protein in K
--


But, when I try to generate the tpr file on the cluster itself using
gromacs 4.6.3, I get the following error:


NOTE 1 [file md3.mdp]:
   With Verlet lists the optimal nstlist is = 10, with GPUs = 20. Note
   that with the Verlet scheme, nstlist has no effect on the accuracy of
   your simulation.


NOTE 2 [file md3.mdp]:
   nstcomm  nstcalcenergy defeats the purpose of nstcalcenergy, setting
   nstcomm to nstcalcenergy

Generated 3403 of the 3403 non-bonded parameter combinations
Generating 1-4 interactions: fudge = 0.5
Generated 3403 of the 3403 1-4 parameter combinations
Segmentation fault

Can anyone please suggest further?



Do as the notes suggest.  They're not fatal errors, they're just cautionary. 
You should probably educate yourself a bit further on what all of these 
algorithms are by taking a look at 
http://www.gromacs.org/Documentation/Cut-off_schemes.  The Verlet scheme is not 
mandatory, but it is required by the type of parallelization you requested.


Reviewers may question the changes in version and cutoff methods when critiquing 
your work, so be aware of that.  Also, the instability you are seeing is 
probably a result of the large time step, unless you are using virtual sites.


-Justin

--
==

Justin A. Lemkul, Ph.D.
Ruth L. Kirschstein NRSA Postdoctoral Fellow

Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 601
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201

jalem...@outerbanks.umaryland.edu | (410) 706-7441
http://mackerell.umaryland.edu/~jalemkul

==
--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] Protein-Ligand MS simulation

2014-03-24 Thread MUSYOKA THOMMAS
Hello. I have a protein pdb and the best pose of a ligand obtained through
docking using auto dock. However, when I combine the ligand and protein
using the discovery studio, some residues like Lysine in the protein pdb
gets protonated as well as the N-terminal residue. Is this unusual and can
it affect the MD end results?
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Protein-Ligand MS simulation

2014-03-24 Thread Justin Lemkul



On 3/24/14, 8:52 AM, MUSYOKA THOMMAS wrote:

Hello. I have a protein pdb and the best pose of a ligand obtained through
docking using auto dock. However, when I combine the ligand and protein
using the discovery studio, some residues like Lysine in the protein pdb
gets protonated as well as the N-terminal residue. Is this unusual and can
it affect the MD end results?



I can't speak to what's usual in Discovery Studio (and there's probably a 
better forum for that, anyway), but it seems like it's just assuming standard 
(pH 7) protonation states.  Both lysine and the N-terminus are predominantly 
protonated at neutral pH.  Protonation state can have a major impact on the 
outcome of a simulation, especially in binding sites, and should be chosen wisely.


-Justin

--
==

Justin A. Lemkul, Ph.D.
Ruth L. Kirschstein NRSA Postdoctoral Fellow

Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 601
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201

jalem...@outerbanks.umaryland.edu | (410) 706-7441
http://mackerell.umaryland.edu/~jalemkul

==
--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] MPI error in gromacs 4.6, more Errors

2014-03-24 Thread Mark Abraham
The segmentation fault is highly unusual, and suggests that the
installation of gromacs used a shared library that has now
migrated/changed/whatever. I suggest you discuss that with your system
admins and ask them to re-install, or re-run the GROMACS regression tests,
to check things are OK.

Mark


On Mon, Mar 24, 2014 at 2:13 PM, Ankita Naithani
ankitanaith...@gmail.comwrote:

 Hi Justin,

 Thank you very much for your reply. I shall try to work my way around and
 see.


 Kind regards,

 Ankita


 On Mon, Mar 24, 2014 at 12:12 PM, Justin Lemkul jalem...@vt.edu wrote:

 
 
  On 3/24/14, 7:57 AM, Ankita Naithani wrote:
 
  Hi, so I modified my mdp file which now looks like the following:
 
  title= production MD
  ; Run parameters
  integrator= md; leap-frog algorithm
  ;nsteps= 2000; 0.005 * 2000 = 10 ps or 100 ns
  ;nsteps= 20; 0.005 * 20 = 1 ns
  ;dt= 0.005; 5 fs
  nsteps= ; 0.003 *  = 10 ps or 100 n
  dt= 0.003; 3 fs
  ; Output control
  nstxout= 0; save coordinates every 2 ps
  nstvout= 0; save velocities every 2 ps
  nstxtcout= 1000; xtc compressed trajectory output every 5 ps
  nstenergy= 1000; save energies every 5 ps
  nstlog= 1000; update log file every 5 ps
  ; Bond parameters
  constraint_algorithm = lincs; holonomic constraints
  constraints= all-bonds; all bonds (even heavy atom-H bonds)
  constrained
  lincs_iter= 1; accuracy of LINCS
  lincs_order= 4; also related to accuracy
  ; Neighborsearching
  ns_type= grid; search neighboring grid cells
  nstlist= 5; 25 fs
  rlist= 1.0; short-range neighborlist cutoff (in nm)
  rcoulomb= 1.0; short-range electrostatic cutoff (in nm)
  rvdw= 1.0; short-range van der Waals cutoff (in nm)
  rlistlong= 1.0; long-range neighborlist cutoff (in nm)
  cutoff-scheme   = Verlet
  ; Electrostatics
  coulombtype= PME; Particle Mesh Ewald for long-range
  electrostatics
  pme_order= 4; cubic interpolation
  fourierspacing= 0.16; grid spacing for FFT
  nstcomm = 10; remove com every 10 steps
  ; Temperature coupling is on
  tcoupl= V-rescale; modified Berendsen thermostat
  tc-grps= Protein Non-Protein; two coupling groups - more
  accurate
  tau_t= 0.10.1; time constant, in ps
  ref_t= 318 318; reference temperature, one for each
 group,
  in K
  ; Pressure coupling is off
  pcoupl  = berendsen; Berendsen thermostat
  pcoupltype= isotropic; uniform scaling of box vectors
  tau_p= 1.0; time constant, in ps
  ref_p= 1.0; reference pressure, in bar
  compressibility = 4.5e-5; isothermal compressibility of water,
 bar^-1
  ; Periodic boundary conditions
  pbc= xyz; 3-D PBC
  ; Dispersion correction
  DispCorr= EnerPres; account for cut-off vdW scheme
  ; Velocity generation
  gen_vel= yes; Velocity generation is on
  gen_temp= 318; reference temperature, for protein in K
  --
 
 
  But, when I try to generate the tpr file on the cluster itself using
  gromacs 4.6.3, I get the following error:
 
 
  NOTE 1 [file md3.mdp]:
 With Verlet lists the optimal nstlist is = 10, with GPUs = 20. Note
 that with the Verlet scheme, nstlist has no effect on the accuracy of
 your simulation.
 
 
  NOTE 2 [file md3.mdp]:
 nstcomm  nstcalcenergy defeats the purpose of nstcalcenergy, setting
 nstcomm to nstcalcenergy
 
  Generated 3403 of the 3403 non-bonded parameter combinations
  Generating 1-4 interactions: fudge = 0.5
  Generated 3403 of the 3403 1-4 parameter combinations
  Segmentation fault
 
  Can anyone please suggest further?
 
 
  Do as the notes suggest.  They're not fatal errors, they're just
  cautionary. You should probably educate yourself a bit further on what
 all
  of these algorithms are by taking a look at http://www.gromacs.org/
  Documentation/Cut-off_schemes.  The Verlet scheme is not mandatory, but
  it is required by the type of parallelization you requested.
 
  Reviewers may question the changes in version and cutoff methods when
  critiquing your work, so be aware of that.  Also, the instability you are
  seeing is probably a result of the large time step, unless you are using
  virtual sites.
 
  -Justin
 
  --
  ==
 
  Justin A. Lemkul, Ph.D.
  Ruth L. Kirschstein NRSA Postdoctoral Fellow
 
  Department of Pharmaceutical Sciences
  School of Pharmacy
  Health Sciences Facility II, Room 601
  University of Maryland, Baltimore
  20 Penn St.
  Baltimore, MD 21201
 
  jalem...@outerbanks.umaryland.edu 

Re: [gmx-users] refcoord-scaling

2014-03-24 Thread Chetan Mahajan
Hi Mark,

I am restraining positions of each of the atoms of TiO2 crystal. Could you
comment again why should simulation crash when 'com' option is used against
'all' option? Also, description in the manual does not answer my question:
when is 'all' option generally used? When is 'com' option generally used?
When is 'no' option generally used?

Thanks!

regards
Chetan


On Mon, Mar 24, 2014 at 4:16 AM, Mark Abraham mark.j.abra...@gmail.comwrote:

 On Mar 24, 2014 1:10 AM, Chetan Mahajan chetanv...@gmail.com wrote:
 
  Dear all:
 
  I am trying to get a simulation of water solvated titanium oxide running.
  When 'all' option is used for refcoord-scaling, simulation runs ok.
  However, when 'com' option is used for refcoord-scaling, simulation
 crashes
  with any of the following errors. Could anyone explain to me why is this
  happening

 See the description of com. You didn't tell us what you were restraining,
 so it's hard to help. But I can see multiple com of tio2 and PBC not
 working well together, particularly if you box size is far from best.

  or when each of the options such as 'all', 'com' and 'no' is used?

 When you really care about your starting position and need to equilibrate
 in NPT.

 Mark

  Thanks a lot!
  regards
  Chetan
 
 
  Errors:
 
  X particles communicated to PME node Y are more than a cell length out of
  the domain decomposition cell of their charge group
 
  This is another way that mdrun tells you your system is blowing
  uphttp://www.gromacs.org/Documentation/Terminology/Blowing_Up.
  In GROMACS version 4.0, domain decomposition was introduced to divide the
  system into regions containing nearby atoms (for more details, see the
  manual http://www.gromacs.org/Documentation/Manual or the GROMACS 4
  paperhttp://dx.doi.org/10.1021/ct700301q).
  If you have particles that are flying across the system, you will get
 this
  fatal error. The message indicates that some piece of your system is
  tearing apart (hence out of the cell of their charge group). Refer
  to the Blowing
  Up http://www.gromacs.org/Documentation/Terminology/Blowing_Up page
 for
  advice on how to fix this issue.
 
 
  A charge group moved too far between two domain decomposition steps.
  --
  Gromacs Users mailing list
 
  * Please search the archive at
 http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
 posting!
 
  * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
 
  * For (un)subscribe requests visit
  https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
 send a mail to gmx-users-requ...@gromacs.org.
 --
 Gromacs Users mailing list

 * Please search the archive at
 http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
 posting!

 * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

 * For (un)subscribe requests visit
 https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
 send a mail to gmx-users-requ...@gromacs.org.

-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] Umbrella pulling Drug + Ion channel

2014-03-24 Thread Andres Ortega
Dear gromacs users, i ´ve beem trying to get the geometries of a drug
molecule permeating an ion channel,
the drug molecule is in a greater Z position than the Protein, but i always
get the error :Segmentation fault (core dumped)
this is my .mdp

title   = Umbrella pulling simulation DOX
; Run parameters
integrator  = md
dt  = 0.002
tinit   = 0
nsteps  = 25; 500 ps
nstcomm = 10
; Output parameters
nstxout = 5000  ; every 10 ps
nstvout = 5000
nstfout = 500
nstxtcout   = 500   ; every 1 ps
nstenergy   = 500

cutoff-scheme = Verlet

; Bond parameters
constraint_algorithm= lincs
constraints = all-bonds
continuation= yes ; continuing from NPT
; Single-range cutoff scheme
nstlist = 20
ns_type = grid
rlist   = 1.2
rcoulomb= 1.2
rvdw= 1.2
; PME electrostatics parameters
coulombtype = PME
fourierspacing  = 0.12
fourier_nx  = 0
fourier_ny  = 0
fourier_nz  = 0
pme_order   = 4
ewald_rtol  = 1e-5
optimize_fft= yes
; Berendsen temperature coupling is on in two groups
Tcoupl  = Nose-Hoover
tc_grps = Protein  POPC Water_and_ions_DOX
tau_t   = 1.01.0   1.0
ref_t   = 310310   310
; Pressure coupling is on
Pcoupl  = Parrinello-Rahman
pcoupltype  = semiisotropic
tau_p   = 1.0  
compressibility = 4.5e-5 4.5e-5
ref_p   = 1.0   1.0
refcoord_scaling = com
; Generate velocities is off
gen_vel = no
; Periodic boundary conditions are on in all directions
pbc = xyz
; Long-range dispersion correction
DispCorr= EnerPres
; Pull code
pull= umbrella
pull_geometry   = cylinder  ;
pull_vec1   = 0.0 0.0 1.0 ;
pull_dim= N N Y
pull_start  = yes   ; define initial COM distance  0
pull_ngroups= 1
pull_group0 = Protein
pull_group1 = DOX
pull-r1 = 1.0
pull-r0 = 1.0
pull_rate1  = 0.01  ; 0.01 nm per ps = 10 nm per ns
pull_k1 = 800  ; kJ mol^-1 nm^-2
pull_pbcatom0   = 8485

According with other post, i have to select a pull_pbcatom0 closes to the
middle of my reference ( an atom close to the middle of the ion channel's
pore) and a pull_vec1 positive even if the drug is above the channel, i
was wondering if you could help me with this, or giving me an advice of this
error, thanks in advance

Andrés Ortega

--
View this message in context: 
http://gromacs.5086.x6.nabble.com/Umbrella-pulling-Drug-Ion-channel-tp5015363.html
Sent from the GROMACS Users Forum mailing list archive at Nabble.com.
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] refcoord-scaling

2014-03-24 Thread Chetan Mahajan
Hi Justin,

My system is TiO2 crystal (2160 atoms position restrained) solvated by 3656
water molecules, 1 formate anion and 1 sodium ion.  Is it true that when
'all' option is used, positions of the atoms (meant to be restrained)
always change?

Thanks
Chetan


On Mon, Mar 24, 2014 at 7:37 PM, Justin Lemkul jalem...@vt.edu wrote:



 On 3/24/14, 8:29 PM, Chetan Mahajan wrote:

 Thanks, Mark. So is 'all' option okay when positions of each of the atoms
 of TiO2 crystal (2160 atoms total) are restrained in space? Apparently, it
 does not seem correct, since positions of the atoms of TiO2 crystal change
 when 'all' option is applied. However, it does not give error as 'com'
 option does.


 Is your system just a TiO2 crystal?  Is there any solvent?  If it's just a
 crystal, I see no point in restraining anything.  It is very unusual that
 the all option of refcoord-scaling is more stable than com - normally
 the opposite is true.  But that also explains why the coordinates are
 changing - the reference position of each individual atom is scaled
 according to the pressure coupling matrix, so the atoms are restrained to a
 dynamic reference, so it seems that it's not really accomplishing anything.

 -Justin

 --
 ==

 Justin A. Lemkul, Ph.D.
 Ruth L. Kirschstein NRSA Postdoctoral Fellow

 Department of Pharmaceutical Sciences
 School of Pharmacy
 Health Sciences Facility II, Room 601
 University of Maryland, Baltimore
 20 Penn St.
 Baltimore, MD 21201

 jalem...@outerbanks.umaryland.edu | (410) 706-7441
 http://mackerell.umaryland.edu/~jalemkul

 ==

 --
 Gromacs Users mailing list

 * Please search the archive at http://www.gromacs.org/
 Support/Mailing_Lists/GMX-Users_List before posting!

 * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

 * For (un)subscribe requests visit
 https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
 send a mail to gmx-users-requ...@gromacs.org.

-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] refcoord-scaling

2014-03-24 Thread Justin Lemkul



On 3/24/14, 9:05 PM, Chetan Mahajan wrote:

Hi Justin,

My system is TiO2 crystal (2160 atoms position restrained) solvated by 3656
water molecules, 1 formate anion and 1 sodium ion.  Is it true that when
'all' option is used, positions of the atoms (meant to be restrained)
always change?



Consider what position restraints are doing.  They never guarantee that atoms 
won't move; they just apply a biasing potential to disfavor movement.  So yes, 
your atoms will probably move.  The smaller the system, the larger the effect 
will likely be.  Consider also what reference coordinates are doing.  They 
define how strong the potential is (distance between current coordinates and 
reference).  If you're scaling all atoms individually according to the pressure 
coupling, the reference moves and therefore the atoms can also move.  For larger 
systems like proteins, refcoord_scaling = com is normal and stable, and 
refcoord_scaling = all is disfavored.  Honestly, I don't know what to expect 
for a very small system like yours, but it is likely to be affected to a greater 
extent.


If the all option gives you a stable simulation that allow for proper 
equilibration, then I doubt there is any real problem.


-Justin

--
==

Justin A. Lemkul, Ph.D.
Ruth L. Kirschstein NRSA Postdoctoral Fellow

Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 601
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201

jalem...@outerbanks.umaryland.edu | (410) 706-7441
http://mackerell.umaryland.edu/~jalemkul

==
--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] berger lipid parameters in amber99sb-ildn ff

2014-03-24 Thread Monoj Mon Kalita
Dear Chris

Sorry, That didn't solve my problem. I tried that option ,before posting
here, with different authors from two publications, but seems no reply from
anybody. I wish, I could find somebody in this mailing list who is willing
to put forward his/her advice or share their experience with me to solve
this issue. Many thanks in advance.

Message: 1
Date: Sun, 23 Mar 2014 11:32:44 +
From: Christopher Neale chris.ne...@alum.utoronto.ca
To: gmx-us...@gromacs.org gmx-us...@gromacs.org
Subject: Re: [gmx-users] berger lipid parameters in amber99sb-ildn ff
Message-ID:

91dc2bfb46004e1b9fa1a8922bb55...@blupr03mb184.namprd03.prod.outlook.com

Content-Type: text/plain; charset=us-ascii

See this paper for Amber protein and berger lipids:

http://pubs.acs.org/doi/abs/10.1021/ct200491c

If you contact the authors, I am sure that they can send you the files.

Chris.


From: gromacs.org_gmx-users-boun...@maillist.sys.kth.se 
gromacs.org_gmx-users-boun...@maillist.sys.kth.se on behalf of Monoj Mon
Kalita mon.123c...@gmail.com
Sent: 23 March 2014 00:29
To: gromacs.org_gmx-users@maillist.sys.kth.se
Subject: [gmx-users] berger lipid parameters in amber99sb-ildn ff

Dear Users

I am trying to run a drug-protein simulation in GROMACS. I have calculated
the ligand paramteres using the generalized amber force field (GAFF) . ESP
partial charges were calculated using HF/6-31G* basis set. RESP fitting was
performed using the Antechamber . Now I am having trouble in introducing
the berger lipid parameters into the topology file created by
amber99sb-ildn ff . All I got are gromos adopted forcefield files and
berger lipid parameters as I want to run my simulation in united-atom POPC
membrane with berger lipid parameter and amber99sb-ildn ff for protein. Is
there any link to download the amber adopted toplogy file for united atom
POPC with berger lipid forcefield parameters which may solve my problem.

I am following the method mentioned in this publication doi:
10.3109/09687688.2013.773095. to set up my protocol.


--

 With Regards
Monoj Mon Kalita
Taipei, Taiwan


--

With Regards

Monoj Mon Kalita
Taipei, Taiwan
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] refcoord-scaling

2014-03-24 Thread Chetan Mahajan
Thanks, Justin. I am trying to understand what is meant by 'com' option.
Following two sentences in manual seem conflicting with each other:

*Scale* the center of mass of the reference coordinates with the scaling
matrix of the pressure coupling. The *vectors of each reference
coordinate*to the center of mass are *not
scaled*.

Doesn't scaling center of mass of reference coordinates change something
else ? reference coordinates?

Thanks

Chetan


On Mon, Mar 24, 2014 at 8:22 PM, Justin Lemkul jalem...@vt.edu wrote:



 On 3/24/14, 9:05 PM, Chetan Mahajan wrote:

 Hi Justin,

 My system is TiO2 crystal (2160 atoms position restrained) solvated by
 3656
 water molecules, 1 formate anion and 1 sodium ion.  Is it true that when
 'all' option is used, positions of the atoms (meant to be restrained)
 always change?


 Consider what position restraints are doing.  They never guarantee that
 atoms won't move; they just apply a biasing potential to disfavor movement.
  So yes, your atoms will probably move.  The smaller the system, the larger
 the effect will likely be.  Consider also what reference coordinates are
 doing.  They define how strong the potential is (distance between current
 coordinates and reference).  If you're scaling all atoms individually
 according to the pressure coupling, the reference moves and therefore the
 atoms can also move.  For larger systems like proteins, refcoord_scaling =
 com is normal and stable, and refcoord_scaling = all is disfavored.
  Honestly, I don't know what to expect for a very small system like yours,
 but it is likely to be affected to a greater extent.

 If the all option gives you a stable simulation that allow for proper
 equilibration, then I doubt there is any real problem.


 -Justin

 --
 ==

 Justin A. Lemkul, Ph.D.
 Ruth L. Kirschstein NRSA Postdoctoral Fellow

 Department of Pharmaceutical Sciences
 School of Pharmacy
 Health Sciences Facility II, Room 601
 University of Maryland, Baltimore
 20 Penn St.
 Baltimore, MD 21201

 jalem...@outerbanks.umaryland.edu | (410) 706-7441
 http://mackerell.umaryland.edu/~jalemkul

 ==
 --
 Gromacs Users mailing list

 * Please search the archive at http://www.gromacs.org/
 Support/Mailing_Lists/GMX-Users_List before posting!

 * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

 * For (un)subscribe requests visit
 https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
 send a mail to gmx-users-requ...@gromacs.org.

-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] refcoord-scaling

2014-03-24 Thread Christopher Neale
Say you have a box with an x-side length of 3 nm and two atoms with x-dimension 
position restraints to: (a) x=1 nm, and (b) x=2 nm

Then let the box shrink to 99% of its previous size due to pressure coupling. 
The following is my understanding of  the locations to which the atoms will be 
restrained:

refcoord_scaling =

none:
(a) x=1 nm, and (b) x=2 nm  -- note how the restraint positions are not scaled 
(even though the positions of the atoms are scaled due to pressure coupling)

all:
(a) x=0.99 nm, and (b) x=1.98 nm   -- note how the atoms are now restrained to 
be closer together

com:
(a) x=0.985 nm, and (b) x=1.985 nm  -- note how the distance between the atoms 
is maintained


So you see how none leads to problems because the restraint reference 
positions are not scaled at all with the atomic positions and how all leads 
to problems because, if you restrained all Ca atoms, then you would get 
compression/expansion of the size of your protein as the system fluctuated in 
constant pressure simulations.

Chris.

From: gromacs.org_gmx-users-boun...@maillist.sys.kth.se 
gromacs.org_gmx-users-boun...@maillist.sys.kth.se on behalf of Chetan Mahajan 
chetanv...@gmail.com
Sent: 24 March 2014 23:25
To: gmx-us...@gromacs.org
Subject: Re: [gmx-users] refcoord-scaling

Thanks, Justin. I am trying to understand what is meant by 'com' option.
Following two sentences in manual seem conflicting with each other:

*Scale* the center of mass of the reference coordinates with the scaling
matrix of the pressure coupling. The *vectors of each reference
coordinate*to the center of mass are *not
scaled*.

Doesn't scaling center of mass of reference coordinates change something
else ? reference coordinates?

Thanks

Chetan


On Mon, Mar 24, 2014 at 8:22 PM, Justin Lemkul jalem...@vt.edu wrote:



 On 3/24/14, 9:05 PM, Chetan Mahajan wrote:

 Hi Justin,

 My system is TiO2 crystal (2160 atoms position restrained) solvated by
 3656
 water molecules, 1 formate anion and 1 sodium ion.  Is it true that when
 'all' option is used, positions of the atoms (meant to be restrained)
 always change?


 Consider what position restraints are doing.  They never guarantee that
 atoms won't move; they just apply a biasing potential to disfavor movement.
  So yes, your atoms will probably move.  The smaller the system, the larger
 the effect will likely be.  Consider also what reference coordinates are
 doing.  They define how strong the potential is (distance between current
 coordinates and reference).  If you're scaling all atoms individually
 according to the pressure coupling, the reference moves and therefore the
 atoms can also move.  For larger systems like proteins, refcoord_scaling =
 com is normal and stable, and refcoord_scaling = all is disfavored.
  Honestly, I don't know what to expect for a very small system like yours,
 but it is likely to be affected to a greater extent.

 If the all option gives you a stable simulation that allow for proper
 equilibration, then I doubt there is any real problem.


 -Justin

 --
 ==

 Justin A. Lemkul, Ph.D.
 Ruth L. Kirschstein NRSA Postdoctoral Fellow

 Department of Pharmaceutical Sciences
 School of Pharmacy
 Health Sciences Facility II, Room 601
 University of Maryland, Baltimore
 20 Penn St.
 Baltimore, MD 21201

 jalem...@outerbanks.umaryland.edu | (410) 706-7441
 http://mackerell.umaryland.edu/~jalemkul

 ==
 --
 Gromacs Users mailing list

 * Please search the archive at http://www.gromacs.org/
 Support/Mailing_Lists/GMX-Users_List before posting!

 * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

 * For (un)subscribe requests visit
 https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
 send a mail to gmx-users-requ...@gromacs.org.

--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.