Re: [gmx-users] Improving scaling - Gromacs 4.0 RC2

2008-10-02 Thread David van der Spoel

Justin A. Lemkul wrote:


Hi,

I've been playing around with the latest release candidate of version 
4.0, and I was hoping someone out there more knowledgeable than me might 
tell me how to improve a bit on the performance I'm seeing.  To clarify, 
the performance I'm seeing is a ton faster than 3.3.x, but I still seem 
to be getting bogged down with the PME/PP balance.  I'm using mostly the 
default options with the new mdrun:


mdrun_mpi -s test.tpr -np 64 -npme 32


try -dlb auto




The system contains about 150,000 atoms - a membrane protein surrounded 
by several hundred lipids and solvent (water).  The protein parameters 
are GROMOS, lipids are Berger, and water is SPC.  My .mdp file (adapted 
from a generic 3.3.x file that I always used to use for such 
simulations) is attached at the end of this mail.  It seems that my 
system runs fastest on 64 CPU's.  Almost all tests with 128 or 256 seem 
to run slower.  The nodes are dual-core 2.3 GHz Xserve G5, connected by 
Infiniband.


Here's a summary of some of the tests I've run:

-np-npme-ddorderns/day% performance loss from imbalance
6416interleave5.76019.6
6432interleave9.60040.9
6432pp_pme5.2523.9
6432cartesian5.3834.7

All other mdrun command line options are defaults.

I get ~10.3 ns/day with -np 256 -npme 64, but since -np 64 -npme 32 
seems to give almost that same performance there seems to be no 
compelling reason to tie up that many nodes.


Any hints on how to speed things up any more?  Is it possible?  Not that 
I'm complaining...the same system under GMX 3.3.3 gives just under 1 
ns/day :)  I'm really curious about the 40.9% performance loss I'm 
seeing with -np 64 -npme 32, even though it gives the best overall 
performance in terms of ns/day.


Thanks in advance for your attention, and any comments.

-Justin

===test.mdp=
title= NPT simulation for a membrane protein
; Run parameters
integrator= md
dt= 0.002
nsteps= 1; 20 ps
nstcomm= 1
; Output parameters
nstxout= 500
nstvout= 500
nstfout= 500
nstlog= 500
nstenergy= 500
; Bond parameters
constraint_algorithm = lincs
constraints= all-bonds
continuation = no; starting up
; Twin-range cutoff scheme, parameters for Gromos96
nstlist= 5
ns_type= grid
rlist= 0.8
rcoulomb= 0.8
rvdw= 1.4
; PME electrostatics parameters
coulombtype= PME
fourierspacing  = 0.24
pme_order= 4
ewald_rtol= 1e-5
optimize_fft= yes
; V-rescale temperature coupling is on in three groups
Tcoupl = V-rescale
tc_grps= Protein POPC SOL_NA+_CL-
tau_t= 0.1 0.1 0.1
ref_t= 310 310 310
; Pressure coupling is on
Pcoupl= Berendsen
pcoupltype= semiisotropic
tau_p= 2.0   
compressibility= 4.5e-5 4.5e-5

ref_p= 1.0 1.0
; Generate velocities is on
gen_vel= yes   
gen_temp= 310

gen_seed= 173529
; Periodic boundary conditions are on in all directions
pbc= xyz
; Long-range dispersion correction
DispCorr= EnerPres

end test.mdp==




--
David.

David van der Spoel, PhD, Professor of Biology
Dept. of Cell and Molecular Biology, Uppsala University.
Husargatan 3, Box 596,  75124 Uppsala, Sweden
phone:  46 18 471 4205  fax: 46 18 511 755
[EMAIL PROTECTED]   [EMAIL PROTECTED]   http://folding.bmc.uu.se

___
gmx-users mailing listgmx-users@gromacs.org
http://www.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to [EMAIL PROTECTED]

Can't post? Read http://www.gromacs.org/mailing_lists/users.php


Re: [gmx-users] g_hbond, residue selection and boundary conditions

2008-10-02 Thread Tsjerk Wassenaar
Hi Nicolas,

   1. Does g_hbond takes into account the periodic boundary conditions
   or should first I center the system on my molecule, then run g_hbond?

Yes (pretty much all tools do).

   2. When I run g_hbond, it prints the message No option -sel the
   usual welcome message and run normally after that. I don't use any
   -sel option in the command line. Is it because I'm using g_hbond
   from gmx 3.3.3 with a trajectory from gmx 3.3.1? I use the following
   command line:

It's an innocent bug in g_hbond, nothing to do with your command-line or system

   3. When g_hbond ask for 2 groups, is it the same think to specify
   group1 then group2 and group2 then group1? (assuming group1 and
   group2 do not overlap)

Yes. The number of hydrogen bonds between A and B is equal to the
number between B and A.

Cheers,

Tsjerk

-- 
Tsjerk A. Wassenaar, Ph.D.
Junior UD (post-doc)
Biomolecular NMR, Bijvoet Center
Utrecht University
Padualaan 8
3584 CH Utrecht
The Netherlands
P: +31-30-2539931
F: +31-30-2537623
___
gmx-users mailing listgmx-users@gromacs.org
http://www.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to [EMAIL PROTECTED]
Can't post? Read http://www.gromacs.org/mailing_lists/users.php


Re: [gmx-users] rot_correlation

2008-10-02 Thread Xavier Periole

On Wed, 1 Oct 2008 12:22:03 -0400
 rams rams [EMAIL PROTECTED] wrote:

Hi Xavier,

I started working on the g_rms program (to obtain the rotation matrices)
suggested by you as well as it was mentioned in a couple of papers for the
calcualtion of rotational correlation times of proteins.

Meanwhile, I have an idea like the following :

The problem in using the MD generated trajectory to obtain the rotational
correlation times is, the vectors / the system will have all the three
degrees of freedom here. (Though by using the option -fit rot+trans option
in trjconv, we can remove the translational and rotational degrees of
freedom but still the internal vibrations are there). Where as for obtaining
the rotational correaltion times, we like to know how a particular vector or
a system is reorienting during the course of the simulation alone.

For this, if I superimpose my reference structure (.tpr) on the structures
generated at different time scales (it is just opposite to the regular rms
fits) and if I generate a trajectory with the rotated structures of .tpr
files at various time intervals, this trajectory gives me the orientation of
a vector / system at different time scales. Also the lenghts of the vectors
are also the same (hence no need to worry about the internal motions). Can I
use this trajectory to obtain the rotaitonal correlation times ??

If I understand correctly you like to generate a duplicate of your
trajectory replacing each frame by a reference structure to remove the
internal motions. You would then use this duplicate trajectory to get
the time auto-correlation function of the NH vectors and extract the
overall rotational correlation time of the protein.
If this is correct it would be equivalent to use the rotation matrix
from g_rms (or any other tool).

Note that if you do not have specific experimental data to compare to it
is not so important which method you use, but you have to explain how you
get the number.
Note also that g_dipole give the molecular dipole relaxation time
which is (I think) what you are looking for.

I hope this helps.
XAvier.

Though its lot of work to do the things manually, but i can give a try if it
is alright to try in the above mentioned way.

Thanks in advance.

ram.


-
XAvier Periole - PhD

- Molecular Dynamics Group -
NMR and Computation
University of Groningen
The Netherlands
-
___
gmx-users mailing listgmx-users@gromacs.org
http://www.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the
www interface or send it to [EMAIL PROTECTED]
Can't post? Read http://www.gromacs.org/mailing_lists/users.php


[gmx-users] error NH2 in C terminal

2008-10-02 Thread shahrbanoo karbalaee
Dear Justin
Thank you for your advice.
there is NH2 in c terminal  sequence peptide  but when I do pdb2gmx I
get error.I delete NH2  by hand and do pdb2gmx -f  name.pdb -ter  and
choose none in c terminal .when I see top file I see NH2 add in C
terminal.Is it correct?
and  another  way I use missing in command pdb2gmx .I get top file
similar to first top(according  delet NH2).Please advise me,what is
differnce top files  with  two way and which one is  better I choose?

thanks again
-- 
sh-karbalaee
___
gmx-users mailing listgmx-users@gromacs.org
http://www.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to [EMAIL PROTECTED]
Can't post? Read http://www.gromacs.org/mailing_lists/users.php


Re: [gmx-users] Improving scaling - Gromacs 4.0 RC2

2008-10-02 Thread Carsten Kutzner

Hi Justin,

I have written a small gmx tool that tries various PME/PP balances  
systematically
for a given number of nodes and afterwards gives a suggestion what  
the fastest
combindation is. Although I plan to extend it with more  
functionality, it's already

working and I can send it to you if you like to try it.

The performance loss due to load imbalance has two reasons:
1. imbalance in the force calculation, which can be levelled out by  
using -dlb yes or -dlb auto

as David suggested
2. imbalance in short-range / long range force calculation which can  
be levelled

by choosing the optimal PME/PP ratio. This the script should do for you.

You might also want to check the md.log file for more detailed  
information about where
your imbalance is coming from. My guess is that with 32 PME nodes and  
interleaved
communication the PP (short range) nodes have to wait on the PME  
(long range)

nodes, while with 16 PME nodes it is the other way around.
That you see less imbalance with pp_pme or cartesian communication  
probably
only means that the PME communication is slower in this case - the  
'smaller'

performance loss from imbalance is a bit misleading here.

Carsten


Am 01.10.2008 um 23:18 schrieb Justin A. Lemkul:



Hi,

I've been playing around with the latest release candidate of  
version 4.0, and I was hoping someone out there more knowledgeable  
than me might tell me how to improve a bit on the performance I'm  
seeing.  To clarify, the performance I'm seeing is a ton faster  
than 3.3.x, but I still seem to be getting bogged down with the PME/ 
PP balance.  I'm using mostly the default options with the new mdrun:


mdrun_mpi -s test.tpr -np 64 -npme 32

The system contains about 150,000 atoms - a membrane protein  
surrounded by several hundred lipids and solvent (water).  The  
protein parameters are GROMOS, lipids are Berger, and water is  
SPC.  My .mdp file (adapted from a generic 3.3.x file that I always  
used to use for such simulations) is attached at the end of this  
mail.  It seems that my system runs fastest on 64 CPU's.  Almost  
all tests with 128 or 256 seem to run slower.  The nodes are dual- 
core 2.3 GHz Xserve G5, connected by Infiniband.


Here's a summary of some of the tests I've run:

-np -npme   -ddorderns/day  % performance loss from imbalance
64  16  interleave  5.760   19.6
64  32  interleave  9.600   40.9
64  32  pp_pme  5.252   3.9
64  32  cartesian   5.383   4.7

All other mdrun command line options are defaults.

I get ~10.3 ns/day with -np 256 -npme 64, but since -np 64 -npme 32  
seems to give almost that same performance there seems to be no  
compelling reason to tie up that many nodes.


Any hints on how to speed things up any more?  Is it possible?  Not  
that I'm complaining...the same system under GMX 3.3.3 gives just  
under 1 ns/day :)  I'm really curious about the 40.9% performance  
loss I'm seeing with -np 64 -npme 32, even though it gives the best  
overall performance in terms of ns/day.


Thanks in advance for your attention, and any comments.

-Justin

===test.mdp=
title   = NPT simulation for a membrane protein
; Run parameters
integrator  = md
dt  = 0.002
nsteps  = 1 ; 20 ps
nstcomm = 1
; Output parameters
nstxout = 500
nstvout = 500
nstfout = 500
nstlog  = 500
nstenergy   = 500
; Bond parameters
constraint_algorithm= lincs
constraints = all-bonds
continuation= no; starting up
; Twin-range cutoff scheme, parameters for Gromos96
nstlist = 5
ns_type = grid
rlist   = 0.8
rcoulomb= 0.8
rvdw= 1.4
; PME electrostatics parameters
coulombtype = PME
fourierspacing  = 0.24
pme_order   = 4
ewald_rtol  = 1e-5
optimize_fft= yes
; V-rescale temperature coupling is on in three groups
Tcoupl  = V-rescale
tc_grps = Protein POPC SOL_NA+_CL-
tau_t   = 0.1 0.1 0.1
ref_t   = 310 310 310
; Pressure coupling is on
Pcoupl  = Berendsen
pcoupltype  = semiisotropic
tau_p   = 2.0   
compressibility = 4.5e-5 4.5e-5
ref_p   = 1.0 1.0
; Generate velocities is on
gen_vel = yes   
gen_temp= 310
gen_seed= 173529
; Periodic boundary conditions are on in all directions
pbc = xyz
; Long-range dispersion correction
DispCorr= EnerPres

end test.mdp==

--


Justin A. Lemkul
Graduate Research Assistant
Department of Biochemistry
Virginia Tech
Blacksburg, VA
jalemkul[at]vt.edu | (540) 231-9080
http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin


___
gmx-users mailing listgmx-users@gromacs.org
http://www.gromacs.org/mailman/listinfo/gmx-users
Please search the 

RE: [gmx-users] Improving scaling - Gromacs 4.0 RC2

2008-10-02 Thread Berk Hess
Hi,

Looking at your 64 core results, it seems that your PP:PME load ratio is about 
1:1.
In most cases 3:1 is much better performance wise.
grompp probably also printed a note about this and also how to fix it.
I have also described this shortly in the parallelization section of the pdf 
manual.

You should probably increase your cut-offs and pme grid spacing by the same 
factor
(something around 1.2).
Hopefully mdrun should choose the proper number of pme nodes for you
when you do not use -npme.

Berk


 Date: Wed, 1 Oct 2008 17:18:24 -0400
 From: [EMAIL PROTECTED]
 To: gmx-users@gromacs.org
 Subject: [gmx-users] Improving scaling - Gromacs 4.0 RC2
 
 
 Hi,
 
 I've been playing around with the latest release candidate of version 4.0, 
 and I 
 was hoping someone out there more knowledgeable than me might tell me how to 
 improve a bit on the performance I'm seeing.  To clarify, the performance I'm 
 seeing is a ton faster than 3.3.x, but I still seem to be getting bogged down 
 with the PME/PP balance.  I'm using mostly the default options with the new 
 mdrun:
 
 mdrun_mpi -s test.tpr -np 64 -npme 32
 
 The system contains about 150,000 atoms - a membrane protein surrounded by 
 several hundred lipids and solvent (water).  The protein parameters are 
 GROMOS, 
 lipids are Berger, and water is SPC.  My .mdp file (adapted from a generic 
 3.3.x 
 file that I always used to use for such simulations) is attached at the end 
 of 
 this mail.  It seems that my system runs fastest on 64 CPU's.  Almost all 
 tests 
 with 128 or 256 seem to run slower.  The nodes are dual-core 2.3 GHz Xserve 
 G5, 
 connected by Infiniband.
 
 Here's a summary of some of the tests I've run:
 
 -np   -npme   -ddorderns/day  % performance loss from imbalance
 6416  interleave  5.760   19.6
 6432  interleave  9.600   40.9
 6432  pp_pme  5.252   3.9
 6432  cartesian   5.383   4.7
 
 All other mdrun command line options are defaults.
 
 I get ~10.3 ns/day with -np 256 -npme 64, but since -np 64 -npme 32 seems to 
 give almost that same performance there seems to be no compelling reason to 
 tie 
 up that many nodes.
 
 Any hints on how to speed things up any more?  Is it possible?  Not that I'm 
 complaining...the same system under GMX 3.3.3 gives just under 1 ns/day :)  
 I'm 
 really curious about the 40.9% performance loss I'm seeing with -np 64 -npme 
 32, 
 even though it gives the best overall performance in terms of ns/day.
 
 Thanks in advance for your attention, and any comments.
 
 -Justin
 
 ===test.mdp=
 title = NPT simulation for a membrane protein
 ; Run parameters
 integrator= md
 dt= 0.002
 nsteps= 1 ; 20 ps
 nstcomm   = 1
 ; Output parameters
 nstxout   = 500
 nstvout   = 500
 nstfout   = 500
 nstlog= 500
 nstenergy = 500
 ; Bond parameters
 constraint_algorithm  = lincs
 constraints   = all-bonds
 continuation  = no; starting up
 ; Twin-range cutoff scheme, parameters for Gromos96
 nstlist   = 5
 ns_type   = grid
 rlist = 0.8
 rcoulomb  = 0.8
 rvdw  = 1.4
 ; PME electrostatics parameters
 coulombtype   = PME
 fourierspacing  = 0.24
 pme_order = 4
 ewald_rtol= 1e-5
 optimize_fft  = yes
 ; V-rescale temperature coupling is on in three groups
 Tcoupl= V-rescale
 tc_grps   = Protein POPC SOL_NA+_CL-
 tau_t = 0.1 0.1 0.1
 ref_t = 310 310 310
 ; Pressure coupling is on
 Pcoupl= Berendsen
 pcoupltype= semiisotropic
 tau_p = 2.0   
 compressibility   = 4.5e-5 4.5e-5
 ref_p = 1.0 1.0
 ; Generate velocities is on
 gen_vel   = yes   
 gen_temp  = 310
 gen_seed  = 173529
 ; Periodic boundary conditions are on in all directions
 pbc   = xyz
 ; Long-range dispersion correction
 DispCorr  = EnerPres
 
 end test.mdp==
 
 -- 
 
 
 Justin A. Lemkul
 Graduate Research Assistant
 Department of Biochemistry
 Virginia Tech
 Blacksburg, VA
 jalemkul[at]vt.edu | (540) 231-9080
 http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin
 
 
 ___
 gmx-users mailing listgmx-users@gromacs.org
 http://www.gromacs.org/mailman/listinfo/gmx-users
 Please search the archive at http://www.gromacs.org/search before posting!
 Please don't post (un)subscribe requests to the list. Use the 
 www interface or send it to [EMAIL PROTECTED]
 Can't post? Read http://www.gromacs.org/mailing_lists/users.php

_
Express yourself instantly with MSN Messenger! Download today it's FREE!

[gmx-users] Error in T-coupling of 2 atoms in T-couplingGroups

2008-10-02 Thread Chitrita Duttaroy
Hi,

I am a beginner of Gromacs 3.3.3.

I have done position restrined grompp with -f pr.mdp, -r .gro (water box
having charge -2e), -c .gro and -p .top (produced by genbox). The output is
a .tpr file.
Then to neutralize the solution with Na+ ion I used genion by -s .tpr -o
.gro - pname NA+ -np 2 -g .log.
Then in the .top file in [molecule] portion I have changed #mol of SOL by
reducing it by 2 and added NA+ of 2 mol to match the coordinate with .gro.
And also deleted the .tpr file. Upto this step everything was OK.
In next step again position restrained grompp was done using .gro file
containing 2 Na+ ions. But which showed a Fatal error --

Fatal error:
2 atoms are not part of any of the T-coupling groups.

So .tpr file is not generated. I am not understanding what to do now.
Please help as early as possible.

Thanking you,

CHITRITA DUTTA ROY.
___
gmx-users mailing listgmx-users@gromacs.org
http://www.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to [EMAIL PROTECTED]
Can't post? Read http://www.gromacs.org/mailing_lists/users.php

Re: [gmx-users] HB lifetime

2008-10-02 Thread Omer Markovitch
Please see my comments below.


 Hi,

 The HB definitions and associated lifetimes are a bit arbitrary, so there'
 s always going to be some ambiguity here.  That being said, the reason the
 integral of the HB correlation function C(t) isn't an ideal definition is
 that C(t) is only roughly exponential.  Same argument goes for getting the
 lifetime from a fit to C(t), or looking for the time where C(t)=1/e, or
 similar simple approximations.


I disagree. HB lifetime is only slightly dependent on the exact values of
the geometric parameters, around the usual values of R(O...O)= 3.5 Angstrom
 angle(O...O-H)= 30 degrees, please see JCP 129, 84505 (a link to the
abstract is given below).
C(t) of a HB obeys the analytical solution of the reversible geminate
recombination (see a short review in JCP 129), and so its tail follows a
power law: C(t) ~ Keq*(D*t)^-3/2, which is indicative of a 3 dimensions
diffusion.


 What Luzar recommends is to think about an equilibrium between bound and
 unbound molecules, so that they interact with a forward and a backward rate
 constant k and k'.  k gives the forward rate, ie. the HB breaking rate, and
 k' gives the HB reformation rate... they are not equal due to the diffusion
 of unbound molecules away from the solvation shell.  There are a few
 advantages of going this route, not the least of which is that you tend to
 get similar lifetimes regardless of small changes in the HB definition, and
 whether you use geometric or energetic criteria, etc.


The reversible geminate recombination deals with the A+B --- C, here
A=B=H2O  C=(H2O)2, the bound water dimer.
From a single fit to C(t) one receives the bimolecular forward  backward
rate constants, which are well defined.
k' you suggest is an apparent unimolecular rate constant, which appears to
be more suited for short times.



 Extracting these rate constants is a bit tricky (I usually do it by hand),
 but I guess gromacs has a scheme to do it... I haven't actually looked at it
 (though I really should!).  I'd recommend some caution though, a scheme that
 works well for HB's between water molecules in bulk may need to be adjusted
 to properly model HB's between water and polar atoms.


I have to disagree again. The A+B=C problem has an analytical solution.
Technically, ones only need to know how to calculate an error-function and
to solve a cubic equation, please see eq. 9, 10 at JCP 129.
The geminate problem is robust in the sense that it describes C(t) of ANY 2
particles, as long as their behavior is controlled by diffusion, it
describes the water pair, but should describe also, for example, liquid
argon. For the second case, ofcourse, different rate constants are expected.

One should NOT see JCP 129 as a proof that previous works were absolutly
wrong !
Instead, it shows that the postulate by Luzar  Chandler, that C(t) of water
is controlled by diffusion, is right, and that with the analytical solution
of the geminate problem one can understand some aspects of the water dimer.
For example - what causes the activation energies of the forward  backward
rate constants to be about similar rather then being different by the
strength of 1 HB?

Hope I was clear.
Omer Markovitch.

** a link to JCP 129, 84505 (2008) http://dx.doi.org/10.1063/1.2968608
** supporting information includes a short trajectory movie
___
gmx-users mailing listgmx-users@gromacs.org
http://www.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to [EMAIL PROTECTED]
Can't post? Read http://www.gromacs.org/mailing_lists/users.php

Re: [gmx-users] Hydrogen Bond Lifetime

2008-10-02 Thread Omer Markovitch
Please see also my reply under HB lifetime to Christopher Daub.
I think that JCP 129, 84505 (2008) should be read when dealing with C(t) in
general  HB lifetimes in particular.
** link: http://dx.doi.org/10.1063/1.2968608

Omer Markovitch.
___
gmx-users mailing listgmx-users@gromacs.org
http://www.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to [EMAIL PROTECTED]
Can't post? Read http://www.gromacs.org/mailing_lists/users.php

Re: [gmx-users] Error in T-coupling of 2 atoms in T-couplingGroups

2008-10-02 Thread Justin A. Lemkul



Chitrita Duttaroy wrote:

Hi,
 
I am a beginner of Gromacs 3.3.3. http://3.3.3.


I have done position restrined grompp with -f pr.mdp, -r .gro (water box 
having charge -2e), -c .gro and -p .top (produced by genbox). The output 
is a .tpr file.
Then to neutralize the solution with Na+ ion I used genion by -s .tpr -o 
.gro - pname NA+ -np 2 -g .log.
Then in the .top file in [molecule] portion I have changed #mol of SOL 
by reducing it by 2 and added NA+ of 2 mol to match the coordinate with 
.gro. And also deleted the .tpr file. Upto this step everything was OK.
In next step again position restrained grompp was done using .gro file 
containing 2 Na+ ions. But which showed a Fatal error --


Fatal error:
2 atoms are not part of any of the T-coupling groups.

So .tpr file is not generated. I am not understanding what to do 
now. Please help as early as possible.


I'm guessing those two atoms are the 2 NA+ (most likely).  Every element of your 
system should be included in a T-coupling group.  Your .mdp file probably has 
not assigned the NA+ anywhere (merge it with the solvent in a special group: 
SOL_NA+).


-Justin

 
Thanking you,
 
CHITRITA DUTTA ROY.





___
gmx-users mailing listgmx-users@gromacs.org
http://www.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to [EMAIL PROTECTED]

Can't post? Read http://www.gromacs.org/mailing_lists/users.php


--


Justin A. Lemkul
Graduate Research Assistant
Department of Biochemistry
Virginia Tech
Blacksburg, VA
jalemkul[at]vt.edu | (540) 231-9080
http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin


___
gmx-users mailing listgmx-users@gromacs.org
http://www.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to [EMAIL PROTECTED]

Can't post? Read http://www.gromacs.org/mailing_lists/users.php


Re: [gmx-users] error NH2 in C terminal

2008-10-02 Thread Justin A. Lemkul



shahrbanoo karbalaee wrote:

Dear Justin
Thank you for your advice.
there is NH2 in c terminal  sequence peptide  but when I do pdb2gmx I
get error.I delete NH2  by hand and do pdb2gmx -f  name.pdb -ter  and
choose none in c terminal .when I see top file I see NH2 add in C
terminal.Is it correct?


I don't understand how deleting NH2 is causing it to show up in the topology.


and  another  way I use missing in command pdb2gmx .I get top file
similar to first top(according  delet NH2).Please advise me,what is
differnce top files  with  two way and which one is  better I choose?


Never use a topology that has missing atoms.

What might be better is if you could paste the end of your .pdb file (showing 
the last residue with NH2, and the corresponding pdb2gmx command lines, then any 
modifications you have tried.  Right now I do not understand what you are doing.


-Justin



thanks again


--


Justin A. Lemkul
Graduate Research Assistant
Department of Biochemistry
Virginia Tech
Blacksburg, VA
jalemkul[at]vt.edu | (540) 231-9080
http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin


___
gmx-users mailing listgmx-users@gromacs.org
http://www.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to [EMAIL PROTECTED]

Can't post? Read http://www.gromacs.org/mailing_lists/users.php


[gmx-users] Re:Error in T-coupling of 2 atoms in T-couplingGroups

2008-10-02 Thread sudheer babu

 Hi Chitrita,

may be you haven't generated index file for the solvent and newly added ions

use make_ndx file


 all the best

Sudheer.

 I am a beginner of Gromacs 3.3.3.

 I have done position restrined grompp with -f pr.mdp, -r .gro (water box
 having charge -2e), -c .gro and -p .top (produced by genbox). The output is
 a .tpr file.
 Then to neutralize the solution with Na+ ion I used genion by -s .tpr -o
 .gro - pname NA+ -np 2 -g .log.
 Then in the .top file in [molecule] portion I have changed #mol of SOL by
 reducing it by 2 and added NA+ of 2 mol to match the coordinate with .gro.
 And also deleted the .tpr file. Upto this step everything was OK.
 In next step again position restrained grompp was done using .gro file
 containing 2 Na+ ions. But which showed a Fatal error --

 Fatal error:
 2 atoms are not part of any of the T-coupling groups.

 So .tpr file is not generated. I am not understanding what to do now.
 Please help as early as possible.

 Thanking you,


___
gmx-users mailing listgmx-users@gromacs.org
http://www.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to [EMAIL PROTECTED]
Can't post? Read http://www.gromacs.org/mailing_lists/users.php

[gmx-users] Uniform neutralizing plasma for Particle Mesh Ewald (PME) instead of counterions

2008-10-02 Thread Himanshu Khandelia
Is there an implementation in gromacs for using a uniform neutralizing 
plasma with Particle Mesh Ewald (PME), to avoid use of counterions?

Thank you

-Himanshu

___
gmx-users mailing listgmx-users@gromacs.org
http://www.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to [EMAIL PROTECTED]
Can't post? Read http://www.gromacs.org/mailing_lists/users.php


Re: [gmx-users] Improving scaling - Gromacs 4.0 RC2

2008-10-02 Thread Justin A. Lemkul



Carsten Kutzner wrote:

Hi Justin,

I have written a small gmx tool that tries various PME/PP balances 
systematically
for a given number of nodes and afterwards gives a suggestion what the 
fastest
combindation is. Although I plan to extend it with more functionality, 
it's already

working and I can send it to you if you like to try it.



I would like to try it; I think that it would help me get an empirical feel for 
things.



The performance loss due to load imbalance has two reasons:
1. imbalance in the force calculation, which can be levelled out by 
using -dlb yes or -dlb auto

as David suggested


Indeed, using -dlb yes improves the % imbalance, but at the cost of output.  If 
we take my 64-core case, applying -dlb yes reduces my speed to around 5-6 
ns/day (depending on the # of PME nodes - 16, 24, 32).


It just seems peculiar to me that I can get great speed in terms of ns/day using 
64/32 PP/PME, but performance is hampered due to imbalance.


2. imbalance in short-range / long range force calculation which can be 
levelled

by choosing the optimal PME/PP ratio. This the script should do for you.

You might also want to check the md.log file for more detailed 
information about where
your imbalance is coming from. My guess is that with 32 PME nodes and 
interleaved
communication the PP (short range) nodes have to wait on the PME (long 
range)

nodes, while with 16 PME nodes it is the other way around.
That you see less imbalance with pp_pme or cartesian communication probably
only means that the PME communication is slower in this case - the 
'smaller'

performance loss from imbalance is a bit misleading here.



Ah, that makes a bit more sense.  Thanks :)

-Justin


Carsten


Am 01.10.2008 um 23:18 schrieb Justin A. Lemkul:



Hi,

I've been playing around with the latest release candidate of version 
4.0, and I was hoping someone out there more knowledgeable than me 
might tell me how to improve a bit on the performance I'm seeing.  To 
clarify, the performance I'm seeing is a ton faster than 3.3.x, but I 
still seem to be getting bogged down with the PME/PP balance.  I'm 
using mostly the default options with the new mdrun:


mdrun_mpi -s test.tpr -np 64 -npme 32

The system contains about 150,000 atoms - a membrane protein 
surrounded by several hundred lipids and solvent (water).  The protein 
parameters are GROMOS, lipids are Berger, and water is SPC.  My .mdp 
file (adapted from a generic 3.3.x file that I always used to use for 
such simulations) is attached at the end of this mail.  It seems that 
my system runs fastest on 64 CPU's.  Almost all tests with 128 or 256 
seem to run slower.  The nodes are dual-core 2.3 GHz Xserve G5, 
connected by Infiniband.


Here's a summary of some of the tests I've run:

-np-npme-ddorderns/day% performance loss from imbalance
6416interleave5.76019.6
6432interleave9.60040.9
6432pp_pme5.2523.9
6432cartesian5.3834.7

All other mdrun command line options are defaults.

I get ~10.3 ns/day with -np 256 -npme 64, but since -np 64 -npme 32 
seems to give almost that same performance there seems to be no 
compelling reason to tie up that many nodes.


Any hints on how to speed things up any more?  Is it possible?  Not 
that I'm complaining...the same system under GMX 3.3.3 gives just 
under 1 ns/day :)  I'm really curious about the 40.9% performance loss 
I'm seeing with -np 64 -npme 32, even though it gives the best overall 
performance in terms of ns/day.


Thanks in advance for your attention, and any comments.

-Justin

===test.mdp=
title= NPT simulation for a membrane protein
; Run parameters
integrator= md
dt= 0.002
nsteps= 1; 20 ps
nstcomm= 1
; Output parameters
nstxout= 500
nstvout= 500
nstfout= 500
nstlog= 500
nstenergy= 500
; Bond parameters
constraint_algorithm = lincs
constraints= all-bonds
continuation = no; starting up
; Twin-range cutoff scheme, parameters for Gromos96
nstlist= 5
ns_type= grid
rlist= 0.8
rcoulomb= 0.8
rvdw= 1.4
; PME electrostatics parameters
coulombtype= PME
fourierspacing  = 0.24
pme_order= 4
ewald_rtol= 1e-5
optimize_fft= yes
; V-rescale temperature coupling is on in three groups
Tcoupl = V-rescale
tc_grps= Protein POPC SOL_NA+_CL-
tau_t= 0.1 0.1 0.1
ref_t= 310 310 310
; Pressure coupling is on
Pcoupl= Berendsen
pcoupltype= semiisotropic
tau_p= 2.0   
compressibility= 4.5e-5 4.5e-5

ref_p= 1.0 1.0
; Generate velocities is on
gen_vel= yes   
gen_temp= 310

gen_seed= 173529
; Periodic boundary conditions are on in all directions
pbc= xyz
; Long-range dispersion correction
DispCorr= EnerPres

end test.mdp==

--



Re: [gmx-users] Improving scaling - Gromacs 4.0 RC2

2008-10-02 Thread Justin A. Lemkul



Berk Hess wrote:

Hi,

Looking at your 64 core results, it seems that your PP:PME load ratio is 
about 1:1.

In most cases 3:1 is much better performance wise.
grompp probably also printed a note about this and also how to fix it.


From the .mdp file I posted before, grompp gave the following:

Calculating fourier grid dimensions for X Y Z
Using a fourier grid of 56x60x50, spacing 0.238 0.230 0.240
Estimate for the relative computational load of the PME mesh part: 0.34

Should it have advised me about anything else?  It seems that the PME load is 
reasonable, given what I understand about the matter.  I suppose that does 
indicate to the PP/PME ratio I should be using.


I have also described this shortly in the parallelization section of the 
pdf manual.


You should probably increase your cut-offs and pme grid spacing by the 
same factor

(something around 1.2).


Which cut-offs, rlist/rcoulomb?  I thought these were force field-dependent. 
Please correct me if I'm wrong.



Hopefully mdrun should choose the proper number of pme nodes for you
when you do not use -npme.


I have never gotten mdrun to cooperate without specifically defining -npme; 
maybe it's just me or something that I'm doing.  For example, output from two 
runs I tried (using 64 cores):


1. With fourierspacing = 0.12 (only difference from the posted .mdp file)

---
Program mdrun_4.0_rc2_mpi, VERSION 4.0_rc2
Source code file: domdec_setup.c, line: 132

Fatal error:
Could not find an appropriate number of separate PME nodes. i.e. = 
0.563840*#nodes (34) and = #nodes/2 (32) and reasonable performance wise 
(grid_x=112, grid_y=117).
Use the -npme option of mdrun or change the number of processors or the PME grid 
dimensions, see the manual for details.

---


2. With fourierspacing = 0.24 (the posted .mdp file)

---
Program mdrun_4.0_rc2_mpi, VERSION 4.0_rc2
Source code file: domdec_setup.c, line: 132

Fatal error:
Could not find an appropriate number of separate PME nodes. i.e. = 
0.397050*#nodes (24) and = #nodes/2 (32) and reasonable performance wise 
(grid_x=56, grid_y=60).
Use the -npme option of mdrun or change the number of processors or the PME grid 
dimensions, see the manual for details.

---


Thanks.

-Justin



Berk



  Date: Wed, 1 Oct 2008 17:18:24 -0400
  From: [EMAIL PROTECTED]
  To: gmx-users@gromacs.org
  Subject: [gmx-users] Improving scaling - Gromacs 4.0 RC2
 
 
  Hi,
 
  I've been playing around with the latest release candidate of version 
4.0, and I
  was hoping someone out there more knowledgeable than me might tell me 
how to
  improve a bit on the performance I'm seeing. To clarify, the 
performance I'm
  seeing is a ton faster than 3.3.x, but I still seem to be getting 
bogged down
  with the PME/PP balance. I'm using mostly the default options with 
the new mdrun:

 
  mdrun_mpi -s test.tpr -np 64 -npme 32
 
  The system contains about 150,000 atoms - a membrane protein 
surrounded by
  several hundred lipids and solvent (water). The protein parameters 
are GROMOS,
  lipids are Berger, and water is SPC. My .mdp file (adapted from a 
generic 3.3.x
  file that I always used to use for such simulations) is attached at 
the end of
  this mail. It seems that my system runs fastest on 64 CPU's. Almost 
all tests
  with 128 or 256 seem to run slower. The nodes are dual-core 2.3 GHz 
Xserve G5,

  connected by Infiniband.
 
  Here's a summary of some of the tests I've run:
 
  -np -npme -ddorder ns/day % performance loss from imbalance
  64 16 interleave 5.760 19.6
  64 32 interleave 9.600 40.9
  64 32 pp_pme 5.252 3.9
  64 32 cartesian 5.383 4.7
 
  All other mdrun command line options are defaults.
 
  I get ~10.3 ns/day with -np 256 -npme 64, but since -np 64 -npme 32 
seems to
  give almost that same performance there seems to be no compelling 
reason to tie

  up that many nodes.
 
  Any hints on how to speed things up any more? Is it possible? Not 
that I'm
  complaining...the same system under GMX 3.3.3 gives just under 1 
ns/day :) I'm
  really curious about the 40.9% performance loss I'm seeing with -np 
64 -npme 32,

  even though it gives the best overall performance in terms of ns/day.
 
  Thanks in advance for your attention, and any comments.
 
  -Justin
 
  ===test.mdp=
  title = NPT simulation for a membrane protein
  ; Run parameters
  integrator = md
  dt = 0.002
  nsteps = 1 ; 20 ps
  nstcomm = 1
  ; Output parameters
  nstxout = 500
  nstvout = 500
  nstfout = 500
  nstlog = 500
  nstenergy = 500
  ; Bond parameters
  constraint_algorithm = lincs
  constraints = all-bonds
  continuation = no ; starting up
  ; Twin-range cutoff scheme, parameters for Gromos96
  nstlist = 5
  ns_type = grid
  rlist = 0.8

[gmx-users] X2top

2008-10-02 Thread Morteza Khabiri
Dear gmxusers

I want to make a itp file by x2top command. but unfortunately It does not
work. I receive the following error:

Fatal error:
Library file ffoplsaa.n2t not found in current dir nor in default
directories.
(You can set the directories to search with the GMXLIB path variable)

I check the root and also the files which this program could not find but
I could not find the problem.

What could the problem?

thanks


___
gmx-users mailing listgmx-users@gromacs.org
http://www.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the
www interface or send it to [EMAIL PROTECTED]
Can't post? Read http://www.gromacs.org/mailing_lists/users.php


Re: [gmx-users] x2top

2008-10-02 Thread David van der Spoel

Morteza Khabiri wrote:

Dear Dr,van der Spoel

I am using the 3.3.2 version. It is totally installed.
I used echo $GMXDATA but it did not show anything.

Then you should run
source /path/to/gromacsbin/GMXRC


Thanks for your help

Morteza

___
gmx-users mailing listgmx-users@gromacs.org
http://www.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to [EMAIL PROTECTED]

Can't post? Read http://www.gromacs.org/mailing_lists/users.php



--
David van der Spoel, Ph.D., Professor of Biology
Molec. Biophys. group, Dept. of Cell  Molec. Biol., Uppsala University.
Box 596, 75124 Uppsala, Sweden. Phone:  +46184714205. Fax: +4618511755.
[EMAIL PROTECTED]   [EMAIL PROTECTED]   http://folding.bmc.uu.se
___
gmx-users mailing listgmx-users@gromacs.org
http://www.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to [EMAIL PROTECTED]

Can't post? Read http://www.gromacs.org/mailing_lists/users.php


Re: [gmx-users] HB lifetime

2008-10-02 Thread David van der Spoel

Christopher Daub wrote:

Hi Omer,

We are aware of your work with Dr. Agmon, and I believe Dr. Luzar has 
spoken with him about it.  I don't understand it enough to say much, but 
I don't think we have substantive disagreements with it.  Of course, the 
questioner was asking about the implementation of the Luzar model in 
Gromacs, so I tried to explain some of the background of her ideas. 
 Perhaps they'll implement your HB model in Gromacs 5...



I would encourage anyone to contribute  implementation of this algorithm 
 to the current g_hbond code. Please get in touch with me off-list if 
you are interested.




Cheers,
Chris.

On Oct 2, 2008, at 4:45 AM, Omer Markovitch wrote:


Please see my comments below.




Hi,

The HB definitions and associated lifetimes are a bit arbitrary,
so there' s always going to be some ambiguity here.  That being
said, the reason the integral of the HB correlation function C(t)
isn't an ideal definition is that C(t) is only roughly
exponential.  Same argument goes for getting the lifetime from a
fit to C(t), or looking for the time where C(t)=1/e, or similar
simple approximations.

 
I disagree. HB lifetime is only slightly dependent on the exact values 
of the geometric parameters, around the usual values of R(O...O)= 3.5 
Angstrom  angle(O...O-H)= 30 degrees, please see JCP 129, 84505 (a 
link to the abstract is given below).
C(t) of a HB obeys the analytical solution of the reversible geminate 
recombination (see a short review in JCP 129), and so its tail follows 
a power law: C(t) ~ Keq*(D*t)^-3/2, which is indicative of a 3 
dimensions diffusion.



What Luzar recommends is to think about an equilibrium between
bound and unbound molecules, so that they interact with a forward
and a backward rate constant k and k'.  k gives the forward rate,
ie. the HB breaking rate, and k' gives the HB reformation rate...
they are not equal due to the diffusion of unbound molecules away
from the solvation shell.  There are a few advantages of going
this route, not the least of which is that you tend to get similar
lifetimes regardless of small changes in the HB definition, and
whether you use geometric or energetic criteria, etc.


The reversible geminate recombination deals with the A+B --- C, here 
A=B=H2O  C=(H2O)2, the bound water dimer.
From a single fit to C(t) one receives the bimolecular forward  
backward rate constants, which are well defined.
k' you suggest is an apparent unimolecular rate constant, which 
appears to be more suited for short times.
 



Extracting these rate constants is a bit tricky (I usually do it
by hand), but I guess gromacs has a scheme to do it... I haven't
actually looked at it (though I really should!).  I'd recommend
some caution though, a scheme that works well for HB's between
water molecules in bulk may need to be adjusted to properly model
HB's between water and polar atoms.


I have to disagree again. The A+B=C problem has an analytical 
solution. Technically, ones only need to know how to calculate an 
error-function and to solve a cubic equation, please see eq. 9, 10 at 
JCP 129.
The geminate problem is robust in the sense that it describes C(t) of 
ANY 2 particles, as long as their behavior is controlled by diffusion, 
it describes the water pair, but should describe also, for example, 
liquid argon. For the second case, ofcourse, different rate constants 
are expected.


One should NOT see JCP 129 as a proof that previous works were 
absolutly wrong !
Instead, it shows that the postulate by Luzar  Chandler, that C(t) of 
water is controlled by diffusion, is right, and that with the 
analytical solution of the geminate problem one can understand some 
aspects of the water dimer. For example - what causes the activation 
energies of the forward  backward rate constants to be about similar 
rather then being different by the strength of 1 HB?


Hope I was clear.
Omer Markovitch.

** a link to JCP 129, 84505 (2008) http://dx.doi.org/10.1063/1.2968608
** supporting information includes a short trajectory movie





___
gmx-users mailing listgmx-users@gromacs.org
http://www.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to [EMAIL PROTECTED]

Can't post? Read http://www.gromacs.org/mailing_lists/users.php



--
David.

David van der Spoel, PhD, Professor of Biology
Dept. of Cell and Molecular Biology, Uppsala University.
Husargatan 3, Box 596,  75124 Uppsala, Sweden
phone:  46 18 471 4205  fax: 46 18 511 755
[EMAIL PROTECTED]   [EMAIL PROTECTED]   http://folding.bmc.uu.se

[gmx-users] posres problem in 4.0_rc2. Bug in grompp?

2008-10-02 Thread Jochen Hub
Hi,

I have a system in which I apply a posres on one atom. During em or md I
get an error that I never had before:

---
Program mdrun, VERSION 4.0_rc2
Source code file: mtop_util.c, line: 642

Software inconsistency error:
Position restraint coordinates are missing
---

Grompp runs withouth errors. When stargting mdrun, I get the error.

The error is most likely in grompp: When running an older CVS-grompp
(from last July), mdrun(4.0) works fine. When using the 4.0-grompp with
the 4.0-mdrun, the error occurs.

Or do I miss something?

cheers,
Jochen








-- 

Dr. Jochen Hub
Max Planck Institute for Biophysical Chemistry
Computational biomolecular dynamics group
Am Fassberg 11
D-37077 Goettingen, Germany
Email: jhub[at]gwdg.de
Tel.: +49 (0)551 201-2312

___
gmx-users mailing listgmx-users@gromacs.org
http://www.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to [EMAIL PROTECTED]
Can't post? Read http://www.gromacs.org/mailing_lists/users.php


[gmx-users] RE: X2top

2008-10-02 Thread Vitaly Chaban
 I check the root and also the files which this program could not find but
 I could not find the problem.
 
 What could the problem?

What does it mean that you checked root? Is this file present in the
gromacs topology folder? Are the access right OK?

I think the problem is not in gromacs but in your particular system.



-- 
Vitaly V. Chaban
School of Chemistry
National University of Kharkiv
Svoboda sq.,4, Kharkiv 61077, Ukraine
email: [EMAIL PROTECTED]
skype: vvchaban
tel.: +38-097-8259698

___
gmx-users mailing listgmx-users@gromacs.org
http://www.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to [EMAIL PROTECTED]
Can't post? Read http://www.gromacs.org/mailing_lists/users.php


Re: [gmx-users] Error in T-coupling of 2 atoms in T-couplingGroups

2008-10-02 Thread Chitrita Duttaroy
Thank you Justin and Sudheer.
I had already made a solution to my problem, but didn't know whether it was
right. To confirm it I want to discuss it with you.

After doing grompp to add Na+ to the water box I have done make_ndx with
.gro file. Then in the index file I merged the index no. of 2 [NA+] into the
[SOL] group and deleted the [NA+] group. After that I could do the position
restrained grompp with the file that 2 Na+ ions.

I want to know is it a correct method or not.

Thanking you in advance,

with regards

CHITRITA.
___
gmx-users mailing listgmx-users@gromacs.org
http://www.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to [EMAIL PROTECTED]
Can't post? Read http://www.gromacs.org/mailing_lists/users.php

[gmx-users] Heat Flux and Lambda

2008-10-02 Thread Andy Shelley
I have been reading through the mailing list and have read some discussion
that heat flux can be indicated by the value of lambda. Can I calculate the
amount of heat flux from the lambda value?

Thanks,
Andy Shelley
___
gmx-users mailing listgmx-users@gromacs.org
http://www.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to [EMAIL PROTECTED]
Can't post? Read http://www.gromacs.org/mailing_lists/users.php

[gmx-users] off topic: graduate school opportunity

2008-10-02 Thread David Mobley
For those interested in graduate school doing molecular dynamics and
free energy simulations, please read on.

To the rest, sorry that this is somewhat off topic, but I thought it
might be of interest to some because of my past involvement on the
mailing list.

Anyway, I just wanted to mention that I recently started my own
research group in the Chemistry Department at the University of New
Orleans. My group will be continuing various free energy studies,
including working on methods for more accurate prediction of binding
free energies, small molecule solvation and solubility, etc.

If you are interested in a graduate program in chemistry, or you know
someone who might be, and this work sounds interesting, I urge you to
contact me at this e-mail address. I'm looking for some graduate
students to join my group here. We are accepting applications NOW for
the spring semester, and through March or so for Fall 2009.

You can get some information on the department at
http://www.chem.uno.edu and on admissions specifically at
http://www.chem.uno.edu/ChemistryDepartmentfolder/Information.html,
and if you need more info, contact me or the selections committee
([EMAIL PROTECTED]). While I do not currently have RA support for
students, I anticipate having some in the near future, and in the
meantime anyone joining my group would be supported by a TA position.

Thanks for your time.

David Mobley, Ph.D.
Assistant Professor of Chemistry
University of New Orleans
New Orleans, LA 70148
Office 504-280-6445
Fax 504-280-6860
___
gmx-users mailing listgmx-users@gromacs.org
http://www.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to [EMAIL PROTECTED]
Can't post? Read http://www.gromacs.org/mailing_lists/users.php


Re: [gmx-users] charge changes in free energy calculations

2008-10-02 Thread David Mobley
Hi,

As long as you end up with the same charge in the initial and final
states,  you should be OK. It's only if your total transformation
involves a change of net charge that you need to worry. So you should
be fine if you turn off the charges on D, change the LJ interactions,
and turn back on the charges on E.

If I were you, I probably would not turn off the charges entirely --
after all, D and E share many atoms. I would only turn off charges on
the atoms that you are going to modify.

David


On Fri, Sep 26, 2008 at 2:14 AM, friendli [EMAIL PROTECTED] wrote:
 Dear all,

 I have a mutation free energy calculation from D(asp) to E(glu). The charge
 is not changed for the overall mutation. However, following Dr. David
 Molbey's suggesion, electrostatic and VDW interaction should be modified
 separately, so in the first step we need to turn off the charge from D(-1)
 to D(0).

 I learn from the mailing list that it is problematic to do FE calculations
 with different charges for initial and final states.
 So is that safe to turn off the charge for D and turn on the same charge for
 E in this case?
 If not, it that OK to perform mutation FE calculation in one step?
 or there is no safe way to handle this kind mutations, i.e. mutating charge
 groups, currently without special correction?

 thank a lot

 Qiang
 ___
 gmx-users mailing listgmx-users@gromacs.org
 http://www.gromacs.org/mailman/listinfo/gmx-users
 Please search the archive at http://www.gromacs.org/search before posting!
 Please don't post (un)subscribe requests to the list. Use the www interface
 or send it to [EMAIL PROTECTED]
 Can't post? Read http://www.gromacs.org/mailing_lists/users.php

___
gmx-users mailing listgmx-users@gromacs.org
http://www.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to [EMAIL PROTECTED]
Can't post? Read http://www.gromacs.org/mailing_lists/users.php