[gmx-users] Query regarding Preferential Interaction Coefficient Calculation

2019-01-15 Thread ISHRAT JAHAN
Dear all,
I have done the MD simulation of protein in osmolytes, now i want to
calculate the preferential interaction coefficient of osmolytes from
protein surface.Will anyone please guide me the proper steps of the
calculation?
Thanks and regards
-- 
Ishrat Jahan
Research Scholar
Department Of Chemistry
A.M.U Aligarh
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] query regarding state.cpt file

2019-01-15 Thread sudha bhagwati
 MD_output.tar.gz

Greetings Sir/Madam

I am running MD simulation using commands listed below-
___
gmx pdb2gmx -ignh -f protein.pdb
gmx editconf -f ligand.pdb -o ligand.gro

gmx editconf -f conf.gro -o newbox.gro -bt dodecahedron -d 1.0
gmx solvate -cp newbox.gro -cs spc216.gro -p topol.top -o solv.gro

gmx grompp -f em.mdp -c solv.gro -p topol.top -o ions.tpr
gmx genion -s ions.tpr -o solv_ions.gro -p topol.top -pname NA -nname CL
-neutral

gmx grompp -f em_real.mdp -c solv_ions.gro -p topol.top -o em.tpr
gmx mdrun -v -deffnm em

gmx make_ndx -f em.gro -o index.ndx

gmx grompp -f nvt.mdp -c em.gro -p topol.top -n index.ndx -o nvt.tpr

gmx mdrun -v -deffnm nvt

gmx grompp -f npt.mdp -c nvt.gro -t nvt.cpt -p topol.top -n index.ndx -o
npt.tpr

gmx mdrun -v -deffnm npt

gmx grompp -f md.mdp -c npt.gro -t npt.cpt -p topol.top -n index.ndx -o
md_0_20.tpr

gmx mdrun -v -deffnm md_0_20
_

I have attached the tar file for generated output files from complete
gromacs run.
I am not an expert user of gromacs. So which file I can use as state.cpt
and what is the command to rerun my stopped MD run?

Thank you!



-- 
Thanks & regards
~
Sudha Bhagwati
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] regarding editconf

2019-01-15 Thread Omkar Singh
My advice is that don't use many option in on shot. In place of -center try
by -translate with rotation.

Best regard

On Tue, Jan 15, 2019 at 8:52 PM Ali Khodayari <
ali.khoday...@student.kuleuven.be> wrote:

> Thank you Justin. I try to search more for the reason. My best, Ali
>
> -Original Message-
> From: gromacs.org_gmx-users-boun...@maillist.sys.kth.se
>  On Behalf Of Justin
> Lemkul
> Sent: dinsdag 15 januari 2019 16:10
> To: gmx-us...@gromacs.org
> Subject: Re: [gmx-users] regarding editconf
>
>
>
> On 1/15/19 9:34 AM, Ali Khodayari wrote:
> > Thank you for your response Justin!
> > I don't really see how it can cause an error, while it might be just a
> > visualization defect in VMD.
> > Previously, I tried to perform each step separately, like doing a
> > centering by -center 0 0 0, but even performing this step leads to the
> error "step 0:
> > Water molecule starting at atom 40225 can not be settled." The only
> > difference would be a change in the coordinates, no? It is not due to
> > the fact that I am using 5.1.2 version, is it?
>
> Systems can become unstable for any number of reasons, but not simply
> because some part of a protein appears to be protruding into space.
> That's just normal PBC.
>
> 5.1.2 is very old but I don't know of any specific reason why it should
> cause a problem.
>
> -Justin
>
> --
> ==
>
> Justin A. Lemkul, Ph.D.
> Assistant Professor
> Office: 301 Fralin Hall
> Lab: 303 Engel Hall
>
> Virginia Tech Department of Biochemistry
> 340 West Campus Dr.
> Blacksburg, VA 24061
>
> jalem...@vt.edu | (540) 231-3129
> http://www.thelemkullab.com
>
> ==
>
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a
> mail to gmx-users-requ...@gromacs.org.
>
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-requ...@gromacs.org.
>
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] Regarding self-assembly of Peptides

2019-01-15 Thread Omkar Singh
Hi,
I have a peptide (capped peptide) and i want to do the self assembly in
GROMACS.
So, may i know how to do this self assembly in gromacs..??

Any suggestions..?
Thank you.
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] GROMACS Infrastructure

2019-01-15 Thread Nam Pho
Hello GROMACS Users,

My name is Nam and I support a campus supercomputer for which one of the
major applications is GROMACS. I was curious if anyone has optimized
servers for cost and has a blueprint for that, what servers and
configurations are ideal for cost and performance?

--

Nam Pho

Director for Research Computing
University of Washington
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] Calculating distance between solute and solid surface

2019-01-15 Thread Ahmed Mohammed
Hello GMX users,

I'm simulating a polymer in water on a solid surface. I'm trying to
calculate the dynamic distance between polymer monomers and the surface
(top layer of the surface). My reference is the surface and I used this
command

gmx distance -f NVT.xtc -s NVT.tpr -selrpos atom -seltype mol_com -oav
dist.xvg -tu ns

The problem is, this does not give a resonable results in comparsion
with trajectory visulization

Anyone has done same thing before that could hepe me in this?

Thank you

Ahmed
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] Use all-atom PDB or coarse-grained PDB as the restraints for grompp a coarse-grained gro?

2019-01-15 Thread ZHANG Cheng
In Gromacs 2018, -r is used to provide the restraint file for grompp.


I have a grompp command used for a coarse-grained (CG) gro file, i.e. CG.gro:


gmx grompp -f parameter.mdp -r AllAtom.pdb/CG.pdb -c CG.gro -p system.top -o 
MD.tpr


So in the command above, should I use AllAtom.pdb or CG.pdb as the file for 
"-r"?


I tried both, and both can work without errors.


But which one is more logically correct?


I think the CG.pdb should definitely work. But why AllAtom.pdb is still okay?
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] core dumped

2019-01-15 Thread rabee khorram
Hello;

Thank you Justin for your answer.

I have mailed fe2o3.pdb file in this gmail for you.
  in atomname2type.n2t, for Fe-O bonds, I have used Fe type From the
atomtype.atp of charmm36 force field,

:   (FE  55.84700 ; heme iron 56)

  is it correct?



On Tue, Jan 15, 2019 at 5:30 PM Justin Lemkul  wrote:

>
>
> On 1/15/19 6:47 AM, rabee khorram wrote:
> > Segmentation fault (core dumped)?
> > *Hello everyone, I am runnig nano Fe2O3 structure with PEN drug with
> > gromacs5.*
>
> How did you parametrize and validate these species?
>
> > *but in step :*
> >
> > * gmx mdrun -v -deffnm nvt *
> >
> > * I am getting an error : step 2600, remaining wall clock time: 60 s
> > Segmentation fault (core dumped)*
> >
> > *and don't create nvt.gro.!*
> >
> > *can you explain to me what is my problem?*
> > *thank you very much.*
> http://manual.gromacs.org/current/user-guide/terminology.html#blowing-up
>
> -Justin
>
> --
> ==
>
> Justin A. Lemkul, Ph.D.
> Assistant Professor
> Office: 301 Fralin Hall
> Lab: 303 Engel Hall
>
> Virginia Tech Department of Biochemistry
> 340 West Campus Dr.
> Blacksburg, VA 24061
>
> jalem...@vt.edu | (540) 231-3129
> http://www.thelemkullab.com
>
> ==
>
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-requ...@gromacs.org.
>
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

Re: [gmx-users] Single precision enough for MD of peptide?

2019-01-15 Thread Justin Lemkul




On 1/15/19 10:39 AM, Neena Susan Eappen wrote:

Hi Justin,


Thank you. What is the solution for this?


Solution to what?

-Justin

--
==

Justin A. Lemkul, Ph.D.
Assistant Professor
Office: 301 Fralin Hall
Lab: 303 Engel Hall

Virginia Tech Department of Biochemistry
340 West Campus Dr.
Blacksburg, VA 24061

jalem...@vt.edu | (540) 231-3129
http://www.thelemkullab.com

==

--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Single precision enough for MD of peptide?

2019-01-15 Thread Neena Susan Eappen
Hi Justin,


Thank you. What is the solution for this?


Neena


From: Neena Susan Eappen
Sent: Saturday, January 12, 2019 3:15:11 PM
To: gromacs.org_gmx-users@maillist.sys.kth.se
Subject: [gmx-users] Single precision enough for MD of peptide?


Hi GMX users,


I just installed GROMACS 2018.4 on my windows PC without GPU support. My PC is 
capable of only single precision with a 64-bit memory, would this affect the 
trajectory of my peptide?


Many thanks,

Neena
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] regarding editconf

2019-01-15 Thread Ali Khodayari
Thank you Justin. I try to search more for the reason. My best, Ali

-Original Message-
From: gromacs.org_gmx-users-boun...@maillist.sys.kth.se
 On Behalf Of Justin
Lemkul
Sent: dinsdag 15 januari 2019 16:10
To: gmx-us...@gromacs.org
Subject: Re: [gmx-users] regarding editconf



On 1/15/19 9:34 AM, Ali Khodayari wrote:
> Thank you for your response Justin!
> I don't really see how it can cause an error, while it might be just a 
> visualization defect in VMD.
> Previously, I tried to perform each step separately, like doing a 
> centering by -center 0 0 0, but even performing this step leads to the
error "step 0:
> Water molecule starting at atom 40225 can not be settled." The only 
> difference would be a change in the coordinates, no? It is not due to 
> the fact that I am using 5.1.2 version, is it?

Systems can become unstable for any number of reasons, but not simply
because some part of a protein appears to be protruding into space. 
That's just normal PBC.

5.1.2 is very old but I don't know of any specific reason why it should
cause a problem.

-Justin

--
==

Justin A. Lemkul, Ph.D.
Assistant Professor
Office: 301 Fralin Hall
Lab: 303 Engel Hall

Virginia Tech Department of Biochemistry
340 West Campus Dr.
Blacksburg, VA 24061

jalem...@vt.edu | (540) 231-3129
http://www.thelemkullab.com

==

--
Gromacs Users mailing list

* Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a
mail to gmx-users-requ...@gromacs.org.

-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] regarding editconf

2019-01-15 Thread Justin Lemkul




On 1/15/19 9:34 AM, Ali Khodayari wrote:

Thank you for your response Justin!
I don't really see how it can cause an error, while it might be just a
visualization defect in VMD.
Previously, I tried to perform each step separately, like doing a centering
by -center 0 0 0, but even performing this step leads to the error "step 0:
Water molecule starting at atom 40225 can not be settled." The only
difference would be a change in the coordinates, no? It is not due to the
fact that I am using 5.1.2 version, is it?


Systems can become unstable for any number of reasons, but not simply 
because some part of a protein appears to be protruding into space. 
That's just normal PBC.


5.1.2 is very old but I don't know of any specific reason why it should 
cause a problem.


-Justin

--
==

Justin A. Lemkul, Ph.D.
Assistant Professor
Office: 301 Fralin Hall
Lab: 303 Engel Hall

Virginia Tech Department of Biochemistry
340 West Campus Dr.
Blacksburg, VA 24061

jalem...@vt.edu | (540) 231-3129
http://www.thelemkullab.com

==

--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] regarding editconf

2019-01-15 Thread Ali Khodayari
Thank you for your response Justin!
I don't really see how it can cause an error, while it might be just a
visualization defect in VMD. 
Previously, I tried to perform each step separately, like doing a centering
by -center 0 0 0, but even performing this step leads to the error "step 0:
Water molecule starting at atom 40225 can not be settled." The only
difference would be a change in the coordinates, no? It is not due to the
fact that I am using 5.1.2 version, is it?
Best regards,
Ali


-Original Message-
From: gromacs.org_gmx-users-boun...@maillist.sys.kth.se
 On Behalf Of Justin
Lemkul
Sent: dinsdag 15 januari 2019 15:01
To: gmx-us...@gromacs.org
Subject: Re: [gmx-users] regarding editconf



On 1/15/19 7:46 AM, Ali Khodayari wrote:
> Dear users,
>
>   
>
> I am trying to reorient and re-center my solute using editconf. 
> Performing such I can place the structure at the desired place, yet it 
> is outside the box when I visualize it in VMD. It leads to an error 
> after mdrun complaining about a water molecule not being settled.
>
> Here is the command I use:
>
> $ gmx editconf -f cell.gro -o cell_newbox.gro -princ -bt triclinic -c 
> -center 0 0 0 -rotate 0 350 0 -n ndx.ndx
>
> Do you see where the problem is?

First, there is no such thing as "outside" an infinite box, so that should
not be the cause of any actual problem.

Regarding editconf, you're doing too many things at once, which may be
mutually exclusive, so you may not be getting what you want. You're telling
editconf to align the longest axis along x (with -princ) but also to rotate
about y (-rotate 0 350 0). Don't do both of those at once.

-Justin

--
==

Justin A. Lemkul, Ph.D.
Assistant Professor
Office: 301 Fralin Hall
Lab: 303 Engel Hall

Virginia Tech Department of Biochemistry
340 West Campus Dr.
Blacksburg, VA 24061

jalem...@vt.edu | (540) 231-3129
http://www.thelemkullab.com

==

--
Gromacs Users mailing list

* Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a
mail to gmx-users-requ...@gromacs.org.

-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] regarding editconf

2019-01-15 Thread Justin Lemkul




On 1/15/19 7:46 AM, Ali Khodayari wrote:

Dear users,

  


I am trying to reorient and re-center my solute using editconf. Performing
such I can place the structure at the desired place, yet it is outside the
box when I visualize it in VMD. It leads to an error after mdrun complaining
about a water molecule not being settled.

Here is the command I use:

$ gmx editconf -f cell.gro -o cell_newbox.gro -princ -bt triclinic -c
-center 0 0 0 -rotate 0 350 0 -n ndx.ndx

Do you see where the problem is?


First, there is no such thing as "outside" an infinite box, so that 
should not be the cause of any actual problem.


Regarding editconf, you're doing too many things at once, which may be 
mutually exclusive, so you may not be getting what you want. You're 
telling editconf to align the longest axis along x (with -princ) but 
also to rotate about y (-rotate 0 350 0). Don't do both of those at once.


-Justin

--
==

Justin A. Lemkul, Ph.D.
Assistant Professor
Office: 301 Fralin Hall
Lab: 303 Engel Hall

Virginia Tech Department of Biochemistry
340 West Campus Dr.
Blacksburg, VA 24061

jalem...@vt.edu | (540) 231-3129
http://www.thelemkullab.com

==

--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] core dumped

2019-01-15 Thread Justin Lemkul




On 1/15/19 6:47 AM, rabee khorram wrote:

Segmentation fault (core dumped)?
*Hello everyone, I am runnig nano Fe2O3 structure with PEN drug with
gromacs5.*


How did you parametrize and validate these species?


*but in step :*

* gmx mdrun -v -deffnm nvt *

* I am getting an error : step 2600, remaining wall clock time: 60 s
Segmentation fault (core dumped)*

*and don't create nvt.gro.!*

*can you explain to me what is my problem?*
*thank you very much.*

http://manual.gromacs.org/current/user-guide/terminology.html#blowing-up

-Justin

--
==

Justin A. Lemkul, Ph.D.
Assistant Professor
Office: 301 Fralin Hall
Lab: 303 Engel Hall

Virginia Tech Department of Biochemistry
340 West Campus Dr.
Blacksburg, VA 24061

jalem...@vt.edu | (540) 231-3129
http://www.thelemkullab.com

==

--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Conversion CHARMM FF to Gromos54a7 FF

2019-01-15 Thread Justin Lemkul




On 1/14/19 11:28 AM, Salman Zarrini wrote:

Dear all,

I have a CHARMM all atoms force fields for a solid with limited atom types,
and I want to convert them to Gromos54a7 force fields. Would you please
confirm that below are the changes I should make to do so?

1. Combination rule # 2 is used in CHARMM whereas GROMOS which uses # 1,
so, In  [ atomtypes ] section I should convert the \sigma and \epsilon
defined in CHARMM to V(c6) and W(c12) in Gromos, respectively to be used in
Gromos. Where, the V(c6) = 4*\epsilon*(\sigma)**6 and W(c12) =
4*\epsilon*(\sigma)**12? (page 123-124 Gromacs.18 manual)

2. In [ bond ] section, the bond function should be 2 (in Gromos) vs 1 in
CHARMM. (Table 5.5, page 139, Gromacs.18 manual)

3. in [ angle ] section, the angle function should be 2 (in Gromos) vs 5 in
CHARMM. (Table 5.5, page 139, Gromacs.18 manual).


You cannot convert one force field to another. A force field is a 
self-consistent, balanced entity. Even if you could (mathematically 
speaking) turn one form of parameters into another, the results would be 
physically meaningless.


-Justin

--
==

Justin A. Lemkul, Ph.D.
Assistant Professor
Office: 301 Fralin Hall
Lab: 303 Engel Hall

Virginia Tech Department of Biochemistry
340 West Campus Dr.
Blacksburg, VA 24061

jalem...@vt.edu | (540) 231-3129
http://www.thelemkullab.com

==

--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] How To Calculate The Energy From The Coordinates Of Pdb File?

2019-01-15 Thread Justin Lemkul




On 1/14/19 3:10 AM, Mehdi Mirzaie wrote:

Dear Groamcs User,

  I would be appreciated if you guide me, how to extract pairwise energy
(between residues) from the coordinates of PDB file?  Since the Pdb files
are already minimized, obviously, there is no needed to run energy
minimization step.


http://manual.gromacs.org/current/how-to/special.html#single-point-energy

-Justin

--
==

Justin A. Lemkul, Ph.D.
Assistant Professor
Office: 301 Fralin Hall
Lab: 303 Engel Hall

Virginia Tech Department of Biochemistry
340 West Campus Dr.
Blacksburg, VA 24061

jalem...@vt.edu | (540) 231-3129
http://www.thelemkullab.com

==

--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Single precision enough for MD of peptide?

2019-01-15 Thread Justin Lemkul




On 1/12/19 10:15 AM, Neena Susan Eappen wrote:

Hi GMX users,


I just installed GROMACS 2018.4 on my windows PC without GPU support. My PC is 
capable of only single precision with a 64-bit memory, would this affect the 
trajectory of my peptide?



Nearly all conventional MD simulations are done with mixed precision 
(there is no pure "single" precision in GROMACS).


-Justin

--
==

Justin A. Lemkul, Ph.D.
Assistant Professor
Office: 301 Fralin Hall
Lab: 303 Engel Hall

Virginia Tech Department of Biochemistry
340 West Campus Dr.
Blacksburg, VA 24061

jalem...@vt.edu | (540) 231-3129
http://www.thelemkullab.com

==

--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Energy cal from MD simulations

2019-01-15 Thread Justin Lemkul




On 1/12/19 1:10 AM, Lod King wrote:

Hi,

I have obtained a 100 ns simulation using Amber. I would like to calculate
the potential energy (VDW, COUL,ect) from the trajectory, below is my
command.

$gmx grompp -f test.mdp -p ../gromacs.top -n ../index.ndx -c ../300.pdb -o
rerun.tpr -maxwarn 1


It is always dangerous (and usually wrong) to ever use -maxwarn.


$gmx mdrun -rerun ../protein.trr -s rerun.tpr -g rerun.log -e rerun.edr

$gmx energy -f rerun.edr -o test.xvg

My question is: in this test.mdp file, should I specify any parameters that
should be the same as I used when running MD using Amber.


If you're looking to obtain an equivalent output, yes.

-Justin

--
==

Justin A. Lemkul, Ph.D.
Assistant Professor
Office: 301 Fralin Hall
Lab: 303 Engel Hall

Virginia Tech Department of Biochemistry
340 West Campus Dr.
Blacksburg, VA 24061

jalem...@vt.edu | (540) 231-3129
http://www.thelemkullab.com

==

--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] gmx 2019 performance issues

2019-01-15 Thread Tamas Hegedus

Thanks for the inputs!
* I will go with the cheaper CPU
* I am looking forward to the gpu-only gromacs; impatiently


On 01/15/2019 01:55 PM, Mark Abraham wrote:

Hi,

On Tue, Jan 15, 2019 at 1:30 PM Tamas Hegedus  wrote:


Hi,

I do not really see an increased performance with gmx 2019 using -bonded
gpu. I do not see what I miss or misunderstand.



Unfortunately that is expected in some cases, see
http://manual.gromacs.org/documentation/current/user-guide/mdrun-performance.html#gpu-accelerated-calculation-of-bonded-interactions-cuda-only.
Much of the gain is that it is more feasible to spend less on the CPU, so
gains in performance per $, rather than in raw performance.

The only thing I see that all cpu run at ~100% with gmx2018, while some

of the cpus run only at ~60% with gmx2019.



QED, probably :-)



There are: 196382 Atoms
Speeds comes from 500 ps runs.

  From one of the log files:
Mapping of GPU IDs to the 4 GPU tasks in the 4 ranks on this node:
PP:0,PP:0,PP:2,PME:2
PP tasks will do (non-perturbed) short-ranged and most bonded
interactions on the GPU
PME tasks will do all aspects on the GPU

--
16 cores 4 GPUs
gmx 2018 48ns/day
gmx 2019 54ns/day

gmx mdrun -nt 16 -ntmpi 4 -pin on -v -deffnm md_test -nb gpu -pme gpu
-npme 1 -gputasks 0123

gmx mdrun -nt 16 -ntmpi 4 -pin on -v -deffnm md_test -nb gpu -bonded gpu
-pme gpu -npme 1 -gputasks 0123

Since the GPUs are not utilized well (some of them are below 50%), my
objective is run 2 jobs/node with 8 CPUs and 2 GPUs with higher usage.

--
8 cores 2 GPUs
gmx 2018 33 ns/day
gmx 2019 35 ns/day

gmx mdrun -nt 8 -ntmpi 4 -pin on -v -deffnm md_test -nb gpu -pme gpu
-npme 1 -gputasks 0033

gmx mdrun -nt 8 -ntmpi 4 -pin on -v -deffnm md_test -nb gpu -bonded gpu
-pme gpu -npme 1 -gputasks 0022

gmx mdrun -ntomp 2 -ntmpi 4 -pin on -v -deffnm md_test -nb gpu -bonded
gpu -pme gpu -npme 1 -gputasks 0022
Changing -nt to -ntomp did not help to increase performance.

And the GPUs are not utilized much better. 1080Ti runs max 60-75%



Single simulations are unlikely to get much higher utilization, except
perhaps paired with high-clock CPUs. Multi-simulations are still the way to
make optimal use of your resources, if throughput-style runs are
appropriate for the science.



--
The main question:
* I use 16 core AMD 2950X with 4 high end GPUs (1080Ti, 2080Ti).
* GPUs does not run at 100%, so I would like load more on them and
possibly run 2 gmx jobs on the same node.

I see two options:
* cheaper: decrease the cores from 16 to 8 and push bonded calculations
to gpu using gmx 2019
* expensive: replace the 16core 2950X to 32core 2990WX

2950X 16 cores 2 GPUs
gmx 2018 43 ns/day
gmx 2019 43 ns/day

33 ns/day (8core/2GPUs)  54 (16core/4GPUS)
43 ns/day << 54 (16core/4GPUS)

So this could be a compromise if 16/32 cores works similarly as 16/16
cores. E.g. 2990 has slower memory access compared to 2950; I do not
expect this to influence gmx runs too much. However, if it decreases by
10-15 percentage then most likely it does not worth to invest into the
32 core processor.



I would suggest the cheaper CPU. We are working actively to implement a
pure GPU implementation in an upcoming version (but no promises yet!)

Mark



Thanks for your feedbacks.
Tamas

--
Tamas Hegedus, PhD
Senior Research Fellow
Department of Biophysics and Radiation Biology
Semmelweis University | phone: (36) 1-459 1500/60233
Tuzolto utca 37
-47
   | mailto:ta...@hegelab.org
Budapest, 1094, Hungary   | http://www.hegelab.org
--
Gromacs Users mailing list

* Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
send a mail to gmx-users-requ...@gromacs.org.




--
Tamas Hegedus, PhD
Senior Research Fellow
Department of Biophysics and Radiation Biology
Semmelweis University | phone: (36) 1-459 1500/60233
Tuzolto utca 37-47| mailto:ta...@hegelab.org
Budapest, 1094, Hungary   | http://www.hegelab.org
--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] gmx 2019 performance issues

2019-01-15 Thread Mark Abraham
Hi,

On Tue, Jan 15, 2019 at 1:30 PM Tamas Hegedus  wrote:

> Hi,
>
> I do not really see an increased performance with gmx 2019 using -bonded
> gpu. I do not see what I miss or misunderstand.
>

Unfortunately that is expected in some cases, see
http://manual.gromacs.org/documentation/current/user-guide/mdrun-performance.html#gpu-accelerated-calculation-of-bonded-interactions-cuda-only.
Much of the gain is that it is more feasible to spend less on the CPU, so
gains in performance per $, rather than in raw performance.

The only thing I see that all cpu run at ~100% with gmx2018, while some
> of the cpus run only at ~60% with gmx2019.
>

QED, probably :-)


> There are: 196382 Atoms
> Speeds comes from 500 ps runs.
>
>  From one of the log files:
> Mapping of GPU IDs to the 4 GPU tasks in the 4 ranks on this node:
>PP:0,PP:0,PP:2,PME:2
> PP tasks will do (non-perturbed) short-ranged and most bonded
> interactions on the GPU
> PME tasks will do all aspects on the GPU
>
> --
> 16 cores 4 GPUs
> gmx 2018 48ns/day
> gmx 2019 54ns/day
>
> gmx mdrun -nt 16 -ntmpi 4 -pin on -v -deffnm md_test -nb gpu -pme gpu
> -npme 1 -gputasks 0123
>
> gmx mdrun -nt 16 -ntmpi 4 -pin on -v -deffnm md_test -nb gpu -bonded gpu
> -pme gpu -npme 1 -gputasks 0123
>
> Since the GPUs are not utilized well (some of them are below 50%), my
> objective is run 2 jobs/node with 8 CPUs and 2 GPUs with higher usage.
>
> --
> 8 cores 2 GPUs
> gmx 2018 33 ns/day
> gmx 2019 35 ns/day
>
> gmx mdrun -nt 8 -ntmpi 4 -pin on -v -deffnm md_test -nb gpu -pme gpu
> -npme 1 -gputasks 0033
>
> gmx mdrun -nt 8 -ntmpi 4 -pin on -v -deffnm md_test -nb gpu -bonded gpu
> -pme gpu -npme 1 -gputasks 0022
>
> gmx mdrun -ntomp 2 -ntmpi 4 -pin on -v -deffnm md_test -nb gpu -bonded
> gpu -pme gpu -npme 1 -gputasks 0022
> Changing -nt to -ntomp did not help to increase performance.
>
> And the GPUs are not utilized much better. 1080Ti runs max 60-75%
>

Single simulations are unlikely to get much higher utilization, except
perhaps paired with high-clock CPUs. Multi-simulations are still the way to
make optimal use of your resources, if throughput-style runs are
appropriate for the science.


> --
> The main question:
> * I use 16 core AMD 2950X with 4 high end GPUs (1080Ti, 2080Ti).
> * GPUs does not run at 100%, so I would like load more on them and
> possibly run 2 gmx jobs on the same node.
>
> I see two options:
> * cheaper: decrease the cores from 16 to 8 and push bonded calculations
> to gpu using gmx 2019
> * expensive: replace the 16core 2950X to 32core 2990WX
>
> 2950X 16 cores 2 GPUs
> gmx 2018 43 ns/day
> gmx 2019 43 ns/day
>
> 33 ns/day (8core/2GPUs)  54 (16core/4GPUS)
> 43 ns/day << 54 (16core/4GPUS)
>
> So this could be a compromise if 16/32 cores works similarly as 16/16
> cores. E.g. 2990 has slower memory access compared to 2950; I do not
> expect this to influence gmx runs too much. However, if it decreases by
> 10-15 percentage then most likely it does not worth to invest into the
> 32 core processor.
>

I would suggest the cheaper CPU. We are working actively to implement a
pure GPU implementation in an upcoming version (but no promises yet!)

Mark


> Thanks for your feedbacks.
> Tamas
>
> --
> Tamas Hegedus, PhD
> Senior Research Fellow
> Department of Biophysics and Radiation Biology
> Semmelweis University | phone: (36) 1-459 1500/60233
> Tuzolto utca 37
> -47
>   | mailto:ta...@hegelab.org
> Budapest, 1094, Hungary   | http://www.hegelab.org
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-requ...@gromacs.org.
>
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] regarding editconf

2019-01-15 Thread Ali Khodayari
Dear users,

 

I am trying to reorient and re-center my solute using editconf. Performing
such I can place the structure at the desired place, yet it is outside the
box when I visualize it in VMD. It leads to an error after mdrun complaining
about a water molecule not being settled. 

Here is the command I use:

$ gmx editconf -f cell.gro -o cell_newbox.gro -princ -bt triclinic -c
-center 0 0 0 -rotate 0 350 0 -n ndx.ndx

Do you see where the problem is?

Regards,

Ali

 

-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] gmx 2019 performance issues

2019-01-15 Thread Tamas Hegedus

Hi,

I do not really see an increased performance with gmx 2019 using -bonded 
gpu. I do not see what I miss or misunderstand.
The only thing I see that all cpu run at ~100% with gmx2018, while some 
of the cpus run only at ~60% with gmx2019.


There are: 196382 Atoms
Speeds comes from 500 ps runs.

From one of the log files:
Mapping of GPU IDs to the 4 GPU tasks in the 4 ranks on this node:
  PP:0,PP:0,PP:2,PME:2
PP tasks will do (non-perturbed) short-ranged and most bonded 
interactions on the GPU

PME tasks will do all aspects on the GPU

--
16 cores 4 GPUs
gmx 2018 48ns/day
gmx 2019 54ns/day

gmx mdrun -nt 16 -ntmpi 4 -pin on -v -deffnm md_test -nb gpu -pme gpu 
-npme 1 -gputasks 0123


gmx mdrun -nt 16 -ntmpi 4 -pin on -v -deffnm md_test -nb gpu -bonded gpu 
-pme gpu -npme 1 -gputasks 0123


Since the GPUs are not utilized well (some of them are below 50%), my 
objective is run 2 jobs/node with 8 CPUs and 2 GPUs with higher usage.


--
8 cores 2 GPUs
gmx 2018 33 ns/day
gmx 2019 35 ns/day

gmx mdrun -nt 8 -ntmpi 4 -pin on -v -deffnm md_test -nb gpu -pme gpu 
-npme 1 -gputasks 0033


gmx mdrun -nt 8 -ntmpi 4 -pin on -v -deffnm md_test -nb gpu -bonded gpu 
-pme gpu -npme 1 -gputasks 0022


gmx mdrun -ntomp 2 -ntmpi 4 -pin on -v -deffnm md_test -nb gpu -bonded 
gpu -pme gpu -npme 1 -gputasks 0022

Changing -nt to -ntomp did not help to increase performance.

And the GPUs are not utilized much better. 1080Ti runs max 60-75%

--
The main question:
* I use 16 core AMD 2950X with 4 high end GPUs (1080Ti, 2080Ti).
* GPUs does not run at 100%, so I would like load more on them and 
possibly run 2 gmx jobs on the same node.


I see two options:
* cheaper: decrease the cores from 16 to 8 and push bonded calculations 
to gpu using gmx 2019

* expensive: replace the 16core 2950X to 32core 2990WX

2950X 16 cores 2 GPUs
gmx 2018 43 ns/day
gmx 2019 43 ns/day

33 ns/day (8core/2GPUs)  54 (16core/4GPUS)
43 ns/day << 54 (16core/4GPUS)

So this could be a compromise if 16/32 cores works similarly as 16/16 
cores. E.g. 2990 has slower memory access compared to 2950; I do not 
expect this to influence gmx runs too much. However, if it decreases by 
10-15 percentage then most likely it does not worth to invest into the 
32 core processor.


Thanks for your feedbacks.
Tamas

--
Tamas Hegedus, PhD
Senior Research Fellow
Department of Biophysics and Radiation Biology
Semmelweis University | phone: (36) 1-459 1500/60233
Tuzolto utca 37-47| mailto:ta...@hegelab.org
Budapest, 1094, Hungary   | http://www.hegelab.org
--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] gmx 2019 running problems

2019-01-15 Thread Tamas Hegedus

Hi,

Thanks for the feed-backs.
Yes, I could build gmx 2019 on a hosts running with nvidia driver 4.10

Tamas

On 01/14/2019 09:06 PM, Tamas Hegedus wrote:

Hi,

I tried to install and use gmx 2019 on a single node computer with 4 GPUs.

I think that the build was ok, but the running is...
There is only workload on 4 cores (-nt 16) and
there is no workload on the GPUs at all.

gmx 2018 was deployed on the same computer with the same tools and 
libraries.


CPU 16cores + 16threads
GPU 1080Ti

cmake -j 16 -DCMAKE_C_COMPILER=gcc-6 -DCMAKE_CXX_COMPILER=g++-6 
-DCMAKE_INSTALL_PREFIX=$HOME/opt/gromacs-2019-gpu -DGMX_GPU=ON 
-DCMAKE_PREFIX_PATH=$HOME/opt/OpenBLAS-0.2.20 
-DFFTWF_LIBRARY=$HOME/opt/fftw-3.3.7/lib/libfftw3f.so 
-DFFTWF_INCLUDE_DIR=$HOME/opt/fftw-3.3.7/include ../ | tee out.cmake


-- Looking for NVIDIA GPUs present in the system
-- Number of NVIDIA GPUs detected: 4
-- Found CUDA: /usr (found suitable version "9.1", minimum required is 
"7.0")


make -j16
make -j16 install # note: a lot of building happened also in this step

**
gmx mdrun -nt 16 -ntmpi 4 -gputasks 0123 -nb gpu -bonded gpu -pme gpu 
-npme 1 -pin on -v -deffnm md_2 -s md_2_500ns.tpr -cpi md_2.1.cpt -noappend


+-+ 


| NVIDIA-SMI 390.48 Driver Version: 390.48  |
|---+--+--+ 

| GPU  NamePersistence-M| Bus-IdDisp.A | Volatile 
Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap| Memory-Usage | GPU-Util 
Compute M. |
|===+==+==| 


|   0  GeForce GTX 108...  Off  | :02:00.0 Off |  N/A |
|  0%   28CP819W / 250W |179MiB / 11178MiB |  0% Default |
+---+--+--+ 


|   1  GeForce GTX 108...  Off  | :03:00.0 Off |  N/A |
|  0%   28CP8 8W / 250W |179MiB / 11178MiB |  0% Default |
+---+--+--+ 


|   2  GeForce GTX 108...  Off  | :83:00.0 Off |  N/A |
|  0%   28CP8 9W / 250W |179MiB / 11178MiB |  0% Default |
+---+--+--+ 


|   3  GeForce GTX 108...  Off  | :84:00.0 Off |  N/A |
|  0%   27CP8 9W / 250W |237MiB / 11178MiB |  0% Default |
+---+--+--+ 




+-+ 

| Processes:   GPU 
Memory |
|  GPU   PID   Type   Process name Usage 
  |
|=| 


|0 20243  C   gmx 161MiB |
|1 20243  C   gmx 161MiB |
|2 20243  C   gmx 161MiB |
|3 20243  C   gmx 219MiB |
+-+ 



Thanks for your suggestions,
Tamas




--
Tamas Hegedus, PhD
Senior Research Fellow
Department of Biophysics and Radiation Biology
Semmelweis University | phone: (36) 1-459 1500/60233
Tuzolto utca 37-47| mailto:ta...@hegelab.org
Budapest, 1094, Hungary   | http://www.hegelab.org
--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] core dumped

2019-01-15 Thread rabee khorram
Segmentation fault (core dumped)?
*Hello everyone, I am runnig nano Fe2O3 structure with PEN drug with
gromacs5.*
*but in step :*

* gmx mdrun -v -deffnm nvt *

* I am getting an error : step 2600, remaining wall clock time: 60 s
Segmentation fault (core dumped)*

*and don't create nvt.gro.!*

*can you explain to me what is my problem?*
*thank you very much.*
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.