[gmx-users] All atom force filed v/s united atom force filed.

2020-02-01 Thread Shradheya R.R. Gupta
Dear researchers, What are the differences, merits and demerits of all atom force field and united atom force field over each other? Which force field is more relevant today? Thank you Shradheya R.R. Gupta DBT-BIF Researcher University of Rajasthan India -- Gromacs Users mailing list * Please

[gmx-users] Extending simulation

2020-01-03 Thread Shradheya R.R. Gupta
Hello, I have done simulation for 50 ns. After visualizing I came to the conclusion that it need more simulation of 10 ns. To do so I use : gmx convert-tpr -s md_0_10.tpr -extend 1 -o md_0_60.tpr It showed: Reading toplogy and stuff from md_0_10.tpr Reading file md_0_10.tpr, VERSION 2019 (sin

[gmx-users] How to calculate binding free energy of the charged ligand and protein

2019-12-25 Thread Shradheya R.R. Gupta
Hi, I encountered a problem during binding free energy calculation, as my ligand and protein is charged, periodic boundary conditions will introduce large artifacts. Is there a way or tutorial to correct this. Thank you, Shradheya Gupta DBT-BIF University of Rajasthan -- Gromacs Users mailing lis

Re: [gmx-users] clustered structures

2019-11-16 Thread Shradheya R.R. Gupta
ster size, > transition ... > > On Fri, Nov 15, 2019 at 9:38 PM Shradheya R.R. Gupta < > shradheyagu...@gmail.com> wrote: > > > Hi, > > After simulation I used gmx cluster which generated 20 structures in a > > single pdb file.Now I am confused, what to do next. Take aver

Re: [gmx-users] MMGBSA

2019-11-16 Thread Shradheya R.R. Gupta
; which is similar to MM/GBSA.. > > http://rashmikumari.github.io/g_mmpbsa/Tutorial.html > This plug-in you can install in your system and it can read the gromacs > trajectories to compute free energy difference. > > On Sun 17 Nov, 2019, 9:43 AM Shradheya R.R. Gupta, < > s

[gmx-users] MMGBSA

2019-11-16 Thread Shradheya R.R. Gupta
Hi, Is there any tutorial to learn how to do MMGBSA for protein-ligand complex. Thanks Shradheya Gupta DBT-BIF University of Rajasthan -- Gromacs Users mailing list * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting! * Can't post? Read htt

[gmx-users] clustered structures

2019-11-15 Thread Shradheya R.R. Gupta
Hi, After simulation I used gmx cluster which generated 20 structures in a single pdb file.Now I am confused, what to do next. Take average of all the generated structures or use a single cluster. If to select average structure then how? If to select a single cluster then how? Thank you Shradhey

[gmx-users] Load Imbalance

2019-11-07 Thread Shradheya R.R. Gupta
Respected Researchers, How to balance the MD simulations? Below is the report. *Dynamic load balancing report: DLB was off during the run due to low measured imbalance. Average load imbalance: 1.4%. The balanceable part of the MD step is 70%, load imbalance is computed from this. Part of the t

Re: [gmx-users] Slurm for GROMACS

2019-11-06 Thread Shradheya R.R. Gupta
, < alexander.tza...@csi.cuny.edu> wrote: > No you do not. If you are the only user. > > On Nov 6, 2019 9:04 AM, "Shradheya R.R. Gupta" > wrote: > Thank you sir for your response. > For personal cluster of 6 computers where I will be using all the nodes to > its

Re: [gmx-users] Slurm for GROMACS

2019-11-06 Thread Shradheya R.R. Gupta
GROMACS is fine. > > Mark > > On Tue, 5 Nov 2019 at 14:51, Shradheya R.R. Gupta < > shradheyagu...@gmail.com> > wrote: > > > Researchers, > > Is slrum required to run GROMACS on multiple computers or OpenMPI is > fine? > > > > Thank you > &g

[gmx-users] Slurm for GROMACS

2019-11-05 Thread Shradheya R.R. Gupta
Researchers, Is slrum required to run GROMACS on multiple computers or OpenMPI is fine? Thank you Shradheya DBT-BIF University of Rajasthan -- Gromacs Users mailing list * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting! * Can't post? Re

Re: [gmx-users] How to run GROMACS in cluster

2019-11-01 Thread Shradheya R.R. Gupta
. Thank you On Fri, 1 Nov 2019 at 18:22, Justin Lemkul wrote: > > > On 11/1/19 7:40 AM, Shradheya R.R. Gupta wrote: > > Respected Researchers, > > > > I am trying to run GROMACS in parallel. So I have established connection > > using ssh, nfs and openmpi. > >

[gmx-users] How to run GROMACS in cluster

2019-11-01 Thread Shradheya R.R. Gupta
Respected Researchers, I am trying to run GROMACS in parallel. So I have established connection using ssh, nfs and openmpi. The connection is working fine as I have run Autodock Vina-MPI and other test files. I have installed the GROMACS in the mounted directory which is shared by all the comput

[gmx-users] How to run on cluster

2019-10-26 Thread Shradheya R.R. Gupta
Respected Researchers, I have six i7 computers with 4 physical cores and each core with 2 threads and all are connected with LAN. Therefore, I want to run the GROMACS in parallel (cluster). I have successfully mounted them using ssh, nfs and OpenMPI. Afterwards I successfully installed GROMACS u

[gmx-users] How to run on cluster

2019-10-26 Thread Shradheya R.R. Gupta
Respected Researchers, I have six i7 computers with 4 physical cores and each core with 2 threads and all are connected with LAN. Therefore, I want to run the GROMACS in parallel (cluster). I have successfully mounted them using ssh, nfs and OpenMPI. Afterwards I successfully installed GROMACS u