Dear researchers,
What are the differences, merits and demerits of all atom force field and
united atom force field over each other?
Which force field is more relevant today?
Thank you
Shradheya R.R. Gupta
DBT-BIF Researcher
University of Rajasthan
India
--
Gromacs Users mailing list
* Please
Hello,
I have done simulation for 50 ns. After visualizing I came to the
conclusion that it need more simulation of 10 ns.
To do so I use :
gmx convert-tpr -s md_0_10.tpr -extend 1 -o md_0_60.tpr
It showed:
Reading toplogy and stuff from md_0_10.tpr
Reading file md_0_10.tpr, VERSION 2019 (sin
Hi,
I encountered a problem during binding free energy calculation, as my
ligand and protein is charged, periodic boundary conditions will introduce
large artifacts. Is there a way or tutorial to correct this.
Thank you,
Shradheya Gupta
DBT-BIF University of Rajasthan
--
Gromacs Users mailing lis
ster size,
> transition ...
>
> On Fri, Nov 15, 2019 at 9:38 PM Shradheya R.R. Gupta <
> shradheyagu...@gmail.com> wrote:
>
> > Hi,
> > After simulation I used gmx cluster which generated 20 structures in a
> > single pdb file.Now I am confused, what to do next. Take aver
; which is similar to MM/GBSA..
>
> http://rashmikumari.github.io/g_mmpbsa/Tutorial.html
> This plug-in you can install in your system and it can read the gromacs
> trajectories to compute free energy difference.
>
> On Sun 17 Nov, 2019, 9:43 AM Shradheya R.R. Gupta, <
> s
Hi,
Is there any tutorial to learn how to do MMGBSA for protein-ligand complex.
Thanks
Shradheya Gupta
DBT-BIF University of Rajasthan
--
Gromacs Users mailing list
* Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
* Can't post? Read htt
Hi,
After simulation I used gmx cluster which generated 20 structures in a
single pdb file.Now I am confused, what to do next. Take average of all
the generated structures or use a single cluster.
If to select average structure then how? If to select a single cluster then
how?
Thank you
Shradhey
Respected Researchers,
How to balance the MD simulations? Below is the report.
*Dynamic load balancing report: DLB was off during the run due to low
measured imbalance. Average load imbalance: 1.4%. The balanceable part of
the MD step is 70%, load imbalance is computed from this. Part of the t
, <
alexander.tza...@csi.cuny.edu> wrote:
> No you do not. If you are the only user.
>
> On Nov 6, 2019 9:04 AM, "Shradheya R.R. Gupta"
> wrote:
> Thank you sir for your response.
> For personal cluster of 6 computers where I will be using all the nodes to
> its
GROMACS is fine.
>
> Mark
>
> On Tue, 5 Nov 2019 at 14:51, Shradheya R.R. Gupta <
> shradheyagu...@gmail.com>
> wrote:
>
> > Researchers,
> > Is slrum required to run GROMACS on multiple computers or OpenMPI is
> fine?
> >
> > Thank you
> &g
Researchers,
Is slrum required to run GROMACS on multiple computers or OpenMPI is fine?
Thank you
Shradheya
DBT-BIF University of Rajasthan
--
Gromacs Users mailing list
* Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
* Can't post? Re
.
Thank you
On Fri, 1 Nov 2019 at 18:22, Justin Lemkul wrote:
>
>
> On 11/1/19 7:40 AM, Shradheya R.R. Gupta wrote:
> > Respected Researchers,
> >
> > I am trying to run GROMACS in parallel. So I have established connection
> > using ssh, nfs and openmpi.
> >
Respected Researchers,
I am trying to run GROMACS in parallel. So I have established connection
using ssh, nfs and openmpi.
The connection is working fine as I have run Autodock Vina-MPI and other
test files.
I have installed the GROMACS in the mounted directory which is shared by
all the comput
Respected Researchers,
I have six i7 computers with 4 physical cores and each core with 2 threads
and all are connected with LAN. Therefore, I want to run the GROMACS in
parallel (cluster).
I have successfully mounted them using ssh, nfs and OpenMPI.
Afterwards I successfully installed GROMACS u
Respected Researchers,
I have six i7 computers with 4 physical cores and each core with 2 threads
and all are connected with LAN. Therefore, I want to run the GROMACS in
parallel (cluster).
I have successfully mounted them using ssh, nfs and OpenMPI.
Afterwards I successfully installed GROMACS u
15 matches
Mail list logo