Dear Sir,
Thanks for your reply.
With Change in host_config.h it worked fine. But It needed full Cuda folder
to be copied into
my home area.
Can we compile gromacs-4.6.4 gpu version with *pgi* compiler?
I* used below command :*
/cmake-2.8.12.1/bin/cmake -DGMX_FFT_LIBRARY=mkl
Hi Kavya,
Most (all?) gromacs tools ignore the atom indices in the PDB file anyway, so
what you intend to do is straightforward.
Kind regards,
Erik
On 13 Feb 2014, at 06:30, Kavyashree M hmkv...@gmail.com wrote:
Dear users,
I was analysing the hydrogen bonding interaction of proteins and
Hi all,
I am aware of that this topic has been discussed for many times. However, I
need your more guidance whether I am experiencing PBC problem after MD or
not.
My MD box has enzyme+ligand+coenzyme complex. As I have already wrote to
the user list, I am using the force field parameters from
If there's a problem, trjconv can handle it with the use of the right index
groups, as suggested at
http://www.gromacs.org/Documentation/Terminology/Periodic_Boundary_Conditions.
But it can't keep these three things together if there's no index group
that describes these three things. You may need
Hi,
I see. Can't trjconv extract the waters for you?
Kind regards,
Erik
On 13 Feb 2014, at 11:10, Kavyashree M hmkv...@gmail.com wrote:
Dear Sir,
For extracting specific waters involved in Hbond from the pdb I am using
a small script wherein i have to use the only information given in
Dear Kannan,
Thank you for your fast response. The problem is that the ligand is as far
as 16 Ang., which is too far away from the coenzyme for any kind of bonding.
here is the link for downloading my dist.jpeg file.http://we.tl/ACencjievC
Do you think that this is a real PBC problem?
Really
Hi Mousumi,
from the fact that you get lots of backup files directly at the beginning
I suspect that your mdrun is not MPI-enabled. This behavior is exactly what
one would get when launching a number of serial mdrun’s on the same input file.
Maybe you need to look for a mdrun_mpi executable.
Hi,
I'm no expert for this stuff, but could it be that you generate about 40
of the #my_mol.log.$n# files (probably only 39)?
It could be that the 'mpirun' starts 40 'mdrun'-jobs and each generates
its own out put.
For GROMACS 4.6.x I always used
mdrun -nt X ...
to start a parallel run (where
On Thu, Feb 13, 2014 at 9:03 AM, Chaitali Chandratre
chaitujo...@gmail.comwrote:
Dear Sir,
Thanks for your reply.
With Change in host_config.h it worked fine. But It needed full Cuda folder
to be copied into
my home area.
Can we compile gromacs-4.6.4 gpu version with *pgi* compiler?