Dear gromacs users,
I have done simulations of 20 residues length peptide for 300ns. As I would
like to do explore the conformational space I have chosen the dPCA to find
flexible regions. I have gone through the dPCA tutorial from gromacs site
and followed it. First I have generated the index
Dear Justin and all GROMACS users,
I'm simulating a monolayer system using the surface tension command in
gromacs to set the surface tension and obtain the area( per molecule).
Although my system is 2d and uniform : a slab of water covered with
surfactant molecules and the surface tension
hi . I want to measure the distance between center of membrane and a
peptide . how can i use gmx distance ??
On Wed, Jan 31, 2018 at 8:13 PM, Justin Lemkul wrote:
>
>
> On 1/30/18 7:00 PM, negar habibzadeh wrote:
>
>> hi. in my dopc.gro file i have 128 dopc ,5120 water (sol)
Hi, gmx users,
I have a question on pulling a group of molecules at a constant speed.
Let's say, I want move a protein in the vacuum at a constant speed of 0.01
nm/ps. I am aware that I can specify the pull rate in the COM pulling. I
am using gromacs 5.1.2. Following is my pulling code:
;pull
Hi,
I don't know what your intended use case is, but there are various
approaches to get a single PDB frame from a trajectory (e.g. trjconv -dump)
and to combine that with B-factors (e.g. editconf -bf), but frankly the
latter stage is probably easier with your own script, e.g. building on
Thank you Mark for this fascinating tng format. I think I could not modify it
using C/C++ at this moment.
For your "easier approach", do you mean I can convert xtc to pdb frame with
b-factor assigned, if I provide the b-factor file? How to do this?
-- Original
Hi,
No, the format is not extensible.
The TNG format offers better compression than XTC, and in principle can be
extended for this kind of purpose, but you'd have to write some C/C++ to
get that organized. An easier approach is to keep your B factors in some
file that you keep in sync with the
Dear Gromacs,
I have residue-based b-factor values for a protein. In the past, they were
assigned to the b-factor columns of pdb files. It would take a lot of space if
I extract all the pdb files. As the pdb files come from the xtc file, I wonder,
if I can modify the xtc file directly?
Thank
Hi
I want to run a long MD 200 ns, but I want the outputs of every 5 ns to be
saved separately ... Is there a way or a script to do this? Kind Regards,Ahmed
--
Gromacs Users mailing list
* Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
Dear all users,
I am new to Gromacs and I want to simulate a box of pure water.I want to know
is there any tutorial for this simulation with Gromacs? or Could anyone kindly
help me?
Thanks in advance,Negar
--
Gromacs Users mailing list
* Please search the archive at
Hi,
Yes, that's another point of failure - the command to launch the MPI
executable needs to provide the correct context, including linking the
right libraries.
Mark
On Thu, Feb 1, 2018 at 2:40 PM Peter Kroon wrote:
> Hi,
>
> Digging through my old submission scripts show
On 2/1/18 7:59 AM, Rakesh Mishra wrote:
Dear Justin
Here I am applying pull for two groups with respect to two reference group
as following.
; Pull code
pull= yes
pull_ngroups= 4
pull_ncoords= 1
pull_group1_name= chain_A8 (reference
Hi,
Digging through my old submission scripts show that when I ran gromacs
with `srun` MPI communication would not work, and I would end up with a
lot of files. If I ran with `mpirun` everything worked as expected.
Which was why I blamed my cluster and their OpenMPI installation. I'm
not sure
Dear Justin
Here I am applying pull for two groups with respect to two reference group
as following.
; Pull code
pull= yes
pull_ngroups= 4
pull_ncoords= 1
pull_group1_name= chain_A8 (reference also immobile )
pull_group2_name=
Hi,
The fact that the binary you ran was named gmx_mpi yet does not seem to be
compiled with MPI is suspicious - either the cmake process that was used
handled suffixes explicitly (which is unnecessary if all you want is the
standard thing) and was wrong, or we implemented the standard suffixing
Hi Venkat,
I've seen similar behaviour with OpenMPI and a home-patched version of
Gromacs. I blamed OpenMPI/the cluster and contacted the admins (but I
don't remember what the result was). In the end I solved/worked around
the issue by compiling Gromacs with IntelMPI.
HTH
Peter
On 01-02-18
16 matches
Mail list logo