BS”D
Dear All,
I have now run into this issue with two very different systems (one with 762
protein and 60 DNA residues, the other with 90 protein residues). If I try and
carry over the velocities from the final equilibration step into a production
run, and try to use more than one MPI ran
>
> Here's your problem. You have pairs defined that are in excess of 12 nm,
> but they are assigned to a 1-4 interaction, so atoms that should be
> separated by three bonds. The user-defined potential shouldn't matter
> here unless you've added [pairs] to the topology.
>
> I see your point.
What c
in Hall
Lab: 303 Engel Hall
Virginia Tech Department of Biochemistry
340 West Campus Dr.
Blacksburg, VA 24061
jalem...@vt.edu | (540) 231-3129
http://www.thelemkullab.com
==============
--
Message: 3
Date: Sun, 1 Sep 2019 13:22:15 -040
ch pull coordinate
> >> With 1 pull groups, expect 2 columns (including the time column)
> >> Reading file umbrella71.tpr, VERSION 5.1.4 (single precision)
> >> Reading file umbrella98.tpr, VERSION 5.1.4 (single precision)
> >> Reading file umbrella111.t
On 9/1/19 5:44 AM, Avijeet Kulshrestha wrote:
Hi all,
I am running martini coarse-grained simulation with 15 fs of time step in
gromacs 2018.6. I have 25859 number of atoms and my box size is:
12.0 14.0 18.0
Where I have Protein, membrane (DPPC) and ions.
I have minimized energy
Hi all,
I am running martini coarse-grained simulation with 15 fs of time step in
gromacs 2018.6. I have 25859 number of atoms and my box size is:
12.0 14.0 18.0
Where I have Protein, membrane (DPPC) and ions.
I have minimized energy with 16 processor and -rdd option as 2.5. It worked
Thank you sir. The problem is sorted out. Decreasing number of processors
did the trick. Thanks again
On Wed, 21 Aug 2019 at 22:02, Justin Lemkul wrote:
>
>
> On 8/21/19 12:30 PM, Dhrubajyoti Maji wrote:
> > Many tanks Dr. Lemkul for your kind reply. I have checked the link. I
> have
> > done th
On 8/21/19 12:30 PM, Dhrubajyoti Maji wrote:
Many tanks Dr. Lemkul for your kind reply. I have checked the link. I have
done the equlibration step successfully but the error appears at production
run. The change is only that now I am writing the output trajectory. So, if
I had any problem in t
Many tanks Dr. Lemkul for your kind reply. I have checked the link. I have
done the equlibration step successfully but the error appears at production
run. The change is only that now I am writing the output trajectory. So, if
I had any problem in topology or mdp file then I think my equilibration
On 8/21/19 1:00 AM, Dhrubajyoti Maji wrote:
Dear all,
I am simulating a system consisting urea molecules. After successfully
generating tpr file while I am trying to run mdrun, the following error is
appearing.
Fatal error:
There is no domain decomposition for 72 ranks that is compatible
Dear all,
I am simulating a system consisting urea molecules. After successfully
generating tpr file while I am trying to run mdrun, the following error is
appearing.
Fatal error:
There is no domain decomposition for 72 ranks that is compatible with the
given box and a minimum cell size of 0.59
Hi Mark,
To my knowledge, she's not using CHARMM-related FF's at all -- I think she
is using Amber03 (Alyssa, correct me if I'm wrong). Visually and RSMD-wise
the trajectory looks totally normal, but is there something specific I
should be looking for in the trajectory, either visually or quantita
Hi,
What does the trajectory look like before it crashes?
We did recently fix a bug relevant to simulations using CHARMM switching
functions on GPUs, if that could be an explanation. We will probably put
out a new 2018 version with that fix next week (or so).
Mark
On Thu., 14 Feb. 2019, 20:26 M
Hi all,
My student is trying to do a fairly straightforward MD simulation -- a
protein complex in water with ions with *no* pull coordinate. It's on an
NVidia GPU-based machine and we're running gromacs 2018.3.
About 65 ns into the simulation, it dies with:
"an atom moved too far between two do
Dear gromacs users,
I performed a MD simulation on a dimer system with pulling code during the
production run to force the two monomers to get closer. After 55 ns of
production run I got this error :
*step 30616369: Water molecule starting at atom 30591 can not be
settled.Check
>
> Dear all,
>
> I have a system of 27000 atoms, that I am simulating on both local and
> Marconi-KNL (cineca) clusters. In this system, I simulate a small molecule
> that has a graphene sheet attached to it, surrounded by water. I have
> already simulated with success this molecule in a system of
Hi,
Unfortunately, you can't attach files to the mailing list. Please use a
file sharing service and share the link.
Mark
On Wed., 12 Dec. 2018, 02:20 Tommaso D'Agostino,
wrote:
> Dear all,
>
> I have a system of 27000 atoms, that I am simulating on both local and
> Marconi-KNL (cineca) cluste
Dear all,
I have a system of 27000 atoms, that I am simulating on both local and
Marconi-KNL (cineca) clusters. In this system, I simulate a small molecule
that has a graphene sheet attached to it, surrounded by water. I have
already simulated with success this molecule in a system of 6500 atoms,
Hi,
The implicit solvent support got a bit broken between 4.5 and 4.6, and
nobody has yet worked out how to fix it, sorry. If you can run with 1 cpu,
do that. Otherwise, please use GROMACS 4.5.7.
Mark
On Mon, Jun 18, 2018 at 9:21 AM Chhaya Singh
wrote:
> I am running a simulation having protei
I am running a simulation having protein in implicit solvent using amber
ff99sb forcefield and gbsa as solvent .
I am not able to use more than one cpu.
It always gives domain decomposition error if i use more than one cpu.
when i tried running using one cpu then it gave me this error :
"Fatal erro
Check the trajectories before and after conversion and make sure that there
are no pbc effect, if so fix it.
Or do the analysis with the avaible trajectories(may be in VMD with tcl
scripts).
--
Regards,
Nikhil Maroli
--
Gromacs Users mailing list
* Please search the archive at
http://www.groma
Thanks for the reply Mark.
On Sat, Apr 28, 2018 at 4:32 PM, Mark Abraham
wrote:
> Hi,
>
> Clearly the conversion tool did not produce a file that conforms to the
> requirements GROMACS has for specifying periodic boxes. That may not work
> well even if you'd run mdrun without domain decompositio
Hi,
Clearly the conversion tool did not produce a file that conforms to the
requirements GROMACS has for specifying periodic boxes. That may not work
well even if you'd run mdrun without domain decomposition because the
periodicity may not be understood correctly. Find out what was going on and
ho
Yes.. I used VMD for conversion...
On Sat, Apr 28, 2018 at 12:50 PM, RAHUL SURESH
wrote:
> Hi
>
> Sounds strange to my little knowledge. How I would justify is, it may be
> due to the conversion from NAMD [dcd] to Gromacs profile [trr] though not
> sure.
>
> So you have converted the file format
Hi
Sounds strange to my little knowledge. How I would justify is, it may be
due to the conversion from NAMD [dcd] to Gromacs profile [trr] though not
sure.
So you have converted the file format using VMD?
On Sat, Apr 28, 2018 at 12:26 PM, Sahithya S Iyer wrote:
> Hi,
>
> Thanks for the reply
Hi,
Thanks for the reply. I am only doing a rerun of a trajectory that has
already evolved without any dynamic load balancing problems.
-rerun only recalculates energies right. I don't understand why the same
trajectory is giving decomposition error now.
On Sat, Apr 28, 2018 at 12:11 PM, RAHUL SU
Hi.
That indicates a problem with dynamic load balancing. Try to build
different sizes of the box.
On Sat, Apr 28, 2018 at 11:57 AM, Sahithya S Iyer wrote:
> Hi,
>
> I am trying to calculate interaction between specific residues using gmx
> mdrun -rerun flag. The trajectory was in a dcd format
Hi,
I am trying to calculate interaction between specific residues using gmx
mdrun -rerun flag. The trajectory was in a dcd format, which I converted to
a trr file. I get the following error -
Domain decomposition has not been implemented for box vectors that have
non-zero components in direction
Well, I do not do anything special when preparing this system compared to
other systems that do not show this issue.
I have carefuly inspected my system and I know what is wrong. I did some
manipulations to PDB file due to missing fragment of
residue and accidentally put NZ atom of Lysine like 3.5
On 4/15/18 9:29 AM, Dawid das wrote:
Dear Gromacs Users,
I run numerous MD simulations for similar systems of protein in water box
and
for only one system I encounter error:
*Fatal error:There is no domain decomposition for 4 ranks that is
compatible with the givenbox and a minimum cell s
Dear Gromacs Users,
I run numerous MD simulations for similar systems of protein in water box
and
for only one system I encounter error:
*Fatal error:There is no domain decomposition for 4 ranks that is
compatible with the givenbox and a minimum cell size of 3.54253 nmChange
the number of ran
Hi,
You have a bonded interaction at a distance of 10 nm. I assume that's not
your intention. Perhaps you should give a configuration to grompp that has
whole molecules. IIRC less ancient versions of GROMACS do a better job of
this.
Mark
On Thu, Feb 15, 2018 at 5:39 PM Iman Ahmadabadi
wrote:
>
Dear Gromacs Users,
In one job, I always get in (any number of nodes) the domain decomposition
error as following and I don't know what should I do. I have to use the
-dds or -rdd setting for my problem?
Sincerely
Iman
Initializing Domain Decomposition on 56 nodes
Dynamic load balancing: auto
Wi
Hi,
On Fri, Feb 9, 2018, 17:15 Kevin C Chan wrote:
> Dear Users,
>
> I have encountered the problem of "There is no domain decomposition for n
> nodes that is compatible with the given box and a minimum cell size of x
> nm" and by reading through the gromacs website and some threads I
> understa
Dear Users,
I have encountered the problem of "There is no domain decomposition for n
nodes that is compatible with the given box and a minimum cell size of x
nm" and by reading through the gromacs website and some threads I
understand that the problem might be caused by breaking the system into t
Hello
I’m trying to run a simulation with distance restraint using Gromacs
version 2016.1-dev.
The distance restraint file contains:
[ distance_restraints ]
; ai aj type index type. low up1 up2 fac
6602 2478 1 0 1 0.24 0.30 0.35 1.0
6602 2504 1 0 1 0.24 0.30 0.35 1.0
660
Hi,
As you have learned, such boundary conditions are only available in the
group scheme, the boundary conditions restrict the number of ranks usable,
and the group scheme prevents OpenMP parallelism being useful. We hope to
relax this in future, but your current options are to run slowly, use
dif
Dear Justin,
I tried using OpenMP parallelization with the following command:
mdrun -ntmpi 1 -ntomp 1
which works fine, but if ntomp is increased, I get the below error:-
*OpenMP threads have been requested with cut-off scheme group, but these
are only supported with cut-off scheme verlet*
Is
On 11/8/17 12:02 PM, Wes Barnett wrote:
On Wed, Nov 8, 2017 at 11:11 AM, Shraddha Parate
wrote:
Dear Gromacs Users,
I was able to achieve a spherical water droplet without periodic boundary
conditions (PBC) by changing few parameters in the .mdp files as below:
However, I am facing th
On Wed, Nov 8, 2017 at 11:11 AM, Shraddha Parate
wrote:
> Dear Gromacs Users,
>
> I was able to achieve a spherical water droplet without periodic boundary
> conditions (PBC) by changing few parameters in the .mdp files as below:
>
> However, I am facing the following error:
>
> *Fatal error:
Dear Gromacs Users,
I was able to achieve a spherical water droplet without periodic boundary
conditions (PBC) by changing few parameters in the .mdp files as below:
*minim.mdp:*
; minim.mdp - used as input into grompp to generate em.tpr
; Parameters describing what to do, when to stop and what t
57695621
[ http://www.oekolab.com/ | Økolab ] | [ http://www.nanofact.no/ | Nanofactory
] | [ http://www.aq-lab.no/ | AQ-Lab ] | [ http://www.phap.no/ | FAP ]
From: "Justin Lemkul"
To: "gmx-users"
Sent: Thursday, June 22, 2017 3:28:32 PM
Subject: Re: [gmx-users] Domai
On 6/22/17 9:22 AM, Sergio Manzetti wrote:
Checked the link, nothing written here on rcon and dds...
"Thus it is not possible to run a small simulation with large numbers of
processors."
Google will help you find more suggestions.
-Justin
--
=
/ | Økolab ] | [ http://www.nanofact.no/ | Nanofactory
] | [ http://www.aq-lab.no/ | AQ-Lab ] | [ http://www.phap.no/ | FAP ]
From: "Justin Lemkul"
To: "gmx-users"
Sent: Thursday, June 22, 2017 3:21:28 PM
Subject: Re: [gmx-users] Domain decomposition
On 6/2
On 6/22/17 9:16 AM, Sergio Manzetti wrote:
Hi, I have (also) a system of one molecule in water box of 3 3 3 dimensions,
the procedure goes well all the way till the simulation starts, getting:
Will use 20 particle-particle and 4 PME only ranks
This is a guess, check the performance at the end
Hi, I have (also) a system of one molecule in water box of 3 3 3 dimensions,
the procedure goes well all the way till the simulation starts, getting:
Will use 20 particle-particle and 4 PME only ranks
This is a guess, check the performance at the end of the log file
--
On 5/18/17 5:59 AM, Kashif wrote:
I got this error every time when I try to simulate one of my protein-ligand
complex.
---
Program mdrun, VERSION 4.6.6
Source code file: /root/Documents/gromacs-4.6.6/src/mdlib/pme.c, line: 851
Fatal error:
I got this error every time when I try to simulate one of my protein-ligand
complex.
---
Program mdrun, VERSION 4.6.6
Source code file: /root/Documents/gromacs-4.6.6/src/mdlib/pme.c, line: 851
Fatal error:
1 particles communicated to PME node 5
Dear all gromacs users,
I have seen in mail archive this domain decomposition error can be avoided
with less number of processor, but how to find the suitable number of
processor required?
here is the log file.
https://drive.google.com/file/d/0Bzs8lO6WJxD9alRTYjFaMjBTT2c/
view?usp=sharing
--
Dear all gromacs users,
I have seen in mail archive this domain decomposition error can be avoided
with less number of processor, but how to find the suitable number of
processor required?
here is the log file.
https://drive.google.com/file/d/0Bzs8lO6WJxD9alRTYjFaMjBTT2c/view?usp=sharing
--
"Mark Abraham"
> To: gmx-us...@gromacs.org
> Sent: Tuesday, March 7, 2017 4:25:12 AM
> Subject: Re: [gmx-users] domain decomposition Error
>
> Hi,
>
> Exactly. NVT not exploding doesn't mean it's ready for NpT, particularly if
> the volume is just wro
raham"
To: gmx-us...@gromacs.org
Sent: Tuesday, March 7, 2017 4:25:12 AM
Subject: Re: [gmx-users] domain decomposition Error
Hi,
Exactly. NVT not exploding doesn't mean it's ready for NpT, particularly if
the volume is just wrong, or you try to use parrinello rahaman too soon.
Mark
llibration step only and not during
> the nvt equillibration step. I have successfully done 1-ns of nvt
> equillibration.
>
>
> --- -- Original Message -
> From: "Mark Abraham"
> To: gmx-us...@gromacs.org
> Sent: Tuesday, March 7, 2017 2:32:46 AM
> Subje
7, 2017 2:32:46 AM
Subject: Re: [gmx-users] domain decomposition Error
Hi,
There's good advice for this problem at think link that was suggested in
the error message: http://www.gromacs.org/Documentation/Errors. Probably
your box volume or NpT protocol need some attention.
Mark
On Tue, 7
Hi,
There's good advice for this problem at think link that was suggested in
the error message: http://www.gromacs.org/Documentation/Errors. Probably
your box volume or NpT protocol need some attention.
Mark
On Tue, 7 Mar 2017 06:23 shweta singh wrote:
> Thank you !
>
> On Tue, Mar 7, 2017 at
Thank you !
On Tue, Mar 7, 2017 at 9:47 AM, MRINAL ARANDHARA <
arandharamri...@iitkgp.ac.in> wrote:
> I am trying to run a lipid bilayer simulation but during the npt
> equillibration step I am getting the following error
> "1 particles communicated to PME rank 6 are more than 2/3 times the
> cut
I am trying to run a lipid bilayer simulation but during the npt equillibration
step I am getting the following error
"1 particles communicated to PME rank 6 are more than 2/3 times the cut-off out
of the domain decomposition cell of their charge group in dimension y"
I have successfully run the
I am trying to run a lipid bilayer simulation but during the npt equillibration
step I am getting the following error
"1 particles communicated to PME rank 6 are more than 2/3 times the cut-off out
of the domain decomposition cell of their charge group in dimension y"
I have successfully run the
On 1/29/17 4:33 AM, Albert wrote:
Hello,
I am trying to run MD simulation for a system:
box size: 105.166 x 105.166 x 105.166
atoms: 114K
FF: Amber99SB
I submitted the job with command line:
srun -n 1 gmx_mpi grompp -f mdp/01-em.mdp -o 60.tpr -n -c ion.pdb
srun -n 12 gmx_mpi mdrun -s 60.tp
Hello,
I am trying to run MD simulation for a system:
box size: 105.166 x 105.166 x 105.166
atoms: 114K
FF: Amber99SB
I submitted the job with command line:
srun -n 1 gmx_mpi grompp -f mdp/01-em.mdp -o 60.tpr -n -c ion.pdb
srun -n 12 gmx_mpi mdrun -s 60.tpr -v -g 60.log -c 60.gro -x 60.xtc
b
Hi Qasim,
> On 12 Jan 2017, at 20:22, qasimp...@gmail.com wrote:
>
> Hi Carsten,
>
> I think I couldn't clearly explain the protocol that I follow. Sorry for
> that. Firstly, I do the EM, nvt (100 ps), npt (100 ps) and md (100 ns) steps
> for the equilibrium. In all those steps I use the below
Hi Carsten,
I think I couldn't clearly explain the protocol that I follow. Sorry for that.
Firstly, I do the EM, nvt (100 ps), npt (100 ps) and md (100 ns) steps for the
equilibrium. In all those steps I use the below free energy parameters for the
forward state:
free-energy = yes
init-lambda
Hi Qasim,
> On 11 Jan 2017, at 20:29, Qasim Pars wrote:
>
> Dear Carsten,
>
> Thanks. The forward state simulations works properly with mdrun -ntmpi 8
> -ntomp 2 or mdrun -ntmpi 4 -ntomp 4 as you suggested.
> For the backward state GROMACS still gives too many lincs warning error
> with those
Dear Carsten,
Thanks. The forward state simulations works properly with mdrun -ntmpi 8
-ntomp 2 or mdrun -ntmpi 4 -ntomp 4 as you suggested.
For the backward state GROMACS still gives too many lincs warning error
with those mdrun commands in the md step, indicating the system is far from
equilibr
Dear Qasim,
those kinds of domain decomposition 'errors' can happen when you
try to distibute an MD system among too many MPI ranks. There is
a minimum cell length for each domain decomposition cell in each
dimension, which depends on the chosen cutoff radii and possibly
other inter-atomic constr
Dear users,
I am trying to simulate a protein-ligand system including ~2 atoms with
waters using GROMACS-2016.1. The protocol I tried is forward state for the
free energy calculation. The best ligand pose used in the simulations was
got by AutoDock. At the beginning of the simulation GROMACS s
On 7/26/16 2:21 PM, Alexander Alexander wrote:
On Tue, Jul 26, 2016 at 7:54 PM, Justin Lemkul wrote:
On 7/26/16 1:16 PM, Alexander Alexander wrote:
On Tue, Jul 26, 2016 at 6:07 PM, Justin Lemkul wrote:
On 7/26/16 11:27 AM, Alexander Alexander wrote:
Thanks.
Yes indeed it is a fre
On Tue, Jul 26, 2016 at 7:54 PM, Justin Lemkul wrote:
>
>
> On 7/26/16 1:16 PM, Alexander Alexander wrote:
>
>> On Tue, Jul 26, 2016 at 6:07 PM, Justin Lemkul wrote:
>>
>>
>>>
>>> On 7/26/16 11:27 AM, Alexander Alexander wrote:
>>>
>>> Thanks.
Yes indeed it is a free energy calculation
On 7/26/16 1:16 PM, Alexander Alexander wrote:
On Tue, Jul 26, 2016 at 6:07 PM, Justin Lemkul wrote:
On 7/26/16 11:27 AM, Alexander Alexander wrote:
Thanks.
Yes indeed it is a free energy calculation in which no problem showed up
in
the first 6 windows where the harmonic restrains were
On Tue, Jul 26, 2016 at 6:07 PM, Justin Lemkul wrote:
>
>
> On 7/26/16 11:27 AM, Alexander Alexander wrote:
>
>> Thanks.
>>
>> Yes indeed it is a free energy calculation in which no problem showed up
>> in
>> the first 6 windows where the harmonic restrains were applying on my amino
>> acid but t
On 7/26/16 11:27 AM, Alexander Alexander wrote:
Thanks.
Yes indeed it is a free energy calculation in which no problem showed up in
the first 6 windows where the harmonic restrains were applying on my amino
acid but the DD problem came up immediately in the first windows of the
removing charge
Hi,
On Tue, Jul 26, 2016 at 2:18 PM Alexander Alexander <
alexanderwie...@gmail.com> wrote:
> Hi,
>
> Thanks for your response.
> I do not know which two atoms has bonded interaction comparable with the
> cell size, however, based on this line in log file "two-body bonded
> interactions: 3.196 nm
Thanks.
Yes indeed it is a free energy calculation in which no problem showed up in
the first 6 windows where the harmonic restrains were applying on my amino
acid but the DD problem came up immediately in the first windows of the
removing charge. Below please find the mdp file.
And If I use -ntmp
On 7/26/16 8:17 AM, Alexander Alexander wrote:
Hi,
Thanks for your response.
I do not know which two atoms has bonded interaction comparable with the
cell size, however, based on this line in log file "two-body bonded
interactions: 3.196 nm, LJC Pairs NB, atoms 24 28", I though the 24 and 28
a
Hi,
Thanks for your response.
I do not know which two atoms has bonded interaction comparable with the
cell size, however, based on this line in log file "two-body bonded
interactions: 3.196 nm, LJC Pairs NB, atoms 24 28", I though the 24 and 28
are the couple whom their coordination are as below:
Hi,
So you know your cell dimensions, and mdrun is reporting that it can't
decompose because you have a bonded interaction that is almost the length
of the one of the cell dimensions. How big should that interaction distance
be, and what might you do about it?
Probably mdrun should be smarter abo
Dear gromacs user,
Now is more than one week that I am engaging with the fatal error due to
domain decomposition, and I have not been succeeded yet, and it is more
painful when I have to test different number of cpu's to see which one
works in a cluster with a long queuing time, means being two or
On 6/8/16 9:41 AM, Daniele Veclani wrote:
Dear Gromacs Users
I'm trying to do a simulation in a NVE ensemble in vaccum. But I but
I find this
error:
"Domain Decomposition does not support simple neighbor searching, use grid
searching or run with one MPI rank"
If I use ns_type=grid I can gen
Dear Gromacs Users
I'm trying to do a simulation in a NVE ensemble in vaccum. But I but
I find this
error:
"Domain Decomposition does not support simple neighbor searching, use grid
searching or run with one MPI rank"
If I use ns_type=grid I can generate the .tpr file, But when I run mdrun I
fi
On Fri, Mar 18, 2016 at 7:47 AM, <
gromacs.org_gmx-users-requ...@maillist.sys.kth.se> wrote:
>
> Message: 4
> Date: Fri, 18 Mar 2016 07:46:48 -0400
> From: Justin Lemkul
> To: gmx-us...@gromacs.org
> Subject: Re: [gmx-users] Domain decomposition error tied to free
>
Hello,
I have been attempting to carry out some free energy calculations, but to
verify the sanity of my parameters, I decided to test them on a structure I
knew to be stable -- the lysozyme from Lemkul's lysozyme in water tutorial.
I chose the L75A mutation because it is out on the surface to mi
On 3/17/16 8:21 PM, Ryan Muraglia wrote:
Hello,
I have been attempting to carry out some free energy calculations, but to
verify the sanity of my parameters, I decided to test them on a structure I
knew to be stable -- the lysozyme from Lemkul's lysozyme in water tutorial.
I chose the L75A mu
I think i've just found my mistake. Thank you so much again.
Khatnaa
On Friday, 30 October 2015, 18:55, Justin Lemkul wrote:
On 10/30/15 7:09 AM, badamkhatan togoldor wrote:
> Thank you Justin.
>> The better question is why you're trying to decouple an entire protein; that
>>
On 10/30/15 7:09 AM, badamkhatan togoldor wrote:
Thank you Justin.
The better question is why you're trying to decouple an entire protein; that is
extremely impractical and unlikely to be useful.
Did i do that? then it's my mistake of less knowledge off that. How i fix that?
Khatnaa
Thank you Justin.
>The better question is why you're trying to decouple an entire protein; that
>is
>extremely impractical and unlikely to be useful.
Did i do that? then it's my mistake of less knowledge off that. How i fix that?
Khatnaa
On Friday, 30 October 2015, 1:14, Justin Le
On 10/29/15 4:56 AM, badamkhatan togoldor wrote:
Dear GMX Users, I am simulating a free energy of a protein chain_A in water
by parallel. Then i got domain decomposition error in mdrun. Will use 15
particle-particle and 9 PME only ranksThis is a guess, check the performance
at the end of the lo
Dear GMX Users,
I am simulating a free energy of a protein chain_A in water by parallel. Then i
got domain decomposition error in mdrun.
Will use 15 particle-particle and 9 PME only ranksThis is a guess, check the
performance at the end of the log file
---
Hi SMA,
It says you have bonds over large distances. Check the
structure/topology/setup.
Cheers,
Tsjerk
On Oct 27, 2015 08:02, "Musharaf Ali" wrote:
> Dear users
> During energy minimization for IL-water system in a box size of 4.7x4.7x9.4
> with432 BMIMTF2N and 3519 water molecules, the follo
Dear users
During energy minimization for IL-water system in a box size of 4.7x4.7x9.4
with432 BMIMTF2N and 3519 water molecules, the following error is written
in the md.log file.
Initializing Domain Decomposition on 144 nodes
Dynamic load balancing: no
Will sort the charge groups at every domain
Thank you Mark for the reply.
We are not sure about it either as it worked when we started the simulation
again using the cpt file and also there
was no issue when we did the same simulation using (links algorithm)
constraints.
Thanks,
Siva
On Jul 23, 2014, at 4:20 PM, Mark Abraham wrote:
>
Dear All,
I am running simulations of BMP2 protein and graphite sheet using implicit
solvent model (mdp file is pasted below). The graphite atoms are frozen in the
simulation and BMP2 is free to translate.
I got an error "Step 1786210: The domain decomposition grid has shifted too
much in the Z
On Mon, Jul 21, 2014 at 3:48 PM, Siva Dasetty wrote:
> Dear All,
>
> I am running simulations of BMP2 protein and graphite sheet using implicit
> solvent model (mdp file is pasted below). The graphite atoms are frozen in
> the simulation and BMP2 is free to translate.
> I got an error "Step 17862
Dear All,
I am running simulations of BMP2 protein and graphite sheet using implicit
solvent model (mdp file is pasted below). The graphite atoms are frozen in the
simulation and BMP2 is free to translate.
I got an error "Step 1786210: The domain decomposition grid has shifted too
much in the Z
Dear All,
I am running simulations of BMP2 protein and graphite sheet using implicit
solvent model (mdp file is pasted below). The graphite atoms are frozen in the
simulation and BMP2 is free to translate.
I got an error "Step 1786210: The domain decomposition grid has shifted too
much in the Z
Dear All,
I am running simulations of BMP2 protein and graphite sheet using
implicit solvent model (mdp file is pasted below). The graphite atoms
are frozen in the simulation and BMP2 is free to translate.
I got an error "Step 1786210: The domain decomposition grid has
shifted too much in the Z-di
95 matches
Mail list logo