On 28/03/2011 10:37 PM, André Ferreira wrote:
Dear all,
I am having some problems with some molecules that I am currently
simulating. I got simulation box that is constituted by 100 oligomer
molecules (190 atoms each), in a cubic box with 7 nm side (50K). Since
the molecule is quite long I am using a rcoulomb of 2.8 nm and rvdw of
3.4 nm.
That does not seem like a good reason to choose massive values for
cutoffs. What about the parameters used in your system makes you think
you need to treat interactions accurately out to these distances?
Proteins have physical dimensions larger than that, and nobody uses
cutoffs greater than about 1.2nm for all-atom models.
When I try to perform the simulation mdrun can't find DD grid for more
than 48 cores, which is a huge handicap. This is the first time that I
am working with longer molecules, and I never had this problem.
In order to obtain good parallel scaling, the DD implementation requires
that the interaction partners of all atoms reside on a cell located on
the same node or a cell located on an adjacent node. Thus, there is a
minimum cell size. The size of your cutoffs creates a minimum cell size
that is inconsistent with this principle when applied to your system.
Fundamentally, a given parallelization algorithm is useful only for a
certain range of problem sizes. Trivially, you won't get useful speedup
for 100 atoms on 1000 processors without a seriously specialized
algorithm. The same algorithm would be terrible for 1000 atoms on 10
processors. If you had around 100 times the number of atoms, then
48-core DD would probably be fine. So you need to change a feature of
your model physics, use fewer processors, or make your system bigger.
Mark
--
gmx-users mailing list gmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-requ...@gromacs.org.
Can't post? Read http://www.gromacs.org/Support/Mailing_Lists