I just realized that that was a very old mdp file.  Here is an mdp file from
my most recent run as well as what I think are the domain decomposition
statistics.

mdp file:
title               =  BMIM+PF6
cpp                 =  /lib/cpp
constraints         =  hbonds
integrator          =  md
dt                  =  0.002   ; ps !
nsteps              =  4000000   ; total 8ns.
nstcomm             =  1
nstxout             =  50000
nstvout             =  50000
nstfout             =  0
nstlog              =  5000
nstenergy           =  5000
nstxtcout           =  25000
nstlist             =  10
ns_type             =  grid
pbc                 =  xyz
coulombtype         =  PME
vdwtype             =  Cut-off
rlist               =  1.2
rcoulomb            =  1.2
rvdw                =  1.2
fourierspacing      =  0.12
pme_order           =  4
ewald_rtol          =  1e-5
; Berendsen temperature coupling is on in two groups
Tcoupl              =  berendsen
tc_grps             =  BMI      PF6
tau_t               =  0.2  0.2
ref_t               =  300  300
nsttcouple          =  1
; Energy monitoring
energygrps          =  BMI      PF6
; Isotropic pressure coupling is now on
Pcoupl              =  berendsen
pcoupltype          =  isotropic
;pc-grps             =  BMI      PFF
tau_p               =  1.0
ref_p               =  1.0
compressibility     =  4.5e-5

; Generate velocites is off at 300 K.
gen_vel             =  yes
gen_temp            =  300.0
gen_seed            =  100000

domain decomposition
There are: 12800 Atoms
Max number of connections per atom is 63
Total number of connections is 286400
Max number of graph edges per atom is 6
Total number of graph edges is 24800

On Thu, Jan 27, 2011 at 4:32 PM, Justin A. Lemkul <jalem...@vt.edu> wrote:

>
>
> Denny Frost wrote:
>
>> about 12000 atoms, 8 nodes, CentOS 5.3/Linux, infiniband.  Below is a copy
>> of my mdp file.
>>
>> title               =  BMIM+PF6
>> cpp                 =  /lib/cpp
>> constraints         =  all_bonds
>> integrator          =  md
>> dt                  =  0.004   ; ps !
>> nsteps              =  20000000   ; total 4ns.
>> nstcomm             =  1
>> nstxout             =  50000
>> nstvout             =  50000
>> nstfout             =  0
>> nstlog              =  5000
>> nstenergy           =  5000
>> nstxtcout           =  25000
>> nstlist             =  10
>> ns_type             =  grid
>> pbc                 =  xyz
>> coulombtype         =  PME
>> vdwtype             =  Shift
>> rlist               =  1.0
>> rcoulomb            =  1.0
>> rvdw                =  1.0
>> fourierspacing      =  0.6
>>
>
> This fourierspacing is 5-6 times larger than what is normally accepted as
> sufficiently accurate.  A sparse grid will make the PME algorithm faster,
> actually, but at the expense of accuracy.
>
> Can you post the domain decomposition statistics from the .log file?  They
> appear just above the energies from time 0.  What did grompp tell you about
> the relative PME:PP load?
>
> -Justin
>
>  ;pme_order           =  4
>> ewald_rtol          =  1e-5
>> ; Berendsen temperature coupling is on in two groups
>> Tcoupl              =  berendsen
>> tc_grps             =  BMI      PF6      tau_t               =  0.1  0.1
>> ref_t               =  300  300
>> nsttcouple          =  1
>> ; Energy monitoring
>> energygrps          =  BMI      PF6
>> ; Isotropic pressure coupling is now on
>> Pcoupl              =  berendsen
>> pcoupltype          =  isotropic
>> ;pc-grps             =  BMI      PFF
>> tau_p               =  1.0
>> ref_p               =  1.0
>> compressibility     =  4.5e-5
>>
>> ; Generate velocites is off at 300 K.
>> gen_vel             =  yes
>> gen_temp            =  300.0
>> gen_seed            =  100000
>>
>>
>> On Thu, Jan 27, 2011 at 4:12 PM, Dallas Warren 
>> <dallas.war...@monash.edu<mailto:
>> dallas.war...@monash.edu>> wrote:
>>
>>    You will need to provide more details on the system.  How many
>>    atoms, what sort of computer system is it being run on, how many
>>    nodes, copy of the mdp file etc.
>>
>>
>>    Catch ya,
>>
>>    Dr. Dallas Warren
>>
>>    Medicinal Chemistry and Drug Action
>>
>>    Monash Institute of Pharmaceutical Sciences, Monash University
>>    381 Royal Parade, Parkville VIC 3010
>>    dallas.war...@monash.edu <mailto:dallas.war...@monash.edu>
>>
>>
>>    +61 3 9903 9304
>>    ---------------------------------
>>    When the only tool you own is a hammer, every problem begins to
>>    resemble a nail.
>>
>>
>>    *From:* gmx-users-boun...@gromacs.org
>>    <mailto:gmx-users-boun...@gromacs.org>
>>    [mailto:gmx-users-boun...@gromacs.org
>>    <mailto:gmx-users-boun...@gromacs.org>] *On Behalf Of *Denny Frost
>>    *Sent:* Friday, 28 January 2011 9:34 AM
>>    *To:* Discussion list for GROMACS users
>>    *Subject:* [gmx-users] Slow Runs
>>
>>
>>    I am taking over a project for a graduate student who did MD using
>>    Gromacs 3.3.3.  I now run similar simulations with Gromacs 4.5.1 and
>>    find that they run only about 1/2 to 1/3 as fast as the previous
>>    runs done in Gromacs 3.3.3.  The runs have about the same number of
>>    atoms and both use opls force fields.  The mdp files is virtually
>>    the same (I copied them).  The only major difference is that my runs
>>    have difference species and thus have different (although smaller)
>>    itp files.  The runs are stable and give reasonable thermodynamic
>>    properties - they're just slow.  Has anyone had any experience with
>>    something like this?
>>
>>
>>    --
>>    gmx-users mailing list    gmx-users@gromacs.org
>>    <mailto:gmx-users@gromacs.org>
>>
>>    http://lists.gromacs.org/mailman/listinfo/gmx-users
>>    Please search the archive at
>>    http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
>>    Please don't post (un)subscribe requests to the list. Use the
>>    www interface or send it to gmx-users-requ...@gromacs.org
>>    <mailto:gmx-users-requ...@gromacs.org>.
>>
>>    Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>
>>
>>
> --
> ========================================
>
> Justin A. Lemkul
> Ph.D. Candidate
> ICTAS Doctoral Scholar
> MILES-IGERT Trainee
> Department of Biochemistry
> Virginia Tech
> Blacksburg, VA
> jalemkul[at]vt.edu | (540) 231-9080
> http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin
>
> ========================================
> --
> gmx-users mailing list    gmx-users@gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
> Please don't post (un)subscribe requests to the list. Use the www interface
> or send it to gmx-users-requ...@gromacs.org.
> Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
-- 
gmx-users mailing list    gmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

Reply via email to