Denny Frost wrote:
gromacs 4.5.1


Ah, what I posted was from 4.0.7. I wonder why that sort of output was eliminated in 4.5; it's quite useful. Sorry for leading you astray on that. No matter, the end of the .log file will still contain statistics about what's eating up all your simulation time.

-Justin

On Fri, Jan 28, 2011 at 12:40 PM, Erik Marklund <er...@xray.bmc.uu.se <mailto:er...@xray.bmc.uu.se>> wrote:

    PME is still an Ewald sum.

    Erik

    Denny Frost skrev 2011-01-28 20.38:
    I don't have any domain decomposition information like that in my
    log file.  That's worrisome.  The only other information I could
    find about PME and Ewald and this set of lines:

    Table routines are used for coulomb: TRUE
    Table routines are used for vdw:     FALSE
    Will do PME sum in reciprocal space.

    ++++ PLEASE READ AND CITE THE FOLLOWING REFERENCE ++++
    U. Essman, L. Perela, M. L. Berkowitz, T. Darden, H. Lee and L. G.
Pedersen A smooth particle mesh Ewald method
    J. Chem. Phys. 103 (1995) pp. 8577-8592
    -------- -------- --- Thank You --- -------- --------

    Will do ordinary reciprocal space Ewald sum.
    Using a Gaussian width (1/beta) of 0.384195 nm for Ewald
    Cut-off's:   NS: 1.2   Coulomb: 1.2   LJ: 1.2
    System total charge: 0.000
    Generated table with 4400 data points for Ewald.
    Tabscale = 2000 points/nm
    Generated table with 4400 data points for LJ6.
    Tabscale = 2000 points/nm
    Generated table with 4400 data points for LJ12.
    Tabscale = 2000 points/nm
    Configuring nonbonded kernels...
    Configuring standard C nonbonded kernels...
    Testing x86_64 SSE2 support... present.


    Why does it say it will do PME on one line, then ordinary Ewald later?

    On Fri, Jan 28, 2011 at 12:26 PM, Justin A. Lemkul
    <jalem...@vt.edu <mailto:jalem...@vt.edu>> wrote:



        Denny Frost wrote:

            I just realized that that was a very old mdp file.  Here
            is an mdp file from my most recent run as well as what I
            think are the domain decomposition statistics.

            mdp file:
            title               =  BMIM+PF6
            cpp                 =  /lib/cpp
            constraints         =  hbonds
            integrator          =  md
            dt                  =  0.002   ; ps !
            nsteps              =  4000000   ; total 8ns.
            nstcomm             =  1
            nstxout             =  50000
            nstvout             =  50000
            nstfout             =  0
            nstlog              =  5000
            nstenergy           =  5000
            nstxtcout           =  25000
            nstlist             =  10
            ns_type             =  grid
            pbc                 =  xyz
            coulombtype         =  PME
            vdwtype             =  Cut-off
            rlist               =  1.2
            rcoulomb            =  1.2
            rvdw                =  1.2
            fourierspacing      =  0.12
            pme_order           =  4
            ewald_rtol          =  1e-5
            ; Berendsen temperature coupling is on in two groups
            Tcoupl              =  berendsen
tc_grps = BMI PF6 tau_t = 0.2 0.2
            ref_t               =  300  300
            nsttcouple          =  1
            ; Energy monitoring
            energygrps          =  BMI      PF6
            ; Isotropic pressure coupling is now on
            Pcoupl              =  berendsen
            pcoupltype          =  isotropic
            ;pc-grps             =  BMI      PFF
            tau_p               =  1.0
            ref_p               =  1.0
            compressibility     =  4.5e-5

            ; Generate velocites is off at 300 K.
            gen_vel             =  yes
            gen_temp            =  300.0
            gen_seed            =  100000

            domain decomposition
            There are: 12800 Atoms
            Max number of connections per atom is 63
            Total number of connections is 286400
            Max number of graph edges per atom is 6
            Total number of graph edges is 24800


        More useful information is contained at the very top of the
        .log file, after the citations.  An example from one of my own
        runs is:

        Linking all bonded interactions to atoms
        There are 2772 inter charge-group exclusions,
        will use an extra communication step for exclusion forces for PME

        The initial number of communication pulses is: X 2 Y 1
        The initial domain decomposition cell size is: X 1.05 nm Y 1.58 nm

        The maximum allowed distance for charge groups involved in
        interactions is:
                        non-bonded interactions           1.400 nm
        (the following are initial values, they could change due to
        box deformation)
                   two-body bonded interactions  (-rdd)   1.400 nm
                 multi-body bonded interactions  (-rdd)   1.054 nm
         atoms separated by up to 5 constraints  (-rcon)  1.054 nm

        When dynamic load balancing gets turned on, these settings
        will change to:
        The maximum number of communication pulses is: X 2 Y 2
        The minimum size for domain decomposition cells is 0.833 nm
        The requested allowed shrink of DD cells (option -dds) is: 0.80
        The allowed shrink of domain decomposition cells is: X 0.79 Y 0.53
        The maximum allowed distance for charge groups involved in
        interactions is:
                        non-bonded interactions           1.400 nm
                   two-body bonded interactions  (-rdd)   1.400 nm
                 multi-body bonded interactions  (-rdd)   0.833 nm
         atoms separated by up to 5 constraints  (-rcon)  0.833 nm


        Making 2D domain decomposition grid 9 x 6 x 1, home cell index
        0 0 0


        Also, the output under "DOMAIN DECOMPOSITION STATISTICS" (at
        the bottom of the file) would be useful.  Also look for any
        notes about performance lost due to imbalance, waiting for
        PME, etc.  These provide very detailed clues about how your
        system was treated.

        -Justin


-- ========================================

        Justin A. Lemkul
        Ph.D. Candidate
        ICTAS Doctoral Scholar
        MILES-IGERT Trainee
        Department of Biochemistry
        Virginia Tech
        Blacksburg, VA
        jalemkul[at]vt.edu <http://vt.edu> | (540) 231-9080
        http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin

        ========================================
-- gmx-users mailing list gmx-users@gromacs.org
        <mailto:gmx-users@gromacs.org>
        http://lists.gromacs.org/mailman/listinfo/gmx-users
        Please search the archive at
        http://www.gromacs.org/Support/Mailing_Lists/Search before
        posting!
        Please don't post (un)subscribe requests to the list. Use the
        www interface or send it to gmx-users-requ...@gromacs.org
        <mailto:gmx-users-requ...@gromacs.org>.
        Can't post? Read http://www.gromacs.org/Support/Mailing_Lists




-- -----------------------------------------------
    Erik Marklund, PhD student
    Dept. of Cell and Molecular Biology, Uppsala University.
    Husargatan 3, Box 596,    75124 Uppsala, Sweden
    phone:    +46 18 471 4537        fax: +46 18 511 755
    er...@xray.bmc.uu.se <mailto:er...@xray.bmc.uu.se>    
http://folding.bmc.uu.se/


    --
    gmx-users mailing list    gmx-users@gromacs.org
    <mailto:gmx-users@gromacs.org>
    http://lists.gromacs.org/mailman/listinfo/gmx-users
    Please search the archive at
    http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
    Please don't post (un)subscribe requests to the list. Use the
    www interface or send it to gmx-users-requ...@gromacs.org
    <mailto:gmx-users-requ...@gromacs.org>.
    Can't post? Read http://www.gromacs.org/Support/Mailing_Lists



--
========================================

Justin A. Lemkul
Ph.D. Candidate
ICTAS Doctoral Scholar
MILES-IGERT Trainee
Department of Biochemistry
Virginia Tech
Blacksburg, VA
jalemkul[at]vt.edu | (540) 231-9080
http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin

========================================
--
gmx-users mailing list    gmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org.
Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

Reply via email to