Is tpbconv with the "pbc" option the best way to make the molecules whole
again?

On Wed, Feb 9, 2011 at 2:06 PM, Justin A. Lemkul <jalem...@vt.edu> wrote:

>
>
> Denny Frost wrote:
>
>> This run is actually a combination of two 5x5x5 nm boxes, one if which was
>> previously run in DD, and the other is water.  Since the length of that bond
>> is almost 5 nm, is it possible that the pbc's are not being recognized?
>>  There is no way I have a bond that long from my previous run.
>>
>>
> I'll venture a guess that there were broken molecules in the system you
> concatenated?  That would gel with a bond that stretches across a 5-nm box.
>  You have to deal with whole molecules in the input configuration.
>
> -Justin
>
>  On Wed, Feb 9, 2011 at 1:56 PM, Justin A. Lemkul <jalem...@vt.edu<mailto:
>> jalem...@vt.edu>> wrote:
>>
>>
>>
>>    Denny Frost wrote:
>>
>>        I'm using version 4.5.3
>>
>>        Here's the output from the log file from DD initiation to the
>> error:
>>
>>        Initializing Domain Decomposition on 8 nodes
>>        Dynamic load balancing: auto
>>        Will sort the charge groups at every domain (re)decomposition
>>        Initial maximum inter charge-group distances:
>>           two-body bonded interactions: 4.893 nm, Bond, atoms 8994 8996
>>         multi-body bonded interactions: 4.893 nm, Angle, atoms 8994 8997
>>        Minimum cell size due to bonded interactions: 5.382 nm
>>
>>
>>    Bonded interactions should normally not occur over such a length.
>>     The information printed here points to the culprits.  What are
>>    these atoms, and why are they bonded if they are so far away?
>>
>>    -Justin
>>
>>        Using 0 separate PME nodes
>>        Scaling the initial minimum size with 1/0.8 (option -dds) = 1.25
>>        Optimizing the DD grid for 8 cells with a minimum initial size
>>        of 6.728 nm
>>        The maximum allowed number of cells is: X 0 Y 0 Z 1
>>
>>        -------------------------------------------------------
>>        Program mdrun_mpi, VERSION 4.5.3
>>        Source code file: domdec.c, line: 6428
>>
>>        Fatal error:
>>        There is no domain decomposition for 8 nodes that is compatible
>>        with the given box and a minimum cell size of 6.72787 nm
>>        Change the number of nodes or mdrun option -rdd or -dds
>>        Look in the log file for details on the domain decomposition
>>        For more information and tips for troubleshooting, please check
>>        the GROMACS
>>        website at http://www.gromacs.org/Documentation/Errors
>>
>>        And here is my mdp file
>>
>>        title               =  BMIM+PF6
>>        cpp                 =  /lib/cpp
>>        constraints         =  hbonds
>>        integrator          =  md
>>        dt                  =  0.002   ; ps !
>>        nsteps              =  75000   ; total 150 ps
>>        nstcomm             =  10
>>        nstxout             =  50000
>>        nstvout             =  50000
>>        nstfout             =  0
>>        nstlog              =  5000
>>        nstenergy           =  5000
>>        nstxtcout           =  25000
>>        nstlist             =  10
>>        ns_type             =  grid
>>        pbc                 =  xyz
>>        coulombtype         =  PME
>>        vdwtype             =  Cut-off
>>        rlist               =  1.2
>>        rcoulomb            =  1.2
>>        rvdw                =  1.2
>>        fourierspacing      =  0.12
>>        pme_order           =  4
>>        ewald_rtol          =  1e-5
>>        ; Berendsen temperature coupling is on in two groups
>>        Tcoupl              =  berendsen
>>        tc_grps             =  BMI      PF6     SOL tau_t
>>    =  0.2  0.2  0.2
>>        ref_t               =  300  300  300
>>        nsttcouple          =  1
>>        ; Energy monitoring
>>        energygrps          =  BMI      PF6     SOL
>>        ; Isotropic pressure coupling is now on
>>        Pcoupl              =  berendsen
>>        pcoupltype          =  isotropic
>>        ;pc-grps             =  BMI      PFF
>>        tau_p               =  2.0
>>        ref_p               =  1.0
>>        compressibility     =  4.5e-5
>>
>>        ; Generate velocites is off at 300 K.
>>        gen_vel             =  yes
>>        gen_temp            =  300.0
>>        gen_seed            =  100000
>>
>>
>>        On Wed, Feb 9, 2011 at 1:39 PM, Justin A. Lemkul
>>        <jalem...@vt.edu <mailto:jalem...@vt.edu>
>>        <mailto:jalem...@vt.edu <mailto:jalem...@vt.edu>>> wrote:
>>
>>
>>
>>           Denny Frost wrote:
>>
>>               I am trying to start a run using domain decomposition on a
>>               5x5x10 nm box with about 26,000 atoms in it.  I've tried
>>        running
>>               8-16 pp nodes, but gromacs always throws an error saying
>> that
>>               there is no domain decomposition compatible with this box
>>        and a
>>               minimum cell size of 6.728 nm.  I've tried many values
>>        for -dds
>>               and a few dd vectors, but with no luck.  Does anyone know
>>        to get
>>               domain decomposition working on a rectangular system like
>>        this?
>>
>>
>>           Not without significantly more information.  Please post:
>>
>>           1. Your Gromacs version
>>           2. Any DD-related information that is printed to either the
>>        log file
>>           or stdout
>>           3. Your .mdp file
>>
>>           -Justin
>>
>>           --     ========================================
>>
>>           Justin A. Lemkul
>>           Ph.D. Candidate
>>           ICTAS Doctoral Scholar
>>           MILES-IGERT Trainee
>>           Department of Biochemistry
>>           Virginia Tech
>>           Blacksburg, VA
>>           jalemkul[at]vt.edu <http://vt.edu> <http://vt.edu> | (540)
>>
>>        231-9080
>>
>>           http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin
>>
>>           ========================================
>>           --     gmx-users mailing list    gmx-users@gromacs.org
>>        <mailto:gmx-users@gromacs.org>
>>           <mailto:gmx-users@gromacs.org <mailto:gmx-users@gromacs.org>>
>>
>>
>>           http://lists.gromacs.org/mailman/listinfo/gmx-users
>>           Please search the archive at
>>           http://www.gromacs.org/Support/Mailing_Lists/Search before
>>        posting!
>>           Please don't post (un)subscribe requests to the list. Use the
>> www
>>           interface or send it to gmx-users-requ...@gromacs.org
>>        <mailto:gmx-users-requ...@gromacs.org>
>>           <mailto:gmx-users-requ...@gromacs.org
>>        <mailto:gmx-users-requ...@gromacs.org>>.
>>
>>           Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>
>>
>>
>>    --     ========================================
>>
>>    Justin A. Lemkul
>>    Ph.D. Candidate
>>    ICTAS Doctoral Scholar
>>    MILES-IGERT Trainee
>>    Department of Biochemistry
>>    Virginia Tech
>>    Blacksburg, VA
>>    jalemkul[at]vt.edu <http://vt.edu> | (540) 231-9080
>>    http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin
>>
>>    ========================================
>>    --     gmx-users mailing list    gmx-users@gromacs.org
>>    <mailto:gmx-users@gromacs.org>
>>    http://lists.gromacs.org/mailman/listinfo/gmx-users
>>    Please search the archive at
>>    http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
>>    Please don't post (un)subscribe requests to the list. Use the www
>>    interface or send it to gmx-users-requ...@gromacs.org
>>    <mailto:gmx-users-requ...@gromacs.org>.
>>    Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>
>>
>>
> --
> ========================================
>
> Justin A. Lemkul
> Ph.D. Candidate
> ICTAS Doctoral Scholar
> MILES-IGERT Trainee
> Department of Biochemistry
> Virginia Tech
> Blacksburg, VA
> jalemkul[at]vt.edu | (540) 231-9080
> http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin
>
> ========================================
> --
> gmx-users mailing list    gmx-users@gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
> Please don't post (un)subscribe requests to the list. Use the www interface
> or send it to gmx-users-requ...@gromacs.org.
> Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
-- 
gmx-users mailing list    gmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

Reply via email to