Hi, Ah I see. So unless your hydrated uranyl is modelled with bonded interactions between uranyl atoms and water atoms, the only bonds in the system are silicate hydroxyl, water, and uranyl. If so, then I suspect the default value of lincs-order (which is 4, to suit highly connected biomolecular use cases) is too high for the actual connectivity you have. Reducing that to 3 will relax the minimum diameter that the domain decomposition requires, which I feel is a more stable approach than modifying -rcon. How does that work for you?
Perhaps we should automate such a check in grompp, to cater for such weakly connected use cases. Mark On Thu, Nov 15, 2018 at 3:25 AM Sergio Perez <sperezcon...@gmail.com> wrote: > Actually the clay has the clayFF force which has only bonds on OH units, > the rest of atoms are just LJ spheres with a charge. I guess the conclusion > is still the same? > > On Wed, Nov 14, 2018 at 8:47 PM Mark Abraham <mark.j.abra...@gmail.com> > wrote: > > > Hi, > > > > On Wed, Nov 14, 2018 at 3:18 AM Sergio Perez <sperezcon...@gmail.com> > > wrote: > > > > > Hello, > > > First of all thanks for the help :) > > > I don't necessarily need to run it with 100 processors, I just want to > > know > > > how much I can reduce rcon taking into account the knowledge of my > system > > > without compromising the accuracy. Let me give some more details of my > > > system. The system is a sodium montmorillonite clay with two solid > > > alumino-silicate layers with two aqueous interlayers between them. The > > > > > > > I assume the silicate network has many bonds over large space - these > > adjacent bonds are the issue, not uranyl. (You would have the same > problem > > with a clay-only system.) > > > > > > > system has TIP4P waters, some OH bonds within the clay and the bonds of > > the > > > uranyl hydrated ion described in my previous email as constraints. The > > > system is orthorrhombic 4.67070x4.49090x3.77930 and has 9046 atoms. > > > > > > This is the ouput of gromacs: > > > > > > Initializing Domain Decomposition on 100 ranks > > > Dynamic load balancing: locked > > > Initial maximum inter charge-group distances: > > > two-body bonded interactions: 0.470 nm, Tab. Bonds NC, atoms 10 13 > > > Minimum cell size due to bonded interactions: 0.000 nm > > > Maximum distance for 5 constraints, at 120 deg. angles, all-trans: > 0.842 > > nm > > > Estimated maximum distance required for P-LINCS: 0.842 nm > > > This distance will limit the DD cell size, you can override this with > > -rcon > > > Guess for relative PME load: 0.04 > > > Will use 90 particle-particle and 10 PME only ranks > > > > > > > GROMACS has guessed to use 90 ranks in the real-space domain > decomposition, > > e.g. as an array of 6x5x3 ranks. > > > > > > > This is a guess, check the performance at the end of the log file > > > Using 10 separate PME ranks, as guessed by mdrun > > > Scaling the initial minimum size with 1/0.8 (option -dds) = 1.25 > > > Optimizing the DD grid for 90 cells with a minimum initial size of > 1.052 > > nm > > > The maximum allowed number of cells is: X 4 Y 4 Z 3 > > > > > > > ... but only 4x4x3=48 ranks can work with the connectivity of your input. > > Thus you are simply using too many ranks for a small system. You'd have > to > > relax the tolerances quite a lot to get to use 90 ranks. Just follow the > > first part of the message advice and use fewer ranks :-) > > > > Mark > > > > ------------------------------------------------------- > > > Program: mdrun_mpi, version 2018.1 > > > Source file: src/gromacs/domdec/domdec.cpp (line 6571) > > > MPI rank: 0 (out of 100) > > > > > > Fatal error: > > > There is no domain decomposition for 90 ranks that is compatible with > the > > > given box and a minimum cell size of 1.05193 nm > > > Change the number of ranks or mdrun option -rcon or -dds or your LINCS > > > settings > > > Look in the log file for details on the domain decomposition > > > > > > For more information and tips for troubleshooting, please check the > > GROMACS > > > website at http://www.gromacs.org/Documentation/Errors > > > ------------------------------------------------------- > > > > > > > > > Thank you for your help! > > > > > > On Wed, Nov 14, 2018 at 5:28 AM Mark Abraham <mark.j.abra...@gmail.com > > > > > wrote: > > > > > > > Hi, > > > > > > > > Possibly. It would be simpler to use fewer processors, such that the > > > > domains can be larger. > > > > > > > > What does mdrun think it needs for -rcon? > > > > > > > > Mark > > > > > > > > On Tue, Nov 13, 2018 at 7:06 AM Sergio Perez <sperezcon...@gmail.com > > > > > > wrote: > > > > > > > > > Dear gmx comunity, > > > > > > > > > > I have been running my system without any problems with 100 > > processors. > > > > But > > > > > I decided to make some of the bonds of my main molecule constrains. > > My > > > > > molecule is not an extended chain, it is a molecular hydrated ion, > in > > > > > particular the uranyl cation with 5 water molecules forming a > > > pentagonal > > > > by > > > > > bipyramid. At this point I get a domain decomposition error and I > > would > > > > > like to reduce rcon in order to run with 100 processors. Since I > know > > > > that > > > > > by the shape of my molecule, two atoms connected by several > > constraints > > > > > will never be further appart than 0.6nm, can I use this safely for > > > -rcon? > > > > > > > > > > Thank you very much! > > > > > Best regards, > > > > > Sergio PĂ©rez-Conesa > > > > > -- > > > > > Gromacs Users mailing list > > > > > > > > > > * Please search the archive at > > > > > http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before > > > > > posting! > > > > > > > > > > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists > > > > > > > > > > * For (un)subscribe requests visit > > > > > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users > > or > > > > > send a mail to gmx-users-requ...@gromacs.org. > > > > -- > > > > Gromacs Users mailing list > > > > > > > > * Please search the archive at > > > > http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before > > > > posting! > > > > > > > > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists > > > > > > > > * For (un)subscribe requests visit > > > > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users > or > > > > send a mail to gmx-users-requ...@gromacs.org. > > > -- > > > Gromacs Users mailing list > > > > > > * Please search the archive at > > > http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before > > > posting! > > > > > > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists > > > > > > * For (un)subscribe requests visit > > > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or > > > send a mail to gmx-users-requ...@gromacs.org. > > -- > > Gromacs Users mailing list > > > > * Please search the archive at > > http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before > > posting! > > > > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists > > > > * For (un)subscribe requests visit > > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or > > send a mail to gmx-users-requ...@gromacs.org. > -- > Gromacs Users mailing list > > * Please search the archive at > http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before > posting! > > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists > > * For (un)subscribe requests visit > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or > send a mail to gmx-users-requ...@gromacs.org. -- Gromacs Users mailing list * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting! * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists * For (un)subscribe requests visit https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-requ...@gromacs.org.