Jennifer Williams wrote:
Hi ,

I am getting the following error when I try to run in parallel (I've tried with 8 and 2 nodes and get the same).

Not all bonded interactions have been properly assigned to the domain decomposition cells

But my simulation works when I run in serial.

I'm using gromacs 4.0.5. I am working on a mesoprous silica which I define as a single residue (each atom is assigned to a single charge group).

How many atoms in what size simulation cell? What are your v-sites?

I've tried changing table_ext in the .mdp file (I first increased it to 2.5 and then 30) following advice on previous forum posts but I still get the same thing.

Does anyone know why this is happening and how I can fix this? I could run in serial but it would take too long.

I also get a NOTE: Periodic molecules: can not easily determine the required minimum bonded cut-off, using half the non-bonded cut-off

Is this part of the same problem or a different thing altogether?

My random guess is that there's a single problem with the interaction
of parallel DD, PBC, vsites, periodic molecules and/or constraints. Berk did
fix a bug earlier this month whose git commit description is
"fixed v-site pbc bug with charge groups consisting ofonly multiple v-sites"
but I do not know if this is at all applicable.

Compiling the git release-4-0-patches branch and trying to run with that
may help.

See bottom of text also.

I've pasted my md.log file below

Thanks


010/AP_ready> more md.log
Log file opened on Tue Oct 27 13:31:44 2009
Host: vlxbig20.see.ed.ac.uk  pid: 6930  nodeid: 0  nnodes:  8
The Gromacs distribution was built Tue Jul 21 13:18:34 BST 2009 by


parameters of the run:
   integrator           = md
   nsteps               = 5000000
   init_step            = 0
   ns_type              = Grid
   nstlist              = 10
   ndelta               = 2
   nstcomm              = 0
   comm_mode            = None
   nstlog               = 1000
   nstxout              = 1000
   nstvout              = 1000
   nstfout              = 1000
   nstenergy            = 1000
   nstxtcout            = 1000
   init_t               = 0
   delta_t              = 0.001
   xtcprec              = 1000
   nkx                  = 39
   nky                  = 39
   nkz                  = 64
   pme_order            = 4
   ewald_rtol           = 1e-05
   ewald_geometry       = 0
   epsilon_surface      = 0
   optimize_fft         = TRUE
   ePBC                 = xyz
   bPeriodicMols        = TRUE
   bContinuation        = FALSE
   bShakeSOR            = FALSE
   etc                  = Nose-Hoover
   epc                  = No
   epctype              = Isotropic
   tau_p                = 1
   ref_p (3x3):
      ref_p[    0]={ 0.00000e+00,  0.00000e+00,  0.00000e+00}
      ref_p[    1]={ 0.00000e+00,  0.00000e+00,  0.00000e+00}
      ref_p[    2]={ 0.00000e+00,  0.00000e+00,  0.00000e+00}
   compress (3x3):
      compress[    0]={ 0.00000e+00,  0.00000e+00,  0.00000e+00}
      compress[    1]={ 0.00000e+00,  0.00000e+00,  0.00000e+00}
      compress[    2]={ 0.00000e+00,  0.00000e+00,  0.00000e+00}
   refcoord_scaling     = No
   posres_com (3):
      posres_com[0]= 0.00000e+00
      posres_com[1]= 0.00000e+00
      posres_com[2]= 0.00000e+00
   posres_comB (3):
      posres_comB[0]= 0.00000e+00
      posres_comB[1]= 0.00000e+00
      posres_comB[2]= 0.00000e+00
   andersen_seed        = 815131
   rlist                = 1.5
   rtpi                 = 0.05
   coulombtype          = PME
   rcoulomb_switch      = 0
   rcoulomb             = 1.5
   vdwtype              = Shift
   rvdw_switch          = 1.2
   rvdw                 = 1.5
   epsilon_r            = 1
   epsilon_rf           = 1
   tabext               = 2.5
   implicit_solvent     = No
   gb_algorithm         = Still
   gb_epsilon_solvent   = 80
   nstgbradii           = 1
   rgbradii             = 2
   gb_saltconc          = 0
   gb_obc_alpha         = 1
   gb_obc_beta          = 0.8
   gb_obc_gamma         = 4.85
   sa_surface_tension   = 2.092
   DispCorr             = EnerPres
   free_energy          = no
   init_lambda          = 0
   sc_alpha             = 0
   sc_power             = 0
   sc_sigma             = 0.3
   delta_lambda         = 0
   nwall                = 0
   wall_type            = 9-3
   wall_atomtype[0]     = -1
   wall_atomtype[1]     = -1
   wall_density[0]      = 0
   wall_density[1]      = 0
   wall_ewald_zfac      = 3
   pull                 = no
   disre                = No
   disre_weighting      = Conservative
   disre_mixed          = FALSE
   dr_fc                = 1000
   dr_tau               = 0
   nstdisreout          = 100
   orires_fc            = 0
   orires_tau           = 0
   nstorireout          = 100
   dihre-fc             = 1000
   em_stepsize          = 0.01
   em_tol               = 10
   niter                = 20
   fc_stepsize          = 0
   nstcgsteep           = 1000
   nbfgscorr            = 10
   ConstAlg             = Lincs
   shake_tol            = 0.0001
   lincs_order          = 4
   lincs_warnangle      = 30
   lincs_iter           = 1
   bd_fric              = 0
   ld_seed              = 1993
   cos_accel            = 0
   deform (3x3):
      deform[    0]={ 0.00000e+00,  0.00000e+00,  0.00000e+00}
      deform[    1]={ 0.00000e+00,  0.00000e+00,  0.00000e+00}
      deform[    2]={ 0.00000e+00,  0.00000e+00,  0.00000e+00}
   userint1             = 0
   userint2             = 0
   userint3             = 0
   userint4             = 0
   userreal1            = 0
   userreal2            = 0
   userreal3            = 0
   userreal4            = 0
grpopts:
   nrdf:        5392
   ref_t:         300
   tau_t:         0.1
anneal:          No
ann_npoints:           0
   acc:            0           0           0
nfreeze: Y Y Y N N N
   energygrp_flags[  0]: 0
   efield-x:
      n = 0
   efield-xt:
      n = 0
   efield-y:
      n = 0
   efield-yt:
      n = 0
   efield-z:
      n = 0
   efield-zt:
      n = 0
   bQMMM                = FALSE
   QMconstraints        = 0
   QMMMscheme           = 0
   scalefactor          = 1
qm_opts:
   ngQM                 = 0

Initializing Domain Decomposition on 8 nodes
Dynamic load balancing: auto
Will sort the charge groups at every domain (re)decomposition

NOTE: Periodic molecules: can not easily determine the required minimum bonded cut-off, using half the non-bonded cut-off

Minimum cell size due to bonded interactions: 0.750 nm
Maximum distance for 5 constraints, at 120 deg. angles, all-trans: 0.376 nm
Estimated maximum distance required for P-LINCS: 0.376 nm
Using 0 separate PME nodes
Scaling the initial minimum size with 1/0.8 (option -dds) = 1.25
Optimizing the DD grid for 8 cells with a minimum initial size of 0.938 nm
The maximum allowed number of cells is: X 4 Y 4 Z 8
Domain decomposition grid 2 x 1 x 4, separate PME nodes 0
Domain decomposition nodeid 0, coordinates 0 0 0

Table routines are used for coulomb: TRUE
Table routines are used for vdw:     TRUE
Will do PME sum in reciprocal space.

++++ PLEASE READ AND CITE THE FOLLOWING REFERENCE ++++
U. Essman, L. Perela, M. L. Berkowitz, T. Darden, H. Lee and L. G. Pedersen
A smooth particle mesh Ewald method
J. Chem. Phys. 103 (1995) pp. 8577-8592
-------- -------- --- Thank You --- -------- --------

Using a Gaussian width (1/beta) of 0.480244 nm for Ewald
Using shifted Lennard-Jones, switch between 0.9 and 1.2 nm
Cut-off's:   NS: 1.5   Coulomb: 1.5   LJ: 1.2
System total charge: 0.000
Generated table with 2000 data points for Ewald.
Tabscale = 500 points/nm
Generated table with 2000 data points for LJ6Shift.
Tabscale = 500 points/nm
Generated table with 2000 data points for LJ12Shift.
Tabscale = 500 points/nm
Generated table with 2000 data points for 1-4 COUL.
Tabscale = 500 points/nm
Generated table with 2000 data points for 1-4 LJ6.
Tabscale = 500 points/nm
Generated table with 2000 data points for 1-4 LJ12.
Tabscale = 500 points/nm
Configuring nonbonded kernels...
Testing x86_64 SSE support... present.

Initializing Parallel LINear Constraint Solver

++++ PLEASE READ AND CITE THE FOLLOWING REFERENCE ++++
B. Hess
P-LINCS: A Parallel Linear Constraint Solver for molecular simulation
J. Chem. Theory Comput. 4 (2008) pp. 116-122
-------- -------- --- Thank You --- -------- --------

The number of constraints is 800
There are inter charge-group constraints,
will communicate selected coordinates each lincs iteration

Linking all bonded interactions to atoms
There are 3236 inter charge-group exclusions,
will use an extra communication step for exclusion forces for PME

The initial number of communication pulses is: X 1 Z 1
The initial domain decomposition cell size is: X 2.01 nm Z 1.90 nm

The maximum allowed distance for charge groups involved in interactions is:
                 non-bonded interactions           1.500 nm
            two-body bonded interactions  (-rdd)   1.500 nm
          multi-body bonded interactions  (-rdd)   1.500 nm
  atoms separated by up to 5 constraints  (-rcon)  1.896 nm

When dynamic load balancing gets turned on, these settings will change to:
The maximum number of communication pulses is: X 1 Z 1
The minimum size for domain decomposition cells is 1.500 nm
The requested allowed shrink of DD cells (option -dds) is: 0.80
The allowed shrink of domain decomposition cells is: X 0.75 Z 0.79
The maximum allowed distance for charge groups involved in interactions is:
                 non-bonded interactions           1.500 nm
            two-body bonded interactions  (-rdd)   1.500 nm
          multi-body bonded interactions  (-rdd)   1.500 nm
  atoms separated by up to 5 constraints  (-rcon)  1.500 nm

Making 2D domain decomposition grid 2 x 1 x 4, home cell index 0 0 0

There are: 5244 Atoms
There are: 476 VSites
Charge group distribution at step 0: 583 565 583 565 666 684 666 684
Grid: 9 x 6 x 6 cells

Constraining the starting coordinates (step 0)

Constraining the coordinates at t0-dt (step 0)

Not all bonded interactions have been properly assigned to the domain decomposition cells

More output should follow here, to wit, a list of missing bonded
interactions. It might also be in the stderr from the calculation. Is there any?

Mark
_______________________________________________
gmx-users mailing list    gmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org.
Can't post? Read http://www.gromacs.org/mailing_lists/users.php

Reply via email to