Re: [gmx-users] GROMACS 2018.2 mdrun GPU Assertion failed: Condition: cudaSuccess == cudaPeekAtLastError()

2018-08-14 Thread Jia Hong
Hi Szilárd,

Thanks to your tweaked version of the code, I eventually found out that the 
CUDA capable GPU's id wasn't what I thought it was.

I thought since nvidia-smi displayed 1080 as the second card (GPU number 1), it 
has id 1, but it actually has id 0. Therefore, setting CUDA_VISIBLE_DEVICES=0 
gmx mdrun  did the trick.

Thank you to everyone involved.

Cheers,
Harry

From: gromacs.org_gmx-users-boun...@maillist.sys.kth.se 
 on behalf of Szilárd Páll 

Sent: Monday, August 13, 2018 10:52 PM
To: Discussion list for GROMACS users
Subject: Re: [gmx-users] GROMACS 2018.2 mdrun GPU Assertion failed: Condition: 
cudaSuccess == cudaPeekAtLastError()

Hi,

I can not reproduce such an error, not even by emulating your use case.

Can you please try to build this slightly tweaked version of the code:
https://gerrit.gromacs.org/changes/8172/revisions/fb358c835c312952571a78f32971cfacf42df966/archive?format=tgz
and let me know what the error is?

Also, please provide the full log and stdandard outputs
with/without CUDA_VISIBLE_DEVICES.

Cheers,
--
Szilárd


On Sat, Aug 11, 2018 at 4:24 AM Jia Hong  wrote:

> Hi Mark,
>
> I've set CUDA_VISIBLE_DEVICES to 1, the exact command was
>
> export CUDA_VISIBLE_DEVICES=1
>
> I chose 1 because that's the GPU id for the 1080 card. Then, in the same
> shell I ran the gmx command (both with and without -gpu_id 1 argument).
>
> As for the log file, I thought it was md_0_01_GPU.log, which I sent as a
> follow-up message. Nonetheless,
> here's the link to the file
> https://www.dropbox.com/s/xc8z09wvnl0314m/md_0_01_GPU.log?dl=0.
> [
> https://www.dropbox.com/static/images/spectrum-icons/generated/content/content-unknown-large.png
> ]
>
> md_0_01_GPU.log<
> https://www.dropbox.com/s/xc8z09wvnl0314m/md_0_01_GPU.log?dl=0>
> Shared with Dropbox
> www.dropbox.com
>
>
> Cheers,
> Harry
>
> 
> From: gromacs.org_gmx-users-boun...@maillist.sys.kth.se <
> gromacs.org_gmx-users-boun...@maillist.sys.kth.se> on behalf of Mark
> Abraham 
> Sent: Saturday, August 11, 2018 7:43 AM
> To: gmx-us...@gromacs.org
> Subject: Re: [gmx-users] GROMACS 2018.2 mdrun GPU Assertion failed:
> Condition: cudaSuccess == cudaPeekAtLastError()
>
> Hi,
>
> To what did you set CUDA_VISIBLE_DEVICES? Just saying it didn't work
> doesn't exclude that it couldn't have worked because it wasn't right :-)
>
> Also, could you please share a full log file via a sharing service and post
> a link. This fragment doesn't help us find out how the detection saw your
> system, unfortunately.
>
> Mark
>
> On Wed, Aug 8, 2018, 10:53 Jia Hong  wrote:
>
> > The log file:
> >
> >
> > Log file opened on Wed Aug  8 16:33:34 2018
> > Host: matthias-processing  pid: 28756  rank ID: 0  number of ranks:  1
> >   :-) GROMACS - gmx mdrun, 2018.2 (-:
> >
> > GROMACS is written by:
> >  Emile Apol  Rossen Apostolov  Paul Bauer Herman J.C.
> > Berendsen
> > Par BjelkmarAldert van Buuren   Rudi van Drunen Anton
> Feenstra
> >   Gerrit GroenhofAleksei Iupinov   Christoph Junghans   Anca Hamuraru
> >  Vincent Hindriksen Dimitrios KarkoulisPeter KassonJiri Kraus
> >   Carsten Kutzner  Per Larsson  Justin A. LemkulViveca
> Lindahl
> >   Magnus Lundborg   Pieter MeulenhoffErik Marklund  Teemu Murtola
> > Szilard Pall   Sander Pronk  Roland Schulz Alexey
> Shvetsov
> >Michael Shirts Alfons Sijbers Peter TielemanTeemu
> Virolainen
> >  Christian WennbergMaarten Wolf
> >and the project leaders:
> > Mark Abraham, Berk Hess, Erik Lindahl, and David van der Spoel
> >
> > Copyright (c) 1991-2000, University of Groningen, The Netherlands.
> > Copyright (c) 2001-2017, The GROMACS development team at
> > Uppsala University, Stockholm University and
> > the Royal Institute of Technology, Sweden.
> > check out http://www.gromacs.org for more information.
> >
> > GROMACS is free software; you can redistribute it and/or modify it
> > under the terms of the GNU Lesser General Public License
> > as published by the Free Software Foundation; either version 2.1
> > of the License, or (at your option) any later version.
> >
> > GROMACS:  gmx mdrun, version 2018.2
> > Executable:   /home/matthias/Documents/gromacs-2018.2/gpu_install/bin/gmx
> > Data prefix:  /home/matthias/Documents/gromacs-2018.2/gpu_install
> > Working dir:  /home/matthias/Documents/MD_simulations/D_EY_ext
> > Command line:
> >   gmx mdrun -deffnm md_0_01_GPU
> >
> > GROMACS version:2018.2
> > Precision:  single
> > Memory model:   64 bit
> > MPI library:thread_mpi
> > OpenMP support: enabled (GMX_OPENMP_MAX_THREADS = 64)
> > GPU support:CUDA
> > SIMD instructions:  AVX_256
> > FFT library:

Re: [gmx-users] Enthalpy calculation in NPT ensemble

2018-08-14 Thread Pelin S Bulutoglu
Dear David,


Thanks a lot for your reply. I am using GROMACS version 2016.5. I performed 
isotropic pressure coupling for 4 ns followed by anisotropic pressure coupling 
for another 4 ns. I actually realized that the enthalpy and PV options are 
available when the pressure is coupled isotropically, but not when it is 
coupled anisotropically. Any input on why this may happen would be greatly 
appreciated.


Regards,

Pelin Su Bulutoglu


From: gromacs.org_gmx-users-boun...@maillist.sys.kth.se 
 on behalf of David van der 
Spoel 
Sent: Tuesday, August 14, 2018 2:11:58 PM
To: gmx-us...@gromacs.org
Subject: Re: [gmx-users] Enthalpy calculation in NPT ensemble

Den 2018-08-14 kl. 16:07, skrev Pelin S Bulutoglu:
> Dear GROMACS users,
>
> I am working on simulating a glycine crystal using GAFF force field with CNDO 
> point charges in an NPT ensemble. I want to calculate the enthalpy and Cp of 
> the crystal. However, the list that comes up following gmx energy command 
> does not include the enthalpy or PV options, as it would normally do for NPT 
> simulations. Does this mean that the simulation does not exhibit NPT 
> behavior? I believe that this is not the case, since upon inspection of the 
> temperature and pressure values, I found that they are both constant (within 
> statistical error) throughout the simulation. What may be the cause of this? 
> The following are some of my input parameters:
>
> cutoff-scheme  = Verlet
> nstlist= 40
> ns-type= Grid
> pbc= xyz
> verlet-buffer-tolerance= 0.005
> rlist  = 1.4
> coulombtype= PME
> coulomb-modifier   = Potential-shift
> rcoulomb-switch= 0
> rcoulomb   = 1.4
> rvdw   = 1.4
> DispCorr   = EnerPres
> table-extension= 1
> fourierspacing = 0.12
> pme-order  = 4
> tcoupl = V-rescale
> nsttcouple = 10
> pcoupl = Berendsen
> pcoupltype = Anisotropic
> nstpcouple = 10
> tau-p  = 2
> compressibility (3x3):
>compressibility[0]={ 4.5e-05,  4.5e-05,  4.5e-05}
>compressibility[1]={ 4.5e-05,  4.5e-05,  4.5e-05}
>compressibility[2]={ 4.5e-05,  4.5e-05,  4.5e-05}
> ref-p (3x3):
>ref-p[0]={ 1.0e+00,  1.0e+00,  1.0e+00}
>ref-p[1]={ 1.0e+00,  1.0e+00,  1.0e+00}
>ref-p[2]={ 1.0e+00,  1.0e+00,  1.0e+00}
> refcoord-scaling   = COM
>
> Any input would be greatly appreciated. Thanks in advance.
>
> Pelin Su Bulutoglu
>
The enthalpy should be there with any modern gromacs version if you are
using constant pressure simulations. Please double check gmx energy
options. Cp is more tricky because you need quantum corrections.

--
David van der Spoel, Ph.D., Professor of Biology
Head of Department, Cell & Molecular Biology, Uppsala University.
Box 596, SE-75124 Uppsala, Sweden. Phone: +46184714205.
http://www.icm.uu.se
[http://www.icm.uu.se/digitalAssets/430/c_430364-l_3-k_imagepuff.jpg]

Institutionen för cell- och molekylärbiologi ...
www.icm.uu.se
Cell- och Molekylärbiologi . Forskningen vid institutionen för cell- och 
molekylärbiologi är inriktad mot att förstå hur levande organismer fungerar på 
cellnivå. . Basen i all forskning är biologi men forskningen överlappar med 
andra områden som medicin, datavetenskap, matematik, kemi, teknikvetenskap och



--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Enthalpy calculation in NPT ensemble

2018-08-14 Thread David van der Spoel

Den 2018-08-14 kl. 16:07, skrev Pelin S Bulutoglu:

Dear GROMACS users,

I am working on simulating a glycine crystal using GAFF force field with CNDO 
point charges in an NPT ensemble. I want to calculate the enthalpy and Cp of 
the crystal. However, the list that comes up following gmx energy command does 
not include the enthalpy or PV options, as it would normally do for NPT 
simulations. Does this mean that the simulation does not exhibit NPT behavior? 
I believe that this is not the case, since upon inspection of the temperature 
and pressure values, I found that they are both constant (within statistical 
error) throughout the simulation. What may be the cause of this? The following 
are some of my input parameters:

cutoff-scheme  = Verlet
nstlist= 40
ns-type= Grid
pbc= xyz
verlet-buffer-tolerance= 0.005
rlist  = 1.4
coulombtype= PME
coulomb-modifier   = Potential-shift
rcoulomb-switch= 0
rcoulomb   = 1.4
rvdw   = 1.4
DispCorr   = EnerPres
table-extension= 1
fourierspacing = 0.12
pme-order  = 4
tcoupl = V-rescale
nsttcouple = 10
pcoupl = Berendsen
pcoupltype = Anisotropic
nstpcouple = 10
tau-p  = 2
compressibility (3x3):
   compressibility[0]={ 4.5e-05,  4.5e-05,  4.5e-05}
   compressibility[1]={ 4.5e-05,  4.5e-05,  4.5e-05}
   compressibility[2]={ 4.5e-05,  4.5e-05,  4.5e-05}
ref-p (3x3):
   ref-p[0]={ 1.0e+00,  1.0e+00,  1.0e+00}
   ref-p[1]={ 1.0e+00,  1.0e+00,  1.0e+00}
   ref-p[2]={ 1.0e+00,  1.0e+00,  1.0e+00}
refcoord-scaling   = COM

Any input would be greatly appreciated. Thanks in advance.

Pelin Su Bulutoglu

The enthalpy should be there with any modern gromacs version if you are 
using constant pressure simulations. Please double check gmx energy 
options. Cp is more tricky because you need quantum corrections.


--
David van der Spoel, Ph.D., Professor of Biology
Head of Department, Cell & Molecular Biology, Uppsala University.
Box 596, SE-75124 Uppsala, Sweden. Phone: +46184714205.
http://www.icm.uu.se
--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] Enthalpy calculation in NPT ensemble

2018-08-14 Thread Pelin S Bulutoglu
Dear GROMACS users,

I am working on simulating a glycine crystal using GAFF force field with CNDO 
point charges in an NPT ensemble. I want to calculate the enthalpy and Cp of 
the crystal. However, the list that comes up following gmx energy command does 
not include the enthalpy or PV options, as it would normally do for NPT 
simulations. Does this mean that the simulation does not exhibit NPT behavior? 
I believe that this is not the case, since upon inspection of the temperature 
and pressure values, I found that they are both constant (within statistical 
error) throughout the simulation. What may be the cause of this? The following 
are some of my input parameters:

   cutoff-scheme  = Verlet
   nstlist= 40
   ns-type= Grid
   pbc= xyz
   verlet-buffer-tolerance= 0.005
   rlist  = 1.4
   coulombtype= PME
   coulomb-modifier   = Potential-shift
   rcoulomb-switch= 0
   rcoulomb   = 1.4
   rvdw   = 1.4
   DispCorr   = EnerPres
   table-extension= 1
   fourierspacing = 0.12
   pme-order  = 4
   tcoupl = V-rescale
   nsttcouple = 10
   pcoupl = Berendsen
   pcoupltype = Anisotropic
   nstpcouple = 10
   tau-p  = 2
   compressibility (3x3):
  compressibility[0]={ 4.5e-05,  4.5e-05,  4.5e-05}
  compressibility[1]={ 4.5e-05,  4.5e-05,  4.5e-05}
  compressibility[2]={ 4.5e-05,  4.5e-05,  4.5e-05}
   ref-p (3x3):
  ref-p[0]={ 1.0e+00,  1.0e+00,  1.0e+00}
  ref-p[1]={ 1.0e+00,  1.0e+00,  1.0e+00}
  ref-p[2]={ 1.0e+00,  1.0e+00,  1.0e+00}
   refcoord-scaling   = COM

Any input would be greatly appreciated. Thanks in advance.

Pelin Su Bulutoglu

-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] new threadripper

2018-08-14 Thread Harry Mark Greenblatt
BS”D

  The newest and most powerful AMD Threadripper, the 2990WX with 32 cores 
coupled with two powerful GPU’s seems quite attractive, but 2 of the CPU dies 
are not directly connected to the system memory.  Some reviews have warned that 
this will impact performance under certain workloads, as stated in this Tom’s 
Hardware article:

https://www.tomshardware.com/reviews/amd-ryzen-threadripper-2-2990wx-2950x,5725.html

”That creates an architecture capable of incredible performance in 
heavily-threaded workloads that aren’t sensitive to memory throughput, but less 
impressive results in bandwidth-hungry applications that don't scale well with 
extra cores."

So where does Gromacs 2018 (and future versions) lie on that scale?  Seems to 
me that it lands in the first category, but I would like some confirmation on 
this.

Thanks

Harry





Harry M. Greenblatt
Associate Staff Scientist
Dept of Structural Biology   
harry.greenbl...@weizmann.ac.il
Weizmann Institute of SciencePhone:  972-8-934-6340
234 Herzl St.Facsimile:   972-8-934-3361
Rehovot, 7610001
Israel

-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

[gmx-users] How to search efficiently the mailing list.

2018-08-14 Thread Sergio Perez
Dear grommacs community!

I have some questions about gromacs and I wanted to search within the
mailing list in case the problem had already commed up. But I could not
search easily within a google group like you do in plumed or there was not
a single webpage with all the subjects like in vmd. The only thing found
usefull was to search in google like:

remd restart site:
https://mailman-1.sys.kth.se/pipermail/gromacs.org_gmx-users/

Is there a better way? I don't want to be asking what has already been
asked.

Thanks in advanced!

Sergio Pérez-Conesa
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

[gmx-users] 回覆︰ 回覆︰ Simulations with SWM4-DP water model in non PBC

2018-08-14 Thread Chi Yuen Pak
Dear Prof. van der Spoel,
I have followed your suggestion and used the following commands with thread-MPI 
enabled "gmx" in GROMACS 2018.2:
gmx grompp -f eq.mdp -c em.gro -p swm4-dp.top -r posre.gro -o eq.tprmpirun -np 
1 gmx mdrun -deffnm eq -ntomp 1 -ntmpi 2
then this gives me the following error message: 
---
Fatalerror: Shellparticles are not implemented with domain decomposition, use a 
single rank 

---
It seems to me that if multiple threads of thread-MPI are set, GROMACS will 
automatically turn on domain decomposition, which is not compatible with the 
simulations of shell particles (e.g. SWM4-DP water model). In this case, is it 
possible to carry out nonPBC simulations with SWM4-DP polarizable water model 
using multiple cores in GROMACS 2018.2?
The mdp, topology, and log file are linked at the end for your reference.
Thanks again for your help.

Best,Chi Yuen

mdp:https://www.dropbox.com/s/xq6s63x4j30hl1h/eq.mdp.pdf?dl=0topology:https://www.dropbox.com/s/0gj8ot2m6k8b019/swm4-dp.top.pdf?dl=0logfile:
 https://www.dropbox.com/s/o5frxpr7jwfqita/eq-gmx-2ntmpi.pdf?dl=0



 

David van der Spoel  於 2018年08月14日 (週二) 4:11 AM 寫道﹕
 

 Den 2018-08-13 kl. 11:49, skrev Chi Yuen Pak:
> Dear Prof. van der Spoel,
> 
> 
> Thanks for your reply. I tried touse the following commands (2 OpenMP 
> threads):
> 
> 
>  
> gmx_mpi grompp -f eq.mdp -c em.gro-p swm4-dp.top -r posre.gro -o eq.tpr
> 
> mpirun -np 1 gmx_mpi mdrun -ntomp 2-deffnm eq
> 
Did you try with thread-mpi?

mpirun -np 1 gmx_mpi mdrun -ntomp 1 -ntmpi 2


> 
>  
> but this gave me the following errormessage in GROMACS 2018.2:
> 
> ---
> 
> Fatal error:
> 
> OpenMP threads have been requestedwith cut-off scheme Group, but these are 
> only supported with cut-off schemeVerlet
> 
> 
> 
> 
>  
> I can successfully runthe simulation only when I set “–ntomp 1” with “Group” 
> in nonPBC (since “Verlet” as the cutoff scheme is only supported in full PBC 
> or pbc=xy)
> 
> 
>  
> The mdp, topology, and logfiles are linked at the end for your reference.
> 
> 
>  
> In addition, I have alsotried –pd option with GROMACS 4.6.7, but as far as I 
> know, particledecomposition has not been supported since GROMACS 5.0.
> 
> 
>  
> There is another problemwith version 4.6. It seems that the flat-bottom 
> restraint is also not supportedbefore version 5. It was reported that an 
> error message would appear ifflat-bottom restraint was used in the version 
> before GROMACS 5.0 (Ref.: https://redmine.gromacs.org/issues/1794). 
> Therefore, may I ask how you carried out thesimulations with flat-bottom 
> restraints in GROAMCS 4.6 in your 2011 paper?
> 
I guess we implemented the flat-bottom restraints in 4.6.7 but it was 
only included in grmoacs from 5.0.

> 
>  
> Thanks again for yourhelp.
> 
> 
>  
> 
>  
> Best,
> 
> Chi Yuen
> 
> 
>  
> 
>  
> mdp:https://www.dropbox.com/s/xq6s63x4j30hl1h/eq.mdp.pdf?dl=0
> 
> 
>  
> topology:https://www.dropbox.com/s/0gj8ot2m6k8b019/swm4-dp.top.pdf?dl=0
> 
> 
>  
> logfile when setting –ntomp to be 1: 
> https://www.dropbox.com/s/p2pi2yvoo9x4d39/eq-1ntomp.log.pdf?dl=0
> 
> 
>  
> log file when setting –ntomp to be 2: 
> https://www.dropbox.com/s/zn2pqr6qnrf192f/eq-2ntomp.log.pdf?dl=0
> 
> 
> 
> 
> 
> 
>  
> 
> 
> 
> |  | Virus-free. www.avast.com  |
> 
>      David van der Spoel  於 2018年08月12日 (週日) 2:36 PM 寫道﹕
>  
> 
>  Den 2018-08-11 kl. 16:33, skrev Chi Yuen Pak:
>> Dear GROMACS users,
>>
>> I would like to ask a question concerning about the simulation of a water 
>> droplet, with SWM4-DP polarizable water model in nonperiodic boundary 
>> conditions. The GROMACS version I use is 2018.2. I have been only able to 
>> successfully run the simulation with only “Group” as the cutoff scheme in 
>> non PBC.  Multiple OpenMP threads are not supported with “Group”. Also, 
>> shell particles are not implemented with domain decomposition, so I have not 
>> been able to run the simulation with more than a single core. The following 
>> commands are what I use (with thread-MPI)gmx_mpi grompp -f eq.mdp -c em.gro 
>> -p swm4-dp.top -r posre.gro -o eq.tprmpirun -np 1 gmx_mpi mdrun -ntomp 1 
>> -deffnm eq
> 
> Are you sure it is not using more than one thread? What does it say in
> the log file?
> 
>> In the paper “Atomistic simulation of ion solvation in water explains 
>> surface preference of halides” by Prof. David Van der Spoel in 2011 (doi: 
>> 10.1073/pnas.1017903108), they also used the SWM4-DP water model in non PBC 
>> (no cutoffs) in GROMACS. I can only guess they used multiple cores for their 
>> simulations, or did they? Does anyone know if it is possible to run multiple 
>> cores to simulate SWM4-DP in non PBC?
> That was run with gromacs 4.6 using the -pd option.
> You could do that as well of course.
>>
>> Attached are 

Re: [gmx-users] No default Ryckaert-Bell. types

2018-08-14 Thread Baolin Huang
Hi Dan and all guys,
Thanks for your answers. I find that the OPLS-AA force field does not contain 
the dihedral force-field parameters of the Chain_C. Actually, my Chain_C was a 
hydroxyapatite (HAP, inorganic material) model. I also notice that the bond 
types of HAP in the ffbonded.itp file are missing.
Following are some parameters from my ffbonded.itp file.
[ dihedraltypes ]
;  ijkl   func coefficients
C3 CT NT C3  3  1.78866   3.49154   0.53555  -5.81576   
0.0   0.0 ; (From wildcard) amine all-atom
[ bondtypes ]
; ij  func   b0  kb
  OWHW  10.09572   502080.0   ; For TIP4F Water - wlj 1/98
 
My question is that what are the meanings of the i, j, k, l, func, 
coefficients, b0, and kb? How to find the values of those of HAP? Just list 
some [bonds] and [dihedrals] of my Chain_C as follows.
[ bonds ]
 O1  P27
 O10 P27
 O13 P27
 O14 P27
 O2  P28
[ dihedrals ] 
O1P27   O10  O13
O1P27   O10  O14
O1P27   O13  O10
O1P27   O13  O14
O1P27   O14  O13
 
Thanks for your time and any suggestions.
Kind regards,
Baolin
 
 
 
 
 
-- Original --
From:  "Dan Gil";
Date:  Tue, Aug 14, 2018 04:39 AM
To:  "gmx-users"; 

Subject:  Re: [gmx-users] No default Ryckaert-Bell. types

 
Hello,

This is telling you that the OPLS-AA force field does not have the dihedral
force-field parameters for your molecule (topol_Other_chain_C.itp) of the
Ryckaert-Bell type. Your exact problem might be:
(1) There is something wrong with the molecule topology.
(2) OPLS-AA really doesn't have the dihedral force-field parameters, and
you need to use another force-field.

If your molecule is a simple hydrocarbon, it is unlikely that OPLS-AA has
missing parameters.

Dan

On Sat, Aug 11, 2018 at 10:52 AM, Baolin Huang  wrote:

> Dear All,
>   I experienced an error as follows: (I used the OPLS-AA force field)
>   ERROR 48600 [file topol_Other_chain_C.itp, line 144934]:
>   No default Ryckaert-Bell. types
>
>
> Anyone can help?
> Kind regards,
>
>
> --
>
> Dr. Baolin Huang
> Lecturer, School of Life Sciences, Guangzhou University
> --
> Gromacs Users mailing list
>
> * Please search the archive at http://www.gromacs.org/
> Support/Mailing_Lists/GMX-Users_List before posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-requ...@gromacs.org.
>
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.