[gmx-users] Bilayer thickness error

2013-10-17 Thread Archana Sonawani-Jagtap
Hi,
This is my input file for calculating bilayer thickness in absence of peptide:
## Input file and input file parameters
coord_file  md.gro
file_type gro
num_frames   1
num_lipid_types1
resname1POPC
atomname1   P8
solvent   SOL
ionsCL-
## Define the size and shape of the grid
box_sizesolvent
grid20
conserve_ratioyes
## Define whether there is a protein embedded in the bilayer
protein   no
precision   1.3
P_value 5.0
## Define the desired output files and format
output_prefix   output
output_format  column
thickness   yes
area   no

I got 400 lines in one column in output file for average thickness
(calculated without peptide)

When running in Gnuplot I get following error:
splot 'output.frame1.20x20.average_thickness.dat' matrix using (1+$1):(1+$2):3
Warning: empty x range [1:1], adjusting to [0.99:1.01]

Please help me.
-- 
Archana Sonawani-Jagtap
Senior Research Fellow,
Biomedical Informatics Centre,
NIRRH (ICMR), Parel
Mumbai, India.
9960791339
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


[gmx-users] Problem with reading AMBER trajectories

2013-10-17 Thread anu chandra
Dear Gromacs users,

I am trying to use Gromacs to read AMBER trajectories (mdcrd) for doing few
analysis. Unfortunately I ended-up with the following error.


GROMACS will now assume it to be a trajectory and will try to open it using
the VMD plug-ins.
This will only work in case the VMD plugins are found and it is a
trajectory format supported by VMD.

Using VMD plugin: crd (AMBER Coordinates)

Format of file md.crd does not record number of atoms.

---
Program g_covar, VERSION 4.6.1
Source code file: /usr/local/gromacs-4.6.1/src/gmxlib/trxio.c, line: 1035

Fatal error:
Not supported in read_first_frame: md.crd
For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors
---




While browsing through the GROMACS mail-list, I came to know that it might
be a problem with DLOPEN libraries. So I recompiled Gromcas with cmake
using the following command


CMAKE_PREFIX_PATH=/usr/include/libltdl cmake
-DCMAKE_INSTALL_PREFIX=/usr/local/gromacs -DCMAKE_C_COMPILER=gcc
-DCMAKE_CXX_COMPILER=g++ -DFFTWF_LIBRARY=/usr/lib/libfftw3f.a
-DFFTWF_INCLUDE_DIR=/usr/lib/ ../


But, the same problem came-up again. Can anyone help me to figure out what
went wrong with my Gromacs installation?

Many thanks in advance.

Regards
Anu
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] OPLS-AA parameters for Phospho-threonine and serine

2013-10-17 Thread Andrea Spitaleri
Hi,
Have look here:

http://haddock.science.uu.nl/services/HADDOCK/library.html

The ff used by HADDOCK is oplsx derived from opls. Maybe you can exploit them 
as starting point.

Hope it helps

And

"Martin, Erik W"  ha scritto:


I've searched the literature and internet and can't seem to find anything. I 
need to rerun some simulations I've run previously with OPLS-AA (and eventually 
gromos 54A7 when I'm done with OPLS) and need to include phosphorylated 
residues.  I'm parameterized residues in Amber and Charmm, so I'm willing to do 
it here if required… but would be shocked if nobody has done this before.  Are 
the parameters for these residues available somewhere.

Thanks for the help,
Erik


Email Disclaimer: www.stjude.org/emaildisclaimer
Consultation Disclaimer: www.stjude.org/consultationdisclaimer

--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


[gmx-users] OPLS-AA parameters for Phospho-threonine and serine

2013-10-17 Thread Martin, Erik W

I've searched the literature and internet and can't seem to find anything. I 
need to rerun some simulations I've run previously with OPLS-AA (and eventually 
gromos 54A7 when I'm done with OPLS) and need to include phosphorylated 
residues.  I'm parameterized residues in Amber and Charmm, so I'm willing to do 
it here if required… but would be shocked if nobody has done this before.  Are 
the parameters for these residues available somewhere.

Thanks for the help,
Erik


Email Disclaimer: www.stjude.org/emaildisclaimer
Consultation Disclaimer: www.stjude.org/consultationdisclaimer

--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] Centering the system

2013-10-17 Thread Justin Lemkul



On 10/17/13 2:09 PM, Shima Arasteh wrote:

I used -fit or boxcenter or trans or .. any other thing which I though to solve 
my problem, but did not work. Would you give me a hint pleaaasssee?



trjconv -center

or

trjconv -fit transxy

-Justin

--
==

Justin A. Lemkul, Ph.D.
Postdoctoral Fellow

Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 601
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201

jalem...@outerbanks.umaryland.edu | (410) 706-7441

==
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] Centering the system

2013-10-17 Thread Shima Arasteh
I used -fit or boxcenter or trans or .. any other thing which I though to solve 
my problem, but did not work. Would you give me a hint pleaaasssee?


Thanks a lot.



Sincerely,
Shima



On Wednesday, October 16, 2013 4:05 PM, Justin Lemkul  wrote:




On 10/16/13 8:29 AM, Shima Arasteh wrote:
>
>
> Dear gmx users,
>
> I have a system consist of a lipid bilayer and a peptide. As the initial 
> configuration, the peptide is located in center of the x-y plane above lipid 
> bilayer. After running MD, the peptide shows interactions with the polar 
> groups. It's ok, but the peptide is near one edge of the x-y plane of the 
> bilayer. I' d like to know if there is any way to use the properties of the 
> pbc and see the peptide in center of the x-y  plane while interacting with 
> the polar groups?
>

trjconv has a number of ways to deal with this.  Please read trjconv -h.

-Justin

-- 
==

Justin A. Lemkul, Ph.D.
Postdoctoral Fellow

Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 601
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201

jalem...@outerbanks.umaryland.edu | (410) 706-7441


== 
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] KALP in DPPC tutorial reg

2013-10-17 Thread Justin Lemkul



On 10/14/13 7:47 AM, Sathya wrote:

Hi,

I'm doing the KALP-15 IN DPPC through the Justin's tutorial
http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin/gmx-tutorials/membrane_protein/03_solvate.html
  I have few questions. Please help me to solve it.

1) After the topol.top file has generated from pdb2gmx is it necessary to
add DPPC 128 in the molecule section?.



If there are lipids in the coordinate file, there have to be lipids in the 
topology.  The tutorial walks you through all of this, step by step.



2) I have used DPPC 128 in the molecule section and during adding ions with
the following command shows error.
grompp -f ions.mdp -c system_solv.gro -p topol.top -o ions.tpr
The above command shows this error.

Fatal error:
number of coordinates in coordinate file (system_solv.gro, 27162) does not
match topology (topol.top, 33451)

Please tell me where should i change to solve this problem.



If you're adding ions, that means you should have already done the insertion, 
inflation/packing, and solvation steps.  You will not have 128 lipids at that 
point (as is stated in the tutorial).  You have a significant mismatch 
somewhere, though, and the difference is nowhere close to being only a problem 
with the lipids.



3) If i didnt use DPPC in molecule section of .top file while running
equilibration using make_ndx which ask us to enter the protein+lipids i
could not find the DPPC option there..



You don't make index groups from the topology; you make them from coordinate 
files.  If DPPC wasn't an option when running make_ndx, then you didn't have 
lipids present.  I believe this questions was just asked the other day.


Given the magnitude of problems you're experiencing, I would suggest starting 
the tutorial over and checking your output at every step more carefully.  There 
seem to be some fundamental problems in whatever you have done.


-Justin

--
==

Justin A. Lemkul, Ph.D.
Postdoctoral Fellow

Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 601
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201

jalem...@outerbanks.umaryland.edu | (410) 706-7441

==
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


[gmx-users] KALP in DPPC tutorial reg

2013-10-17 Thread Sathya
Hi,

I'm doing the KALP-15 IN DPPC through the Justin's tutorial
http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin/gmx-tutorials/membrane_protein/03_solvate.html
 I have few questions. Please help me to solve it.

1) After the topol.top file has generated from pdb2gmx is it necessary to
add DPPC 128 in the molecule section?.

2) I have used DPPC 128 in the molecule section and during adding ions with
the following command shows error.
grompp -f ions.mdp -c system_solv.gro -p topol.top -o ions.tpr
The above command shows this error.

Fatal error:
number of coordinates in coordinate file (system_solv.gro, 27162) does not
match topology (topol.top, 33451)

Please tell me where should i change to solve this problem.

3) If i didnt use DPPC in molecule section of .top file while running
equilibration using make_ndx which ask us to enter the protein+lipids i
could not find the DPPC option there..

please help me to solve this problems.

Thanks in advance.





--
View this message in context: 
http://gromacs.5086.x6.nabble.com/KALP-in-DPPC-tutorial-reg-tp5011799.html
Sent from the GROMACS Users Forum mailing list archive at Nabble.com.
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] genion doesn't recognize SOL in top file

2013-10-17 Thread Justin Lemkul



On 10/17/13 1:01 PM, sunyeping wrote:

Dear Gromacs users,
I am trying to add ions to my system using:genion -s ions.tpr -o solv_ions.gro 
-p topol.top -pname NA -nname CL -np 8

but it returns a error message saying:

Fatal error:
No line with moleculetype 'SOL' found the [ molecules ] section of file 
'topol.top'

I checked the topol.top file, but SOL is indeed under [ molecules ] section:

[ molecules ]
; Compound#mols
Protein_chain_A 1
Protein_chain_B 1
DRG 1
SOL 86574

I know it problem has been disscussed here before, but I cann't fix it out with 
the avialable information. Could you help me with it?



You probably have a non-Unix newline character somewhere.  Run the .top file 
through dos2unix and try again.


-Justin

--
==

Justin A. Lemkul, Ph.D.
Postdoctoral Fellow

Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 601
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201

jalem...@outerbanks.umaryland.edu | (410) 706-7441

==
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


[gmx-users] genion doesn't recognize SOL in top file

2013-10-17 Thread sunyeping
Dear Gromacs users,
I am trying to add ions to my system using:genion -s ions.tpr -o solv_ions.gro 
-p topol.top -pname NA -nname CL -np 8

but it returns a error message saying:

Fatal error:
No line with moleculetype 'SOL' found the [ molecules ] section of file 
'topol.top'

I checked the topol.top file, but SOL is indeed under [ molecules ] section:

[ molecules ]
; Compound#mols
Protein_chain_A 1
Protein_chain_B 1
DRG 1
SOL 86574

I know it problem has been disscussed here before, but I cann't fix it out with 
the avialable information. Could you help me with it?

Thanks in advance.
Yeping Sun
Institute of Microbiology, Chinese Academy of Sciences

--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] default -rdd with distance restraints seems too large

2013-10-17 Thread XAvier Periole

Hi Chris,

I mentioned that PS would have helped! I am sorry about the confusion. I should 
have been more clear. I guess you have not followed the particle decomposition 
threads lately :)) 

The PD option has been associated some serious issues lately … notably I 
noticed it does not work well in combination of REMD. Some info is not 
communicated correctly and thus a few variables are wrong. I forgot which ones 
… 

Most recently Floris (a PhD student with us in Groningen) has noticed that 
turning PD on would lead to severe buckling of a membrane to the point that the 
system would crash. The is happening quite fast. 

If you can use an older version of GMX (4.0.7) this would be fine but starting 
4.5.5 things are ugly :))

I hope this helps.

On Oct 17, 2013, at 5:57 PM, Christopher Neale  
wrote:

> Thanks for the hint XAvier.
> 
> Unfortunately, I get crashes with particle decomposition (see below). If I 
> use either DD or PD, I can run on up to 2 threads 
> without adjusting -rdd or -dds. I can only use >2 thread with DD if I set 
> -rdd 2.8. If I try to use more than 2 threads with PD, 
> I get lincs problems and immediate crashes. If I export GMX_MAXCONSTRWARN=-1 
> with the same setup , then I get a segfault immediately. 
> Note, however, that if I use constraints=none and set my timestep to 0.5 fs, 
> I can indeed use PD with 8 threads (without exporting GMX_MAXCONSTRWARN). 
> Also note that I am using the SD integrator, but I just tested and PD also 
> gives me an error with the md integrator.
> (( I don't think that the crashes have anything to do with improper setup. 
> These runs are fine in serial or in parallel as described 
> above and only ever "explode"/crash with PD and >2 threads, for which they do 
> so immediately )).
> 
> Here is the error that I get when I 
> 
> Step 0, time 0 (ps)  LINCS WARNING
> relative constraint deviation after LINCS: 
> rms 218.843810, max 8135.581543 (between atoms 15623 and 15624)
> bonds that rotated more than 30 degrees: 
> atom 1 atom 2  angle  previous, current, constraint length
>  13908  13916   90.80.2130   0.8066  0.1538 
>  13916  13917   90.30.2402   0.7979  0. 
>  13916  13918   90.20.2403   0.8452  0.
>  13916  13919   89.31.3408   0.9956  0.1430
> ...
> ...
>  22587  22589   31.70.4648   0.1144  0.
>  22587  22590   90.20.4168   0.1273  0.
> 
> Wrote pdb files with previous and current coordinates
> starting mdrun 'Gallium Rubidium Oxygen Manganese Argon Carbon Silicon in 
> water'
> 500 steps,  1.0 ps.
> 
> WARNING: Listed nonbonded interaction between particles 13908 and 13920
> at distance 3f which is larger than the table limit 3f nm.
> 
> This is likely either a 1,4 interaction, or a listed interaction inside
> a smaller molecule you are decoupling during a free energy calculation.
> Since interactions at distances beyond the table cannot be computed,
> they are skipped until they are inside the table limit again. You will
> only see this message once, even if it occurs for several interactions.
> 
> IMPORTANT: This should not happen in a stable simulation, so there is
> probably something wrong with your system. Only change the table-extension
> distance in the mdp file if you are really sure that is the reason.
> 
> 
> 
> step 0: Water molecule starting at atom 39302 can not be settled.
> Check for bad contacts and/or reduce the timestep if appropriate.
> 
> step 0: Water molecule starting at atom 53072 can not be settled.
> Check for bad contacts and/or reduce the timestep if appropriate.
> 
> Step 0, time 0 (ps)  LINCS WARNING
> relative constraint deviation after LINCS:
> rms 2569455308.447471, max 215672291328.00 (between atoms 14054 and 14055)
> bonds that rotated more than 30 degrees:
> atom 1 atom 2  angle  previous, current, constraint length
>  13442  13444   90.00.   0.1147  0.
>  13503  13506   40.80.1538   0.2002  0.1538
>  13506  13507   45.20.   0.1541  0.
> ...
> ...
>  19020  19023   89.80.1534 66420.9531  0.1530
> ;;dispcorr = EnerPres  ;; not using for CHARMM simulations
> 
> 
> ###
> 
> mdp options:
> 
> constraints = all-bonds
> lincs-iter =  1
> lincs-order =  6
> constraint_algorithm =  lincs
> integrator = sd
> dt = 0.002
> tinit = 0
> nsteps = 500
> nstcomm = 1
> nstxout = 500
> nstvout = 500
> nstfout = 500
> nstxtcout = 500
> nstenergy = 500
> nstlist = 10
> nstlog=0 ; reduce log file size
> ns_type = grid
> vdwtype = switch
> rlist = 1.2
> rlistlong = 1.3
> rvdw = 1.2
> rvdw-switch = 0.8
> rcoulomb = 1.2
> coulombtype = PME
> ewald-rtol = 1e-5
> optimize_fft = yes
> fourierspacing = 0.12
> fourier_nx = 0
> fourier_ny = 0
> fourier_nz = 0
> pme_order = 4
> tc_grps =  System   
> tau_t   =  1.0   
> ld_seed =  -1   
> ref_t = 310
> gen_temp = 310
> 

Aw: [gmx-users] g_sham

2013-10-17 Thread lloyd riggs

This is my own experience, someone may have better suggestions.  First, you can look on the internet for .py .c++ or java matix manupulation tools/small programs run in bash shells.  These allow the output from the g:sham or other (2d or 3d) to be turned into mtricies.  These can then be fed into qtiplot/scidavis matricies.   You can also sum/n all of these for guassians to get your overall realistic maps.  qtiplot makes nice images of matricies with thermal maps...also, with the 1980 .xpm files, you can just cut these (the mtrix portion) and feed them into qtiplot, and do all fo the same.  There's a setting for real values (0.335587) vs 0 and 1, but I forget these, and then do the same for your guassian, and plot these in the same software.  Mostly all of this is based on simple matix manipulation .py or .C++ found for free on the internetbut are doable for any projectothers might have better suggestions...

 

Stephan Watkins

 

Gesendet: Montag, 14. Oktober 2013 um 13:54 Uhr
Von: "pratibha kapoor" 
An: gmx-users@gromacs.org
Betreff: [gmx-users] g_sham

Dear all gromacs users

I am creating free energy landscape using g_sham but my axis are not
getting labelled. I have searched the archive and found that using xmin and
xmax options we can label them.
I have first created my 2D projection xvg file using
g_anaeig -f *.xtc -s *.tpr -first 1 -last 2 -2d *.xvg -v *.trr
and then found min and max values for both the vectors,
say for vector1 min:-2.25 and max:1.83
and for vector2 min:-1.60 and max: 2.22
and then I have used:
g_sham -f *.xvg -ls *.xpm -notime -xmin -2.25 -1.60 0 -xmax 1.83 2.22 0
and then converted *.xpm to *.eps using
xpm2ps -f *.xpm -o *.eps -rainbow blue
This way I got eps file with only one axis(x axis) labelled and following
line appeared:
Auto tick spacing failed for Y-axis, guessing 1.19375

I would like to ask is this way of labelling the axis correct? If yes, why
didn't y axis get labelled and how to solve the problem?

Thanks in advance.
--
gmx-users mailing list gmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists



-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

[gmx-users] default -rdd with distance restraints seems too large

2013-10-17 Thread Christopher Neale
Thanks for the hint XAvier.

Unfortunately, I get crashes with particle decomposition (see below). If I use 
either DD or PD, I can run on up to 2 threads 
without adjusting -rdd or -dds. I can only use >2 thread with DD if I set -rdd 
2.8. If I try to use more than 2 threads with PD, 
I get lincs problems and immediate crashes. If I export GMX_MAXCONSTRWARN=-1 
with the same setup , then I get a segfault immediately. 
Note, however, that if I use constraints=none and set my timestep to 0.5 fs, I 
can indeed use PD with 8 threads (without exporting GMX_MAXCONSTRWARN). 
Also note that I am using the SD integrator, but I just tested and PD also 
gives me an error with the md integrator.
(( I don't think that the crashes have anything to do with improper setup. 
These runs are fine in serial or in parallel as described 
above and only ever "explode"/crash with PD and >2 threads, for which they do 
so immediately )).

Here is the error that I get when I 

Step 0, time 0 (ps)  LINCS WARNING
relative constraint deviation after LINCS: 
rms 218.843810, max 8135.581543 (between atoms 15623 and 15624)
bonds that rotated more than 30 degrees: 
 atom 1 atom 2  angle  previous, current, constraint length
  13908  13916   90.80.2130   0.8066  0.1538 
  13916  13917   90.30.2402   0.7979  0. 
  13916  13918   90.20.2403   0.8452  0.
  13916  13919   89.31.3408   0.9956  0.1430
...
...
  22587  22589   31.70.4648   0.1144  0.
  22587  22590   90.20.4168   0.1273  0.

Wrote pdb files with previous and current coordinates
starting mdrun 'Gallium Rubidium Oxygen Manganese Argon Carbon Silicon in water'
500 steps,  1.0 ps.

WARNING: Listed nonbonded interaction between particles 13908 and 13920
at distance 3f which is larger than the table limit 3f nm.

This is likely either a 1,4 interaction, or a listed interaction inside
a smaller molecule you are decoupling during a free energy calculation.
Since interactions at distances beyond the table cannot be computed,
they are skipped until they are inside the table limit again. You will
only see this message once, even if it occurs for several interactions.

IMPORTANT: This should not happen in a stable simulation, so there is
probably something wrong with your system. Only change the table-extension
distance in the mdp file if you are really sure that is the reason.



step 0: Water molecule starting at atom 39302 can not be settled.
Check for bad contacts and/or reduce the timestep if appropriate.

step 0: Water molecule starting at atom 53072 can not be settled.
Check for bad contacts and/or reduce the timestep if appropriate.

Step 0, time 0 (ps)  LINCS WARNING
relative constraint deviation after LINCS:
rms 2569455308.447471, max 215672291328.00 (between atoms 14054 and 14055)
bonds that rotated more than 30 degrees:
 atom 1 atom 2  angle  previous, current, constraint length
  13442  13444   90.00.   0.1147  0.
  13503  13506   40.80.1538   0.2002  0.1538
  13506  13507   45.20.   0.1541  0.
...
...
  19020  19023   89.80.1534 66420.9531  0.1530
;;dispcorr = EnerPres  ;; not using for CHARMM simulations


###

mdp options:

constraints = all-bonds
lincs-iter =  1
lincs-order =  6
constraint_algorithm =  lincs
integrator = sd
dt = 0.002
tinit = 0
nsteps = 500
nstcomm = 1
nstxout = 500
nstvout = 500
nstfout = 500
nstxtcout = 500
nstenergy = 500
nstlist = 10
nstlog=0 ; reduce log file size
ns_type = grid
vdwtype = switch
rlist = 1.2
rlistlong = 1.3
rvdw = 1.2
rvdw-switch = 0.8
rcoulomb = 1.2
coulombtype = PME
ewald-rtol = 1e-5
optimize_fft = yes
fourierspacing = 0.12
fourier_nx = 0
fourier_ny = 0
fourier_nz = 0
pme_order = 4
tc_grps =  System   
tau_t   =  1.0   
ld_seed =  -1   
ref_t = 310
gen_temp = 310
gen_vel = yes
unconstrained_start = no
gen_seed = -1
Pcoupl = berendsen
pcoupltype = semiisotropic
tau_p = 4 4
compressibility = 4.5e-5 4.5e-5
ref_p = 1.0 1.0
disre = simple
disre-fc = 5


-- original message --

Yes it is a pity!

But particle decomposition helps :)) well helped!

--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] default -rdd with distance restraints seems too large

2013-10-17 Thread XAvier Periole

Yes it is a pity! 

But particle decomposition helps :)) well helped! 

> 
> It's a shame that long distance restraints limit the parallalization so much, 
> but it is understandable. Thanks for helping me with this.
> 
> Chris.
> 
> -- original message --
> 
> Initializing Domain Decomposition on 8 nodes
> Dynamic load balancing: auto
> Will sort the charge groups at every domain (re)decomposition
> Initial maximum inter charge-group distances:
>two-body bonded interactions: 2.636 nm, Dis. Rest., atoms 1701 4425
>  multi-body bonded interactions: 0.479 nm, CMAP Dih., atoms 1062 1081
> Minimum cell size due to bonded interactions: 2.899 nm
> Maximum distance for 7 constraints, at 120 deg. angles, all-trans: 1.172 nm
> Estimated maximum distance required for P-LINCS: 1.172 nm
> Using 0 separate PME nodes, as there are too few total
> nodes for efficient splitting
> Scaling the initial minimum size with 1/0.8 (option -dds) = 1.25
> Optimizing the DD grid for 8 cells with a minimum initial size of 3.624 nm
> The maximum allowed number of cells is: X 1 Y 1 Z 2
> 
> --
> gmx-users mailing listgmx-users@gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> * Please search the archive at 
> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
> * Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-requ...@gromacs.org.
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


[gmx-users] default -rdd with distance restraints seems too large

2013-10-17 Thread Christopher Neale
Indeed, sorry that I didn't notice that, Mark. Looks as if the two-body bonded 
interaction gets multiplied by 1.1/0.8 so I suppose that this is working as it 
should.

It's a shame that long distance restraints limit the parallalization so much, 
but it is understandable. Thanks for helping me with this.

Chris.

-- original message --

Initializing Domain Decomposition on 8 nodes
Dynamic load balancing: auto
Will sort the charge groups at every domain (re)decomposition
Initial maximum inter charge-group distances:
two-body bonded interactions: 2.636 nm, Dis. Rest., atoms 1701 4425
  multi-body bonded interactions: 0.479 nm, CMAP Dih., atoms 1062 1081
Minimum cell size due to bonded interactions: 2.899 nm
Maximum distance for 7 constraints, at 120 deg. angles, all-trans: 1.172 nm
Estimated maximum distance required for P-LINCS: 1.172 nm
Using 0 separate PME nodes, as there are too few total
 nodes for efficient splitting
Scaling the initial minimum size with 1/0.8 (option -dds) = 1.25
Optimizing the DD grid for 8 cells with a minimum initial size of 3.624 nm
The maximum allowed number of cells is: X 1 Y 1 Z 2

--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] parallelization

2013-10-17 Thread Carsten Kutzner
Hi,

On Oct 17, 2013, at 2:25 PM, pratibha kapoor  wrote:

> Dear gromacs users
> 
> I would like to run my simulations on all nodes(8) with full utilisation of
> all cores(2 each). I have compiled gromacs version 4.6.3 using both thread
> mpi and open mpi. I am using following command:
> mpirun -np 8 mdrun_mpi -v -s -nt 2 -s *.tpr -c *.gro
> But I am getting following error:
> Setting the total number of threads is only supported with thread-MPI and
> Gromacs was compiled without thread-MPI .
> Although during compilation I have used:
> cmake .. -DGMX_MPI=ON -DGMX_THREAD_MPI=ON
you can either use MPI or thread_mpi. But you can use MPI and OpenMP with
-DGMX_MPI=ON -DGMX_OPENMP=ON

> If I dont use -nt option, I could see that all the processors(8) are
> utilised but I am not sure whether all cores are being utilised. For
You can run with 
mpirun -np 16 mdrun_mpi -v -s -nt 2 -s *.tpr -c *.gro

to use all 16 available cores.

> version 4.6.3 without mpi, I Know by default gromacs uses all the threads
> but not sure if mpi version uses all threads or not.
Take a look at the md.log output file, there it should be written
what Groamcs did use!

Best,
  Carsten

> Any help is appreciated.
> -- 
> gmx-users mailing listgmx-users@gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> * Please search the archive at 
> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
> * Please don't post (un)subscribe requests to the list. Use the 
> www interface or send it to gmx-users-requ...@gromacs.org.
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


--
Dr. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
Theoretical and Computational Biophysics
Am Fassberg 11, 37077 Goettingen, Germany
Tel. +49-551-2012313, Fax: +49-551-2012302
http://www.mpibpc.mpg.de/grubmueller/kutzner
http://www.mpibpc.mpg.de/grubmueller/sppexa

--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


[gmx-users] parallelization

2013-10-17 Thread pratibha kapoor
Dear gromacs users

I would like to run my simulations on all nodes(8) with full utilisation of
all cores(2 each). I have compiled gromacs version 4.6.3 using both thread
mpi and open mpi. I am using following command:
mpirun -np 8 mdrun_mpi -v -s -nt 2 -s *.tpr -c *.gro
But I am getting following error:
Setting the total number of threads is only supported with thread-MPI and
Gromacs was compiled without thread-MPI .
Although during compilation I have used:
cmake .. -DGMX_MPI=ON -DGMX_THREAD_MPI=ON

If I dont use -nt option, I could see that all the processors(8) are
utilised but I am not sure whether all cores are being utilised. For
version 4.6.3 without mpi, I Know by default gromacs uses all the threads
but not sure if mpi version uses all threads or not.
Any help is appreciated.
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] MD of lipid bilayer

2013-10-17 Thread Justin Lemkul



On 10/16/13 10:32 PM, Sathya wrote:

Dear Justin,

I am currently working on Dynamics study of DPPC lipid bilayer and
chromium ions..
I want to know how to insert chromium ions into lipid bilayer. Is
there any software for this?
Please suggest some idea about how to insert chromium ions into DPPC.



They're just ions.  Insert them with genion.  Whether or not any force field can 
adequately describe Cr is an entirely different matter...



When i use DPPC in pdb2gmx directly i m getting error like DPP not
found and what should i do now?
Can i use directly the DPPC layer in pdb2gmx or not? ow to rectify
this error?



I assume what you're seeing is 
http://www.gromacs.org/Documentation/Errors#Residue_'XXX'_not_found_in_residue_topology_database.


Either you must make the force field understand what those residues are, 
e.g.http://www.gromacs.org/Documentation/How-tos/Adding_a_Residue_to_a_Force_Field, 
or use pre-built topologies.  There are generally .itp files available for most 
common lipids in most of the common force fields, making life vastly easier 
since the .top is very straightforward to write by hand at that point.


-Justin

--
==

Justin A. Lemkul, Ph.D.
Postdoctoral Fellow

Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 601
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201

jalem...@outerbanks.umaryland.edu | (410) 706-7441

==
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] lipid tail order

2013-10-17 Thread Justin Lemkul



On 10/17/13 4:57 AM, Archana Sonawani-Jagtap wrote:

Hi,

I want to plot
1.  -SCD (lipid tail order parameters) profile for both the chains (sn1 and
sn2)

2. Lateral diffusion of lipids

3. density profiles

  in presence and absence of peptide.

I have plotted the above parameters in presence of peptide, for calculating
in absence of peptide, do i need to simulate the POPC bilayer separately?



I don't know how you'd infer the information otherwise, so yes, you need control 
simulations.



I have used POPC128a bilayer from Peter Tieleman's site. I have simulations
of POPC bilayer with peptide inserted in it for 60ns so can I simulate this
bilayer till 60ns without peptide?



60 ns may or may not be enough to converge the quantities of interest.  Membrane 
simulations are often 100 ns in length or more.



what should be the nvt, npt parameters? can I simulate in the same manner
as Justin's KALP tutorial skipping the peptide insertion part?



Run settings are not a function of system composition; they are a function of 
the force field.  You don't really need to do any sort of construction at all. 
The pre-equilibrated bilayers serve as a reasonable starting point for further 
simulation.


-Justin

--
==

Justin A. Lemkul, Ph.D.
Postdoctoral Fellow

Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 601
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201

jalem...@outerbanks.umaryland.edu | (410) 706-7441

==
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


[gmx-users] (no subject)

2013-10-17 Thread Archana Sonawani-Jagtap
Hi,

Please tell me what is wrong in my input file. I am not getting APL
along with std deviation values.

Following is the input and output for calculating APL
Input file
coord_file  md.gro
file_type   gro
num_frames1
num_lipid_types 1
resname1  POPC
atomname1   P8
solvent SOL
ions  CL-
## Define the size and shape of the grid
box_sizevectors
grid200
conserve_ratio  yes
## Define whether there is a protein embedded in the bilayer
protein yes
precision   1.3
P_value 6.0
## Define the desired output files and format
output_prefix   output
output_format   column
thickness   no
areayes

Output:
Reading from "PC_APL"...
You defined the coordinate file as md.gro
You specified that this file contains 1 frame(s)
You defined a lipid residue name as POPC (atom(s): P8)
You defined the solvent as SOL
You defined the ions as CL-

Deconstructing lipid bilayer...
Lower X limit: 0.4365999   Upper X limit: 6.696
Lower Y limit: 0.2762   Upper Y limit: 6.476
Cross sectional area (box size) was determined from: a line in the coord file
Cross sectional area of the system: 6.25940 x 6.19980 nanometers
Lower Z limit: -0.328   Upper Z limit: 5.269
The middle (in the Z-direction) is 2.4705
In the top leaflet, the Z values range from 4.223 to 5.269
In the bottom leaflet, the Z values range from -0.328 to 1.616

Simulating periodic boundary conditions...Done
Dividing the periodic array into a top and bottom leaflet...Done

Looking for offending protein atoms...
There are 10 protein atoms within the headgroups of the top leaflet
There are 18 protein atoms within the headgroups of the bottom leaflet

Simulating periodic boundary conditions for the protein atoms...Done
Dividing the periodic array into a top and bottom leaflet...Done

Generating the grid...
Your system is bigger in the X-direction
There are 200 grid points in the X direction, spaced every 0.03145 nanometers
There are 198 grid points in the Y direction, spaced every 0.03147 nanometers
Note: the intervals may not be exactly the same in order to have a
whole number of grid points

Analyzing the bilayer...

Calculating area per lipid head group...
The lateral area of the system is 3880.70281 sq. Angstroms (per side)
When you don't account for any protein atoms:
The average area per lipid in the top leaflet is 61.59846 sq. Angstroms
The average area per lipid in the bottom leaflet is 59.70312 sq. Angstroms
When you do take the protein atoms into account:
The new area per lipid in the top leaflet is 60.66670 sq. Angstroms
The new area per lipid in the bottom leaflet is 54.70072 sq. Angstroms
The top leaflet lipid areas will be printed to
output.frame1.200x198.top_areas.d at
The bottom leaflet lipid areas will be printed to
output.frame1.200x198.bottom_a reas.dat

-- 
Archana Sonawani-Jagtap
Senior Research Fellow,
Biomedical Informatics Centre,
NIRRH (ICMR), Parel
Mumbai, India.
9960791339
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] Is the website of Martini Force Field down for maintenance?

2013-10-17 Thread XAvier Periole

Try cgmartini.nl


On Oct 16, 2013, at 10:29 PM, 朱文鹏  wrote:

> Dear GMX users,
> 
> I am going to do some coarse-grained simulations in which the lipid
> bilayeris covered by
> polysarccharide. I remember the website of Martini Force Field (http://md
> .chem.rug.nl/cgmartini/) provides a database for sugar including .itp and .
> gro files of long chains of different polysarccharide.
> 
> But I can not open the website now. Is it down for maintenance or changed
> to another address? Do you have these .itp and .gro files of
> polysarccharidefor Martini Force Field?
> 
> Thank you very much for your help.
> 
> Best,
> Jason
> -- 
> gmx-users mailing listgmx-users@gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> * Please search the archive at 
> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
> * Please don't post (un)subscribe requests to the list. Use the 
> www interface or send it to gmx-users-requ...@gromacs.org.
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


[gmx-users] postdoctoral position in computational biophysics in Denmark

2013-10-17 Thread himanshu khandelia
A postdoctoral position in simulations of membranes and membrane proteins
is available from 1 December 2013 (starting date flexible) for one year
(with possible extension) at the University of Southern Denmark (SDU),
Odense in the group of Dr Himanshu Khandelia.


Knowledge of statistical physics is necessary and some experience in theory
and/or simulations is highly beneficial. The successful candidate(s) will
have access to the DCSC-funded "Horseshoe" super-cluster of more than 2,000
processors at SDU, and large European supercomputing installations to
perform their simulations.


The projects revolve around molecular simulations of biomembrane-associated
phenomena, including simulations of ion transport.


The successful candidate will work in a highly collaborative atmosphere at
the Center for Biomembrane Physics MEMPHYS together with theorists and
experimental scientists.


For further information please contact Associate Professor Himanshu
Khandelia, hkhan...@sdu.dk.


Application, salary etc.

First time postdocs can expect a salary ~ Euro 3200 after tax. Experienced
postdocs get more. Free health benefits etc.


Please apply at the original job posting here:


https://ssl1.peoplexs.com/Peoplexs22/CandidatesPortalNoLogin/Vacancy.cfm?PortalID=3794&VacatureID=612049&Vacancy=Postdoc%20in%20Simulations%20of%20Membranes%20and%20Membrane%20Proteins#top

-- 
-
Himanshu Khandelia, PhD
Associate Professor
MEMPHYS, Center for BioMembrane Physics: www.memphys.sdu.dk
University of Southern Denmark (SDU)
Campusvej 55, Odense M 5230, Denmark

-
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


[gmx-users] lipid tail order

2013-10-17 Thread Archana Sonawani-Jagtap
Hi,

I want to plot
1.  -SCD (lipid tail order parameters) profile for both the chains (sn1 and
sn2)

2. Lateral diffusion of lipids

3. density profiles

 in presence and absence of peptide.

I have plotted the above parameters in presence of peptide, for calculating
in absence of peptide, do i need to simulate the POPC bilayer separately?

I have used POPC128a bilayer from Peter Tieleman's site. I have simulations
of POPC bilayer with peptide inserted in it for 60ns so can I simulate this
bilayer till 60ns without peptide?

what should be the nvt, npt parameters? can I simulate in the same manner
as Justin's KALP tutorial skipping the peptide insertion part?

Please help me...

thanks in advance
-- 
Archana Sonawani-Jagtap
Senior Research Fellow,
Biomedical Informatics Centre,
NIRRH (ICMR), Parel
Mumbai, India.
9960791339
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] There is no domain decomposition for 16 nodes that is compatible with the given box and a minimum cell size of 0.826223 nm

2013-10-17 Thread Mark Abraham
4.5 can only handle about 500-1000 atoms per processor. Details vary.

Mark
On Oct 17, 2013 5:39 AM, "Nilesh Dhumal"  wrote:

> Thanks for you reply.
>
> I am doing simulation for ionic liquids BMIM + CL. Total number of atoms
> are 3328.
>
> Nilesh
>
> > Assuming you're using LINCS, from the manual:
> > "With domain decomposition, the cell size is limited by the distance
> > spanned by *lincs-order*+1 constraints."
> > Assuming a default lincs-order (4), 0.82nm seems a fairly sane distance
> > for
> > 5 bonds.
> >
> > Which means that you're probably using too many nodes for the size of
> your
> > system.
> >
> > Hope that helps. If it doesn't you'll need to provide some information
> > about your system.
> >
> > -Trayder
> >
> >
> >
> > On Thu, Oct 17, 2013 at 1:27 PM, Nilesh Dhumal
> > wrote:
> >
> >> Hello,
> >>
> >> I am getting the following error for simulation. I am using Gromacs
> >> VERSION 4.5.5 and running on 24 processors.
> >>
> >> Should I reduce the number of processor or the problem is in bonded
> >> parameters. If I use -nt 1 option. I could run the simulation.
> >>
> >> Fatal error:
> >> There is no domain decomposition for 16 nodes that is compatible with
> >> the
> >> given box and a minimum cell size of 0.826223 nm
> >> Change the number of nodes or mdrun option -rdd or -dds
> >> Look in the log file for details on the domain decomposition
> >>
> >>
> >> Nilesh
> >>
> >> --
> >> gmx-users mailing listgmx-users@gromacs.org
> >> http://lists.gromacs.org/mailman/listinfo/gmx-users
> >> * Please search the archive at
> >> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
> >> * Please don't post (un)subscribe requests to the list. Use the
> >> www interface or send it to gmx-users-requ...@gromacs.org.
> >> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> >>
> > --
> > gmx-users mailing listgmx-users@gromacs.org
> > http://lists.gromacs.org/mailman/listinfo/gmx-users
> > * Please search the archive at
> > http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
> > * Please don't post (un)subscribe requests to the list. Use the
> > www interface or send it to gmx-users-requ...@gromacs.org.
> > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> >
>
>
> --
> gmx-users mailing listgmx-users@gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
> * Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-requ...@gromacs.org.
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] default -rdd with distance restraints seems too large

2013-10-17 Thread Mark Abraham
Hi,

The log file gives a breakdown of how the minimum cell size was computed.
What does it say?

Mark
On Oct 17, 2013 5:17 AM, "Christopher Neale" 
wrote:

> I have a system that also uses a set of distance restraints
>
> The box size is:
>7.12792   7.12792  10.25212
>
> When running mdrun -nt 8, I get:
>
> Fatal error:
> There is no domain decomposition for 8 nodes that is compatible with the
> given box and a minimum cell size of 3.62419 nm
>
> However, the largest restrained distance is 2.0 nm and the largest
> displacement between restrained atoms is 2.63577 nm
>
> So why does mdrun set -rdd to 3.62419 nm ?
>
> If I run mdrun -rdd 2.8 everything works fine.
>
> Thank you,
> Chris.
>
> --
> gmx-users mailing listgmx-users@gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
> * Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-requ...@gromacs.org.
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


[gmx-users] Free energy of solvation of large molecule

2013-10-17 Thread Jernej Zidar
Hi,
  I'm trying to calculate the free energy of solvation of a relatively
large polymer molecule (161 atoms). I went through the free energy
tutorial published on J. Lemkul's web page but when trying to apply
the same approach to my case, the simulations typically fail. The
files for one such case are here:
https://www.dropbox.com/s/z5z3ip767dgwloh/coul0.0.tar.gz

  I looked all over the internet to find a similar use case but most
people seem to have studied small organic molecules, which means
computing the solvation free energy for my molecule is far from
trivial.
  I understand I'll have to do simulations at many lambda points for
rather long periods of time but how to run them in a stable way?

Thanks in advance,
Jernej
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] mistake occured in Gromacs install

2013-10-17 Thread Mark Abraham
You do need a C compiler, not a Fortran one, and IIRC gcc 4.6.2 has some
known issues. Please follow the instructions in the install guide and get
the latest compiler you can.

Mark
On Oct 17, 2013 8:30 AM, "张海平" <21620101152...@stu.xmu.edu.cn> wrote:

> Dear professor:
>   When I install the Gromacs software, there occured a problem as
> follow(my computer is 64bit,linux, gcc is GNU Fortran (GCC) 4.6.2):
>
>
> "[ZHP@console build]$  cmake .. -DGMX_BUILD_OWN_FFTW=ON
> -- No compatible CUDA toolkit found (v3.2+), disabling native GPU
> acceleration
> CMake Warning at CMakeLists.txt:744 (message):
>   No C SSE4.1 flag found.  Consider a newer compiler, or use SSE2 for
>   slightly lower performance
>
>
> CMake Error at CMakeLists.txt:767 (message):
>   Cannot find smmintrin.h, which is required for SSE4.1 intrinsics support.
>
>
> -- Configuring incomplete, errors occurred!
> "
> I don't know how to solve it. Hope your reply soon.
>
> Best regards
> Haiping Zhang
> --
> gmx-users mailing listgmx-users@gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
> * Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-requ...@gromacs.org.
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists