Re: [gmx-users] gromacs

2014-04-01 Thread Pavan Kumar
Set the installed path and library path of Gromacs in .bashrc file and
source it


On Tue, Apr 1, 2014 at 1:36 PM, Meenakshi Rajput ashi.rajpu...@gmail.comwrote:

 i used the pdb2gmx command but i get a message command not found. can you
 help me
 --
 Gromacs Users mailing list

 * Please search the archive at
 http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
 posting!

 * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

 * For (un)subscribe requests visit
 https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
 send a mail to gmx-users-requ...@gromacs.org.




-- 
Thanks  Regards,
Pavan Kumar
Project Engineer
CDAC -KP
Ph +91-7676367646
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] MPI error in gromacs 4.6

2014-03-24 Thread Pavan Kumar
Hello Ankita
You have to just include the following line in your mdp file
cutoff-scheme=Verlet
And run your grompp with the modfied mdp file to generate tpr file and then
mdrun.
Hope this doesn't give you the same error


On Mon, Mar 24, 2014 at 4:47 PM, Ankita Naithani
ankitanaith...@gmail.comwrote:

 Hi,

 I am trying to run a simulation of my protein (monomer ~500 residues). I
 had few questions and erors regarding the same.
 I have previously run the simulation of the apo form of the same protein
 using Gromacs 4.5.5 which was available at the cluster facility I was using
 and also which is installed in my system. However, when I tried to run the
 holo form, I got error :
 Fatal error:
 11 particles communicated to PME node 106 are more than 2/3 times the
 cut-off out of the domain decomposition cell of their charge group in
 dimension y.
 This usually means that your system is not well equilibrated.
 For more information and tips for troubleshooting, please check the GROMACS
 website at http://www.gromacs.org/Documentation/Errors

 This I figured out could be solved using a lower timestep as my previous
 timestep was 4fs and now I have reduced it to 3fs which should work fine
 now.
 However, after producing the tpr file for production run in my GROMACS
 4.5.5, I realised that the grant for the cluster facility is over and the
 new clusters which I am trying to set up the same protein for support only
 gromacs 4.6. I am trying to run the code in these clusters and I get he
 following error:


 ---
 Program mdrun_mpi, VERSION 4.6.3
 Source code file: /home/gromacs-4.6.3/src/kernel/runner
 .c, line: 824

 Fatal error:
 OpenMP threads have been requested with cut-off scheme Group, but these are
 only
  supported with cut-off scheme Verlet
 For more information and tips for troubleshooting, please check the GROMACS
 website at http://www.gromacs.org/Documentation/Errors

 -

 1. I wanted help with my mdp options to make it compatible.
 2. Since my pevious calculations were based on gromacs 4.5.5, switching to
 gromacs 4.6, would that break the continuity of the run or would that bring
 about differences in the way the trajectories would be analysed?


 Below, is my mdp file
 title= production MD
 ; Run parameters
 integrator= md; leap-frog algorithm
 nsteps= ; 0.003 *  = 10 ps or 100 n
 dt= 0.003; 3 fs
 ; Output control
 nstxout= 0; save coordinates every 2 ps
 nstvout= 0; save velocities every 2 ps
 nstxtcout= 1000; xtc compressed trajectory output every 5 ps
 nstenergy= 1000; save energies every 5 ps
 nstlog= 1000; update log file every 5 ps
 energygrps  = Protein ATP
 ; Bond parameters
 constraint_algorithm = lincs; holonomic constraints
 constraints= all-bonds; all bonds (even heavy atom-H bonds)
 constrained
 lincs_iter= 1; accuracy of LINCS
 lincs_order= 4; also related to accuracy
 ; Neighborsearching
 ns_type= grid; search neighboring grid cells
 nstlist= 5; 25 fs
 rlist= 1.0; short-range neighborlist cutoff (in nm)
 rcoulomb= 1.0; short-range electrostatic cutoff (in nm)
 rvdw= 1.0; short-range van der Waals cutoff (in nm)
 rlistlong= 1.0; long-range neighborlist cutoff (in nm)
 ; Electrostatics
 coulombtype= PME; Particle Mesh Ewald for long-range
 electrostatics
 pme_order= 4; cubic interpolation
 fourierspacing= 0.16; grid spacing for FFT
 nstcomm = 10; remove com every 10 steps
 ; Temperature coupling is on
 tcoupl= V-rescale; modified Berendsen thermostat
 tc-grps= Protein Non-Protein; two coupling groups - more
 accurate
 tau_t= 0.10.1; time constant, in ps
 ref_t= 318 318; reference temperature, one for each group,
 in K
 ; Pressure coupling is off
 pcoupl  = berendsen; Berendsen thermostat
 pcoupltype= isotropic; uniform scaling of box vectors
 tau_p= 1.0; time constant, in ps
 ref_p= 1.0; reference pressure, in bar
 compressibility = 4.5e-5; isothermal compressibility of water, bar^-1
 ; Periodic boundary conditions
 pbc= xyz; 3-D PBC
 ; Dispersion correction
 DispCorr= EnerPres; account for cut-off vdW scheme
 ; Velocity generation
 gen_vel= yes; Velocity generation is on
 gen_temp= 318; reference temperature, for protein in K




 Kind regards--
 Ankita Naithani
 --
 Gromacs Users mailing list

 * Please search the archive at
 http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
 posting!

 * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

 

Re: [gmx-users] MPI error in gromacs 4.6

2014-03-24 Thread Pavan Kumar
It might be some typographical errors.
Check the mdp file thoroughly. I think semicolon is required for the last
line in your mdp file


On Mon, Mar 24, 2014 at 5:18 PM, Ankita Naithani
ankitanaith...@gmail.comwrote:

 Hi Pavan,
 Thank you for your response. I am trying to generate the tpr file with the
 following parameter;
 ; Neighborsearching
  ns_type= grid; search neighboring grid cells
  nstlist= 5; 25 fs
  rlist= 1.0; short-range neighborlist cutoff (in nm)
  rcoulomb= 1.0; short-range electrostatic cutoff (in nm)
  rvdw= 1.0; short-range van der Waals cutoff (in nm)
  rlistlong= 1.0; long-range neighborlist cutoff (in nm)
 cutoff-scheme = Verlet

 But, I get a warning of Unknown left-hand 'cutoff-scheme' in parameter
 file.


 On Mon, Mar 24, 2014 at 11:26 AM, Pavan Kumar kumar.pavan...@gmail.com
 wrote:

  Hello Ankita
  You have to just include the following line in your mdp file
  cutoff-scheme=Verlet
  And run your grompp with the modfied mdp file to generate tpr file and
 then
  mdrun.
  Hope this doesn't give you the same error
 
 
  On Mon, Mar 24, 2014 at 4:47 PM, Ankita Naithani
  ankitanaith...@gmail.comwrote:
 
   Hi,
  
   I am trying to run a simulation of my protein (monomer ~500 residues).
 I
   had few questions and erors regarding the same.
   I have previously run the simulation of the apo form of the same
 protein
   using Gromacs 4.5.5 which was available at the cluster facility I was
  using
   and also which is installed in my system. However, when I tried to run
  the
   holo form, I got error :
   Fatal error:
   11 particles communicated to PME node 106 are more than 2/3 times the
   cut-off out of the domain decomposition cell of their charge group in
   dimension y.
   This usually means that your system is not well equilibrated.
   For more information and tips for troubleshooting, please check the
  GROMACS
   website at http://www.gromacs.org/Documentation/Errors
  
   This I figured out could be solved using a lower timestep as my
 previous
   timestep was 4fs and now I have reduced it to 3fs which should work
 fine
   now.
   However, after producing the tpr file for production run in my GROMACS
   4.5.5, I realised that the grant for the cluster facility is over and
 the
   new clusters which I am trying to set up the same protein for support
  only
   gromacs 4.6. I am trying to run the code in these clusters and I get he
   following error:
  
  
   ---
   Program mdrun_mpi, VERSION 4.6.3
   Source code file: /home/gromacs-4.6.3/src/kernel/runner
   .c, line: 824
  
   Fatal error:
   OpenMP threads have been requested with cut-off scheme Group, but these
  are
   only
supported with cut-off scheme Verlet
   For more information and tips for troubleshooting, please check the
  GROMACS
   website at http://www.gromacs.org/Documentation/Errors
  
  
 
 -
  
   1. I wanted help with my mdp options to make it compatible.
   2. Since my pevious calculations were based on gromacs 4.5.5, switching
  to
   gromacs 4.6, would that break the continuity of the run or would that
  bring
   about differences in the way the trajectories would be analysed?
  
  
   Below, is my mdp file
   title= production MD
   ; Run parameters
   integrator= md; leap-frog algorithm
   nsteps= ; 0.003 *  = 10 ps or 100 n
   dt= 0.003; 3 fs
   ; Output control
   nstxout= 0; save coordinates every 2 ps
   nstvout= 0; save velocities every 2 ps
   nstxtcout= 1000; xtc compressed trajectory output every 5
 ps
   nstenergy= 1000; save energies every 5 ps
   nstlog= 1000; update log file every 5 ps
   energygrps  = Protein ATP
   ; Bond parameters
   constraint_algorithm = lincs; holonomic constraints
   constraints= all-bonds; all bonds (even heavy atom-H bonds)
   constrained
   lincs_iter= 1; accuracy of LINCS
   lincs_order= 4; also related to accuracy
   ; Neighborsearching
   ns_type= grid; search neighboring grid cells
   nstlist= 5; 25 fs
   rlist= 1.0; short-range neighborlist cutoff (in nm)
   rcoulomb= 1.0; short-range electrostatic cutoff (in nm)
   rvdw= 1.0; short-range van der Waals cutoff (in nm)
   rlistlong= 1.0; long-range neighborlist cutoff (in nm)
   ; Electrostatics
   coulombtype= PME; Particle Mesh Ewald for long-range
   electrostatics
   pme_order= 4; cubic interpolation
   fourierspacing= 0.16; grid spacing for FFT
   nstcomm = 10; remove com every 10 steps
   ; Temperature coupling is on
   tcoupl= V-rescale

Re: [gmx-users] Adding TPO and SEP to G53a6 Forcefield

2014-03-04 Thread Pavan Kumar
...@outerbanks.umaryland.edu | (410) 706-7441
 http://mackerell.umaryland.edu/~jalemkul

 ==

 --
 Gromacs Users mailing list

 * Please search the archive at http://www.gromacs.org/
 Support/Mailing_Lists/GMX-Users_List before posting!

 * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

 * For (un)subscribe requests visit
 https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
 send a mail to gmx-users-requ...@gromacs.org.




-- 
Best
Pavan Kumar
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] Out of Disk Space

2014-03-04 Thread Pavan Kumar
Hi everybody,
I am running mdrun_mpi V 4.6.5 for nsteps=1000; 20 ns
My simulation is getting stopped at 500ps and I am getting following error
even though i have enough space on my cluster around 17 GB.

*Cannot fsync 'md1.trr'; maybe you are out of disk space?*
*Cannot fsync 'md1.log'; maybe you are out of disk space?*


 I have executed for two times, each time i get different error.

What is the reason for the above error. I am sharing my mdp file.
Any help is appreciated.
-- 
Thanks  Regards,
Pavan Kumar
Project Engineer
CDAC -KP
Ph +91-7676367646
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Guidance on how to install Gromacs-5.0-beta2

2014-02-21 Thread Pavan Kumar
Hello Yogendra,

Remove -DGMX_BUILD_OWN_FFTW=ON argument and add
CMAKE_PREFIX_PATH=/path/to/fftw

I hope it works


On Fri, Feb 21, 2014 at 4:31 PM, Yogendra Ramtirtha 
ramtirtha.yogen...@gmail.com wrote:

 Hello,
 Actually, I am trying to install Gromacs-5.0-beta2 on my systemwhich has
 Ubuntu 12.04.
 Everything went fine untill the make command which gives me the following
 error -

 ( [  0%] Performing download step (verify and extract) for 'fftwBuild'
 -- verifying file...
  file='http:/www.fftw.org/fftw-3.3.3.tar.gz'
 -- verifying file... warning: did not verify file - no URL_HASH specified?
 -- extracting...


  
 src='/home/yogendra/Desktop/gromacs-5.0-beta2/build/src/contrib/fftw/fftwBuild-prefix/src/http:/
 www.fftw.org/fftw-3.3.3.tar.gz'


  
 dst='/home/yogendra/Desktop/gromacs-5.0-beta2/build/src/contrib/fftw/fftwBuild-prefix/src/fftwBuild'
 CMake Error at fftwBuild-stamp/extract-fftwBuild.cmake:11 (message):
   error: file to extract does not exist:


 '/home/yogendra/Desktop/gromacs-5.0-beta2/build/src/contrib/fftw/fftwBuild-prefix/src/http:/
 www.fftw.org/fftw-3.3.3.tar.gz'


 make[2]: ***
 [src/contrib/fftw/fftwBuild-prefix/src/fftwBuild-stamp/fftwBuild-download]
 Error 1
 make[1]: *** [src/contrib/fftw/CMakeFiles/fftwBuild.dir/all] Error 2
 make: *** [all] Error 2 )

 To deal with this error, I manually installed fftw-3.3.3 on my desktop but
 the major hurdle now is that I don't know how to proceed with Gromacs
 installation.
 Please can anyone guide me, I need help as Gromacs is a part of my academic
 work. ( I am a graduate student of biotechnology. )
 --
 Gromacs Users mailing list

 * Please search the archive at
 http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
 posting!

 * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

 * For (un)subscribe requests visit
 https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
 send a mail to gmx-users-requ...@gromacs.org.




-- 
Thanks  Regards,
Pavan Kumar
Project Engineer
CDAC -KP
Ph +91-7676367646
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] Fwd: Segmentation fault with mdrun

2014-02-20 Thread Pavan Kumar
-- Forwarded message --
From: Pavan Kumar kumar.pavan...@gmail.com
Date: Thu, Feb 20, 2014 at 3:34 PM
Subject: Segmentation fault with mdrun
To: gromacs.org_gmx-users-requ...@maillist.sys.kth.se


Hello,

I am using gromacs 4.6.5.
I am running my code on the cluster providing 8 nodes and 4 processor each.
I am getting Segmentation Fault while running mdrunmpi

My command
*mpirun -n 8 mdrunmpi.4.6.5 -v -deffnm em*

More Details:
*Machine : GNU/Linux x86_64*

*gcc (GCC) 4.3.3 *
*cmake version 2.8.3*

Compilation Details:

*I have compiled my code using mpich compiler *

*cmake .. -DGMX_GPU=OFF -DGMX_MPI=ON
-DCMAKE_C_COMPILER=/opt/mpich2/gnu/bin/mpicc -DGMX_BUILD_OWN_FFTW=ON
-DGMX_CPU_ACCELERATION=SSE2
-DCMAKE_INSTALL_PREFIX=/home/garuda/garuda3/gromacs465*

If there is any problem in compiling the code itself, what is the error and
how it could be corrected.
Please give me any solutions, why i am getting the above error.


-- 
Thanks  Regards,
Pavan Kumar
Project Engineer
CDAC -KP
Ph +91-7676367646



-- 
Thanks  Regards,
Pavan Kumar
Project Engineer
CDAC -KP
Ph +91-7676367646
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] Segmentation Fault with mdrun_mpi

2014-02-20 Thread Pavan Kumar
Hello,

I am using gromacs 4.6.5.
I am running my code on the cluster providing 8 nodes and 4 processor each.
I am getting Segmentation Fault while running mdrunmpi

My command
*mpirun -n 8 mdrunmpi.4.6.5 -v -deffnm em*

More Details:
*Machine : GNU/Linux x86_64*

*gcc (GCC) 4.3.3*
*cmake version 2.8.3*

Compilation Details:

*I have compiled my code using mpich compiler *

*cmake .. -DGMX_GPU=OFF -DGMX_MPI=ON
-DCMAKE_C_COMPILER=/opt/mpich2/gnu/bin/mpicc -DGMX_BUILD_OWN_FFTW=ON
-DGMX_CPU_ACCELERATION=SSE2
-DCMAKE_INSTALL_PREFIX=/home/garuda/garuda3/gromacs465*


mdrun log file is attached. Please find the attachment.
If there is any problem in compiling the code itself, what is the error and
how it could be corrected.
Please give me any solutions, why i am getting the above error.
-- 
Thanks  Regards,
Pavan Kumar
Project Engineer
CDAC -KP
Ph +91-7676367646
Log file opened on Thu Feb 20 17:52:25 2014
Host: compute-1-21.local  pid: 7187  nodeid: 0  nnodes:  8
Gromacs version:VERSION 4.6.5
Precision:  single
Memory model:   64 bit
MPI library:MPI
OpenMP support: enabled
GPU support:disabled
invsqrt routine:gmx_software_invsqrt(x)
CPU acceleration:   SSE2
FFT library:fftw-3.3.2-sse2
Large file support: enabled
RDTSCP usage:   disabled
Built on:   Tue Feb  4 14:40:23 IST 2014
Built by:   garuda3@compute-1-0.local [CMAKE]
Build OS/arch:  Linux 2.6.18-53.1.4.el5 x86_64
Build CPU vendor:   GenuineIntel
Build CPU brand:Intel(R) Xeon(R) CPU   X5460  @ 3.16GHz
Build CPU family:   6   Model: 23   Stepping: 6
Build CPU features: apic clfsh cmov cx8 cx16 lahf_lm mmx msr pdcm pse sse2 sse3 
sse4.1 ssse3
C compiler: /opt/mpich2/gnu/bin/mpicc GNU gcc (GCC) 4.1.2 20070626 (Red 
Hat 4.1.2-14)
C compiler flags:   -msse2-Wextra -Wno-missing-field-initializers 
-Wno-sign-compare -Wall -Wno-unused -Wunused-value   -fomit-frame-pointer 
-funroll-all-loops  -O3 -DNDEBUG


 :-)  G  R  O  M  A  C  S  (-:

  GROtesk MACabre and Sinister

:-)  VERSION 4.6.5  (-:

Contributions from Mark Abraham, Emile Apol, Rossen Apostolov, 
   Herman J.C. Berendsen, Aldert van Buuren, Pär Bjelkmar,  
 Rudi van Drunen, Anton Feenstra, Gerrit Groenhof, Christoph Junghans, 
Peter Kasson, Carsten Kutzner, Per Larsson, Pieter Meulenhoff, 
   Teemu Murtola, Szilard Pall, Sander Pronk, Roland Schulz, 
Michael Shirts, Alfons Sijbers, Peter Tieleman,

   Berk Hess, David van der Spoel, and Erik Lindahl.

   Copyright (c) 1991-2000, University of Groningen, The Netherlands.
 Copyright (c) 2001-2012,2013, The GROMACS development team at
Uppsala University  The Royal Institute of Technology, Sweden.
check out http://www.gromacs.org for more information.

 This program is free software; you can redistribute it and/or
   modify it under the terms of the GNU Lesser General Public License
as published by the Free Software Foundation; either version 2.1
 of the License, or (at your option) any later version.

   :-)  ./bin/mdrun_mpi  (-:


 PLEASE READ AND CITE THE FOLLOWING REFERENCE 
B. Hess and C. Kutzner and D. van der Spoel and E. Lindahl
GROMACS 4: Algorithms for highly efficient, load-balanced, and scalable
molecular simulation
J. Chem. Theory Comput. 4 (2008) pp. 435-447
  --- Thank You ---  


 PLEASE READ AND CITE THE FOLLOWING REFERENCE 
D. van der Spoel, E. Lindahl, B. Hess, G. Groenhof, A. E. Mark and H. J. C.
Berendsen
GROMACS: Fast, Flexible and Free
J. Comp. Chem. 26 (2005) pp. 1701-1719
  --- Thank You ---  


 PLEASE READ AND CITE THE FOLLOWING REFERENCE 
E. Lindahl and B. Hess and D. van der Spoel
GROMACS 3.0: A package for molecular simulation and trajectory analysis
J. Mol. Mod. 7 (2001) pp. 306-317
  --- Thank You ---  


 PLEASE READ AND CITE THE FOLLOWING REFERENCE 
H. J. C. Berendsen, D. van der Spoel and R. van Drunen
GROMACS: A message-passing parallel molecular dynamics implementation
Comp. Phys. Comm. 91 (1995) pp. 43-56
  --- Thank You ---  

Input Parameters:
   integrator   = steep
   nsteps   = 500
   init-step= 0
   cutoff-scheme= Verlet
   ns_type  = Grid
   nstlist  = 1
   ndelta   = 2
   nstcomm  = 100
   comm-mode= Linear
   nstlog   = 1000
   nstxout  = 0
   nstvout  = 0
   nstfout  = 0
   nstcalcenergy= 100
   nstenergy= 1000
   nstxtcout= 0
   init-t   = 0
   delta-t  = 0.001
   xtcprec  = 1000
   fourierspacing   = 0.12
   nkx