Re: [gmx-users] interrupt of gmx mdrun

2019-06-30 Thread Dallas Warren
Check the output file from the job scheduler i.e. slurm-jobnumber.slurm

That will contain the error output information that will indicate what
has gone wrong.

Catch ya,

Dr. Dallas Warren
Drug Delivery, Disposition and Dynamics
Monash Institute of Pharmaceutical Sciences, Monash University
381 Royal Parade, Parkville VIC 3052
dallas.war...@monash.edu
-
When the only tool you own is a hammer, every problem begins to resemble a nail.

On Sat, 29 Jun 2019 at 20:15, Andrew Bostick  wrote:
>
> Hi gromacs users,
>
> I am doing MD simulation of a protein (from pdb ). After equilibration
> phases, I used following commands:
>
> gmx_mpi grompp -f md.mdp -c npt.gro -t npt.cpt p topol.top -o md.tpr -n
> index.ndx
>
> gmx_mpi mdrun -v -nb gpu -deffnm  md >& md.job &
>
> But, mdrun was interrupted in step 8237500. I repeated last command. This
> happened again but in step 6837500.
>
> The last lines of md.log file is as follows:
>
>
>Step   Time Lambda
> 683750013675.00.0
>
>Energies (kJ/mol)
>   AngleProper Dih.  Improper Dih.  LJ-14 Coulomb-14
> 5.30550e+036.71343e+032.61548e+022.78159e+033.61962e+04
> LJ (SR)  Disper. corr.   Coulomb (SR)   Coul. recip.  Potential
> 4.03766e+04   -3.33403e+03   -4.03105e+052.26409e+03   -3.12540e+05
> Kinetic En.   Total EnergyTemperature Pres. DC (bar) Pressure (bar)
> 5.98029e+04   -2.52737e+052.95155e+02   -2.28999e+025.62166e+00
>Constr. rmsd
> 2.50195e-05
>
> What is the reason of this interruption. How to resolve that?
>
> Best,
> Andrew
> --
> Gromacs Users mailing list
>
> * Please search the archive at 
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
> mail to gmx-users-requ...@gromacs.org.
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Constraints Error; In Using Tetrahedral Zinc Dummy Model

2019-06-30 Thread Mark Abraham
Hi,

The usual way to do this is with harmonic bonds (or equivalent distance
restraints) from metal to ligand. Those are different from constraints.
GROMACS is not built for general rigid-body motion, which is what a
fully-constrained tetrahedron would require.

Mark


On Sun, 30 Jun 2019 at 20:32, Mahdi Bagherpoor 
wrote:

> Dear Gromacs users,
>
> I am trying to simulate a zinc-finger protein in explicit water, *with
> CHARMM36 FF*. Unfortunately, I did not find any *tetrahedral zinc force*
> field
> in Gromacs and therefore I converted the tetrahedral dummy zinc model used
> in CHARMM27 FF to the format of Gromacs. The minimization of the system
> goes perfect but unfortunately when I start *NVT simulation*, I get this
> error:
> 
>
> WARNING 1 [file topol.top, line 57]:
>   The bond in molecule-type Dummy_chain_X between atoms 2 DZ1 and 3 DZ2 has
>   an estimated oscillational period of 6.1e-03 ps, which is less than 5
>   times the time step of 2.0e-03 ps.
>   Maybe you forgot to change the constraints mdp option.
> ...
> Too many warnings (1).
> If you are sure all warnings are harmless, use the -maxwarn option.
>
> ---
> Here, DZ1 and DZ2 are dummy atoms related to tetrahedral zinc model. *in
> mdp file*, I have used *h-bonds* *Lincs* constraint. When I use the
> *all-bond* in the mdp file again I get another error that is:
> ---
>
> WARNING 1 [file topol.top, line 57]:
>   There are atoms at both ends of an angle, connected by constraints and
>   with masses that differ by more than a factor of 13. This means that
>   there are likely dynamic modes that are only very weakly coupled. To
>   ensure good equipartitioning, you need to either not use constraints on
>   all bonds (but, if possible, only on bonds involving hydrogens) or use
>   integrator = sd or decrease one or more tolerances:
>   verlet-buffer-tolerance <= 0.0001, LINCS iterations >= 2, LINCS order >=
>   4 or SHAKE tolerance <= 1e-05
> 
> Number of degrees of freedom in T-Coupling group Protein_DNA_ZNT is 6936.89
> Number of degrees of freedom in T-Coupling group Water_and_ions is
> 179547.11
> 
>
> 
> Does this error mean that I need to constraint zinc bonds beside h-bonds?
> If so, how should I do it? or something wrong in my topology is? *I will
> appreciate* if you let me know any idea to fix this problem.
>
> Cheers,
> Mahdi
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-requ...@gromacs.org.
>
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] Constraints Error; In Using Tetrahedral Zinc Dummy Model

2019-06-30 Thread Mahdi Bagherpoor
Dear Gromacs users,

I am trying to simulate a zinc-finger protein in explicit water, *with
CHARMM36 FF*. Unfortunately, I did not find any *tetrahedral zinc force* field
in Gromacs and therefore I converted the tetrahedral dummy zinc model used
in CHARMM27 FF to the format of Gromacs. The minimization of the system
goes perfect but unfortunately when I start *NVT simulation*, I get this
error:


WARNING 1 [file topol.top, line 57]:
  The bond in molecule-type Dummy_chain_X between atoms 2 DZ1 and 3 DZ2 has
  an estimated oscillational period of 6.1e-03 ps, which is less than 5
  times the time step of 2.0e-03 ps.
  Maybe you forgot to change the constraints mdp option.
...
Too many warnings (1).
If you are sure all warnings are harmless, use the -maxwarn option.

---
Here, DZ1 and DZ2 are dummy atoms related to tetrahedral zinc model. *in
mdp file*, I have used *h-bonds* *Lincs* constraint. When I use the
*all-bond* in the mdp file again I get another error that is:
---

WARNING 1 [file topol.top, line 57]:
  There are atoms at both ends of an angle, connected by constraints and
  with masses that differ by more than a factor of 13. This means that
  there are likely dynamic modes that are only very weakly coupled. To
  ensure good equipartitioning, you need to either not use constraints on
  all bonds (but, if possible, only on bonds involving hydrogens) or use
  integrator = sd or decrease one or more tolerances:
  verlet-buffer-tolerance <= 0.0001, LINCS iterations >= 2, LINCS order >=
  4 or SHAKE tolerance <= 1e-05

Number of degrees of freedom in T-Coupling group Protein_DNA_ZNT is 6936.89
Number of degrees of freedom in T-Coupling group Water_and_ions is 179547.11



Does this error mean that I need to constraint zinc bonds beside h-bonds?
If so, how should I do it? or something wrong in my topology is? *I will
appreciate* if you let me know any idea to fix this problem.

Cheers,
Mahdi
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Fetal error at grompp

2019-06-30 Thread Mark Abraham
Hi,

Grompp is warning you that a periodic unit cell with a net charge of -13
doesn't model anything physical. Your earlier case was different, eg. you
added ions to neutralise the charge.

Mark

On Tue., 25 Jun. 2019, 09:05 kalpana,  wrote:

> Dear all,
> I have worked with the same commands and setting in previous version of
> ubuntu and gromacs. Now with new system and up-gradation, I am facing
> problem. First kindly see the gmx information and then see the fatal error,
> I am getting at grompp. Kindly find the attached ions.mdp as well and see
> the other .mdp files too and guide me.
> Thanks & Regards
> Kalpana
>
>
> 1.
>   gmx --version
>
> GROMACS version:2019.3
> Precision:  single
> Memory model:   64 bit
> MPI library:thread_mpi
> OpenMP support: enabled (GMX_OPENMP_MAX_THREADS = 64)
> GPU support:CUDA
> SIMD instructions:  AVX_512
> FFT library:fftw-3.3.8-sse2-avx
> RDTSCP usage:   enabled
> TNG support:enabled
> Hwloc support:  hwloc-1.11.6
> Tracing support:disabled
> C compiler: /usr/bin/gcc GNU 8.3.0
> C compiler flags:-mavx512f -mfma -g -fno-inline
> C++ compiler:   /usr/bin/c++ GNU 8.3.0
> C++ compiler flags:  -mavx512f -mfma-std=c++11   -g -fno-inline
> CUDA compiler:  /usr/local/cuda/bin/nvcc nvcc: NVIDIA (R) Cuda compiler
> driver;Copyright (c) 2005-2019 NVIDIA Corporation;Built on
> Wed_Apr_24_19:10:27_PDT_2019;Cuda compilation tools, release 10.1,
> V10.1.168
> CUDA compiler
>
> flags:-gencode;arch=compute_30,code=sm_30;-gencode;arch=compute_35,code=sm_35;-gencode;arch=compute_37,code=sm_37;-gencode;arch=compute_50,code=sm_50;-gencode;arch=compute_52,code=sm_52;-gencode;arch=compute_60,code=sm_60;-gencode;arch=compute_61,code=sm_61;-gencode;arch=compute_70,code=sm_70;-gencode;arch=compute_75,code=compute_75;-use_fast_math;-D_FORCE_INLINES;;
> ;-mavx512f;-mfma;-std=c++11;-g;-fno-inline;
> CUDA driver:10.10
> CUDA runtime:   N/A
>
>
> 2.
> gmx pdb2gmx -f 1model1A1.pdb -o model1_processed.gro -water tip3p
> no warning and notes in pdb2gmx run
>
> 3.
> gmx editconf -f model1_processed.gro -o model1_newbox.gro -c -d 1.0 -bt
> dodecahedron
> no warning and notes in editconf
>
> 4.
> gmx solvate -cp model1_newbox.gro -cs spc216.gro -o model1_solv.gro -p
> topol.top
>
> WARNING: Masses and atomic (Van der Waals) radii will be guessed
>  based on residue and atom names, since they could not be
>  definitively assigned from the information in your input
>  files. These guessed numbers might deviate from the mass
>  and radius of the atom type. Please check the output
>  files if necessary.
>
> NOTE: From version 5.0 gmx solvate uses the Van der Waals radii
> from the source below. This means the results may be different
> compared to previous GROMACS versions.
>
>  PLEASE READ AND CITE THE FOLLOWING REFERENCE 
> A. Bondi
> van der Waals Volumes and Radii
> J. Phys. Chem. 68 (1964) pp. 441-451
>   --- Thank You ---  
>
> 5.
> gmx grompp -f ions.mdp -c model1_solv.gro -p topol.top -o ions.tpr
>
> NOTE 1 [file topol.top, line 60959]:
>   System has non-zero total charge: -13.00
>   Total charge should normally be an integer. See
>   http://www.gromacs.org/Documentation/Floating_Point_Arithmetic
>   for discussion on how close it should be to an integer.
>
> WARNING 1 [file topol.top, line 60959]:
>   You are using Ewald electrostatics in a system with net charge. This can
>   lead to severe artifacts, such as ions moving into regions with low
>   dielectric, due to the uniform background charge. We suggest to
>   neutralize your system with counter ions, possibly in combination with a
>   physiological salt concentration.
>
>  PLEASE READ AND CITE THE FOLLOWING REFERENCE 
> J. S. Hub, B. L. de Groot, H. Grubmueller, G. Groenhof
> Quantifying Artifacts in Ewald Simulations of Inhomogeneous Systems with a
> Net
> Charge
> J. Chem. Theory Comput. 10 (2014) pp. 381-393
>   --- Thank You ---  
>
> Removing all charge groups because cutoff-scheme=Verlet
> Analysing residue names:
> There are:   424Protein residues
> There are: 16060  Water residues
> Analysing Protein...
> Number of degrees of freedom in T-Coupling group rest is 115683.00
> Calculating fourier grid dimensions for X Y Z
> Using a fourier grid of 80x80x80, spacing 0.116 0.116 0.116
> Estimate for the relative computational load of the PME mesh part: 0.34
> This run will generate roughly 4 Mb of data
>
> There was 1 note
>
> There was 1 warning
>
> ---
> Program: gmx grompp, version 2019.3
> Source file: src/gromacs/gmxpreprocess/grompp.cpp (line 2315)
>
> Fatal error:
> Too many warnings (1).
> If you are sure all warnings are harmless, use the -maxwarn option.
>
> For more information and tips for troubleshooting, please check the GROMACS
> website 

Re: [gmx-users] How to define a user-defined potential

2019-06-30 Thread Mark Abraham
Hi,

In general, GROMACS supports only two-body interactions (apart from various
well known multi-body bond-style interactions), so unfortunately there is
no way to do what you want.

Mark

On Thu., 27 Jun. 2019, 07:50 Divya Rai,  wrote:

>  I wanted to know how is it possible to model a user-defined potential and
> run MD simulation through Gromacs? The tabulated potential has a specific
> form as defined in the manuals. What if my potential lacks it? Taking an
> example of Muller-Brown, how can I use Muller Brown potential for my MD
> simulations through Gromacs?
> Kindly help.
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-requ...@gromacs.org.
>
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] how to accelerate mdrun process

2019-06-30 Thread Mark Abraham
Hi,

As in the other thread, please do a Release buildnof GROMACS, not debug.

Mark

On Fri., 28 Jun. 2019, 15:46 kalpana,  wrote:

> Hi everyone,
> Kindly help to accelerate this mdrun process as you could see that its
> showing following timing for the command along with provided gmx -version.
>
>   gmx mdrun -deffnm md_0_1 -ntmpi 5 -ntomp 6 -pin on -v
>
> starting mdrun 'protein in water'
> 500 steps,  1.0 ps.
> step 17000, will finish Wed Jul  3 22:01:39 2019imb F  2% imb F  5%
>
>  gmx --version
>
> GROMACS version:2019.3
> Precision:  single
> Memory model:   64 bit
> MPI library:thread_mpi
> OpenMP support: enabled (GMX_OPENMP_MAX_THREADS = 64)
> GPU support:CUDA
> SIMD instructions:  AVX_512
> FFT library:fftw-3.3.8-sse2-avx
> RDTSCP usage:   enabled
> TNG support:enabled
> Hwloc support:  hwloc-1.11.6
> Tracing support:disabled
> C compiler: /usr/bin/gcc GNU 8.3.0
> C compiler flags:-mavx512f -mfma -g -fno-inline
> C++ compiler:   /usr/bin/c++ GNU 8.3.0
> C++ compiler flags:  -mavx512f -mfma-std=c++11   -g -fno-inline
> CUDA compiler:  /usr/local/cuda/bin/nvcc nvcc: NVIDIA (R) Cuda compiler
> driver;Copyright (c) 2005-2019 NVIDIA Corporation;Built on
> Wed_Apr_24_19:10:27_PDT_2019;Cuda compilation tools, release 10.1,
> V10.1.168
> CUDA compiler
>
> flags:-gencode;arch=compute_30,code=sm_30;-gencode;arch=compute_35,code=sm_35;-gencode;arch=compute_37,code=sm_37;-gencode;arch=compute_50,code=sm_50;-gencode;arch=compute_52,code=sm_52;-gencode;arch=compute_60,code=sm_60;-gencode;arch=compute_61,code=sm_61;-gencode;arch=compute_70,code=sm_70;-gencode;arch=compute_75,code=compute_75;-use_fast_math;-D_FORCE_INLINES;;
> ;-mavx512f;-mfma;-std=c++11;-g;-fno-inline;
> CUDA driver:10.10
> CUDA runtime:   N/A
>
>
> Thanks and regards
> Kalpana
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-requ...@gromacs.org.
>
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] errors during installation

2019-06-30 Thread Mark Abraham
Hi,

You've chosen a debug build, so it's super slow and the tests are probably
so slow they are timing out. Do a release build, like the install guide
suggests.

Mark


On Fri., 28 Jun. 2019, 15:34 kalpana,  wrote:

> Hi everyone,
> I had to reinstall gromacs and now I getting following errors during make
> check with provided settings. Kindly suggest better settings with which I
> could run mdrun faster becz my previous installations was with same
> settings but was very slow (mdrun -deffnm md_0_1 -ntmpi 2 -ntomp 6 -pin on
> -v).
>
> sudo cmake .. -DGMX_BUILD_OWN_FFTW=OFF -DREGRESSIONTEST_DOWNLOAD=OFF
> -DCMAKE_C_COMPILER=gcc -DGMX_SIMD=AVX_512 -DGMX_GPU=ON -DGMX_MPI=ON
> -DCUDA_TOOLKIT_ROOT_DIR=/usr/local/cuda -DGMX_USE_RDTSCP=ON
> -DGMX_FFT_LIBRARY=fftw3 -DCMAKE_BUILD_TYPE=Debug
> -DREGRESSIONTEST_PATH=/Downloads/gromacs/regressiontests-2019.3
>
>
> [image: image.png]
>
>
> Thanks & regards,
> Kalpana
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-requ...@gromacs.org.
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] How to install gromacs on cpu cluster

2019-06-30 Thread Benson Muite

Hi Yeping,

The basic steps are the same as in the documentation. The steps below 
should get GROMACS running on a linux computer where Cmake can be 
installed. You may need to change $HOME to an absolute path and give the 
exact location of the C and C++ compilers you will use to compile 
GROMACS. You need not use the same C and C++ compilers to compile CMake, 
though recent versions of Cmake also need a recent compiler. It may be 
the case that you will need to update your path variable to make your 
installation of GCC get picked up as the first compiler that is tried. 
Typical linux systems (but not all) should have most prerequisites for 
CMake:


wget 
https://github.com/Kitware/CMake/releases/download/v3.15.0-rc3/cmake-3.15.0-rc3.tar.gz

tar -xvf cmake-3.15.0-rc3.tar.gz
mkdir cmake-3.15.0-rc3-build
cd cmake-3.15.0-rc3-build/
 ../cmake-3.15.0-rc3/bootstrap 
--prefix=$HOME/InstallGromacs/cmake-3.15.0-rc3-install

gmake
make install
cd ..
wget ftp://ftp.gromacs.org/pub/gromacs/gromacs-2019.3.tar.gz
tar -xvf gromacs-2019.3.tar.gz
cd gromacs-2019.3/
mkdir build
cd build/
$HOME/InstallGromacs/cmake-3.15.0-rc3-install/bin/cmake .. 
-DGMX_BUILD_OWN_FFTW=ON -DREGRESSIONTEST_DOWNLOAD=ON 
-DCMAKE_INSTALL_PREFIX=$HOME/InstallGromacs/gromacs-2019.3-install/ 
-DCMAKE_CXX_COMPILER=/usr/bin/c++ -DCMAKE_C_COMPILER=/usr/bin/gcc

make
make check
make install

On 6/30/19 7:41 AM, Benson Muite wrote:

Hi Yeping,

May want to use GCC 7 or GCC 8

https://gcc.gnu.org/

Follow instructions for how to get GCC working, see

https://gcc.gnu.org/install/

Once have a new version of GCC in your own directory, you will 
probably want to then compile an MPI library (eg. OpenMPI - 
https://www.open-mpi.org/or MPICH - https://www.mpich.org) in your 
home directory as well, the MPI library should be compiled with the 
GCC version that you use. I have not used PBS in a while, but the MPI 
library should pick this up. For programs on a single node, PBS should 
be able to just run the compiled executable.


What version of cmake do you have on your system?

Will try to write a short script later today. You may also want to 
look at Spack which  offers an automated GROMACS installation (but it 
can be helpful to set it up yourself for good performance):


https://github.com/spack/spack

Regards,

Benson

On 6/30/19 3:36 AM, sunyeping wrote:

Hi,  Benson,

I can install gcc-4.9 for compiling the latest version of gromacs 
(gromacs_2019.3) in my own account directory 
(/data/home/sunyp/software/GCC. For proper submission of task with 
PBS system, which options of cmake should I use?
According to the "Quick and dirty cluster installation" section of 
the gromacs installation guide, it seems that a quick and dirty 
installation should be done, and then another installation with MPI 
should be done to the same location with the non-MPI installation. I 
am not very clear how these should be done exactly. Could you give 
the exact commands?

Best regards
Yeping


  Q


  ; gromacs 

    Subject:Re: [gmx-users] How to install gromacs on cpu cluster

    Hi Yeping,

    Minimum required compiler version for the latest release is GCC
    4.8.1 :

http://manual.gromacs.org/documentation/current/install-guide/index.html

    GROMACS 4.5 seems to indicate support for GCC 4.5
(http://www.gromacs.org/Documentation/Installation_Instructions_4.5)

    Is CMAKE on your cluster? If so what version?

    Regards,

    Benson

    On 6/29/19 12:08 PM, sunyeping wrote:
    Hello Benson,

    Thank you for respond to my question. There is no GPU on my cluster.

    Best regards,

    Yeping
--
    From:Benson Muite 
    Sent At:2019 Jun. 29 (Sat.) 16:56
    To:gromacs ; 孙业平 
    Subject:Re: [gmx-users] How to install gromacs on cpu cluster

    Hi Yeping,

It may be easier to install a newer version of GCC. Are there any GPUs

    on your cluster?

    Benson

    On 6/29/19 11:27 AM, sunyeping wrote:
    >
    > Dear everyone,
    >
> I would like to install gromacs on a cpu cluster of 12 nodes, with each node 
containing 32 cores. The gcc version on the cluster is 4.4.7. Which version of 
gromacs can be properly compiled with this gcc version?
    >
> The cluster support PBS job submission system, then what is the correct 
options for cmake (or maybe configure) when compiling gromacs?
    >
    > Thank you in advance.
    >
    > Best regards.
    > Yeping



--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.