[gmx-users] GROMACS-4.6.3 CUDA version on multiple nodes each having 2 GPUs

2013-11-13 Thread Prajapati, Jigneshkumar Dahyabhai
Hello, I am trying to run MPI, OpenMP and CUDA enable GROMACS 4.6.3 on nodes having 12 cores (2 CPUs) and 2 GPUs (Tesla M2090) each. The problem is when I launch job GROMCAS is using only GPUs on first node come across and failing to use GPUs on other nodes. The command I used for two gpu enab

Re: [gmx-users] GROMACS 4.6.4 is released

2013-11-13 Thread jkrieger
Will a simulation from 4.6.1 continue running fine if I upgrade to 4.6.4? > Hi GROMACS users, > > GROMACS 4.6.4 is officially released. It contains numerous bug fixes, and > some noteworthy simulation performance enhancements (particularly with > GPUs!). We encourage all users to upgrade their ins

[gmx-users] GROMACS 4.6.4 is released

2013-11-13 Thread Mark Abraham
Hi GROMACS users, GROMACS 4.6.4 is officially released. It contains numerous bug fixes, and some noteworthy simulation performance enhancements (particularly with GPUs!). We encourage all users to upgrade their installations from earlier 4.6-era releases. You can find the code, manual, release no

Re: [gmx-users] Gromacs-4.6 on two Titans GPUs

2013-11-05 Thread Mark Abraham
On Tue, Nov 5, 2013 at 12:55 PM, James Starlight wrote: > Dear Richard, > > > 1) mdrun -ntmpi 1 -ntomp 12 -gpu_id 0 -v -deffnm md_CaM_test > gave me performance about 25ns/day for the explicit solved system consisted > of 68k atoms (charmm ff. 1.0 cutoofs) > > gaves slightly worse performation i

Re: [gmx-users] Gromacs-4.6 on two Titans GPUs

2013-11-05 Thread James Starlight
Dear Richard, 1) mdrun -ntmpi 1 -ntomp 12 -gpu_id 0 -v -deffnm md_CaM_test gave me performance about 25ns/day for the explicit solved system consisted of 68k atoms (charmm ff. 1.0 cutoofs) gaves slightly worse performation in comparison to the 1) finally 3) mdrun -deffnm md_CaM_test running

Re: [gmx-users] Gromacs-4.6 on two Titans GPUs

2013-11-05 Thread Richard Broadbent
Dear James, On 05/11/13 11:16, James Starlight wrote: My suggestions: 1) During compilstion using -march=corei7-avx-i I have obtained error that somethng now found ( sorry I didnt save log) so I compile gromacs without this flag 2) I have twice as better performance using just 1 gpu by means o

Re: [gmx-users] Gromacs-4.6 on two Titans GPUs

2013-11-05 Thread James Starlight
My suggestions: 1) During compilstion using -march=corei7-avx-i I have obtained error that somethng now found ( sorry I didnt save log) so I compile gromacs without this flag 2) I have twice as better performance using just 1 gpu by means of mdrun -ntmpi 1 -ntomp 12 -gpu_id 0 -v -deffnm md_CaM_

Re: [gmx-users] Gromacs-4.6 on two Titans GPUs

2013-11-04 Thread Szilárd Páll
You can use the "-march=native" flag with gcc to optimize for the CPU your are building on or e.g. -march=corei7-avx-i for Intel Ivy Bridge CPUs. -- Szilárd Páll On Mon, Nov 4, 2013 at 12:37 PM, James Starlight wrote: > Szilárd, thanks for suggestion! > > What kind of CPU optimisation should I t

Re: [gmx-users] Gromacs-4.6 on two Titans GPUs

2013-11-04 Thread James Starlight
Szilárd, thanks for suggestion! What kind of CPU optimisation should I take into account assumint that I'm using dual-GPU Nvidia TITAN workstation with 6 cores i7 (recognized as 12 nodes in Debian). James 2013/11/4 Szilárd Páll > That should be enough. You may want to use the -march (or equiv

Re: [gmx-users] Gromacs-4.6 on two Titans GPUs

2013-11-04 Thread Szilárd Páll
That should be enough. You may want to use the -march (or equivalent) compiler flag for CPU optimization. Cheers, -- Szilárd Páll On Sun, Nov 3, 2013 at 10:01 AM, James Starlight wrote: > Dear Gromacs Users! > > I'd like to compile lattest 4.6 Gromacs with native GPU supporting on my i7 > cpu w

[gmx-users] Gromacs-4.6 on two Titans GPUs

2013-11-03 Thread James Starlight
Dear Gromacs Users! I'd like to compile lattest 4.6 Gromacs with native GPU supporting on my i7 cpu with dual GeForces Titans gpu mounted. With this config I'd like to perform simulations using cpu as well as both gpus simultaneously. What flags besides cmake .. -DGMX_GPU=ON -DCUDA_TOOLKIT_ROOT_

Re: [gmx-users] Gromacs 4.6 & 4.5.3 qualitative differences & 4.6 instability in polarizable force field vacuum/liquid mixture interface simulations

2013-11-02 Thread David van der Spoel
On 2013-11-02 18:38, ploetz wrote: Dear Gromacs Users, Please start a redmine.gromacs.org issue and assign it to me, but try to simplify the system as much as possible. You can cut and paste all the information to the redmine issue. I am trying to simulate a system consisting of a vacuum/co

[gmx-users] Gromacs 4.6 & 4.5.3 qualitative differences & 4.6 instability in polarizable force field vacuum/liquid mixture interface simulations

2013-11-02 Thread ploetz
Dear Gromacs Users, I am trying to simulate a system consisting of a vacuum/condensed phase interface in which a 6x6x12nm condensed phase region is flanked on both ends (in the z-dimension) by a 6x6x12nm vacuum region to form overall box dimensions of 6x6x36 nm. The system is a binary liquid mixtu

Re: [gmx-users] Gromacs tutorials for binding free energy analysis

2013-10-28 Thread Justin Lemkul
On 10/28/13 8:02 AM, Sajad Ahrari wrote: Hello dears I would liked to know which of the tutorials presented by Gromacs for binding free energy analysis (http://www.gromacs.org/Documentation/Tutorials) are based on LIE method? I would suggest you read them and see. I suspect none of them a

[gmx-users] Gromacs tutorials for binding free energy analysis

2013-10-28 Thread Sajad Ahrari
Hello dears I would liked to know which of the tutorials presented by Gromacs for binding free energy analysis (http://www.gromacs.org/Documentation/Tutorials)  are based on LIE method? regards, Sajad -- gmx-users mailing listgmx-users@gromacs.org http://lists.gromacs.org/mailman/listinfo/gm

[gmx-users] Gromacs on Stampede

2013-10-13 Thread Christopher Neale
Why not put it in a slurm script and submit that script as a (probably single-node) job. It is not generally acceptable to use a large fraction of the head node of a shared resource for a substantial amount of time. If your problem is different and of a gromacs nature, you may need to describe

Re: [gmx-users] Gromacs on Stampede

2013-10-12 Thread Arun Sharma
Hello, I have a question about running gromacs utilities on Stampede and hopefully someone can point me in the right direction. I compiled gromacs using instructions in this thread and mdrun works fine. Also, some utilities like g_energy, g_analyze (single - core utilities, I believe) seem to b

Re: [gmx-users] Gromacs on Stampede

2013-10-11 Thread Arun Sharma
Dear Chris, Thank you so much for providing the scripts and such detailed instructions. I was trying to load the gromacs module that is already available and was unable to get it to run.  Thanks to you, I now have a working gromacs installation. On Thursday, October 10, 2013 2:59 PM, Christ

[gmx-users] Gromacs on Stampede

2013-10-10 Thread Christopher Neale
Dear Arun: here is how I compile fftw and gromacs on stampede. I have also included a job script and a script to submit a chain of jobs. As Szilárd notes, this does not use the MICs, but it is still a rather fast machine. # Compilation for single precision gromacs plus mdrun_mpi # #

Re: [gmx-users] Gromacs on Stampede

2013-10-10 Thread Szilárd Páll
Hi, GROMACS does not have Xeon Phi support, so you'll be better off using only the CPUs in Stampede. Porting and optimization is in progress, but it will probably be a few months before you can test some Phi-optimized mdrun. Running (most) analyses on Phi is not really feasible. While there are a

[gmx-users] Gromacs on Stampede

2013-10-10 Thread Arun Sharma
Hello, Does anyone have experience running gromacs and data analysis tools on Stampede or similar supercomputer. Do we have a set of best practices or approaches for this situation. Any input is highly appreciated. Thanks -- gmx-users mailing listgmx-users@gromacs.org http://lists.gromac

Re: [gmx-users] Gromacs on Intel Xeon and AMD Opteron

2013-09-11 Thread Mark Abraham
__ > From: gmx-users-boun...@gromacs.org [gmx-users-boun...@gromacs.org] on behalf > of Mark Abraham [mark.j.abra...@gmail.com] > Sent: 11 September 2013 14:13 > To: Discussion list for GROMACS users > Subject: Re: [gmx-users] Gromacs on Intel Xeon and AMD Opteron > > Hm

RE: [gmx-users] Gromacs on Intel Xeon and AMD Opteron

2013-09-11 Thread Xu, Jianqing
ubject: Re: [gmx-users] Gromacs on Intel Xeon and AMD Opteron Hmm, that looks like a problem. Which GROMACS version, and which compiler version? (See mdrun -version) On Wed, Sep 11, 2013 at 3:06 PM, Xu, Jianqing wrote: > > Thanks for the comments, Mark. > > Sorry that I did not exp

Re: [gmx-users] Gromacs on Intel Xeon and AMD Opteron

2013-09-11 Thread Mark Abraham
s-boun...@gromacs.org [gmx-users-boun...@gromacs.org] on behalf > of Mark Abraham [mark.j.abra...@gmail.com] > Sent: 11 September 2013 10:56 > To: Discussion list for GROMACS users > Subject: Re: [gmx-users] Gromacs on Intel Xeon and AMD Opteron > > It's not clear whether

RE: [gmx-users] Gromacs on Intel Xeon and AMD Opteron

2013-09-11 Thread Xu, Jianqing
...@gromacs.org] on behalf of Mark Abraham [mark.j.abra...@gmail.com] Sent: 11 September 2013 10:56 To: Discussion list for GROMACS users Subject: Re: [gmx-users] Gromacs on Intel Xeon and AMD Opteron It's not clear whether you are reporting single points or post-equilibration time averages, but eithe

Re: [gmx-users] Gromacs on Intel Xeon and AMD Opteron

2013-09-11 Thread Mark Abraham
It's not clear whether you are reporting single points or post-equilibration time averages, but either way you must expect differences. MD simulations are chaotic. However, the long-time ensemble averages should agree well - that's the point of the simulation. Mark On Wed, Sep 11, 2013 at 10:35 A

[gmx-users] Gromacs on Intel Xeon and AMD Opteron

2013-09-11 Thread Xu, Jianqing
Dear all, This is my first time here. I apologize if I am not aware any rules for posting a new message. I was testing Gromacs on two servers, one server has AMD Opteron processor (Server 1), and the other has Intel Xeon processor (Server 2). The simulation system that I tried is the lysozyme

Re: [gmx-users] gromacs 4.6.3 and Intel compiiler 11.x

2013-09-04 Thread Guanglei Cui
I was following http://www.gromacs.org/Documentation/Installation_Instructions. The link to 4.6.3 regression test set isn't obvious. Following the pattern, I downloaded the 4.6.3 regression test tarball (which apparently unpacks to a folder named for 4.6.2). Now, GMX_CPU_ACCELERATION=None passes al

Re: [gmx-users] gromacs 4.6.3 and Intel compiiler 11.x

2013-09-03 Thread Mark Abraham
Please test using the test set version that matches the code! On Sep 4, 2013 5:16 AM, "Guanglei Cui" wrote: > Hi Szilard, > > Thanks for your reply. I may try your suggestions tomorrow when I get back > to work. > > Feeling curious, I downloaded and compiled gmx 4.6.3 on my home computer > (gcc-4

Re: [gmx-users] gromacs 4.6.3 and Intel compiiler 11.x

2013-09-03 Thread Guanglei Cui
Hi Szilard, Thanks for your reply. I may try your suggestions tomorrow when I get back to work. Feeling curious, I downloaded and compiled gmx 4.6.3 on my home computer (gcc-4.6.3 and ubuntu 12.04). Even with the default (below), kernel (38 out of 142) and freeenergy (2 out of 9) tests would stil

Re: [gmx-users] gromacs 4.6.3 and Intel compiiler 11.x

2013-09-03 Thread Mark Abraham
On Tue, Sep 3, 2013 at 7:47 PM, Guanglei Cui wrote: > Dear GMX users, > > I'm attempting to compile gromacs 4.6.3 with an older Intel compiler (ver > 11.x). Here is how I compiled FFTW, > > ./configure CC=icc F77=ifort CFLAGS="-O3 -gcc" > --prefix=/tmp/gromacs-4.6.3/fftw-3.3.3/build-intel-threads

Re: [gmx-users] gromacs 4.6.3 and Intel compiiler 11.x

2013-09-03 Thread Szilárd Páll
On Tue, Sep 3, 2013 at 9:50 PM, Guanglei Cui wrote: > Hi Mark, > > I agree with you and Justin, but let's just say there are things that are > out of my control ;-) I just tried SSE2 and NONE. Both failed the > regression check. That's alarming, with GMX_CPU_ACCELERATION=None only the plain C ker

Re: [gmx-users] gromacs 4.6.3 and Intel compiiler 11.x

2013-09-03 Thread Mark Abraham
Can't be sure - we don't know what the problem with the compiler *is*. Mark On Tue, Sep 3, 2013 at 9:50 PM, Guanglei Cui wrote: > Hi Mark, > > I agree with you and Justin, but let's just say there are things that are > out of my control ;-) I just tried SSE2 and NONE. Both failed the > regressio

Re: [gmx-users] gromacs 4.6.3 and Intel compiiler 11.x

2013-09-03 Thread Guanglei Cui
Hi Mark, I agree with you and Justin, but let's just say there are things that are out of my control ;-) I just tried SSE2 and NONE. Both failed the regression check. I think I've spent enough time on this, which justifies escalating this to someone with the control, but is failing regression chec

Re: [gmx-users] gromacs 4.6.3 and Intel compiiler 11.x

2013-09-03 Thread Guanglei Cui
Hi Justin, Thanks for the response. Right now, I will just have to get around this by using a slower code. How do I switch off SSE4.1 and use a different CPU optimization scheme? Regards, Guanglei On Tue, Sep 3, 2013 at 2:39 PM, Justin Lemkul wrote: > > > On 9/3/13 1:47 PM, Guanglei Cui wrote

Re: [gmx-users] gromacs 4.6.3 and Intel compiiler 11.x

2013-09-03 Thread Justin Lemkul
On 9/3/13 1:47 PM, Guanglei Cui wrote: Dear GMX users, I'm attempting to compile gromacs 4.6.3 with an older Intel compiler (ver 11.x). Here is how I compiled FFTW, ./configure CC=icc F77=ifort CFLAGS="-O3 -gcc" --prefix=/tmp/gromacs-4.6.3/fftw-3.3.3/build-intel-threads --enable-threads --ena

[gmx-users] gromacs 4.6.3 and Intel compiiler 11.x

2013-09-03 Thread Guanglei Cui
Dear GMX users, I'm attempting to compile gromacs 4.6.3 with an older Intel compiler (ver 11.x). Here is how I compiled FFTW, ./configure CC=icc F77=ifort CFLAGS="-O3 -gcc" --prefix=/tmp/gromacs-4.6.3/fftw-3.3.3/build-intel-threads --enable-threads --enable-sse2 --with-combined-threads --with-our

Re: [gmx-users] Gromacs 4.6.3 Installation Issues,

2013-08-26 Thread HANNIBAL LECTER
probably u do not have CUDA. If you are not really interested in performing simulations using GPU, you can set -DGMX_GPU=off during cmake On Sat, Aug 24, 2013 at 12:15 PM, No One wrote: > hi, > > i'm having difficulties installing gromacs by creating static links to the > libraries for fftw3. >

[gmx-users] Gromacs 4.6.3 Installation Issues,

2013-08-24 Thread No One
hi, i'm having difficulties installing gromacs by creating static links to the libraries for fftw3. i am currently running: cygwin1.7.24-1 cmake2.8.11.2-1 fftw33.3.3-1 this is the input that i'm attempting to utilize and flags errors (and some combination there of): cmake FFTW3F_INCLUDE_DIR=C:

Re: [gmx-users] GROMACS-CYSTEINE PROTEASES

2013-08-21 Thread MUSYOKA THOMMAS
Dear Justin, Thanks so much. If get stuck i will come knocking on your door. Cheers. On Wed, Aug 21, 2013 at 10:17 PM, Justin Lemkul wrote: > > > On 8/21/13 4:00 PM, MUSYOKA THOMMAS wrote: > >> Dear users, >> I am new to GROMACS and i have just been practising with several >> tutorials. >> I a

Re: [gmx-users] GROMACS-CYSTEINE PROTEASES

2013-08-21 Thread Justin Lemkul
On 8/21/13 4:00 PM, MUSYOKA THOMMAS wrote: Dear users, I am new to GROMACS and i have just been practising with several tutorials. I am trying to do a molecular dynamics simulation of cysteine proteases an example falcipain-2 (PDBID=2OUL) and got several issues to pose 1) looking at the structu

[gmx-users] GROMACS-CYSTEINE PROTEASES

2013-08-21 Thread MUSYOKA THOMMAS
Dear users, I am new to GROMACS and i have just been practising with several tutorials. I am trying to do a molecular dynamics simulation of cysteine proteases an example falcipain-2 (PDBID=2OUL) and got several issues to pose 1) looking at the structure it has water molecules - do i get rid of the

Re: [gmx-users] gromacs installation in mac osx 10.8

2013-08-19 Thread Mark Abraham
This is a known issue (e.g. http://bugzilla.gromacs.org/issues/1021). I have literally no idea if CUDA is even available for this hardware. You do have the option of cmake -DGMX_CPU_ACCELERATION=SSE4.1, which might be imperceptibly slower than AVX given the use of a GPU. There are rumours of OpenM

[gmx-users] gromacs installation in mac osx 10.8

2013-08-19 Thread Sudip Roy
Dear All, I am trying to install Gromacs 4.6.3 in my Macbook pro retina with 10.8. It has a GPU (NVIDIA GeForce GT 650M 1024 MB). I got really hopeful to use the GPU and the i7 (threaded 8 cores) for calculations. But I am finding it difficult to install Gromacs . Mac osx 10.8 comes with cla

[gmx-users] gromacs-4.5.5 cpmd QM/MM versions

2013-08-13 Thread tarak karmakar
Dear All, I am planning to perform a QM/MM calculation of my protein system. Can anybody suggest me whether gromacs-4.5.5 can be patched with any latest version of CPMD? If not, then please suggest me some combination of the gromacs-cpmd versions. I came across this tutorial, but there, both groma

[gmx-users] gromacs-4.5.5 CPMD QM/MM

2013-08-09 Thread tarak karmakar
Dear All, I am planning to perform a QM/MM calculation of my protein system. Can anybody suggest me whether gromacs-4.5.5 can be patched with any latest version of CPMD? If not, then please suggest me some combination of the gromacs-cpmd versions. I came across this tutorial, but there, both groma

Re: [gmx-users] Gromacs: GPU detection

2013-08-07 Thread Szilárd Páll
That should never happen. If mdrun is compiled with GPU support and GPUs are detected, the detection stats should always get printed. Can you reliably reproduce the issue? -- Szilárd On Fri, Aug 2, 2013 at 9:50 AM, Jernej Zidar wrote: > Hi there. > Lately I've been running simulations using G

[gmx-users] Gromacs 4.6.3 installation Issue with Intel & CUDA

2013-08-03 Thread Jim Strong
Hello, I have a problem compiling Gromacs 4.6.3 on RHEL 6.4 x64 with permutations of the following programs: Intel 13.1.0 CUDA 5.5 OpenMPI 1.7.2 Cmake 2.8.11.2 FFTW 3.3.3 (or Intel MKL) GNU 4.8.1 Basically, once the Intel compiler is introduced into the picture, NVCC seems to fail: ---

[gmx-users] Gromacs: GPU detection

2013-08-02 Thread Jernej Zidar
Hi there. Lately I've been running simulations using GPUs on a computer node. I noticed that though the GPUs are always in use sometimes I don't get this message in the output: Using 4 MPI threads Using 2 OpenMP threads per tMPI thread 4 GPUs detected: #0: NVIDIA Tesla C2070, compute cap.: 2.0

Re: [gmx-users] GROMACS 4.6.3 Static Linking

2013-07-31 Thread Szilárd Páll
On Thu, Jul 25, 2013 at 5:55 PM, Mark Abraham wrote: > That combo is supposed to generate a CMake warning. > > I also get a warning during linking that some shared library will have > to provide some function (getpwuid?) at run time, but the binary is > static. That warning has always popped up f

[gmx-users] Gromacs on Rescale

2013-07-25 Thread Joris Poort
Dear users, Please take a moment to check out Gromacs on Rescale. http://blog.rescale.com/run-gromacs-faster-on-rescale-with-parallelization/ We would really appreciate any feedback you might have for us. Best, Joris *Joris Poort* Rescale jo...@rescale.com -- gmx-users mailing listgmx-us

Re: [gmx-users] GROMACS 4.6.3 Static Linking

2013-07-25 Thread Mark Abraham
That combo is supposed to generate a CMake warning. I also get a warning during linking that some shared library will have to provide some function (getpwuid?) at run time, but the binary is static. Mark On Thu, Jul 25, 2013 at 4:21 PM, Andrew R Turner wrote: > Mark, > > A bit of testing has re

Re: [gmx-users] GROMACS 4.6.3 Static Linking

2013-07-25 Thread Andrew R Turner
Mark, A bit of testing has revealed that it is the "-DGMX_PREFER_STATIC_LIBS=ON" flag that makes the difference. With this flag you end up with dynamic executables that do not work (I think due to some glibc problem I have not yet tracked down) whereas if I exclude this option then I get

Re: [gmx-users] GROMACS 4.6.3 Static Linking

2013-07-25 Thread Mark Abraham
Vitaly has it upside down - it is normally required to build a static binary on Crays. cmake .. -DBUILD_SHARED_LIBS=off just works for building static binaries for me with 4.6.3 on lindgren, a Cray XE6, when using PrgEnv-gnu/4.0.46 Mark On Wed, Jul 24, 2013 at 8:58 AM, Andrew R Turner wrote: >

Re: [gmx-users] GROMACS 4.6.3 Static Linking

2013-07-24 Thread Andrew R Turner
Hi Vitaly, Impossible just for v4.6.3? It was certainly possible to create static executables for a Cray XE using v4.6.1 (I know, because I have done it). I followed the same procedure for 4.6.3 and have only managed to get dynamic executables (which do not work) hence my question. I will

[gmx-users] GROMACS benchmarks (d.dppc, d.lzm, etc.) with V4.6.1

2013-07-23 Thread tuccillo
Hi Folks, I can't seem to get the GROMACS benchmarks to work with 4.6.1. They work fine with 4.5.5. Can anyone offer any suggestion? It dies in grompp first with the following: --- Program grompp_mpi, VERSION 4.6.1 Source code file: /lustre/tucc

Re: [gmx-users] GROMACS 4.6.3 Static Linking

2013-07-20 Thread Dr. Vitaly Chaban
Soneone said here that static versions are impossible for Cray... Dr. Vitaly V. Chaban On Fri, Jul 19, 2013 at 12:55 PM, Andrew R Turner wrote: > Hi > > I am having problems creating static versions of the GROMACS binaries for > a Cray XE6 (www.hector.ac.uk). The build process I am using is d

Re: [gmx-users] GROMACS 4.6.3 Static Linking

2013-07-19 Thread Andrew R Turner
Hi Mark, What does build_*/src/buildinfo.h have to say about the compiler flags that are getting used? /** C compiler flags used to build */ #define BUILD_CFLAGS"-msse2-Wextra -Wno-missing-field-initializers -Wno-sign-compare -Wall -Wno-unused -Wunused-value -static -O3 -

Re: [gmx-users] GROMACS 4.6.3 Static Linking

2013-07-19 Thread Mark Abraham
What does build_*/src/buildinfo.h have to say about the compiler flags that are getting used? Mark On Fri, Jul 19, 2013 at 12:55 PM, Andrew R Turner wrote: > Hi > > I am having problems creating static versions of the GROMACS binaries for a > Cray XE6 (www.hector.ac.uk). The build process I am u

[gmx-users] GROMACS 4.6.3 Static Linking

2013-07-19 Thread Andrew R Turner
Hi I am having problems creating static versions of the GROMACS binaries for a Cray XE6 (www.hector.ac.uk). The build process I am using is documented at: http://www.hector.ac.uk/support/documentation/software/gromacs/compiling_4-6-1_phase3.php and successfully produced static binaries for

Re: [gmx-users] Gromacs installation problem

2013-07-12 Thread Douglas Houston
Thanks a lot Mark, that worked (after I did "setenv LD_LIBRARY_PATH ${LD_LIBRARY_PATH}:/usr/people/douglas/programs/gromacs-4.6.3/install/lib"). Quoting Mark Abraham on Fri, 12 Jul 2013 13:22:01 +0100: On Fri, Jul 12, 2013 at 11:18 AM, Douglas Houston wrote: Hi, I am having trouble in

Re: [gmx-users] Gromacs installation problem

2013-07-12 Thread Mark Abraham
On Fri, Jul 12, 2013 at 11:18 AM, Douglas Houston wrote: > Hi, > > I am having trouble installing Gromacs 4.6.3. > > In bash I am using the following sequence of commands: > > cd gromacs-4.6.3 > mkdir build > cd build > CC=/usr/people/douglas/programs/gcc-4.7.3/installation/bin/gcc > ~/programs/cm

[gmx-users] Gromacs installation problem

2013-07-12 Thread Douglas Houston
Hi, I am having trouble installing Gromacs 4.6.3. In bash I am using the following sequence of commands: cd gromacs-4.6.3 mkdir build cd build CC=/usr/people/douglas/programs/gcc-4.7.3/installation/bin/gcc ~/programs/cmake-2.8.7/bin/cmake .. -DGMX_BUILD_OWN_FFTW=ON make sudo make install Ev

[gmx-users] GROMACS 4.6.3 released

2013-07-05 Thread Mark Abraham
Hi GROMACS users, GROMACS 4.6.3 is officially released. It contains one major bug fix; a significant simulation performance regression with MPI-enabled builds was introduced in 4.6.2. Several less critical issues have also been addressed. We encourage all users to upgrade their installations from

Re: [gmx-users] Gromacs GPU system question

2013-07-04 Thread Szilárd Páll
On Mon, Jun 24, 2013 at 4:43 PM, Szilárd Páll wrote: > On Sat, Jun 22, 2013 at 5:55 PM, Mirco Wahab > wrote: >> On 22.06.2013 17:31, Mare Libero wrote: >>> >>> I am assembling a GPU workstation to run MD simulations, and I was >>> wondering if anyone has any recommendation regarding the GPU/CPU >

[gmx-users] gromacs

2013-07-02 Thread battis...@libero.it
Dear users and experts I'm doing an umbrella-samplig calculation to obtain the PMF when two structure (A and B) are at various X distances. I setted my md-umbrella.mdp file as in follows: ; COM PULLING ; Pull type: no, umbrella, constraint or constant_force pull = umbrella

Re: [gmx-users] Gromacs 4.5.4 and Gromacs 4.5.5 give different results

2013-06-27 Thread Sapna Sarupria
I have tried the same with Berendsen and it still crashes. I have seen this happen before in some other versions compatibility too (I have an old post about it in the mailing list). Thanks for the response! On Thu, Jun 27, 2013 at 3:21 PM, Justin Lemkul wrote: > > > On 6/27/13 3:16 PM, Sapna S

Re: [gmx-users] Gromacs 4.5.4 and Gromacs 4.5.5 give different results

2013-06-27 Thread Justin Lemkul
On 6/27/13 3:16 PM, Sapna Sarupria wrote: Thanks Justin for the response. I thought of that too but was not sure if that could alone be attributed to the crash. There is no obvious reason why it should happen given Gromacs has typically been quite stable. This is quite a straightforward system.

Re: [gmx-users] Gromacs 4.5.4 and Gromacs 4.5.5 give different results

2013-06-27 Thread Sapna Sarupria
Thanks Justin for the response. I thought of that too but was not sure if that could alone be attributed to the crash. There is no obvious reason why it should happen given Gromacs has typically been quite stable. This is quite a straightforward system...but perhaps it is the version. I am able to

Re: [gmx-users] Gromacs 4.5.4 and Gromacs 4.5.5 give different results

2013-06-27 Thread Justin Lemkul
On 6/27/13 2:30 PM, sarupria wrote: Hello all, I have a naphthalene + water system which I want to run a NPT simulation of. The system has been energy mininimized. When I run the NPT simulation using gromacs 4.5.5 the simulation runs fine, but when I attempt to run the same simulation using 4.

[gmx-users] Gromacs 4.5.4 and Gromacs 4.5.5 give different results

2013-06-27 Thread sarupria
Hello all, I have a naphthalene + water system which I want to run a NPT simulation of. The system has been energy mininimized. When I run the NPT simulation using gromacs 4.5.5 the simulation runs fine, but when I attempt to run the same simulation using 4.5.4 it crashes with lincs error. We ha

Re: [gmx-users] Gromacs GPU system question

2013-06-26 Thread Szilárd Páll
Thanks Mirco, good info, your numbers look quite consistent. The only complicating factor is that your CPUs are overclocked by different amounts, which changes the relative performances somewhat compared to non-overclocked parts. However, let me list some prices to show that the top-of-the line AM

Re: [gmx-users] Gromacs GPU system question

2013-06-24 Thread Szilárd Páll
On Sat, Jun 22, 2013 at 5:55 PM, Mirco Wahab wrote: > On 22.06.2013 17:31, Mare Libero wrote: >> >> I am assembling a GPU workstation to run MD simulations, and I was >> wondering if anyone has any recommendation regarding the GPU/CPU >> combination. >> From what I can see, the GTX690 could be th

Re: [gmx-users] Gromacs GPU system question

2013-06-24 Thread Szilárd Páll
I strongly suggest that you consider the single-chip GTX cards instead of a dual-chip one; from the point of view of price/performance you'll probably get the most from a 680 or 780. You could ask why, so here are the reasons: - The current parallelization scheme requires domain-decomposition to u

Re: 转发:[gmx-users] gromacs on GPU

2013-06-23 Thread Mark Abraham
On Sun, Jun 23, 2013 at 3:30 PM, wrote: > Dear gromcas users, > > Can anyone tell me how to make a mdp file for gromacs mdrun on NVIDIA GPU > card, the following is a mdp file which runs well on cpu, but when I add the > option "-testverlet" to mdrun in order to run it on GPU, it returns error

转发:[gmx-users] gromacs on GPU

2013-06-23 Thread sunyeping
Dear gromcas users,    Can anyone tell me how to make a mdp file for gromacs mdrun on NVIDIA GPU card, the following is a mdp file which runs well on cpu, but when I add the option "-testverlet" to mdrun in order to run it on GPU, it returns error "Fatal error:User non-bonded potentials are not

Re: [gmx-users] Gromacs GPU system question

2013-06-22 Thread Mirco Wahab
On 22.06.2013 22:18, Mare Libero wrote: The vendor I contacted was pushing for one of the high end i7 processors with hyper-threading. But from what I can read, most of the MD software don't make any use of it. So, using a the multi-cores AMD (like your FX-8350) can be a cheaper and more advanta

Re: [gmx-users] Gromacs GPU system question

2013-06-22 Thread Mirco Wahab
On 22.06.2013 17:31, Mare Libero wrote: I am assembling a GPU workstation to run MD simulations, and I was wondering if anyone has any recommendation regarding the GPU/CPU combination. From what I can see, the GTX690 could be the best bang for my buck in terms of number of cores, memory, clock

[gmx-users] Gromacs GPU system question

2013-06-22 Thread Mare Libero
Hello, I am assembling a GPU workstation to run MD simulations, and I was wondering if anyone has any recommendation regarding the GPU/CPU combination. From what I can see, the GTX690 could be the best bang for my buck in terms of number of cores, memory, clock rate. But being a dual GPU card,

Re: [gmx-users] gromacs on GPU

2013-06-21 Thread Mark Abraham
On Fri, Jun 21, 2013 at 8:33 AM, wrote: > Can anyone tell me how to make a mdp file for gromacs mdrun on NVIDIA GPU > card, the following is a mdp file which runs well on cpu, but when I add the > option "-testverlet" to mdrun in order to run it on GPU, it returns error > "nonbond potiential

[gmx-users] gromacs on GPU

2013-06-20 Thread sunyeping
 Can anyone tell me how to make a mdp file for gromacs mdrun on NVIDIA GPU card, the following is a mdp file which runs well on cpu, but when I add the option "-testverlet" to mdrun in order to run it on GPU, it returns error "nonbond potiential is not supported". Could you check the mdp fil

[gmx-users] GROMACS on GPU

2013-06-20 Thread sunyeping
Dear gromacs users,   Can anyone tell me how to make a mdp file for gromacs mdrun on NVIDIA GPU card, the following is a mdp file which runs well on cpu, but when I add the option "-testverlet" to mdrun in order to run it on GPU, it returns error "nonbond potiential is not supported". Could

Re: [gmx-users] Gromacs 4.5.5

2013-06-17 Thread Emmanuel, Alaina
Thank you Mark. Sent from my Ultrafast Samsung Galaxy S4 on Three Original message From: Mark Abraham Date: 17/06/2013 08:02 (GMT+00:00) To: Discussion list for GROMACS users Subject: Re: [gmx-users] Gromacs 4.5.5 On Jun 16, 2013 7:58 PM, "Emmanuel, Alaina&qu

Re: [gmx-users] Gromacs 4.5.5

2013-06-17 Thread Mark Abraham
On Jun 16, 2013 7:58 PM, "Emmanuel, Alaina" wrote: > > Dear All, > > I've had to re-install my gromacs on my computer and I'm having issues getting past the "Making all in Man7" during the "make" stage. There is no error in the output you show. Grepping for "error" doesn't really help when there

[gmx-users] Gromacs 4.5.5

2013-06-16 Thread Emmanuel, Alaina
Dear All, I've had to re-install my gromacs on my computer and I'm having issues getting past the "Making all in Man7" during the "make" stage. Below (at the end) are 2 copies of different outputs I get during my installation. The first copy is from the "make.log" file. The second copy is from

[gmx-users] gromacs 4.6.2 MPI distribution location problems

2013-06-10 Thread sirishkaushik
Hi All, I installed gromacs 4.6.2 using the following cmake options: cmake -DCMAKE_INSTALL_PREFIX=/home/kaushik/gromacs_executable/gromacs-new-mpi -DGMX_MPI=on After successful installation, when I run a test with mpirun mdrun, the program breaks with the following error: ---

Re: [gmx-users] GROMACS 4.6.2 released

2013-05-30 Thread Mark Abraham
On Thu, May 30, 2013 at 5:54 PM, Albert wrote: > it seems that Gromacs update quite frequently these days.. We try, thanks! :-) The idea is to get these patch releases out more-or-less monthly. This time, we spent several weeks noticing various correctness problems with not-quite-mainstream use

Re: [gmx-users] GROMACS 4.6.2 released

2013-05-30 Thread rajat desikan
Excellent work. Thank you all for working so hard! On Thu, May 30, 2013 at 9:24 PM, Albert wrote: > it seems that Gromacs update quite frequently these days.. > > > > > > On 05/30/2013 05:42 PM, Mark Abraham wrote: > >> Hi GROMACS users, >> >> >> GROMACS 4.6.2 is officially released. It contain

Re: [gmx-users] GROMACS 4.6.2 released

2013-05-30 Thread Albert
it seems that Gromacs update quite frequently these days.. On 05/30/2013 05:42 PM, Mark Abraham wrote: Hi GROMACS users, GROMACS 4.6.2 is officially released. It contains numerous bug fixes, some simulation performance enhancements and some documentation updates. We encourage all users to u

[gmx-users] GROMACS 4.6.2 released

2013-05-30 Thread Mark Abraham
Hi GROMACS users, GROMACS 4.6.2 is officially released. It contains numerous bug fixes, some simulation performance enhancements and some documentation updates. We encourage all users to upgrade their installations from 4.6 and 4.6.1. You can find the code, manual, release notes, installation i

Re: [gmx-users] Gromacs for Non biological systems

2013-05-28 Thread Dr. Vitaly Chaban
Look for my papers. At least, two dozens of them are about non-biophysical stuff. Gromacs can. The question is whether you can provide an adequate Hamiltonian to describe your systems involving Al-surface. Dr. Vitaly Chaban On Mon, May 27, 2013 at 10:59 PM, Jeya vimalan wrote: > Dear Collegu

[gmx-users] Gromacs for Non biological systems

2013-05-27 Thread Jeya vimalan
Dear Collegues, I was pointed to Gromacs to make it work on the non biological systems. My aim is to understand the interaction of Hf precursors on Gamma Alumina surface. But, i do not know yet if gromacs can efficiently handle this. Can someone help me fiding some papers where GROMACS have been us

Re: [gmx-users] Gromacs mpi error while running REMD

2013-05-01 Thread Justin Lemkul
On 5/1/13 7:50 PM, bharat gupta wrote: Dear gmx-users, I got the following error after issuing the final command for running 12 replicas :- [bme:42039] *** Process received signal *** [bme:42039] Signal: Segmentation fault (11) [bme:42039] Signal code: Invalid permissions (2) [bme:42039] Fail

[gmx-users] Gromacs mpi error while running REMD

2013-05-01 Thread bharat gupta
Dear gmx-users, I got the following error after issuing the final command for running 12 replicas :- [bme:42039] *** Process received signal *** [bme:42039] Signal: Segmentation fault (11) [bme:42039] Signal code: Invalid permissions (2) [bme:42039] Failing at address: 0x7f093b655340 [bme:42039]

Re: [gmx-users] GROMACS 4.6 with GPU acceleration (double

2013-04-22 Thread Szilárd Páll
On Mon, Apr 22, 2013 at 8:49 AM, Albert wrote: > On 04/22/2013 08:40 AM, Mikhail Stukan wrote: >> >> Could you explain which hardware do you mean? As far as I know, K20X >> supports double precision, so I would assume that double precision GROMACS >> should be realizable on it. > > > Really? But m

Re: [gmx-users] GROMACS 4.6 with GPU acceleration (double presion)

2013-04-22 Thread Szilárd Páll
On Tue, Apr 9, 2013 at 6:52 PM, David van der Spoel wrote: > On 2013-04-09 18:06, Mikhail Stukan wrote: > >> Dear experts, >> >> I have the following question. I am trying to compile GROMACS 4.6.1 with >> GPU acceleration and have the following diagnostics: >> >> # cmake .. -DGMX_DOUBLE=ON -DGMX_B

Re: [gmx-users] GROMACS 4.6 with GPU acceleration (double

2013-04-21 Thread Albert
On 04/22/2013 08:40 AM, Mikhail Stukan wrote: Could you explain which hardware do you mean? As far as I know, K20X supports double precision, so I would assume that double precision GROMACS should be realizable on it. Really? But many people have discussed that the GPU version ONLY support s

Re: [gmx-users] GROMACS 4.6 with GPU acceleration (double

2013-04-21 Thread Mikhail Stukan
David, Thank you very much for the reply. >The hardware does not support it yet AFAIK. Could you explain which hardware do you mean? As far as I know, K20X supports double precision, so I would assume that double precision GROMACS should be realizable on it. Thanks and regards, Mikhail >O

[gmx-users] GROMACS 4.5.7 released

2013-04-19 Thread Mark Abraham
Hi GROMACS users, GROMACS 4.5.7 is officially released. It contains some bug fixes, particularly for the md-vv integrator. You can find the code, release notes, and installation instructions at the links below. ftp://ftp.gromacs.org/pub/gromacs/gromacs-4.5.7.tar.gz http://www.gromacs.org/About_G

Re: [gmx-users] GROMACS 4.6 with GPU acceleration (double presion)

2013-04-09 Thread David van der Spoel
On 2013-04-09 18:06, Mikhail Stukan wrote: Dear experts, I have the following question. I am trying to compile GROMACS 4.6.1 with GPU acceleration and have the following diagnostics: # cmake .. -DGMX_DOUBLE=ON -DGMX_BUILD_OWN_FFTW=ON -DGMX_GPU=ON -DCUDA_TOOLKIT_ROOT_DIR=/usr/local/cuda -DCUDA

  1   2   3   4   5   6   7   8   9   10   >