[gmx-users] mdrun_mpi segmentation fault for run in vacuum

2012-06-12 Thread reisingere
Hi everybody, I tried to run a minimization just of the hydrogen of a membrane protein. I want to do this in vacuum. But when I started the run with mpirun mdrun_mpi -deffnm protein -v -nt 2 I get the error that there is a segmentation fault. But when I only type mpirun mdrun_mpi there is no

Re: [gmx-users] mdrun_mpi segmentation fault for run in vacuum

2012-06-12 Thread Justin A. Lemkul
On 6/12/12 5:54 AM, reising...@rostlab.informatik.tu-muenchen.de wrote: Hi everybody, I tried to run a minimization just of the hydrogen of a membrane protein. I want to do this in vacuum. But when I started the run with mpirun mdrun_mpi -deffnm protein -v -nt 2 I get the error that there

Re: [gmx-users] mdrun_mpi segmentation fault for run in vacuum

2012-06-12 Thread reisingere
On 6/12/12 5:54 AM, reising...@rostlab.informatik.tu-muenchen.de wrote: Hi everybody, I tried to run a minimization just of the hydrogen of a membrane protein. I want to do this in vacuum. But when I started the run with mpirun mdrun_mpi -deffnm protein -v -nt 2 I get the error that

Re: [gmx-users] mdrun_mpi segmentation fault for run in vacuum

2012-06-12 Thread Justin A. Lemkul
On 6/12/12 7:05 AM, reising...@rostlab.informatik.tu-muenchen.de wrote: On 6/12/12 5:54 AM, reising...@rostlab.informatik.tu-muenchen.de wrote: Hi everybody, I tried to run a minimization just of the hydrogen of a membrane protein. I want to do this in vacuum. But when I started the run

Re: [gmx-users] mdrun_mpi segmentation fault for run in vacuum

2012-06-12 Thread reisingere
On 6/12/12 7:05 AM, reising...@rostlab.informatik.tu-muenchen.de wrote: On 6/12/12 5:54 AM, reising...@rostlab.informatik.tu-muenchen.de wrote: Hi everybody, I tried to run a minimization just of the hydrogen of a membrane protein. I want to do this in vacuum. But when I started the

Re: [gmx-users] mdrun_mpi segmentation fault for run in vacuum

2012-06-12 Thread Justin A. Lemkul
On 6/12/12 7:34 AM, reising...@rostlab.informatik.tu-muenchen.de wrote: On 6/12/12 7:05 AM, reising...@rostlab.informatik.tu-muenchen.de wrote: On 6/12/12 5:54 AM, reising...@rostlab.informatik.tu-muenchen.de wrote: Hi everybody, I tried to run a minimization just of the hydrogen of a

Re: [gmx-users] mdrun_mpi segmentation fault for run in vacuum

2012-06-12 Thread reisingere
On 6/12/12 7:34 AM, reising...@rostlab.informatik.tu-muenchen.de wrote: On 6/12/12 7:05 AM, reising...@rostlab.informatik.tu-muenchen.de wrote: On 6/12/12 5:54 AM, reising...@rostlab.informatik.tu-muenchen.de wrote: Hi everybody, I tried to run a minimization just of the hydrogen of a

Re: [gmx-users] mdrun_mpi segmentation fault for run in vacuum

2012-06-12 Thread Justin A. Lemkul
On 6/12/12 7:46 AM, reising...@rostlab.informatik.tu-muenchen.de wrote: On 6/12/12 7:34 AM, reising...@rostlab.informatik.tu-muenchen.de wrote: On 6/12/12 7:05 AM, reising...@rostlab.informatik.tu-muenchen.de wrote: On 6/12/12 5:54 AM, reising...@rostlab.informatik.tu-muenchen.de

Re: [gmx-users] mdrun_mpi segmentation fault for run in vacuum

2012-06-12 Thread reisingere
On 6/12/12 7:46 AM, reising...@rostlab.informatik.tu-muenchen.de wrote: On 6/12/12 7:34 AM, reising...@rostlab.informatik.tu-muenchen.de wrote: On 6/12/12 7:05 AM, reising...@rostlab.informatik.tu-muenchen.de wrote: On 6/12/12 5:54 AM, reising...@rostlab.informatik.tu-muenchen.de

Re: [gmx-users] mdrun_mpi segmentation fault for run in vacuum

2012-06-12 Thread Justin A. Lemkul
On 6/12/12 8:48 AM, reising...@rostlab.informatik.tu-muenchen.de wrote: On 6/12/12 7:46 AM, reising...@rostlab.informatik.tu-muenchen.de wrote: On 6/12/12 7:34 AM, reising...@rostlab.informatik.tu-muenchen.de wrote: On 6/12/12 7:05 AM, reising...@rostlab.informatik.tu-muenchen.de

Re: [gmx-users] mdrun_mpi segmentation fault for run in vacuum

2012-06-12 Thread reisingere
On 6/12/12 8:48 AM, reising...@rostlab.informatik.tu-muenchen.de wrote: On 6/12/12 7:46 AM, reising...@rostlab.informatik.tu-muenchen.de wrote: On 6/12/12 7:34 AM, reising...@rostlab.informatik.tu-muenchen.de wrote: On 6/12/12 7:05 AM, reising...@rostlab.informatik.tu-muenchen.de

Re: [gmx-users] mdrun_mpi segmentation fault for run in vacuum

2012-06-12 Thread Justin A. Lemkul
On 6/12/12 10:09 AM, reising...@rostlab.informatik.tu-muenchen.de wrote: On 6/12/12 8:48 AM, reising...@rostlab.informatik.tu-muenchen.de wrote: On 6/12/12 7:46 AM, reising...@rostlab.informatik.tu-muenchen.de wrote: On 6/12/12 7:34 AM, reising...@rostlab.informatik.tu-muenchen.de

Re: [gmx-users] mdrun_mpi issue with CHARMM36 FF

2012-05-14 Thread Mark Abraham
On 14/05/2012 3:52 PM, Anirban wrote: Hi ALL, I am trying to simulate a membrane protein system using CHARMM36 FF on GROAMCS4.5.5 on a parallel cluster running on MPI. The system consists of arounf 1,17,000 atoms. The job runs fine on 5 nodes (5X12=120 cores) using mpirun and gives proper

Re: [gmx-users] mdrun_mpi issue with CHARMM36 FF

2012-05-14 Thread Anirban
On Mon, May 14, 2012 at 11:35 AM, Mark Abraham mark.abra...@anu.edu.auwrote: On 14/05/2012 3:52 PM, Anirban wrote: Hi ALL, I am trying to simulate a membrane protein system using CHARMM36 FF on GROAMCS4.5.5 on a parallel cluster running on MPI. The system consists of arounf 1,17,000

Re: [gmx-users] mdrun_mpi issue with CHARMM36 FF

2012-05-14 Thread Mark Abraham
On 14/05/2012 4:18 PM, Anirban wrote: On Mon, May 14, 2012 at 11:35 AM, Mark Abraham mark.abra...@anu.edu.au mailto:mark.abra...@anu.edu.au wrote: On 14/05/2012 3:52 PM, Anirban wrote: Hi ALL, I am trying to simulate a membrane protein system using CHARMM36 FF on

[gmx-users] mdrun_mpi issue with CHARMM36 FF

2012-05-13 Thread Anirban
Hi ALL, I am trying to simulate a membrane protein system using CHARMM36 FF on GROAMCS4.5.5 on a parallel cluster running on MPI. The system consists of arounf 1,17,000 atoms. The job runs fine on 5 nodes (5X12=120 cores) using mpirun and gives proper output. But whenever I try to submit it on

[gmx-users] mdrun_mpi issue for CHARMM36 FF

2012-05-12 Thread Anirban
Hi ALL, I am trying to simulate a membrane protein system using CHARMM36 FF on GROAMCS4.5.5 on a parallel cluster running on MPI. The system consists of arounf 1,17,000 atoms. The job runs fine on 5 nodes (5X12=120 cores) using mpirun and gives proper output. But whenever I try to submit it on

[gmx-users] mdrun_mpi -rerun bonded interactions

2011-12-15 Thread Vasileios Tatsis
Dear GRomacs users, I am using the -rerun option of mdrun to read the coordinates of a trajectory and to compute the potential energy of a molecule during MD. This operation when performed in parallel, using mdrun_mpi, the energy of the bonded interactions is not computed. But using one core,

Re: [gmx-users] mdrun_mpi error

2011-12-10 Thread aiswarya pawar
i used the option still i get the error as= /bin/sh ../../libtool --tag=CC --mode=compile mpCC -DHAVE_CONFIG_H -I. -I../../src -I../../include -DGMXLIBDIR=\/home/staff/sec/secdpal/soft/gromacs/share/top\ -I/home/staff/sec/secdpal/soft/include -O3 -qarch=ppc64 -qtune=pwr5 -c -o vmdio.lo

Re: [gmx-users] mdrun_mpi error

2011-12-10 Thread Mark Abraham
On 10/12/2011 7:54 PM, aiswarya pawar wrote: i used the option still i get the error as= /bin/sh ../../libtool --tag=CC --mode=compile mpCC -DHAVE_CONFIG_H -I. -I../../src -I../../include -DGMXLIBDIR=\/home/staff/sec/secdpal/soft/gromacs/share/top\ -I/home/staff/sec/secdpal/soft/include

Re: [gmx-users] mdrun_mpi error

2011-12-09 Thread aiswarya pawar
Hi, I tried giving this- ./configure --prefix=/home/soft/gromacs --host=ppc --build=ppc64 --enable-mpi --with-fft=fftw3 MPICC=mpcc CC=xlc CFLAGS=-O3 -qarch=450d -qtune=450 CXX=mpixlC_r CXXFLAGS=-O3 -qarch=450d -qtune=450 and the configure process ran well. but when i gave make mdrun, i get an

Re: [gmx-users] mdrun_mpi error

2011-12-09 Thread aiswarya pawar
Hi, I tried giving this- ./configure --prefix=/home/soft/gromacs --host=ppc --build=ppc64 --enable-mpi --with-fft=fftw3 MPICC=mpcc CC=xlc CFLAGS=-O3 -qarch=450d -qtune=450 CXX=mpixlC_r CXXFLAGS=-O3 -qarch=450d -qtune=450 and the configure process ran well. but when i gave make mdrun, i get an

Re: [gmx-users] mdrun_mpi error

2011-12-09 Thread Mark Abraham
On 10/12/2011 6:31 PM, aiswarya pawar wrote: Hi, I tried giving this- ./configure --prefix=/home/soft/gromacs --host=ppc --build=ppc64 --enable-mpi --with-fft=fftw3 MPICC=mpcc CC=xlc CFLAGS=-O3 -qarch=450d -qtune=450 CXX=mpixlC_r CXXFLAGS=-O3 -qarch=450d -qtune=450 and the configure

Re: [gmx-users] mdrun_mpi error

2011-12-08 Thread Mark Abraham
On 8/12/2011 6:35 PM, aiswarya pawar wrote: Hi users, Am running the mdrun_mpi on cluster with the md.mdp parameters as- ; VARIOUS PREPROCESSING OPTIONS title= Position Restrained Molecular Dynamics ; RUN CONTROL PARAMETERS constraints = all-bonds integrator = md dt =

[gmx-users] mdrun_mpi error

2011-12-07 Thread aiswarya pawar
Hi users, Am running the mdrun_mpi on cluster with the md.mdp parameters as- ; VARIOUS PREPROCESSING OPTIONS title= Position Restrained Molecular Dynamics ; RUN CONTROL PARAMETERS constraints = all-bonds integrator = md dt = 0.002 ; 2fs ! nsteps = 250 ; total 5000 ps.

Re: [gmx-users] mdrun_mpi in HP_MPI LSF/SLURM setup

2011-04-18 Thread Larcombe, Lee
Ok. Solved it. Nothing wrong with the LSF/SLURM/MPI stuff. GMXRC.bash hadn't executed properly to set the environment up (typo) and the shared libraries weren’t being found. Seems this makes mdrun_mpi run like serial mdrun! Oops. Lee On 16/04/2011 22:51, Larcombe, Lee

Re: [gmx-users] mdrun_mpi in HP_MPI LSF/SLURM setup

2011-04-18 Thread Mark Abraham
On 4/18/2011 6:44 PM, Larcombe, Lee wrote: Ok. Solved it. Nothing wrong with the LSF/SLURM/MPI stuff. GMXRC.bash hadn't executed properly to set the environment up (typo) and the shared libraries weren’t being found. Seems this makes mdrun_mpi run like serial mdrun! OK, but failing to pick up

Re: [gmx-users] mdrun_mpi in HP_MPI LSF/SLURM setup

2011-04-16 Thread Mark Abraham
On 16/04/2011 12:13 AM, Larcombe, Lee wrote: Hi gmx-users We have an HPC setup running HP_MPI and LSF/SLURM. Gromacs 4.5.3 has been compiled with mpi support The compute nodes on the system contain 2 x dual core Xeons which the system sees as 4 processors An LSF script called gromacs_run.lsf

Re: [gmx-users] mdrun_mpi in HP_MPI LSF/SLURM setup

2011-04-16 Thread Larcombe, Lee
Thanks Mark, HP-MPI is configured correctly on the system - an HP XC 3000 800 cores. It works for all the other users (none gromacs) and no I've tested it, I can launch an mpi job which runs fine on the login node (two quad core xeons) It seems to be an issue with the number of processors passed

[gmx-users] mdrun_mpi in HP_MPI LSF/SLURM setup

2011-04-15 Thread Larcombe, Lee
Hi gmx-users We have an HPC setup running HP_MPI and LSF/SLURM. Gromacs 4.5.3 has been compiled with mpi support The compute nodes on the system contain 2 x dual core Xeons which the system sees as 4 processors An LSF script called gromacs_run.lsf is as shown below #BSUB -N #BSUB -J

[gmx-users] mdrun_mpi!!

2011-04-13 Thread delara aghaie
Hello In version 4.0.7 in .ll file, I use command line: mpiexec mdrun_mpi -v -s topol.tpr I get error which does noe recognize mdrun_mpi I change it to mdrun and it works.   1) is the bold command line ok? 2) version 4.0.7 does not need _mpi after commands?   thanks D.Aghaie-- gmx-users mailing

Re: [gmx-users] mdrun_mpi!!

2011-04-13 Thread Justin A. Lemkul
delara aghaie wrote: Hello In version 4.0.7 in .ll file, I use command line: *mpiexec mdrun_mpi -v -s topol.tpr* I get error which does noe recognize mdrun_mpi I change it to mdrun and it works. 1) is the bold command line ok? Only if you have (1) compiled with MPI support and (2) named

Re: [gmx-users] mdrun_mpi executable not found

2011-01-26 Thread Justin Kat
Thank you, I have been to that page probably a good 100 times by now. Was the 'No.' response with regards to my primary question? Or to the one within the parentheses? Suppose I remove my existing installation and reinstall, I am hoping to figure out when/where exactly should I specify

Re: [gmx-users] mdrun_mpi executable not found

2011-01-26 Thread Justin Kat
./configure --enable-mpi --program-suffix=_mpi make mdrun make install-mdrun make links Sorry for the random asterisk* symbols they must have came through from some formatting. On Wed, Jan 26, 2011 at 12:53 PM, Justin Kat justin@mail.mcgill.cawrote: Thank you, I have been to that page

Re: [gmx-users] mdrun_mpi executable not found

2011-01-25 Thread Justin Kat
Alright. So meaning I should have instead issued: ./configure --enable-mpi --program-suffix=_mpi make mdrun make install-mdrun make links to have installed an MPI-enabled executable called mdrun_mpi apart from the existing mdrun executable? (Would I also need to append the _mpi suffix when

Re: [gmx-users] mdrun_mpi executable not found

2011-01-25 Thread Mark Abraham
On 26/01/2011 8:50 AM, Justin Kat wrote: Alright. So meaning I should have instead issued: ./configure --enable-mpi --program-suffix=_mpi|| make mdrun make install-mdrun make links to have installed an MPI-enabled executable called mdrun_mpi apart from the existing mdrun executable? (Would I

[gmx-users] mdrun_mpi executable not found

2011-01-24 Thread Justin Kat
Dear gmx users, I have installed the parallel version 4.0.7 of gromacs on one of the nodes of my cluster. Here is the steps I've done through root: first, the normal installation: ./configure make make install make links then issued commands below for the mpi build: ./configure

Re: [gmx-users] mdrun_mpi executable not found

2011-01-24 Thread Justin A. Lemkul
Justin Kat wrote: Dear gmx users, I have installed the parallel version 4.0.7 of gromacs on one of the nodes of my cluster. Here is the steps I've done through root: first, the normal installation: ./configure make make install make links then issued commands below for the mpi

Re: [gmx-users] mdrun_mpi executable not found

2011-01-24 Thread Justin Kat
Thank you for the reply! hmm mdrun_mpi does not appear in the list of executables in /usr/local/gromacs/bin (and well therefore not in /usr/local/bin). Which set of installation commands that I used should have compiled the mdrun_mpi executable? And how should I go about getting the mdrun_mpi

Re: [gmx-users] mdrun_mpi executable not found

2011-01-24 Thread Justin A. Lemkul
Justin Kat wrote: Thank you for the reply! hmm mdrun_mpi does not appear in the list of executables in /usr/local/gromacs/bin (and well therefore not in /usr/local/bin). Which set of installation commands that I used should have compiled the mdrun_mpi executable? And how should I go about

Re: [gmx-users] mdrun_mpi: error while loading shared libraries: libimf.so: cannot open shared object file: No such file or directory

2010-07-08 Thread Carsten Kutzner
Hi, you can check with ldd mdrun_mpi whether really all needed libraries were found. Is libimf.so in /opt/intel/fc/10.1.008/lib ? The intel compilers also come with files called iccvars.sh or ictvars.sh. If you do source /path/to/iccvars.sh everything should be set as needed. Check the Intel

Re: [gmx-users] mdrun_mpi: error while loading shared libraries: libimf.so: cannot open shared object file: No such file or directory

2010-07-08 Thread Mark Abraham
- Original Message - From: zhongjin zhongjin1...@yahoo.com.cn Date: Thursday, July 8, 2010 18:53 Subject: [gmx-users] mdrun_mpi: error while loading shared libraries: libimf.so: cannot open shared object file: No such file or directory To: gmx-users@gromacs.org

Re: [gmx-users] mdrun_mpi issue.

2010-06-29 Thread Mark Abraham
- Original Message - From: quantrum75 quantru...@yahoo.com Date: Wednesday, June 30, 2010 7:12 Subject: [gmx-users] mdrun_mpi issue. To: gmx-users@gromacs.org --- | Hi Folks, I am trying to run a simulation under GMX 4.0.5. When

Re: [gmx-users] mdrun_mpi error Signal: Segmentation fault

2009-04-17 Thread nam kim
User reported it's problem with input file. 2009/4/15 annalisa bordogna annalisa.bordo...@gmail.com: Hi, I received a similar error during an equilibration by steepest descent in which I had posed constraints on water, leaving the protein free to move. I suggest to control your mdp file...

Re: [gmx-users] mdrun_mpi error Signal: Segmentation fault

2009-04-13 Thread nam kim
process crashes around 100 steps out of 1000 requested. On Fri, Apr 10, 2009 at 4:32 PM, Justin A. Lemkul jalem...@vt.edu wrote: nam kim wrote: I have segmentation fault error while running mdrun_mpi( gromacs 4.0.4). I have installed gromacs 4.0.4 two month ago and been working fine.

Re: [gmx-users] mdrun_mpi error Signal: Segmentation fault

2009-04-13 Thread Justin A. Lemkul
nam kim wrote: process crashes around 100 steps out of 1000 requested. Fine, but you still haven't answered my question. Do you receive any other messages? Do other systems run on the specific hardware you're using? You may just have some instability in this particular system that is

[gmx-users] mdrun_mpi error Signal: Segmentation fault

2009-04-10 Thread nam kim
I have segmentation fault error while running mdrun_mpi( gromacs 4.0.4). I have installed gromacs 4.0.4 two month ago and been working fine. Today, I just got Segment errors. Rebooting does not much help. Here is log: [rd:06790] *** Process received signal *** [d:06790] Signal: Segmentation

Re: [gmx-users] mdrun_mpi error Signal: Segmentation fault

2009-04-10 Thread Justin A. Lemkul
nam kim wrote: I have segmentation fault error while running mdrun_mpi( gromacs 4.0.4). I have installed gromacs 4.0.4 two month ago and been working fine. Today, I just got Segment errors. Rebooting does not much help. Here is log: [rd:06790] *** Process received signal *** [d:06790]

[gmx-users] mdrun_mpi stops at random

2006-05-11 Thread Jason O'Young
Hi all, I have an issue doing parallel runs where the simulation would just hang at seemingly random intervals anywhere from an hour to a day. There are no error messages reported in the logs and nothing funny from dmesg. My set up is two dual-core Pentium D. I run with -np 4 to take