Re: [gmx-users] mdrun_mpi error

2011-12-10 Thread Mark Abraham
On 10/12/2011 7:54 PM, aiswarya pawar wrote: i used the option still i get the error as= /bin/sh ../../libtool --tag=CC --mode=compile mpCC -DHAVE_CONFIG_H -I. -I../../src -I../../include -DGMXLIBDIR=\"/home/staff/sec/secdpal/soft/gromacs/share/top\" -I/home/staff/sec/secdpal/soft/include

Re: [gmx-users] mdrun_mpi error

2011-12-10 Thread aiswarya pawar
i used the option still i get the error as= /bin/sh ../../libtool --tag=CC --mode=compile mpCC -DHAVE_CONFIG_H -I. -I../../src -I../../include -DGMXLIBDIR=\"/home/staff/sec/secdpal/soft/gromacs/share/top\" -I/home/staff/sec/secdpal/soft/include -O3 -qarch=ppc64 -qtune=pwr5 -c -o vmdio.lo vmdio

Re: [gmx-users] mdrun_mpi error

2011-12-09 Thread Mark Abraham
On 10/12/2011 6:31 PM, aiswarya pawar wrote: Hi, I tried giving this- ./configure --prefix=/home/soft/gromacs --host=ppc --build=ppc64 --enable-mpi --with-fft=fftw3 MPICC="mpcc" CC="xlc" CFLAGS="-O3 -qarch=450d -qtune=450" CXX="mpixlC_r" CXXFLAGS="-O3 -qarch=450d -qtune=450" and the conf

Re: [gmx-users] mdrun_mpi error

2011-12-09 Thread aiswarya pawar
Hi, I tried giving this- ./configure --prefix=/home/soft/gromacs --host=ppc --build=ppc64 --enable-mpi --with-fft=fftw3 MPICC="mpcc" CC="xlc" CFLAGS="-O3 -qarch=450d -qtune=450" CXX="mpixlC_r" CXXFLAGS="-O3 -qarch=450d -qtune=450" and the configure process ran well. but when i gave make mdrun,

Re: [gmx-users] mdrun_mpi error

2011-12-09 Thread aiswarya pawar
Hi, I tried giving this- ./configure --prefix=/home/soft/gromacs --host=ppc --build=ppc64 --enable-mpi --with-fft=fftw3 MPICC="mpcc" CC="xlc" CFLAGS="-O3 -qarch=450d -qtune=450" CXX="mpixlC_r" CXXFLAGS="-O3 -qarch=450d -qtune=450" and the configure process ran well. but when i gave make mdrun,

Re: [gmx-users] mdrun_mpi error

2011-12-08 Thread Mark Abraham
On 8/12/2011 6:35 PM, aiswarya pawar wrote: Hi users, Am running the mdrun_mpi on cluster with the md.mdp parameters as- ; VARIOUS PREPROCESSING OPTIONS title= Position Restrained Molecular Dynamics ; RUN CONTROL PARAMETERS constraints = all-bonds integrator = md dt = 0.00

[gmx-users] mdrun_mpi error

2011-12-07 Thread aiswarya pawar
Hi users, Am running the mdrun_mpi on cluster with the md.mdp parameters as- ; VARIOUS PREPROCESSING OPTIONS title= Position Restrained Molecular Dynamics ; RUN CONTROL PARAMETERS constraints = all-bonds integrator = md dt = 0.002 ; 2fs ! nsteps = 250 ; total 5000 ps. ns

Re: [gmx-users] mdrun_mpi: error while loading shared libraries: libimf.so: cannot open shared object file: No such file or directory

2010-07-08 Thread Mark Abraham
- Original Message - From: zhongjin Date: Thursday, July 8, 2010 18:53 Subject: [gmx-users] mdrun_mpi: error while loading shared libraries: libimf.so: cannot open shared object file: No such file or directory To: gmx-users@gromacs.org

Re: [gmx-users] mdrun_mpi: error while loading shared libraries: libimf.so: cannot open shared object file: No such file or directory

2010-07-08 Thread Carsten Kutzner
Hi, you can check with ldd mdrun_mpi whether really all needed libraries were found. Is libimf.so in /opt/intel/fc/10.1.008/lib ? The intel compilers also come with files called "iccvars.sh" or "ictvars.sh". If you do source /path/to/iccvars.sh everything should be set as needed. Check the In

[gmx-users] mdrun_mpi: error while loading shared libraries: libimf.so: cannot open shared object file: No such file or directory

2010-07-08 Thread zhongjin
Dear users,   When I am using GROMACS 4.0.7 on the Compute node ,executing command: mpiexec -n 4  mdrun_mpi -deffnm SWNT66nvt >/dev/null & and then met a problem :mdrun_mpi: error while loading shared libraries: libimf.so: cannot open shared object file: No such file or directory but I have ad

Re: [gmx-users] mdrun_mpi error Signal: Segmentation fault

2009-04-17 Thread nam kim
User reported it's problem with input file. 2009/4/15 annalisa bordogna : > Hi, > I received a similar error during an equilibration by steepest descent in > which I had posed constraints on water, leaving the protein free to move. > I suggest to control your mdp file... maybe you did the same thi

Re: [gmx-users] mdrun_mpi error Signal: Segmentation fault

2009-04-15 Thread annalisa bordogna
Hi, I received a similar error during an equilibration by steepest descent in which I had posed constraints on water, leaving the protein free to move. I suggest to control your mdp file... maybe you did the same thing and the system collapsed or exploded (you can see that reading the log file: if

Re: [gmx-users] mdrun_mpi error Signal: Segmentation fault

2009-04-13 Thread Justin A. Lemkul
nam kim wrote: process crashes around 100 steps out of 1000 requested. Fine, but you still haven't answered my question. Do you receive any other messages? Do other systems run on the specific hardware you're using? You may just have some instability in this particular system that is c

Re: [gmx-users] mdrun_mpi error Signal: Segmentation fault

2009-04-13 Thread nam kim
process crashes around 100 steps out of 1000 requested. On Fri, Apr 10, 2009 at 4:32 PM, Justin A. Lemkul wrote: > > > nam kim wrote: >> >> I have segmentation fault error while running mdrun_mpi( gromacs 4.0.4). >> I have installed gromacs 4.0.4 two month ago and been working fine. >> Today, I j

Re: [gmx-users] mdrun_mpi error Signal: Segmentation fault

2009-04-10 Thread Justin A. Lemkul
nam kim wrote: I have segmentation fault error while running mdrun_mpi( gromacs 4.0.4). I have installed gromacs 4.0.4 two month ago and been working fine. Today, I just got Segment errors. Rebooting does not much help. Here is log: [rd:06790] *** Process received signal *** [d:06790] Signal:

[gmx-users] mdrun_mpi error Signal: Segmentation fault

2009-04-10 Thread nam kim
I have segmentation fault error while running mdrun_mpi( gromacs 4.0.4). I have installed gromacs 4.0.4 two month ago and been working fine. Today, I just got Segment errors. Rebooting does not much help. Here is log: [rd:06790] *** Process received signal *** [d:06790] Signal: Segmentation fault