Re: [gmx-users] running gromacs in parallel, and with nice levels

2007-10-05 Thread Carsten Kutzner
compile mdrun ? There is no need to recompile or to set the nice level on dedicated nodes, it will not affect the performance. Hope that helps, regards, Carsten -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Department Am Fassber

Re: [gmx-users] how to get CVS version gmx?

2007-08-17 Thread Carsten Kutzner
Hi Linda, you can use these commands to download from CVS: cvs -z3 -d :pserver:[EMAIL PROTECTED]:/home/gmx/cvs login Then hit on password prompt cvs -z3 -d :pserver:[EMAIL PROTECTED]:/home/gmx/cvs co gmx Carsten Zhaoyang Fu wrote: > Dear gmx users & developers, > > Would you pleas

Re: [gmx-users] problems with parallel mdrun

2007-08-09 Thread Carsten Kutzner
Mark Abraham wrote: > Gurpreet Singh wrote: >> I get the following errors while using paralled version of mdrun compiled >> with openmpi. >> mpirun -np 4 mdrun_d_mpi -np 4 -v -deffnm EQUI1 >> * >> Program mdrun_d_mpi, VERSION 3.3.99_development_20070720 >> Source code file: gmx_parallel_3dfft.c

Re: [gmx-users] Parallel Gromacs Benchmarking with Opteron Dual-Core & Gigabit Ethernet

2007-07-24 Thread Carsten Kutzner
Kazem Jahanbakhsh wrote: > Dear Erik, > >> Remember - compared to the benchmark numbers at www.gromacs.org, your >> bandwidth is 1/4 and the latency 4 times higher, since you have four >> cores sharing a single network connection. >> > > I agree with you about the GbE sharing between 4 cores dega

Re: [gmx-users] rotational fit in XY plane only

2007-06-28 Thread Carsten Kutzner
Hi Abu, with these commands you download the latest CVS version: cvs -z3 -d :pserver:[EMAIL PROTECTED]:/home/gmx/cvs login cvs -z3 -d :pserver:[EMAIL PROTECTED]:/home/gmx/cvs co gmx Carsten Naser, Md Abu wrote: > > I implemented an xy only fitting option for trjconv in the development >

Re: [gmx-users] Gromacs website broken

2007-06-26 Thread Carsten Kutzner
pim schravendijk wrote: Hi People!, The Gromacs website is seriously broken. I wanted to copy-paste the line for cvs checkout as I usually do, but following the link to the CVS page via either the download or the developer menu gives me a page saying I don't have access and need to log in. Usin

Re: [gmx-users] request for patch and test program from 'Speeding up Parallel GROMACS'

2007-04-04 Thread Carsten Kutzner
many parts of a program, the (lam-) all-to-all just turned out to be the No. 1 candidate. Carsten -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Department Am Fassberg 11 37077 Goettingen, Germany Tel. +49-551-2012313, Fax: +

Re: [gmx-users] ordered all-to-all patch

2007-04-03 Thread Carsten Kutzner
Chris Neale wrote: Thanks for the reply. Are you sure that it is there? I checked that prior to sending the fist request and I checked it again now. Ordered by date, the most recent entry is "StressCPU, version 2.0 09.02.2007" and next is "g_spatial 21.12.2006". In any event, I cleared my cache

Re: [gmx-users] ordered all-to-all patch

2007-04-03 Thread Carsten Kutzner
Hi Chris, the patch can be downloaded from the gromacs website: download -> user contributions -> contributed software -> gmx_alltoall Carsten [EMAIL PROTECTED] wrote: This is a fantastic development. I had wondered why my scaling was so much better for openmpi than for lam. Has the patch be

Re: [gmx-users] Problem installing gromacs-3.2.1 with pme.c patch

2007-02-08 Thread Carsten Kutzner
Hi Eva, the patch is for version 3.3 and cannot be applied to other versions. Do you really need to use a pme order other than the standard one (which is 4)? Otherwise you will not need the patch at all. Hope that helps, Carsten Eva Santos wrote: Hello everyone, I have a problem when in

Re: [gmx-users] Installation Problem

2006-11-15 Thread Carsten Kutzner
Karthikeyan Pasupathy wrote: I have installed gromacs 3.3 on intel core2duo. I did everything. now when i try 2 run any program of its.. i get an error like this: "./pdb2gmx: error while loading shared libraries: libXm.so.2: cannot open shared object file: No such file or directory" export L

Re: [gmx-users] Running GROMACS in parallel

2006-11-09 Thread Carsten Kutzner
gmx-users mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.p

Re: [gmx-users] Problems Installing on IBM Netfinity 4500R with Redhat 8.0

2006-11-02 Thread Carsten Kutzner
27;' X_LIBS='' X_PRE_LIBS='' ac_ct_AR='' ac_ct_AS='' ac_ct_CC='' ac_ct_CXX='' ac_ct_DLLTOOL='' ac_ct_F77='' ac_ct_OBJDUMP='' ac_ct_RANLIB='' ac_ct_STRIP='' am__fastdepCC_FALSE='

Re: Fwd: [gmx-users] mdrun on several clusternodes using PME gromacs 3.3.1

2006-10-27 Thread Carsten Kutzner
ill die unhappy thank you so much joern ___ gmx-users mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mai

Re: [gmx-users] GROMACS Parallel Runs

2006-10-02 Thread Carsten Kutzner
___ gmx-users mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.

Re: [gmx-users] Failed to allocated u bytes of aligned memory (mdrun)

2006-08-01 Thread Carsten Kutzner
___ > gmx-users mailing listgmx-users@gromacs.org > http://www.gromacs.org/mailman/listinfo/gmx-users > Please don't post (un)subscribe requests to the list. Use the > www interface or send it to [EMAIL PROTECTED] > Can't post? Read http://www.gromacs.org/mailing_l

Re: [gmx-users] Failed to allocated u bytes of aligned memory (mdrun)

2006-08-01 Thread Carsten Kutzner
PAUL NEWMAN wrote: > Dear all: > I made a simulation about a polyelectrolyte with some ions, it runs ok > when I use cut-off for both Wdv and Coulomb. However I get the following > error when I change it to PME. I attache my mdp file . Could someone > help me to find what it's wrong in my file?.

Re: [gmx-users] mdrun hangs on nodes of P655+ aix 5.2

2006-07-31 Thread Carsten Kutzner
t (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Department Am Fassberg 11 37077

Re: [gmx-users] MPICH or LAM/MPI

2006-06-27 Thread Carsten Kutzner
be requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Department Am Fassberg 11 37077 Goettingen,

Re: [gmx-users] MPICH or LAM/MPI

2006-06-27 Thread Carsten Kutzner
sers Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics

Re: [gmx-users] problem with LAM

2006-06-27 Thread Carsten Kutzner
ou run something like 'mpirun -np 2 ls'? Carsten -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Department Am Fassberg 11 37077 Goettingen, Germany Tel. +49-551-2012313, Fax: +49-551-2012302 http://www.mpibpc.

Re: [gmx-users] MPICH or LAM/MPI

2006-06-20 Thread Carsten Kutzner
; ac_ct_F77='' ac_ct_OBJDUMP='' ac_ct_RANLIB='' ac_ct_STRIP='' am__fastdepCC_FALSE='#' am__fastdepCC_TRUE='' am__fastdepCXX_FALSE='' am__fastdepCXX_TRUE='' am__include='include' am__leading_dot='.' am__quot

Re: [gmx-users] MPICH or LAM/MPI

2006-06-19 Thread Carsten Kutzner
computer nodes. Carsten -Original Message- From: Carsten Kutzner <[EMAIL PROTECTED]> To: Discussion list for GROMACS users Sent: Mon, 19 Jun 2006 10:28:51 +0200 Subject: Re: [gmx-users] MPICH or LAM/MPI Hello Hector, since it does not take long to install lam and mpich, I would i

Re: [gmx-users] MPICH or LAM/MPI

2006-06-19 Thread Carsten Kutzner
)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Department Am Fassberg 11 3

Re: [gmx-users] Paralellization limit?

2006-04-19 Thread Carsten Kutzner
Andrea Carotti wrote: Hi Carsten, thanks for your quick reply. Could you plese confrim me that gromacs 3.3.1 works and compile fine with mpich 2.x? Yes, compiles and runs in parallel. Only difference is that I use the FFTW3. Carsten Cause this is the first time that I hear that from a user a

Re: [gmx-users] Paralellization limit?

2006-04-19 Thread Carsten Kutzner
Hi Andrea, Andrea Carotti wrote: Hi all, I'm trying to simulate a system with two identical proteins (42aa each one), solvent spc (18430 mols) and 6 ions NA+...for a total of ~56100 atoms. Now the "problem" is that if I run the MD on 4 nodes everything works fine, but when I try to use 6 or

<    1   2   3