Hi Alan,
thanks for the hint!!! Seems like it really was a problem with lammpi. The download from FINK worked beautifully, and GROMACS/open mpi works perfectly.
Cheers
Steffen
Steffen, if don't intend to use GMX double precision, why not use
Gromacs from FINK? It's all there. Use ommpi also, instead of lammpi
(seems broken in Fink too).

Then you can do something like:

#  with openmpi, for a dual core
grompp -f em.mdp -c Complex_b4em.pdb -p Complex.top -o em.tpr -np 2
om-mpirun -n 2 mdrun_mpi -v -deffnm em

grompp -f md.mdp -c em.gro -p Complex.top -o md.tpr -np 2 -sort -shuffle
om-mpirun -n 2 mdrun_mpi -v -deffnm md

Cheers,
Alan

On Thu, Jul 31, 2008 at 4:40 PM, <[EMAIL PROTECTED]> wrote:

Hi all,
I'm having trouble with the compilation of GROMACS on a Mac Pro (OS X).
It's a  machine with two Quad-Core Intel Xeon, so 8 nodes in all. As
compiler I used the programs delivered with the Xcode tools 3.0 and for
parallelisation the LAM/MPI package delivered with the Xcode tools. The
problem now is: If I try to start a job on the 8 nodes, the program is
started 8 times (one time each node) instead of one time and distributed
on 8 nodes. The result is the same for GROMACS 3.3.1, 3.3.3 and the recent
CVS. Does anybody have a suggestion? Is the problem more related to LAM
then to GROMACS?
Greetings
Steffen




--
Dipl.-Chem. Steffen Wolf
Fellow of the Ruhr-University Research School
Department of Biophysics
University of Bochum
ND 04/67
44780 Bochum
Germany
Tel: +49 (0)234 32 28363
Fax: +49 (0)234 32 14626
E-Mail: [EMAIL PROTECTED]
Web: http://www.bph.rub.de
_______________________________________________
gmx-users mailing list    gmx-users@gromacs.org
http://www.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED]
Can't post? Read http://www.gromacs.org/mailing_lists/users.php

Reply via email to