On Jun 8, 2010, at 3:06 PM, Jeff Squyres wrote:

> I know nothing about Gromacs, but you might want to ensure that your Gromacs 
> was compiled with Open MPI.  A common symptom of "mpirun -np 4 
> my_mpi_application" running 4 1-process MPI jobs (instead of 1 4-process MPI 
> job) is that you compiled my_mpi_application with one MPI implementation, but 
> then used the mpirun from a different MPI implementation.
> 
Hi,

this can be checked by looking at the Gromacs output file md.log. The second 
line should
read something like

Host: <somename> pid: <somepid> nodeid: 0 nnodes: 4

Lauren, you will want to ensure that nnodes is 4 in your case, and not 1.

You can also easily test that without any input file by typing

mpirun -np 4 mdrun -h

and then should see

NNODES=4, MYRANK=1, HOSTNAME=<...>
NNODES=4, MYRANK=2, HOSTNAME=<...>
NNODES=4, MYRANK=3, HOSTNAME=<...>
NNODES=4, MYRANK=4, HOSTNAME=<...>
...


Carsten


> 
> On Jun 8, 2010, at 8:59 AM, lauren wrote:
> 
>> 
>> The version of Gromacs is 4.0.7.
>> This is the first time that I using Gromacs, then excuse me if I'm nonsense.
>> 
>> Wich part of md.log output  should I post?
>> after or before the input description?
>> 
>> thanks for all,
>> and sorry
>> 
>> De: Carsten Kutzner <ckut...@gwdg.de>
>> Para: Open MPI Users <us...@open-mpi.org>
>> Enviadas: Domingo, 6 de Junho de 2010 9:51:26
>> Assunto: Re: [OMPI users] Gromacs run in parallel
>> 
>> Hi,
>> 
>> which version of Gromacs is this? Could you post the first lines of 
>> the md.log output file?
>> 
>> Carsten
>> 
>> 
>> On Jun 5, 2010, at 10:23 PM, lauren wrote:
>> 
>>> sorry my english..
>>> 
>>> I want to know how can I run  Gromancs in parallel!
>>> Because when I used  
>>> 
>>> mdrun &
>>> mpiexec -np 4 mdrun_mpi -v -deffnm em
>>> 
>>> to run the minimization in 4 cores > all cores make the same job, again!
>>> They don't run together.  
>>> I want all in parallel make the job faster.
>>> 
>>> 
>>> what could be wrong?
>>> 
>>> thank's a lot!
>>> 
>>> 
>>> 
>>> _______________________________________________
>>> users mailing list
>>> us...@open-mpi.org
>>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>> 
>> 
>> 
>> _______________________________________________
>> users mailing list
>> us...@open-mpi.org
>> http://www.open-mpi.org/mailman/listinfo.cgi/users
> 
> 
> -- 
> Jeff Squyres
> jsquy...@cisco.com
> For corporate legal information go to:
> http://www.cisco.com/web/about/doing_business/legal/cri/
> 
> 
> _______________________________________________
> users mailing list
> us...@open-mpi.org
> http://www.open-mpi.org/mailman/listinfo.cgi/users


--
Dr. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
Theoretical and Computational Biophysics
Am Fassberg 11, 37077 Goettingen, Germany
Tel. +49-551-2012313, Fax: +49-551-2012302
http://www.mpibpc.mpg.de/home/grubmueller/ihp/ckutzne





Reply via email to