Re: [gmx-users] Re: Running Gromacs in Clusters

2012-11-07 Thread Marcelo Depolo
I thought that at first, but other softwares run in parallel. If there's a
problem, it' s somehow in the PBS.

My guess is that my PBS don't allow the LAM library see others nodes. But
I have no clue where the problem could be.
I've tried eliminating the node=X and got the error. I've tried use
node=2 (or any number higher than 1) and it goes to the queue even if
there is empty nodes.

I'm almost trying to compile gromacs again but now with OpemMPI...
-- 
Marcelo DepĆ³lo
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Fwd: [gmx-users] Re: Running Gromacs in Clusters

2012-11-07 Thread Dr. Vitaly Chaban
On Wed, Nov 7, 2012 at 11:24 PM, Marcelo Depolo marcelodep...@gmail.com wrote:
 I thought that at first, but other softwares run in parallel. If there's a
 problem, it' s somehow in the PBS.

 My guess is that my PBS don't allow the LAM library see others nodes. But
 I have no clue where the problem could be.

I would be very surprised if this is true. The nornal sequences of
events during submission process is the following:

1) The system looks into your submission script and finds out the
resource requirements.

2) If the requirements are met, the job gets R status and the
remaining commands (which do not start with #PBS) are executed.

3) If there is a problem with the message parallel interface or the
[scientific] code, the job dies with some MPI-specific error message,
or with code-specific message, or usually both of them.

What I see in your report, your error message comes from PBS, i.e.
neither MPI nor gromacs are launched.


--
Dr. Vitaly V. Chaban
MEMPHYS - Center for Biomembrane Physics
Department of Physics, Chemistry and Pharmacy
University of Southern Denmark
Campusvej 55, 5230 Odense M, Denmark
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


[gmx-users] Re: Running Gromacs in Clusters

2012-11-07 Thread Dr. Vitaly Chaban
On Wed, Nov 7, 2012 at 11:48 PM, Dr. Vitaly Chaban vvcha...@gmail.com wrote:
 On Wed, Nov 7, 2012 at 11:24 PM, Marcelo Depolo marcelodep...@gmail.com 
 wrote:
 I thought that at first, but other softwares run in parallel. If there's a
 problem, it' s somehow in the PBS.

 My guess is that my PBS don't allow the LAM library see others nodes. But
 I have no clue where the problem could be.

 I would be very surprised if this is true. The nornal sequences of
 events during submission process is the following:

 1) The system looks into your submission script and finds out the
 resource requirements.

 2) If the requirements are met, the job gets R status and the
 remaining commands (which do not start with #PBS) are executed.

 3) If there is a problem with the message parallel interface or the
 [scientific] code, the job dies with some MPI-specific error message,
 or with code-specific message, or usually both of them.

 What I see in your report, your error message comes from PBS, i.e.
 neither MPI nor gromacs are launched.



Are you stating that other programs on your cluster run successfully
on multiple nodes using the same (the #PBS part) submission script and
only gromacs-jobs complain about lack of resources? I cannot
believe...


-- 
Dr. Vitaly V. Chaban
MEMPHYS - Center for Biomembrane Physics
Department of Physics, Chemistry and Pharmacy
University of Southern Denmark
Campusvej 55, 5230 Odense M, Denmark
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists