On 19 June 2013 23:52, Reuti <re...@staff.uni-marburg.de> wrote:
> Am 19.06.2013 um 22:14 schrieb Riccardo Murri:
>
>> On 19 June 2013 20:42, Reuti <re...@staff.uni-marburg.de> wrote:
>>> Am 19.06.2013 um 19:43 schrieb Riccardo Murri <riccardo.mu...@uzh.ch>:
>>>
>>>> On 19 June 2013 16:01, Ralph Castain <r...@open-mpi.org> wrote:
>>>>> How is OMPI picking up this hostfile? It isn't being specified on the cmd 
>>>>> line - are you running under some resource manager?
>>>>
>>>> Via the environment variable `OMPI_MCA_orte_default_hostfile`.
>>>>
>>>> We're running under SGE, but disable the OMPI/SGE integration (rather
>
> BTW: Which version of SGE?

SGE6.2u4 running under Rocks 5.3:

    $ qstat -h
    GE 6.2u4

    $ cat /etc/rocks-release
    Rocks release 5.3 (Rolled Tacos)


>> It's enabled but (IIRC) the problem is that OpenMPI detects the
>> presence of SGE from some environment variable
>
> Correct.
>
>
>> , which, in our version
>> of SGE, simply isn't there.
>
> Do you use a custom "starter_method" in the queue definition?

No custom starter_method.


> Does a submitted script with:
>
> #!/bin/sh
> env
>
> list at least some of the SGE* environment variables - or none at all?

Quite some SGE_* variables are in the environment:

    $ cat env.sh
    env | sort

    $ qsub -pe mpi 2 env.sh
    Your job 29590 ("env.sh") has been submitted

    $ egrep ^SGE_ env.sh.o29590
    SGE_ACCOUNT=sge
    SGE_ARCH=lx26-amd64
    ...

However, I cannot reproduce the issue now -- it's quite possible that
it originated on a older cluster (now decommisioned) and we just kept
the submission
script on newer hardware without checking.

Thanks for the help,
Riccardo

Reply via email to