On Monday, 26 April 2021 2:12:41 PM PDT John DeSantis wrote:
> Furthermore,
> searching the mailing list suggests that the appropriate method is to use
> `salloc` first, despite version 17.11.9 not needing `salloc` for an
> "interactive" sessions.
Before 20.11 with salloc you needed to set a Sall
Jürgen,
>> does it work with `srun --overlap ...´ or if you do `export SLURM_OVERLAP=1´
>> before running your interactive job?
I performed testing yesterday while using the "--overlap" flag, but that didn't
do anything. But, exporting the variable instead seems to have corrected the
issue:
Hi John,
does it work with `srun --overlap ...´ or if you do `export SLURM_OVERLAP=1´
before running your interactive job?
Best regards
Jürgen
* John DeSantis [210428 09:41]:
> Hello all,
>
> Just an update, the following URL almost mirrors the issue we're seeing:
> https://github.com/open-
I haven't experienced this issue here. Then again we've been using PMIx
for launching MPI for a while now, thus we may have circumvented this
particular issue.
-Paul Edmon-
On 4/28/2021 9:41 AM, John DeSantis wrote:
Hello all,
Just an update, the following URL almost mirrors the issue we're
Hello all,
Just an update, the following URL almost mirrors the issue we're seeing:
https://github.com/open-mpi/ompi/issues/8378
But, SLURM 20.11.3 was shipped with the fix. I've verified that the changes
are in the source code.
We don't want to have to downgrade SLURM to 20.02.x, but it seem
Hello all,
We've recently (don't laugh!) updated two of our SLURM installations from
16.05.10-2 to 20.11.3 and 17.11.9, respectively. Now, OpenMPI doesn't seem to
function in interactive mode across multiple nodes as it did previously on the
latest version 20.11.3; using `srun` and `mpirun` o