Hello Gilles,

thanks for your response. I'm testing with 20 task, each using 8 threads. When using a single node or only few nodes, we do not see this either.

Attached is the used slurm script, which reports also the environment variables, and the output log from three different runs with srun, mpirun, and mpirun --mca ....

It is correct, that when running with mpirun we do not see this issue. The errors are only observed when running with "srun"
Moreover, I notice that fewer tests are performed when using mpirun.

From that we can conclude that the issue is related to slurm-openmpi interaction.

Switching from srun to mpirun has also some negative implications w.r.t. to scheduling and also robustness. Therefore, we would like to start the job with srun.


Cheers,
    Alois








Am 5/3/22 um 12:52 schrieb Gilles Gouaillardet via users:
Alois,

Thanks for the report.

FWIW, I am not seeing any errors on my Mac with Open MPI from brew (4.1.3)

How many MPI tasks are you running?
Can you please confirm you can evidence the error with

mpirun -np <number_of_processes> ./mpi_test_suite -d MPI_TYPE_MIX_ARRAY -c 0 -t collective


Also, can you try the same command with
mpirun --mca pml ob1 --mca btl tcp,self ...

Cheers,

Gilles

On Tue, May 3, 2022 at 7:08 PM Alois Schlögl via users <users@lists.open-mpi.org> wrote:


    Within our cluster (debian10/slurm16, debian11/slurm20), with
    infiniband, and we have several instances of openmpi installed
    through
    the Lmod module system. When testing the openmpi installations
    with the
    mpi-test-suite 1.1 [1], it shows errors like these

    ...
    Rank:0) tst_test_array[45]:Allreduce Min/Max with MPI_IN_PLACE
    (Rank:0) tst_test_array[46]:Allreduce Sum
    (Rank:0) tst_test_array[47]:Alltoall
    Number of failed tests: 130
    Summary of failed tests:
    ERROR class:P2P test:Ring Send Pack (7), comm Duplicated
    MPI_COMM_WORLD
    (4), type MPI_TYPE_MIX (27) number of values:1000
    ERROR class:P2P test:Ring Send Pack (7), comm Duplicated
    MPI_COMM_WORLD
    (4), type MPI_TYPE_MIX_ARRAY (28) number of values:1000
    ...

    when using openmpi/4.1.x (i tested with 4.1.1 and 4.1.3) The
    number of
    errors may vary, but the first errors are always about
        ERROR class:P2P test:Ring Send Pack (7), comm Duplicated
    MPI_COMM_WORLD

    When testing on openmpi/3.1.3, the tests runs successfully, and there
    are no failed tests.

    Typically, the openmpi/4.1.x installation is configured with
             ./configure --prefix=${PREFIX} \
                     --with-ucx=$UCX_HOME \
                     --enable-orterun-prefix-by-default  \
                     --enable-mpi-cxx \
                     --with-hwloc \
                     --with-pmi \
                     --with-pmix \
                     --with-cuda=$CUDA_HOME \
                     --with-slurm

    but I've also tried different compilation options including w/ and
    w/o
    --enable-mpi1-compatibility, w/ and w/o ucx, using hwloc from the
    OS, or
    compiled from source. But I could not identify any pattern.

    Therefore, I'd like asking you what the issue might be. Specifically,
    I'm would like to know:

    - Am I right in assuming that mpi-test-suite [1] suitable for testing
    openmpi ?
    - what are possible causes for these type of errors ?
    - what would you recommend how to debug these issues ?

    Kind regards,
       Alois


    [1] https://github.com/open-mpi/mpi-test-suite/t

Attachment: job-mpi-test3.sh
Description: application/shellscript

delta197
/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/bin/ompi_info
 running on 20*8 cores with 20 MPI-tasks and 8 threads
SHELL=/bin/bash
SLURM_JOB_USER=schloegl
SLURM_TASKS_PER_NODE=2(x10)
SLURM_JOB_UID=10103
SLURM_TASK_PID=50793
PKG_CONFIG_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/lib/pkgconfig:/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/lib/pkgconfig:/mnt/nfs/clustersw/shared/cuda/11.2.2/pkgconfig
SLURM_LOCALID=0
SLURM_SUBMIT_DIR=/nfs/scistore16/jonasgrp/schloegl/slurm
HOSTNAME=delta197
LANGUAGE=en_US:en
SLURMD_NODENAME=delta197
_ModuleTable002_=ewpmbiA9ICIvbW50L25mcy9jbHVzdGVyc3cvRGViaWFuL2J1bGxzZXllL21vZHVsZWZpbGVzL0NvcmUvaHdsb2MvMi43LjEubHVhIiwKZnVsbE5hbWUgPSAiaHdsb2MvMi43LjEiLApsb2FkT3JkZXIgPSAzLApwcm9wVCA9IHt9LApzdGFja0RlcHRoID0gMSwKc3RhdHVzID0gImFjdGl2ZSIsCnVzZXJOYW1lID0gImh3bG9jLzIuNy4xIiwKd1YgPSAiMDAwMDAwMDAyLjAwMDAwMDAwNy4wMDAwMDAwMDEuKnpmaW5hbCIsCn0sCm9wZW5tcGkgPSB7CmZuID0gIi9tbnQvbmZzL2NsdXN0ZXJzdy9EZWJpYW4vYnVsbHNleWUvbW9kdWxlZmlsZXMvQ29yZS9vcGVubXBpLzQuMS4zZC5sdWEiLApmdWxsTmFtZSA9ICJvcGVubXBpLzQuMS4zZCIsCmxvYWRPcmRlciA9IDQsCnByb3BUID0g
MPICC=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/bin/mpicc
__LMOD_REF_COUNT_MODULEPATH=/mnt/nfs/clustersw/Debian/bullseye/modulefiles/MPI/openmpi/4.1.3d:1;/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Linux:1;/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Core:1;/mnt/nfs/clustersw/Debian/bullseye/lmod/lmod/modulefiles/Core:1
OMPI_MCA_btl=self,openib
_ModuleTable005_=c3cvRGViaWFuL2J1bGxzZXllL2xtb2QvbG1vZC9tb2R1bGVmaWxlcy9Db3JlIiwKfQo=
__LMOD_REF_COUNT_LD_RUN_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/lib:1;/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1/lib:1
SLURM_NODE_ALIASES=(null)
SLURM_CLUSTER_NAME=istscicomp
OMPI_MCA_btl_openib_allow_ib=1
SLURM_CPUS_ON_NODE=16
__LMOD_REF_COUNT__LMFILES_=/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Core/ucx/1.12.1.lua:1;/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Core/cuda/11.2.2.lua:1;/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Core/hwloc/2.7.1.lua:1;/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Core/openmpi/4.1.3d.lua:1
SLURM_JOB_CPUS_PER_NODE=16(x10)
MPICXX=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/bin/mpicxx
LMOD_DIR=/mnt/nfs/clustersw/Debian/bullseye/lmod/lmod/libexec
MPIFC=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/bin/mpifort
PWD=/nfs/scistore16/jonasgrp/schloegl/slurm
SLURM_GTIDS=0
LOGNAME=schloegl
UCX_NET_DEVICES=ibp59s0
XDG_SESSION_TYPE=unspecified
SLURM_JOB_PARTITION=defaultp
MODULESHOME=/mnt/nfs/clustersw/Debian/bullseye/lmod/lmod
MANPATH=/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/share/man:/mnt/nfs/clustersw/Debian/bullseye/lmod/lmod/share/man::
NUM_CORES=20*8
SLURM_JOB_NUM_NODES=10
OMPI_MCA_pml=ob1
OPENBLAS_NUM_THREADS=8
SLURM_JOBID=33082
SLURM_JOB_QOS=normal
MPI_HOME=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d
__LMOD_REF_COUNT_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/bin:1;/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/sbin:1;/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/bin:1;/mnt/nfs/clustersw/shared/cuda/11.2.2/bin:1;/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1/bin:1;/nfs/scistore16/jonasgrp/schloegl/bin:1;/usr/local/bin:1;/usr/bin:1;/bin:1;/usr/local/games:1;/usr/games:1
HOME=/nfs/scistore16/jonasgrp/schloegl
_ModuleTable_Sz_=5
__LMOD_REF_COUNT_LIBRARY_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/lib:1;/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1/lib:1
LANG=en_US.UTF-8
__LMOD_REF_COUNT_LOADEDMODULES=ucx/1.12.1:1;cuda/11.2.2:1;hwloc/2.7.1:1;openmpi/4.1.3d:1
__LMOD_REF_COUNT_PKG_CONFIG_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/lib/pkgconfig:1;/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/lib/pkgconfig:1;/mnt/nfs/clustersw/shared/cuda/11.2.2/pkgconfig:1
SLURM_PROCID=0
CUDA_DEVICE_ORDER=PCI_BUS_ID
MAN_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/share/man:/mnt/nfs/clustersw/shared/cuda/11.2.2/share/man
__LMOD_REF_COUNT_CPATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/include:1;/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/include:1;/mnt/nfs/clustersw/shared/cuda/11.2.2/targets/x86_64-linux/include:1;/mnt/nfs/clustersw/shared/cuda/11.2.2/include:1;/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1/include:1
LMOD_SETTARG_FULL_SUPPORT=no
UCX_HOME=/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1
TMPDIR=/tmp
SLURM_CPUS_PER_TASK=8
SLURM_NTASKS=20
SLURM_TOPOLOGY_ADDR=sw11.delta197
LMOD_VERSION=8.6.12
_ModuleTable003_=e30sCnN0YWNrRGVwdGggPSAwLApzdGF0dXMgPSAiYWN0aXZlIiwKdXNlck5hbWUgPSAib3Blbm1waS80LjEuM2QiLAp3ViA9ICIwMDAwMDAwMDQuMDAwMDAwMDAxLjAwMDAwMDAwMy4qZC4qemZpbmFsIiwKfSwKdWN4ID0gewpmbiA9ICIvbW50L25mcy9jbHVzdGVyc3cvRGViaWFuL2J1bGxzZXllL21vZHVsZWZpbGVzL0NvcmUvdWN4LzEuMTIuMS5sdWEiLApmdWxsTmFtZSA9ICJ1Y3gvMS4xMi4xIiwKbG9hZE9yZGVyID0gMSwKcHJvcFQgPSB7fSwKc3RhY2tEZXB0aCA9IDEsCnN0YXR1cyA9ICJhY3RpdmUiLAp1c2VyTmFtZSA9ICJ1Y3gvMS4xMi4xIiwKd1YgPSAiMDAwMDAwMDAxLjAwMDAwMDAxMi4wMDAwMDAwMDEuKnpmaW5hbCIsCn0sCn0sCm1wYXRoQSA9IHsKIi9tbnQv
MODULEPATH_ROOT=/mnt/nfs/clustersw/Debian/bullseye/modulefiles
SLURM_TOPOLOGY_ADDR_PATTERN=switch.node
XDG_SESSION_CLASS=background
LMOD_PKG=/mnt/nfs/clustersw/Debian/bullseye/lmod/lmod
SLURM_MEM_PER_NODE=2048
OMPI_MCA_opal_warn_on_missing_libcuda=0
SLURM_WORKING_CLUSTER=istscicomp:10.36.192.126:6817:9216:101
USER=schloegl
LIBRARY_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/lib:/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1/lib
SLURM_NODELIST=delta[197-206]
OMP_MCA_mpi_param_check=YES
ENVIRONMENT=BATCH
LOADEDMODULES=ucx/1.12.1:cuda/11.2.2:hwloc/2.7.1:openmpi/4.1.3d
SLURM_JOB_ACCOUNT=itgrp
SLURM_PRIO_PROCESS=0
SLURM_NPROCS=20
LMOD_ROOT=/mnt/nfs/clustersw/Debian/bullseye/lmod
SHLVL=2
SLURM_NNODES=10
LD_RUN_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/lib:/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1/lib
BASH_ENV=/mnt/nfs/clustersw/Debian/bullseye/lmod/lmod/init/bash
__LMOD_REF_COUNT_MAN_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/share/man:1;/mnt/nfs/clustersw/shared/cuda/11.2.2/share/man:1
LMOD_sys=Linux
DISTRIB_ID=Debian
__LMOD_REF_COUNT_MANPATH=/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/share/man:1;/mnt/nfs/clustersw/Debian/bullseye/lmod/lmod/share/man:1
XDG_SESSION_ID=c36
SLURM_SUBMIT_HOST=gpu114
_ModuleTable001_=X01vZHVsZVRhYmxlXyA9IHsKTVR2ZXJzaW9uID0gMywKY19yZWJ1aWxkVGltZSA9IGZhbHNlLApjX3Nob3J0VGltZSA9IGZhbHNlLApkZXB0aFQgPSB7fSwKZmFtaWx5ID0gewpNUEkgPSAib3Blbm1waSIsCn0sCm1UID0gewpjdWRhID0gewpmbiA9ICIvbW50L25mcy9jbHVzdGVyc3cvRGViaWFuL2J1bGxzZXllL21vZHVsZWZpbGVzL0NvcmUvY3VkYS8xMS4yLjIubHVhIiwKZnVsbE5hbWUgPSAiY3VkYS8xMS4yLjIiLApsb2FkT3JkZXIgPSAyLApwcm9wVCA9IHt9LApzdGFja0RlcHRoID0gMSwKc3RhdHVzID0gImFjdGl2ZSIsCnVzZXJOYW1lID0gImN1ZGEvMTEuMi4yIiwKd1YgPSAiXjAwMDAwMDExLjAwMDAwMDAwMi4wMDAwMDAwMDIuKnpmaW5hbCIsCn0sCmh3bG9jID0g
MPIEXEC=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/bin/mpiexec
MPIRUN=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/bin/mpirun
DISTRIB_RELEASE=11
LD_LIBRARY_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/lib:/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/lib:/mnt/nfs/clustersw/shared/cuda/11.2.2/extras/CUPTI/lib64:/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1/lib:/mnt/nfs/clustersw/shared/cuda/11.2.2/targets/x86_64-linux/lib:/mnt/nfs/clustersw/shared/cuda/11.2.2/lib64
XDG_RUNTIME_DIR=/run/user/10103
SLURM_JOB_ID=33082
SLURM_NODEID=0
_ModuleTable004_=bmZzL2NsdXN0ZXJzdy9EZWJpYW4vYnVsbHNleWUvbW9kdWxlZmlsZXMvTVBJL29wZW5tcGkvNC4xLjNkIgosICIvbW50L25mcy9jbHVzdGVyc3cvRGViaWFuL2J1bGxzZXllL21vZHVsZWZpbGVzL0xpbnV4IgosICIvbW50L25mcy9jbHVzdGVyc3cvRGViaWFuL2J1bGxzZXllL21vZHVsZWZpbGVzL0NvcmUiLCAiL21udC9uZnMvY2x1c3RlcnN3L0RlYmlhbi9idWxsc2V5ZS9sbW9kL2xtb2QvbW9kdWxlZmlsZXMvQ29yZSIsCn0sCnN5c3RlbUJhc2VNUEFUSCA9ICIvbW50L25mcy9jbHVzdGVyc3cvRGViaWFuL2J1bGxzZXllL21vZHVsZWZpbGVzL0xpbnV4Oi9tbnQvbmZzL2NsdXN0ZXJzdy9EZWJpYW4vYnVsbHNleWUvbW9kdWxlZmlsZXMvQ29yZTovbW50L25mcy9jbHVzdGVy
LMOD_FAMILY_MPI_VERSION=4.1.3d
OMP_NUM_THREADS=8
LMOD_FAMILY_MPI=openmpi
CUDA_HOME=/mnt/nfs/clustersw/shared/cuda/11.2.2
OMPI_MCA_mpi_cuda_support=1
SLURM_MPI_TYPE=pmix
__LMOD_REF_COUNT_C_INCLUDE_PATH=/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/include:1
DISTRIB_CODENAME=bullseye
SLURM_CONF=/etc/slurm/slurm.conf
PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/bin:/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/sbin:/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/bin:/mnt/nfs/clustersw/shared/cuda/11.2.2/bin:/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1/bin:/nfs/scistore16/jonasgrp/schloegl/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
SLURM_JOB_NAME=mpitest
MODULEPATH=/mnt/nfs/clustersw/Debian/bullseye/modulefiles/MPI/openmpi/4.1.3d:/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Linux:/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Core:/mnt/nfs/clustersw/Debian/bullseye/lmod/lmod/modulefiles/Core
SLURM_NTASKS_PER_NODE=2
_LMFILES_=/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Core/ucx/1.12.1.lua:/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Core/cuda/11.2.2.lua:/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Core/hwloc/2.7.1.lua:/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Core/openmpi/4.1.3d.lua
DBUS_SESSION_BUS_ADDRESS=unix:path=/run/user/10103/bus
LMOD_CMD=/mnt/nfs/clustersw/Debian/bullseye/lmod/lmod/libexec/lmod
C_INCLUDE_PATH=/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/include
MKL_NUM_THREADS=8
MAIL=/var/mail/schloegl
__LMOD_REF_COUNT_LD_LIBRARY_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/lib:1;/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/lib:1;/mnt/nfs/clustersw/shared/cuda/11.2.2/extras/CUPTI/lib64:1;/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1/lib:1;/mnt/nfs/clustersw/shared/cuda/11.2.2/targets/x86_64-linux/lib:1;/mnt/nfs/clustersw/shared/cuda/11.2.2/lib64:1
SLURM_JOB_GID=11114
SLURM_GET_USER_ENV=1
CPATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/include:/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/include:/mnt/nfs/clustersw/shared/cuda/11.2.2/targets/x86_64-linux/include:/mnt/nfs/clustersw/shared/cuda/11.2.2/include:/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1/include
DISTRIB_DESCRIPTION=Debian GNU/Linux 11 (bullseye)
SLURM_JOB_NODELIST=delta[197-206]
BASH_FUNC_ml%%=() {  eval $($LMOD_DIR/ml_cmd "$@")
}
BASH_FUNC_module%%=() {  if [ -z "${LMOD_SH_DBG_ON+x}" ]; then
 case "$-" in 
 *v*x*)
 __lmod_sh_dbg='vx'
 ;;
 *v*)
 __lmod_sh_dbg='v'
 ;;
 *x*)
 __lmod_sh_dbg='x'
 ;;
 esac;
 fi;
 if [ -n "${__lmod_sh_dbg:-}" ]; then
 set +$__lmod_sh_dbg;
 echo "Shell debugging temporarily silenced: export LMOD_SH_DBG_ON=1 for Lmod's 
output" 1>&2;
 fi;
 _mlshopt="f";
 case "$-" in 
 *f*)
 unset _mlshopt
 ;;
 esac;
 if [ -n "${_mlshopt:-}" ]; then
 set -$_mlshopt;
 fi;
 eval $($LMOD_CMD bash "$@") && eval $(${LMOD_SETTARG_CMD:-:} -s sh);
 __lmod_my_status=$?;
 if [ -n "${_mlshopt:-}" ]; then
 set +$_mlshopt;
 fi;
 unset _mlshopt;
 if [ -n "${__lmod_sh_dbg:-}" ]; then
 echo "Shell debugging restarted" 1>&2;
 set -$__lmod_sh_dbg;
 unset __lmod_sh_dbg;
 fi;
 return $__lmod_my_status
}
_=/usr/bin/env
==== SRUN 
(Rank:0) tst_test_array[0]:Status
(Rank:0) tst_test_array[1]:Request_Null
(Rank:0) tst_test_array[2]:Type_dup
(Rank:0) tst_test_array[3]:Get_version
(Rank:0) tst_test_array[4]:Ring
(Rank:0) tst_test_array[5]:Ring Send Bottom
(Rank:0) tst_test_array[6]:Ring Send Pack
(Rank:0) tst_test_array[7]:Ring Isend
(Rank:0) tst_test_array[8]:Ring Ibsend
(Rank:0) tst_test_array[9]:Ring Irsend
(Rank:0) tst_test_array[10]:Ring Issend
(Rank:0) tst_test_array[11]:Ring Bsend
(Rank:0) tst_test_array[12]:Ring Rsend
(Rank:0) tst_test_array[13]:Ring Ssend
(Rank:0) tst_test_array[14]:Ring Sendrecv
(Rank:0) tst_test_array[15]:Ring same value
(Rank:0) tst_test_array[16]:Ring Persistent
(Rank:0) tst_test_array[17]:Direct Partner Intercomm
(Rank:0) tst_test_array[18]:Many-to-one
(Rank:0) tst_test_array[19]:Many-to-one with MPI_Probe (MPI_ANY_SOURCE)
(Rank:0) tst_test_array[20]:Many-to-one with MPI_Iprobe (MPI_ANY_SOURCE)
(Rank:0) tst_test_array[21]:Many-to-one with Isend and Cancellation
(Rank:0) tst_test_array[22]:Alltoall
(Rank:0) tst_test_array[23]:Alltoall Persistent
(Rank:0) tst_test_array[24]:Alltoall xIsend
(Rank:0) tst_test_array[25]:Alltoall Irsend
(Rank:0) tst_test_array[26]:Alltoall Issend
(Rank:0) tst_test_array[27]:Alltoall with MPI_Probe (MPI_ANY_SOURCE)
(Rank:0) tst_test_array[28]:Ring Send with cart comm
(Rank:0) tst_test_array[29]:Alltoall with topo comm
(Rank:0) tst_test_array[30]:Bcast
(Rank:0) tst_test_array[31]:Gather
(Rank:0) tst_test_array[32]:Allgather
(Rank:0) tst_test_array[33]:Allgather with MPI_IN_PLACE
(Rank:0) tst_test_array[34]:Scan sum
(Rank:0) tst_test_array[35]:Scatter
(Rank:0) tst_test_array[36]:Scatterv
(Rank:0) tst_test_array[37]:Scatterv with stride
(Rank:0) tst_test_array[38]:Reduce Min
(Rank:0) tst_test_array[39]:Reduce Max
(Rank:0) tst_test_array[40]:Reduce Min with MPI_IN_PLACE
(Rank:0) tst_test_array[41]:Reduce Max with MPI_IN_PLACE
(Rank:0) tst_test_array[42]:Allreduce Min
(Rank:0) tst_test_array[43]:Allreduce Max
(Rank:0) tst_test_array[44]:Allreduce Min/Max
(Rank:0) tst_test_array[45]:Allreduce Min/Max with MPI_IN_PLACE
(Rank:0) tst_test_array[46]:Allreduce Sum
(Rank:0) tst_test_array[47]:Alltoall
Number of failed tests: 106
Summary of failed tests:
ERROR class:P2P test:Ring Send Pack (7), comm Duplicated MPI_COMM_WORLD (4), 
type MPI_TYPE_MIX (27) number of values:1000
ERROR class:P2P test:Ring Send Pack (7), comm Duplicated MPI_COMM_WORLD (4), 
type MPI_TYPE_MIX_ARRAY (28) number of values:1000
ERROR class:P2P test:Ring Send Pack (7), comm Reversed MPI_COMM_WORLD (5), type 
MPI_TYPE_MIX (27) number of values:1000
ERROR class:P2P test:Ring Send Pack (7), comm Reversed MPI_COMM_WORLD (5), type 
MPI_TYPE_MIX_ARRAY (28) number of values:1000
ERROR class:P2P test:Ring Send Pack (7), comm Halved MPI_COMM_WORLD (6), type 
MPI_TYPE_MIX (27) number of values:1000
ERROR class:P2P test:Ring Send Pack (7), comm Halved MPI_COMM_WORLD (6), type 
MPI_TYPE_MIX_ARRAY (28) number of values:1000
ERROR class:P2P test:Ring Send Pack (7), comm Odd/Even split MPI_COMM_WORLD 
(9), type MPI_TYPE_MIX (27) number of values:1000
ERROR class:P2P test:Ring Send Pack (7), comm Odd/Even split MPI_COMM_WORLD 
(9), type MPI_TYPE_MIX_ARRAY (28) number of values:1000
ERROR class:P2P test:Ring Send Pack (7), comm Intracomm merged of the Halved 
Inter_communicato (12), type MPI_TYPE_MIX (27) number of values:1000
ERROR class:P2P test:Ring Send Pack (7), comm Intracomm merged of the Halved 
Inter_communicato (12), type MPI_TYPE_MIX_ARRAY (28) number of values:1000
ERROR class:P2P test:Ring Send Pack (7), comm MPI_COMM_TYPE_SHARED comm (13), 
type MPI_TYPE_MIX (27) number of values:1000
ERROR class:P2P test:Ring Send Pack (7), comm MPI_COMM_TYPE_SHARED comm (13), 
type MPI_TYPE_MIX_ARRAY (28) number of values:1000
ERROR class:P2P test:Alltoall (23), comm Intracomm merged of the Halved 
Inter_communicato (12), type MPI_CONTIGUOUS_INT (21) number of values:1000
ERROR class:P2P test:Alltoall (23), comm Intracomm merged of the Halved 
Inter_communicato (12), type MPI_VECTOR_INT (22) number of values:1000
ERROR class:P2P test:Alltoall (23), comm Intracomm merged of the Halved 
Inter_communicato (12), type MPI_HVECTOR_INT (23) number of values:1000
ERROR class:P2P test:Alltoall (23), comm Intracomm merged of the Halved 
Inter_communicato (12), type MPI_INDEXED_INT (24) number of values:1000
ERROR class:P2P test:Alltoall (23), comm Intracomm merged of the Halved 
Inter_communicato (12), type MPI_HINDEXED_INT (25) number of values:1000
ERROR class:P2P test:Alltoall (23), comm Intracomm merged of the Halved 
Inter_communicato (12), type MPI_STRUCT_INT (26) number of values:1000
ERROR class:P2P test:Alltoall Persistent (24), comm MPI_COMM_WORLD (1), type 
MPI_CONTIGUOUS_INT (21) number of values:1000
ERROR class:P2P test:Alltoall Persistent (24), comm MPI_COMM_WORLD (1), type 
MPI_VECTOR_INT (22) number of values:1000
ERROR class:P2P test:Alltoall Persistent (24), comm MPI_COMM_WORLD (1), type 
MPI_HVECTOR_INT (23) number of values:1000
ERROR class:P2P test:Alltoall Persistent (24), comm MPI_COMM_WORLD (1), type 
MPI_INDEXED_INT (24) number of values:1000
ERROR class:P2P test:Alltoall Persistent (24), comm MPI_COMM_WORLD (1), type 
MPI_HINDEXED_INT (25) number of values:1000
ERROR class:P2P test:Alltoall Persistent (24), comm MPI_COMM_WORLD (1), type 
MPI_STRUCT_INT (26) number of values:1000
ERROR class:P2P test:Alltoall Persistent (24), comm Duplicated MPI_COMM_WORLD 
(4), type MPI_CONTIGUOUS_INT (21) number of values:1000
ERROR class:P2P test:Alltoall Persistent (24), comm Duplicated MPI_COMM_WORLD 
(4), type MPI_VECTOR_INT (22) number of values:1000
ERROR class:P2P test:Alltoall Persistent (24), comm Duplicated MPI_COMM_WORLD 
(4), type MPI_HVECTOR_INT (23) number of values:1000
ERROR class:P2P test:Alltoall Persistent (24), comm Duplicated MPI_COMM_WORLD 
(4), type MPI_INDEXED_INT (24) number of values:1000
ERROR class:P2P test:Alltoall Persistent (24), comm Duplicated MPI_COMM_WORLD 
(4), type MPI_HINDEXED_INT (25) number of values:1000
ERROR class:P2P test:Alltoall Persistent (24), comm Duplicated MPI_COMM_WORLD 
(4), type MPI_STRUCT_INT (26) number of values:1000
ERROR class:P2P test:Alltoall Persistent (24), comm Reversed MPI_COMM_WORLD 
(5), type MPI_CONTIGUOUS_INT (21) number of values:1000
ERROR class:P2P test:Alltoall Persistent (24), comm Reversed MPI_COMM_WORLD 
(5), type MPI_VECTOR_INT (22) number of values:1000
ERROR class:P2P test:Alltoall Persistent (24), comm Reversed MPI_COMM_WORLD 
(5), type MPI_HVECTOR_INT (23) number of values:1000
ERROR class:P2P test:Alltoall Persistent (24), comm Reversed MPI_COMM_WORLD 
(5), type MPI_INDEXED_INT (24) number of values:1000
ERROR class:P2P test:Alltoall Persistent (24), comm Reversed MPI_COMM_WORLD 
(5), type MPI_HINDEXED_INT (25) number of values:1000
ERROR class:P2P test:Alltoall Persistent (24), comm Reversed MPI_COMM_WORLD 
(5), type MPI_STRUCT_INT (26) number of values:1000
ERROR class:P2P test:Alltoall Persistent (24), comm Intracomm merged of the 
Halved Inter_communicato (12), type MPI_CONTIGUOUS_INT (21) number of 
values:1000
ERROR class:P2P test:Alltoall Persistent (24), comm Intracomm merged of the 
Halved Inter_communicato (12), type MPI_VECTOR_INT (22) number of values:1000
ERROR class:P2P test:Alltoall Persistent (24), comm Intracomm merged of the 
Halved Inter_communicato (12), type MPI_HVECTOR_INT (23) number of values:1000
ERROR class:P2P test:Alltoall Persistent (24), comm Intracomm merged of the 
Halved Inter_communicato (12), type MPI_INDEXED_INT (24) number of values:1000
ERROR class:P2P test:Alltoall Persistent (24), comm Intracomm merged of the 
Halved Inter_communicato (12), type MPI_HINDEXED_INT (25) number of values:1000
ERROR class:P2P test:Alltoall Persistent (24), comm Intracomm merged of the 
Halved Inter_communicato (12), type MPI_STRUCT_INT (26) number of values:1000
ERROR class:P2P test:Alltoall Irsend (26), comm MPI_COMM_WORLD (1), type 
MPI_CONTIGUOUS_INT (21) number of values:1000
ERROR class:P2P test:Alltoall Irsend (26), comm MPI_COMM_WORLD (1), type 
MPI_VECTOR_INT (22) number of values:1000
ERROR class:P2P test:Alltoall Irsend (26), comm MPI_COMM_WORLD (1), type 
MPI_HVECTOR_INT (23) number of values:1000
ERROR class:P2P test:Alltoall Irsend (26), comm MPI_COMM_WORLD (1), type 
MPI_INDEXED_INT (24) number of values:1000
ERROR class:P2P test:Alltoall Irsend (26), comm MPI_COMM_WORLD (1), type 
MPI_HINDEXED_INT (25) number of values:1000
ERROR class:P2P test:Alltoall Irsend (26), comm MPI_COMM_WORLD (1), type 
MPI_STRUCT_INT (26) number of values:1000
ERROR class:P2P test:Alltoall Irsend (26), comm Duplicated MPI_COMM_WORLD (4), 
type MPI_CONTIGUOUS_INT (21) number of values:1000
ERROR class:P2P test:Alltoall Irsend (26), comm Duplicated MPI_COMM_WORLD (4), 
type MPI_VECTOR_INT (22) number of values:1000
ERROR class:P2P test:Alltoall Irsend (26), comm Duplicated MPI_COMM_WORLD (4), 
type MPI_HVECTOR_INT (23) number of values:1000
ERROR class:P2P test:Alltoall Irsend (26), comm Duplicated MPI_COMM_WORLD (4), 
type MPI_INDEXED_INT (24) number of values:1000
ERROR class:P2P test:Alltoall Irsend (26), comm Duplicated MPI_COMM_WORLD (4), 
type MPI_HINDEXED_INT (25) number of values:1000
ERROR class:P2P test:Alltoall Irsend (26), comm Duplicated MPI_COMM_WORLD (4), 
type MPI_STRUCT_INT (26) number of values:1000
ERROR class:P2P test:Alltoall Irsend (26), comm Reversed MPI_COMM_WORLD (5), 
type MPI_CONTIGUOUS_INT (21) number of values:1000
ERROR class:P2P test:Alltoall Irsend (26), comm Reversed MPI_COMM_WORLD (5), 
type MPI_VECTOR_INT (22) number of values:1000
ERROR class:P2P test:Alltoall Irsend (26), comm Reversed MPI_COMM_WORLD (5), 
type MPI_HVECTOR_INT (23) number of values:1000
ERROR class:P2P test:Alltoall Irsend (26), comm Reversed MPI_COMM_WORLD (5), 
type MPI_INDEXED_INT (24) number of values:1000
ERROR class:P2P test:Alltoall Irsend (26), comm Reversed MPI_COMM_WORLD (5), 
type MPI_HINDEXED_INT (25) number of values:1000
ERROR class:P2P test:Alltoall Irsend (26), comm Reversed MPI_COMM_WORLD (5), 
type MPI_STRUCT_INT (26) number of values:1000
ERROR class:P2P test:Alltoall Irsend (26), comm Intracomm merged of the Halved 
Inter_communicato (12), type MPI_CONTIGUOUS_INT (21) number of values:1000
ERROR class:P2P test:Alltoall Irsend (26), comm Intracomm merged of the Halved 
Inter_communicato (12), type MPI_VECTOR_INT (22) number of values:1000
ERROR class:P2P test:Alltoall Irsend (26), comm Intracomm merged of the Halved 
Inter_communicato (12), type MPI_HVECTOR_INT (23) number of values:1000
ERROR class:P2P test:Alltoall Irsend (26), comm Intracomm merged of the Halved 
Inter_communicato (12), type MPI_INDEXED_INT (24) number of values:1000
ERROR class:P2P test:Alltoall Irsend (26), comm Intracomm merged of the Halved 
Inter_communicato (12), type MPI_HINDEXED_INT (25) number of values:1000
ERROR class:P2P test:Alltoall Irsend (26), comm Intracomm merged of the Halved 
Inter_communicato (12), type MPI_STRUCT_INT (26) number of values:1000
ERROR class:P2P test:Alltoall Issend (27), comm MPI_COMM_WORLD (1), type 
MPI_CONTIGUOUS_INT (21) number of values:1000
ERROR class:P2P test:Alltoall Issend (27), comm MPI_COMM_WORLD (1), type 
MPI_VECTOR_INT (22) number of values:1000
ERROR class:P2P test:Alltoall Issend (27), comm MPI_COMM_WORLD (1), type 
MPI_HVECTOR_INT (23) number of values:1000
ERROR class:P2P test:Alltoall Issend (27), comm MPI_COMM_WORLD (1), type 
MPI_INDEXED_INT (24) number of values:1000
ERROR class:P2P test:Alltoall Issend (27), comm MPI_COMM_WORLD (1), type 
MPI_HINDEXED_INT (25) number of values:1000
ERROR class:P2P test:Alltoall Issend (27), comm MPI_COMM_WORLD (1), type 
MPI_STRUCT_INT (26) number of values:1000
ERROR class:P2P test:Alltoall Issend (27), comm Duplicated MPI_COMM_WORLD (4), 
type MPI_CONTIGUOUS_INT (21) number of values:1000
ERROR class:P2P test:Alltoall Issend (27), comm Duplicated MPI_COMM_WORLD (4), 
type MPI_VECTOR_INT (22) number of values:1000
ERROR class:P2P test:Alltoall Issend (27), comm Duplicated MPI_COMM_WORLD (4), 
type MPI_HVECTOR_INT (23) number of values:1000
ERROR class:P2P test:Alltoall Issend (27), comm Duplicated MPI_COMM_WORLD (4), 
type MPI_INDEXED_INT (24) number of values:1000
ERROR class:P2P test:Alltoall Issend (27), comm Duplicated MPI_COMM_WORLD (4), 
type MPI_HINDEXED_INT (25) number of values:1000
ERROR class:P2P test:Alltoall Issend (27), comm Duplicated MPI_COMM_WORLD (4), 
type MPI_STRUCT_INT (26) number of values:1000
ERROR class:P2P test:Alltoall Issend (27), comm Reversed MPI_COMM_WORLD (5), 
type MPI_CONTIGUOUS_INT (21) number of values:1000
ERROR class:P2P test:Alltoall Issend (27), comm Reversed MPI_COMM_WORLD (5), 
type MPI_VECTOR_INT (22) number of values:1000
ERROR class:P2P test:Alltoall Issend (27), comm Reversed MPI_COMM_WORLD (5), 
type MPI_HVECTOR_INT (23) number of values:1000
ERROR class:P2P test:Alltoall Issend (27), comm Reversed MPI_COMM_WORLD (5), 
type MPI_INDEXED_INT (24) number of values:1000
ERROR class:P2P test:Alltoall Issend (27), comm Reversed MPI_COMM_WORLD (5), 
type MPI_HINDEXED_INT (25) number of values:1000
ERROR class:P2P test:Alltoall Issend (27), comm Reversed MPI_COMM_WORLD (5), 
type MPI_STRUCT_INT (26) number of values:1000
ERROR class:P2P test:Alltoall Issend (27), comm Halved Inter_communicator (11), 
type MPI_CONTIGUOUS_INT (21) number of values:1000
ERROR class:P2P test:Alltoall Issend (27), comm Halved Inter_communicator (11), 
type MPI_VECTOR_INT (22) number of values:1000
ERROR class:P2P test:Alltoall Issend (27), comm Halved Inter_communicator (11), 
type MPI_HVECTOR_INT (23) number of values:1000
ERROR class:P2P test:Alltoall Issend (27), comm Halved Inter_communicator (11), 
type MPI_INDEXED_INT (24) number of values:1000
ERROR class:P2P test:Alltoall Issend (27), comm Halved Inter_communicator (11), 
type MPI_HINDEXED_INT (25) number of values:1000
ERROR class:P2P test:Alltoall Issend (27), comm Halved Inter_communicator (11), 
type MPI_STRUCT_INT (26) number of values:1000
ERROR class:P2P test:Alltoall Issend (27), comm Intracomm merged of the Halved 
Inter_communicato (12), type MPI_CONTIGUOUS_INT (21) number of values:1000
ERROR class:P2P test:Alltoall Issend (27), comm Intracomm merged of the Halved 
Inter_communicato (12), type MPI_VECTOR_INT (22) number of values:1000
ERROR class:P2P test:Alltoall Issend (27), comm Intracomm merged of the Halved 
Inter_communicato (12), type MPI_HVECTOR_INT (23) number of values:1000
ERROR class:P2P test:Alltoall Issend (27), comm Intracomm merged of the Halved 
Inter_communicato (12), type MPI_INDEXED_INT (24) number of values:1000
ERROR class:P2P test:Alltoall Issend (27), comm Intracomm merged of the Halved 
Inter_communicato (12), type MPI_HINDEXED_INT (25) number of values:1000
ERROR class:P2P test:Alltoall Issend (27), comm Intracomm merged of the Halved 
Inter_communicato (12), type MPI_STRUCT_INT (26) number of values:1000
ERROR class:P2P test:Alltoall with topo comm (30), comm Full-connected Topology 
(10), type MPI_CONTIGUOUS_INT (21) number of values:1000
ERROR class:P2P test:Alltoall with topo comm (30), comm Full-connected Topology 
(10), type MPI_VECTOR_INT (22) number of values:1000
ERROR class:P2P test:Alltoall with topo comm (30), comm Full-connected Topology 
(10), type MPI_HVECTOR_INT (23) number of values:1000
ERROR class:P2P test:Alltoall with topo comm (30), comm Full-connected Topology 
(10), type MPI_INDEXED_INT (24) number of values:1000
ERROR class:P2P test:Alltoall with topo comm (30), comm Full-connected Topology 
(10), type MPI_HINDEXED_INT (25) number of values:1000
ERROR class:P2P test:Alltoall with topo comm (30), comm Full-connected Topology 
(10), type MPI_STRUCT_INT (26) number of values:1000
ERROR class:Collective test:Scatter (36), comm MPI_COMM_WORLD (1), type 
MPI_SHORT (5) number of values:1000
ERROR class:Collective test:Scatter (36), comm MPI_COMM_WORLD (1), type 
MPI_UNSIGNED_SHORT (6) number of values:1000
ERROR class:Collective test:Scatter (36), comm Duplicated MPI_COMM_WORLD (4), 
type MPI_SHORT (5) number of values:1000
ERROR class:Collective test:Scatter (36), comm Duplicated MPI_COMM_WORLD (4), 
type MPI_UNSIGNED_SHORT (6) number of values:1000
delta197
/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/bin/ompi_info
 running on 20*8 cores with 20 MPI-tasks and 8 threads
SHELL=/bin/bash
SLURM_JOB_USER=schloegl
SLURM_TASKS_PER_NODE=2(x10)
SLURM_JOB_UID=10103
SLURM_TASK_PID=50792
PKG_CONFIG_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/lib/pkgconfig:/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/lib/pkgconfig:/mnt/nfs/clustersw/shared/cuda/11.2.2/pkgconfig
SLURM_LOCALID=0
SLURM_SUBMIT_DIR=/nfs/scistore16/jonasgrp/schloegl/slurm
HOSTNAME=delta197
LANGUAGE=en_US:en
SLURMD_NODENAME=delta197
_ModuleTable002_=ewpmbiA9ICIvbW50L25mcy9jbHVzdGVyc3cvRGViaWFuL2J1bGxzZXllL21vZHVsZWZpbGVzL0NvcmUvaHdsb2MvMi43LjEubHVhIiwKZnVsbE5hbWUgPSAiaHdsb2MvMi43LjEiLApsb2FkT3JkZXIgPSAzLApwcm9wVCA9IHt9LApzdGFja0RlcHRoID0gMSwKc3RhdHVzID0gImFjdGl2ZSIsCnVzZXJOYW1lID0gImh3bG9jLzIuNy4xIiwKd1YgPSAiMDAwMDAwMDAyLjAwMDAwMDAwNy4wMDAwMDAwMDEuKnpmaW5hbCIsCn0sCm9wZW5tcGkgPSB7CmZuID0gIi9tbnQvbmZzL2NsdXN0ZXJzdy9EZWJpYW4vYnVsbHNleWUvbW9kdWxlZmlsZXMvQ29yZS9vcGVubXBpLzQuMS4zZC5sdWEiLApmdWxsTmFtZSA9ICJvcGVubXBpLzQuMS4zZCIsCmxvYWRPcmRlciA9IDQsCnByb3BUID0g
MPICC=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/bin/mpicc
__LMOD_REF_COUNT_MODULEPATH=/mnt/nfs/clustersw/Debian/bullseye/modulefiles/MPI/openmpi/4.1.3d:1;/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Linux:1;/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Core:1;/mnt/nfs/clustersw/Debian/bullseye/lmod/lmod/modulefiles/Core:1
OMPI_MCA_btl=self,openib
_ModuleTable005_=c3cvRGViaWFuL2J1bGxzZXllL2xtb2QvbG1vZC9tb2R1bGVmaWxlcy9Db3JlIiwKfQo=
__LMOD_REF_COUNT_LD_RUN_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/lib:1;/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1/lib:1
SLURM_NODE_ALIASES=(null)
SLURM_CLUSTER_NAME=istscicomp
OMPI_MCA_btl_openib_allow_ib=1
SLURM_CPUS_ON_NODE=16
__LMOD_REF_COUNT__LMFILES_=/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Core/ucx/1.12.1.lua:1;/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Core/cuda/11.2.2.lua:1;/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Core/hwloc/2.7.1.lua:1;/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Core/openmpi/4.1.3d.lua:1
SLURM_JOB_CPUS_PER_NODE=16(x10)
MPICXX=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/bin/mpicxx
LMOD_DIR=/mnt/nfs/clustersw/Debian/bullseye/lmod/lmod/libexec
MPIFC=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/bin/mpifort
PWD=/nfs/scistore16/jonasgrp/schloegl/slurm
SLURM_GTIDS=0
LOGNAME=schloegl
UCX_NET_DEVICES=ibp59s0
XDG_SESSION_TYPE=unspecified
SLURM_JOB_PARTITION=defaultp
MODULESHOME=/mnt/nfs/clustersw/Debian/bullseye/lmod/lmod
MANPATH=/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/share/man:/mnt/nfs/clustersw/Debian/bullseye/lmod/lmod/share/man::
NUM_CORES=20*8
SLURM_JOB_NUM_NODES=10
OMPI_MCA_pml=ob1
OPENBLAS_NUM_THREADS=8
SLURM_JOBID=33083
SLURM_JOB_QOS=normal
MPI_HOME=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d
__LMOD_REF_COUNT_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/bin:1;/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/sbin:1;/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/bin:1;/mnt/nfs/clustersw/shared/cuda/11.2.2/bin:1;/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1/bin:1;/nfs/scistore16/jonasgrp/schloegl/bin:1;/usr/local/bin:1;/usr/bin:1;/bin:1;/usr/local/games:1;/usr/games:1
HOME=/nfs/scistore16/jonasgrp/schloegl
_ModuleTable_Sz_=5
__LMOD_REF_COUNT_LIBRARY_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/lib:1;/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1/lib:1
LANG=en_US.UTF-8
__LMOD_REF_COUNT_LOADEDMODULES=ucx/1.12.1:1;cuda/11.2.2:1;hwloc/2.7.1:1;openmpi/4.1.3d:1
__LMOD_REF_COUNT_PKG_CONFIG_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/lib/pkgconfig:1;/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/lib/pkgconfig:1;/mnt/nfs/clustersw/shared/cuda/11.2.2/pkgconfig:1
SLURM_PROCID=0
CUDA_DEVICE_ORDER=PCI_BUS_ID
MAN_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/share/man:/mnt/nfs/clustersw/shared/cuda/11.2.2/share/man
__LMOD_REF_COUNT_CPATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/include:1;/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/include:1;/mnt/nfs/clustersw/shared/cuda/11.2.2/targets/x86_64-linux/include:1;/mnt/nfs/clustersw/shared/cuda/11.2.2/include:1;/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1/include:1
LMOD_SETTARG_FULL_SUPPORT=no
UCX_HOME=/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1
TMPDIR=/tmp
SLURM_CPUS_PER_TASK=8
SLURM_NTASKS=20
SLURM_TOPOLOGY_ADDR=sw11.delta197
LMOD_VERSION=8.6.12
_ModuleTable003_=e30sCnN0YWNrRGVwdGggPSAwLApzdGF0dXMgPSAiYWN0aXZlIiwKdXNlck5hbWUgPSAib3Blbm1waS80LjEuM2QiLAp3ViA9ICIwMDAwMDAwMDQuMDAwMDAwMDAxLjAwMDAwMDAwMy4qZC4qemZpbmFsIiwKfSwKdWN4ID0gewpmbiA9ICIvbW50L25mcy9jbHVzdGVyc3cvRGViaWFuL2J1bGxzZXllL21vZHVsZWZpbGVzL0NvcmUvdWN4LzEuMTIuMS5sdWEiLApmdWxsTmFtZSA9ICJ1Y3gvMS4xMi4xIiwKbG9hZE9yZGVyID0gMSwKcHJvcFQgPSB7fSwKc3RhY2tEZXB0aCA9IDEsCnN0YXR1cyA9ICJhY3RpdmUiLAp1c2VyTmFtZSA9ICJ1Y3gvMS4xMi4xIiwKd1YgPSAiMDAwMDAwMDAxLjAwMDAwMDAxMi4wMDAwMDAwMDEuKnpmaW5hbCIsCn0sCn0sCm1wYXRoQSA9IHsKIi9tbnQv
MODULEPATH_ROOT=/mnt/nfs/clustersw/Debian/bullseye/modulefiles
SLURM_TOPOLOGY_ADDR_PATTERN=switch.node
XDG_SESSION_CLASS=background
LMOD_PKG=/mnt/nfs/clustersw/Debian/bullseye/lmod/lmod
SLURM_MEM_PER_NODE=2048
OMPI_MCA_opal_warn_on_missing_libcuda=0
SLURM_WORKING_CLUSTER=istscicomp:10.36.192.126:6817:9216:101
USER=schloegl
LIBRARY_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/lib:/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1/lib
SLURM_NODELIST=delta[197-206]
OMP_MCA_mpi_param_check=YES
ENVIRONMENT=BATCH
LOADEDMODULES=ucx/1.12.1:cuda/11.2.2:hwloc/2.7.1:openmpi/4.1.3d
SLURM_JOB_ACCOUNT=itgrp
SLURM_PRIO_PROCESS=0
SLURM_NPROCS=20
LMOD_ROOT=/mnt/nfs/clustersw/Debian/bullseye/lmod
SHLVL=2
SLURM_NNODES=10
LD_RUN_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/lib:/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1/lib
BASH_ENV=/mnt/nfs/clustersw/Debian/bullseye/lmod/lmod/init/bash
__LMOD_REF_COUNT_MAN_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/share/man:1;/mnt/nfs/clustersw/shared/cuda/11.2.2/share/man:1
LMOD_sys=Linux
DISTRIB_ID=Debian
__LMOD_REF_COUNT_MANPATH=/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/share/man:1;/mnt/nfs/clustersw/Debian/bullseye/lmod/lmod/share/man:1
XDG_SESSION_ID=c37
SLURM_SUBMIT_HOST=gpu114
_ModuleTable001_=X01vZHVsZVRhYmxlXyA9IHsKTVR2ZXJzaW9uID0gMywKY19yZWJ1aWxkVGltZSA9IGZhbHNlLApjX3Nob3J0VGltZSA9IGZhbHNlLApkZXB0aFQgPSB7fSwKZmFtaWx5ID0gewpNUEkgPSAib3Blbm1waSIsCn0sCm1UID0gewpjdWRhID0gewpmbiA9ICIvbW50L25mcy9jbHVzdGVyc3cvRGViaWFuL2J1bGxzZXllL21vZHVsZWZpbGVzL0NvcmUvY3VkYS8xMS4yLjIubHVhIiwKZnVsbE5hbWUgPSAiY3VkYS8xMS4yLjIiLApsb2FkT3JkZXIgPSAyLApwcm9wVCA9IHt9LApzdGFja0RlcHRoID0gMSwKc3RhdHVzID0gImFjdGl2ZSIsCnVzZXJOYW1lID0gImN1ZGEvMTEuMi4yIiwKd1YgPSAiXjAwMDAwMDExLjAwMDAwMDAwMi4wMDAwMDAwMDIuKnpmaW5hbCIsCn0sCmh3bG9jID0g
MPIEXEC=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/bin/mpiexec
MPIRUN=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/bin/mpirun
DISTRIB_RELEASE=11
LD_LIBRARY_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/lib:/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/lib:/mnt/nfs/clustersw/shared/cuda/11.2.2/extras/CUPTI/lib64:/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1/lib:/mnt/nfs/clustersw/shared/cuda/11.2.2/targets/x86_64-linux/lib:/mnt/nfs/clustersw/shared/cuda/11.2.2/lib64
XDG_RUNTIME_DIR=/run/user/10103
SLURM_JOB_ID=33083
SLURM_NODEID=0
_ModuleTable004_=bmZzL2NsdXN0ZXJzdy9EZWJpYW4vYnVsbHNleWUvbW9kdWxlZmlsZXMvTVBJL29wZW5tcGkvNC4xLjNkIgosICIvbW50L25mcy9jbHVzdGVyc3cvRGViaWFuL2J1bGxzZXllL21vZHVsZWZpbGVzL0xpbnV4IgosICIvbW50L25mcy9jbHVzdGVyc3cvRGViaWFuL2J1bGxzZXllL21vZHVsZWZpbGVzL0NvcmUiLCAiL21udC9uZnMvY2x1c3RlcnN3L0RlYmlhbi9idWxsc2V5ZS9sbW9kL2xtb2QvbW9kdWxlZmlsZXMvQ29yZSIsCn0sCnN5c3RlbUJhc2VNUEFUSCA9ICIvbW50L25mcy9jbHVzdGVyc3cvRGViaWFuL2J1bGxzZXllL21vZHVsZWZpbGVzL0xpbnV4Oi9tbnQvbmZzL2NsdXN0ZXJzdy9EZWJpYW4vYnVsbHNleWUvbW9kdWxlZmlsZXMvQ29yZTovbW50L25mcy9jbHVzdGVy
LMOD_FAMILY_MPI_VERSION=4.1.3d
OMP_NUM_THREADS=8
LMOD_FAMILY_MPI=openmpi
CUDA_HOME=/mnt/nfs/clustersw/shared/cuda/11.2.2
OMPI_MCA_mpi_cuda_support=1
SLURM_MPI_TYPE=pmix
__LMOD_REF_COUNT_C_INCLUDE_PATH=/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/include:1
DISTRIB_CODENAME=bullseye
SLURM_CONF=/etc/slurm/slurm.conf
PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/bin:/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/sbin:/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/bin:/mnt/nfs/clustersw/shared/cuda/11.2.2/bin:/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1/bin:/nfs/scistore16/jonasgrp/schloegl/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
SLURM_JOB_NAME=mpitest
MODULEPATH=/mnt/nfs/clustersw/Debian/bullseye/modulefiles/MPI/openmpi/4.1.3d:/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Linux:/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Core:/mnt/nfs/clustersw/Debian/bullseye/lmod/lmod/modulefiles/Core
SLURM_NTASKS_PER_NODE=2
_LMFILES_=/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Core/ucx/1.12.1.lua:/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Core/cuda/11.2.2.lua:/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Core/hwloc/2.7.1.lua:/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Core/openmpi/4.1.3d.lua
DBUS_SESSION_BUS_ADDRESS=unix:path=/run/user/10103/bus
LMOD_CMD=/mnt/nfs/clustersw/Debian/bullseye/lmod/lmod/libexec/lmod
C_INCLUDE_PATH=/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/include
MKL_NUM_THREADS=8
MAIL=/var/mail/schloegl
__LMOD_REF_COUNT_LD_LIBRARY_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/lib:1;/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/lib:1;/mnt/nfs/clustersw/shared/cuda/11.2.2/extras/CUPTI/lib64:1;/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1/lib:1;/mnt/nfs/clustersw/shared/cuda/11.2.2/targets/x86_64-linux/lib:1;/mnt/nfs/clustersw/shared/cuda/11.2.2/lib64:1
SLURM_JOB_GID=11114
SLURM_GET_USER_ENV=1
CPATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/include:/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/include:/mnt/nfs/clustersw/shared/cuda/11.2.2/targets/x86_64-linux/include:/mnt/nfs/clustersw/shared/cuda/11.2.2/include:/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1/include
DISTRIB_DESCRIPTION=Debian GNU/Linux 11 (bullseye)
SLURM_JOB_NODELIST=delta[197-206]
BASH_FUNC_ml%%=() {  eval $($LMOD_DIR/ml_cmd "$@")
}
BASH_FUNC_module%%=() {  if [ -z "${LMOD_SH_DBG_ON+x}" ]; then
 case "$-" in 
 *v*x*)
 __lmod_sh_dbg='vx'
 ;;
 *v*)
 __lmod_sh_dbg='v'
 ;;
 *x*)
 __lmod_sh_dbg='x'
 ;;
 esac;
 fi;
 if [ -n "${__lmod_sh_dbg:-}" ]; then
 set +$__lmod_sh_dbg;
 echo "Shell debugging temporarily silenced: export LMOD_SH_DBG_ON=1 for Lmod's 
output" 1>&2;
 fi;
 _mlshopt="f";
 case "$-" in 
 *f*)
 unset _mlshopt
 ;;
 esac;
 if [ -n "${_mlshopt:-}" ]; then
 set -$_mlshopt;
 fi;
 eval $($LMOD_CMD bash "$@") && eval $(${LMOD_SETTARG_CMD:-:} -s sh);
 __lmod_my_status=$?;
 if [ -n "${_mlshopt:-}" ]; then
 set +$_mlshopt;
 fi;
 unset _mlshopt;
 if [ -n "${__lmod_sh_dbg:-}" ]; then
 echo "Shell debugging restarted" 1>&2;
 set -$__lmod_sh_dbg;
 unset __lmod_sh_dbg;
 fi;
 return $__lmod_my_status
}
_=/usr/bin/env
==== MPIRUN 
(Rank:0) tst_test_array[0]:Bcast
(Rank:0) tst_test_array[1]:Gather
(Rank:0) tst_test_array[2]:Allgather
(Rank:0) tst_test_array[3]:Allgather with MPI_IN_PLACE
(Rank:0) tst_test_array[4]:Scan sum
(Rank:0) tst_test_array[5]:Scatter
(Rank:0) tst_test_array[6]:Scatterv
(Rank:0) tst_test_array[7]:Scatterv with stride
(Rank:0) tst_test_array[8]:Reduce Min
(Rank:0) tst_test_array[9]:Reduce Max
(Rank:0) tst_test_array[10]:Reduce Min with MPI_IN_PLACE
(Rank:0) tst_test_array[11]:Reduce Max with MPI_IN_PLACE
(Rank:0) tst_test_array[12]:Allreduce Min
(Rank:0) tst_test_array[13]:Allreduce Max
(Rank:0) tst_test_array[14]:Allreduce Min/Max
(Rank:0) tst_test_array[15]:Allreduce Min/Max with MPI_IN_PLACE
(Rank:0) tst_test_array[16]:Allreduce Sum
(Rank:0) tst_test_array[17]:Alltoall
Number of failed tests: 0
delta197
/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/bin/ompi_info
 running on 20*8 cores with 20 MPI-tasks and 8 threads
SHELL=/bin/bash
SLURM_JOB_USER=schloegl
SLURM_TASKS_PER_NODE=2(x10)
SLURM_JOB_UID=10103
SLURM_TASK_PID=51040
PKG_CONFIG_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/lib/pkgconfig:/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/lib/pkgconfig:/mnt/nfs/clustersw/shared/cuda/11.2.2/pkgconfig
SLURM_LOCALID=0
SLURM_SUBMIT_DIR=/nfs/scistore16/jonasgrp/schloegl/slurm
HOSTNAME=delta197
LANGUAGE=en_US:en
SLURMD_NODENAME=delta197
_ModuleTable002_=ewpmbiA9ICIvbW50L25mcy9jbHVzdGVyc3cvRGViaWFuL2J1bGxzZXllL21vZHVsZWZpbGVzL0NvcmUvaHdsb2MvMi43LjEubHVhIiwKZnVsbE5hbWUgPSAiaHdsb2MvMi43LjEiLApsb2FkT3JkZXIgPSAzLApwcm9wVCA9IHt9LApzdGFja0RlcHRoID0gMSwKc3RhdHVzID0gImFjdGl2ZSIsCnVzZXJOYW1lID0gImh3bG9jLzIuNy4xIiwKd1YgPSAiMDAwMDAwMDAyLjAwMDAwMDAwNy4wMDAwMDAwMDEuKnpmaW5hbCIsCn0sCm9wZW5tcGkgPSB7CmZuID0gIi9tbnQvbmZzL2NsdXN0ZXJzdy9EZWJpYW4vYnVsbHNleWUvbW9kdWxlZmlsZXMvQ29yZS9vcGVubXBpLzQuMS4zZC5sdWEiLApmdWxsTmFtZSA9ICJvcGVubXBpLzQuMS4zZCIsCmxvYWRPcmRlciA9IDQsCnByb3BUID0g
MPICC=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/bin/mpicc
__LMOD_REF_COUNT_MODULEPATH=/mnt/nfs/clustersw/Debian/bullseye/modulefiles/MPI/openmpi/4.1.3d:1;/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Linux:1;/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Core:1;/mnt/nfs/clustersw/Debian/bullseye/lmod/lmod/modulefiles/Core:1
OMPI_MCA_btl=self,openib
_ModuleTable005_=c3cvRGViaWFuL2J1bGxzZXllL2xtb2QvbG1vZC9tb2R1bGVmaWxlcy9Db3JlIiwKfQo=
__LMOD_REF_COUNT_LD_RUN_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/lib:1;/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1/lib:1
SLURM_NODE_ALIASES=(null)
SLURM_CLUSTER_NAME=istscicomp
OMPI_MCA_btl_openib_allow_ib=1
SLURM_CPUS_ON_NODE=16
__LMOD_REF_COUNT__LMFILES_=/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Core/ucx/1.12.1.lua:1;/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Core/cuda/11.2.2.lua:1;/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Core/hwloc/2.7.1.lua:1;/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Core/openmpi/4.1.3d.lua:1
SLURM_JOB_CPUS_PER_NODE=16(x10)
MPICXX=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/bin/mpicxx
LMOD_DIR=/mnt/nfs/clustersw/Debian/bullseye/lmod/lmod/libexec
MPIFC=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/bin/mpifort
PWD=/nfs/scistore16/jonasgrp/schloegl/slurm
SLURM_GTIDS=0
LOGNAME=schloegl
UCX_NET_DEVICES=ibp59s0
XDG_SESSION_TYPE=unspecified
SLURM_JOB_PARTITION=defaultp
MODULESHOME=/mnt/nfs/clustersw/Debian/bullseye/lmod/lmod
MANPATH=/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/share/man:/mnt/nfs/clustersw/Debian/bullseye/lmod/lmod/share/man::
NUM_CORES=20*8
SLURM_JOB_NUM_NODES=10
OMPI_MCA_pml=ob1
OPENBLAS_NUM_THREADS=8
SLURM_JOBID=33084
SLURM_JOB_QOS=normal
MPI_HOME=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d
__LMOD_REF_COUNT_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/bin:1;/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/sbin:1;/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/bin:1;/mnt/nfs/clustersw/shared/cuda/11.2.2/bin:1;/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1/bin:1;/nfs/scistore16/jonasgrp/schloegl/bin:1;/usr/local/bin:1;/usr/bin:1;/bin:1;/usr/local/games:1;/usr/games:1
HOME=/nfs/scistore16/jonasgrp/schloegl
_ModuleTable_Sz_=5
__LMOD_REF_COUNT_LIBRARY_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/lib:1;/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1/lib:1
LANG=en_US.UTF-8
__LMOD_REF_COUNT_LOADEDMODULES=ucx/1.12.1:1;cuda/11.2.2:1;hwloc/2.7.1:1;openmpi/4.1.3d:1
__LMOD_REF_COUNT_PKG_CONFIG_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/lib/pkgconfig:1;/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/lib/pkgconfig:1;/mnt/nfs/clustersw/shared/cuda/11.2.2/pkgconfig:1
SLURM_PROCID=0
CUDA_DEVICE_ORDER=PCI_BUS_ID
MAN_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/share/man:/mnt/nfs/clustersw/shared/cuda/11.2.2/share/man
__LMOD_REF_COUNT_CPATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/include:1;/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/include:1;/mnt/nfs/clustersw/shared/cuda/11.2.2/targets/x86_64-linux/include:1;/mnt/nfs/clustersw/shared/cuda/11.2.2/include:1;/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1/include:1
LMOD_SETTARG_FULL_SUPPORT=no
UCX_HOME=/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1
TMPDIR=/tmp
SLURM_CPUS_PER_TASK=8
SLURM_NTASKS=20
SLURM_TOPOLOGY_ADDR=sw11.delta197
LMOD_VERSION=8.6.12
_ModuleTable003_=e30sCnN0YWNrRGVwdGggPSAwLApzdGF0dXMgPSAiYWN0aXZlIiwKdXNlck5hbWUgPSAib3Blbm1waS80LjEuM2QiLAp3ViA9ICIwMDAwMDAwMDQuMDAwMDAwMDAxLjAwMDAwMDAwMy4qZC4qemZpbmFsIiwKfSwKdWN4ID0gewpmbiA9ICIvbW50L25mcy9jbHVzdGVyc3cvRGViaWFuL2J1bGxzZXllL21vZHVsZWZpbGVzL0NvcmUvdWN4LzEuMTIuMS5sdWEiLApmdWxsTmFtZSA9ICJ1Y3gvMS4xMi4xIiwKbG9hZE9yZGVyID0gMSwKcHJvcFQgPSB7fSwKc3RhY2tEZXB0aCA9IDEsCnN0YXR1cyA9ICJhY3RpdmUiLAp1c2VyTmFtZSA9ICJ1Y3gvMS4xMi4xIiwKd1YgPSAiMDAwMDAwMDAxLjAwMDAwMDAxMi4wMDAwMDAwMDEuKnpmaW5hbCIsCn0sCn0sCm1wYXRoQSA9IHsKIi9tbnQv
MODULEPATH_ROOT=/mnt/nfs/clustersw/Debian/bullseye/modulefiles
SLURM_TOPOLOGY_ADDR_PATTERN=switch.node
XDG_SESSION_CLASS=background
LMOD_PKG=/mnt/nfs/clustersw/Debian/bullseye/lmod/lmod
SLURM_MEM_PER_NODE=2048
OMPI_MCA_opal_warn_on_missing_libcuda=0
SLURM_WORKING_CLUSTER=istscicomp:10.36.192.126:6817:9216:101
USER=schloegl
LIBRARY_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/lib:/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1/lib
SLURM_NODELIST=delta[197-206]
OMP_MCA_mpi_param_check=YES
ENVIRONMENT=BATCH
LOADEDMODULES=ucx/1.12.1:cuda/11.2.2:hwloc/2.7.1:openmpi/4.1.3d
SLURM_JOB_ACCOUNT=itgrp
SLURM_PRIO_PROCESS=0
SLURM_NPROCS=20
LMOD_ROOT=/mnt/nfs/clustersw/Debian/bullseye/lmod
SHLVL=2
SLURM_NNODES=10
LD_RUN_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/lib:/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1/lib
BASH_ENV=/mnt/nfs/clustersw/Debian/bullseye/lmod/lmod/init/bash
__LMOD_REF_COUNT_MAN_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/share/man:1;/mnt/nfs/clustersw/shared/cuda/11.2.2/share/man:1
LMOD_sys=Linux
DISTRIB_ID=Debian
__LMOD_REF_COUNT_MANPATH=/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/share/man:1;/mnt/nfs/clustersw/Debian/bullseye/lmod/lmod/share/man:1
XDG_SESSION_ID=c38
SLURM_SUBMIT_HOST=gpu114
_ModuleTable001_=X01vZHVsZVRhYmxlXyA9IHsKTVR2ZXJzaW9uID0gMywKY19yZWJ1aWxkVGltZSA9IGZhbHNlLApjX3Nob3J0VGltZSA9IGZhbHNlLApkZXB0aFQgPSB7fSwKZmFtaWx5ID0gewpNUEkgPSAib3Blbm1waSIsCn0sCm1UID0gewpjdWRhID0gewpmbiA9ICIvbW50L25mcy9jbHVzdGVyc3cvRGViaWFuL2J1bGxzZXllL21vZHVsZWZpbGVzL0NvcmUvY3VkYS8xMS4yLjIubHVhIiwKZnVsbE5hbWUgPSAiY3VkYS8xMS4yLjIiLApsb2FkT3JkZXIgPSAyLApwcm9wVCA9IHt9LApzdGFja0RlcHRoID0gMSwKc3RhdHVzID0gImFjdGl2ZSIsCnVzZXJOYW1lID0gImN1ZGEvMTEuMi4yIiwKd1YgPSAiXjAwMDAwMDExLjAwMDAwMDAwMi4wMDAwMDAwMDIuKnpmaW5hbCIsCn0sCmh3bG9jID0g
MPIEXEC=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/bin/mpiexec
MPIRUN=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/bin/mpirun
DISTRIB_RELEASE=11
LD_LIBRARY_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/lib:/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/lib:/mnt/nfs/clustersw/shared/cuda/11.2.2/extras/CUPTI/lib64:/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1/lib:/mnt/nfs/clustersw/shared/cuda/11.2.2/targets/x86_64-linux/lib:/mnt/nfs/clustersw/shared/cuda/11.2.2/lib64
XDG_RUNTIME_DIR=/run/user/10103
SLURM_JOB_ID=33084
SLURM_NODEID=0
_ModuleTable004_=bmZzL2NsdXN0ZXJzdy9EZWJpYW4vYnVsbHNleWUvbW9kdWxlZmlsZXMvTVBJL29wZW5tcGkvNC4xLjNkIgosICIvbW50L25mcy9jbHVzdGVyc3cvRGViaWFuL2J1bGxzZXllL21vZHVsZWZpbGVzL0xpbnV4IgosICIvbW50L25mcy9jbHVzdGVyc3cvRGViaWFuL2J1bGxzZXllL21vZHVsZWZpbGVzL0NvcmUiLCAiL21udC9uZnMvY2x1c3RlcnN3L0RlYmlhbi9idWxsc2V5ZS9sbW9kL2xtb2QvbW9kdWxlZmlsZXMvQ29yZSIsCn0sCnN5c3RlbUJhc2VNUEFUSCA9ICIvbW50L25mcy9jbHVzdGVyc3cvRGViaWFuL2J1bGxzZXllL21vZHVsZWZpbGVzL0xpbnV4Oi9tbnQvbmZzL2NsdXN0ZXJzdy9EZWJpYW4vYnVsbHNleWUvbW9kdWxlZmlsZXMvQ29yZTovbW50L25mcy9jbHVzdGVy
LMOD_FAMILY_MPI_VERSION=4.1.3d
OMP_NUM_THREADS=8
LMOD_FAMILY_MPI=openmpi
CUDA_HOME=/mnt/nfs/clustersw/shared/cuda/11.2.2
OMPI_MCA_mpi_cuda_support=1
SLURM_MPI_TYPE=pmix
__LMOD_REF_COUNT_C_INCLUDE_PATH=/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/include:1
DISTRIB_CODENAME=bullseye
SLURM_CONF=/etc/slurm/slurm.conf
PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/bin:/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/sbin:/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/bin:/mnt/nfs/clustersw/shared/cuda/11.2.2/bin:/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1/bin:/nfs/scistore16/jonasgrp/schloegl/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
SLURM_JOB_NAME=mpitest
MODULEPATH=/mnt/nfs/clustersw/Debian/bullseye/modulefiles/MPI/openmpi/4.1.3d:/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Linux:/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Core:/mnt/nfs/clustersw/Debian/bullseye/lmod/lmod/modulefiles/Core
SLURM_NTASKS_PER_NODE=2
_LMFILES_=/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Core/ucx/1.12.1.lua:/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Core/cuda/11.2.2.lua:/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Core/hwloc/2.7.1.lua:/mnt/nfs/clustersw/Debian/bullseye/modulefiles/Core/openmpi/4.1.3d.lua
DBUS_SESSION_BUS_ADDRESS=unix:path=/run/user/10103/bus
LMOD_CMD=/mnt/nfs/clustersw/Debian/bullseye/lmod/lmod/libexec/lmod
C_INCLUDE_PATH=/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/include
MKL_NUM_THREADS=8
MAIL=/var/mail/schloegl
__LMOD_REF_COUNT_LD_LIBRARY_PATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/lib:1;/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/lib:1;/mnt/nfs/clustersw/shared/cuda/11.2.2/extras/CUPTI/lib64:1;/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1/lib:1;/mnt/nfs/clustersw/shared/cuda/11.2.2/targets/x86_64-linux/lib:1;/mnt/nfs/clustersw/shared/cuda/11.2.2/lib64:1
SLURM_JOB_GID=11114
SLURM_GET_USER_ENV=1
CPATH=/mnt/nfs/clustersw/Debian/bullseye/openmpi/4.1.3d/include:/mnt/nfs/clustersw/Debian/bullseye/hwloc/2.7.1/include:/mnt/nfs/clustersw/shared/cuda/11.2.2/targets/x86_64-linux/include:/mnt/nfs/clustersw/shared/cuda/11.2.2/include:/mnt/nfs/clustersw/Debian/bullseye/cuda/11.2/ucx/1.12.1/include
DISTRIB_DESCRIPTION=Debian GNU/Linux 11 (bullseye)
SLURM_JOB_NODELIST=delta[197-206]
BASH_FUNC_ml%%=() {  eval $($LMOD_DIR/ml_cmd "$@")
}
BASH_FUNC_module%%=() {  if [ -z "${LMOD_SH_DBG_ON+x}" ]; then
 case "$-" in 
 *v*x*)
 __lmod_sh_dbg='vx'
 ;;
 *v*)
 __lmod_sh_dbg='v'
 ;;
 *x*)
 __lmod_sh_dbg='x'
 ;;
 esac;
 fi;
 if [ -n "${__lmod_sh_dbg:-}" ]; then
 set +$__lmod_sh_dbg;
 echo "Shell debugging temporarily silenced: export LMOD_SH_DBG_ON=1 for Lmod's 
output" 1>&2;
 fi;
 _mlshopt="f";
 case "$-" in 
 *f*)
 unset _mlshopt
 ;;
 esac;
 if [ -n "${_mlshopt:-}" ]; then
 set -$_mlshopt;
 fi;
 eval $($LMOD_CMD bash "$@") && eval $(${LMOD_SETTARG_CMD:-:} -s sh);
 __lmod_my_status=$?;
 if [ -n "${_mlshopt:-}" ]; then
 set +$_mlshopt;
 fi;
 unset _mlshopt;
 if [ -n "${__lmod_sh_dbg:-}" ]; then
 echo "Shell debugging restarted" 1>&2;
 set -$__lmod_sh_dbg;
 unset __lmod_sh_dbg;
 fi;
 return $__lmod_my_status
}
_=/usr/bin/env
==== MPIRUN --mca 
(Rank:0) tst_test_array[0]:Bcast
(Rank:0) tst_test_array[1]:Gather
(Rank:0) tst_test_array[2]:Allgather
(Rank:0) tst_test_array[3]:Allgather with MPI_IN_PLACE
(Rank:0) tst_test_array[4]:Scan sum
(Rank:0) tst_test_array[5]:Scatter
(Rank:0) tst_test_array[6]:Scatterv
(Rank:0) tst_test_array[7]:Scatterv with stride
(Rank:0) tst_test_array[8]:Reduce Min
(Rank:0) tst_test_array[9]:Reduce Max
(Rank:0) tst_test_array[10]:Reduce Min with MPI_IN_PLACE
(Rank:0) tst_test_array[11]:Reduce Max with MPI_IN_PLACE
(Rank:0) tst_test_array[12]:Allreduce Min
(Rank:0) tst_test_array[13]:Allreduce Max
(Rank:0) tst_test_array[14]:Allreduce Min/Max
(Rank:0) tst_test_array[15]:Allreduce Min/Max with MPI_IN_PLACE
(Rank:0) tst_test_array[16]:Allreduce Sum
(Rank:0) tst_test_array[17]:Alltoall
Number of failed tests: 0

Reply via email to