[OMPI users] MPI I/O, ROMIO and showing io mca parameters at run-time

2022-06-10 Thread Eric Chamberland via users
Hi, I want to try romio with OpenMPI 4.1.2 because I am observing a big performance difference with IntelMPI on GPFS. I want to see, at *runtime*, all parameters (default values, names) used by MPI (at least for the "io" framework). I would like to have all the same output as "ompi_info

Re: [OMPI users] HPL: Error occurred in MPI_Recv

2022-06-10 Thread Bart Willems via users
No errors on any of the links. This is also not isolated to 1 or 2 nodes, it happens on all cluster nodes. Bart On Thu, Jun 9, 2022 at 11:42 AM Collin Strassburger via users < users@lists.open-mpi.org> wrote: > Since it is happening on this cluster and not on others, have you checked > the

Re: [OMPI users] Segfault in ucp_dt_pack function from UCX library 1.8.0 and 1.11.2 for large sized communications using both OpenMPI 4.0.3 and 4.1.2

2022-06-10 Thread Eric Chamberland via users
Hi, to give further information about this problem... it seems not related to MPI or UCX at all but seems to come from ParMETIS itself... With ParMETIS installed from SPACK, with "+int64" option,  I have been able to use both OpenMPI 4.1.2 and IntelMPI 2021.6 successfully! With ParMETIS