You should also check your paths for non interactive remote logins and ensure
that you are not accidentally mixing versions of open MPI (e.g., the new
version in your local machine, and some other version on the remote machines).
Sent from my phone. No type good.
> On Feb 13, 2017, at 8:14 AM
I have no MPI installation in my environment.
If it was the case, would I have an error since I use the complete path
for mpirun?
I finally managed to get a backtrace:
#0 0x77533f18 in _exit () from /lib64/libc.so.6
#1 0x75169d68 in rte_abort (status=-51, report=true) at
../../..
Cyril,
your first post mentions a crash in orted, but
the stack trace is the one of a MPI task.
i would expect orted generate a core, and then you can use gdb post mortem
to get the stack trace.
there should be several threads, so you can
info threads
bt
you might have to switch to an other threa