[slurm0:14820] mca: base: components_register: registering errmgr components
[slurm0:14820] mca: base: components_register: found loaded component default_app
[slurm0:14820] mca: base: components_register: component default_app register function successful
[slurm0:14820] mca: base: components_register: found loaded component default_hnp
[slurm0:14820] mca: base: components_register: component default_hnp register function successful
[slurm0:14820] mca: base: components_register: found loaded component default_orted
[slurm0:14820] mca: base: components_register: component default_orted register function successful
[slurm0:14820] mca: base: components_register: found loaded component default_tool
[slurm0:14820] mca: base: components_register: component default_tool register function successful
[slurm0:14820] mca: base: components_open: opening errmgr components
[slurm0:14820] mca: base: components_open: found loaded component default_app
[slurm0:14820] mca: base: components_open: component default_app open function successful
[slurm0:14820] mca: base: components_open: found loaded component default_hnp
[slurm0:14820] mca: base: components_open: component default_hnp open function successful
[slurm0:14820] mca: base: components_open: found loaded component default_orted
[slurm0:14820] mca: base: components_open: component default_orted open function successful
[slurm0:14820] mca: base: components_open: found loaded component default_tool
[slurm0:14820] mca: base: components_open: component default_tool open function successful
[slurm0:14820] mca:base:select: Auto-selecting errmgr components
[slurm0:14820] mca:base:select:(errmgr) Querying component [default_app]
[slurm0:14820] mca:base:select:(errmgr) Skipping component [default_app]. Query failed to return a module
[slurm0:14820] mca:base:select:(errmgr) Querying component [default_hnp]
[slurm0:14820] mca:base:select:(errmgr) Query of component [default_hnp] set priority to 1000
[slurm0:14820] mca:base:select:(errmgr) Querying component [default_orted]
[slurm0:14820] mca:base:select:(errmgr) Skipping component [default_orted]. Query failed to return a module
[slurm0:14820] mca:base:select:(errmgr) Querying component [default_tool]
[slurm0:14820] mca:base:select:(errmgr) Skipping component [default_tool]. Query failed to return a module
[slurm0:14820] mca:base:select:(errmgr) Selected component [default_hnp]
[slurm0:14820] mca: base: close: component default_app closed
[slurm0:14820] mca: base: close: unloading component default_app
[slurm0:14820] mca: base: close: component default_orted closed
[slurm0:14820] mca: base: close: unloading component default_orted
[slurm0:14820] mca: base: close: component default_tool closed
[slurm0:14820] mca: base: close: unloading component default_tool
[slurm0:14820] mca: base: components_register: registering odls components
[slurm0:14820] mca: base: components_register: found loaded component default
[slurm0:14820] mca: base: components_register: component default has no register or open function
[slurm0:14820] mca: base: components_open: opening odls components
[slurm0:14820] mca: base: components_open: found loaded component default
[slurm0:14820] mca: base: components_open: component default open function successful
[slurm0:14820] mca:base:select: Auto-selecting odls components
[slurm0:14820] mca:base:select:( odls) Querying component [default]
[slurm0:14820] mca:base:select:( odls) Query of component [default] set priority to 1
[slurm0:14820] mca:base:select:( odls) Selected component [default]
[slurm1:09446] mca: base: components_register: registering errmgr components
[slurm1:09446] mca: base: components_register: found loaded component default_app
[slurm1:09446] mca: base: components_register: component default_app register function successful
[slurm1:09446] mca: base: components_register: found loaded component default_hnp
[slurm1:09446] mca: base: components_register: component default_hnp register function successful
[slurm1:09446] mca: base: components_register: found loaded component default_orted
[slurm1:09446] mca: base: components_register: component default_orted register function successful
[slurm1:09446] mca: base: components_register: found loaded component default_tool
[slurm1:09446] mca: base: components_register: component default_tool register function successful
[slurm1:09446] mca: base: components_open: opening errmgr components
[slurm1:09446] mca: base: components_open: found loaded component default_app
[slurm1:09446] mca: base: components_open: component default_app open function successful
[slurm1:09446] mca: base: components_open: found loaded component default_hnp
[slurm1:09446] mca: base: components_open: component default_hnp open function successful
[slurm1:09446] mca: base: components_open: found loaded component default_orted
[slurm1:09446] mca: base: components_open: component default_orted open function successful
[slurm1:09446] mca: base: components_open: found loaded component default_tool
[slurm1:09446] mca: base: components_open: component default_tool open function successful
[slurm1:09446] mca:base:select: Auto-selecting errmgr components
[slurm1:09446] mca:base:select:(errmgr) Querying component [default_app]
[slurm1:09446] mca:base:select:(errmgr) Skipping component [default_app]. Query failed to return a module
[slurm1:09446] mca:base:select:(errmgr) Querying component [default_hnp]
[slurm1:09446] mca:base:select:(errmgr) Skipping component [default_hnp]. Query failed to return a module
[slurm1:09446] mca:base:select:(errmgr) Querying component [default_orted]
[slurm1:09446] mca:base:select:(errmgr) Query of component [default_orted] set priority to 1000
[slurm1:09446] mca:base:select:(errmgr) Querying component [default_tool]
[slurm1:09446] mca:base:select:(errmgr) Skipping component [default_tool]. Query failed to return a module
[slurm1:09446] mca:base:select:(errmgr) Selected component [default_orted]
[slurm1:09446] mca: base: close: component default_app closed
[slurm1:09446] mca: base: close: unloading component default_app
[slurm1:09446] mca: base: close: component default_hnp closed
[slurm1:09446] mca: base: close: unloading component default_hnp
[slurm1:09446] mca: base: close: component default_tool closed
[slurm1:09446] mca: base: close: unloading component default_tool
[slurm1:09446] mca: base: components_register: registering odls components
[slurm1:09446] mca: base: components_register: found loaded component default
[slurm1:09446] mca: base: components_register: component default has no register or open function
[slurm1:09446] mca: base: components_open: opening odls components
[slurm1:09446] mca: base: components_open: found loaded component default
[slurm1:09446] mca: base: components_open: component default open function successful
[slurm1:09446] mca:base:select: Auto-selecting odls components
[slurm1:09446] mca:base:select:( odls) Querying component [default]
[slurm1:09446] mca:base:select:( odls) Query of component [default] set priority to 1
[slurm1:09446] mca:base:select:( odls) Selected component [default]
[slurm0:14820] [[63894,0],0] odls:constructing child list
[slurm0:14820] [[63894,0],0] odls:construct_child_list unpacking data to launch job [63894,1]
[slurm0:14820] [[63894,0],0] odls:constructing child list - looking for daemon for proc [[63894,1],0]
[slurm0:14820] [[63894,0],0] odls:constructing child list - checking proc [[63894,1],0] on daemon 1
[slurm0:14820] [[63894,0],0] odls:constructing child list - looking for daemon for proc [[63894,1],1]
[slurm0:14820] [[63894,0],0] odls:constructing child list - checking proc [[63894,1],1] on daemon 1
[slurm0:14820] [[63894,0],0] odls:constructing child list - looking for daemon for proc [[63894,1],2]
[slurm0:14820] [[63894,0],0] odls:constructing child list - checking proc [[63894,1],2] on daemon 1
[slurm0:14820] [[63894,0],0] odls:constructing child list - looking for daemon for proc [[63894,1],3]
[slurm0:14820] [[63894,0],0] odls:constructing child list - checking proc [[63894,1],3] on daemon 1
[slurm0:14820] [[63894,0],0] odls:constructing child list - looking for daemon for proc [[63894,1],4]
[slurm0:14820] [[63894,0],0] odls:constructing child list - checking proc [[63894,1],4] on daemon 1
[slurm0:14820] [[63894,0],0] odls:constructing child list - looking for daemon for proc [[63894,1],5]
[slurm0:14820] [[63894,0],0] odls:constructing child list - checking proc [[63894,1],5] on daemon 1
[slurm0:14820] [[63894,0],0] odls:constructing child list - looking for daemon for proc [[63894,1],6]
[slurm0:14820] [[63894,0],0] odls:constructing child list - checking proc [[63894,1],6] on daemon 1
[slurm0:14820] [[63894,0],0] odls:constructing child list - looking for daemon for proc [[63894,1],7]
[slurm0:14820] [[63894,0],0] odls:constructing child list - checking proc [[63894,1],7] on daemon 1
[slurm0:14820] [[63894,0],0] odls:constructing child list - looking for daemon for proc [[63894,1],8]
[slurm0:14820] [[63894,0],0] odls:constructing child list - checking proc [[63894,1],8] on daemon 1
[slurm0:14820] [[63894,0],0] odls:constructing child list - looking for daemon for proc [[63894,1],9]
[slurm0:14820] [[63894,0],0] odls:constructing child list - checking proc [[63894,1],9] on daemon 1
[slurm0:14820] [[63894,0],0] odls:constructing child list - looking for daemon for proc [[63894,1],10]
[slurm0:14820] [[63894,0],0] odls:constructing child list - checking proc [[63894,1],10] on daemon 1
[slurm0:14820] [[63894,0],0] odls:constructing child list - looking for daemon for proc [[63894,1],11]
[slurm0:14820] [[63894,0],0] odls:constructing child list - checking proc [[63894,1],11] on daemon 1
[slurm0:14820] [[63894,0],0] odls:constructing child list - looking for daemon for proc [[63894,1],12]
[slurm0:14820] [[63894,0],0] odls:constructing child list - checking proc [[63894,1],12] on daemon 1
[slurm0:14820] [[63894,0],0] odls:constructing child list - looking for daemon for proc [[63894,1],13]
[slurm0:14820] [[63894,0],0] odls:constructing child list - checking proc [[63894,1],13] on daemon 1
[slurm0:14820] [[63894,0],0] odls:constructing child list - looking for daemon for proc [[63894,1],14]
[slurm0:14820] [[63894,0],0] odls:constructing child list - checking proc [[63894,1],14] on daemon 1
[slurm0:14820] [[63894,0],0] odls:constructing child list - looking for daemon for proc [[63894,1],15]
[slurm0:14820] [[63894,0],0] odls:constructing child list - checking proc [[63894,1],15] on daemon 1
[slurm1:09446] [[63894,0],1] odls:constructing child list
[slurm1:09446] [[63894,0],1] odls:construct_child_list unpacking data to launch job [63894,1]
[slurm1:09446] [[63894,0],1] odls:construct_child_list adding new object for job [63894,1]
[slurm1:09446] [[63894,0],1] odls:construct_child_list unpacking 1 app_contexts
[slurm1:09446] [[63894,0],1] odls:constructing child list - looking for daemon for proc [[63894,1],0]
[slurm1:09446] [[63894,0],1] odls:constructing child list - checking proc [[63894,1],0] on daemon 1
[slurm1:09446] [[63894,0],1] odls:constructing child list - found proc [[63894,1],0] for me!
[slurm1:09446] [[63894,0],1] adding proc [[63894,1],0] to my local list
[slurm1:09446] [[63894,0],1] odls:constructing child list - looking for daemon for proc [[63894,1],1]
[slurm1:09446] [[63894,0],1] odls:constructing child list - checking proc [[63894,1],1] on daemon 1
[slurm1:09446] [[63894,0],1] odls:constructing child list - found proc [[63894,1],1] for me!
[slurm1:09446] [[63894,0],1] adding proc [[63894,1],1] to my local list
[slurm1:09446] [[63894,0],1] odls:constructing child list - looking for daemon for proc [[63894,1],2]
[slurm1:09446] [[63894,0],1] odls:constructing child list - checking proc [[63894,1],2] on daemon 1
[slurm1:09446] [[63894,0],1] odls:constructing child list - found proc [[63894,1],2] for me!
[slurm1:09446] [[63894,0],1] adding proc [[63894,1],2] to my local list
[slurm1:09446] [[63894,0],1] odls:constructing child list - looking for daemon for proc [[63894,1],3]
[slurm1:09446] [[63894,0],1] odls:constructing child list - checking proc [[63894,1],3] on daemon 1
[slurm1:09446] [[63894,0],1] odls:constructing child list - found proc [[63894,1],3] for me!
[slurm1:09446] [[63894,0],1] adding proc [[63894,1],3] to my local list
[slurm1:09446] [[63894,0],1] odls:constructing child list - looking for daemon for proc [[63894,1],4]
[slurm1:09446] [[63894,0],1] odls:constructing child list - checking proc [[63894,1],4] on daemon 1
[slurm1:09446] [[63894,0],1] odls:constructing child list - found proc [[63894,1],4] for me!
[slurm1:09446] [[63894,0],1] adding proc [[63894,1],4] to my local list
[slurm1:09446] [[63894,0],1] odls:constructing child list - looking for daemon for proc [[63894,1],5]
[slurm1:09446] [[63894,0],1] odls:constructing child list - checking proc [[63894,1],5] on daemon 1
[slurm1:09446] [[63894,0],1] odls:constructing child list - found proc [[63894,1],5] for me!
[slurm1:09446] [[63894,0],1] adding proc [[63894,1],5] to my local list
[slurm1:09446] [[63894,0],1] odls:constructing child list - looking for daemon for proc [[63894,1],6]
[slurm1:09446] [[63894,0],1] odls:constructing child list - checking proc [[63894,1],6] on daemon 1
[slurm1:09446] [[63894,0],1] odls:constructing child list - found proc [[63894,1],6] for me!
[slurm1:09446] [[63894,0],1] adding proc [[63894,1],6] to my local list
[slurm1:09446] [[63894,0],1] odls:constructing child list - looking for daemon for proc [[63894,1],7]
[slurm1:09446] [[63894,0],1] odls:constructing child list - checking proc [[63894,1],7] on daemon 1
[slurm1:09446] [[63894,0],1] odls:constructing child list - found proc [[63894,1],7] for me!
[slurm1:09446] [[63894,0],1] adding proc [[63894,1],7] to my local list
[slurm1:09446] [[63894,0],1] odls:constructing child list - looking for daemon for proc [[63894,1],8]
[slurm1:09446] [[63894,0],1] odls:constructing child list - checking proc [[63894,1],8] on daemon 1
[slurm1:09446] [[63894,0],1] odls:constructing child list - found proc [[63894,1],8] for me!
[slurm1:09446] [[63894,0],1] adding proc [[63894,1],8] to my local list
[slurm1:09446] [[63894,0],1] odls:constructing child list - looking for daemon for proc [[63894,1],9]
[slurm1:09446] [[63894,0],1] odls:constructing child list - checking proc [[63894,1],9] on daemon 1
[slurm1:09446] [[63894,0],1] odls:constructing child list - found proc [[63894,1],9] for me!
[slurm1:09446] [[63894,0],1] adding proc [[63894,1],9] to my local list
[slurm1:09446] [[63894,0],1] odls:constructing child list - looking for daemon for proc [[63894,1],10]
[slurm1:09446] [[63894,0],1] odls:constructing child list - checking proc [[63894,1],10] on daemon 1
[slurm1:09446] [[63894,0],1] odls:constructing child list - found proc [[63894,1],10] for me!
[slurm1:09446] [[63894,0],1] adding proc [[63894,1],10] to my local list
[slurm1:09446] [[63894,0],1] odls:constructing child list - looking for daemon for proc [[63894,1],11]
[slurm1:09446] [[63894,0],1] odls:constructing child list - checking proc [[63894,1],11] on daemon 1
[slurm1:09446] [[63894,0],1] odls:constructing child list - found proc [[63894,1],11] for me!
[slurm1:09446] [[63894,0],1] adding proc [[63894,1],11] to my local list
[slurm1:09446] [[63894,0],1] odls:constructing child list - looking for daemon for proc [[63894,1],12]
[slurm1:09446] [[63894,0],1] odls:constructing child list - checking proc [[63894,1],12] on daemon 1
[slurm1:09446] [[63894,0],1] odls:constructing child list - found proc [[63894,1],12] for me!
[slurm1:09446] [[63894,0],1] adding proc [[63894,1],12] to my local list
[slurm1:09446] [[63894,0],1] odls:constructing child list - looking for daemon for proc [[63894,1],13]
[slurm1:09446] [[63894,0],1] odls:constructing child list - checking proc [[63894,1],13] on daemon 1
[slurm1:09446] [[63894,0],1] odls:constructing child list - found proc [[63894,1],13] for me!
[slurm1:09446] [[63894,0],1] adding proc [[63894,1],13] to my local list
[slurm1:09446] [[63894,0],1] odls:constructing child list - looking for daemon for proc [[63894,1],14]
[slurm1:09446] [[63894,0],1] odls:constructing child list - checking proc [[63894,1],14] on daemon 1
[slurm1:09446] [[63894,0],1] odls:constructing child list - found proc [[63894,1],14] for me!
[slurm1:09446] [[63894,0],1] adding proc [[63894,1],14] to my local list
[slurm1:09446] [[63894,0],1] odls:constructing child list - looking for daemon for proc [[63894,1],15]
[slurm1:09446] [[63894,0],1] odls:constructing child list - checking proc [[63894,1],15] on daemon 1
[slurm1:09446] [[63894,0],1] odls:constructing child list - found proc [[63894,1],15] for me!
[slurm1:09446] [[63894,0],1] adding proc [[63894,1],15] to my local list
[slurm1:09446] [[63894,0],1] odls:launch working child [[63894,1],0]
[slurm1:09446] [[63894,0],1] odls:launch: spawning child [[63894,1],0]
[slurm1:09446] 
 Data for app_context: index 0	app: ./MPI_Errhandler_fatal_f
 	Num procs: 16	FirstRank: 0	Recovery: DEFAULT	Max Restarts: 0
 	Argv[0]: ./MPI_Errhandler_fatal_f
 	Env[0]: OMPI_MCA_btl=tcp,self
 	Env[1]: OMPI_MCA_errmgr_base_verbose=100
 	Env[2]: OMPI_MCA_odls_base_verbose=100
 	Env[3]: OMPI_MCA_rmaps_base_oversubscribe=1
 	Env[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
 	Env[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
 	Env[6]: OMPI_MCA_orte_peer_modex_id=0
 	Env[7]: OMPI_MCA_orte_peer_init_barrier_id=1
 	Env[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
 	Env[9]: SHELL=/bin/bash
 	Env[10]: SSH_CLIENT=192.168.122.100 51855 22
 	Env[11]: USER=gouaillardet
 	Env[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[13]: MAIL=/var/mail/gouaillardet
 	Env[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
 	Env[15]: PWD=/csc/home1/gouaillardet
 	Env[16]: XMODIFIERS=@im=none
 	Env[17]: LANG=en_US.UTF-8
 	Env[18]: SHLVL=1
 	Env[19]: HOME=/csc/home1/gouaillardet
 	Env[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[21]: LOGNAME=gouaillardet
 	Env[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
 	Env[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
 	Env[24]: OMPI_MCA_orte_ess_jobid=4187357184
 	Env[25]: OMPI_MCA_orte_ess_vpid=1
 	Env[26]: OMPI_MCA_orte_ess_num_procs=16
 	Env[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
 	Env[28]: OMPI_MCA_plm=rsh
 	Env[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
 	Env[30]: OMPI_MCA_mpi_yield_when_idle=1
 	Env[31]: OMPI_MCA_orte_app_num=0
 	Env[32]: OMPI_UNIVERSE_SIZE=1
 	Env[33]: OMPI_MCA_orte_num_nodes=1
 	Env[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
 	Env[35]: OMPI_MCA_orte_bound_at_launch=1
 	Env[36]: OMPI_MCA_ess=env
 	Env[37]: OMPI_COMM_WORLD_SIZE=16
 	Env[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
 	Env[39]: OMPI_MCA_orte_tmpdir_base=/tmp
 	Env[40]: OMPI_MCA_grpcomm=^pmi
 	Env[41]: OMPI_MCA_db=^pmi
 	Env[42]: OMPI_MCA_pubsub=^pmi
 	Env[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
 	Env[44]: OMPI_NUM_APP_CTX=1
 	Env[45]: OMPI_FIRST_RANKS=0
 	Env[46]: OMPI_APP_CTX_NUM_PROCS=16
 	Env[47]: OMPI_MCA_ess_base_jobid=4187357185
 	Env[48]: OMPI_MCA_ess_base_vpid=0
 	Env[49]: OMPI_COMM_WORLD_RANK=0
 	Env[50]: OMPI_COMM_WORLD_LOCAL_RANK=0
 	Env[51]: OMPI_COMM_WORLD_NODE_RANK=0
 	Env[52]: OMPI_MCA_orte_ess_node_rank=0
 	Env[53]: OMPI_MCA_orte_num_restarts=0
 	Env[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/0
 	Working dir: /csc/home1/gouaillardet (user: 0 session-dir: 0)
 	Prefix: /csc/home1/gouaillardet/local/ompi-v1.8
 	Hostfile: NULL	Add-Hostfile: NULL
 	Dash_host[0]: slurm1
 	Preload binary: FALSE	Preload files: NULL	Used on node: TRUE
[slurm1:09446] [[63894,0],1] odls:launch working child [[63894,1],1]
[slurm1:09446] [[63894,0],1] odls:launch: spawning child [[63894,1],1]
[slurm1:09446] 
 Data for app_context: index 0	app: ./MPI_Errhandler_fatal_f
 	Num procs: 16	FirstRank: 0	Recovery: DEFAULT	Max Restarts: 0
 	Argv[0]: ./MPI_Errhandler_fatal_f
 	Env[0]: OMPI_MCA_btl=tcp,self
 	Env[1]: OMPI_MCA_errmgr_base_verbose=100
 	Env[2]: OMPI_MCA_odls_base_verbose=100
 	Env[3]: OMPI_MCA_rmaps_base_oversubscribe=1
 	Env[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
 	Env[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
 	Env[6]: OMPI_MCA_orte_peer_modex_id=0
 	Env[7]: OMPI_MCA_orte_peer_init_barrier_id=1
 	Env[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
 	Env[9]: SHELL=/bin/bash
 	Env[10]: SSH_CLIENT=192.168.122.100 51855 22
 	Env[11]: USER=gouaillardet
 	Env[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[13]: MAIL=/var/mail/gouaillardet
 	Env[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
 	Env[15]: PWD=/csc/home1/gouaillardet
 	Env[16]: XMODIFIERS=@im=none
 	Env[17]: LANG=en_US.UTF-8
 	Env[18]: SHLVL=1
 	Env[19]: HOME=/csc/home1/gouaillardet
 	Env[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[21]: LOGNAME=gouaillardet
 	Env[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
 	Env[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
 	Env[24]: OMPI_MCA_orte_ess_jobid=4187357184
 	Env[25]: OMPI_MCA_orte_ess_vpid=1
 	Env[26]: OMPI_MCA_orte_ess_num_procs=16
 	Env[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
 	Env[28]: OMPI_MCA_plm=rsh
 	Env[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
 	Env[30]: OMPI_MCA_mpi_yield_when_idle=1
 	Env[31]: OMPI_MCA_orte_app_num=0
 	Env[32]: OMPI_UNIVERSE_SIZE=1
 	Env[33]: OMPI_MCA_orte_num_nodes=1
 	Env[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
 	Env[35]: OMPI_MCA_orte_bound_at_launch=1
 	Env[36]: OMPI_MCA_ess=env
 	Env[37]: OMPI_COMM_WORLD_SIZE=16
 	Env[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
 	Env[39]: OMPI_MCA_orte_tmpdir_base=/tmp
 	Env[40]: OMPI_MCA_grpcomm=^pmi
 	Env[41]: OMPI_MCA_db=^pmi
 	Env[42]: OMPI_MCA_pubsub=^pmi
 	Env[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
 	Env[44]: OMPI_NUM_APP_CTX=1
 	Env[45]: OMPI_FIRST_RANKS=0
 	Env[46]: OMPI_APP_CTX_NUM_PROCS=16
 	Env[47]: OMPI_MCA_ess_base_jobid=4187357185
 	Env[48]: OMPI_MCA_ess_base_vpid=1
 	Env[49]: OMPI_COMM_WORLD_RANK=1
 	Env[50]: OMPI_COMM_WORLD_LOCAL_RANK=1
 	Env[51]: OMPI_COMM_WORLD_NODE_RANK=1
 	Env[52]: OMPI_MCA_orte_ess_node_rank=1
 	Env[53]: OMPI_MCA_orte_num_restarts=0
 	Env[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/1
 	Working dir: /csc/home1/gouaillardet (user: 0 session-dir: 0)
 	Prefix: /csc/home1/gouaillardet/local/ompi-v1.8
 	Hostfile: NULL	Add-Hostfile: NULL
 	Dash_host[0]: slurm1
 	Preload binary: FALSE	Preload files: NULL	Used on node: TRUE
[slurm1:09446] [[63894,0],1] odls:launch working child [[63894,1],2]
[slurm1:09446] [[63894,0],1] odls:launch: spawning child [[63894,1],2]
[slurm1:09446] 
 Data for app_context: index 0	app: ./MPI_Errhandler_fatal_f
 	Num procs: 16	FirstRank: 0	Recovery: DEFAULT	Max Restarts: 0
 	Argv[0]: ./MPI_Errhandler_fatal_f
 	Env[0]: OMPI_MCA_btl=tcp,self
 	Env[1]: OMPI_MCA_errmgr_base_verbose=100
 	Env[2]: OMPI_MCA_odls_base_verbose=100
 	Env[3]: OMPI_MCA_rmaps_base_oversubscribe=1
 	Env[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
 	Env[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
 	Env[6]: OMPI_MCA_orte_peer_modex_id=0
 	Env[7]: OMPI_MCA_orte_peer_init_barrier_id=1
 	Env[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
 	Env[9]: SHELL=/bin/bash
 	Env[10]: SSH_CLIENT=192.168.122.100 51855 22
 	Env[11]: USER=gouaillardet
 	Env[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[13]: MAIL=/var/mail/gouaillardet
 	Env[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
 	Env[15]: PWD=/csc/home1/gouaillardet
 	Env[16]: XMODIFIERS=@im=none
 	Env[17]: LANG=en_US.UTF-8
 	Env[18]: SHLVL=1
 	Env[19]: HOME=/csc/home1/gouaillardet
 	Env[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[21]: LOGNAME=gouaillardet
 	Env[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
 	Env[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
 	Env[24]: OMPI_MCA_orte_ess_jobid=4187357184
 	Env[25]: OMPI_MCA_orte_ess_vpid=1
 	Env[26]: OMPI_MCA_orte_ess_num_procs=16
 	Env[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
 	Env[28]: OMPI_MCA_plm=rsh
 	Env[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
 	Env[30]: OMPI_MCA_mpi_yield_when_idle=1
 	Env[31]: OMPI_MCA_orte_app_num=0
 	Env[32]: OMPI_UNIVERSE_SIZE=1
 	Env[33]: OMPI_MCA_orte_num_nodes=1
 	Env[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
 	Env[35]: OMPI_MCA_orte_bound_at_launch=1
 	Env[36]: OMPI_MCA_ess=env
 	Env[37]: OMPI_COMM_WORLD_SIZE=16
 	Env[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
 	Env[39]: OMPI_MCA_orte_tmpdir_base=/tmp
 	Env[40]: OMPI_MCA_grpcomm=^pmi
 	Env[41]: OMPI_MCA_db=^pmi
 	Env[42]: OMPI_MCA_pubsub=^pmi
 	Env[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
 	Env[44]: OMPI_NUM_APP_CTX=1
 	Env[45]: OMPI_FIRST_RANKS=0
 	Env[46]: OMPI_APP_CTX_NUM_PROCS=16
 	Env[47]: OMPI_MCA_ess_base_jobid=4187357185
 	Env[48]: OMPI_MCA_ess_base_vpid=2
 	Env[49]: OMPI_COMM_WORLD_RANK=2
 	Env[50]: OMPI_COMM_WORLD_LOCAL_RANK=2
 	Env[51]: OMPI_COMM_WORLD_NODE_RANK=2
 	Env[52]: OMPI_MCA_orte_ess_node_rank=2
 	Env[53]: OMPI_MCA_orte_num_restarts=0
 	Env[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/2
 	Working dir: /csc/home1/gouaillardet (user: 0 session-dir: 0)
 	Prefix: /csc/home1/gouaillardet/local/ompi-v1.8
 	Hostfile: NULL	Add-Hostfile: NULL
 	Dash_host[0]: slurm1
 	Preload binary: FALSE	Preload files: NULL	Used on node: TRUE
[slurm1:09446] [[63894,0],1] odls:launch working child [[63894,1],3]
[slurm1:09446] [[63894,0],1] odls:launch: spawning child [[63894,1],3]
[slurm1:09446] 
 Data for app_context: index 0	app: ./MPI_Errhandler_fatal_f
 	Num procs: 16	FirstRank: 0	Recovery: DEFAULT	Max Restarts: 0
 	Argv[0]: ./MPI_Errhandler_fatal_f
 	Env[0]: OMPI_MCA_btl=tcp,self
 	Env[1]: OMPI_MCA_errmgr_base_verbose=100
 	Env[2]: OMPI_MCA_odls_base_verbose=100
 	Env[3]: OMPI_MCA_rmaps_base_oversubscribe=1
 	Env[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
 	Env[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
 	Env[6]: OMPI_MCA_orte_peer_modex_id=0
 	Env[7]: OMPI_MCA_orte_peer_init_barrier_id=1
 	Env[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
 	Env[9]: SHELL=/bin/bash
 	Env[10]: SSH_CLIENT=192.168.122.100 51855 22
 	Env[11]: USER=gouaillardet
 	Env[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[13]: MAIL=/var/mail/gouaillardet
 	Env[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
 	Env[15]: PWD=/csc/home1/gouaillardet
 	Env[16]: XMODIFIERS=@im=none
 	Env[17]: LANG=en_US.UTF-8
 	Env[18]: SHLVL=1
 	Env[19]: HOME=/csc/home1/gouaillardet
 	Env[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[21]: LOGNAME=gouaillardet
 	Env[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
 	Env[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
 	Env[24]: OMPI_MCA_orte_ess_jobid=4187357184
 	Env[25]: OMPI_MCA_orte_ess_vpid=1
 	Env[26]: OMPI_MCA_orte_ess_num_procs=16
 	Env[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
 	Env[28]: OMPI_MCA_plm=rsh
 	Env[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
 	Env[30]: OMPI_MCA_mpi_yield_when_idle=1
 	Env[31]: OMPI_MCA_orte_app_num=0
 	Env[32]: OMPI_UNIVERSE_SIZE=1
 	Env[33]: OMPI_MCA_orte_num_nodes=1
 	Env[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
 	Env[35]: OMPI_MCA_orte_bound_at_launch=1
 	Env[36]: OMPI_MCA_ess=env
 	Env[37]: OMPI_COMM_WORLD_SIZE=16
 	Env[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
 	Env[39]: OMPI_MCA_orte_tmpdir_base=/tmp
 	Env[40]: OMPI_MCA_grpcomm=^pmi
 	Env[41]: OMPI_MCA_db=^pmi
 	Env[42]: OMPI_MCA_pubsub=^pmi
 	Env[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
 	Env[44]: OMPI_NUM_APP_CTX=1
 	Env[45]: OMPI_FIRST_RANKS=0
 	Env[46]: OMPI_APP_CTX_NUM_PROCS=16
 	Env[47]: OMPI_MCA_ess_base_jobid=4187357185
 	Env[48]: OMPI_MCA_ess_base_vpid=3
 	Env[49]: OMPI_COMM_WORLD_RANK=3
 	Env[50]: OMPI_COMM_WORLD_LOCAL_RANK=3
 	Env[51]: OMPI_COMM_WORLD_NODE_RANK=3
 	Env[52]: OMPI_MCA_orte_ess_node_rank=3
 	Env[53]: OMPI_MCA_orte_num_restarts=0
 	Env[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/3
 	Working dir: /csc/home1/gouaillardet (user: 0 session-dir: 0)
 	Prefix: /csc/home1/gouaillardet/local/ompi-v1.8
 	Hostfile: NULL	Add-Hostfile: NULL
 	Dash_host[0]: slurm1
 	Preload binary: FALSE	Preload files: NULL	Used on node: TRUE
[slurm1:09446] [[63894,0],1] odls:launch working child [[63894,1],4]
[slurm1:09446] [[63894,0],1] odls:launch: spawning child [[63894,1],4]
[slurm1:09446] 
 Data for app_context: index 0	app: ./MPI_Errhandler_fatal_f
 	Num procs: 16	FirstRank: 0	Recovery: DEFAULT	Max Restarts: 0
 	Argv[0]: ./MPI_Errhandler_fatal_f
 	Env[0]: OMPI_MCA_btl=tcp,self
 	Env[1]: OMPI_MCA_errmgr_base_verbose=100
 	Env[2]: OMPI_MCA_odls_base_verbose=100
 	Env[3]: OMPI_MCA_rmaps_base_oversubscribe=1
 	Env[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
 	Env[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
 	Env[6]: OMPI_MCA_orte_peer_modex_id=0
 	Env[7]: OMPI_MCA_orte_peer_init_barrier_id=1
 	Env[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
 	Env[9]: SHELL=/bin/bash
 	Env[10]: SSH_CLIENT=192.168.122.100 51855 22
 	Env[11]: USER=gouaillardet
 	Env[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[13]: MAIL=/var/mail/gouaillardet
 	Env[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
 	Env[15]: PWD=/csc/home1/gouaillardet
 	Env[16]: XMODIFIERS=@im=none
 	Env[17]: LANG=en_US.UTF-8
 	Env[18]: SHLVL=1
 	Env[19]: HOME=/csc/home1/gouaillardet
 	Env[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[21]: LOGNAME=gouaillardet
 	Env[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
 	Env[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
 	Env[24]: OMPI_MCA_orte_ess_jobid=4187357184
 	Env[25]: OMPI_MCA_orte_ess_vpid=1
 	Env[26]: OMPI_MCA_orte_ess_num_procs=16
 	Env[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
 	Env[28]: OMPI_MCA_plm=rsh
 	Env[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
 	Env[30]: OMPI_MCA_mpi_yield_when_idle=1
 	Env[31]: OMPI_MCA_orte_app_num=0
 	Env[32]: OMPI_UNIVERSE_SIZE=1
 	Env[33]: OMPI_MCA_orte_num_nodes=1
 	Env[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
 	Env[35]: OMPI_MCA_orte_bound_at_launch=1
 	Env[36]: OMPI_MCA_ess=env
 	Env[37]: OMPI_COMM_WORLD_SIZE=16
 	Env[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
 	Env[39]: OMPI_MCA_orte_tmpdir_base=/tmp
 	Env[40]: OMPI_MCA_grpcomm=^pmi
 	Env[41]: OMPI_MCA_db=^pmi
 	Env[42]: OMPI_MCA_pubsub=^pmi
 	Env[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
 	Env[44]: OMPI_NUM_APP_CTX=1
 	Env[45]: OMPI_FIRST_RANKS=0
 	Env[46]: OMPI_APP_CTX_NUM_PROCS=16
 	Env[47]: OMPI_MCA_ess_base_jobid=4187357185
 	Env[48]: OMPI_MCA_ess_base_vpid=4
 	Env[49]: OMPI_COMM_WORLD_RANK=4
 	Env[50]: OMPI_COMM_WORLD_LOCAL_RANK=4
 	Env[51]: OMPI_COMM_WORLD_NODE_RANK=4
 	Env[52]: OMPI_MCA_orte_ess_node_rank=4
 	Env[53]: OMPI_MCA_orte_num_restarts=0
 	Env[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/4
 	Working dir: /csc/home1/gouaillardet (user: 0 session-dir: 0)
 	Prefix: /csc/home1/gouaillardet/local/ompi-v1.8
 	Hostfile: NULL	Add-Hostfile: NULL
 	Dash_host[0]: slurm1
 	Preload binary: FALSE	Preload files: NULL	Used on node: TRUE
[slurm1:09446] [[63894,0],1] odls:launch working child [[63894,1],5]
[slurm1:09446] [[63894,0],1] odls:launch: spawning child [[63894,1],5]
[slurm1:09446] 
 Data for app_context: index 0	app: ./MPI_Errhandler_fatal_f
 	Num procs: 16	FirstRank: 0	Recovery: DEFAULT	Max Restarts: 0
 	Argv[0]: ./MPI_Errhandler_fatal_f
 	Env[0]: OMPI_MCA_btl=tcp,self
 	Env[1]: OMPI_MCA_errmgr_base_verbose=100
 	Env[2]: OMPI_MCA_odls_base_verbose=100
 	Env[3]: OMPI_MCA_rmaps_base_oversubscribe=1
 	Env[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
 	Env[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
 	Env[6]: OMPI_MCA_orte_peer_modex_id=0
 	Env[7]: OMPI_MCA_orte_peer_init_barrier_id=1
 	Env[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
 	Env[9]: SHELL=/bin/bash
 	Env[10]: SSH_CLIENT=192.168.122.100 51855 22
 	Env[11]: USER=gouaillardet
 	Env[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[13]: MAIL=/var/mail/gouaillardet
 	Env[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
 	Env[15]: PWD=/csc/home1/gouaillardet
 	Env[16]: XMODIFIERS=@im=none
 	Env[17]: LANG=en_US.UTF-8
 	Env[18]: SHLVL=1
 	Env[19]: HOME=/csc/home1/gouaillardet
 	Env[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[21]: LOGNAME=gouaillardet
 	Env[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
 	Env[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
 	Env[24]: OMPI_MCA_orte_ess_jobid=4187357184
 	Env[25]: OMPI_MCA_orte_ess_vpid=1
 	Env[26]: OMPI_MCA_orte_ess_num_procs=16
 	Env[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
 	Env[28]: OMPI_MCA_plm=rsh
 	Env[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
 	Env[30]: OMPI_MCA_mpi_yield_when_idle=1
 	Env[31]: OMPI_MCA_orte_app_num=0
 	Env[32]: OMPI_UNIVERSE_SIZE=1
 	Env[33]: OMPI_MCA_orte_num_nodes=1
 	Env[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
 	Env[35]: OMPI_MCA_orte_bound_at_launch=1
 	Env[36]: OMPI_MCA_ess=env
 	Env[37]: OMPI_COMM_WORLD_SIZE=16
 	Env[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
 	Env[39]: OMPI_MCA_orte_tmpdir_base=/tmp
 	Env[40]: OMPI_MCA_grpcomm=^pmi
 	Env[41]: OMPI_MCA_db=^pmi
 	Env[42]: OMPI_MCA_pubsub=^pmi
 	Env[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
 	Env[44]: OMPI_NUM_APP_CTX=1
 	Env[45]: OMPI_FIRST_RANKS=0
 	Env[46]: OMPI_APP_CTX_NUM_PROCS=16
 	Env[47]: OMPI_MCA_ess_base_jobid=4187357185
 	Env[48]: OMPI_MCA_ess_base_vpid=5
 	Env[49]: OMPI_COMM_WORLD_RANK=5
 	Env[50]: OMPI_COMM_WORLD_LOCAL_RANK=5
 	Env[51]: OMPI_COMM_WORLD_NODE_RANK=5
 	Env[52]: OMPI_MCA_orte_ess_node_rank=5
 	Env[53]: OMPI_MCA_orte_num_restarts=0
 	Env[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/5
 	Working dir: /csc/home1/gouaillardet (user: 0 session-dir: 0)
 	Prefix: /csc/home1/gouaillardet/local/ompi-v1.8
 	Hostfile: NULL	Add-Hostfile: NULL
 	Dash_host[0]: slurm1
 	Preload binary: FALSE	Preload files: NULL	Used on node: TRUE
[slurm1:09446] [[63894,0],1] odls:launch working child [[63894,1],6]
[slurm1:09446] [[63894,0],1] odls:launch: spawning child [[63894,1],6]
[slurm1:09446] 
 Data for app_context: index 0	app: ./MPI_Errhandler_fatal_f
 	Num procs: 16	FirstRank: 0	Recovery: DEFAULT	Max Restarts: 0
 	Argv[0]: ./MPI_Errhandler_fatal_f
 	Env[0]: OMPI_MCA_btl=tcp,self
 	Env[1]: OMPI_MCA_errmgr_base_verbose=100
 	Env[2]: OMPI_MCA_odls_base_verbose=100
 	Env[3]: OMPI_MCA_rmaps_base_oversubscribe=1
 	Env[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
 	Env[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
 	Env[6]: OMPI_MCA_orte_peer_modex_id=0
 	Env[7]: OMPI_MCA_orte_peer_init_barrier_id=1
 	Env[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
 	Env[9]: SHELL=/bin/bash
 	Env[10]: SSH_CLIENT=192.168.122.100 51855 22
 	Env[11]: USER=gouaillardet
 	Env[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[13]: MAIL=/var/mail/gouaillardet
 	Env[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
 	Env[15]: PWD=/csc/home1/gouaillardet
 	Env[16]: XMODIFIERS=@im=none
 	Env[17]: LANG=en_US.UTF-8
 	Env[18]: SHLVL=1
 	Env[19]: HOME=/csc/home1/gouaillardet
 	Env[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[21]: LOGNAME=gouaillardet
 	Env[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
 	Env[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
 	Env[24]: OMPI_MCA_orte_ess_jobid=4187357184
 	Env[25]: OMPI_MCA_orte_ess_vpid=1
 	Env[26]: OMPI_MCA_orte_ess_num_procs=16
 	Env[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
 	Env[28]: OMPI_MCA_plm=rsh
 	Env[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
 	Env[30]: OMPI_MCA_mpi_yield_when_idle=1
 	Env[31]: OMPI_MCA_orte_app_num=0
 	Env[32]: OMPI_UNIVERSE_SIZE=1
 	Env[33]: OMPI_MCA_orte_num_nodes=1
 	Env[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
 	Env[35]: OMPI_MCA_orte_bound_at_launch=1
 	Env[36]: OMPI_MCA_ess=env
 	Env[37]: OMPI_COMM_WORLD_SIZE=16
 	Env[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
 	Env[39]: OMPI_MCA_orte_tmpdir_base=/tmp
 	Env[40]: OMPI_MCA_grpcomm=^pmi
 	Env[41]: OMPI_MCA_db=^pmi
 	Env[42]: OMPI_MCA_pubsub=^pmi
 	Env[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
 	Env[44]: OMPI_NUM_APP_CTX=1
 	Env[45]: OMPI_FIRST_RANKS=0
 	Env[46]: OMPI_APP_CTX_NUM_PROCS=16
 	Env[47]: OMPI_MCA_ess_base_jobid=4187357185
 	Env[48]: OMPI_MCA_ess_base_vpid=6
 	Env[49]: OMPI_COMM_WORLD_RANK=6
 	Env[50]: OMPI_COMM_WORLD_LOCAL_RANK=6
 	Env[51]: OMPI_COMM_WORLD_NODE_RANK=6
 	Env[52]: OMPI_MCA_orte_ess_node_rank=6
 	Env[53]: OMPI_MCA_orte_num_restarts=0
 	Env[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/6
 	Working dir: /csc/home1/gouaillardet (user: 0 session-dir: 0)
 	Prefix: /csc/home1/gouaillardet/local/ompi-v1.8
 	Hostfile: NULL	Add-Hostfile: NULL
 	Dash_host[0]: slurm1
 	Preload binary: FALSE	Preload files: NULL	Used on node: TRUE
[slurm1:09446] [[63894,0],1] odls:launch working child [[63894,1],7]
[slurm1:09446] [[63894,0],1] odls:launch: spawning child [[63894,1],7]
[slurm1:09446] 
 Data for app_context: index 0	app: ./MPI_Errhandler_fatal_f
 	Num procs: 16	FirstRank: 0	Recovery: DEFAULT	Max Restarts: 0
 	Argv[0]: ./MPI_Errhandler_fatal_f
 	Env[0]: OMPI_MCA_btl=tcp,self
 	Env[1]: OMPI_MCA_errmgr_base_verbose=100
 	Env[2]: OMPI_MCA_odls_base_verbose=100
 	Env[3]: OMPI_MCA_rmaps_base_oversubscribe=1
 	Env[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
 	Env[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
 	Env[6]: OMPI_MCA_orte_peer_modex_id=0
 	Env[7]: OMPI_MCA_orte_peer_init_barrier_id=1
 	Env[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
 	Env[9]: SHELL=/bin/bash
 	Env[10]: SSH_CLIENT=192.168.122.100 51855 22
 	Env[11]: USER=gouaillardet
 	Env[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[13]: MAIL=/var/mail/gouaillardet
 	Env[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
 	Env[15]: PWD=/csc/home1/gouaillardet
 	Env[16]: XMODIFIERS=@im=none
 	Env[17]: LANG=en_US.UTF-8
 	Env[18]: SHLVL=1
 	Env[19]: HOME=/csc/home1/gouaillardet
 	Env[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[21]: LOGNAME=gouaillardet
 	Env[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
 	Env[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
 	Env[24]: OMPI_MCA_orte_ess_jobid=4187357184
 	Env[25]: OMPI_MCA_orte_ess_vpid=1
 	Env[26]: OMPI_MCA_orte_ess_num_procs=16
 	Env[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
 	Env[28]: OMPI_MCA_plm=rsh
 	Env[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
 	Env[30]: OMPI_MCA_mpi_yield_when_idle=1
 	Env[31]: OMPI_MCA_orte_app_num=0
 	Env[32]: OMPI_UNIVERSE_SIZE=1
 	Env[33]: OMPI_MCA_orte_num_nodes=1
 	Env[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
 	Env[35]: OMPI_MCA_orte_bound_at_launch=1
 	Env[36]: OMPI_MCA_ess=env
 	Env[37]: OMPI_COMM_WORLD_SIZE=16
 	Env[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
 	Env[39]: OMPI_MCA_orte_tmpdir_base=/tmp
 	Env[40]: OMPI_MCA_grpcomm=^pmi
 	Env[41]: OMPI_MCA_db=^pmi
 	Env[42]: OMPI_MCA_pubsub=^pmi
 	Env[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
 	Env[44]: OMPI_NUM_APP_CTX=1
 	Env[45]: OMPI_FIRST_RANKS=0
 	Env[46]: OMPI_APP_CTX_NUM_PROCS=16
 	Env[47]: OMPI_MCA_ess_base_jobid=4187357185
 	Env[48]: OMPI_MCA_ess_base_vpid=7
 	Env[49]: OMPI_COMM_WORLD_RANK=7
 	Env[50]: OMPI_COMM_WORLD_LOCAL_RANK=7
 	Env[51]: OMPI_COMM_WORLD_NODE_RANK=7
 	Env[52]: OMPI_MCA_orte_ess_node_rank=7
 	Env[53]: OMPI_MCA_orte_num_restarts=0
 	Env[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/7
 	Working dir: /csc/home1/gouaillardet (user: 0 session-dir: 0)
 	Prefix: /csc/home1/gouaillardet/local/ompi-v1.8
 	Hostfile: NULL	Add-Hostfile: NULL
 	Dash_host[0]: slurm1
 	Preload binary: FALSE	Preload files: NULL	Used on node: TRUE
[slurm1:09446] [[63894,0],1] odls:launch working child [[63894,1],8]
[slurm1:09446] [[63894,0],1] odls:launch: spawning child [[63894,1],8]
[slurm1:09446] 
 Data for app_context: index 0	app: ./MPI_Errhandler_fatal_f
 	Num procs: 16	FirstRank: 0	Recovery: DEFAULT	Max Restarts: 0
 	Argv[0]: ./MPI_Errhandler_fatal_f
 	Env[0]: OMPI_MCA_btl=tcp,self
 	Env[1]: OMPI_MCA_errmgr_base_verbose=100
 	Env[2]: OMPI_MCA_odls_base_verbose=100
 	Env[3]: OMPI_MCA_rmaps_base_oversubscribe=1
 	Env[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
 	Env[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
 	Env[6]: OMPI_MCA_orte_peer_modex_id=0
 	Env[7]: OMPI_MCA_orte_peer_init_barrier_id=1
 	Env[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
 	Env[9]: SHELL=/bin/bash
 	Env[10]: SSH_CLIENT=192.168.122.100 51855 22
 	Env[11]: USER=gouaillardet
 	Env[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[13]: MAIL=/var/mail/gouaillardet
 	Env[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
 	Env[15]: PWD=/csc/home1/gouaillardet
 	Env[16]: XMODIFIERS=@im=none
 	Env[17]: LANG=en_US.UTF-8
 	Env[18]: SHLVL=1
 	Env[19]: HOME=/csc/home1/gouaillardet
 	Env[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[21]: LOGNAME=gouaillardet
 	Env[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
 	Env[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
 	Env[24]: OMPI_MCA_orte_ess_jobid=4187357184
 	Env[25]: OMPI_MCA_orte_ess_vpid=1
 	Env[26]: OMPI_MCA_orte_ess_num_procs=16
 	Env[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
 	Env[28]: OMPI_MCA_plm=rsh
 	Env[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
 	Env[30]: OMPI_MCA_mpi_yield_when_idle=1
 	Env[31]: OMPI_MCA_orte_app_num=0
 	Env[32]: OMPI_UNIVERSE_SIZE=1
 	Env[33]: OMPI_MCA_orte_num_nodes=1
 	Env[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
 	Env[35]: OMPI_MCA_orte_bound_at_launch=1
 	Env[36]: OMPI_MCA_ess=env
 	Env[37]: OMPI_COMM_WORLD_SIZE=16
 	Env[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
 	Env[39]: OMPI_MCA_orte_tmpdir_base=/tmp
 	Env[40]: OMPI_MCA_grpcomm=^pmi
 	Env[41]: OMPI_MCA_db=^pmi
 	Env[42]: OMPI_MCA_pubsub=^pmi
 	Env[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
 	Env[44]: OMPI_NUM_APP_CTX=1
 	Env[45]: OMPI_FIRST_RANKS=0
 	Env[46]: OMPI_APP_CTX_NUM_PROCS=16
 	Env[47]: OMPI_MCA_ess_base_jobid=4187357185
 	Env[48]: OMPI_MCA_ess_base_vpid=8
 	Env[49]: OMPI_COMM_WORLD_RANK=8
 	Env[50]: OMPI_COMM_WORLD_LOCAL_RANK=8
 	Env[51]: OMPI_COMM_WORLD_NODE_RANK=8
 	Env[52]: OMPI_MCA_orte_ess_node_rank=8
 	Env[53]: OMPI_MCA_orte_num_restarts=0
 	Env[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/8
 	Working dir: /csc/home1/gouaillardet (user: 0 session-dir: 0)
 	Prefix: /csc/home1/gouaillardet/local/ompi-v1.8
 	Hostfile: NULL	Add-Hostfile: NULL
 	Dash_host[0]: slurm1
 	Preload binary: FALSE	Preload files: NULL	Used on node: TRUE
[slurm1:09446] [[63894,0],1] odls:launch working child [[63894,1],9]
[slurm1:09446] [[63894,0],1] odls:launch: spawning child [[63894,1],9]
[slurm1:09446] 
 Data for app_context: index 0	app: ./MPI_Errhandler_fatal_f
 	Num procs: 16	FirstRank: 0	Recovery: DEFAULT	Max Restarts: 0
 	Argv[0]: ./MPI_Errhandler_fatal_f
 	Env[0]: OMPI_MCA_btl=tcp,self
 	Env[1]: OMPI_MCA_errmgr_base_verbose=100
 	Env[2]: OMPI_MCA_odls_base_verbose=100
 	Env[3]: OMPI_MCA_rmaps_base_oversubscribe=1
 	Env[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
 	Env[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
 	Env[6]: OMPI_MCA_orte_peer_modex_id=0
 	Env[7]: OMPI_MCA_orte_peer_init_barrier_id=1
 	Env[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
 	Env[9]: SHELL=/bin/bash
 	Env[10]: SSH_CLIENT=192.168.122.100 51855 22
 	Env[11]: USER=gouaillardet
 	Env[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[13]: MAIL=/var/mail/gouaillardet
 	Env[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
 	Env[15]: PWD=/csc/home1/gouaillardet
 	Env[16]: XMODIFIERS=@im=none
 	Env[17]: LANG=en_US.UTF-8
 	Env[18]: SHLVL=1
 	Env[19]: HOME=/csc/home1/gouaillardet
 	Env[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[21]: LOGNAME=gouaillardet
 	Env[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
 	Env[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
 	Env[24]: OMPI_MCA_orte_ess_jobid=4187357184
 	Env[25]: OMPI_MCA_orte_ess_vpid=1
 	Env[26]: OMPI_MCA_orte_ess_num_procs=16
 	Env[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
 	Env[28]: OMPI_MCA_plm=rsh
 	Env[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
 	Env[30]: OMPI_MCA_mpi_yield_when_idle=1
 	Env[31]: OMPI_MCA_orte_app_num=0
 	Env[32]: OMPI_UNIVERSE_SIZE=1
 	Env[33]: OMPI_MCA_orte_num_nodes=1
 	Env[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
 	Env[35]: OMPI_MCA_orte_bound_at_launch=1
 	Env[36]: OMPI_MCA_ess=env
 	Env[37]: OMPI_COMM_WORLD_SIZE=16
 	Env[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
 	Env[39]: OMPI_MCA_orte_tmpdir_base=/tmp
 	Env[40]: OMPI_MCA_grpcomm=^pmi
 	Env[41]: OMPI_MCA_db=^pmi
 	Env[42]: OMPI_MCA_pubsub=^pmi
 	Env[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
 	Env[44]: OMPI_NUM_APP_CTX=1
 	Env[45]: OMPI_FIRST_RANKS=0
 	Env[46]: OMPI_APP_CTX_NUM_PROCS=16
 	Env[47]: OMPI_MCA_ess_base_jobid=4187357185
 	Env[48]: OMPI_MCA_ess_base_vpid=9
 	Env[49]: OMPI_COMM_WORLD_RANK=9
 	Env[50]: OMPI_COMM_WORLD_LOCAL_RANK=9
 	Env[51]: OMPI_COMM_WORLD_NODE_RANK=9
 	Env[52]: OMPI_MCA_orte_ess_node_rank=9
 	Env[53]: OMPI_MCA_orte_num_restarts=0
 	Env[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/9
 	Working dir: /csc/home1/gouaillardet (user: 0 session-dir: 0)
 	Prefix: /csc/home1/gouaillardet/local/ompi-v1.8
 	Hostfile: NULL	Add-Hostfile: NULL
 	Dash_host[0]: slurm1
 	Preload binary: FALSE	Preload files: NULL	Used on node: TRUE
[slurm1:09446] [[63894,0],1] odls:launch working child [[63894,1],10]
[slurm1:09446] [[63894,0],1] odls:launch: spawning child [[63894,1],10]
[slurm1:09446] 
 Data for app_context: index 0	app: ./MPI_Errhandler_fatal_f
 	Num procs: 16	FirstRank: 0	Recovery: DEFAULT	Max Restarts: 0
 	Argv[0]: ./MPI_Errhandler_fatal_f
 	Env[0]: OMPI_MCA_btl=tcp,self
 	Env[1]: OMPI_MCA_errmgr_base_verbose=100
 	Env[2]: OMPI_MCA_odls_base_verbose=100
 	Env[3]: OMPI_MCA_rmaps_base_oversubscribe=1
 	Env[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
 	Env[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
 	Env[6]: OMPI_MCA_orte_peer_modex_id=0
 	Env[7]: OMPI_MCA_orte_peer_init_barrier_id=1
 	Env[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
 	Env[9]: SHELL=/bin/bash
 	Env[10]: SSH_CLIENT=192.168.122.100 51855 22
 	Env[11]: USER=gouaillardet
 	Env[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[13]: MAIL=/var/mail/gouaillardet
 	Env[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
 	Env[15]: PWD=/csc/home1/gouaillardet
 	Env[16]: XMODIFIERS=@im=none
 	Env[17]: LANG=en_US.UTF-8
 	Env[18]: SHLVL=1
 	Env[19]: HOME=/csc/home1/gouaillardet
 	Env[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[21]: LOGNAME=gouaillardet
 	Env[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
 	Env[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
 	Env[24]: OMPI_MCA_orte_ess_jobid=4187357184
 	Env[25]: OMPI_MCA_orte_ess_vpid=1
 	Env[26]: OMPI_MCA_orte_ess_num_procs=16
 	Env[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
 	Env[28]: OMPI_MCA_plm=rsh
 	Env[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
 	Env[30]: OMPI_MCA_mpi_yield_when_idle=1
 	Env[31]: OMPI_MCA_orte_app_num=0
 	Env[32]: OMPI_UNIVERSE_SIZE=1
 	Env[33]: OMPI_MCA_orte_num_nodes=1
 	Env[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
 	Env[35]: OMPI_MCA_orte_bound_at_launch=1
 	Env[36]: OMPI_MCA_ess=env
 	Env[37]: OMPI_COMM_WORLD_SIZE=16
 	Env[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
 	Env[39]: OMPI_MCA_orte_tmpdir_base=/tmp
 	Env[40]: OMPI_MCA_grpcomm=^pmi
 	Env[41]: OMPI_MCA_db=^pmi
 	Env[42]: OMPI_MCA_pubsub=^pmi
 	Env[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
 	Env[44]: OMPI_NUM_APP_CTX=1
 	Env[45]: OMPI_FIRST_RANKS=0
 	Env[46]: OMPI_APP_CTX_NUM_PROCS=16
 	Env[47]: OMPI_MCA_ess_base_jobid=4187357185
 	Env[48]: OMPI_MCA_ess_base_vpid=10
 	Env[49]: OMPI_COMM_WORLD_RANK=10
 	Env[50]: OMPI_COMM_WORLD_LOCAL_RANK=10
 	Env[51]: OMPI_COMM_WORLD_NODE_RANK=10
 	Env[52]: OMPI_MCA_orte_ess_node_rank=10
 	Env[53]: OMPI_MCA_orte_num_restarts=0
 	Env[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/10
 	Working dir: /csc/home1/gouaillardet (user: 0 session-dir: 0)
 	Prefix: /csc/home1/gouaillardet/local/ompi-v1.8
 	Hostfile: NULL	Add-Hostfile: NULL
 	Dash_host[0]: slurm1
 	Preload binary: FALSE	Preload files: NULL	Used on node: TRUE
[slurm1:09446] [[63894,0],1] odls:launch working child [[63894,1],11]
[slurm1:09446] [[63894,0],1] odls:launch: spawning child [[63894,1],11]
[slurm1:09446] 
 Data for app_context: index 0	app: ./MPI_Errhandler_fatal_f
 	Num procs: 16	FirstRank: 0	Recovery: DEFAULT	Max Restarts: 0
 	Argv[0]: ./MPI_Errhandler_fatal_f
 	Env[0]: OMPI_MCA_btl=tcp,self
 	Env[1]: OMPI_MCA_errmgr_base_verbose=100
 	Env[2]: OMPI_MCA_odls_base_verbose=100
 	Env[3]: OMPI_MCA_rmaps_base_oversubscribe=1
 	Env[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
 	Env[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
 	Env[6]: OMPI_MCA_orte_peer_modex_id=0
 	Env[7]: OMPI_MCA_orte_peer_init_barrier_id=1
 	Env[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
 	Env[9]: SHELL=/bin/bash
 	Env[10]: SSH_CLIENT=192.168.122.100 51855 22
 	Env[11]: USER=gouaillardet
 	Env[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[13]: MAIL=/var/mail/gouaillardet
 	Env[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
 	Env[15]: PWD=/csc/home1/gouaillardet
 	Env[16]: XMODIFIERS=@im=none
 	Env[17]: LANG=en_US.UTF-8
 	Env[18]: SHLVL=1
 	Env[19]: HOME=/csc/home1/gouaillardet
 	Env[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[21]: LOGNAME=gouaillardet
 	Env[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
 	Env[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
 	Env[24]: OMPI_MCA_orte_ess_jobid=4187357184
 	Env[25]: OMPI_MCA_orte_ess_vpid=1
 	Env[26]: OMPI_MCA_orte_ess_num_procs=16
 	Env[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
 	Env[28]: OMPI_MCA_plm=rsh
 	Env[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
 	Env[30]: OMPI_MCA_mpi_yield_when_idle=1
 	Env[31]: OMPI_MCA_orte_app_num=0
 	Env[32]: OMPI_UNIVERSE_SIZE=1
 	Env[33]: OMPI_MCA_orte_num_nodes=1
 	Env[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
 	Env[35]: OMPI_MCA_orte_bound_at_launch=1
 	Env[36]: OMPI_MCA_ess=env
 	Env[37]: OMPI_COMM_WORLD_SIZE=16
 	Env[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
 	Env[39]: OMPI_MCA_orte_tmpdir_base=/tmp
 	Env[40]: OMPI_MCA_grpcomm=^pmi
 	Env[41]: OMPI_MCA_db=^pmi
 	Env[42]: OMPI_MCA_pubsub=^pmi
 	Env[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
 	Env[44]: OMPI_NUM_APP_CTX=1
 	Env[45]: OMPI_FIRST_RANKS=0
 	Env[46]: OMPI_APP_CTX_NUM_PROCS=16
 	Env[47]: OMPI_MCA_ess_base_jobid=4187357185
 	Env[48]: OMPI_MCA_ess_base_vpid=11
 	Env[49]: OMPI_COMM_WORLD_RANK=11
 	Env[50]: OMPI_COMM_WORLD_LOCAL_RANK=11
 	Env[51]: OMPI_COMM_WORLD_NODE_RANK=11
 	Env[52]: OMPI_MCA_orte_ess_node_rank=11
 	Env[53]: OMPI_MCA_orte_num_restarts=0
 	Env[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/11
 	Working dir: /csc/home1/gouaillardet (user: 0 session-dir: 0)
 	Prefix: /csc/home1/gouaillardet/local/ompi-v1.8
 	Hostfile: NULL	Add-Hostfile: NULL
 	Dash_host[0]: slurm1
 	Preload binary: FALSE	Preload files: NULL	Used on node: TRUE
[slurm1:09446] [[63894,0],1] odls:launch working child [[63894,1],12]
[slurm1:09446] [[63894,0],1] odls:launch: spawning child [[63894,1],12]
[slurm1:09446] 
 Data for app_context: index 0	app: ./MPI_Errhandler_fatal_f
 	Num procs: 16	FirstRank: 0	Recovery: DEFAULT	Max Restarts: 0
 	Argv[0]: ./MPI_Errhandler_fatal_f
 	Env[0]: OMPI_MCA_btl=tcp,self
 	Env[1]: OMPI_MCA_errmgr_base_verbose=100
 	Env[2]: OMPI_MCA_odls_base_verbose=100
 	Env[3]: OMPI_MCA_rmaps_base_oversubscribe=1
 	Env[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
 	Env[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
 	Env[6]: OMPI_MCA_orte_peer_modex_id=0
 	Env[7]: OMPI_MCA_orte_peer_init_barrier_id=1
 	Env[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
 	Env[9]: SHELL=/bin/bash
 	Env[10]: SSH_CLIENT=192.168.122.100 51855 22
 	Env[11]: USER=gouaillardet
 	Env[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[13]: MAIL=/var/mail/gouaillardet
 	Env[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
 	Env[15]: PWD=/csc/home1/gouaillardet
 	Env[16]: XMODIFIERS=@im=none
 	Env[17]: LANG=en_US.UTF-8
 	Env[18]: SHLVL=1
 	Env[19]: HOME=/csc/home1/gouaillardet
 	Env[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[21]: LOGNAME=gouaillardet
 	Env[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
 	Env[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
 	Env[24]: OMPI_MCA_orte_ess_jobid=4187357184
 	Env[25]: OMPI_MCA_orte_ess_vpid=1
 	Env[26]: OMPI_MCA_orte_ess_num_procs=16
 	Env[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
 	Env[28]: OMPI_MCA_plm=rsh
 	Env[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
 	Env[30]: OMPI_MCA_mpi_yield_when_idle=1
 	Env[31]: OMPI_MCA_orte_app_num=0
 	Env[32]: OMPI_UNIVERSE_SIZE=1
 	Env[33]: OMPI_MCA_orte_num_nodes=1
 	Env[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
 	Env[35]: OMPI_MCA_orte_bound_at_launch=1
 	Env[36]: OMPI_MCA_ess=env
 	Env[37]: OMPI_COMM_WORLD_SIZE=16
 	Env[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
 	Env[39]: OMPI_MCA_orte_tmpdir_base=/tmp
 	Env[40]: OMPI_MCA_grpcomm=^pmi
 	Env[41]: OMPI_MCA_db=^pmi
 	Env[42]: OMPI_MCA_pubsub=^pmi
 	Env[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
 	Env[44]: OMPI_NUM_APP_CTX=1
 	Env[45]: OMPI_FIRST_RANKS=0
 	Env[46]: OMPI_APP_CTX_NUM_PROCS=16
 	Env[47]: OMPI_MCA_ess_base_jobid=4187357185
 	Env[48]: OMPI_MCA_ess_base_vpid=12
 	Env[49]: OMPI_COMM_WORLD_RANK=12
 	Env[50]: OMPI_COMM_WORLD_LOCAL_RANK=12
 	Env[51]: OMPI_COMM_WORLD_NODE_RANK=12
 	Env[52]: OMPI_MCA_orte_ess_node_rank=12
 	Env[53]: OMPI_MCA_orte_num_restarts=0
 	Env[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/12
 	Working dir: /csc/home1/gouaillardet (user: 0 session-dir: 0)
 	Prefix: /csc/home1/gouaillardet/local/ompi-v1.8
 	Hostfile: NULL	Add-Hostfile: NULL
 	Dash_host[0]: slurm1
 	Preload binary: FALSE	Preload files: NULL	Used on node: TRUE
[slurm1:09446] [[63894,0],1] odls:launch working child [[63894,1],13]
[slurm1:09446] [[63894,0],1] odls:launch: spawning child [[63894,1],13]
[slurm1:09446] 
 Data for app_context: index 0	app: ./MPI_Errhandler_fatal_f
 	Num procs: 16	FirstRank: 0	Recovery: DEFAULT	Max Restarts: 0
 	Argv[0]: ./MPI_Errhandler_fatal_f
 	Env[0]: OMPI_MCA_btl=tcp,self
 	Env[1]: OMPI_MCA_errmgr_base_verbose=100
 	Env[2]: OMPI_MCA_odls_base_verbose=100
 	Env[3]: OMPI_MCA_rmaps_base_oversubscribe=1
 	Env[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
 	Env[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
 	Env[6]: OMPI_MCA_orte_peer_modex_id=0
 	Env[7]: OMPI_MCA_orte_peer_init_barrier_id=1
 	Env[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
 	Env[9]: SHELL=/bin/bash
 	Env[10]: SSH_CLIENT=192.168.122.100 51855 22
 	Env[11]: USER=gouaillardet
 	Env[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[13]: MAIL=/var/mail/gouaillardet
 	Env[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
 	Env[15]: PWD=/csc/home1/gouaillardet
 	Env[16]: XMODIFIERS=@im=none
 	Env[17]: LANG=en_US.UTF-8
 	Env[18]: SHLVL=1
 	Env[19]: HOME=/csc/home1/gouaillardet
 	Env[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[21]: LOGNAME=gouaillardet
 	Env[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
 	Env[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
 	Env[24]: OMPI_MCA_orte_ess_jobid=4187357184
 	Env[25]: OMPI_MCA_orte_ess_vpid=1
 	Env[26]: OMPI_MCA_orte_ess_num_procs=16
 	Env[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
 	Env[28]: OMPI_MCA_plm=rsh
 	Env[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
 	Env[30]: OMPI_MCA_mpi_yield_when_idle=1
 	Env[31]: OMPI_MCA_orte_app_num=0
 	Env[32]: OMPI_UNIVERSE_SIZE=1
 	Env[33]: OMPI_MCA_orte_num_nodes=1
 	Env[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
 	Env[35]: OMPI_MCA_orte_bound_at_launch=1
 	Env[36]: OMPI_MCA_ess=env
 	Env[37]: OMPI_COMM_WORLD_SIZE=16
 	Env[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
 	Env[39]: OMPI_MCA_orte_tmpdir_base=/tmp
 	Env[40]: OMPI_MCA_grpcomm=^pmi
 	Env[41]: OMPI_MCA_db=^pmi
 	Env[42]: OMPI_MCA_pubsub=^pmi
 	Env[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
 	Env[44]: OMPI_NUM_APP_CTX=1
 	Env[45]: OMPI_FIRST_RANKS=0
 	Env[46]: OMPI_APP_CTX_NUM_PROCS=16
 	Env[47]: OMPI_MCA_ess_base_jobid=4187357185
 	Env[48]: OMPI_MCA_ess_base_vpid=13
 	Env[49]: OMPI_COMM_WORLD_RANK=13
 	Env[50]: OMPI_COMM_WORLD_LOCAL_RANK=13
 	Env[51]: OMPI_COMM_WORLD_NODE_RANK=13
 	Env[52]: OMPI_MCA_orte_ess_node_rank=13
 	Env[53]: OMPI_MCA_orte_num_restarts=0
 	Env[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/13
 	Working dir: /csc/home1/gouaillardet (user: 0 session-dir: 0)
 	Prefix: /csc/home1/gouaillardet/local/ompi-v1.8
 	Hostfile: NULL	Add-Hostfile: NULL
 	Dash_host[0]: slurm1
 	Preload binary: FALSE	Preload files: NULL	Used on node: TRUE
[slurm1:09446] [[63894,0],1] odls:launch working child [[63894,1],14]
[slurm1:09446] [[63894,0],1] odls:launch: spawning child [[63894,1],14]
[slurm1:09446] 
 Data for app_context: index 0	app: ./MPI_Errhandler_fatal_f
 	Num procs: 16	FirstRank: 0	Recovery: DEFAULT	Max Restarts: 0
 	Argv[0]: ./MPI_Errhandler_fatal_f
 	Env[0]: OMPI_MCA_btl=tcp,self
 	Env[1]: OMPI_MCA_errmgr_base_verbose=100
 	Env[2]: OMPI_MCA_odls_base_verbose=100
 	Env[3]: OMPI_MCA_rmaps_base_oversubscribe=1
 	Env[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
 	Env[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
 	Env[6]: OMPI_MCA_orte_peer_modex_id=0
 	Env[7]: OMPI_MCA_orte_peer_init_barrier_id=1
 	Env[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
 	Env[9]: SHELL=/bin/bash
 	Env[10]: SSH_CLIENT=192.168.122.100 51855 22
 	Env[11]: USER=gouaillardet
 	Env[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[13]: MAIL=/var/mail/gouaillardet
 	Env[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
 	Env[15]: PWD=/csc/home1/gouaillardet
 	Env[16]: XMODIFIERS=@im=none
 	Env[17]: LANG=en_US.UTF-8
 	Env[18]: SHLVL=1
 	Env[19]: HOME=/csc/home1/gouaillardet
 	Env[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[21]: LOGNAME=gouaillardet
 	Env[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
 	Env[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
 	Env[24]: OMPI_MCA_orte_ess_jobid=4187357184
 	Env[25]: OMPI_MCA_orte_ess_vpid=1
 	Env[26]: OMPI_MCA_orte_ess_num_procs=16
 	Env[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
 	Env[28]: OMPI_MCA_plm=rsh
 	Env[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
 	Env[30]: OMPI_MCA_mpi_yield_when_idle=1
 	Env[31]: OMPI_MCA_orte_app_num=0
 	Env[32]: OMPI_UNIVERSE_SIZE=1
 	Env[33]: OMPI_MCA_orte_num_nodes=1
 	Env[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
 	Env[35]: OMPI_MCA_orte_bound_at_launch=1
 	Env[36]: OMPI_MCA_ess=env
 	Env[37]: OMPI_COMM_WORLD_SIZE=16
 	Env[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
 	Env[39]: OMPI_MCA_orte_tmpdir_base=/tmp
 	Env[40]: OMPI_MCA_grpcomm=^pmi
 	Env[41]: OMPI_MCA_db=^pmi
 	Env[42]: OMPI_MCA_pubsub=^pmi
 	Env[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
 	Env[44]: OMPI_NUM_APP_CTX=1
 	Env[45]: OMPI_FIRST_RANKS=0
 	Env[46]: OMPI_APP_CTX_NUM_PROCS=16
 	Env[47]: OMPI_MCA_ess_base_jobid=4187357185
 	Env[48]: OMPI_MCA_ess_base_vpid=14
 	Env[49]: OMPI_COMM_WORLD_RANK=14
 	Env[50]: OMPI_COMM_WORLD_LOCAL_RANK=14
 	Env[51]: OMPI_COMM_WORLD_NODE_RANK=14
 	Env[52]: OMPI_MCA_orte_ess_node_rank=14
 	Env[53]: OMPI_MCA_orte_num_restarts=0
 	Env[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/14
 	Working dir: /csc/home1/gouaillardet (user: 0 session-dir: 0)
 	Prefix: /csc/home1/gouaillardet/local/ompi-v1.8
 	Hostfile: NULL	Add-Hostfile: NULL
 	Dash_host[0]: slurm1
 	Preload binary: FALSE	Preload files: NULL	Used on node: TRUE
[slurm1:09446] [[63894,0],1] odls:launch working child [[63894,1],15]
[slurm1:09446] [[63894,0],1] odls:launch: spawning child [[63894,1],15]
[slurm1:09446] 
 Data for app_context: index 0	app: ./MPI_Errhandler_fatal_f
 	Num procs: 16	FirstRank: 0	Recovery: DEFAULT	Max Restarts: 0
 	Argv[0]: ./MPI_Errhandler_fatal_f
 	Env[0]: OMPI_MCA_btl=tcp,self
 	Env[1]: OMPI_MCA_errmgr_base_verbose=100
 	Env[2]: OMPI_MCA_odls_base_verbose=100
 	Env[3]: OMPI_MCA_rmaps_base_oversubscribe=1
 	Env[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
 	Env[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
 	Env[6]: OMPI_MCA_orte_peer_modex_id=0
 	Env[7]: OMPI_MCA_orte_peer_init_barrier_id=1
 	Env[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
 	Env[9]: SHELL=/bin/bash
 	Env[10]: SSH_CLIENT=192.168.122.100 51855 22
 	Env[11]: USER=gouaillardet
 	Env[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[13]: MAIL=/var/mail/gouaillardet
 	Env[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
 	Env[15]: PWD=/csc/home1/gouaillardet
 	Env[16]: XMODIFIERS=@im=none
 	Env[17]: LANG=en_US.UTF-8
 	Env[18]: SHLVL=1
 	Env[19]: HOME=/csc/home1/gouaillardet
 	Env[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[21]: LOGNAME=gouaillardet
 	Env[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
 	Env[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
 	Env[24]: OMPI_MCA_orte_ess_jobid=4187357184
 	Env[25]: OMPI_MCA_orte_ess_vpid=1
 	Env[26]: OMPI_MCA_orte_ess_num_procs=16
 	Env[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
 	Env[28]: OMPI_MCA_plm=rsh
 	Env[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
 	Env[30]: OMPI_MCA_mpi_yield_when_idle=1
 	Env[31]: OMPI_MCA_orte_app_num=0
 	Env[32]: OMPI_UNIVERSE_SIZE=1
 	Env[33]: OMPI_MCA_orte_num_nodes=1
 	Env[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
 	Env[35]: OMPI_MCA_orte_bound_at_launch=1
 	Env[36]: OMPI_MCA_ess=env
 	Env[37]: OMPI_COMM_WORLD_SIZE=16
 	Env[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
 	Env[39]: OMPI_MCA_orte_tmpdir_base=/tmp
 	Env[40]: OMPI_MCA_grpcomm=^pmi
 	Env[41]: OMPI_MCA_db=^pmi
 	Env[42]: OMPI_MCA_pubsub=^pmi
 	Env[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
 	Env[44]: OMPI_NUM_APP_CTX=1
 	Env[45]: OMPI_FIRST_RANKS=0
 	Env[46]: OMPI_APP_CTX_NUM_PROCS=16
 	Env[47]: OMPI_MCA_ess_base_jobid=4187357185
 	Env[48]: OMPI_MCA_ess_base_vpid=15
 	Env[49]: OMPI_COMM_WORLD_RANK=15
 	Env[50]: OMPI_COMM_WORLD_LOCAL_RANK=15
 	Env[51]: OMPI_COMM_WORLD_NODE_RANK=15
 	Env[52]: OMPI_MCA_orte_ess_node_rank=15
 	Env[53]: OMPI_MCA_orte_num_restarts=0
 	Env[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/15
 	Working dir: /csc/home1/gouaillardet (user: 0 session-dir: 0)
 	Prefix: /csc/home1/gouaillardet/local/ompi-v1.8
 	Hostfile: NULL	Add-Hostfile: NULL
 	Dash_host[0]: slurm1
 	Preload binary: FALSE	Preload files: NULL	Used on node: TRUE
[slurm1:09446] [[63894,0],1] odls:launch setting waitpids
[slurm1:09446] [[63894,0],1] STARTING ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ARGV[0]: ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[0]: OMPI_MCA_btl=tcp,self
[slurm1:09446] [[63894,0],1]	ENVIRON[1]: OMPI_MCA_errmgr_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[2]: OMPI_MCA_odls_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[3]: OMPI_MCA_rmaps_base_oversubscribe=1
[slurm1:09446] [[63894,0],1]	ENVIRON[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
[slurm1:09446] [[63894,0],1]	ENVIRON[6]: OMPI_MCA_orte_peer_modex_id=0
[slurm1:09446] [[63894,0],1]	ENVIRON[7]: OMPI_MCA_orte_peer_init_barrier_id=1
[slurm1:09446] [[63894,0],1]	ENVIRON[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
[slurm1:09446] [[63894,0],1]	ENVIRON[9]: SHELL=/bin/bash
[slurm1:09446] [[63894,0],1]	ENVIRON[10]: SSH_CLIENT=192.168.122.100 51855 22
[slurm1:09446] [[63894,0],1]	ENVIRON[11]: USER=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[13]: MAIL=/var/mail/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
[slurm1:09446] [[63894,0],1]	ENVIRON[15]: PWD=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[16]: XMODIFIERS=@im=none
[slurm1:09446] [[63894,0],1]	ENVIRON[17]: LANG=en_US.UTF-8
[slurm1:09446] [[63894,0],1]	ENVIRON[18]: SHLVL=1
[slurm1:09446] [[63894,0],1]	ENVIRON[19]: HOME=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[21]: LOGNAME=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
[slurm1:09446] [[63894,0],1]	ENVIRON[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
[slurm1:09446] [[63894,0],1]	ENVIRON[24]: OMPI_MCA_orte_ess_jobid=4187357184
[slurm1:09446] [[63894,0],1]	ENVIRON[25]: OMPI_MCA_orte_ess_vpid=1
[slurm1:09446] [[63894,0],1]	ENVIRON[26]: OMPI_MCA_orte_ess_num_procs=16
[slurm1:09446] [[63894,0],1]	ENVIRON[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
[slurm1:09446] [[63894,0],1]	ENVIRON[28]: OMPI_MCA_plm=rsh
[slurm1:09446] [[63894,0],1]	ENVIRON[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
[slurm1:09446] [[63894,0],1]	ENVIRON[30]: OMPI_MCA_mpi_yield_when_idle=1
[slurm1:09446] [[63894,0],1]	ENVIRON[31]: OMPI_MCA_orte_app_num=0
[slurm1:09446] [[63894,0],1]	ENVIRON[32]: OMPI_UNIVERSE_SIZE=1
[slurm1:09446] [[63894,0],1]	ENVIRON[33]: OMPI_MCA_orte_num_nodes=1
[slurm1:09446] [[63894,0],1]	ENVIRON[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
[slurm1:09446] [[63894,0],1]	ENVIRON[35]: OMPI_MCA_orte_bound_at_launch=1
[slurm1:09446] [[63894,0],1]	ENVIRON[36]: OMPI_MCA_ess=env
[slurm1:09446] [[63894,0],1]	ENVIRON[37]: OMPI_COMM_WORLD_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[39]: OMPI_MCA_orte_tmpdir_base=/tmp
[slurm1:09446] [[63894,0],1]	ENVIRON[40]: OMPI_MCA_grpcomm=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[41]: OMPI_MCA_db=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[42]: OMPI_MCA_pubsub=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[44]: OMPI_NUM_APP_CTX=1
[slurm1:09446] [[63894,0],1]	ENVIRON[45]: OMPI_FIRST_RANKS=0
[slurm1:09446] [[63894,0],1]	ENVIRON[46]: OMPI_APP_CTX_NUM_PROCS=16
[slurm1:09446] [[63894,0],1]	ENVIRON[47]: OMPI_MCA_ess_base_jobid=4187357185
[slurm1:09446] [[63894,0],1]	ENVIRON[48]: OMPI_MCA_ess_base_vpid=12
[slurm1:09446] [[63894,0],1]	ENVIRON[49]: OMPI_COMM_WORLD_RANK=12
[slurm1:09446] [[63894,0],1]	ENVIRON[50]: OMPI_COMM_WORLD_LOCAL_RANK=12
[slurm1:09446] [[63894,0],1]	ENVIRON[51]: OMPI_COMM_WORLD_NODE_RANK=12
[s[slurm1:09446] [[63894,0],1] STARTING ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ARGV[0]: ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[0]: OMPI_MCA_btl=tcp,self
[slurm1:09446] [[63894,0],1]	ENVIRON[1]: OMPI_MCA_errmgr_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[2]: OMPI_MCA_odls_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[3]: OMPI_MCA_rmaps_base_oversubscribe=1
[slurm1:09446] [[63894,0],1]	ENVIRON[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
[slurm1:09446] [[63894,0],1]	ENVIRON[6]: OMPI_MCA_orte_peer_modex_id=0
[slurm1:09446] [[63894,0],1]	ENVIRON[7]: OMPI_MCA_orte_peer_init_barrier_id=1
[slurm1:09446] [[63894,0],1]	ENVIRON[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
[slurm1:09446] [[63894,0],1]	ENVIRON[9]: SHELL=/bin/bash
[slurm1:09446] [[63894,0],1]	ENVIRON[10]: SSH_CLIENT=192.168.122.100 51855 22
[slurm1:09446] [[63894,0],1]	ENVIRON[11]: USER=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[13]: MAIL=/var/mail/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
[slurm1:09446] [[63894,0],1]	ENVIRON[15]: PWD=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[16]: XMODIFIERS=@im=none
[slurm1:09446] [[63894,0],1]	ENVIRON[17]: LANG=en_US.UTF-8
[slurm1:09446] [[63894,0],1]	ENVIRON[18]: SHLVL=1
[slurm1:09446] [[63894,0],1]	ENVIRON[19]: HOME=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[21]: LOGNAME=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
[slurm1:09446] [[63894,0],1]	ENVIRON[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
[slurm1:09446] [[63894,0],1]	ENVIRON[24]: OMPI_MCA_orte_ess_jobid=4187357184
[slurm1:09446] [[63894,0],1]	ENVIRON[25]: OMPI_MCA_orte_ess_vpid=1
[slurm1:09446] [[63894,0],1]	ENVIRON[26]: OMPI_MCA_orte_ess_num_procs=16
[slurm1:09446] [[63894,0],1]	ENVIRON[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
[slurm1:09446] [[63894,0],1]	ENVIRON[28]: OMPI_MCA_plm=rsh
[slurm1:09446] [[63894,0],1]	ENVIRON[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
[slurm1:09446] [[63894,0],1]	ENVIRON[30]: OMPI_MCA_mpi_yield_when_idle=1
[slurm1:09446] [[63894,0],1]	ENVIRON[31]: OMPI_MCA_orte_app_num=0
[slurm1:09446] [[63894,0],1]	ENVIRON[32]: OMPI_UNIVERSE_SIZE=1
[slurm1:09446] [[63894,0],1]	ENVIRON[33]: OMPI_MCA_orte_num_nodes=1
[slurm1:09446] [[63894,0],1]	ENVIRON[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
[slurm1:09446] [[63894,0],1]	ENVIRON[35]: OMPI_MCA_orte_bound_at_launch=1
[slurm1:09446] [[63894,0],1]	ENVIRON[36]: OMPI_MCA_ess=env
[slurm1:09446] [[63894,0],1]	ENVIRON[37]: OMPI_COMM_WORLD_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[39]: OMPI_MCA_orte_tmpdir_base=/tmp
[slurm1:09446] [[63894,0],1]	ENVIRON[40]: OMPI_MCA_grpcomm=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[41]: OMPI_MCA_db=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[42]: OMPI_MCA_pubsub=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[44]: OMPI_NUM_APP_CTX=1
[slurm1:09446] [[63894,0],1]	ENVIRON[45]: OMPI_FIRST_RANKS=0
[slurm1:09446] [[63894,0],1]	ENVIRON[46]: OMPI_APP_CTX_NUM_PROCS=16
[slurm1:09446] [[63894,0],1]	ENVIRON[47]: OMPI_MCA_ess_base_jobid=4187357185
[slurm1:09446] [[63894,0],1]	ENVIRON[48]: OMPI_MCA_ess_base_vpid=13
[slurm1:09446] [[63894,0],1]	ENVIRON[49]: OMPI_COMM_WORLD_RANK=13
[slurm1:09446] [[63894,0],1]	ENVIRON[50]: OMPI_COMM_WORLD_LOCAL_RANK=13
[slurm1:09446] [[63894,0],1]	ENVIRON[51]: OMPI_COMM_WORLD_NODE_RANK=13
[s[slurm1:09446] [[63894,0],1] STARTING ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ARGV[0]: ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[0]: OMPI_MCA_btl=tcp,self
[slurm1:09446] [[63894,0],1]	ENVIRON[1]: OMPI_MCA_errmgr_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[2]: OMPI_MCA_odls_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[3]: OMPI_MCA_rmaps_base_oversubscribe=1
[slurm1:09446] [[63894,0],1]	ENVIRON[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
[slurm1:09446] [[63894,0],1]	ENVIRON[6]: OMPI_MCA_orte_peer_modex_id=0
[slurm1:09446] [[63894,0],1]	ENVIRON[7]: OMPI_MCA_orte_peer_init_barrier_id=1
[slurm1:09446] [[63894,0],1]	ENVIRON[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
[slurm1:09446] [[63894,0],1]	ENVIRON[9]: SHELL=/bin/bash
[slurm1:09446] [[63894,0],1]	ENVIRON[10]: SSH_CLIENT=192.168.122.100 51855 22
[slurm1:09446] [[63894,0],1]	ENVIRON[11]: USER=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[13]: MAIL=/var/mail/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
[slurm1:09446] [[63894,0],1]	ENVIRON[15]: PWD=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[16]: XMODIFIERS=@im=none
[slurm1:09446] [[63894,0],1]	ENVIRON[17]: LANG=en_US.UTF-8
[slurm1:09446] [[63894,0],1]	ENVIRON[18]: SHLVL=1
[slurm1:09446] [[63894,0],1]	ENVIRON[19]: HOME=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[21]: LOGNAME=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
[slurm1:09446] [[63894,0],1]	ENVIRON[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
[slurm1:09446] [[63894,0],1]	ENVIRON[24]: OMPI_MCA_orte_ess_jobid=4187357184
[slurm1:09446] [[63894,0],1]	ENVIRON[25]: OMPI_MCA_orte_ess_vpid=1
[slurm1:09446] [[63894,0],1]	ENVIRON[26]: OMPI_MCA_orte_ess_num_procs=16
[slurm1:09446] [[63894,0],1]	ENVIRON[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
[slurm1:09446] [[63894,0],1]	ENVIRON[28]: OMPI_MCA_plm=rsh
[slurm1:09446] [[63894,0],1]	ENVIRON[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
[slurm1:09446] [[63894,0],1]	ENVIRON[30]: OMPI_MCA_mpi_yield_when_idle=1
[slurm1:09446] [[63894,0],1]	ENVIRON[31]: OMPI_MCA_orte_app_num=0
[slurm1:09446] [[63894,0],1]	ENVIRON[32]: OMPI_UNIVERSE_SIZE=1
[slurm1:09446] [[63894,0],1]	ENVIRON[33]: OMPI_MCA_orte_num_nodes=1
[slurm1:09446] [[63894,0],1]	ENVIRON[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
[slurm1:09446] [[63894,0],1]	ENVIRON[35]: OMPI_MCA_orte_bound_at_launch=1
[slurm1:09446] [[63894,0],1]	ENVIRON[36]: OMPI_MCA_ess=env
[slurm1:09446] [[63894,0],1]	ENVIRON[37]: OMPI_COMM_WORLD_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[39]: OMPI_MCA_orte_tmpdir_base=/tmp
[slurm1:09446] [[63894,0],1]	ENVIRON[40]: OMPI_MCA_grpcomm=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[41]: OMPI_MCA_db=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[42]: OMPI_MCA_pubsub=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[44]: OMPI_NUM_APP_CTX=1
[slurm1:09446] [[63894,0],1]	ENVIRON[45]: OMPI_FIRST_RANKS=0
[slurm1:09446] [[63894,0],1]	ENVIRON[46]: OMPI_APP_CTX_NUM_PROCS=16
[slurm1:09446] [[63894,0],1]	ENVIRON[47]: OMPI_MCA_ess_base_jobid=4187357185
[slurm1:09446] [[63894,0],1]	ENVIRON[48]: OMPI_MCA_ess_base_vpid=14
[slurm1:09446] [[63894,0],1]	ENVIRON[49]: OMPI_COMM_WORLD_RANK=14
[slurm1:09446] [[63894,0],1]	ENVIRON[50]: OMPI_COMM_WORLD_LOCAL_RANK=14
[slurm1:09446] [[63894,0],1]	ENVIRON[51]: OMPI_COMM_WORLD_NODE_RANK=14
[s[slurm1:09446] [[63894,0],1] STARTING ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ARGV[0]: ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[0]: OMPI_MCA_btl=tcp,self
[slurm1:09446] [[63894,0],1]	ENVIRON[1]: OMPI_MCA_errmgr_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[2]: OMPI_MCA_odls_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[3]: OMPI_MCA_rmaps_base_oversubscribe=1
[slurm1:09446] [[63894,0],1]	ENVIRON[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
[slurm1:09446] [[63894,0],1]	ENVIRON[6]: OMPI_MCA_orte_peer_modex_id=0
[slurm1:09446] [[63894,0],1]	ENVIRON[7]: OMPI_MCA_orte_peer_init_barrier_id=1
[slurm1:09446] [[63894,0],1]	ENVIRON[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
[slurm1:09446] [[63894,0],1]	ENVIRON[9]: SHELL=/bin/bash
[slurm1:09446] [[63894,0],1]	ENVIRON[10]: SSH_CLIENT=192.168.122.100 51855 22
[slurm1:09446] [[63894,0],1]	ENVIRON[11]: USER=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[13]: MAIL=/var/mail/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
[slurm1:09446] [[63894,0],1]	ENVIRON[15]: PWD=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[16]: XMODIFIERS=@im=none
[slurm1:09446] [[63894,0],1]	ENVIRON[17]: LANG=en_US.UTF-8
[slurm1:09446] [[63894,0],1]	ENVIRON[18]: SHLVL=1
[slurm1:09446] [[63894,0],1]	ENVIRON[19]: HOME=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[21]: LOGNAME=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
[slurm1:09446] [[63894,0],1]	ENVIRON[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
[slurm1:09446] [[63894,0],1]	ENVIRON[24]: OMPI_MCA_orte_ess_jobid=4187357184
[slurm1:09446] [[63894,0],1]	ENVIRON[25]: OMPI_MCA_orte_ess_vpid=1
[slurm1:09446] [[63894,0],1]	ENVIRON[26]: OMPI_MCA_orte_ess_num_procs=16
[slurm1:09446] [[63894,0],1]	ENVIRON[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
[slurm1:09446] [[63894,0],1]	ENVIRON[28]: OMPI_MCA_plm=rsh
[slurm1:09446] [[63894,0],1]	ENVIRON[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
[slurm1:09446] [[63894,0],1]	ENVIRON[30]: OMPI_MCA_mpi_yield_when_idle=1
[slurm1:09446] [[63894,0],1]	ENVIRON[31]: OMPI_MCA_orte_app_num=0
[slurm1:09446] [[63894,0],1]	ENVIRON[32]: OMPI_UNIVERSE_SIZE=1
[slurm1:09446] [[63894,0],1]	ENVIRON[33]: OMPI_MCA_orte_num_nodes=1
[slurm1:09446] [[63894,0],1]	ENVIRON[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
[slurm1:09446] [[63894,0],1]	ENVIRON[35]: OMPI_MCA_orte_bound_at_launch=1
[slurm1:09446] [[63894,0],1]	ENVIRON[36]: OMPI_MCA_ess=env
[slurm1:09446] [[63894,0],1]	ENVIRON[37]: OMPI_COMM_WORLD_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[39]: OMPI_MCA_orte_tmpdir_base=/tmp
[slurm1:09446] [[63894,0],1]	ENVIRON[40]: OMPI_MCA_grpcomm=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[41]: OMPI_MCA_db=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[42]: OMPI_MCA_pubsub=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[44]: OMPI_NUM_APP_CTX=1
[slurm1:09446] [[63894,0],1]	ENVIRON[45]: OMPI_FIRST_RANKS=0
[slurm1:09446] [[63894,0],1]	ENVIRON[46]: OMPI_APP_CTX_NUM_PROCS=16
[slurm1:09446] [[63894,0],1]	ENVIRON[47]: OMPI_MCA_ess_base_jobid=4187357185
[slurm1:09446] [[63894,0],1]	ENVIRON[48]: OMPI_MCA_ess_base_vpid=15
[slurm1:09446] [[63894,0],1]	ENVIRON[49]: OMPI_COMM_WORLD_RANK=15
[slurm1:09446] [[63894,0],1]	ENVIRON[50]: OMPI_COMM_WORLD_LOCAL_RANK=15
[slurm1:09446] [[63894,0],1]	ENVIRON[51]: OMPI_COMM_WORLD_NODE_RANK=15
[s[slurm1:09446] [[63894,0],1] STARTING ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ARGV[0]: ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[0]: OMPI_MCA_btl=tcp,self
[slurm1:09446] [[63894,0],1]	ENVIRON[1]: OMPI_MCA_errmgr_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[2]: OMPI_MCA_odls_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[3]: OMPI_MCA_rmaps_base_oversubscribe=1
[slurm1:09446] [[63894,0],1]	ENVIRON[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
[slurm1:09446] [[63894,0],1]	ENVIRON[6]: OMPI_MCA_orte_peer_modex_id=0
[slurm1:09446] [[63894,0],1]	ENVIRON[7]: OMPI_MCA_orte_peer_init_barrier_id=1
[slurm1:09446] [[63894,0],1]	ENVIRON[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
[slurm1:09446] [[63894,0],1]	ENVIRON[9]: SHELL=/bin/bash
[slurm1:09446] [[63894,0],1]	ENVIRON[10]: SSH_CLIENT=192.168.122.100 51855 22
[slurm1:09446] [[63894,0],1]	ENVIRON[11]: USER=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[13]: MAIL=/var/mail/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
[slurm1:09446] [[63894,0],1]	ENVIRON[15]: PWD=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[16]: XMODIFIERS=@im=none
[slurm1:09446] [[63894,0],1]	ENVIRON[17]: LANG=en_US.UTF-8
[slurm1:09446] [[63894,0],1]	ENVIRON[18]: SHLVL=1
[slurm1:09446] [[63894,0],1]	ENVIRON[19]: HOME=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[21]: LOGNAME=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
[slurm1:09446] [[63894,0],1]	ENVIRON[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
[slurm1:09446] [[63894,0],1]	ENVIRON[24]: OMPI_MCA_orte_ess_jobid=4187357184
[slurm1:09446] [[63894,0],1]	ENVIRON[25]: OMPI_MCA_orte_ess_vpid=1
[slurm1:09446] [[63894,0],1]	ENVIRON[26]: OMPI_MCA_orte_ess_num_procs=16
[slurm1:09446] [[63894,0],1]	ENVIRON[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
[slurm1:09446] [[63894,0],1]	ENVIRON[28]: OMPI_MCA_plm=rsh
[slurm1:09446] [[63894,0],1]	ENVIRON[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
[slurm1:09446] [[63894,0],1]	ENVIRON[30]: OMPI_MCA_mpi_yield_when_idle=1
[slurm1:09446] [[63894,0],1]	ENVIRON[31]: OMPI_MCA_orte_app_num=0
[slurm1:09446] [[63894,0],1]	ENVIRON[32]: OMPI_UNIVERSE_SIZE=1
[slurm1:09446] [[63894,0],1]	ENVIRON[33]: OMPI_MCA_orte_num_nodes=1
[slurm1:09446] [[63894,0],1]	ENVIRON[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
[slurm1:09446] [[63894,0],1]	ENVIRON[35]: OMPI_MCA_orte_bound_at_launch=1
[slurm1:09446] [[63894,0],1]	ENVIRON[36]: OMPI_MCA_ess=env
[slurm1:09446] [[63894,0],1]	ENVIRON[37]: OMPI_COMM_WORLD_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[39]: OMPI_MCA_orte_tmpdir_base=/tmp
[slurm1:09446] [[63894,0],1]	ENVIRON[40]: OMPI_MCA_grpcomm=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[41]: OMPI_MCA_db=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[42]: OMPI_MCA_pubsub=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[44]: OMPI_NUM_APP_CTX=1
[slurm1:09446] [[63894,0],1]	ENVIRON[45]: OMPI_FIRST_RANKS=0
[slurm1:09446] [[63894,0],1]	ENVIRON[46]: OMPI_APP_CTX_NUM_PROCS=16
[slurm1:09446] [[63894,0],1]	ENVIRON[47]: OMPI_MCA_ess_base_jobid=4187357185
[slurm1:09446] [[63894,0],1]	ENVIRON[48]: OMPI_MCA_ess_base_vpid=0
[slurm1:09446] [[63894,0],1]	ENVIRON[49]: OMPI_COMM_WORLD_RANK=0
[slurm1:09446] [[63894,0],1]	ENVIRON[50]: OMPI_COMM_WORLD_LOCAL_RANK=0
[slurm1:09446] [[63894,0],1]	ENVIRON[51]: OMPI_COMM_WORLD_NODE_RANK=0
[slurm[slurm1:09446] [[63894,0],1] STARTING ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ARGV[0]: ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[0]: OMPI_MCA_btl=tcp,self
[slurm1:09446] [[63894,0],1]	ENVIRON[1]: OMPI_MCA_errmgr_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[2]: OMPI_MCA_odls_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[3]: OMPI_MCA_rmaps_base_oversubscribe=1
[slurm1:09446] [[63894,0],1]	ENVIRON[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
[slurm1:09446] [[63894,0],1]	ENVIRON[6]: OMPI_MCA_orte_peer_modex_id=0
[slurm1:09446] [[63894,0],1]	ENVIRON[7]: OMPI_MCA_orte_peer_init_barrier_id=1
[slurm1:09446] [[63894,0],1]	ENVIRON[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
[slurm1:09446] [[63894,0],1]	ENVIRON[9]: SHELL=/bin/bash
[slurm1:09446] [[63894,0],1]	ENVIRON[10]: SSH_CLIENT=192.168.122.100 51855 22
[slurm1:09446] [[63894,0],1]	ENVIRON[11]: USER=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[13]: MAIL=/var/mail/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
[slurm1:09446] [[63894,0],1]	ENVIRON[15]: PWD=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[16]: XMODIFIERS=@im=none
[slurm1:09446] [[63894,0],1]	ENVIRON[17]: LANG=en_US.UTF-8
[slurm1:09446] [[63894,0],1]	ENVIRON[18]: SHLVL=1
[slurm1:09446] [[63894,0],1]	ENVIRON[19]: HOME=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[21]: LOGNAME=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
[slurm1:09446] [[63894,0],1]	ENVIRON[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
[slurm1:09446] [[63894,0],1]	ENVIRON[24]: OMPI_MCA_orte_ess_jobid=4187357184
[slurm1:09446] [[63894,0],1]	ENVIRON[25]: OMPI_MCA_orte_ess_vpid=1
[slurm1:09446] [[63894,0],1]	ENVIRON[26]: OMPI_MCA_orte_ess_num_procs=16
[slurm1:09446] [[63894,0],1]	ENVIRON[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
[slurm1:09446] [[63894,0],1]	ENVIRON[28]: OMPI_MCA_plm=rsh
[slurm1:09446] [[63894,0],1]	ENVIRON[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
[slurm1:09446] [[63894,0],1]	ENVIRON[30]: OMPI_MCA_mpi_yield_when_idle=1
[slurm1:09446] [[63894,0],1]	ENVIRON[31]: OMPI_MCA_orte_app_num=0
[slurm1:09446] [[63894,0],1]	ENVIRON[32]: OMPI_UNIVERSE_SIZE=1
[slurm1:09446] [[63894,0],1]	ENVIRON[33]: OMPI_MCA_orte_num_nodes=1
[slurm1:09446] [[63894,0],1]	ENVIRON[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
[slurm1:09446] [[63894,0],1]	ENVIRON[35]: OMPI_MCA_orte_bound_at_launch=1
[slurm1:09446] [[63894,0],1]	ENVIRON[36]: OMPI_MCA_ess=env
[slurm1:09446] [[63894,0],1]	ENVIRON[37]: OMPI_COMM_WORLD_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[39]: OMPI_MCA_orte_tmpdir_base=/tmp
[slurm1:09446] [[63894,0],1]	ENVIRON[40]: OMPI_MCA_grpcomm=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[41]: OMPI_MCA_db=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[42]: OMPI_MCA_pubsub=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[44]: OMPI_NUM_APP_CTX=1
[slurm1:09446] [[63894,0],1]	ENVIRON[45]: OMPI_FIRST_RANKS=0
[slurm1:09446] [[63894,0],1]	ENVIRON[46]: OMPI_APP_CTX_NUM_PROCS=16
[slurm1:09446] [[63894,0],1]	ENVIRON[47]: OMPI_MCA_ess_base_jobid=4187357185
[slurm1:09446] [[63894,0],1]	ENVIRON[48]: OMPI_MCA_ess_base_vpid=1
[slurm1:09446] [[63894,0],1]	ENVIRON[49]: OMPI_COMM_WORLD_RANK=1
[slurm1:09446] [[63894,0],1]	ENVIRON[50]: OMPI_COMM_WORLD_LOCAL_RANK=1
[slurm1:09446] [[63894,0],1]	ENVIRON[51]: OMPI_COMM_WORLD_NODE_RANK=1
[slurm[slurm1:09446] [[63894,0],1] STARTING ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ARGV[0]: ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[0]: OMPI_MCA_btl=tcp,self
[slurm1:09446] [[63894,0],1]	ENVIRON[1]: OMPI_MCA_errmgr_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[2]: OMPI_MCA_odls_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[3]: OMPI_MCA_rmaps_base_oversubscribe=1
[slurm1:09446] [[63894,0],1]	ENVIRON[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
[slurm1:09446] [[63894,0],1]	ENVIRON[6]: OMPI_MCA_orte_peer_modex_id=0
[slurm1:09446] [[63894,0],1]	ENVIRON[7]: OMPI_MCA_orte_peer_init_barrier_id=1
[slurm1:09446] [[63894,0],1]	ENVIRON[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
[slurm1:09446] [[63894,0],1]	ENVIRON[9]: SHELL=/bin/bash
[slurm1:09446] [[63894,0],1]	ENVIRON[10]: SSH_CLIENT=192.168.122.100 51855 22
[slurm1:09446] [[63894,0],1]	ENVIRON[11]: USER=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[13]: MAIL=/var/mail/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
[slurm1:09446] [[63894,0],1]	ENVIRON[15]: PWD=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[16]: XMODIFIERS=@im=none
[slurm1:09446] [[63894,0],1]	ENVIRON[17]: LANG=en_US.UTF-8
[slurm1:09446] [[63894,0],1]	ENVIRON[18]: SHLVL=1
[slurm1:09446] [[63894,0],1]	ENVIRON[19]: HOME=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[21]: LOGNAME=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
[slurm1:09446] [[63894,0],1]	ENVIRON[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
[slurm1:09446] [[63894,0],1]	ENVIRON[24]: OMPI_MCA_orte_ess_jobid=4187357184
[slurm1:09446] [[63894,0],1]	ENVIRON[25]: OMPI_MCA_orte_ess_vpid=1
[slurm1:09446] [[63894,0],1]	ENVIRON[26]: OMPI_MCA_orte_ess_num_procs=16
[slurm1:09446] [[63894,0],1]	ENVIRON[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
[slurm1:09446] [[63894,0],1]	ENVIRON[28]: OMPI_MCA_plm=rsh
[slurm1:09446] [[63894,0],1]	ENVIRON[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
[slurm1:09446] [[63894,0],1]	ENVIRON[30]: OMPI_MCA_mpi_yield_when_idle=1
[slurm1:09446] [[63894,0],1]	ENVIRON[31]: OMPI_MCA_orte_app_num=0
[slurm1:09446] [[63894,0],1]	ENVIRON[32]: OMPI_UNIVERSE_SIZE=1
[slurm1:09446] [[63894,0],1]	ENVIRON[33]: OMPI_MCA_orte_num_nodes=1
[slurm1:09446] [[63894,0],1]	ENVIRON[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
[slurm1:09446] [[63894,0],1]	ENVIRON[35]: OMPI_MCA_orte_bound_at_launch=1
[slurm1:09446] [[63894,0],1]	ENVIRON[36]: OMPI_MCA_ess=env
[slurm1:09446] [[63894,0],1]	ENVIRON[37]: OMPI_COMM_WORLD_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[39]: OMPI_MCA_orte_tmpdir_base=/tmp
[slurm1:09446] [[63894,0],1]	ENVIRON[40]: OMPI_MCA_grpcomm=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[41]: OMPI_MCA_db=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[42]: OMPI_MCA_pubsub=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[44]: OMPI_NUM_APP_CTX=1
[slurm1:09446] [[63894,0],1]	ENVIRON[45]: OMPI_FIRST_RANKS=0
[slurm1:09446] [[63894,0],1]	ENVIRON[46]: OMPI_APP_CTX_NUM_PROCS=16
[slurm1:09446] [[63894,0],1]	ENVIRON[47]: OMPI_MCA_ess_base_jobid=4187357185
[slurm1:09446] [[63894,0],1]	ENVIRON[48]: OMPI_MCA_ess_base_vpid=2
[slurm1:09446] [[63894,0],1]	ENVIRON[49]: OMPI_COMM_WORLD_RANK=2
[slurm1:09446] [[63894,0],1]	ENVIRON[50]: OMPI_COMM_WORLD_LOCAL_RANK=2
[slurm1:09446] [[63894,0],1]	ENVIRON[51]: OMPI_COMM_WORLD_NODE_RANK=2
[slurm[slurm1:09446] [[63894,0],1] STARTING ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ARGV[0]: ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[0]: OMPI_MCA_btl=tcp,self
[slurm1:09446] [[63894,0],1]	ENVIRON[1]: OMPI_MCA_errmgr_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[2]: OMPI_MCA_odls_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[3]: OMPI_MCA_rmaps_base_oversubscribe=1
[slurm1:09446] [[63894,0],1]	ENVIRON[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
[slurm1:09446] [[63894,0],1]	ENVIRON[6]: OMPI_MCA_orte_peer_modex_id=0
[slurm1:09446] [[63894,0],1]	ENVIRON[7]: OMPI_MCA_orte_peer_init_barrier_id=1
[slurm1:09446] [[63894,0],1]	ENVIRON[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
[slurm1:09446] [[63894,0],1]	ENVIRON[9]: SHELL=/bin/bash
[slurm1:09446] [[63894,0],1]	ENVIRON[10]: SSH_CLIENT=192.168.122.100 51855 22
[slurm1:09446] [[63894,0],1]	ENVIRON[11]: USER=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[13]: MAIL=/var/mail/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
[slurm1:09446] [[63894,0],1]	ENVIRON[15]: PWD=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[16]: XMODIFIERS=@im=none
[slurm1:09446] [[63894,0],1]	ENVIRON[17]: LANG=en_US.UTF-8
[slurm1:09446] [[63894,0],1]	ENVIRON[18]: SHLVL=1
[slurm1:09446] [[63894,0],1]	ENVIRON[19]: HOME=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[21]: LOGNAME=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
[slurm1:09446] [[63894,0],1]	ENVIRON[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
[slurm1:09446] [[63894,0],1]	ENVIRON[24]: OMPI_MCA_orte_ess_jobid=4187357184
[slurm1:09446] [[63894,0],1]	ENVIRON[25]: OMPI_MCA_orte_ess_vpid=1
[slurm1:09446] [[63894,0],1]	ENVIRON[26]: OMPI_MCA_orte_ess_num_procs=16
[slurm1:09446] [[63894,0],1]	ENVIRON[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
[slurm1:09446] [[63894,0],1]	ENVIRON[28]: OMPI_MCA_plm=rsh
[slurm1:09446] [[63894,0],1]	ENVIRON[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
[slurm1:09446] [[63894,0],1]	ENVIRON[30]: OMPI_MCA_mpi_yield_when_idle=1
[slurm1:09446] [[63894,0],1]	ENVIRON[31]: OMPI_MCA_orte_app_num=0
[slurm1:09446] [[63894,0],1]	ENVIRON[32]: OMPI_UNIVERSE_SIZE=1
[slurm1:09446] [[63894,0],1]	ENVIRON[33]: OMPI_MCA_orte_num_nodes=1
[slurm1:09446] [[63894,0],1]	ENVIRON[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
[slurm1:09446] [[63894,0],1]	ENVIRON[35]: OMPI_MCA_orte_bound_at_launch=1
[slurm1:09446] [[63894,0],1]	ENVIRON[36]: OMPI_MCA_ess=env
[slurm1:09446] [[63894,0],1]	ENVIRON[37]: OMPI_COMM_WORLD_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[39]: OMPI_MCA_orte_tmpdir_base=/tmp
[slurm1:09446] [[63894,0],1]	ENVIRON[40]: OMPI_MCA_grpcomm=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[41]: OMPI_MCA_db=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[42]: OMPI_MCA_pubsub=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[44]: OMPI_NUM_APP_CTX=1
[slurm1:09446] [[63894,0],1]	ENVIRON[45]: OMPI_FIRST_RANKS=0
[slurm1:09446] [[63894,0],1]	ENVIRON[46]: OMPI_APP_CTX_NUM_PROCS=16
[slurm1:09446] [[63894,0],1]	ENVIRON[47]: OMPI_MCA_ess_base_jobid=4187357185
[slurm1:09446] [[63894,0],1]	ENVIRON[48]: OMPI_MCA_ess_base_vpid=3
[slurm1:09446] [[63894,0],1]	ENVIRON[49]: OMPI_COMM_WORLD_RANK=3
[slurm1:09446] [[63894,0],1]	ENVIRON[50]: OMPI_COMM_WORLD_LOCAL_RANK=3
[slurm1:09446] [[63894,0],1]	ENVIRON[51]: OMPI_COMM_WORLD_NODE_RANK=3
[slurm[slurm1:09446] [[63894,0],1] STARTING ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ARGV[0]: ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[0]: OMPI_MCA_btl=tcp,self
[slurm1:09446] [[63894,0],1]	ENVIRON[1]: OMPI_MCA_errmgr_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[2]: OMPI_MCA_odls_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[3]: OMPI_MCA_rmaps_base_oversubscribe=1
[slurm1:09446] [[63894,0],1]	ENVIRON[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
[slurm1:09446] [[63894,0],1]	ENVIRON[6]: OMPI_MCA_orte_peer_modex_id=0
[slurm1:09446] [[63894,0],1]	ENVIRON[7]: OMPI_MCA_orte_peer_init_barrier_id=1
[slurm1:09446] [[63894,0],1]	ENVIRON[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
[slurm1:09446] [[63894,0],1]	ENVIRON[9]: SHELL=/bin/bash
[slurm1:09446] [[63894,0],1]	ENVIRON[10]: SSH_CLIENT=192.168.122.100 51855 22
[slurm1:09446] [[63894,0],1]	ENVIRON[11]: USER=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[13]: MAIL=/var/mail/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
[slurm1:09446] [[63894,0],1]	ENVIRON[15]: PWD=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[16]: XMODIFIERS=@im=none
[slurm1:09446] [[63894,0],1]	ENVIRON[17]: LANG=en_US.UTF-8
[slurm1:09446] [[63894,0],1]	ENVIRON[18]: SHLVL=1
[slurm1:09446] [[63894,0],1]	ENVIRON[19]: HOME=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[21]: LOGNAME=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
[slurm1:09446] [[63894,0],1]	ENVIRON[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
[slurm1:09446] [[63894,0],1]	ENVIRON[24]: OMPI_MCA_orte_ess_jobid=4187357184
[slurm1:09446] [[63894,0],1]	ENVIRON[25]: OMPI_MCA_orte_ess_vpid=1
[slurm1:09446] [[63894,0],1]	ENVIRON[26]: OMPI_MCA_orte_ess_num_procs=16
[slurm1:09446] [[63894,0],1]	ENVIRON[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
[slurm1:09446] [[63894,0],1]	ENVIRON[28]: OMPI_MCA_plm=rsh
[slurm1:09446] [[63894,0],1]	ENVIRON[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
[slurm1:09446] [[63894,0],1]	ENVIRON[30]: OMPI_MCA_mpi_yield_when_idle=1
[slurm1:09446] [[63894,0],1]	ENVIRON[31]: OMPI_MCA_orte_app_num=0
[slurm1:09446] [[63894,0],1]	ENVIRON[32]: OMPI_UNIVERSE_SIZE=1
[slurm1:09446] [[63894,0],1]	ENVIRON[33]: OMPI_MCA_orte_num_nodes=1
[slurm1:09446] [[63894,0],1]	ENVIRON[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
[slurm1:09446] [[63894,0],1]	ENVIRON[35]: OMPI_MCA_orte_bound_at_launch=1
[slurm1:09446] [[63894,0],1]	ENVIRON[36]: OMPI_MCA_ess=env
[slurm1:09446] [[63894,0],1]	ENVIRON[37]: OMPI_COMM_WORLD_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[39]: OMPI_MCA_orte_tmpdir_base=/tmp
[slurm1:09446] [[63894,0],1]	ENVIRON[40]: OMPI_MCA_grpcomm=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[41]: OMPI_MCA_db=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[42]: OMPI_MCA_pubsub=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[44]: OMPI_NUM_APP_CTX=1
[slurm1:09446] [[63894,0],1]	ENVIRON[45]: OMPI_FIRST_RANKS=0
[slurm1:09446] [[63894,0],1]	ENVIRON[46]: OMPI_APP_CTX_NUM_PROCS=16
[slurm1:09446] [[63894,0],1]	ENVIRON[47]: OMPI_MCA_ess_base_jobid=4187357185
[slurm1:09446] [[63894,0],1]	ENVIRON[48]: OMPI_MCA_ess_base_vpid=4
[slurm1:09446] [[63894,0],1]	ENVIRON[49]: OMPI_COMM_WORLD_RANK=4
[slurm1:09446] [[63894,0],1]	ENVIRON[50]: OMPI_COMM_WORLD_LOCAL_RANK=4
[slurm1:09446] [[63894,0],1]	ENVIRON[51]: OMPI_COMM_WORLD_NODE_RANK=4
[slurm[slurm1:09446] [[63894,0],1] STARTING ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ARGV[0]: ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[0]: OMPI_MCA_btl=tcp,self
[slurm1:09446] [[63894,0],1]	ENVIRON[1]: OMPI_MCA_errmgr_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[2]: OMPI_MCA_odls_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[3]: OMPI_MCA_rmaps_base_oversubscribe=1
[slurm1:09446] [[63894,0],1]	ENVIRON[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
[slurm1:09446] [[63894,0],1]	ENVIRON[6]: OMPI_MCA_orte_peer_modex_id=0
[slurm1:09446] [[63894,0],1]	ENVIRON[7]: OMPI_MCA_orte_peer_init_barrier_id=1
[slurm1:09446] [[63894,0],1]	ENVIRON[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
[slurm1:09446] [[63894,0],1]	ENVIRON[9]: SHELL=/bin/bash
[slurm1:09446] [[63894,0],1]	ENVIRON[10]: SSH_CLIENT=192.168.122.100 51855 22
[slurm1:09446] [[63894,0],1]	ENVIRON[11]: USER=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[13]: MAIL=/var/mail/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
[slurm1:09446] [[63894,0],1]	ENVIRON[15]: PWD=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[16]: XMODIFIERS=@im=none
[slurm1:09446] [[63894,0],1]	ENVIRON[17]: LANG=en_US.UTF-8
[slurm1:09446] [[63894,0],1]	ENVIRON[18]: SHLVL=1
[slurm1:09446] [[63894,0],1]	ENVIRON[19]: HOME=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[21]: LOGNAME=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
[slurm1:09446] [[63894,0],1]	ENVIRON[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
[slurm1:09446] [[63894,0],1]	ENVIRON[24]: OMPI_MCA_orte_ess_jobid=4187357184
[slurm1:09446] [[63894,0],1]	ENVIRON[25]: OMPI_MCA_orte_ess_vpid=1
[slurm1:09446] [[63894,0],1]	ENVIRON[26]: OMPI_MCA_orte_ess_num_procs=16
[slurm1:09446] [[63894,0],1]	ENVIRON[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
[slurm1:09446] [[63894,0],1]	ENVIRON[28]: OMPI_MCA_plm=rsh
[slurm1:09446] [[63894,0],1]	ENVIRON[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
[slurm1:09446] [[63894,0],1]	ENVIRON[30]: OMPI_MCA_mpi_yield_when_idle=1
[slurm1:09446] [[63894,0],1]	ENVIRON[31]: OMPI_MCA_orte_app_num=0
[slurm1:09446] [[63894,0],1]	ENVIRON[32]: OMPI_UNIVERSE_SIZE=1
[slurm1:09446] [[63894,0],1]	ENVIRON[33]: OMPI_MCA_orte_num_nodes=1
[slurm1:09446] [[63894,0],1]	ENVIRON[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
[slurm1:09446] [[63894,0],1]	ENVIRON[35]: OMPI_MCA_orte_bound_at_launch=1
[slurm1:09446] [[63894,0],1]	ENVIRON[36]: OMPI_MCA_ess=env
[slurm1:09446] [[63894,0],1]	ENVIRON[37]: OMPI_COMM_WORLD_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[39]: OMPI_MCA_orte_tmpdir_base=/tmp
[slurm1:09446] [[63894,0],1]	ENVIRON[40]: OMPI_MCA_grpcomm=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[41]: OMPI_MCA_db=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[42]: OMPI_MCA_pubsub=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[44]: OMPI_NUM_APP_CTX=1
[slurm1:09446] [[63894,0],1]	ENVIRON[45]: OMPI_FIRST_RANKS=0
[slurm1:09446] [[63894,0],1]	ENVIRON[46]: OMPI_APP_CTX_NUM_PROCS=16
[slurm1:09446] [[63894,0],1]	ENVIRON[47]: OMPI_MCA_ess_base_jobid=4187357185
[slurm1:09446] [[63894,0],1]	ENVIRON[48]: OMPI_MCA_ess_base_vpid=5
[slurm1:09446] [[63894,0],1]	ENVIRON[49]: OMPI_COMM_WORLD_RANK=5
[slurm1:09446] [[63894,0],1]	ENVIRON[50]: OMPI_COMM_WORLD_LOCAL_RANK=5
[slurm1:09446] [[63894,0],1]	ENVIRON[51]: OMPI_COMM_WORLD_NODE_RANK=5
[slurm[slurm1:09446] [[63894,0],1] STARTING ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ARGV[0]: ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[0]: OMPI_MCA_btl=tcp,self
[slurm1:09446] [[63894,0],1]	ENVIRON[1]: OMPI_MCA_errmgr_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[2]: OMPI_MCA_odls_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[3]: OMPI_MCA_rmaps_base_oversubscribe=1
[slurm1:09446] [[63894,0],1]	ENVIRON[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
[slurm1:09446] [[63894,0],1]	ENVIRON[6]: OMPI_MCA_orte_peer_modex_id=0
[slurm1:09446] [[63894,0],1]	ENVIRON[7]: OMPI_MCA_orte_peer_init_barrier_id=1
[slurm1:09446] [[63894,0],1]	ENVIRON[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
[slurm1:09446] [[63894,0],1]	ENVIRON[9]: SHELL=/bin/bash
[slurm1:09446] [[63894,0],1]	ENVIRON[10]: SSH_CLIENT=192.168.122.100 51855 22
[slurm1:09446] [[63894,0],1]	ENVIRON[11]: USER=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[13]: MAIL=/var/mail/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
[slurm1:09446] [[63894,0],1]	ENVIRON[15]: PWD=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[16]: XMODIFIERS=@im=none
[slurm1:09446] [[63894,0],1]	ENVIRON[17]: LANG=en_US.UTF-8
[slurm1:09446] [[63894,0],1]	ENVIRON[18]: SHLVL=1
[slurm1:09446] [[63894,0],1]	ENVIRON[19]: HOME=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[21]: LOGNAME=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
[slurm1:09446] [[63894,0],1]	ENVIRON[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
[slurm1:09446] [[63894,0],1]	ENVIRON[24]: OMPI_MCA_orte_ess_jobid=4187357184
[slurm1:09446] [[63894,0],1]	ENVIRON[25]: OMPI_MCA_orte_ess_vpid=1
[slurm1:09446] [[63894,0],1]	ENVIRON[26]: OMPI_MCA_orte_ess_num_procs=16
[slurm1:09446] [[63894,0],1]	ENVIRON[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
[slurm1:09446] [[63894,0],1]	ENVIRON[28]: OMPI_MCA_plm=rsh
[slurm1:09446] [[63894,0],1]	ENVIRON[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
[slurm1:09446] [[63894,0],1]	ENVIRON[30]: OMPI_MCA_mpi_yield_when_idle=1
[slurm1:09446] [[63894,0],1]	ENVIRON[31]: OMPI_MCA_orte_app_num=0
[slurm1:09446] [[63894,0],1]	ENVIRON[32]: OMPI_UNIVERSE_SIZE=1
[slurm1:09446] [[63894,0],1]	ENVIRON[33]: OMPI_MCA_orte_num_nodes=1
[slurm1:09446] [[63894,0],1]	ENVIRON[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
[slurm1:09446] [[63894,0],1]	ENVIRON[35]: OMPI_MCA_orte_bound_at_launch=1
[slurm1:09446] [[63894,0],1]	ENVIRON[36]: OMPI_MCA_ess=env
[slurm1:09446] [[63894,0],1]	ENVIRON[37]: OMPI_COMM_WORLD_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[39]: OMPI_MCA_orte_tmpdir_base=/tmp
[slurm1:09446] [[63894,0],1]	ENVIRON[40]: OMPI_MCA_grpcomm=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[41]: OMPI_MCA_db=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[42]: OMPI_MCA_pubsub=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[44]: OMPI_NUM_APP_CTX=1
[slurm1:09446] [[63894,0],1]	ENVIRON[45]: OMPI_FIRST_RANKS=0
[slurm1:09446] [[63894,0],1]	ENVIRON[46]: OMPI_APP_CTX_NUM_PROCS=16
[slurm1:09446] [[63894,0],1]	ENVIRON[47]: OMPI_MCA_ess_base_jobid=4187357185
[slurm1:09446] [[63894,0],1]	ENVIRON[48]: OMPI_MCA_ess_base_vpid=6
[slurm1:09446] [[63894,0],1]	ENVIRON[49]: OMPI_COMM_WORLD_RANK=6
[slurm1:09446] [[63894,0],1]	ENVIRON[50]: OMPI_COMM_WORLD_LOCAL_RANK=6
[slurm1:09446] [[63894,0],1]	ENVIRON[51]: OMPI_COMM_WORLD_NODE_RANK=6
[slurm[slurm1:09446] [[63894,0],1] STARTING ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ARGV[0]: ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[0]: OMPI_MCA_btl=tcp,self
[slurm1:09446] [[63894,0],1]	ENVIRON[1]: OMPI_MCA_errmgr_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[2]: OMPI_MCA_odls_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[3]: OMPI_MCA_rmaps_base_oversubscribe=1
[slurm1:09446] [[63894,0],1]	ENVIRON[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
[slurm1:09446] [[63894,0],1]	ENVIRON[6]: OMPI_MCA_orte_peer_modex_id=0
[slurm1:09446] [[63894,0],1]	ENVIRON[7]: OMPI_MCA_orte_peer_init_barrier_id=1
[slurm1:09446] [[63894,0],1]	ENVIRON[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
[slurm1:09446] [[63894,0],1]	ENVIRON[9]: SHELL=/bin/bash
[slurm1:09446] [[63894,0],1]	ENVIRON[10]: SSH_CLIENT=192.168.122.100 51855 22
[slurm1:09446] [[63894,0],1]	ENVIRON[11]: USER=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[13]: MAIL=/var/mail/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
[slurm1:09446] [[63894,0],1]	ENVIRON[15]: PWD=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[16]: XMODIFIERS=@im=none
[slurm1:09446] [[63894,0],1]	ENVIRON[17]: LANG=en_US.UTF-8
[slurm1:09446] [[63894,0],1]	ENVIRON[18]: SHLVL=1
[slurm1:09446] [[63894,0],1]	ENVIRON[19]: HOME=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[21]: LOGNAME=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
[slurm1:09446] [[63894,0],1]	ENVIRON[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
[slurm1:09446] [[63894,0],1]	ENVIRON[24]: OMPI_MCA_orte_ess_jobid=4187357184
[slurm1:09446] [[63894,0],1]	ENVIRON[25]: OMPI_MCA_orte_ess_vpid=1
[slurm1:09446] [[63894,0],1]	ENVIRON[26]: OMPI_MCA_orte_ess_num_procs=16
[slurm1:09446] [[63894,0],1]	ENVIRON[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
[slurm1:09446] [[63894,0],1]	ENVIRON[28]: OMPI_MCA_plm=rsh
[slurm1:09446] [[63894,0],1]	ENVIRON[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
[slurm1:09446] [[63894,0],1]	ENVIRON[30]: OMPI_MCA_mpi_yield_when_idle=1
[slurm1:09446] [[63894,0],1]	ENVIRON[31]: OMPI_MCA_orte_app_num=0
[slurm1:09446] [[63894,0],1]	ENVIRON[32]: OMPI_UNIVERSE_SIZE=1
[slurm1:09446] [[63894,0],1]	ENVIRON[33]: OMPI_MCA_orte_num_nodes=1
[slurm1:09446] [[63894,0],1]	ENVIRON[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
[slurm1:09446] [[63894,0],1]	ENVIRON[35]: OMPI_MCA_orte_bound_at_launch=1
[slurm1:09446] [[63894,0],1]	ENVIRON[36]: OMPI_MCA_ess=env
[slurm1:09446] [[63894,0],1]	ENVIRON[37]: OMPI_COMM_WORLD_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[39]: OMPI_MCA_orte_tmpdir_base=/tmp
[slurm1:09446] [[63894,0],1]	ENVIRON[40]: OMPI_MCA_grpcomm=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[41]: OMPI_MCA_db=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[42]: OMPI_MCA_pubsub=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[44]: OMPI_NUM_APP_CTX=1
[slurm1:09446] [[63894,0],1]	ENVIRON[45]: OMPI_FIRST_RANKS=0
[slurm1:09446] [[63894,0],1]	ENVIRON[46]: OMPI_APP_CTX_NUM_PROCS=16
[slurm1:09446] [[63894,0],1]	ENVIRON[47]: OMPI_MCA_ess_base_jobid=4187357185
[slurm1:09446] [[63894,0],1]	ENVIRON[48]: OMPI_MCA_ess_base_vpid=7
[slurm1:09446] [[63894,0],1]	ENVIRON[49]: OMPI_COMM_WORLD_RANK=7
[slurm1:09446] [[63894,0],1]	ENVIRON[50]: OMPI_COMM_WORLD_LOCAL_RANK=7
[slurm1:09446] [[63894,0],1]	ENVIRON[51]: OMPI_COMM_WORLD_NODE_RANK=7
[slurm[slurm1:09446] [[63894,0],1] STARTING ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ARGV[0]: ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[0]: OMPI_MCA_btl=tcp,self
[slurm1:09446] [[63894,0],1]	ENVIRON[1]: OMPI_MCA_errmgr_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[2]: OMPI_MCA_odls_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[3]: OMPI_MCA_rmaps_base_oversubscribe=1
[slurm1:09446] [[63894,0],1]	ENVIRON[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
[slurm1:09446] [[63894,0],1]	ENVIRON[6]: OMPI_MCA_orte_peer_modex_id=0
[slurm1:09446] [[63894,0],1]	ENVIRON[7]: OMPI_MCA_orte_peer_init_barrier_id=1
[slurm1:09446] [[63894,0],1]	ENVIRON[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
[slurm1:09446] [[63894,0],1]	ENVIRON[9]: SHELL=/bin/bash
[slurm1:09446] [[63894,0],1]	ENVIRON[10]: SSH_CLIENT=192.168.122.100 51855 22
[slurm1:09446] [[63894,0],1]	ENVIRON[11]: USER=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[13]: MAIL=/var/mail/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
[slurm1:09446] [[63894,0],1]	ENVIRON[15]: PWD=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[16]: XMODIFIERS=@im=none
[slurm1:09446] [[63894,0],1]	ENVIRON[17]: LANG=en_US.UTF-8
[slurm1:09446] [[63894,0],1]	ENVIRON[18]: SHLVL=1
[slurm1:09446] [[63894,0],1]	ENVIRON[19]: HOME=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[21]: LOGNAME=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
[slurm1:09446] [[63894,0],1]	ENVIRON[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
[slurm1:09446] [[63894,0],1]	ENVIRON[24]: OMPI_MCA_orte_ess_jobid=4187357184
[slurm1:09446] [[63894,0],1]	ENVIRON[25]: OMPI_MCA_orte_ess_vpid=1
[slurm1:09446] [[63894,0],1]	ENVIRON[26]: OMPI_MCA_orte_ess_num_procs=16
[slurm1:09446] [[63894,0],1]	ENVIRON[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
[slurm1:09446] [[63894,0],1]	ENVIRON[28]: OMPI_MCA_plm=rsh
[slurm1:09446] [[63894,0],1]	ENVIRON[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
[slurm1:09446] [[63894,0],1]	ENVIRON[30]: OMPI_MCA_mpi_yield_when_idle=1
[slurm1:09446] [[63894,0],1]	ENVIRON[31]: OMPI_MCA_orte_app_num=0
[slurm1:09446] [[63894,0],1]	ENVIRON[32]: OMPI_UNIVERSE_SIZE=1
[slurm1:09446] [[63894,0],1]	ENVIRON[33]: OMPI_MCA_orte_num_nodes=1
[slurm1:09446] [[63894,0],1]	ENVIRON[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
[slurm1:09446] [[63894,0],1]	ENVIRON[35]: OMPI_MCA_orte_bound_at_launch=1
[slurm1:09446] [[63894,0],1]	ENVIRON[36]: OMPI_MCA_ess=env
[slurm1:09446] [[63894,0],1]	ENVIRON[37]: OMPI_COMM_WORLD_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[39]: OMPI_MCA_orte_tmpdir_base=/tmp
[slurm1:09446] [[63894,0],1]	ENVIRON[40]: OMPI_MCA_grpcomm=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[41]: OMPI_MCA_db=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[42]: OMPI_MCA_pubsub=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[44]: OMPI_NUM_APP_CTX=1
[slurm1:09446] [[63894,0],1]	ENVIRON[45]: OMPI_FIRST_RANKS=0
[slurm1:09446] [[63894,0],1]	ENVIRON[46]: OMPI_APP_CTX_NUM_PROCS=16
[slurm1:09446] [[63894,0],1]	ENVIRON[47]: OMPI_MCA_ess_base_jobid=4187357185
[slurm1:09446] [[63894,0],1]	ENVIRON[48]: OMPI_MCA_ess_base_vpid=8
[slurm1:09446] [[63894,0],1]	ENVIRON[49]: OMPI_COMM_WORLD_RANK=8
[slurm1:09446] [[63894,0],1]	ENVIRON[50]: OMPI_COMM_WORLD_LOCAL_RANK=8
[slurm1:09446] [[63894,0],1]	ENVIRON[51]: OMPI_COMM_WORLD_NODE_RANK=8
[slurm[slurm1:09446] [[63894,0],1] STARTING ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ARGV[0]: ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[0]: OMPI_MCA_btl=tcp,self
[slurm1:09446] [[63894,0],1]	ENVIRON[1]: OMPI_MCA_errmgr_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[2]: OMPI_MCA_odls_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[3]: OMPI_MCA_rmaps_base_oversubscribe=1
[slurm1:09446] [[63894,0],1]	ENVIRON[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
[slurm1:09446] [[63894,0],1]	ENVIRON[6]: OMPI_MCA_orte_peer_modex_id=0
[slurm1:09446] [[63894,0],1]	ENVIRON[7]: OMPI_MCA_orte_peer_init_barrier_id=1
[slurm1:09446] [[63894,0],1]	ENVIRON[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
[slurm1:09446] [[63894,0],1]	ENVIRON[9]: SHELL=/bin/bash
[slurm1:09446] [[63894,0],1]	ENVIRON[10]: SSH_CLIENT=192.168.122.100 51855 22
[slurm1:09446] [[63894,0],1]	ENVIRON[11]: USER=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[13]: MAIL=/var/mail/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
[slurm1:09446] [[63894,0],1]	ENVIRON[15]: PWD=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[16]: XMODIFIERS=@im=none
[slurm1:09446] [[63894,0],1]	ENVIRON[17]: LANG=en_US.UTF-8
[slurm1:09446] [[63894,0],1]	ENVIRON[18]: SHLVL=1
[slurm1:09446] [[63894,0],1]	ENVIRON[19]: HOME=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[21]: LOGNAME=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
[slurm1:09446] [[63894,0],1]	ENVIRON[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
[slurm1:09446] [[63894,0],1]	ENVIRON[24]: OMPI_MCA_orte_ess_jobid=4187357184
[slurm1:09446] [[63894,0],1]	ENVIRON[25]: OMPI_MCA_orte_ess_vpid=1
[slurm1:09446] [[63894,0],1]	ENVIRON[26]: OMPI_MCA_orte_ess_num_procs=16
[slurm1:09446] [[63894,0],1]	ENVIRON[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
[slurm1:09446] [[63894,0],1]	ENVIRON[28]: OMPI_MCA_plm=rsh
[slurm1:09446] [[63894,0],1]	ENVIRON[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
[slurm1:09446] [[63894,0],1]	ENVIRON[30]: OMPI_MCA_mpi_yield_when_idle=1
[slurm1:09446] [[63894,0],1]	ENVIRON[31]: OMPI_MCA_orte_app_num=0
[slurm1:09446] [[63894,0],1]	ENVIRON[32]: OMPI_UNIVERSE_SIZE=1
[slurm1:09446] [[63894,0],1]	ENVIRON[33]: OMPI_MCA_orte_num_nodes=1
[slurm1:09446] [[63894,0],1]	ENVIRON[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
[slurm1:09446] [[63894,0],1]	ENVIRON[35]: OMPI_MCA_orte_bound_at_launch=1
[slurm1:09446] [[63894,0],1]	ENVIRON[36]: OMPI_MCA_ess=env
[slurm1:09446] [[63894,0],1]	ENVIRON[37]: OMPI_COMM_WORLD_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[39]: OMPI_MCA_orte_tmpdir_base=/tmp
[slurm1:09446] [[63894,0],1]	ENVIRON[40]: OMPI_MCA_grpcomm=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[41]: OMPI_MCA_db=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[42]: OMPI_MCA_pubsub=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[44]: OMPI_NUM_APP_CTX=1
[slurm1:09446] [[63894,0],1]	ENVIRON[45]: OMPI_FIRST_RANKS=0
[slurm1:09446] [[63894,0],1]	ENVIRON[46]: OMPI_APP_CTX_NUM_PROCS=16
[slurm1:09446] [[63894,0],1]	ENVIRON[47]: OMPI_MCA_ess_base_jobid=4187357185
[slurm1:09446] [[63894,0],1]	ENVIRON[48]: OMPI_MCA_ess_base_vpid=9
[slurm1:09446] [[63894,0],1]	ENVIRON[49]: OMPI_COMM_WORLD_RANK=9
[slurm1:09446] [[63894,0],1]	ENVIRON[50]: OMPI_COMM_WORLD_LOCAL_RANK=9
[slurm1:09446] [[63894,0],1]	ENVIRON[51]: OMPI_COMM_WORLD_NODE_RANK=9
[slurm[slurm1:09446] [[63894,0],1] STARTING ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ARGV[0]: ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[0]: OMPI_MCA_btl=tcp,self
[slurm1:09446] [[63894,0],1]	ENVIRON[1]: OMPI_MCA_errmgr_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[2]: OMPI_MCA_odls_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[3]: OMPI_MCA_rmaps_base_oversubscribe=1
[slurm1:09446] [[63894,0],1]	ENVIRON[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
[slurm1:09446] [[63894,0],1]	ENVIRON[6]: OMPI_MCA_orte_peer_modex_id=0
[slurm1:09446] [[63894,0],1]	ENVIRON[7]: OMPI_MCA_orte_peer_init_barrier_id=1
[slurm1:09446] [[63894,0],1]	ENVIRON[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
[slurm1:09446] [[63894,0],1]	ENVIRON[9]: SHELL=/bin/bash
[slurm1:09446] [[63894,0],1]	ENVIRON[10]: SSH_CLIENT=192.168.122.100 51855 22
[slurm1:09446] [[63894,0],1]	ENVIRON[11]: USER=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[13]: MAIL=/var/mail/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
[slurm1:09446] [[63894,0],1]	ENVIRON[15]: PWD=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[16]: XMODIFIERS=@im=none
[slurm1:09446] [[63894,0],1]	ENVIRON[17]: LANG=en_US.UTF-8
[slurm1:09446] [[63894,0],1]	ENVIRON[18]: SHLVL=1
[slurm1:09446] [[63894,0],1]	ENVIRON[19]: HOME=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[21]: LOGNAME=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
[slurm1:09446] [[63894,0],1]	ENVIRON[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
[slurm1:09446] [[63894,0],1]	ENVIRON[24]: OMPI_MCA_orte_ess_jobid=4187357184
[slurm1:09446] [[63894,0],1]	ENVIRON[25]: OMPI_MCA_orte_ess_vpid=1
[slurm1:09446] [[63894,0],1]	ENVIRON[26]: OMPI_MCA_orte_ess_num_procs=16
[slurm1:09446] [[63894,0],1]	ENVIRON[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
[slurm1:09446] [[63894,0],1]	ENVIRON[28]: OMPI_MCA_plm=rsh
[slurm1:09446] [[63894,0],1]	ENVIRON[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
[slurm1:09446] [[63894,0],1]	ENVIRON[30]: OMPI_MCA_mpi_yield_when_idle=1
[slurm1:09446] [[63894,0],1]	ENVIRON[31]: OMPI_MCA_orte_app_num=0
[slurm1:09446] [[63894,0],1]	ENVIRON[32]: OMPI_UNIVERSE_SIZE=1
[slurm1:09446] [[63894,0],1]	ENVIRON[33]: OMPI_MCA_orte_num_nodes=1
[slurm1:09446] [[63894,0],1]	ENVIRON[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
[slurm1:09446] [[63894,0],1]	ENVIRON[35]: OMPI_MCA_orte_bound_at_launch=1
[slurm1:09446] [[63894,0],1]	ENVIRON[36]: OMPI_MCA_ess=env
[slurm1:09446] [[63894,0],1]	ENVIRON[37]: OMPI_COMM_WORLD_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[39]: OMPI_MCA_orte_tmpdir_base=/tmp
[slurm1:09446] [[63894,0],1]	ENVIRON[40]: OMPI_MCA_grpcomm=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[41]: OMPI_MCA_db=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[42]: OMPI_MCA_pubsub=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[44]: OMPI_NUM_APP_CTX=1
[slurm1:09446] [[63894,0],1]	ENVIRON[45]: OMPI_FIRST_RANKS=0
[slurm1:09446] [[63894,0],1]	ENVIRON[46]: OMPI_APP_CTX_NUM_PROCS=16
[slurm1:09446] [[63894,0],1]	ENVIRON[47]: OMPI_MCA_ess_base_jobid=4187357185
[slurm1:09446] [[63894,0],1]	ENVIRON[48]: OMPI_MCA_ess_base_vpid=10
[slurm1:09446] [[63894,0],1]	ENVIRON[49]: OMPI_COMM_WORLD_RANK=10
[slurm1:09446] [[63894,0],1]	ENVIRON[50]: OMPI_COMM_WORLD_LOCAL_RANK=10
[slurm1:09446] [[63894,0],1]	ENVIRON[51]: OMPI_COMM_WORLD_NODE_RANK=10
[s[slurm1:09446] [[63894,0],1] STARTING ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ARGV[0]: ./MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[0]: OMPI_MCA_btl=tcp,self
[slurm1:09446] [[63894,0],1]	ENVIRON[1]: OMPI_MCA_errmgr_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[2]: OMPI_MCA_odls_base_verbose=100
[slurm1:09446] [[63894,0],1]	ENVIRON[3]: OMPI_MCA_rmaps_base_oversubscribe=1
[slurm1:09446] [[63894,0],1]	ENVIRON[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
[slurm1:09446] [[63894,0],1]	ENVIRON[5]: OMPI_MCA_orte_precondition_transports=133c2a3580dbbdaa-ec11192b4b888226
[slurm1:09446] [[63894,0],1]	ENVIRON[6]: OMPI_MCA_orte_peer_modex_id=0
[slurm1:09446] [[63894,0],1]	ENVIRON[7]: OMPI_MCA_orte_peer_init_barrier_id=1
[slurm1:09446] [[63894,0],1]	ENVIRON[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
[slurm1:09446] [[63894,0],1]	ENVIRON[9]: SHELL=/bin/bash
[slurm1:09446] [[63894,0],1]	ENVIRON[10]: SSH_CLIENT=192.168.122.100 51855 22
[slurm1:09446] [[63894,0],1]	ENVIRON[11]: USER=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[13]: MAIL=/var/mail/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
[slurm1:09446] [[63894,0],1]	ENVIRON[15]: PWD=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[16]: XMODIFIERS=@im=none
[slurm1:09446] [[63894,0],1]	ENVIRON[17]: LANG=en_US.UTF-8
[slurm1:09446] [[63894,0],1]	ENVIRON[18]: SHLVL=1
[slurm1:09446] [[63894,0],1]	ENVIRON[19]: HOME=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:09446] [[63894,0],1]	ENVIRON[21]: LOGNAME=gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[22]: SSH_CONNECTION=192.168.122.100 51855 192.168.122.101 22
[slurm1:09446] [[63894,0],1]	ENVIRON[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
[slurm1:09446] [[63894,0],1]	ENVIRON[24]: OMPI_MCA_orte_ess_jobid=4187357184
[slurm1:09446] [[63894,0],1]	ENVIRON[25]: OMPI_MCA_orte_ess_vpid=1
[slurm1:09446] [[63894,0],1]	ENVIRON[26]: OMPI_MCA_orte_ess_num_procs=16
[slurm1:09446] [[63894,0],1]	ENVIRON[27]: OMPI_MCA_orte_hnp_uri=4187357184.0;tcp://192.168.122.100:48366
[slurm1:09446] [[63894,0],1]	ENVIRON[28]: OMPI_MCA_plm=rsh
[slurm1:09446] [[63894,0],1]	ENVIRON[29]: OMPI_MCA_orte_local_daemon_uri=4187357184.1;tcp://192.168.122.101,10.0.0.5:33172
[slurm1:09446] [[63894,0],1]	ENVIRON[30]: OMPI_MCA_mpi_yield_when_idle=1
[slurm1:09446] [[63894,0],1]	ENVIRON[31]: OMPI_MCA_orte_app_num=0
[slurm1:09446] [[63894,0],1]	ENVIRON[32]: OMPI_UNIVERSE_SIZE=1
[slurm1:09446] [[63894,0],1]	ENVIRON[33]: OMPI_MCA_orte_num_nodes=1
[slurm1:09446] [[63894,0],1]	ENVIRON[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
[slurm1:09446] [[63894,0],1]	ENVIRON[35]: OMPI_MCA_orte_bound_at_launch=1
[slurm1:09446] [[63894,0],1]	ENVIRON[36]: OMPI_MCA_ess=env
[slurm1:09446] [[63894,0],1]	ENVIRON[37]: OMPI_COMM_WORLD_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[38]: OMPI_COMM_WORLD_LOCAL_SIZE=16
[slurm1:09446] [[63894,0],1]	ENVIRON[39]: OMPI_MCA_orte_tmpdir_base=/tmp
[slurm1:09446] [[63894,0],1]	ENVIRON[40]: OMPI_MCA_grpcomm=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[41]: OMPI_MCA_db=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[42]: OMPI_MCA_pubsub=^pmi
[slurm1:09446] [[63894,0],1]	ENVIRON[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
[slurm1:09446] [[63894,0],1]	ENVIRON[44]: OMPI_NUM_APP_CTX=1
[slurm1:09446] [[63894,0],1]	ENVIRON[45]: OMPI_FIRST_RANKS=0
[slurm1:09446] [[63894,0],1]	ENVIRON[46]: OMPI_APP_CTX_NUM_PROCS=16
[slurm1:09446] [[63894,0],1]	ENVIRON[47]: OMPI_MCA_ess_base_jobid=4187357185
[slurm1:09446] [[63894,0],1]	ENVIRON[48]: OMPI_MCA_ess_base_vpid=11
[slurm1:09446] [[63894,0],1]	ENVIRON[49]: OMPI_COMM_WORLD_RANK=11
[slurm1:09446] [[63894,0],1]	ENVIRON[50]: OMPI_COMM_WORLD_LOCAL_RANK=11
[slurm1:09446] [[63894,0],1]	ENVIRON[51]: OMPI_COMM_WORLD_NODE_RANK=11
[s1:09446] [[63894,0],1]	ENVIRON[52]: OMPI_MCA_orte_ess_node_rank=3
[slurm1:09446] [[63894,0],1]	ENVIRON[53]: OMPI_MCA_orte_num_restarts=0
[slurm1:09446] [[63894,0],1]	ENVIRON[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/3
[slurm1:09446] [[63894,0],1]	ENVIRON[55]: OPAL_OUTPUT_STDERR_FD=27
1:09446] [[63894,0],1]	ENVIRON[52]: OMPI_MCA_orte_ess_node_rank=4
[slurm1:09446] [[63894,0],1]	ENVIRON[53]: OMPI_MCA_orte_num_restarts=0
[slurm1:09446] [[63894,0],1]	ENVIRON[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/4
[slurm1:09446] [[63894,0],1]	ENVIRON[55]: OPAL_OUTPUT_STDERR_FD=30
1:09446] [[63894,0],1]	ENVIRON[52]: OMPI_MCA_orte_ess_node_rank=5
[slurm1:09446] [[63894,0],1]	ENVIRON[53]: OMPI_MCA_orte_num_restarts=0
[slurm1:09446] [[63894,0],1]	ENVIRON[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/5
[slurm1:09446] [[63894,0],1]	ENVIRON[55]: OPAL_OUTPUT_STDERR_FD=33
1:09446] [[63894,0],1]	ENVIRON[52]: OMPI_MCA_orte_ess_node_rank=6
[slurm1:09446] [[63894,0],1]	ENVIRON[53]: OMPI_MCA_orte_num_restarts=0
[slurm1:09446] [[63894,0],1]	ENVIRON[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/6
[slurm1:09446] [[63894,0],1]	ENVIRON[55]: OPAL_OUTPUT_STDERR_FD=36
1:09446] [[63894,0],1]	ENVIRON[52]: OMPI_MCA_orte_ess_node_rank=7
[slurm1:09446] [[63894,0],1]	ENVIRON[53]: OMPI_MCA_orte_num_restarts=0
[slurm1:09446] [[63894,0],1]	ENVIRON[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/7
[slurm1:09446] [[63894,0],1]	ENVIRON[55]: OPAL_OUTPUT_STDERR_FD=39
1:09446] [[63894,0],1]	ENVIRON[52]: OMPI_MCA_orte_ess_node_rank=8
[slurm1:09446] [[63894,0],1]	ENVIRON[53]: OMPI_MCA_orte_num_restarts=0
[slurm1:09446] [[63894,0],1]	ENVIRON[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/8
[slurm1:09446] [[63894,0],1]	ENVIRON[55]: OPAL_OUTPUT_STDERR_FD=42
1:09446] [[63894,0],1]	ENVIRON[52]: OMPI_MCA_orte_ess_node_rank=9
[slurm1:09446] [[63894,0],1]	ENVIRON[53]: OMPI_MCA_orte_num_restarts=0
[slurm1:09446] [[63894,0],1]	ENVIRON[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/9
[slurm1:09446] [[63894,0],1]	ENVIRON[55]: OPAL_OUTPUT_STDERR_FD=45
lurm1:09446] [[63894,0],1]	ENVIRON[52]: OMPI_MCA_orte_ess_node_rank=10
[slurm1:09446] [[63894,0],1]	ENVIRON[53]: OMPI_MCA_orte_num_restarts=0
[slurm1:09446] [[63894,0],1]	ENVIRON[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/10
[slurm1:09446] [[63894,0],1]	ENVIRON[55]: OPAL_OUTPUT_STDERR_FD=48
lurm1:09446] [[63894,0],1]	ENVIRON[52]: OMPI_MCA_orte_ess_node_rank=12
[slurm1:09446] [[63894,0],1]	ENVIRON[53]: OMPI_MCA_orte_num_restarts=0
[slurm1:09446] [[63894,0],1]	ENVIRON[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/12
[slurm1:09446] [[63894,0],1]	ENVIRON[55]: OPAL_OUTPUT_STDERR_FD=54
lurm1:09446] [[63894,0],1]	ENVIRON[52]: OMPI_MCA_orte_ess_node_rank=13
[slurm1:09446] [[63894,0],1]	ENVIRON[53]: OMPI_MCA_orte_num_restarts=0
[slurm1:09446] [[63894,0],1]	ENVIRON[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/13
[slurm1:09446] [[63894,0],1]	ENVIRON[55]: OPAL_OUTPUT_STDERR_FD=57
lurm1:09446] [[63894,0],1]	ENVIRON[52]: OMPI_MCA_orte_ess_node_rank=14
[slurm1:09446] [[63894,0],1]	ENVIRON[53]: OMPI_MCA_orte_num_restarts=0
[slurm1:09446] [[63894,0],1]	ENVIRON[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/14
[slurm1:09446] [[63894,0],1]	ENVIRON[55]: OPAL_OUTPUT_STDERR_FD=60
lurm1:09446] [[63894,0],1]	ENVIRON[52]: OMPI_MCA_orte_ess_node_rank=11
[slurm1:09446] [[63894,0],1]	ENVIRON[53]: OMPI_MCA_orte_num_restarts=0
[slurm1:09446] [[63894,0],1]	ENVIRON[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/11
[slurm1:09446] [[63894,0],1]	ENVIRON[55]: OPAL_OUTPUT_STDERR_FD=51
lurm1:09446] [[63894,0],1]	ENVIRON[52]: OMPI_MCA_orte_ess_node_rank=15
[slurm1:09446] [[63894,0],1]	ENVIRON[53]: OMPI_MCA_orte_num_restarts=0
[slurm1:09446] [[63894,0],1]	ENVIRON[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/15
[slurm1:09446] [[63894,0],1]	ENVIRON[55]: OPAL_OUTPUT_STDERR_FD=63
1:09446] [[63894,0],1]	ENVIRON[52]: OMPI_MCA_orte_ess_node_rank=0
[slurm1:09446] [[63894,0],1]	ENVIRON[53]: OMPI_MCA_orte_num_restarts=0
[slurm1:09446] [[63894,0],1]	ENVIRON[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/0
[slurm1:09446] [[63894,0],1]	ENVIRON[55]: OPAL_OUTPUT_STDERR_FD=17
1:09446] [[63894,0],1]	ENVIRON[52]: OMPI_MCA_orte_ess_node_rank=1
[slurm1:09446] [[63894,0],1]	ENVIRON[53]: OMPI_MCA_orte_num_restarts=0
[slurm1:09446] [[63894,0],1]	ENVIRON[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/1
[slurm1:09446] [[63894,0],1]	ENVIRON[55]: OPAL_OUTPUT_STDERR_FD=21
1:09446] [[63894,0],1]	ENVIRON[52]: OMPI_MCA_orte_ess_node_rank=2
[slurm1:09446] [[63894,0],1]	ENVIRON[53]: OMPI_MCA_orte_num_restarts=0
[slurm1:09446] [[63894,0],1]	ENVIRON[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/2
[slurm1:09446] [[63894,0],1]	ENVIRON[55]: OPAL_OUTPUT_STDERR_FD=24
[slurm1:09448] mca: base: components_register: registering errmgr components
[slurm1:09448] mca: base: components_register: found loaded component default_app
[slurm1:09448] mca: base: components_register: component default_app register function successful
[slurm1:09448] mca: base: components_register: found loaded component default_hnp
[slurm1:09448] mca: base: components_register: component default_hnp register function successful
[slurm1:09448] mca: base: components_register: found loaded component default_orted
[slurm1:09448] mca: base: components_register: component default_orted register function successful
[slurm1:09448] mca: base: components_register: found loaded component default_tool
[slurm1:09448] mca: base: components_register: component default_tool register function successful
[slurm1:09448] mca: base: components_open: opening errmgr components
[slurm1:09448] mca: base: components_open: found loaded component default_app
[slurm1:09448] mca: base: components_open: component default_app open function successful
[slurm1:09448] mca: base: components_open: found loaded component default_hnp
[slurm1:09448] mca: base: components_open: component default_hnp open function successful
[slurm1:09448] mca: base: components_open: found loaded component default_orted
[slurm1:09448] mca: base: components_open: component default_orted open function successful
[slurm1:09448] mca: base: components_open: found loaded component default_tool
[slurm1:09448] mca: base: components_open: component default_tool open function successful
[slurm1:09449] mca: base: components_register: registering errmgr components
[slurm1:09449] mca: base: components_register: found loaded component default_app
[slurm1:09449] mca: base: components_register: component default_app register function successful
[slurm1:09449] mca: base: components_register: found loaded component default_hnp
[slurm1:09449] mca: base: components_register: component default_hnp register function successful
[slurm1:09449] mca: base: components_register: found loaded component default_orted
[slurm1:09449] mca: base: components_register: component default_orted register function successful
[slurm1:09449] mca: base: components_register: found loaded component default_tool
[slurm1:09449] mca: base: components_register: component default_tool register function successful
[slurm1:09449] mca: base: components_open: opening errmgr components
[slurm1:09449] mca: base: components_open: found loaded component default_app
[slurm1:09449] mca: base: components_open: component default_app open function successful
[slurm1:09449] mca: base: components_open: found loaded component default_hnp
[slurm1:09449] mca: base: components_open: component default_hnp open function successful
[slurm1:09449] mca: base: components_open: found loaded component default_orted
[slurm1:09449] mca: base: components_open: component default_orted open function successful
[slurm1:09449] mca: base: components_open: found loaded component default_tool
[slurm1:09449] mca: base: components_open: component default_tool open function successful
[slurm1:09448] mca:base:select: Auto-selecting errmgr components
[slurm1:09448] mca:base:select:(errmgr) Querying component [default_app]
[slurm1:09448] mca:base:select:(errmgr) Query of component [default_app] set priority to 1000
[slurm1:09448] mca:base:select:(errmgr) Querying component [default_hnp]
[slurm1:09448] mca:base:select:(errmgr) Skipping component [default_hnp]. Query failed to return a module
[slurm1:09448] mca:base:select:(errmgr) Querying component [default_orted]
[slurm1:09448] mca:base:select:(errmgr) Skipping component [default_orted]. Query failed to return a module
[slurm1:09448] mca:base:select:(errmgr) Querying component [default_tool]
[slurm1:09448] mca:base:select:(errmgr) Skipping component [default_tool]. Query failed to return a module
[slurm1:09448] mca:base:select:(errmgr) Selected component [default_app]
[slurm1:09448] mca: base: close: component default_hnp closed
[slurm1:09448] mca: base: close: unloading component default_hnp
[slurm1:09448] mca: base: close: component default_orted closed
[slurm1:09448] mca: base: close: unloading component default_orted
[slurm1:09448] mca: base: close: component default_tool closed
[slurm1:09448] mca: base: close: unloading component default_tool
[slurm1:09449] mca:base:select: Auto-selecting errmgr components
[slurm1:09449] mca:base:select:(errmgr) Querying component [default_app]
[slurm1:09449] mca:base:select:(errmgr) Query of component [default_app] set priority to 1000
[slurm1:09449] mca:base:select:(errmgr) Querying component [default_hnp]
[slurm1:09449] mca:base:select:(errmgr) Skipping component [default_hnp]. Query failed to return a module
[slurm1:09449] mca:base:select:(errmgr) Querying component [default_orted]
[slurm1:09449] mca:base:select:(errmgr) Skipping component [default_orted]. Query failed to return a module
[slurm1:09449] mca:base:select:(errmgr) Querying component [default_tool]
[slurm1:09449] mca:base:select:(errmgr) Skipping component [default_tool]. Query failed to return a module
[slurm1:09449] mca:base:select:(errmgr) Selected component [default_app]
[slurm1:09449] mca: base: close: component default_hnp closed
[slurm1:09449] mca: base: close: unloading component default_hnp
[slurm1:09450] mca: base: components_register: registering errmgr components
[slurm1:09449] mca: base: close: component default_orted closed
[slurm1:09449] mca: base: close: unloading component default_orted
[slurm1:09449] mca: base: close: component default_tool closed
[slurm1:09449] mca: base: close: unloading component default_tool
[slurm1:09450] mca: base: components_register: found loaded component default_app
[slurm1:09450] mca: base: components_register: component default_app register function successful
[slurm1:09450] mca: base: components_register: found loaded component default_hnp
[slurm1:09450] mca: base: components_register: component default_hnp register function successful
[slurm1:09450] mca: base: components_register: found loaded component default_orted
[slurm1:09450] mca: base: components_register: component default_orted register function successful
[slurm1:09450] mca: base: components_register: found loaded component default_tool
[slurm1:09450] mca: base: components_register: component default_tool register function successful
[slurm1:09450] mca: base: components_open: opening errmgr components
[slurm1:09450] mca: base: components_open: found loaded component default_app
[slurm1:09450] mca: base: components_open: component default_app open function successful
[slurm1:09450] mca: base: components_open: found loaded component default_hnp
[slurm1:09450] mca: base: components_open: component default_hnp open function successful
[slurm1:09450] mca: base: components_open: found loaded component default_orted
[slurm1:09450] mca: base: components_open: component default_orted open function successful
[slurm1:09450] mca: base: components_open: found loaded component default_tool
[slurm1:09450] mca: base: components_open: component default_tool open function successful
[slurm1:09451] mca: base: components_register: registering errmgr components
[slurm1:09451] mca: base: components_register: found loaded component default_app
[slurm1:09451] mca: base: components_register: component default_app register function successful
[slurm1:09451] mca: base: components_register: found loaded component default_hnp
[slurm1:09451] mca: base: components_register: component default_hnp register function successful
[slurm1:09451] mca: base: components_register: found loaded component default_orted
[slurm1:09451] mca: base: components_register: component default_orted register function successful
[slurm1:09451] mca: base: components_register: found loaded component default_tool
[slurm1:09451] mca: base: components_register: component default_tool register function successful
[slurm1:09451] mca: base: components_open: opening errmgr components
[slurm1:09451] mca: base: components_open: found loaded component default_app
[slurm1:09451] mca: base: components_open: component default_app open function successful
[slurm1:09451] mca: base: components_open: found loaded component default_hnp
[slurm1:09451] mca: base: components_open: component default_hnp open function successful
[slurm1:09451] mca: base: components_open: found loaded component default_orted
[slurm1:09451] mca: base: components_open: component default_orted open function successful
[slurm1:09451] mca: base: components_open: found loaded component default_tool
[slurm1:09451] mca: base: components_open: component default_tool open function successful
[slurm1:09452] mca: base: components_register: registering errmgr components
[slurm1:09452] mca: base: components_register: found loaded component default_app
[slurm1:09452] mca: base: components_register: component default_app register function successful
[slurm1:09452] mca: base: components_register: found loaded component default_hnp
[slurm1:09452] mca: base: components_register: component default_hnp register function successful
[slurm1:09452] mca: base: components_register: found loaded component default_orted
[slurm1:09452] mca: base: components_register: component default_orted register function successful
[slurm1:09452] mca: base: components_register: found loaded component default_tool
[slurm1:09452] mca: base: components_register: component default_tool register function successful
[slurm1:09452] mca: base: components_open: opening errmgr components
[slurm1:09452] mca: base: components_open: found loaded component default_app
[slurm1:09452] mca: base: components_open: component default_app open function successful
[slurm1:09452] mca: base: components_open: found loaded component default_hnp
[slurm1:09452] mca: base: components_open: component default_hnp open function successful
[slurm1:09452] mca: base: components_open: found loaded component default_orted
[slurm1:09452] mca: base: components_open: component default_orted open function successful
[slurm1:09452] mca: base: components_open: found loaded component default_tool
[slurm1:09452] mca: base: components_open: component default_tool open function successful
[slurm1:09453] mca: base: components_register: registering errmgr components
[slurm1:09450] mca:base:select: Auto-selecting errmgr components
[slurm1:09450] mca:base:select:(errmgr) Querying component [default_app]
[slurm1:09450] mca:base:select:(errmgr) Query of component [default_app] set priority to 1000
[slurm1:09450] mca:base:select:(errmgr) Querying component [default_hnp]
[slurm1:09450] mca:base:select:(errmgr) Skipping component [default_hnp]. Query failed to return a module
[slurm1:09450] mca:base:select:(errmgr) Querying component [default_orted]
[slurm1:09450] mca:base:select:(errmgr) Skipping component [default_orted]. Query failed to return a module
[slurm1:09450] mca:base:select:(errmgr) Querying component [default_tool]
[slurm1:09450] mca:base:select:(errmgr) Skipping component [default_tool]. Query failed to return a module
[slurm1:09450] mca:base:select:(errmgr) Selected component [default_app]
[slurm1:09450] mca: base: close: component default_hnp closed
[slurm1:09450] mca: base: close: unloading component default_hnp
[slurm1:09450] mca: base: close: component default_orted closed
[slurm1:09450] mca: base: close: unloading component default_orted
[slurm1:09450] mca: base: close: component default_tool closed
[slurm1:09450] mca: base: close: unloading component default_tool
[slurm1:09453] mca: base: components_register: found loaded component default_app
[slurm1:09453] mca: base: components_register: component default_app register function successful
[slurm1:09451] mca:base:select: Auto-selecting errmgr components
[slurm1:09451] mca:base:select:(errmgr) Querying component [default_app]
[slurm1:09451] mca:base:select:(errmgr) Query of component [default_app] set priority to 1000
[slurm1:09451] mca:base:select:(errmgr) Querying component [default_hnp]
[slurm1:09451] mca:base:select:(errmgr) Skipping component [default_hnp]. Query failed to return a module
[slurm1:09451] mca:base:select:(errmgr) Querying component [default_orted]
[slurm1:09451] mca:base:select:(errmgr) Skipping component [default_orted]. Query failed to return a module
[slurm1:09451] mca:base:select:(errmgr) Querying component [default_tool]
[slurm1:09451] mca:base:select:(errmgr) Skipping component [default_tool]. Query failed to return a module
[slurm1:09451] mca:base:select:(errmgr) Selected component [default_app]
[slurm1:09451] mca: base: close: component default_hnp closed
[slurm1:09451] mca: base: close: unloading component default_hnp
[slurm1:09451] mca: base: close: component default_orted closed
[slurm1:09451] mca: base: close: unloading component default_orted
[slurm1:09451] mca: base: close: component default_tool closed
[slurm1:09451] mca: base: close: unloading component default_tool
[slurm1:09453] mca: base: components_register: found loaded component default_hnp
[slurm1:09453] mca: base: components_register: component default_hnp register function successful
[slurm1:09453] mca: base: components_register: found loaded component default_orted
[slurm1:09453] mca: base: components_register: component default_orted register function successful
[slurm1:09453] mca: base: components_register: found loaded component default_tool
[slurm1:09453] mca: base: components_register: component default_tool register function successful
[slurm1:09447] mca: base: components_register: registering errmgr components
[slurm1:09447] mca: base: components_register: found loaded component default_app
[slurm1:09453] mca: base: components_open: opening errmgr components
[slurm1:09453] mca: base: components_open: found loaded component default_app
[slurm1:09453] mca: base: components_open: component default_app open function successful
[slurm1:09453] mca: base: components_open: found loaded component default_hnp
[slurm1:09453] mca: base: components_open: component default_hnp open function successful
[slurm1:09447] mca: base: components_register: component default_app register function successful
[slurm1:09447] mca: base: components_register: found loaded component default_hnp
[slurm1:09447] mca: base: components_register: component default_hnp register function successful
[slurm1:09447] mca: base: components_register: found loaded component default_orted
[slurm1:09447] mca: base: components_register: component default_orted register function successful
[slurm1:09447] mca: base: components_register: found loaded component default_tool
[slurm1:09447] mca: base: components_register: component default_tool register function successful
[slurm1:09447] mca: base: components_open: opening errmgr components
[slurm1:09447] mca: base: components_open: found loaded component default_app
[slurm1:09447] mca: base: components_open: component default_app open function successful
[slurm1:09447] mca: base: components_open: found loaded component default_hnp
[slurm1:09447] mca: base: components_open: component default_hnp open function successful
[slurm1:09447] mca: base: components_open: found loaded component default_orted
[slurm1:09447] mca: base: components_open: component default_orted open function successful
[slurm1:09447] mca: base: components_open: found loaded component default_tool
[slurm1:09447] mca: base: components_open: component default_tool open function successful
[slurm1:09453] mca: base: components_open: found loaded component default_orted
[slurm1:09453] mca: base: components_open: component default_orted open function successful
[slurm1:09453] mca: base: components_open: found loaded component default_tool
[slurm1:09453] mca: base: components_open: component default_tool open function successful
[slurm1:09452] mca:base:select: Auto-selecting errmgr components
[slurm1:09452] mca:base:select:(errmgr) Querying component [default_app]
[slurm1:09452] mca:base:select:(errmgr) Query of component [default_app] set priority to 1000
[slurm1:09452] mca:base:select:(errmgr) Querying component [default_hnp]
[slurm1:09452] mca:base:select:(errmgr) Skipping component [default_hnp]. Query failed to return a module
[slurm1:09452] mca:base:select:(errmgr) Querying component [default_orted]
[slurm1:09452] mca:base:select:(errmgr) Skipping component [default_orted]. Query failed to return a module
[slurm1:09452] mca:base:select:(errmgr) Querying component [default_tool]
[slurm1:09452] mca:base:select:(errmgr) Skipping component [default_tool]. Query failed to return a module
[slurm1:09452] mca:base:select:(errmgr) Selected component [default_app]
[slurm1:09452] mca: base: close: component default_hnp closed
[slurm1:09452] mca: base: close: unloading component default_hnp
[slurm1:09454] mca: base: components_register: registering errmgr components
[slurm1:09454] mca: base: components_register: found loaded component default_app
[slurm1:09454] mca: base: components_register: component default_app register function successful
[slurm1:09454] mca: base: components_register: found loaded component default_hnp
[slurm1:09454] mca: base: components_register: component default_hnp register function successful
[slurm1:09454] mca: base: components_register: found loaded component default_orted
[slurm1:09454] mca: base: components_register: component default_orted register function successful
[slurm1:09454] mca: base: components_register: found loaded component default_tool
[slurm1:09454] mca: base: components_register: component default_tool register function successful
[slurm1:09454] mca: base: components_open: opening errmgr components
[slurm1:09454] mca: base: components_open: found loaded component default_app
[slurm1:09454] mca: base: components_open: component default_app open function successful
[slurm1:09454] mca: base: components_open: found loaded component default_hnp
[slurm1:09454] mca: base: components_open: component default_hnp open function successful
[slurm1:09454] mca: base: components_open: found loaded component default_orted
[slurm1:09454] mca: base: components_open: component default_orted open function successful
[slurm1:09454] mca: base: components_open: found loaded component default_tool
[slurm1:09454] mca: base: components_open: component default_tool open function successful
[slurm1:09452] mca: base: close: component default_orted closed
[slurm1:09452] mca: base: close: unloading component default_orted
[slurm1:09452] mca: base: close: component default_tool closed
[slurm1:09452] mca: base: close: unloading component default_tool
[slurm1:09455] mca: base: components_register: registering errmgr components
[slurm1:09455] mca: base: components_register: found loaded component default_app
[slurm1:09455] mca: base: components_register: component default_app register function successful
[slurm1:09455] mca: base: components_register: found loaded component default_hnp
[slurm1:09455] mca: base: components_register: component default_hnp register function successful
[slurm1:09455] mca: base: components_register: found loaded component default_orted
[slurm1:09455] mca: base: components_register: component default_orted register function successful
[slurm1:09455] mca: base: components_register: found loaded component default_tool
[slurm1:09453] mca:base:select: Auto-selecting errmgr components
[slurm1:09453] mca:base:select:(errmgr) Querying component [default_app]
[slurm1:09453] mca:base:select:(errmgr) Query of component [default_app] set priority to 1000
[slurm1:09453] mca:base:select:(errmgr) Querying component [default_hnp]
[slurm1:09453] mca:base:select:(errmgr) Skipping component [default_hnp]. Query failed to return a module
[slurm1:09453] mca:base:select:(errmgr) Querying component [default_orted]
[slurm1:09453] mca:base:select:(errmgr) Skipping component [default_orted]. Query failed to return a module
[slurm1:09453] mca:base:select:(errmgr) Querying component [default_tool]
[slurm1:09453] mca:base:select:(errmgr) Skipping component [default_tool]. Query failed to return a module
[slurm1:09453] mca:base:select:(errmgr) Selected component [default_app]
[slurm1:09453] mca: base: close: component default_hnp closed
[slurm1:09453] mca: base: close: unloading component default_hnp
[slurm1:09453] mca: base: close: component default_orted closed
[slurm1:09453] mca: base: close: unloading component default_orted
[slurm1:09453] mca: base: close: component default_tool closed
[slurm1:09453] mca: base: close: unloading component default_tool
[slurm1:09455] mca: base: components_register: component default_tool register function successful
[slurm1:09455] mca: base: components_open: opening errmgr components
[slurm1:09455] mca: base: components_open: found loaded component default_app
[slurm1:09455] mca: base: components_open: component default_app open function successful
[slurm1:09455] mca: base: components_open: found loaded component default_hnp
[slurm1:09455] mca: base: components_open: component default_hnp open function successful
[slurm1:09455] mca: base: components_open: found loaded component default_orted
[slurm1:09455] mca: base: components_open: component default_orted open function successful
[slurm1:09455] mca: base: components_open: found loaded component default_tool
[slurm1:09455] mca: base: components_open: component default_tool open function successful
[slurm1:09456] mca: base: components_register: registering errmgr components
[slurm1:09456] mca: base: components_register: found loaded component default_app
[slurm1:09456] mca: base: components_register: component default_app register function successful
[slurm1:09456] mca: base: components_register: found loaded component default_hnp
[slurm1:09456] mca: base: components_register: component default_hnp register function successful
[slurm1:09454] mca:base:select: Auto-selecting errmgr components
[slurm1:09454] mca:base:select:(errmgr) Querying component [default_app]
[slurm1:09454] mca:base:select:(errmgr) Query of component [default_app] set priority to 1000
[slurm1:09454] mca:base:select:(errmgr) Querying component [default_hnp]
[slurm1:09454] mca:base:select:(errmgr) Skipping component [default_hnp]. Query failed to return a module
[slurm1:09454] mca:base:select:(errmgr) Querying component [default_orted]
[slurm1:09454] mca:base:select:(errmgr) Skipping component [default_orted]. Query failed to return a module
[slurm1:09454] mca:base:select:(errmgr) Querying component [default_tool]
[slurm1:09454] mca:base:select:(errmgr) Skipping component [default_tool]. Query failed to return a module
[slurm1:09454] mca:base:select:(errmgr) Selected component [default_app]
[slurm1:09454] mca: base: close: component default_hnp closed
[slurm1:09447] mca:base:select: Auto-selecting errmgr components
[slurm1:09447] mca:base:select:(errmgr) Querying component [default_app]
[slurm1:09447] mca:base:select:(errmgr) Query of component [default_app] set priority to 1000
[slurm1:09454] mca: base: close: unloading component default_hnp
[slurm1:09454] mca: base: close: component default_orted closed
[slurm1:09454] mca: base: close: unloading component default_orted
[slurm1:09454] mca: base: close: component default_tool closed
[slurm1:09454] mca: base: close: unloading component default_tool
[slurm1:09456] mca: base: components_register: found loaded component default_orted
[slurm1:09456] mca: base: components_register: component default_orted register function successful
[slurm1:09456] mca: base: components_register: found loaded component default_tool
[slurm1:09456] mca: base: components_register: component default_tool register function successful
[slurm1:09456] mca: base: components_open: opening errmgr components
[slurm1:09456] mca: base: components_open: found loaded component default_app
[slurm1:09456] mca: base: components_open: component default_app open function successful
[slurm1:09456] mca: base: components_open: found loaded component default_hnp
[slurm1:09456] mca: base: components_open: component default_hnp open function successful
[slurm1:09456] mca: base: components_open: found loaded component default_orted
[slurm1:09456] mca: base: components_open: component default_orted open function successful
[slurm1:09456] mca: base: components_open: found loaded component default_tool
[slurm1:09456] mca: base: components_open: component default_tool open function successful
[slurm1:09447] mca:base:select:(errmgr) Querying component [default_hnp]
[slurm1:09447] mca:base:select:(errmgr) Skipping component [default_hnp]. Query failed to return a module
[slurm1:09447] mca:base:select:(errmgr) Querying component [default_orted]
[slurm1:09447] mca:base:select:(errmgr) Skipping component [default_orted]. Query failed to return a module
[slurm1:09447] mca:base:select:(errmgr) Querying component [default_tool]
[slurm1:09447] mca:base:select:(errmgr) Skipping component [default_tool]. Query failed to return a module
[slurm1:09447] mca:base:select:(errmgr) Selected component [default_app]
[slurm1:09447] mca: base: close: component default_hnp closed
[slurm1:09447] mca: base: close: unloading component default_hnp
[slurm1:09447] mca: base: close: component default_orted closed
[slurm1:09447] mca: base: close: unloading component default_orted
[slurm1:09447] mca: base: close: component default_tool closed
[slurm1:09447] mca: base: close: unloading component default_tool
[slurm1:09455] mca:base:select: Auto-selecting errmgr components
[slurm1:09455] mca:base:select:(errmgr) Querying component [default_app]
[slurm1:09455] mca:base:select:(errmgr) Query of component [default_app] set priority to 1000
[slurm1:09455] mca:base:select:(errmgr) Querying component [default_hnp]
[slurm1:09455] mca:base:select:(errmgr) Skipping component [default_hnp]. Query failed to return a module
[slurm1:09455] mca:base:select:(errmgr) Querying component [default_orted]
[slurm1:09455] mca:base:select:(errmgr) Skipping component [default_orted]. Query failed to return a module
[slurm1:09455] mca:base:select:(errmgr) Querying component [default_tool]
[slurm1:09455] mca:base:select:(errmgr) Skipping component [default_tool]. Query failed to return a module
[slurm1:09455] mca:base:select:(errmgr) Selected component [default_app]
[slurm1:09455] mca: base: close: component default_hnp closed
[slurm1:09455] mca: base: close: unloading component default_hnp
[slurm1:09455] mca: base: close: component default_orted closed
[slurm1:09455] mca: base: close: unloading component default_orted
[slurm1:09455] mca: base: close: component default_tool closed
[slurm1:09455] mca: base: close: unloading component default_tool
[slurm1:09457] mca: base: components_register: registering errmgr components
[slurm1:09457] mca: base: components_register: found loaded component default_app
[slurm1:09457] mca: base: components_register: component default_app register function successful
[slurm1:09457] mca: base: components_register: found loaded component default_hnp
[slurm1:09457] mca: base: components_register: component default_hnp register function successful
[slurm1:09457] mca: base: components_register: found loaded component default_orted
[slurm1:09457] mca: base: components_register: component default_orted register function successful
[slurm1:09457] mca: base: components_register: found loaded component default_tool
[slurm1:09457] mca: base: components_register: component default_tool register function successful
[slurm1:09457] mca: base: components_open: opening errmgr components
[slurm1:09457] mca: base: components_open: found loaded component default_app
[slurm1:09457] mca: base: components_open: component default_app open function successful
[slurm1:09457] mca: base: components_open: found loaded component default_hnp
[slurm1:09457] mca: base: components_open: component default_hnp open function successful
[slurm1:09457] mca: base: components_open: found loaded component default_orted
[slurm1:09457] mca: base: components_open: component default_orted open function successful
[slurm1:09457] mca: base: components_open: found loaded component default_tool
[slurm1:09457] mca: base: components_open: component default_tool open function successful
[slurm1:09456] mca:base:select: Auto-selecting errmgr components
[slurm1:09456] mca:base:select:(errmgr) Querying component [default_app]
[slurm1:09456] mca:base:select:(errmgr) Query of component [default_app] set priority to 1000
[slurm1:09456] mca:base:select:(errmgr) Querying component [default_hnp]
[slurm1:09456] mca:base:select:(errmgr) Skipping component [default_hnp]. Query failed to return a module
[slurm1:09456] mca:base:select:(errmgr) Querying component [default_orted]
[slurm1:09456] mca:base:select:(errmgr) Skipping component [default_orted]. Query failed to return a module
[slurm1:09456] mca:base:select:(errmgr) Querying component [default_tool]
[slurm1:09456] mca:base:select:(errmgr) Skipping component [default_tool]. Query failed to return a module
[slurm1:09456] mca:base:select:(errmgr) Selected component [default_app]
[slurm1:09456] mca: base: close: component default_hnp closed
[slurm1:09456] mca: base: close: unloading component default_hnp
[slurm1:09456] mca: base: close: component default_orted closed
[slurm1:09456] mca: base: close: unloading component default_orted
[slurm1:09456] mca: base: close: component default_tool closed
[slurm1:09456] mca: base: close: unloading component default_tool
[slurm1:09458] mca: base: components_register: registering errmgr components
[slurm1:09458] mca: base: components_register: found loaded component default_app
[slurm1:09458] mca: base: components_register: component default_app register function successful
[slurm1:09458] mca: base: components_register: found loaded component default_hnp
[slurm1:09458] mca: base: components_register: component default_hnp register function successful
[slurm1:09458] mca: base: components_register: found loaded component default_orted
[slurm1:09458] mca: base: components_register: component default_orted register function successful
[slurm1:09458] mca: base: components_register: found loaded component default_tool
[slurm1:09458] mca: base: components_register: component default_tool register function successful
[slurm1:09458] mca: base: components_open: opening errmgr components
[slurm1:09458] mca: base: components_open: found loaded component default_app
[slurm1:09458] mca: base: components_open: component default_app open function successful
[slurm1:09458] mca: base: components_open: found loaded component default_hnp
[slurm1:09458] mca: base: components_open: component default_hnp open function successful
[slurm1:09458] mca: base: components_open: found loaded component default_orted
[slurm1:09458] mca: base: components_open: component default_orted open function successful
[slurm1:09458] mca: base: components_open: found loaded component default_tool
[slurm1:09458] mca: base: components_open: component default_tool open function successful
[slurm1:09457] mca:base:select: Auto-selecting errmgr components
[slurm1:09457] mca:base:select:(errmgr) Querying component [default_app]
[slurm1:09457] mca:base:select:(errmgr) Query of component [default_app] set priority to 1000
[slurm1:09457] mca:base:select:(errmgr) Querying component [default_hnp]
[slurm1:09457] mca:base:select:(errmgr) Skipping component [default_hnp]. Query failed to return a module
[slurm1:09457] mca:base:select:(errmgr) Querying component [default_orted]
[slurm1:09457] mca:base:select:(errmgr) Skipping component [default_orted]. Query failed to return a module
[slurm1:09457] mca:base:select:(errmgr) Querying component [default_tool]
[slurm1:09457] mca:base:select:(errmgr) Skipping component [default_tool]. Query failed to return a module
[slurm1:09457] mca:base:select:(errmgr) Selected component [default_app]
[slurm1:09457] mca: base: close: component default_hnp closed
[slurm1:09457] mca: base: close: unloading component default_hnp
[slurm1:09459] mca: base: components_register: registering errmgr components
[slurm1:09459] mca: base: components_register: found loaded component default_app
[slurm1:09459] mca: base: components_register: component default_app register function successful
[slurm1:09459] mca: base: components_register: found loaded component default_hnp
[slurm1:09459] mca: base: components_register: component default_hnp register function successful
[slurm1:09459] mca: base: components_register: found loaded component default_orted
[slurm1:09459] mca: base: components_register: component default_orted register function successful
[slurm1:09459] mca: base: components_register: found loaded component default_tool
[slurm1:09459] mca: base: components_register: component default_tool register function successful
[slurm1:09459] mca: base: components_open: opening errmgr components
[slurm1:09459] mca: base: components_open: found loaded component default_app
[slurm1:09459] mca: base: components_open: component default_app open function successful
[slurm1:09459] mca: base: components_open: found loaded component default_hnp
[slurm1:09459] mca: base: components_open: component default_hnp open function successful
[slurm1:09459] mca: base: components_open: found loaded component default_orted
[slurm1:09459] mca: base: components_open: component default_orted open function successful
[slurm1:09459] mca: base: components_open: found loaded component default_tool
[slurm1:09459] mca: base: components_open: component default_tool open function successful
[slurm1:09457] mca: base: close: component default_orted closed
[slurm1:09457] mca: base: close: unloading component default_orted
[slurm1:09457] mca: base: close: component default_tool closed
[slurm1:09457] mca: base: close: unloading component default_tool
[slurm1:09460] mca: base: components_register: registering errmgr components
[slurm1:09460] mca: base: components_register: found loaded component default_app
[slurm1:09460] mca: base: components_register: component default_app register function successful
[slurm1:09458] mca:base:select: Auto-selecting errmgr components
[slurm1:09458] mca:base:select:(errmgr) Querying component [default_app]
[slurm1:09458] mca:base:select:(errmgr) Query of component [default_app] set priority to 1000
[slurm1:09458] mca:base:select:(errmgr) Querying component [default_hnp]
[slurm1:09458] mca:base:select:(errmgr) Skipping component [default_hnp]. Query failed to return a module
[slurm1:09458] mca:base:select:(errmgr) Querying component [default_orted]
[slurm1:09458] mca:base:select:(errmgr) Skipping component [default_orted]. Query failed to return a module
[slurm1:09458] mca:base:select:(errmgr) Querying component [default_tool]
[slurm1:09458] mca:base:select:(errmgr) Skipping component [default_tool]. Query failed to return a module
[slurm1:09458] mca:base:select:(errmgr) Selected component [default_app]
[slurm1:09458] mca: base: close: component default_hnp closed
[slurm1:09458] mca: base: close: unloading component default_hnp
[slurm1:09458] mca: base: close: component default_orted closed
[slurm1:09458] mca: base: close: unloading component default_orted
[slurm1:09458] mca: base: close: component default_tool closed
[slurm1:09458] mca: base: close: unloading component default_tool
[slurm1:09460] mca: base: components_register: found loaded component default_hnp
[slurm1:09460] mca: base: components_register: component default_hnp register function successful
[slurm1:09460] mca: base: components_register: found loaded component default_orted
[slurm1:09460] mca: base: components_register: component default_orted register function successful
[slurm1:09461] mca: base: components_register: registering errmgr components
[slurm1:09461] mca: base: components_register: found loaded component default_app
[slurm1:09461] mca: base: components_register: component default_app register function successful
[slurm1:09461] mca: base: components_register: found loaded component default_hnp
[slurm1:09461] mca: base: components_register: component default_hnp register function successful
[slurm1:09461] mca: base: components_register: found loaded component default_orted
[slurm1:09461] mca: base: components_register: component default_orted register function successful
[slurm1:09461] mca: base: components_register: found loaded component default_tool
[slurm1:09461] mca: base: components_register: component default_tool register function successful
[slurm1:09461] mca: base: components_open: opening errmgr components
[slurm1:09461] mca: base: components_open: found loaded component default_app
[slurm1:09461] mca: base: components_open: component default_app open function successful
[slurm1:09461] mca: base: components_open: found loaded component default_hnp
[slurm1:09461] mca: base: components_open: component default_hnp open function successful
[slurm1:09461] mca: base: components_open: found loaded component default_orted
[slurm1:09461] mca: base: components_open: component default_orted open function successful
[slurm1:09461] mca: base: components_open: found loaded component default_tool
[slurm1:09461] mca: base: components_open: component default_tool open function successful
[slurm1:09460] mca: base: components_register: found loaded component default_tool
[slurm1:09460] mca: base: components_register: component default_tool register function successful
[slurm1:09460] mca: base: components_open: opening errmgr components
[slurm1:09460] mca: base: components_open: found loaded component default_app
[slurm1:09460] mca: base: components_open: component default_app open function successful
[slurm1:09460] mca: base: components_open: found loaded component default_hnp
[slurm1:09460] mca: base: components_open: component default_hnp open function successful
[slurm1:09459] mca:base:select: Auto-selecting errmgr components
[slurm1:09459] mca:base:select:(errmgr) Querying component [default_app]
[slurm1:09459] mca:base:select:(errmgr) Query of component [default_app] set priority to 1000
[slurm1:09459] mca:base:select:(errmgr) Querying component [default_hnp]
[slurm1:09459] mca:base:select:(errmgr) Skipping component [default_hnp]. Query failed to return a module
[slurm1:09459] mca:base:select:(errmgr) Querying component [default_orted]
[slurm1:09459] mca:base:select:(errmgr) Skipping component [default_orted]. Query failed to return a module
[slurm1:09459] mca:base:select:(errmgr) Querying component [default_tool]
[slurm1:09459] mca:base:select:(errmgr) Skipping component [default_tool]. Query failed to return a module
[slurm1:09459] mca:base:select:(errmgr) Selected component [default_app]
[slurm1:09459] mca: base: close: component default_hnp closed
[slurm1:09459] mca: base: close: unloading component default_hnp
[slurm1:09459] mca: base: close: component default_orted closed
[slurm1:09459] mca: base: close: unloading component default_orted
[slurm1:09459] mca: base: close: component default_tool closed
[slurm1:09459] mca: base: close: unloading component default_tool
[slurm1:09460] mca: base: components_open: found loaded component default_orted
[slurm1:09460] mca: base: components_open: component default_orted open function successful
[slurm1:09460] mca: base: components_open: found loaded component default_tool
[slurm1:09460] mca: base: components_open: component default_tool open function successful
[slurm1:09462] mca: base: components_register: registering errmgr components
[slurm1:09462] mca: base: components_register: found loaded component default_app
[slurm1:09462] mca: base: components_register: component default_app register function successful
[slurm1:09462] mca: base: components_register: found loaded component default_hnp
[slurm1:09462] mca: base: components_register: component default_hnp register function successful
[slurm1:09462] mca: base: components_register: found loaded component default_orted
[slurm1:09462] mca: base: components_register: component default_orted register function successful
[slurm1:09462] mca: base: components_register: found loaded component default_tool
[slurm1:09462] mca: base: components_register: component default_tool register function successful
[slurm1:09462] mca: base: components_open: opening errmgr components
[slurm1:09462] mca: base: components_open: found loaded component default_app
[slurm1:09462] mca: base: components_open: component default_app open function successful
[slurm1:09462] mca: base: components_open: found loaded component default_hnp
[slurm1:09462] mca: base: components_open: component default_hnp open function successful
[slurm1:09462] mca: base: components_open: found loaded component default_orted
[slurm1:09462] mca: base: components_open: component default_orted open function successful
[slurm1:09462] mca: base: components_open: found loaded component default_tool
[slurm1:09462] mca: base: components_open: component default_tool open function successful
[slurm1:09461] mca:base:select: Auto-selecting errmgr components
[slurm1:09461] mca:base:select:(errmgr) Querying component [default_app]
[slurm1:09461] mca:base:select:(errmgr) Query of component [default_app] set priority to 1000
[slurm1:09461] mca:base:select:(errmgr) Querying component [default_hnp]
[slurm1:09461] mca:base:select:(errmgr) Skipping component [default_hnp]. Query failed to return a module
[slurm1:09461] mca:base:select:(errmgr) Querying component [default_orted]
[slurm1:09461] mca:base:select:(errmgr) Skipping component [default_orted]. Query failed to return a module
[slurm1:09461] mca:base:select:(errmgr) Querying component [default_tool]
[slurm1:09461] mca:base:select:(errmgr) Skipping component [default_tool]. Query failed to return a module
[slurm1:09461] mca:base:select:(errmgr) Selected component [default_app]
[slurm1:09461] mca: base: close: component default_hnp closed
[slurm1:09461] mca: base: close: unloading component default_hnp
[slurm1:09461] mca: base: close: component default_orted closed
[slurm1:09461] mca: base: close: unloading component default_orted
[slurm1:09461] mca: base: close: component default_tool closed
[slurm1:09461] mca: base: close: unloading component default_tool
[slurm1:09460] mca:base:select: Auto-selecting errmgr components
[slurm1:09460] mca:base:select:(errmgr) Querying component [default_app]
[slurm1:09460] mca:base:select:(errmgr) Query of component [default_app] set priority to 1000
[slurm1:09460] mca:base:select:(errmgr) Querying component [default_hnp]
[slurm1:09460] mca:base:select:(errmgr) Skipping component [default_hnp]. Query failed to return a module
[slurm1:09460] mca:base:select:(errmgr) Querying component [default_orted]
[slurm1:09460] mca:base:select:(errmgr) Skipping component [default_orted]. Query failed to return a module
[slurm1:09460] mca:base:select:(errmgr) Querying component [default_tool]
[slurm1:09460] mca:base:select:(errmgr) Skipping component [default_tool]. Query failed to return a module
[slurm1:09460] mca:base:select:(errmgr) Selected component [default_app]
[slurm1:09460] mca: base: close: component default_hnp closed
[slurm1:09460] mca: base: close: unloading component default_hnp
[slurm1:09460] mca: base: close: component default_orted closed
[slurm1:09460] mca: base: close: unloading component default_orted
[slurm1:09462] mca:base:select: Auto-selecting errmgr components
[slurm1:09460] mca: base: close: component default_tool closed
[slurm1:09460] mca: base: close: unloading component default_tool
[slurm1:09462] mca:base:select:(errmgr) Querying component [default_app]
[slurm1:09462] mca:base:select:(errmgr) Query of component [default_app] set priority to 1000
[slurm1:09462] mca:base:select:(errmgr) Querying component [default_hnp]
[slurm1:09462] mca:base:select:(errmgr) Skipping component [default_hnp]. Query failed to return a module
[slurm1:09462] mca:base:select:(errmgr) Querying component [default_orted]
[slurm1:09462] mca:base:select:(errmgr) Skipping component [default_orted]. Query failed to return a module
[slurm1:09462] mca:base:select:(errmgr) Querying component [default_tool]
[slurm1:09462] mca:base:select:(errmgr) Skipping component [default_tool]. Query failed to return a module
[slurm1:09462] mca:base:select:(errmgr) Selected component [default_app]
[slurm1:09462] mca: base: close: component default_hnp closed
[slurm1:09462] mca: base: close: unloading component default_hnp
[slurm1:09462] mca: base: close: component default_orted closed
[slurm1:09462] mca: base: close: unloading component default_orted
[slurm1:09462] mca: base: close: component default_tool closed
[slurm1:09462] mca: base: close: unloading component default_tool
[slurm1:09446] [[63894,0],1] odls: require sync on child [[63894,1],2]
[slurm1:09446] [[63894,0],1] odls: registering sync on child [[63894,1],2]
[slurm1:09446] [[63894,0],1] odls: require sync registering child [[63894,1],2]
[slurm1:09446] [[63894,0],1] odls:sync nidmap requested for job [63894,1]
[slurm1:09446] [[63894,0],1] odls: sending sync ack to child [[63894,1],2] with 15992 bytes of data
[slurm1:09446] [[63894,0],1] odls: Finished sending sync ack to child [[63894,1],2] (Registering True)
[slurm1:09446] [[63894,0],1] odls: require sync on child [[63894,1],1]
[slurm1:09446] [[63894,0],1] odls: registering sync on child [[63894,1],1]
[slurm1:09446] [[63894,0],1] odls: require sync registering child [[63894,1],1]
[slurm1:09446] [[63894,0],1] odls:sync nidmap requested for job [63894,1]
[slurm1:09446] [[63894,0],1] odls: sending sync ack to child [[63894,1],1] with 15992 bytes of data
[slurm1:09446] [[63894,0],1] odls: Finished sending sync ack to child [[63894,1],1] (Registering True)
[slurm1:09446] [[63894,0],1] odls: require sync on child [[63894,1],4]
[slurm1:09446] [[63894,0],1] odls: registering sync on child [[63894,1],4]
[slurm1:09446] [[63894,0],1] odls: require sync registering child [[63894,1],4]
[slurm1:09446] [[63894,0],1] odls:sync nidmap requested for job [63894,1]
[slurm1:09446] [[63894,0],1] odls: sending sync ack to child [[63894,1],4] with 15992 bytes of data
[slurm1:09446] [[63894,0],1] odls: Finished sending sync ack to child [[63894,1],4] (Registering True)
[slurm1:09446] [[63894,0],1] odls: require sync on child [[63894,1],3]
[slurm1:09446] [[63894,0],1] odls: registering sync on child [[63894,1],3]
[slurm1:09446] [[63894,0],1] odls: require sync registering child [[63894,1],3]
[slurm1:09446] [[63894,0],1] odls:sync nidmap requested for job [63894,1]
[slurm1:09446] [[63894,0],1] odls: sending sync ack to child [[63894,1],3] with 15992 bytes of data
[slurm1:09446] [[63894,0],1] odls: Finished sending sync ack to child [[63894,1],3] (Registering True)
[slurm1:09446] [[63894,0],1] odls: require sync on child [[63894,1],5]
[slurm1:09446] [[63894,0],1] odls: registering sync on child [[63894,1],5]
[slurm1:09446] [[63894,0],1] odls: require sync registering child [[63894,1],5]
[slurm1:09446] [[63894,0],1] odls:sync nidmap requested for job [63894,1]
[slurm1:09446] [[63894,0],1] odls: sending sync ack to child [[63894,1],5] with 15992 bytes of data
[slurm1:09446] [[63894,0],1] odls: Finished sending sync ack to child [[63894,1],5] (Registering True)
[slurm1:09446] [[63894,0],1] odls: require sync on child [[63894,1],6]
[slurm1:09446] [[63894,0],1] odls: registering sync on child [[63894,1],6]
[slurm1:09446] [[63894,0],1] odls: require sync registering child [[63894,1],6]
[slurm1:09446] [[63894,0],1] odls:sync nidmap requested for job [63894,1]
[slurm1:09446] [[63894,0],1] odls: sending sync ack to child [[63894,1],6] with 15992 bytes of data
[slurm1:09446] [[63894,0],1] odls: Finished sending sync ack to child [[63894,1],6] (Registering True)
[slurm1:09446] [[63894,0],1] odls: require sync on child [[63894,1],7]
[slurm1:09446] [[63894,0],1] odls: registering sync on child [[63894,1],7]
[slurm1:09446] [[63894,0],1] odls: require sync registering child [[63894,1],7]
[slurm1:09446] [[63894,0],1] odls:sync nidmap requested for job [63894,1]
[slurm1:09446] [[63894,0],1] odls: sending sync ack to child [[63894,1],7] with 15992 bytes of data
[slurm1:09446] [[63894,0],1] odls: Finished sending sync ack to child [[63894,1],7] (Registering True)
[slurm1:09446] [[63894,0],1] odls: require sync on child [[63894,1],8]
[slurm1:09446] [[63894,0],1] odls: registering sync on child [[63894,1],8]
[slurm1:09446] [[63894,0],1] odls: require sync registering child [[63894,1],8]
[slurm1:09446] [[63894,0],1] odls:sync nidmap requested for job [63894,1]
[slurm1:09446] [[63894,0],1] odls: sending sync ack to child [[63894,1],8] with 15992 bytes of data
[slurm1:09446] [[63894,0],1] odls: Finished sending sync ack to child [[63894,1],8] (Registering True)
[slurm1:09446] [[63894,0],1] odls: require sync on child [[63894,1],10]
[slurm1:09446] [[63894,0],1] odls: registering sync on child [[63894,1],10]
[slurm1:09446] [[63894,0],1] odls: require sync registering child [[63894,1],10]
[slurm1:09446] [[63894,0],1] odls:sync nidmap requested for job [63894,1]
[slurm1:09446] [[63894,0],1] odls: sending sync ack to child [[63894,1],10] with 15992 bytes of data
[slurm1:09446] [[63894,0],1] odls: Finished sending sync ack to child [[63894,1],10] (Registering True)
[slurm1:09446] [[63894,0],1] odls: require sync on child [[63894,1],12]
[slurm1:09446] [[63894,0],1] odls: registering sync on child [[63894,1],12]
[slurm1:09446] [[63894,0],1] odls: require sync registering child [[63894,1],12]
[slurm1:09446] [[63894,0],1] odls:sync nidmap requested for job [63894,1]
[slurm1:09446] [[63894,0],1] odls: sending sync ack to child [[63894,1],12] with 15992 bytes of data
[slurm1:09446] [[63894,0],1] odls: Finished sending sync ack to child [[63894,1],12] (Registering True)
[slurm1:09446] [[63894,0],1] odls: require sync on child [[63894,1],14]
[slurm1:09446] [[63894,0],1] odls: registering sync on child [[63894,1],14]
[slurm1:09446] [[63894,0],1] odls: require sync registering child [[63894,1],14]
[slurm1:09446] [[63894,0],1] odls:sync nidmap requested for job [63894,1]
[slurm1:09446] [[63894,0],1] odls: sending sync ack to child [[63894,1],14] with 15992 bytes of data
[slurm1:09446] [[63894,0],1] odls: Finished sending sync ack to child [[63894,1],14] (Registering True)
[slurm1:09446] [[63894,0],1] odls: require sync on child [[63894,1],9]
[slurm1:09446] [[63894,0],1] odls: registering sync on child [[63894,1],9]
[slurm1:09446] [[63894,0],1] odls: require sync registering child [[63894,1],9]
[slurm1:09446] [[63894,0],1] odls:sync nidmap requested for job [63894,1]
[slurm1:09446] [[63894,0],1] odls: sending sync ack to child [[63894,1],9] with 15992 bytes of data
[slurm1:09446] [[63894,0],1] odls: Finished sending sync ack to child [[63894,1],9] (Registering True)
[slurm1:09446] [[63894,0],1] odls: require sync on child [[63894,1],11]
[slurm1:09446] [[63894,0],1] odls: registering sync on child [[63894,1],11]
[slurm1:09446] [[63894,0],1] odls: require sync registering child [[63894,1],11]
[slurm1:09446] [[63894,0],1] odls:sync nidmap requested for job [63894,1]
[slurm1:09446] [[63894,0],1] odls: sending sync ack to child [[63894,1],11] with 15992 bytes of data
[slurm1:09446] [[63894,0],1] odls: Finished sending sync ack to child [[63894,1],11] (Registering True)
[slurm1:09446] [[63894,0],1] odls: require sync on child [[63894,1],0]
[slurm1:09446] [[63894,0],1] odls: registering sync on child [[63894,1],0]
[slurm1:09446] [[63894,0],1] odls: require sync registering child [[63894,1],0]
[slurm1:09446] [[63894,0],1] odls:sync nidmap requested for job [63894,1]
[slurm1:09446] [[63894,0],1] odls: sending sync ack to child [[63894,1],0] with 15992 bytes of data
[slurm1:09446] [[63894,0],1] odls: Finished sending sync ack to child [[63894,1],0] (Registering True)
[slurm1:09446] [[63894,0],1] odls: require sync on child [[63894,1],13]
[slurm1:09446] [[63894,0],1] odls: registering sync on child [[63894,1],13]
[slurm1:09446] [[63894,0],1] odls: require sync registering child [[63894,1],13]
[slurm1:09446] [[63894,0],1] odls:sync nidmap requested for job [63894,1]
[slurm1:09446] [[63894,0],1] odls: sending sync ack to child [[63894,1],13] with 15992 bytes of data
[slurm1:09446] [[63894,0],1] odls: Finished sending sync ack to child [[63894,1],13] (Registering True)
[slurm1:09446] [[63894,0],1] odls: require sync on child [[63894,1],15]
[slurm1:09446] [[63894,0],1] odls: registering sync on child [[63894,1],15]
[slurm1:09446] [[63894,0],1] odls: require sync registering child [[63894,1],15]
[slurm1:09446] [[63894,0],1] odls:sync nidmap requested for job [63894,1]
[slurm1:09446] [[63894,0],1] odls: sending sync ack to child [[63894,1],15] with 15992 bytes of data
[slurm1:09446] [[63894,0],1] odls: Finished sending sync ack to child [[63894,1],15] (Registering True)
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],0]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],1]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],2]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],3]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],4]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],5]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],6]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],7]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],8]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],9]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],10]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],11]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],12]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],13]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],14]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],15]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],0]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],1]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],2]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],3]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],4]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],5]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],6]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],7]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],8]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],9]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],10]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],11]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],12]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],13]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],14]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],15]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],0]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],1]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],2]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],3]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],4]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],5]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],6]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],7]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],8]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],9]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],10]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],11]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],12]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],13]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],14]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],15]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],0]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],1]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],2]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],3]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],4]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],5]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],6]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],7]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],8]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],9]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],10]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],11]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],12]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],13]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],14]
[slurm1:09446] [[63894,0],1] odls: sending message to tag 30 on child [[63894,1],15]
 MPITEST_INFO (         0): Starting test MPI_Errhandler_fatal     
 MPITEST_INFO (         0): This test should abort after printing the results
 MPITEST_INFO (         0): message, otherwise a f.a.i.l.u.r.e is noted
[slurm1:9449] *** An error occurred in MPI_Send
[slurm1:9449] *** reported by process [140737380745217,2]
[slurm1:9449] *** on communicator MPI COMMUNICATOR 3 DUP FROM 0
[slurm1:9449] *** MPI_ERR_RANK: invalid rank
[slurm1:9449] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[slurm1:9449] ***    and potentially your MPI job)
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],15] error state COMMUNICATION FAILURE
[slurm1:09446] wait it a daemon ? nope - ignore
MPITEST_results: MPI_Errhandler_fatal all tests PASSED (        16)
[slurm1:09449] [[63894,1],2] called abort_peers
[slurm1:09462] [[63894,1],15] called abort_peers
[slurm1:09455] [[63894,1],8] called abort_peers
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],2] error state COMMUNICATION FAILURE
[slurm1:09446] wait it a daemon ? nope - ignore
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],12] error state COMMUNICATION FAILURE
[slurm1:09446] wait it a daemon ? nope - ignore
[slurm1:09446] [[63894,0],1] odls:wait_local_proc child process [[63894,1],2] pid 9449 terminated
[slurm1:09446] [[63894,0],1] odls:waitpid_fired child [[63894,1],2] exit code 6
[slurm1:09446] [[63894,0],1] odls:waitpid_fired checking abort file /tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/2/aborted for child [[63894,1],2]
[slurm1:09446] [[63894,0],1] odls:waitpid_fired child [[63894,1],2] died by call to abort
[slurm1:09446] [[63894,0],1] odls:wait_local_proc child process [[63894,1],12] pid 9459 terminated
[slurm1:09446] [[63894,0],1] odls:waitpid_fired child [[63894,1],12] exit code 6
[slurm1:09446] [[63894,0],1] odls:waitpid_fired checking abort file /tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/12/aborted for child [[63894,1],12]
[slurm1:09446] [[63894,0],1] odls:waitpid_fired child [[63894,1],12] died by call to abort
[slurm1:09446] [[63894,0],1] odls:wait_local_proc child process [[63894,1],15] pid 9462 terminated
[slurm1:09446] [[63894,0],1] odls:waitpid_fired child [[63894,1],15] exit code 6
[slurm1:09446] [[63894,0],1] odls:waitpid_fired checking abort file /tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/15/aborted for child [[63894,1],15]
[slurm1:09446] [[63894,0],1] odls:waitpid_fired child [[63894,1],15] died by call to abort
[slurm1:09446] [[63894,0],1] odls:wait_local_proc child process [[63894,1],8] pid 9455 terminated
[slurm1:09446] [[63894,0],1] odls:waitpid_fired child [[63894,1],8] exit code 6
[slurm1:09446] [[63894,0],1] odls:waitpid_fired checking abort file /tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/8/aborted for child [[63894,1],8]
[slurm1:09446] [[63894,0],1] odls:waitpid_fired child [[63894,1],8] died by call to abort
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],2] error state CALLED ABORT
[slurm1:09446] [[63894,0],1] errmgr:default_orted got state CALLED ABORT for proc [[63894,1],2]
[slurm1:09446] [[63894,0],1] errmgr:default_orted orte_orteds_term_ordered 0
[slurm1:09459] [[63894,1],12] called abort_peers
[slurm1:09458] [[63894,1],11] called abort_peers
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: for proc [[63894,1],2] state CALLED ABORT
[slurm0:14820] [[63894,0],0] errmgr:hnp: proc [[63894,1],2] called abort
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: abort called on job [63894,1]
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: ordering orted termination
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: for proc [[63894,1],12] state CALLED ABORT
[slurm0:14820] [[63894,0],0] errmgr:hnp: proc [[63894,1],12] called abort
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: abort in progress, ignoring abort on job [63894,1]
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: for proc [[63894,1],15] state CALLED ABORT
[slurm0:14820] [[63894,0],0] errmgr:hnp: proc [[63894,1],15] called abort
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: abort in progress, ignoring abort on job [63894,1]
[slurm0:14820] [[63894,0],0] odls:kill_local_proc working on WILDCARD
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: for proc [[63894,1],8] state CALLED ABORT
[slurm0:14820] [[63894,0],0] errmgr:hnp: proc [[63894,1],8] called abort
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: abort in progress, ignoring abort on job [63894,1]
[slurm1:09446] [[63894,0],1] errmgr:default_orted reporting proc [[63894,1],2] aborted to HNP (local procs = 15)
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],12] error state CALLED ABORT
[slurm1:09446] [[63894,0],1] errmgr:default_orted got state CALLED ABORT for proc [[63894,1],12]
[slurm1:09446] [[63894,0],1] errmgr:default_orted orte_orteds_term_ordered 0
[slurm1:09446] [[63894,0],1] errmgr:default_orted reporting proc [[63894,1],12] aborted to HNP (local procs = 14)
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],15] error state CALLED ABORT
[slurm1:09446] [[63894,0],1] errmgr:default_orted got state CALLED ABORT for proc [[63894,1],15]
[slurm1:09446] [[63894,0],1] errmgr:default_orted orte_orteds_term_ordered 0
[slurm1:09446] [[63894,0],1] errmgr:default_orted reporting proc [[63894,1],15] aborted to HNP (local procs = 13)
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],8] error state CALLED ABORT
[slurm1:09446] [[63894,0],1] errmgr:default_orted got state CALLED ABORT for proc [[63894,1],8]
[slurm1:09446] [[63894,0],1] errmgr:default_orted orte_orteds_term_ordered 0
[slurm1:09446] [[63894,0],1] errmgr:default_orted reporting proc [[63894,1],8] aborted to HNP (local procs = 12)
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],8] error state COMMUNICATION FAILURE
[slurm1:09446] wait it a daemon ? nope - ignore
[slurm1:09461] [[63894,1],14] called abort_peers
[slurm1:09451] [[63894,1],4] called abort_peers
[slurm1:09454] [[63894,1],7] called abort_peers
[slurm1:09453] [[63894,1],6] called abort_peers
[slurm1:09450] [[63894,1],3] called abort_peers
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],4] error state COMMUNICATION FAILURE
[slurm1:09446] wait it a daemon ? nope - ignore
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],6] error state COMMUNICATION FAILURE
[slurm1:09446] wait it a daemon ? nope - ignore
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],11] error state COMMUNICATION FAILURE
[slurm1:09446] wait it a daemon ? nope - ignore
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],1] error state COMMUNICATION FAILURE
[slurm1:09446] wait it a daemon ? nope - ignore
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],14] error state COMMUNICATION FAILURE
[slurm1:09446] wait it a daemon ? nope - ignore
[slurm1:09457] [[63894,1],10] called abort_peers
[slurm1:09448] [[63894,1],1] called abort_peers
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],10] error state COMMUNICATION FAILURE
[slurm1:09446] wait it a daemon ? nope - ignore
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],3] error state COMMUNICATION FAILURE
[slurm1:09446] wait it a daemon ? nope - ignore
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],13] error state COMMUNICATION FAILURE
[slurm1:09446] wait it a daemon ? nope - ignore
[slurm1:09446] [[63894,0],1] odls:kill_local_proc working on WILDCARD
[slurm1:09446] [[63894,0],1] odls:kill_local_proc checking child process [[63894,1],0]
[slurm1:09446] [[63894,0],1] odls:wait_local_proc child process [[63894,1],1] pid 9448 terminated
[slurm1:09446] [[63894,0],1] odls:waitpid_fired child [[63894,1],1] exit code 6
[slurm1:09446] [[63894,0],1] odls:waitpid_fired checking abort file /tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/1/aborted for child [[63894,1],1]
[slurm1:09446] [[63894,0],1] odls:waitpid_fired child [[63894,1],1] died by call to abort
[slurm1:09446] [[63894,0],1] odls:wait_local_proc child process [[63894,1],3] pid 9450 terminated
[slurm1:09446] [[63894,0],1] odls:waitpid_fired child [[63894,1],3] exit code 6
[slurm1:09446] [[63894,0],1] odls:waitpid_fired checking abort file /tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/3/aborted for child [[63894,1],3]
[slurm1:09446] [[63894,0],1] odls:waitpid_fired child [[63894,1],3] died by call to abort
[slurm1:09446] [[63894,0],1] odls:wait_local_proc child process [[63894,1],4] pid 9451 terminated
[slurm1:09446] [[63894,0],1] odls:waitpid_fired child [[63894,1],4] exit code 6
[slurm1:09446] [[63894,0],1] odls:waitpid_fired checking abort file /tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/4/aborted for child [[63894,1],4]
[slurm1:09446] [[63894,0],1] odls:waitpid_fired child [[63894,1],4] died by call to abort
[slurm1:09446] [[63894,0],1] odls:wait_local_proc child process [[63894,1],5] pid 9452 terminated
[slurm1:09446] [[63894,0],1] odls:waitpid_fired child [[63894,1],5] exit code 6
[slurm1:09446] [[63894,0],1] odls:waitpid_fired checking abort file /tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/5/aborted for child [[63894,1],5]
[slurm1:09446] [[63894,0],1] odls:waitpid_fired child [[63894,1],5] died by call to abort
[slurm1:09446] [[63894,0],1] odls:wait_local_proc child process [[63894,1],6] pid 9453 terminated
[slurm1:09446] [[63894,0],1] odls:waitpid_fired child [[63894,1],6] exit code 6
[slurm1:09446] [[63894,0],1] odls:waitpid_fired checking abort file /tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/6/aborted for child [[63894,1],6]
[slurm1:09446] [[63894,0],1] odls:waitpid_fired child [[63894,1],6] died by call to abort
[slurm1:09446] [[63894,0],1] odls:wait_local_proc child process [[63894,1],9] pid 9456 terminated
[slurm1:09446] [[63894,0],1] odls:waitpid_fired child [[63894,1],9] exit code 6
[slurm1:09446] [[63894,0],1] odls:waitpid_fired checking abort file /tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/9/aborted for child [[63894,1],9]
[slurm1:09446] [[63894,0],1] odls:waitpid_fired child [[63894,1],9] died by call to abort
[slurm1:09446] [[63894,0],1] odls:wait_local_proc child process [[63894,1],10] pid 9457 terminated
[slurm1:09446] [[63894,0],1] odls:waitpid_fired child [[63894,1],10] exit code 6
[slurm1:09446] [[63894,0],1] odls:waitpid_fired checking abort file /tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/10/aborted for child [[63894,1],10]
[slurm1:09446] [[63894,0],1] odls:waitpid_fired child [[63894,1],10] died by call to abort
[slurm1:09446] [[63894,0],1] odls:wait_local_proc child process [[63894,1],11] pid 9458 terminated
[slurm1:09446] [[63894,0],1] odls:waitpid_fired child [[63894,1],11] exit code 6
[slurm1:09446] [[63894,0],1] odls:waitpid_fired checking abort file /tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/11/aborted for child [[63894,1],11]
[slurm1:09446] [[63894,0],1] odls:waitpid_fired child [[63894,1],11] died by call to abort
[slurm1:09446] [[63894,0],1] odls:wait_local_proc child process [[63894,1],13] pid 9460 terminated
[slurm1:09446] [[63894,0],1] odls:waitpid_fired child [[63894,1],13] exit code 6
[slurm1:09446] [[63894,0],1] odls:waitpid_fired checking abort file /tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/13/aborted for child [[63894,1],13]
[slurm1:09446] [[63894,0],1] odls:waitpid_fired child [[63894,1],13] died by call to abort
[slurm1:09446] [[63894,0],1] odls:wait_local_proc child process [[63894,1],14] pid 9461 terminated
[slurm1:09446] [[63894,0],1] odls:waitpid_fired child [[63894,1],14] exit code 6
[slurm1:09446] [[63894,0],1] odls:waitpid_fired checking abort file /tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/14/aborted for child [[63894,1],14]
[slurm1:09446] [[63894,0],1] odls:waitpid_fired child [[63894,1],14] died by call to abort
[slurm1:09446] [[63894,0],1] SENDING SIGCONT TO [[63894,1],0]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 18 TO PID 9447 SUCCESS
[slurm1:09446] [[63894,0],1] SENDING SIGTERM TO [[63894,1],0]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 15 TO PID 9447 SUCCESS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID CHECKING PID 9447 WITH TIMEOUT 2 SECONDS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID INDICATES PID 9447 MAY HAVE ALREADY EXITED
[slurm1:09446] [[63894,0],1] odls:default:WAITPID CHECKING PID 9447 WITH TIMEOUT 2 SECONDS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID INDICATES PID 9447 MAY HAVE ALREADY EXITED
[slurm1:09446] [[63894,0],1] odls:default:WAITPID CHECKING PID 9447 WITH TIMEOUT 2 SECONDS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID INDICATES PROC 9447 IS DEAD
[slurm1:09446] [[63894,0],1] SENDING FORCE SIGKILL TO [[63894,1],0]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 9 TO PID 9447 SUCCESS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID CHECKING PID 9447 WITH TIMEOUT 2 SECONDS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID INDICATES PID 9447 NO LONGER EXISTS
[slurm1:09446] [[63894,0],1] odls:kill_local_proc child [[63894,1],0] killed
[slurm1:09446] [[63894,0],1] odls:kill_local_proc child [[63894,1],0] iof_complete is 1 and state is KILLED BY INTERNAL COMMAND
[slurm1:09446] [[63894,0],1] odls:kill_local_proc checking child process [[63894,1],1]
[slurm1:09446] [[63894,0],1] odls:wait_local_proc child process [[63894,1],7] pid 9454 terminated
[slurm1:09446] [[63894,0],1] odls:waitpid_fired child [[63894,1],7] exit code 6
[slurm1:09446] [[63894,0],1] odls:waitpid_fired checking abort file /tmp/openmpi-sessions-gouaillardet@slurm1_0/63894/1/7/aborted for child [[63894,1],7]
[slurm1:09446] [[63894,0],1] odls:waitpid_fired child [[63894,1],7] died by call to abort
[slurm1:09446] [[63894,0],1] SENDING SIGCONT TO [[63894,1],1]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 18 TO PID 9448 SUCCESS
[slurm1:09446] [[63894,0],1] SENDING SIGTERM TO [[63894,1],1]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 15 TO PID 9448 SUCCESS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID CHECKING PID 9448 WITH TIMEOUT 2 SECONDS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID INDICATES PID 9448 NO LONGER EXISTS
[slurm1:09446] [[63894,0],1] SENDING FORCE SIGKILL TO [[63894,1],1]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 9 TO PID 9448 SUCCESS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID CHECKING PID 9448 WITH TIMEOUT 2 SECONDS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID INDICATES PID 9448 NO LONGER EXISTS
[slurm1:09446] [[63894,0],1] odls:kill_local_proc child [[63894,1],1] killed
[slurm1:09446] [[63894,0],1] odls:kill_local_proc child [[63894,1],1] iof_complete is 1 and state is KILLED BY INTERNAL COMMAND
[slurm1:09446] [[63894,0],1] odls:kill_local_proc checking child process [[63894,1],3]
[slurm1:09446] [[63894,0],1] SENDING SIGCONT TO [[63894,1],3]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 18 TO PID 9450 SUCCESS
[slurm1:09446] [[63894,0],1] SENDING SIGTERM TO [[63894,1],3]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 15 TO PID 9450 SUCCESS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID CHECKING PID 9450 WITH TIMEOUT 2 SECONDS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID INDICATES PID 9450 NO LONGER EXISTS
[slurm1:09446] [[63894,0],1] SENDING FORCE SIGKILL TO [[63894,1],3]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 9 TO PID 9450 SUCCESS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID CHECKING PID 9450 WITH TIMEOUT 2 SECONDS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID INDICATES PID 9450 NO LONGER EXISTS
[slurm1:09446] [[63894,0],1] odls:kill_local_proc child [[63894,1],3] killed
[slurm1:09446] [[63894,0],1] odls:kill_local_proc child [[63894,1],3] iof_complete is 1 and state is KILLED BY INTERNAL COMMAND
[slurm1:09446] [[63894,0],1] odls:kill_local_proc checking child process [[63894,1],4]
[slurm1:09446] [[63894,0],1] SENDING SIGCONT TO [[63894,1],4]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 18 TO PID 9451 SUCCESS
[slurm1:09446] [[63894,0],1] SENDING SIGTERM TO [[63894,1],4]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 15 TO PID 9451 SUCCESS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID CHECKING PID 9451 WITH TIMEOUT 2 SECONDS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID INDICATES PID 9451 NO LONGER EXISTS
[slurm1:09446] [[63894,0],1] SENDING FORCE SIGKILL TO [[63894,1],4]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 9 TO PID 9451 SUCCESS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID CHECKING PID 9451 WITH TIMEOUT 2 SECONDS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID INDICATES PID 9451 NO LONGER EXISTS
[slurm1:09446] [[63894,0],1] odls:kill_local_proc child [[63894,1],4] killed
[slurm1:09446] [[63894,0],1] odls:kill_local_proc child [[63894,1],4] iof_complete is 1 and state is KILLED BY INTERNAL COMMAND
[slurm1:09446] [[63894,0],1] odls:kill_local_proc checking child process [[63894,1],5]
[slurm1:09446] [[63894,0],1] SENDING SIGCONT TO [[63894,1],5]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 18 TO PID 9452 SUCCESS
[slurm1:09446] [[63894,0],1] SENDING SIGTERM TO [[63894,1],5]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 15 TO PID 9452 SUCCESS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID CHECKING PID 9452 WITH TIMEOUT 2 SECONDS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID INDICATES PID 9452 NO LONGER EXISTS
[slurm1:09446] [[63894,0],1] SENDING FORCE SIGKILL TO [[63894,1],5]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 9 TO PID 9452 SUCCESS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID CHECKING PID 9452 WITH TIMEOUT 2 SECONDS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID INDICATES PID 9452 NO LONGER EXISTS
[slurm1:09446] [[63894,0],1] odls:kill_local_proc child [[63894,1],5] killed
[slurm1:09446] [[63894,0],1] odls:kill_local_proc child [[63894,1],5] iof_complete is 1 and state is KILLED BY INTERNAL COMMAND
[slurm1:09446] [[63894,0],1] odls:kill_local_proc checking child process [[63894,1],6]
[slurm1:09446] [[63894,0],1] SENDING SIGCONT TO [[63894,1],6]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 18 TO PID 9453 SUCCESS
[slurm1:09446] [[63894,0],1] SENDING SIGTERM TO [[63894,1],6]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 15 TO PID 9453 SUCCESS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID CHECKING PID 9453 WITH TIMEOUT 2 SECONDS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID INDICATES PID 9453 NO LONGER EXISTS
[slurm1:09446] [[63894,0],1] SENDING FORCE SIGKILL TO [[63894,1],6]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 9 TO PID 9453 SUCCESS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID CHECKING PID 9453 WITH TIMEOUT 2 SECONDS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID INDICATES PID 9453 NO LONGER EXISTS
[slurm1:09446] [[63894,0],1] odls:kill_local_proc child [[63894,1],6] killed
[slurm1:09446] [[63894,0],1] odls:kill_local_proc child [[63894,1],6] iof_complete is 1 and state is KILLED BY INTERNAL COMMAND
[slurm1:09446] [[63894,0],1] odls:kill_local_proc checking child process [[63894,1],7]
[slurm1:09446] [[63894,0],1] SENDING SIGCONT TO [[63894,1],7]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 18 TO PID 9454 SUCCESS
[slurm1:09446] [[63894,0],1] SENDING SIGTERM TO [[63894,1],7]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 15 TO PID 9454 SUCCESS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID CHECKING PID 9454 WITH TIMEOUT 2 SECONDS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID INDICATES PID 9454 NO LONGER EXISTS
[slurm1:09446] [[63894,0],1] SENDING FORCE SIGKILL TO [[63894,1],7]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 9 TO PID 9454 SUCCESS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID CHECKING PID 9454 WITH TIMEOUT 2 SECONDS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID INDICATES PID 9454 NO LONGER EXISTS
[slurm1:09446] [[63894,0],1] odls:kill_local_proc child [[63894,1],7] killed
[slurm1:09446] [[63894,0],1] odls:kill_local_proc child [[63894,1],7] iof_complete is 1 and state is KILLED BY INTERNAL COMMAND
[slurm1:09446] [[63894,0],1] odls:kill_local_proc checking child process [[63894,1],9]
[slurm1:09446] [[63894,0],1] SENDING SIGCONT TO [[63894,1],9]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 18 TO PID 9456 SUCCESS
[slurm1:09446] [[63894,0],1] SENDING SIGTERM TO [[63894,1],9]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 15 TO PID 9456 SUCCESS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID CHECKING PID 9456 WITH TIMEOUT 2 SECONDS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID INDICATES PID 9456 NO LONGER EXISTS
[slurm1:09446] [[63894,0],1] SENDING FORCE SIGKILL TO [[63894,1],9]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 9 TO PID 9456 SUCCESS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID CHECKING PID 9456 WITH TIMEOUT 2 SECONDS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID INDICATES PID 9456 NO LONGER EXISTS
[slurm1:09446] [[63894,0],1] odls:kill_local_proc child [[63894,1],9] killed
[slurm1:09446] [[63894,0],1] odls:kill_local_proc child [[63894,1],9] iof_complete is 1 and state is KILLED BY INTERNAL COMMAND
[slurm1:09446] [[63894,0],1] odls:kill_local_proc checking child process [[63894,1],10]
[slurm1:09446] [[63894,0],1] SENDING SIGCONT TO [[63894,1],10]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 18 TO PID 9457 SUCCESS
[slurm1:09446] [[63894,0],1] SENDING SIGTERM TO [[63894,1],10]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 15 TO PID 9457 SUCCESS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID CHECKING PID 9457 WITH TIMEOUT 2 SECONDS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID INDICATES PID 9457 NO LONGER EXISTS
[slurm1:09446] [[63894,0],1] SENDING FORCE SIGKILL TO [[63894,1],10]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 9 TO PID 9457 SUCCESS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID CHECKING PID 9457 WITH TIMEOUT 2 SECONDS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID INDICATES PID 9457 NO LONGER EXISTS
[slurm1:09446] [[63894,0],1] odls:kill_local_proc child [[63894,1],10] killed
[slurm1:09446] [[63894,0],1] odls:kill_local_proc child [[63894,1],10] iof_complete is 1 and state is KILLED BY INTERNAL COMMAND
[slurm1:09446] [[63894,0],1] odls:kill_local_proc checking child process [[63894,1],11]
[slurm1:09446] [[63894,0],1] SENDING SIGCONT TO [[63894,1],11]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 18 TO PID 9458 SUCCESS
[slurm1:09446] [[63894,0],1] SENDING SIGTERM TO [[63894,1],11]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 15 TO PID 9458 SUCCESS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID CHECKING PID 9458 WITH TIMEOUT 2 SECONDS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID INDICATES PID 9458 NO LONGER EXISTS
[slurm1:09446] [[63894,0],1] SENDING FORCE SIGKILL TO [[63894,1],11]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 9 TO PID 9458 SUCCESS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID CHECKING PID 9458 WITH TIMEOUT 2 SECONDS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID INDICATES PID 9458 NO LONGER EXISTS
[slurm1:09446] [[63894,0],1] odls:kill_local_proc child [[63894,1],11] killed
[slurm1:09446] [[63894,0],1] odls:kill_local_proc child [[63894,1],11] iof_complete is 1 and state is KILLED BY INTERNAL COMMAND
[slurm1:09446] [[63894,0],1] odls:kill_local_proc checking child process [[63894,1],13]
[slurm1:09446] [[63894,0],1] SENDING SIGCONT TO [[63894,1],13]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 18 TO PID 9460 SUCCESS
[slurm1:09446] [[63894,0],1] SENDING SIGTERM TO [[63894,1],13]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 15 TO PID 9460 SUCCESS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID CHECKING PID 9460 WITH TIMEOUT 2 SECONDS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID INDICATES PID 9460 NO LONGER EXISTS
[slurm1:09446] [[63894,0],1] SENDING FORCE SIGKILL TO [[63894,1],13]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 9 TO PID 9460 SUCCESS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID CHECKING PID 9460 WITH TIMEOUT 2 SECONDS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID INDICATES PID 9460 NO LONGER EXISTS
[slurm1:09446] [[63894,0],1] odls:kill_local_proc child [[63894,1],13] killed
[slurm1:09446] [[63894,0],1] odls:kill_local_proc child [[63894,1],13] iof_complete is 1 and state is KILLED BY INTERNAL COMMAND
[slurm1:09446] [[63894,0],1] odls:kill_local_proc checking child process [[63894,1],14]
[slurm1:09446] [[63894,0],1] SENDING SIGCONT TO [[63894,1],14]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 18 TO PID 9461 SUCCESS
[slurm1:09446] [[63894,0],1] SENDING SIGTERM TO [[63894,1],14]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 15 TO PID 9461 SUCCESS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID CHECKING PID 9461 WITH TIMEOUT 2 SECONDS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID INDICATES PID 9461 NO LONGER EXISTS
[slurm1:09446] [[63894,0],1] SENDING FORCE SIGKILL TO [[63894,1],14]
[slurm1:09446] [[63894,0],1] odls:default:SENT KILL 9 TO PID 9461 SUCCESS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID CHECKING PID 9461 WITH TIMEOUT 2 SECONDS
[slurm1:09446] [[63894,0],1] odls:default:WAITPID INDICATES PID 9461 NO LONGER EXISTS
[slurm1:09446] [[63894,0],1] odls:kill_local_proc child [[63894,1],14] killed
[slurm1:09446] [[63894,0],1] odls:kill_local_proc child [[63894,1],14] iof_complete is 1 and state is KILLED BY INTERNAL COMMAND
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],1] error state CALLED ABORT
[slurm1:09446] [[63894,0],1] errmgr:default_orted got state CALLED ABORT for proc [[63894,1],1]
[slurm1:09446] [[63894,0],1] errmgr:default_orted orte_orteds_term_ordered 1
[slurm1:09446] [[63894,0],1] errmgr:default_orted reporting proc [[63894,1],1] aborted to HNP (local procs = 11)
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],3] error state CALLED ABORT
[slurm1:09446] [[63894,0],1] errmgr:default_orted got state CALLED ABORT for proc [[63894,1],3]
[slurm1:09446] [[63894,0],1] errmgr:default_orted orte_orteds_term_ordered 1
[slurm1:09446] [[63894,0],1] errmgr:default_orted reporting proc [[63894,1],3] aborted to HNP (local procs = 10)
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],4] error state CALLED ABORT
[slurm1:09446] [[63894,0],1] errmgr:default_orted got state CALLED ABORT for proc [[63894,1],4]
[slurm1:09446] [[63894,0],1] errmgr:default_orted orte_orteds_term_ordered 1
[slurm1:09446] [[63894,0],1] errmgr:default_orted reporting proc [[63894,1],4] aborted to HNP (local procs = 9)
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],5] error state CALLED ABORT
[slurm1:09446] [[63894,0],1] errmgr:default_orted got state CALLED ABORT for proc [[63894,1],5]
[slurm1:09446] [[63894,0],1] errmgr:default_orted orte_orteds_term_ordered 1
[slurm1:09446] [[63894,0],1] errmgr:default_orted reporting proc [[63894,1],5] aborted to HNP (local procs = 8)
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],6] error state CALLED ABORT
[slurm1:09446] [[63894,0],1] errmgr:default_orted got state CALLED ABORT for proc [[63894,1],6]
[slurm1:09446] [[63894,0],1] errmgr:default_orted orte_orteds_term_ordered 1
[slurm1:09446] [[63894,0],1] errmgr:default_orted reporting proc [[63894,1],6] aborted to HNP (local procs = 7)
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],9] error state CALLED ABORT
[slurm1:09446] [[63894,0],1] errmgr:default_orted got state CALLED ABORT for proc [[63894,1],9]
[slurm1:09446] [[63894,0],1] errmgr:default_orted orte_orteds_term_ordered 1
[slurm1:09446] [[63894,0],1] errmgr:default_orted reporting proc [[63894,1],9] aborted to HNP (local procs = 6)
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],10] error state CALLED ABORT
[slurm1:09446] [[63894,0],1] errmgr:default_orted got state CALLED ABORT for proc [[63894,1],10]
[slurm1:09446] [[63894,0],1] errmgr:default_orted orte_orteds_term_ordered 1
[slurm1:09446] [[63894,0],1] errmgr:default_orted reporting proc [[63894,1],10] aborted to HNP (local procs = 5)
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],11] error state CALLED ABORT
[slurm1:09446] [[63894,0],1] errmgr:default_orted got state CALLED ABORT for proc [[63894,1],11]
[slurm1:09446] [[63894,0],1] errmgr:default_orted orte_orteds_term_ordered 1
[slurm1:09446] [[63894,0],1] errmgr:default_orted reporting proc [[63894,1],11] aborted to HNP (local procs = 4)
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],13] error state CALLED ABORT
[slurm1:09446] [[63894,0],1] errmgr:default_orted got state CALLED ABORT for proc [[63894,1],13]
[slurm1:09446] [[63894,0],1] errmgr:default_orted orte_orteds_term_ordered 1
[slurm1:09446] [[63894,0],1] errmgr:default_orted reporting proc [[63894,1],13] aborted to HNP (local procs = 3)
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],14] error state CALLED ABORT
[slurm1:09446] [[63894,0],1] errmgr:default_orted got state CALLED ABORT for proc [[63894,1],14]
[slurm1:09446] [[63894,0],1] errmgr:default_orted orte_orteds_term_ordered 1
[slurm1:09446] [[63894,0],1] errmgr:default_orted reporting proc [[63894,1],14] aborted to HNP (local procs = 2)
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],0] error state KILLED BY INTERNAL COMMAND
[slurm1:09446] [[63894,0],1] errmgr:default_orted got state KILLED BY INTERNAL COMMAND for proc [[63894,1],0]
[slurm1:09446] [[63894,0],1] errmgr:default_orted orte_orteds_term_ordered 1
[slurm1:09446] [[63894,0],1] errmgr:default_orted reporting proc [[63894,1],0] aborted to HNP (local procs = 1)
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],7] error state CALLED ABORT
[slurm1:09446] [[63894,0],1] errmgr:default_orted got state CALLED ABORT for proc [[63894,1],7]
[slurm1:09446] [[63894,0],1] errmgr:default_orted orte_orteds_term_ordered 1
[slurm1:09446] [[63894,0],1] errmgr:default_orted reporting proc [[63894,1],7] aborted to HNP (local procs = 0)
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],1] error state KILLED BY INTERNAL COMMAND
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],3] error state KILLED BY INTERNAL COMMAND
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],4] error state KILLED BY INTERNAL COMMAND
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],5] error state KILLED BY INTERNAL COMMAND
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],6] error state KILLED BY INTERNAL COMMAND
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],7] error state KILLED BY INTERNAL COMMAND
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],9] error state KILLED BY INTERNAL COMMAND
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],10] error state KILLED BY INTERNAL COMMAND
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],11] error state KILLED BY INTERNAL COMMAND
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],13] error state KILLED BY INTERNAL COMMAND
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],14] error state KILLED BY INTERNAL COMMAND
[slurm1:09460] [[63894,1],13] called abort_peers
[slurm1:09456] [[63894,1],9] called abort_peers
[slurm1:09452] [[63894,1],5] called abort_peers
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: for proc [[63894,1],1] state CALLED ABORT
[slurm0:14820] [[63894,0],0] errmgr:hnp: proc [[63894,1],1] called abort
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: abort in progress, ignoring abort on job [63894,1]
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: for proc [[63894,1],3] state CALLED ABORT
[slurm0:14820] [[63894,0],0] errmgr:hnp: proc [[63894,1],3] called abort
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: abort in progress, ignoring abort on job [63894,1]
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: for proc [[63894,1],4] state CALLED ABORT
[slurm0:14820] [[63894,0],0] errmgr:hnp: proc [[63894,1],4] called abort
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: abort in progress, ignoring abort on job [63894,1]
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: for proc [[63894,1],5] state CALLED ABORT
[slurm0:14820] [[63894,0],0] errmgr:hnp: proc [[63894,1],5] called abort
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: abort in progress, ignoring abort on job [63894,1]
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: for proc [[63894,1],6] state CALLED ABORT
[slurm0:14820] [[63894,0],0] errmgr:hnp: proc [[63894,1],6] called abort
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: abort in progress, ignoring abort on job [63894,1]
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: for proc [[63894,1],9] state CALLED ABORT
[slurm0:14820] [[63894,0],0] errmgr:hnp: proc [[63894,1],9] called abort
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: abort in progress, ignoring abort on job [63894,1]
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: for proc [[63894,1],10] state CALLED ABORT
[slurm0:14820] [[63894,0],0] errmgr:hnp: proc [[63894,1],10] called abort
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: abort in progress, ignoring abort on job [63894,1]
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: for proc [[63894,1],11] state CALLED ABORT
[slurm0:14820] [[63894,0],0] errmgr:hnp: proc [[63894,1],11] called abort
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: abort in progress, ignoring abort on job [63894,1]
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: for proc [[63894,1],13] state CALLED ABORT
[slurm0:14820] [[63894,0],0] errmgr:hnp: proc [[63894,1],13] called abort
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: abort in progress, ignoring abort on job [63894,1]
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: for proc [[63894,1],14] state CALLED ABORT
[slurm0:14820] [[63894,0],0] errmgr:hnp: proc [[63894,1],14] called abort
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: abort in progress, ignoring abort on job [63894,1]
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],5] error state COMMUNICATION FAILURE
[slurm1:09446] wait it a daemon ? nope - ignore
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],9] error state COMMUNICATION FAILURE
[slurm1:09446] wait it a daemon ? nope - ignore
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],7] error state COMMUNICATION FAILURE
[slurm1:09446] wait it a daemon ? nope - ignore
[slurm1:09446] [[63894,0],1] errmgr:default_orted:proc_errors process [[63894,1],0] error state COMMUNICATION FAILURE
[slurm1:09446] wait it a daemon ? nope - ignore
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: for proc [[63894,1],0] state KILLED BY INTERNAL COMMAND
[slurm0:14820] [[63894,0],0] errmgr:hnp: proc [[63894,1],0] killed by cmd
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: for proc [[63894,1],7] state CALLED ABORT
[slurm0:14820] [[63894,0],0] errmgr:hnp: proc [[63894,1],7] called abort
[slurm0:14820] [[63894,0],0] errmgr:default_hnp: abort in progress, ignoring abort on job [63894,1]
[slurm1:09447] [[63894,1],0] called abort_peers
[slurm0:14820] 15 more processes have sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[slurm0:14820] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
