[slurm0:14139] mca: base: components_register: registering errmgr components
[slurm0:14139] mca: base: components_register: found loaded component default_app
[slurm0:14139] mca: base: components_register: component default_app register function successful
[slurm0:14139] mca: base: components_register: found loaded component default_hnp
[slurm0:14139] mca: base: components_register: component default_hnp register function successful
[slurm0:14139] mca: base: components_register: found loaded component default_orted
[slurm0:14139] mca: base: components_register: component default_orted register function successful
[slurm0:14139] mca: base: components_register: found loaded component default_tool
[slurm0:14139] mca: base: components_register: component default_tool register function successful
[slurm0:14139] mca: base: components_open: opening errmgr components
[slurm0:14139] mca: base: components_open: found loaded component default_app
[slurm0:14139] mca: base: components_open: component default_app open function successful
[slurm0:14139] mca: base: components_open: found loaded component default_hnp
[slurm0:14139] mca: base: components_open: component default_hnp open function successful
[slurm0:14139] mca: base: components_open: found loaded component default_orted
[slurm0:14139] mca: base: components_open: component default_orted open function successful
[slurm0:14139] mca: base: components_open: found loaded component default_tool
[slurm0:14139] mca: base: components_open: component default_tool open function successful
[slurm0:14139] mca:base:select: Auto-selecting errmgr components
[slurm0:14139] mca:base:select:(errmgr) Querying component [default_app]
[slurm0:14139] mca:base:select:(errmgr) Skipping component [default_app]. Query failed to return a module
[slurm0:14139] mca:base:select:(errmgr) Querying component [default_hnp]
[slurm0:14139] mca:base:select:(errmgr) Query of component [default_hnp] set priority to 1000
[slurm0:14139] mca:base:select:(errmgr) Querying component [default_orted]
[slurm0:14139] mca:base:select:(errmgr) Skipping component [default_orted]. Query failed to return a module
[slurm0:14139] mca:base:select:(errmgr) Querying component [default_tool]
[slurm0:14139] mca:base:select:(errmgr) Skipping component [default_tool]. Query failed to return a module
[slurm0:14139] mca:base:select:(errmgr) Selected component [default_hnp]
[slurm0:14139] mca: base: close: component default_app closed
[slurm0:14139] mca: base: close: unloading component default_app
[slurm0:14139] mca: base: close: component default_orted closed
[slurm0:14139] mca: base: close: unloading component default_orted
[slurm0:14139] mca: base: close: component default_tool closed
[slurm0:14139] mca: base: close: unloading component default_tool
[slurm0:14139] mca: base: components_register: registering odls components
[slurm0:14139] mca: base: components_register: found loaded component default
[slurm0:14139] mca: base: components_register: component default has no register or open function
[slurm0:14139] mca: base: components_open: opening odls components
[slurm0:14139] mca: base: components_open: found loaded component default
[slurm0:14139] mca: base: components_open: component default open function successful
[slurm0:14139] mca:base:select: Auto-selecting odls components
[slurm0:14139] mca:base:select:( odls) Querying component [default]
[slurm0:14139] mca:base:select:( odls) Query of component [default] set priority to 1
[slurm0:14139] mca:base:select:( odls) Selected component [default]
[slurm1:07433] mca: base: components_register: registering errmgr components
[slurm1:07433] mca: base: components_register: found loaded component default_app
[slurm1:07433] mca: base: components_register: component default_app register function successful
[slurm1:07433] mca: base: components_register: found loaded component default_hnp
[slurm1:07433] mca: base: components_register: component default_hnp register function successful
[slurm1:07433] mca: base: components_register: found loaded component default_orted
[slurm1:07433] mca: base: components_register: component default_orted register function successful
[slurm1:07433] mca: base: components_register: found loaded component default_tool
[slurm1:07433] mca: base: components_register: component default_tool register function successful
[slurm1:07433] mca: base: components_open: opening errmgr components
[slurm1:07433] mca: base: components_open: found loaded component default_app
[slurm1:07433] mca: base: components_open: component default_app open function successful
[slurm1:07433] mca: base: components_open: found loaded component default_hnp
[slurm1:07433] mca: base: components_open: component default_hnp open function successful
[slurm1:07433] mca: base: components_open: found loaded component default_orted
[slurm1:07433] mca: base: components_open: component default_orted open function successful
[slurm1:07433] mca: base: components_open: found loaded component default_tool
[slurm1:07433] mca: base: components_open: component default_tool open function successful
[slurm1:07433] mca:base:select: Auto-selecting errmgr components
[slurm1:07433] mca:base:select:(errmgr) Querying component [default_app]
[slurm1:07433] mca:base:select:(errmgr) Skipping component [default_app]. Query failed to return a module
[slurm1:07433] mca:base:select:(errmgr) Querying component [default_hnp]
[slurm1:07433] mca:base:select:(errmgr) Skipping component [default_hnp]. Query failed to return a module
[slurm1:07433] mca:base:select:(errmgr) Querying component [default_orted]
[slurm1:07433] mca:base:select:(errmgr) Query of component [default_orted] set priority to 1000
[slurm1:07433] mca:base:select:(errmgr) Querying component [default_tool]
[slurm1:07433] mca:base:select:(errmgr) Skipping component [default_tool]. Query failed to return a module
[slurm1:07433] mca:base:select:(errmgr) Selected component [default_orted]
[slurm1:07433] mca: base: close: component default_app closed
[slurm1:07433] mca: base: close: unloading component default_app
[slurm1:07433] mca: base: close: component default_hnp closed
[slurm1:07433] mca: base: close: unloading component default_hnp
[slurm1:07433] mca: base: close: component default_tool closed
[slurm1:07433] mca: base: close: unloading component default_tool
[slurm1:07433] mca: base: components_register: registering odls components
[slurm1:07433] mca: base: components_register: found loaded component default
[slurm1:07433] mca: base: components_register: component default has no register or open function
[slurm1:07433] mca: base: components_open: opening odls components
[slurm1:07433] mca: base: components_open: found loaded component default
[slurm1:07433] mca: base: components_open: component default open function successful
[slurm1:07433] mca:base:select: Auto-selecting odls components
[slurm1:07433] mca:base:select:( odls) Querying component [default]
[slurm1:07433] mca:base:select:( odls) Query of component [default] set priority to 1
[slurm1:07433] mca:base:select:( odls) Selected component [default]
[slurm0:14139] [[63305,0],0] odls:constructing child list
[slurm0:14139] [[63305,0],0] odls:construct_child_list unpacking data to launch job [63305,1]
[slurm0:14139] [[63305,0],0] odls:constructing child list - looking for daemon for proc [[63305,1],0]
[slurm0:14139] [[63305,0],0] odls:constructing child list - checking proc [[63305,1],0] on daemon 1
[slurm0:14139] [[63305,0],0] odls:constructing child list - looking for daemon for proc [[63305,1],1]
[slurm0:14139] [[63305,0],0] odls:constructing child list - checking proc [[63305,1],1] on daemon 1
[slurm0:14139] [[63305,0],0] odls:constructing child list - looking for daemon for proc [[63305,1],2]
[slurm0:14139] [[63305,0],0] odls:constructing child list - checking proc [[63305,1],2] on daemon 1
[slurm1:07433] [[63305,0],1] odls:constructing child list
[slurm1:07433] [[63305,0],1] odls:construct_child_list unpacking data to launch job [63305,1]
[slurm1:07433] [[63305,0],1] odls:construct_child_list adding new object for job [63305,1]
[slurm1:07433] [[63305,0],1] odls:construct_child_list unpacking 1 app_contexts
[slurm1:07433] [[63305,0],1] odls:constructing child list - looking for daemon for proc [[63305,1],0]
[slurm1:07433] [[63305,0],1] odls:constructing child list - checking proc [[63305,1],0] on daemon 1
[slurm1:07433] [[63305,0],1] odls:constructing child list - found proc [[63305,1],0] for me!
[slurm1:07433] [[63305,0],1] adding proc [[63305,1],0] to my local list
[slurm1:07433] [[63305,0],1] odls:constructing child list - looking for daemon for proc [[63305,1],1]
[slurm1:07433] [[63305,0],1] odls:constructing child list - checking proc [[63305,1],1] on daemon 1
[slurm1:07433] [[63305,0],1] odls:constructing child list - found proc [[63305,1],1] for me!
[slurm1:07433] [[63305,0],1] adding proc [[63305,1],1] to my local list
[slurm1:07433] [[63305,0],1] odls:constructing child list - looking for daemon for proc [[63305,1],2]
[slurm1:07433] [[63305,0],1] odls:constructing child list - checking proc [[63305,1],2] on daemon 1
[slurm1:07433] [[63305,0],1] odls:constructing child list - found proc [[63305,1],2] for me!
[slurm1:07433] [[63305,0],1] adding proc [[63305,1],2] to my local list
[slurm1:07433] [[63305,0],1] odls:launch working child [[63305,1],0]
[slurm1:07433] [[63305,0],1] odls:launch: spawning child [[63305,1],0]
[slurm1:07433] 
 Data for app_context: index 0	app: ./MPI_Errhandler_fatal_f
 	Num procs: 3	FirstRank: 0	Recovery: DEFAULT	Max Restarts: 0
 	Argv[0]: ./MPI_Errhandler_fatal_f
 	Env[0]: OMPI_MCA_btl=tcp,self
 	Env[1]: OMPI_MCA_errmgr_base_verbose=100
 	Env[2]: OMPI_MCA_odls_base_verbose=100
 	Env[3]: OMPI_MCA_rmaps_base_oversubscribe=1
 	Env[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
 	Env[5]: OMPI_MCA_orte_precondition_transports=2dc7ba5a5439f2ad-d9a90a11c86a8e44
 	Env[6]: OMPI_MCA_orte_peer_modex_id=0
 	Env[7]: OMPI_MCA_orte_peer_init_barrier_id=1
 	Env[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
 	Env[9]: SHELL=/bin/bash
 	Env[10]: SSH_CLIENT=192.168.122.100 51678 22
 	Env[11]: USER=gouaillardet
 	Env[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[13]: MAIL=/var/mail/gouaillardet
 	Env[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
 	Env[15]: PWD=/csc/home1/gouaillardet
 	Env[16]: XMODIFIERS=@im=none
 	Env[17]: LANG=en_US.UTF-8
 	Env[18]: SHLVL=1
 	Env[19]: HOME=/csc/home1/gouaillardet
 	Env[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[21]: LOGNAME=gouaillardet
 	Env[22]: SSH_CONNECTION=192.168.122.100 51678 192.168.122.101 22
 	Env[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
 	Env[24]: OMPI_MCA_orte_ess_jobid=4148756480
 	Env[25]: OMPI_MCA_orte_ess_vpid=1
 	Env[26]: OMPI_MCA_orte_ess_num_procs=3
 	Env[27]: OMPI_MCA_orte_hnp_uri=4148756480.0;tcp://192.168.122.100:43512
 	Env[28]: OMPI_MCA_plm=rsh
 	Env[29]: OMPI_MCA_orte_local_daemon_uri=4148756480.1;tcp://192.168.122.101,10.0.0.5:46911
 	Env[30]: OMPI_MCA_mpi_yield_when_idle=1
 	Env[31]: OMPI_MCA_orte_app_num=0
 	Env[32]: OMPI_UNIVERSE_SIZE=1
 	Env[33]: OMPI_MCA_orte_num_nodes=1
 	Env[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
 	Env[35]: OMPI_MCA_orte_bound_at_launch=1
 	Env[36]: OMPI_MCA_ess=env
 	Env[37]: OMPI_COMM_WORLD_SIZE=3
 	Env[38]: OMPI_COMM_WORLD_LOCAL_SIZE=3
 	Env[39]: OMPI_MCA_orte_tmpdir_base=/tmp
 	Env[40]: OMPI_MCA_grpcomm=^pmi
 	Env[41]: OMPI_MCA_db=^pmi
 	Env[42]: OMPI_MCA_pubsub=^pmi
 	Env[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
 	Env[44]: OMPI_NUM_APP_CTX=1
 	Env[45]: OMPI_FIRST_RANKS=0
 	Env[46]: OMPI_APP_CTX_NUM_PROCS=3
 	Env[47]: OMPI_MCA_ess_base_jobid=4148756481
 	Env[48]: OMPI_MCA_ess_base_vpid=0
 	Env[49]: OMPI_COMM_WORLD_RANK=0
 	Env[50]: OMPI_COMM_WORLD_LOCAL_RANK=0
 	Env[51]: OMPI_COMM_WORLD_NODE_RANK=0
 	Env[52]: OMPI_MCA_orte_ess_node_rank=0
 	Env[53]: OMPI_MCA_orte_num_restarts=0
 	Env[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63305/1/0
 	Working dir: /csc/home1/gouaillardet (user: 0 session-dir: 0)
 	Prefix: /csc/home1/gouaillardet/local/ompi-v1.8
 	Hostfile: NULL	Add-Hostfile: NULL
 	Dash_host[0]: slurm1
 	Preload binary: FALSE	Preload files: NULL	Used on node: TRUE
[slurm1:07433] [[63305,0],1] odls:launch working child [[63305,1],1]
[slurm1:07433] [[63305,0],1] odls:launch: spawning child [[63305,1],1]
[slurm1:07433] 
 Data for app_context: index 0	app: ./MPI_Errhandler_fatal_f
 	Num procs: 3	FirstRank: 0	Recovery: DEFAULT	Max Restarts: 0
 	Argv[0]: ./MPI_Errhandler_fatal_f
 	Env[0]: OMPI_MCA_btl=tcp,self
 	Env[1]: OMPI_MCA_errmgr_base_verbose=100
 	Env[2]: OMPI_MCA_odls_base_verbose=100
 	Env[3]: OMPI_MCA_rmaps_base_oversubscribe=1
 	Env[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
 	Env[5]: OMPI_MCA_orte_precondition_transports=2dc7ba5a5439f2ad-d9a90a11c86a8e44
 	Env[6]: OMPI_MCA_orte_peer_modex_id=0
 	Env[7]: OMPI_MCA_orte_peer_init_barrier_id=1
 	Env[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
 	Env[9]: SHELL=/bin/bash
 	Env[10]: SSH_CLIENT=192.168.122.100 51678 22
 	Env[11]: USER=gouaillardet
 	Env[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[13]: MAIL=/var/mail/gouaillardet
 	Env[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
 	Env[15]: PWD=/csc/home1/gouaillardet
 	Env[16]: XMODIFIERS=@im=none
 	Env[17]: LANG=en_US.UTF-8
 	Env[18]: SHLVL=1
 	Env[19]: HOME=/csc/home1/gouaillardet
 	Env[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[21]: LOGNAME=gouaillardet
 	Env[22]: SSH_CONNECTION=192.168.122.100 51678 192.168.122.101 22
 	Env[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
 	Env[24]: OMPI_MCA_orte_ess_jobid=4148756480
 	Env[25]: OMPI_MCA_orte_ess_vpid=1
 	Env[26]: OMPI_MCA_orte_ess_num_procs=3
 	Env[27]: OMPI_MCA_orte_hnp_uri=4148756480.0;tcp://192.168.122.100:43512
 	Env[28]: OMPI_MCA_plm=rsh
 	Env[29]: OMPI_MCA_orte_local_daemon_uri=4148756480.1;tcp://192.168.122.101,10.0.0.5:46911
 	Env[30]: OMPI_MCA_mpi_yield_when_idle=1
 	Env[31]: OMPI_MCA_orte_app_num=0
 	Env[32]: OMPI_UNIVERSE_SIZE=1
 	Env[33]: OMPI_MCA_orte_num_nodes=1
 	Env[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
 	Env[35]: OMPI_MCA_orte_bound_at_launch=1
 	Env[36]: OMPI_MCA_ess=env
 	Env[37]: OMPI_COMM_WORLD_SIZE=3
 	Env[38]: OMPI_COMM_WORLD_LOCAL_SIZE=3
 	Env[39]: OMPI_MCA_orte_tmpdir_base=/tmp
 	Env[40]: OMPI_MCA_grpcomm=^pmi
 	Env[41]: OMPI_MCA_db=^pmi
 	Env[42]: OMPI_MCA_pubsub=^pmi
 	Env[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
 	Env[44]: OMPI_NUM_APP_CTX=1
 	Env[45]: OMPI_FIRST_RANKS=0
 	Env[46]: OMPI_APP_CTX_NUM_PROCS=3
 	Env[47]: OMPI_MCA_ess_base_jobid=4148756481
 	Env[48]: OMPI_MCA_ess_base_vpid=1
 	Env[49]: OMPI_COMM_WORLD_RANK=1
 	Env[50]: OMPI_COMM_WORLD_LOCAL_RANK=1
 	Env[51]: OMPI_COMM_WORLD_NODE_RANK=1
 	Env[52]: OMPI_MCA_orte_ess_node_rank=1
 	Env[53]: OMPI_MCA_orte_num_restarts=0
 	Env[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63305/1/1
 	Working dir: /csc/home1/gouaillardet (user: 0 session-dir: 0)
 	Prefix: /csc/home1/gouaillardet/local/ompi-v1.8
 	Hostfile: NULL	Add-Hostfile: NULL
 	Dash_host[0]: slurm1
 	Preload binary: FALSE	Preload files: NULL	Used on node: TRUE
[slurm1:07433] [[63305,0],1] odls:launch working child [[63305,1],2]
[slurm1:07433] [[63305,0],1] odls:launch: spawning child [[63305,1],2]
[slurm1:07433] 
 Data for app_context: index 0	app: ./MPI_Errhandler_fatal_f
 	Num procs: 3	FirstRank: 0	Recovery: DEFAULT	Max Restarts: 0
 	Argv[0]: ./MPI_Errhandler_fatal_f
 	Env[0]: OMPI_MCA_btl=tcp,self
 	Env[1]: OMPI_MCA_errmgr_base_verbose=100
 	Env[2]: OMPI_MCA_odls_base_verbose=100
 	Env[3]: OMPI_MCA_rmaps_base_oversubscribe=1
 	Env[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
 	Env[5]: OMPI_MCA_orte_precondition_transports=2dc7ba5a5439f2ad-d9a90a11c86a8e44
 	Env[6]: OMPI_MCA_orte_peer_modex_id=0
 	Env[7]: OMPI_MCA_orte_peer_init_barrier_id=1
 	Env[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
 	Env[9]: SHELL=/bin/bash
 	Env[10]: SSH_CLIENT=192.168.122.100 51678 22
 	Env[11]: USER=gouaillardet
 	Env[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[13]: MAIL=/var/mail/gouaillardet
 	Env[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
 	Env[15]: PWD=/csc/home1/gouaillardet
 	Env[16]: XMODIFIERS=@im=none
 	Env[17]: LANG=en_US.UTF-8
 	Env[18]: SHLVL=1
 	Env[19]: HOME=/csc/home1/gouaillardet
 	Env[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
 	Env[21]: LOGNAME=gouaillardet
 	Env[22]: SSH_CONNECTION=192.168.122.100 51678 192.168.122.101 22
 	Env[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
 	Env[24]: OMPI_MCA_orte_ess_jobid=4148756480
 	Env[25]: OMPI_MCA_orte_ess_vpid=1
 	Env[26]: OMPI_MCA_orte_ess_num_procs=3
 	Env[27]: OMPI_MCA_orte_hnp_uri=4148756480.0;tcp://192.168.122.100:43512
 	Env[28]: OMPI_MCA_plm=rsh
 	Env[29]: OMPI_MCA_orte_local_daemon_uri=4148756480.1;tcp://192.168.122.101,10.0.0.5:46911
 	Env[30]: OMPI_MCA_mpi_yield_when_idle=1
 	Env[31]: OMPI_MCA_orte_app_num=0
 	Env[32]: OMPI_UNIVERSE_SIZE=1
 	Env[33]: OMPI_MCA_orte_num_nodes=1
 	Env[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
 	Env[35]: OMPI_MCA_orte_bound_at_launch=1
 	Env[36]: OMPI_MCA_ess=env
 	Env[37]: OMPI_COMM_WORLD_SIZE=3
 	Env[38]: OMPI_COMM_WORLD_LOCAL_SIZE=3
 	Env[39]: OMPI_MCA_orte_tmpdir_base=/tmp
 	Env[40]: OMPI_MCA_grpcomm=^pmi
 	Env[41]: OMPI_MCA_db=^pmi
 	Env[42]: OMPI_MCA_pubsub=^pmi
 	Env[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
 	Env[44]: OMPI_NUM_APP_CTX=1
 	Env[45]: OMPI_FIRST_RANKS=0
 	Env[46]: OMPI_APP_CTX_NUM_PROCS=3
 	Env[47]: OMPI_MCA_ess_base_jobid=4148756481
 	Env[48]: OMPI_MCA_ess_base_vpid=2
 	Env[49]: OMPI_COMM_WORLD_RANK=2
 	Env[50]: OMPI_COMM_WORLD_LOCAL_RANK=2
 	Env[51]: OMPI_COMM_WORLD_NODE_RANK=2
 	Env[52]: OMPI_MCA_orte_ess_node_rank=2
 	Env[53]: OMPI_MCA_orte_num_restarts=0
 	Env[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63305/1/2
 	Working dir: /csc/home1/gouaillardet (user: 0 session-dir: 0)
 	Prefix: /csc/home1/gouaillardet/local/ompi-v1.8
 	Hostfile: NULL	Add-Hostfile: NULL
 	Dash_host[0]: slurm1
 	Preload binary: FALSE	Preload files: NULL	Used on node: TRUE
[slurm1:07433] [[63305,0],1] odls:launch setting waitpids
[slurm1:07433] [[63305,0],1] STARTING ./MPI_Errhandler_fatal_f
[slurm1:07433] [[63305,0],1]	ARGV[0]: ./MPI_Errhandler_fatal_f
[slurm1:07433] [[63305,0],1]	ENVIRON[0]: OMPI_MCA_btl=tcp,self
[slurm1:07433] [[63305,0],1]	ENVIRON[1]: OMPI_MCA_errmgr_base_verbose=100
[slurm1:07433] [[63305,0],1]	ENVIRON[2]: OMPI_MCA_odls_base_verbose=100
[slurm1:07433] [[63305,0],1]	ENVIRON[3]: OMPI_MCA_rmaps_base_oversubscribe=1
[slurm1:07433] [[63305,0],1]	ENVIRON[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
[slurm1:07433] [[63305,0],1]	ENVIRON[5]: OMPI_MCA_orte_precondition_transports=2dc7ba5a5439f2ad-d9a90a11c86a8e44
[slurm1:07433] [[63305,0],1]	ENVIRON[6]: OMPI_MCA_orte_peer_modex_id=0
[slurm1:07433] [[63305,0],1]	ENVIRON[7]: OMPI_MCA_orte_peer_init_barrier_id=1
[slurm1:07433] [[63305,0],1]	ENVIRON[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
[slurm1:07433] [[63305,0],1]	ENVIRON[9]: SHELL=/bin/bash
[slurm1:07433] [[63305,0],1]	ENVIRON[10]: SSH_CLIENT=192.168.122.100 51678 22
[slurm1:07433] [[63305,0],1]	ENVIRON[11]: USER=gouaillardet
[slurm1:07433] [[63305,0],1]	ENVIRON[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:07433] [[63305,0],1]	ENVIRON[13]: MAIL=/var/mail/gouaillardet
[slurm1:07433] [[63305,0],1]	ENVIRON[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
[slurm1:07433] [[63305,0],1]	ENVIRON[15]: PWD=/csc/home1/gouaillardet
[slurm1:07433] [[63305,0],1]	ENVIRON[16]: XMODIFIERS=@im=none
[slurm1:07433] [[63305,0],1]	ENVIRON[17]: LANG=en_US.UTF-8
[slurm1:07433] [[63305,0],1]	ENVIRON[18]: SHLVL=1
[slurm1:07433] [[63305,0],1]	ENVIRON[19]: HOME=/csc/home1/gouaillardet
[slurm1:07433] [[63305,0],1]	ENVIRON[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:07433] [[63305,0],1]	ENVIRON[21]: LOGNAME=gouaillardet
[slurm1:07433] [[63305,0],1]	ENVIRON[22]: SSH_CONNECTION=192.168.122.100 51678 192.168.122.101 22
[slurm1:07433] [[63305,0],1]	ENVIRON[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
[slurm1:07433] [[63305,0],1]	ENVIRON[24]: OMPI_MCA_orte_ess_jobid=4148756480
[slurm1:07433] [[63305,0],1]	ENVIRON[25]: OMPI_MCA_orte_ess_vpid=1
[slurm1:07433] [[63305,0],1]	ENVIRON[26]: OMPI_MCA_orte_ess_num_procs=3
[slurm1:07433] [[63305,0],1]	ENVIRON[27]: OMPI_MCA_orte_hnp_uri=4148756480.0;tcp://192.168.122.100:43512
[slurm1:07433] [[63305,0],1]	ENVIRON[28]: OMPI_MCA_plm=rsh
[slurm1:07433] [[63305,0],1]	ENVIRON[29]: OMPI_MCA_orte_local_daemon_uri=4148756480.1;tcp://192.168.122.101,10.0.0.5:46911
[slurm1:07433] [[63305,0],1]	ENVIRON[30]: OMPI_MCA_mpi_yield_when_idle=1
[slurm1:07433] [[63305,0],1]	ENVIRON[31]: OMPI_MCA_orte_app_num=0
[slurm1:07433] [[63305,0],1]	ENVIRON[32]: OMPI_UNIVERSE_SIZE=1
[slurm1:07433] [[63305,0],1]	ENVIRON[33]: OMPI_MCA_orte_num_nodes=1
[slurm1:07433] [[63305,0],1]	ENVIRON[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
[slurm1:07433] [[63305,0],1]	ENVIRON[35]: OMPI_MCA_orte_bound_at_launch=1
[slurm1:07433] [[63305,0],1]	ENVIRON[36]: OMPI_MCA_ess=env
[slurm1:07433] [[63305,0],1]	ENVIRON[37]: OMPI_COMM_WORLD_SIZE=3
[slurm1:07433] [[63305,0],1]	ENVIRON[38]: OMPI_COMM_WORLD_LOCAL_SIZE=3
[slurm1:07433] [[63305,0],1]	ENVIRON[39]: OMPI_MCA_orte_tmpdir_base=/tmp
[slurm1:07433] [[63305,0],1]	ENVIRON[40]: OMPI_MCA_grpcomm=^pmi
[slurm1:07433] [[63305,0],1]	ENVIRON[41]: OMPI_MCA_db=^pmi
[slurm1:07433] [[63305,0],1]	ENVIRON[42]: OMPI_MCA_pubsub=^pmi
[slurm1:07433] [[63305,0],1]	ENVIRON[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
[slurm1:07433] [[63305,0],1]	ENVIRON[44]: OMPI_NUM_APP_CTX=1
[slurm1:07433] [[63305,0],1]	ENVIRON[45]: OMPI_FIRST_RANKS=0
[slurm1:07433] [[63305,0],1]	ENVIRON[46]: OMPI_APP_CTX_NUM_PROCS=3
[slurm1:07433] [[63305,0],1]	ENVIRON[47]: OMPI_MCA_ess_base_jobid=4148756481
[slurm1:07433] [[63305,0],1]	ENVIRON[48]: OMPI_MCA_ess_base_vpid=0
[slurm1:07433] [[63305,0],1]	ENVIRON[49]: OMPI_COMM_WORLD_RANK=0
[slurm1:07433] [[63305,0],1]	ENVIRON[50]: OMPI_COMM_WORLD_LOCAL_RANK=0
[slurm1:07433] [[63305,0],1]	ENVIRON[51]: OMPI_COMM_WORLD_NODE_RANK=0
[slurm1:07[slurm1:07433] [[63305,0],1] STARTING ./MPI_Errhandler_fatal_f
[slurm1:07433] [[63305,0],1]	ARGV[0]: ./MPI_Errhandler_fatal_f
[slurm1:07433] [[63305,0],1]	ENVIRON[0]: OMPI_MCA_btl=tcp,self
[slurm1:07433] [[63305,0],1]	ENVIRON[1]: OMPI_MCA_errmgr_base_verbose=100
[slurm1:07433] [[63305,0],1]	ENVIRON[2]: OMPI_MCA_odls_base_verbose=100
[slurm1:07433] [[63305,0],1]	ENVIRON[3]: OMPI_MCA_rmaps_base_oversubscribe=1
[slurm1:07433] [[63305,0],1]	ENVIRON[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
[slurm1:07433] [[63305,0],1]	ENVIRON[5]: OMPI_MCA_orte_precondition_transports=2dc7ba5a5439f2ad-d9a90a11c86a8e44
[slurm1:07433] [[63305,0],1]	ENVIRON[6]: OMPI_MCA_orte_peer_modex_id=0
[slurm1:07433] [[63305,0],1]	ENVIRON[7]: OMPI_MCA_orte_peer_init_barrier_id=1
[slurm1:07433] [[63305,0],1]	ENVIRON[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
[slurm1:07433] [[63305,0],1]	ENVIRON[9]: SHELL=/bin/bash
[slurm1:07433] [[63305,0],1]	ENVIRON[10]: SSH_CLIENT=192.168.122.100 51678 22
[slurm1:07433] [[63305,0],1]	ENVIRON[11]: USER=gouaillardet
[slurm1:07433] [[63305,0],1]	ENVIRON[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:07433] [[63305,0],1]	ENVIRON[13]: MAIL=/var/mail/gouaillardet
[slurm1:07433] [[63305,0],1]	ENVIRON[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
[slurm1:07433] [[63305,0],1]	ENVIRON[15]: PWD=/csc/home1/gouaillardet
[slurm1:07433] [[63305,0],1]	ENVIRON[16]: XMODIFIERS=@im=none
[slurm1:07433] [[63305,0],1]	ENVIRON[17]: LANG=en_US.UTF-8
[slurm1:07433] [[63305,0],1]	ENVIRON[18]: SHLVL=1
[slurm1:07433] [[63305,0],1]	ENVIRON[19]: HOME=/csc/home1/gouaillardet
[slurm1:07433] [[63305,0],1]	ENVIRON[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:07433] [[63305,0],1]	ENVIRON[21]: LOGNAME=gouaillardet
[slurm1:07433] [[63305,0],1]	ENVIRON[22]: SSH_CONNECTION=192.168.122.100 51678 192.168.122.101 22
[slurm1:07433] [[63305,0],1]	ENVIRON[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
[slurm1:07433] [[63305,0],1]	ENVIRON[24]: OMPI_MCA_orte_ess_jobid=4148756480
[slurm1:07433] [[63305,0],1]	ENVIRON[25]: OMPI_MCA_orte_ess_vpid=1
[slurm1:07433] [[63305,0],1]	ENVIRON[26]: OMPI_MCA_orte_ess_num_procs=3
[slurm1:07433] [[63305,0],1]	ENVIRON[27]: OMPI_MCA_orte_hnp_uri=4148756480.0;tcp://192.168.122.100:43512
[slurm1:07433] [[63305,0],1]	ENVIRON[28]: OMPI_MCA_plm=rsh
[slurm1:07433] [[63305,0],1]	ENVIRON[29]: OMPI_MCA_orte_local_daemon_uri=4148756480.1;tcp://192.168.122.101,10.0.0.5:46911
[slurm1:07433] [[63305,0],1]	ENVIRON[30]: OMPI_MCA_mpi_yield_when_idle=1
[slurm1:07433] [[63305,0],1]	ENVIRON[31]: OMPI_MCA_orte_app_num=0
[slurm1:07433] [[63305,0],1]	ENVIRON[32]: OMPI_UNIVERSE_SIZE=1
[slurm1:07433] [[63305,0],1]	ENVIRON[33]: OMPI_MCA_orte_num_nodes=1
[slurm1:07433] [[63305,0],1]	ENVIRON[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
[slurm1:07433] [[63305,0],1]	ENVIRON[35]: OMPI_MCA_orte_bound_at_launch=1
[slurm1:07433] [[63305,0],1]	ENVIRON[36]: OMPI_MCA_ess=env
[slurm1:07433] [[63305,0],1]	ENVIRON[37]: OMPI_COMM_WORLD_SIZE=3
[slurm1:07433] [[63305,0],1]	ENVIRON[38]: OMPI_COMM_WORLD_LOCAL_SIZE=3
[slurm1:07433] [[63305,0],1]	ENVIRON[39]: OMPI_MCA_orte_tmpdir_base=/tmp
[slurm1:07433] [[63305,0],1]	ENVIRON[40]: OMPI_MCA_grpcomm=^pmi
[slurm1:07433] [[63305,0],1]	ENVIRON[41]: OMPI_MCA_db=^pmi
[slurm1:07433] [[63305,0],1]	ENVIRON[42]: OMPI_MCA_pubsub=^pmi
[slurm1:07433] [[63305,0],1]	ENVIRON[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
[slurm1:07433] [[63305,0],1]	ENVIRON[44]: OMPI_NUM_APP_CTX=1
[slurm1:07433] [[63305,0],1]	ENVIRON[45]: OMPI_FIRST_RANKS=0
[slurm1:07433] [[63305,0],1]	ENVIRON[46]: OMPI_APP_CTX_NUM_PROCS=3
[slurm1:07433] [[63305,0],1]	ENVIRON[47]: OMPI_MCA_ess_base_jobid=4148756481
[slurm1:07433] [[63305,0],1]	ENVIRON[48]: OMPI_MCA_ess_base_vpid=1
[slurm1:07433] [[63305,0],1]	ENVIRON[49]: OMPI_COMM_WORLD_RANK=1
[slurm1:07433] [[63305,0],1]	ENVIRON[50]: OMPI_COMM_WORLD_LOCAL_RANK=1
[slurm1:07433] [[63305,0],1]	ENVIRON[51]: OMPI_COMM_WORLD_NODE_RANK=1
[slurm1:07[slurm1:07433] [[63305,0],1] STARTING ./MPI_Errhandler_fatal_f
[slurm1:07433] [[63305,0],1]	ARGV[0]: ./MPI_Errhandler_fatal_f
[slurm1:07433] [[63305,0],1]	ENVIRON[0]: OMPI_MCA_btl=tcp,self
[slurm1:07433] [[63305,0],1]	ENVIRON[1]: OMPI_MCA_errmgr_base_verbose=100
[slurm1:07433] [[63305,0],1]	ENVIRON[2]: OMPI_MCA_odls_base_verbose=100
[slurm1:07433] [[63305,0],1]	ENVIRON[3]: OMPI_MCA_rmaps_base_oversubscribe=1
[slurm1:07433] [[63305,0],1]	ENVIRON[4]: OMPI_COMMAND=MPI_Errhandler_fatal_f
[slurm1:07433] [[63305,0],1]	ENVIRON[5]: OMPI_MCA_orte_precondition_transports=2dc7ba5a5439f2ad-d9a90a11c86a8e44
[slurm1:07433] [[63305,0],1]	ENVIRON[6]: OMPI_MCA_orte_peer_modex_id=0
[slurm1:07433] [[63305,0],1]	ENVIRON[7]: OMPI_MCA_orte_peer_init_barrier_id=1
[slurm1:07433] [[63305,0],1]	ENVIRON[8]: OMPI_MCA_orte_peer_fini_barrier_id=2
[slurm1:07433] [[63305,0],1]	ENVIRON[9]: SHELL=/bin/bash
[slurm1:07433] [[63305,0],1]	ENVIRON[10]: SSH_CLIENT=192.168.122.100 51678 22
[slurm1:07433] [[63305,0],1]	ENVIRON[11]: USER=gouaillardet
[slurm1:07433] [[63305,0],1]	ENVIRON[12]: LD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:07433] [[63305,0],1]	ENVIRON[13]: MAIL=/var/mail/gouaillardet
[slurm1:07433] [[63305,0],1]	ENVIRON[14]: PATH=/csc/home1/gouaillardet/local/ompi-v1.8/bin:/csc/home1/gouaillardet/local/ompi-v1.8/bin:/usr/local/bin:/bin:/usr/bin
[slurm1:07433] [[63305,0],1]	ENVIRON[15]: PWD=/csc/home1/gouaillardet
[slurm1:07433] [[63305,0],1]	ENVIRON[16]: XMODIFIERS=@im=none
[slurm1:07433] [[63305,0],1]	ENVIRON[17]: LANG=en_US.UTF-8
[slurm1:07433] [[63305,0],1]	ENVIRON[18]: SHLVL=1
[slurm1:07433] [[63305,0],1]	ENVIRON[19]: HOME=/csc/home1/gouaillardet
[slurm1:07433] [[63305,0],1]	ENVIRON[20]: DYLD_LIBRARY_PATH=/csc/home1/gouaillardet/local/ompi-v1.8/lib:
[slurm1:07433] [[63305,0],1]	ENVIRON[21]: LOGNAME=gouaillardet
[slurm1:07433] [[63305,0],1]	ENVIRON[22]: SSH_CONNECTION=192.168.122.100 51678 192.168.122.101 22
[slurm1:07433] [[63305,0],1]	ENVIRON[23]: _=/csc/home1/gouaillardet/local/ompi-v1.8/bin/orted
[slurm1:07433] [[63305,0],1]	ENVIRON[24]: OMPI_MCA_orte_ess_jobid=4148756480
[slurm1:07433] [[63305,0],1]	ENVIRON[25]: OMPI_MCA_orte_ess_vpid=1
[slurm1:07433] [[63305,0],1]	ENVIRON[26]: OMPI_MCA_orte_ess_num_procs=3
[slurm1:07433] [[63305,0],1]	ENVIRON[27]: OMPI_MCA_orte_hnp_uri=4148756480.0;tcp://192.168.122.100:43512
[slurm1:07433] [[63305,0],1]	ENVIRON[28]: OMPI_MCA_plm=rsh
[slurm1:07433] [[63305,0],1]	ENVIRON[29]: OMPI_MCA_orte_local_daemon_uri=4148756480.1;tcp://192.168.122.101,10.0.0.5:46911
[slurm1:07433] [[63305,0],1]	ENVIRON[30]: OMPI_MCA_mpi_yield_when_idle=1
[slurm1:07433] [[63305,0],1]	ENVIRON[31]: OMPI_MCA_orte_app_num=0
[slurm1:07433] [[63305,0],1]	ENVIRON[32]: OMPI_UNIVERSE_SIZE=1
[slurm1:07433] [[63305,0],1]	ENVIRON[33]: OMPI_MCA_orte_num_nodes=1
[slurm1:07433] [[63305,0],1]	ENVIRON[34]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
[slurm1:07433] [[63305,0],1]	ENVIRON[35]: OMPI_MCA_orte_bound_at_launch=1
[slurm1:07433] [[63305,0],1]	ENVIRON[36]: OMPI_MCA_ess=env
[slurm1:07433] [[63305,0],1]	ENVIRON[37]: OMPI_COMM_WORLD_SIZE=3
[slurm1:07433] [[63305,0],1]	ENVIRON[38]: OMPI_COMM_WORLD_LOCAL_SIZE=3
[slurm1:07433] [[63305,0],1]	ENVIRON[39]: OMPI_MCA_orte_tmpdir_base=/tmp
[slurm1:07433] [[63305,0],1]	ENVIRON[40]: OMPI_MCA_grpcomm=^pmi
[slurm1:07433] [[63305,0],1]	ENVIRON[41]: OMPI_MCA_db=^pmi
[slurm1:07433] [[63305,0],1]	ENVIRON[42]: OMPI_MCA_pubsub=^pmi
[slurm1:07433] [[63305,0],1]	ENVIRON[43]: OMPI_MCA_initial_wdir=/csc/home1/gouaillardet
[slurm1:07433] [[63305,0],1]	ENVIRON[44]: OMPI_NUM_APP_CTX=1
[slurm1:07433] [[63305,0],1]	ENVIRON[45]: OMPI_FIRST_RANKS=0
[slurm1:07433] [[63305,0],1]	ENVIRON[46]: OMPI_APP_CTX_NUM_PROCS=3
[slurm1:07433] [[63305,0],1]	ENVIRON[47]: OMPI_MCA_ess_base_jobid=4148756481
[slurm1:07433] [[63305,0],1]	ENVIRON[48]: OMPI_MCA_ess_base_vpid=2
[slurm1:07433] [[63305,0],1]	ENVIRON[49]: OMPI_COMM_WORLD_RANK=2
[slurm1:07433] [[63305,0],1]	ENVIRON[50]: OMPI_COMM_WORLD_LOCAL_RANK=2
[slurm1:07433] [[63305,0],1]	ENVIRON[51]: OMPI_COMM_WORLD_NODE_RANK=2
[slurm1:07433] [[63305,0],1]	ENVIRON[52]: OMPI_MCA_orte_ess_node_rank=0
[slurm1:07433] [[63305,0],1]	ENVIRON[53]: OMPI_MCA_orte_num_restarts=0
[slurm1:07433] [[63305,0],1]	ENVIRON[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63305/1/0
[slurm1:07433] [[63305,0],1]	ENVIRON[55]: OPAL_OUTPUT_STDERR_FD=17
433] [[63305,0],1]	ENVIRON[52]: OMPI_MCA_orte_ess_node_rank=1
[slurm1:07433] [[63305,0],1]	ENVIRON[53]: OMPI_MCA_orte_num_restarts=0
[slurm1:07433] [[63305,0],1]	ENVIRON[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63305/1/1
[slurm1:07433] [[63305,0],1]	ENVIRON[55]: OPAL_OUTPUT_STDERR_FD=21
433] [[63305,0],1]	ENVIRON[52]: OMPI_MCA_orte_ess_node_rank=2
[slurm1:07433] [[63305,0],1]	ENVIRON[53]: OMPI_MCA_orte_num_restarts=0
[slurm1:07433] [[63305,0],1]	ENVIRON[54]: OMPI_FILE_LOCATION=/tmp/openmpi-sessions-gouaillardet@slurm1_0/63305/1/2
[slurm1:07433] [[63305,0],1]	ENVIRON[55]: OPAL_OUTPUT_STDERR_FD=24
[slurm1:07434] mca: base: components_register: registering errmgr components
[slurm1:07434] mca: base: components_register: found loaded component default_app
[slurm1:07434] mca: base: components_register: component default_app register function successful
[slurm1:07434] mca: base: components_register: found loaded component default_hnp
[slurm1:07434] mca: base: components_register: component default_hnp register function successful
[slurm1:07434] mca: base: components_register: found loaded component default_orted
[slurm1:07434] mca: base: components_register: component default_orted register function successful
[slurm1:07434] mca: base: components_register: found loaded component default_tool
[slurm1:07434] mca: base: components_register: component default_tool register function successful
[slurm1:07434] mca: base: components_open: opening errmgr components
[slurm1:07434] mca: base: components_open: found loaded component default_app
[slurm1:07434] mca: base: components_open: component default_app open function successful
[slurm1:07434] mca: base: components_open: found loaded component default_hnp
[slurm1:07434] mca: base: components_open: component default_hnp open function successful
[slurm1:07434] mca: base: components_open: found loaded component default_orted
[slurm1:07434] mca: base: components_open: component default_orted open function successful
[slurm1:07434] mca: base: components_open: found loaded component default_tool
[slurm1:07434] mca: base: components_open: component default_tool open function successful
[slurm1:07436] mca: base: components_register: registering errmgr components
[slurm1:07435] mca: base: components_register: registering errmgr components
[slurm1:07435] mca: base: components_register: found loaded component default_app
[slurm1:07435] mca: base: components_register: component default_app register function successful
[slurm1:07435] mca: base: components_register: found loaded component default_hnp
[slurm1:07435] mca: base: components_register: component default_hnp register function successful
[slurm1:07435] mca: base: components_register: found loaded component default_orted
[slurm1:07435] mca: base: components_register: component default_orted register function successful
[slurm1:07435] mca: base: components_register: found loaded component default_tool
[slurm1:07435] mca: base: components_register: component default_tool register function successful
[slurm1:07435] mca: base: components_open: opening errmgr components
[slurm1:07435] mca: base: components_open: found loaded component default_app
[slurm1:07435] mca: base: components_open: component default_app open function successful
[slurm1:07435] mca: base: components_open: found loaded component default_hnp
[slurm1:07435] mca: base: components_open: component default_hnp open function successful
[slurm1:07435] mca: base: components_open: found loaded component default_orted
[slurm1:07435] mca: base: components_open: component default_orted open function successful
[slurm1:07435] mca: base: components_open: found loaded component default_tool
[slurm1:07435] mca: base: components_open: component default_tool open function successful
[slurm1:07436] mca: base: components_register: found loaded component default_app
[slurm1:07436] mca: base: components_register: component default_app register function successful
[slurm1:07436] mca: base: components_register: found loaded component default_hnp
[slurm1:07436] mca: base: components_register: component default_hnp register function successful
[slurm1:07436] mca: base: components_register: found loaded component default_orted
[slurm1:07434] mca:base:select: Auto-selecting errmgr components
[slurm1:07434] mca:base:select:(errmgr) Querying component [default_app]
[slurm1:07434] mca:base:select:(errmgr) Query of component [default_app] set priority to 1000
[slurm1:07434] mca:base:select:(errmgr) Querying component [default_hnp]
[slurm1:07436] mca: base: components_register: component default_orted register function successful
[slurm1:07434] mca:base:select:(errmgr) Skipping component [default_hnp]. Query failed to return a module
[slurm1:07434] mca:base:select:(errmgr) Querying component [default_orted]
[slurm1:07434] mca:base:select:(errmgr) Skipping component [default_orted]. Query failed to return a module
[slurm1:07434] mca:base:select:(errmgr) Querying component [default_tool]
[slurm1:07434] mca:base:select:(errmgr) Skipping component [default_tool]. Query failed to return a module
[slurm1:07434] mca:base:select:(errmgr) Selected component [default_app]
[slurm1:07434] mca: base: close: component default_hnp closed
[slurm1:07434] mca: base: close: unloading component default_hnp
[slurm1:07434] mca: base: close: component default_orted closed
[slurm1:07434] mca: base: close: unloading component default_orted
[slurm1:07434] mca: base: close: component default_tool closed
[slurm1:07434] mca: base: close: unloading component default_tool
[slurm1:07436] mca: base: components_register: found loaded component default_tool
[slurm1:07436] mca: base: components_register: component default_tool register function successful
[slurm1:07436] mca: base: components_open: opening errmgr components
[slurm1:07436] mca: base: components_open: found loaded component default_app
[slurm1:07436] mca: base: components_open: component default_app open function successful
[slurm1:07436] mca: base: components_open: found loaded component default_hnp
[slurm1:07436] mca: base: components_open: component default_hnp open function successful
[slurm1:07436] mca: base: components_open: found loaded component default_orted
[slurm1:07436] mca: base: components_open: component default_orted open function successful
[slurm1:07436] mca: base: components_open: found loaded component default_tool
[slurm1:07435] mca:base:select: Auto-selecting errmgr components
[slurm1:07435] mca:base:select:(errmgr) Querying component [default_app]
[slurm1:07435] mca:base:select:(errmgr) Query of component [default_app] set priority to 1000
[slurm1:07435] mca:base:select:(errmgr) Querying component [default_hnp]
[slurm1:07435] mca:base:select:(errmgr) Skipping component [default_hnp]. Query failed to return a module
[slurm1:07435] mca:base:select:(errmgr) Querying component [default_orted]
[slurm1:07435] mca:base:select:(errmgr) Skipping component [default_orted]. Query failed to return a module
[slurm1:07435] mca:base:select:(errmgr) Querying component [default_tool]
[slurm1:07435] mca:base:select:(errmgr) Skipping component [default_tool]. Query failed to return a module
[slurm1:07435] mca:base:select:(errmgr) Selected component [default_app]
[slurm1:07435] mca: base: close: component default_hnp closed
[slurm1:07435] mca: base: close: unloading component default_hnp
[slurm1:07435] mca: base: close: component default_orted closed
[slurm1:07435] mca: base: close: unloading component default_orted
[slurm1:07435] mca: base: close: component default_tool closed
[slurm1:07435] mca: base: close: unloading component default_tool
[slurm1:07436] mca: base: components_open: component default_tool open function successful
[slurm1:07436] mca:base:select: Auto-selecting errmgr components
[slurm1:07436] mca:base:select:(errmgr) Querying component [default_app]
[slurm1:07436] mca:base:select:(errmgr) Query of component [default_app] set priority to 1000
[slurm1:07436] mca:base:select:(errmgr) Querying component [default_hnp]
[slurm1:07436] mca:base:select:(errmgr) Skipping component [default_hnp]. Query failed to return a module
[slurm1:07436] mca:base:select:(errmgr) Querying component [default_orted]
[slurm1:07436] mca:base:select:(errmgr) Skipping component [default_orted]. Query failed to return a module
[slurm1:07436] mca:base:select:(errmgr) Querying component [default_tool]
[slurm1:07436] mca:base:select:(errmgr) Skipping component [default_tool]. Query failed to return a module
[slurm1:07436] mca:base:select:(errmgr) Selected component [default_app]
[slurm1:07436] mca: base: close: component default_hnp closed
[slurm1:07436] mca: base: close: unloading component default_hnp
[slurm1:07436] mca: base: close: component default_orted closed
[slurm1:07436] mca: base: close: unloading component default_orted
[slurm1:07436] mca: base: close: component default_tool closed
[slurm1:07436] mca: base: close: unloading component default_tool
[slurm1:07433] [[63305,0],1] odls: require sync on child [[63305,1],0]
[slurm1:07433] [[63305,0],1] odls: registering sync on child [[63305,1],0]
[slurm1:07433] [[63305,0],1] odls: require sync registering child [[63305,1],0]
[slurm1:07433] [[63305,0],1] odls:sync nidmap requested for job [63305,1]
[slurm1:07433] [[63305,0],1] odls: sending sync ack to child [[63305,1],0] with 14829 bytes of data
[slurm1:07433] [[63305,0],1] odls: Finished sending sync ack to child [[63305,1],0] (Registering True)
[slurm1:07433] [[63305,0],1] odls: require sync on child [[63305,1],1]
[slurm1:07433] [[63305,0],1] odls: registering sync on child [[63305,1],1]
[slurm1:07433] [[63305,0],1] odls: require sync registering child [[63305,1],1]
[slurm1:07433] [[63305,0],1] odls:sync nidmap requested for job [63305,1]
[slurm1:07433] [[63305,0],1] odls: sending sync ack to child [[63305,1],1] with 14829 bytes of data
[slurm1:07433] [[63305,0],1] odls: Finished sending sync ack to child [[63305,1],1] (Registering True)
[slurm1:07433] [[63305,0],1] odls: require sync on child [[63305,1],2]
[slurm1:07433] [[63305,0],1] odls: registering sync on child [[63305,1],2]
[slurm1:07433] [[63305,0],1] odls: require sync registering child [[63305,1],2]
[slurm1:07433] [[63305,0],1] odls:sync nidmap requested for job [63305,1]
[slurm1:07433] [[63305,0],1] odls: sending sync ack to child [[63305,1],2] with 14829 bytes of data
[slurm1:07433] [[63305,0],1] odls: Finished sending sync ack to child [[63305,1],2] (Registering True)
[slurm1:07433] [[63305,0],1] odls: sending message to tag 30 on child [[63305,1],0]
[slurm1:07433] [[63305,0],1] odls: sending message to tag 30 on child [[63305,1],1]
[slurm1:07433] [[63305,0],1] odls: sending message to tag 30 on child [[63305,1],2]
[slurm1:07433] [[63305,0],1] odls: sending message to tag 30 on child [[63305,1],0]
[slurm1:07433] [[63305,0],1] odls: sending message to tag 30 on child [[63305,1],1]
[slurm1:07433] [[63305,0],1] odls: sending message to tag 30 on child [[63305,1],2]
[slurm1:07433] [[63305,0],1] odls: sending message to tag 30 on child [[63305,1],0]
[slurm1:07433] [[63305,0],1] odls: sending message to tag 30 on child [[63305,1],1]
[slurm1:07433] [[63305,0],1] odls: sending message to tag 30 on child [[63305,1],2]
[slurm1:07433] [[63305,0],1] odls: sending message to tag 30 on child [[63305,1],0]
[slurm1:07433] [[63305,0],1] odls: sending message to tag 30 on child [[63305,1],1]
[slurm1:07433] [[63305,0],1] odls: sending message to tag 30 on child [[63305,1],2]
[slurm1:07435] [[63305,1],1] called (default_app) abort_peers
 MPITEST_INFO (         0): Starting test MPI_Errhandler_fatal     
 MPITEST_INFO (         0): This test should abort after printing the results
 MPITEST_INFO (         0): message, otherwise a f.a.i.l.u.r.e is noted
MPITEST_results: MPI_Errhandler_fatal all tests PASSED (         3)
[slurm1:07436] [[63305,1],2] called (default_app) abort_peers
[slurm1:07434] [[63305,1],0] called (default_app) abort_peers
[slurm1:7435] *** An error occurred in MPI_Send
[slurm1:7435] *** reported by process [140737342144513,1]
[slurm1:7435] *** on communicator MPI COMMUNICATOR 3 DUP FROM 0
[slurm1:7435] *** MPI_ERR_RANK: invalid rank
[slurm1:7435] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[slurm1:7435] ***    and potentially your MPI job)
[slurm1:07433] [[63305,0],1] errmgr:default_orted:proc_errors process [[63305,1],1] error state COMMUNICATION FAILURE
[slurm1:07433] wait it a daemon ? nope - ignore
[slurm1:07433] [[63305,0],1] errmgr:default_orted:proc_errors process [[63305,1],2] error state COMMUNICATION FAILURE
[slurm1:07433] wait it a daemon ? nope - ignore
[slurm1:07433] [[63305,0],1] odls:wait_local_proc child process [[63305,1],1] pid 7435 terminated
[slurm1:07433] [[63305,0],1] odls:waitpid_fired child [[63305,1],1] exit code 6
[slurm1:07433] [[63305,0],1] odls:waitpid_fired checking abort file /tmp/openmpi-sessions-gouaillardet@slurm1_0/63305/1/1/aborted for child [[63305,1],1]
[slurm1:07433] [[63305,0],1] odls:waitpid_fired child [[63305,1],1] died by call to abort
[slurm1:07433] [[63305,0],1] odls:wait_local_proc child process [[63305,1],2] pid 7436 terminated
[slurm1:07433] [[63305,0],1] odls:waitpid_fired child [[63305,1],2] exit code 6
[slurm1:07433] [[63305,0],1] odls:waitpid_fired checking abort file /tmp/openmpi-sessions-gouaillardet@slurm1_0/63305/1/2/aborted for child [[63305,1],2]
[slurm1:07433] [[63305,0],1] odls:waitpid_fired child [[63305,1],2] died by call to abort
[slurm1:07433] [[63305,0],1] errmgr:default_orted:proc_errors process [[63305,1],1] error state CALLED ABORT
[slurm1:07433] [[63305,0],1] errmgr:default_orted got state CALLED ABORT for proc [[63305,1],1]
[slurm1:07433] [[63305,0],1] errmgr:default_orted orte_orteds_term_ordered 0
[slurm1:07433] [[63305,0],1] errmgr:default_orted reporting proc [[63305,1],1] aborted to HNP (local procs = 2)
[slurm1:07433] [[63305,0],1] errmgr:default_orted:proc_errors process [[63305,1],2] error state CALLED ABORT
[slurm1:07433] [[63305,0],1] errmgr:default_orted got state CALLED ABORT for proc [[63305,1],2]
[slurm1:07433] [[63305,0],1] errmgr:default_orted orte_orteds_term_ordered 0
[slurm1:07433] [[63305,0],1] errmgr:default_orted reporting proc [[63305,1],2] aborted to HNP (local procs = 1)
[slurm1:07433] [[63305,0],1] errmgr:default_orted:proc_errors process [[63305,1],0] error state COMMUNICATION FAILURE
[slurm1:07433] wait it a daemon ? nope - ignore
[slurm1:07433] [[63305,0],1] odls:wait_local_proc child process [[63305,1],0] pid 7434 terminated
[slurm1:07433] [[63305,0],1] odls:waitpid_fired child [[63305,1],0] exit code 6
[slurm1:07433] [[63305,0],1] odls:waitpid_fired checking abort file /tmp/openmpi-sessions-gouaillardet@slurm1_0/63305/1/0/aborted for child [[63305,1],0]
[slurm1:07433] [[63305,0],1] odls:waitpid_fired child [[63305,1],0] died by call to abort
[slurm1:07433] [[63305,0],1] errmgr:default_orted:proc_errors process [[63305,1],0] error state CALLED ABORT
[slurm1:07433] [[63305,0],1] errmgr:default_orted got state CALLED ABORT for proc [[63305,1],0]
[slurm1:07433] [[63305,0],1] errmgr:default_orted orte_orteds_term_ordered 0
[slurm1:07433] [[63305,0],1] errmgr:default_orted reporting proc [[63305,1],0] aborted to HNP (local procs = 0)
[slurm0:14139] [[63305,0],0] errmgr:default_hnp: for proc [[63305,1],1] state CALLED ABORT
[slurm0:14139] [[63305,0],0] errmgr:hnp: proc [[63305,1],1] called abort
[slurm0:14139] [[63305,0],0] errmgr:default_hnp: abort called on job [63305,1]
[slurm0:14139] [[63305,0],0] errmgr:default_hnp: ordering orted termination
[slurm0:14139] [[63305,0],0] errmgr:default_hnp: for proc [[63305,1],2] state CALLED ABORT
[slurm0:14139] [[63305,0],0] errmgr:hnp: proc [[63305,1],2] called abort
[slurm0:14139] [[63305,0],0] errmgr:default_hnp: abort in progress, ignoring abort on job [63305,1]
[slurm0:14139] [[63305,0],0] odls:kill_local_proc working on WILDCARD
[slurm0:14139] [[63305,0],0] errmgr:default_hnp: for proc [[63305,1],0] state CALLED ABORT
[slurm0:14139] [[63305,0],0] errmgr:hnp: proc [[63305,1],0] called abort
[slurm0:14139] [[63305,0],0] errmgr:default_hnp: abort in progress, ignoring abort on job [63305,1]
[slurm1:07433] [[63305,0],1] odls:kill_local_proc working on WILDCARD
[slurm1:07433] [[63305,0],1] odls:kill_local_proc working on WILDCARD
[slurm1:07433] mca: base: close: component default_orted closed
[slurm1:07433] mca: base: close: unloading component default_orted
[slurm0:14139] [[63305,0],0] errmgr:default_hnp: for proc [[63305,0],1] state COMMUNICATION FAILURE
[slurm0:14139] [[63305,0],0] Comm failure: daemons terminating - recording daemon [[63305,0],1] as gone
[slurm0:14139] [[63305,0],0] errmgr_hnp: all routes and children gone - ordering exit
[slurm0:14139] [[63305,0],0] odls:kill_local_proc working on WILDCARD
[slurm0:14139] 2 more processes have sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[slurm0:14139] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
[slurm0:14139] mca: base: close: component default closed
[slurm0:14139] mca: base: close: unloading component default
[slurm0:14139] mca: base: close: component default_hnp closed
[slurm0:14139] mca: base: close: unloading component default_hnp
[slurm1:07433] [[63305,0],1] odls:kill_local_proc working on WILDCARD
[slurm1:07433] mca: base: close: component default closed
[slurm1:07433] mca: base: close: unloading component default
