It really helps if you tell us what version of OMPI you are using :-)

Regardless, this looks like a classic mismatch between the OMPI version used to 
compile the app versus the one being used for mpirun. You might want to make 
sure you have everything consistent


> On Jul 18, 2015, at 2:21 PM, Juan Liu <jl...@mst.edu> wrote:
> 
> Hi all,
> 
> I have some trouble when running the command : mpirun -np 2 foam.ups
> Here is the error report I got:
> Can anyone commend on this? Thanks.
> 
> Best,
> -------------------------------------------------------------------------------------------------
> [laptop:02500] [[INVALID],INVALID] ORTE_ERROR_LOG: Not found in file 
> ess_env_module.c at line 367
> [laptop:02499] [[INVALID],INVALID] ORTE_ERROR_LOG: Not found in file 
> ess_env_module.c at line 367
> [laptop:02497] tcp_peer_recv_connect_ack: invalid header type: 707067904
> [laptop:02497] tcp_peer_recv_connect_ack: invalid header type: 707067904
> [(null):2499] *** An error occurred in MPI_Abort
> [(null):2499] *** on a NULL communicator
> [(null):2499] *** Unknown error
> [(null):2499] *** MPI_ERRORS_ARE_FATAL: your MPI job will now abort
> [(null):2500] *** An error occurred in MPI_Abort
> [(null):2500] *** on a NULL communicator
> [(null):2500] *** Unknown error
> [(null):2500] *** MPI_ERRORS_ARE_FATAL: your MPI job will now abort
> --------------------------------------------------------------------------
> An MPI process is aborting at a time when it cannot guarantee that all
> of its peer processes in the job will be killed properly.  You should
> double check that everything has shut down cleanly.
> 
>   Reason:     Before MPI_INIT completed
>   Local host: laptop
>   PID:        2499
> --------------------------------------------------------------------------
> --------------------------------------------------------------------------
> An MPI process is aborting at a time when it cannot guarantee that all
> of its peer processes in the job will be killed properly.  You should
> double check that everything has shut down cleanly.
> 
>   Reason:     Before MPI_INIT completed
>   Local host: laptop
>   PID:        2500
> --------------------------------------------------------------------------
> -------------------------------------------------------
> Primary job  terminated normally, but 1 process returned
> a non-zero exit code.. Per user-direction, the job has been aborted.
> -------------------------------------------------------
> --------------------------------------------------------------------------
> mpirun detected that one or more processes exited with non-zero status, thus 
> causing
> the job to be terminated. The first process to do so was:
> 
>   Process name: [[10789,1],0]
>   Exit code:    1
> 
> _______________________________________________
> users mailing list
> us...@open-mpi.org
> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
> Link to this post: 
> http://www.open-mpi.org/community/lists/users/2015/07/27286.php

Reply via email to