Hmmm…that looks like MPICH, not OMPI - did you mix the two?

> On Jul 5, 2015, at 8:16 PM, Victor Rodriguez <vm.ro...@gmail.com> wrote:
> 
> Hi
> 
> I am facing the following issue on my MPI build from source code:
> 
> (the only change in config was disable fortran )
> 
> Any help is more than welcome
> 
> 
> root@intel-corei7-64:~/test# mpirun -n 2 ./mpi_hello
> Fatal error in MPI_Init: Other MPI error, error stack:
> MPIR_Init_thread(498)..............:
> MPID_Init(187).....................: channel initialization failed
> MPIDI_CH3_Init(89).................:
> MPID_nem_init(320).................:
> MPID_nem_tcp_init(171).............:
> MPID_nem_tcp_get_business_card(418):
> MPID_nem_tcp_init(377).............: gethostbyname failed,
> intel-corei7-64 (errno 1)
> 
> ===================================================================================
> =   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
> =   PID 483 RUNNING AT intel-corei7-64
> =   EXIT CODE: 1
> =   CLEANING UP REMAINING PROCESSES
> =   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
> ===================================================================================
> 
> Source code :
> https://github.com/VictorRodriguez/parallel/blob/master/mpi/hello.c
> 
> Best Regards
> 
> Victor Rodriguez
> _______________________________________________
> devel mailing list
> de...@open-mpi.org
> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/devel
> Link to this post: 
> http://www.open-mpi.org/community/lists/devel/2015/07/17595.php

Reply via email to