Re: [OMPI users] OpenMPI 1.10.7 and Infiniband

2017-07-26 Thread Russell Dekema
Are you sure your InfiniBand network is up and running? What kind of output do you get if you run the command 'ibv_devinfo'? Sincerely, Rusty Dekema On Wed, Jul 26, 2017 at 2:40 PM, Sajesh Singh wrote: > OS: Centos 7 > > Infiniband Packages from OS repos > > Mellanox HCA > > >

Re: [OMPI users] Q: Basic invoking of InfiniBand with OpenMPI

2017-07-17 Thread Russell Dekema
but have a quick >> question: if I run the "*mpiexec*" with "*-mca btl tcp,self*" do I get the >> benefit of *RoCE *(the fastest speed)? >> >> I'll go over the details of all reply and post useful feedback. >> >> Thanks very much all! >

Re: [OMPI users] Q: Basic invoking of InfiniBand with OpenMPI

2017-07-17 Thread Russell Dekema
It looks like you have two dual-port Mellanox VPI cards in this machine. These cards can be set to run InfiniBand or Ethernet on a port-by-port basis, and all four of your ports are set to Ethernet mode. Two of your ports have active 100 gigabit Ethernet links, and the other two have no link up at