Re: [OMPI users] [EXTERNAL] Re: Build Failing for OpenMPI 1.7.2 and CUDA 5.5.11

2013-10-07 Thread Hammond, Simon David (-EXP)
Thanks Rolf, that seems to have made the code compile and make
successfully. 

S.

-- 
Simon Hammond
Scalable Computer Architectures (CSRI/146, 01422)
Sandia National Laboratories, NM, USA






On 10/7/13 1:47 PM, "Rolf vandeVaart"  wrote:

>That might be a bug.  While I am checking, you could try configuring with
>this additional flag:
>
>--enable-mca-no-build=pml-bfo
>
>Rolf
>
>>-Original Message-
>>From: users [mailto:users-boun...@open-mpi.org] On Behalf Of Hammond,
>>Simon David (-EXP)
>>Sent: Monday, October 07, 2013 3:30 PM
>>To: us...@open-mpi.org
>>Subject: [OMPI users] Build Failing for OpenMPI 1.7.2 and CUDA 5.5.11
>>
>>Hey everyone,
>>
>>I am trying to build OpenMPI 1.7.2 with CUDA enabled, OpenMPI will
>>configure successfully but I am seeing a build error relating to the
>>inclusion of
>>the CUDA options (at least I think so). Do you guys know if this is a
>>bug or
>>whether something is wrong with how we are configuring OpenMPI for our
>>cluster.
>>
>>Configure Line: ./configure
>>--prefix=/home/projects/openmpi/1.7.2/gnu/4.7.2 --enable-shared --enable-
>>static --disable-vt --with-cuda=/home/projects/cuda/5.5.11
>>CC=`which gcc` CXX=`which g++` FC=`which gfortran`
>>
>>Running make V=1 gives:
>>
>>make[2]: Entering directory `/tmp/openmpi-1.7.2/ompi/tools/ompi_info'
>>/bin/sh ../../../libtool  --tag=CC   --mode=link
>>/home/projects/gcc/4.7.2/bin/gcc -std=gnu99 -
>>DOPAL_CONFIGURE_USER="\"\"" -
>>DOPAL_CONFIGURE_HOST="\"k20-0007\""
>>-DOPAL_CONFIGURE_DATE="\"Mon Oct  7 13:16:12 MDT 2013\""
>>-DOMPI_BUILD_USER="\"$USER\"" -DOMPI_BUILD_HOST="\"`hostname`\""
>>-DOMPI_BUILD_DATE="\"`date`\"" -DOMPI_BUILD_CFLAGS="\"-O3 -
>>DNDEBUG -finline-functions -fno-strict-aliasing -pthread\""
>>-DOMPI_BUILD_CPPFLAGS="\"-I../../..
>>-I/tmp/openmpi-1.7.2/opal/mca/hwloc/hwloc152/hwloc/include
>>-I/tmp/openmpi-1.7.2/opal/mca/event/libevent2019/libevent
>>-I/tmp/openmpi-1.7.2/opal/mca/event/libevent2019/libevent/include
>>-I/usr/include/infiniband -I/usr/include/infiniband
>>-I/usr/include/infiniband -
>>I/usr/include/infiniband -I/usr/include/infiniband\"" -
>>DOMPI_BUILD_CXXFLAGS="\"-O3 -DNDEBUG -finline-functions -pthread\"" -
>>DOMPI_BUILD_CXXCPPFLAGS="\"-I../../..  \""
>>-DOMPI_BUILD_FFLAGS="\"\"" -DOMPI_BUILD_FCFLAGS="\"\""
>>-DOMPI_BUILD_LDFLAGS="\"-export-dynamic  \"" -DOMPI_BUILD_LIBS="\"-
>>lrt -lnsl  -lutil -lm \"" -DOPAL_CC_ABSOLUTE="\"\""
>>-DOMPI_CXX_ABSOLUTE="\"none\"" -O3 -DNDEBUG -finline-functions
>>-fno-strict-aliasing -pthread  -export-dynamic   -o ompi_info ompi_info.o
>>param.o components.o version.o ../../../ompi/libmpi.la -lrt -lnsl
>>-lutil -lm
>>libtool: link: /home/projects/gcc/4.7.2/bin/gcc -std=gnu99 -
>>DOPAL_CONFIGURE_USER=\"\" -
>>DOPAL_CONFIGURE_HOST=\"k20-0007\"
>>"-DOPAL_CONFIGURE_DATE=\"Mon Oct  7 13:16:12 MDT 2013\""
>>-DOMPI_BUILD_USER=\"\" -DOMPI_BUILD_HOST=\"k20-0007\"
>>"-DOMPI_BUILD_DATE=\"Mon Oct  7 13:26:23 MDT 2013\""
>>"-DOMPI_BUILD_CFLAGS=\"-O3 -DNDEBUG -finline-functions -fno-strict-
>>aliasing -pthread\"" "-DOMPI_BUILD_CPPFLAGS=\"-I../../..
>>-I/tmp/openmpi-1.7.2/opal/mca/hwloc/hwloc152/hwloc/include
>>-I/tmp/openmpi-1.7.2/opal/mca/event/libevent2019/libevent
>>-I/tmp/openmpi-1.7.2/opal/mca/event/libevent2019/libevent/include
>>-I/usr/include/infiniband -I/usr/include/infiniband
>>-I/usr/include/infiniband -
>>I/usr/include/infiniband -I/usr/include/infiniband\"" "-
>>DOMPI_BUILD_CXXFLAGS=\"-O3 -DNDEBUG -finline-functions -pthread\"" "-
>>DOMPI_BUILD_CXXCPPFLAGS=\"-I../../..  \""
>>-DOMPI_BUILD_FFLAGS=\"\" -DOMPI_BUILD_FCFLAGS=\"\"
>>"-DOMPI_BUILD_LDFLAGS=\"-export-dynamic  \"" "-DOMPI_BUILD_LIBS=\"-
>>lrt -lnsl  -lutil -lm \"" -DOPAL_CC_ABSOLUTE=\"\" -
>>DOMPI_CXX_ABSOLUTE=\"none\"
>>-O3 -DNDEBUG -finline-functions -fno-strict-aliasing -pthread -o
>>.libs/ompi_info ompi_info.o param.o components.o version.o -Wl,--export-
>>dynamic  ../../../ompi/.libs/libmpi.so -L/usr/lib64 -lrdmacm -losmcomp -
>>libverbs /tmp/openmpi-1.7.2/orte/.libs/libopen-rte.so
>>/tmp/openmpi-1.7.2/opal/.libs/libopen-pal.so -lcuda -lnuma -ldl -lrt
>>-lnsl -lutil -
>>lm -pthread -Wl,-rpath -Wl,/home/projects/openmpi/1.7.2/gnu/4.7.2/lib
>>../../../ompi/.libs/libmpi.so: undefined reference to
>>`mca_pml_bfo_send_request_start_cuda'
>>../../../ompi/.libs/libmpi.so: undefined reference to
>>`mca_pml_bfo_cuda_need_buffers'
>>collect2: error: ld returned 1 exit status
>>
>>
>>
>>Thanks.
>>
>>S.
>>
>>--
>>Simon Hammond
>>Scalable Computer Architectures (CSRI/146, 01422) Sandia National
>>Laboratories, NM, USA
>>
>>
>>
>>
>>___
>>users mailing list
>>us...@open-mpi.org
>>http://www.open-mpi.org/mailman/listinfo.cgi/users
>--
>-
>This email message is for the sole use of the intended recipient(s) and
>may contain
>confidential information.  Any unauthorized review, use, disclosure or
>distribution
>is prohibited.  If you are not the intended recipient, please contact th

Re: [OMPI users] [EXTERNAL] Re: Build Failing for OpenMPI 1.7.2 and CUDA 5.5.11

2013-10-07 Thread Rolf vandeVaart
Good.  This is fixed in Open MPI 1.7.3 by the way.  I will add note to FAQ on 
building Open MPI 1.7.2.

>-Original Message-
>From: users [mailto:users-boun...@open-mpi.org] On Behalf Of Hammond,
>Simon David (-EXP)
>Sent: Monday, October 07, 2013 4:17 PM
>To: Open MPI Users
>Subject: Re: [OMPI users] [EXTERNAL] Re: Build Failing for OpenMPI 1.7.2 and
>CUDA 5.5.11
>
>Thanks Rolf, that seems to have made the code compile and make
>successfully.
>
>S.
>
>--
>Simon Hammond
>Scalable Computer Architectures (CSRI/146, 01422) Sandia National
>Laboratories, NM, USA
>
>
>
>
>
>
>On 10/7/13 1:47 PM, "Rolf vandeVaart"  wrote:
>
>>That might be a bug.  While I am checking, you could try configuring with
>>this additional flag:
>>
>>--enable-mca-no-build=pml-bfo
>>
>>Rolf
>>
>>>-Original Message-
>>>From: users [mailto:users-boun...@open-mpi.org] On Behalf Of
>Hammond,
>>>Simon David (-EXP)
>>>Sent: Monday, October 07, 2013 3:30 PM
>>>To: us...@open-mpi.org
>>>Subject: [OMPI users] Build Failing for OpenMPI 1.7.2 and CUDA 5.5.11
>>>
>>>Hey everyone,
>>>
>>>I am trying to build OpenMPI 1.7.2 with CUDA enabled, OpenMPI will
>>>configure successfully but I am seeing a build error relating to the
>>>inclusion of
>>>the CUDA options (at least I think so). Do you guys know if this is a
>>>bug or
>>>whether something is wrong with how we are configuring OpenMPI for our
>>>cluster.
>>>
>>>Configure Line: ./configure
>>>--prefix=/home/projects/openmpi/1.7.2/gnu/4.7.2 --enable-shared --
>enable-
>>>static --disable-vt --with-cuda=/home/projects/cuda/5.5.11
>>>CC=`which gcc` CXX=`which g++` FC=`which gfortran`
>>>
>>>Running make V=1 gives:
>>>
>>>make[2]: Entering directory `/tmp/openmpi-1.7.2/ompi/tools/ompi_info'
>>>/bin/sh ../../../libtool  --tag=CC   --mode=link
>>>/home/projects/gcc/4.7.2/bin/gcc -std=gnu99 -
>>>DOPAL_CONFIGURE_USER="\"\"" -
>>>DOPAL_CONFIGURE_HOST="\"k20-0007\""
>>>-DOPAL_CONFIGURE_DATE="\"Mon Oct  7 13:16:12 MDT 2013\""
>>>-DOMPI_BUILD_USER="\"$USER\"" -
>DOMPI_BUILD_HOST="\"`hostname`\""
>>>-DOMPI_BUILD_DATE="\"`date`\"" -DOMPI_BUILD_CFLAGS="\"-O3 -
>>>DNDEBUG -finline-functions -fno-strict-aliasing -pthread\""
>>>-DOMPI_BUILD_CPPFLAGS="\"-I../../..
>>>-I/tmp/openmpi-1.7.2/opal/mca/hwloc/hwloc152/hwloc/include
>>>-I/tmp/openmpi-1.7.2/opal/mca/event/libevent2019/libevent
>>>-I/tmp/openmpi-1.7.2/opal/mca/event/libevent2019/libevent/include
>>>-I/usr/include/infiniband -I/usr/include/infiniband
>>>-I/usr/include/infiniband -
>>>I/usr/include/infiniband -I/usr/include/infiniband\"" -
>>>DOMPI_BUILD_CXXFLAGS="\"-O3 -DNDEBUG -finline-functions -
>pthread\"" -
>>>DOMPI_BUILD_CXXCPPFLAGS="\"-I../../..  \""
>>>-DOMPI_BUILD_FFLAGS="\"\"" -DOMPI_BUILD_FCFLAGS="\"\""
>>>-DOMPI_BUILD_LDFLAGS="\"-export-dynamic  \"" -
>DOMPI_BUILD_LIBS="\"-
>>>lrt -lnsl  -lutil -lm \"" -DOPAL_CC_ABSOLUTE="\"\""
>>>-DOMPI_CXX_ABSOLUTE="\"none\"" -O3 -DNDEBUG -finline-functions
>>>-fno-strict-aliasing -pthread  -export-dynamic   -o ompi_info ompi_info.o
>>>param.o components.o version.o ../../../ompi/libmpi.la -lrt -lnsl
>>>-lutil -lm
>>>libtool: link: /home/projects/gcc/4.7.2/bin/gcc -std=gnu99 -
>>>DOPAL_CONFIGURE_USER=\"\" -
>>>DOPAL_CONFIGURE_HOST=\"k20-0007\"
>>>"-DOPAL_CONFIGURE_DATE=\"Mon Oct  7 13:16:12 MDT 2013\""
>>>-DOMPI_BUILD_USER=\"\" -DOMPI_BUILD_HOST=\"k20-
>0007\"
>>>"-DOMPI_BUILD_DATE=\"Mon Oct  7 13:26:23 MDT 2013\""
>>>"-DOMPI_BUILD_CFLAGS=\"-O3 -DNDEBUG -finline-functions -fno-strict-
>>>aliasing -pthread\"" "-DOMPI_BUILD_CPPFLAGS=\"-I../../..
>>>-I/tmp/openmpi-1.7.2/opal/mca/hwloc/hwloc152/hwloc/include
>>>-I/tmp/openmpi-1.7.2/opal/mca/event/libevent2019/libevent
>>>-I/tmp/openmpi-1.7.2/opal/mca/event/libevent2019/libevent/include
>>>-I/usr/include/infiniband -I/usr/include/infiniband
>>>-I/usr/include/infiniband -
>>>I/usr/include/infiniband -I/usr/include/infiniba