Good afternoon,

I'm trying to build Open MPI 4.1.5 using GCC 11.3. However, I get an error
that I'm not sure how to correct. The error is,

...
  CC       pscatter.lo
  CC       piscatter.lo
  CC       pscatterv.lo
  CC       piscatterv.lo
  CC       psend.lo
  CC       psend_init.lo
  CC       psendrecv.lo
  CC       psendrecv_replace.lo
  CC       pssend_init.lo
  CC       pssend.lo
  CC       pstart.lo
  CC       pstartall.lo
  CC       pstatus_c2f.lo
  CC       pstatus_f2c.lo
  CC       pstatus_set_cancelled.lo
  CC       pstatus_set_elements.lo
  CC       pstatus_set_elements_x.lo
  CC       ptestall.lo
  CC       ptestany.lo
  CC       ptest.lo
  CC       ptest_cancelled.lo
  CC       ptestsome.lo
  CC       ptopo_test.lo
  CC       ptype_c2f.lo
  CC       ptype_commit.lo
  CC       ptype_contiguous.lo
  CC       ptype_create_darray.lo
make[3]: *** No rule to make target
'/usr/lib/gcc/x86_64-linux-gnu/9/include/float.h', needed by
'ptype_create_f90_complex.lo'.  Stop.
make[3]: Leaving directory
'/home/laytonjb/src/openmpi-4.1.5/ompi/mpi/c/profile'
make[2]: *** [Makefile:2559: all-recursive] Error 1
make[2]: Leaving directory '/home/laytonjb/src/openmpi-4.1.5/ompi/mpi/c'
make[1]: *** [Makefile:3566: all-recursive] Error 1
make[1]: Leaving directory '/home/laytonjb/src/openmpi-4.1.5/ompi'
make: *** [Makefile:1912: all-recursive] Error 1



Here is the configuration output from configure:

Open MPI configuration:
-----------------------
Version: 4.1.5
Build MPI C bindings: yes
Build MPI C++ bindings (deprecated): no
Build MPI Fortran bindings: mpif.h, use mpi, use mpi_f08
MPI Build Java bindings (experimental): no
Build Open SHMEM support: false (no spml)
Debug build: no
Platform file: (none)

Miscellaneous
-----------------------
CUDA support: no
HWLOC support: external
Libevent support: internal
Open UCC: no
PMIx support: Internal

Transports
-----------------------
Cisco usNIC: no
Cray uGNI (Gemini/Aries): no
Intel Omnipath (PSM2): no
Intel TrueScale (PSM): no
Mellanox MXM: no
Open UCX: no
OpenFabrics OFI Libfabric: no
OpenFabrics Verbs: no
Portals4: no
Shared memory/copy in+copy out: yes
Shared memory/Linux CMA: yes
Shared memory/Linux KNEM: no
Shared memory/XPMEM: no
TCP: yes

Resource Managers
-----------------------
Cray Alps: no
Grid Engine: no
LSF: no
Moab: no
Slurm: yes
ssh/rsh: yes
Torque: no

OMPIO File Systems
-----------------------
DDN Infinite Memory Engine: no
Generic Unix FS: yes
IBM Spectrum Scale/GPFS: no
Lustre: no
PVFS2/OrangeFS: no



Any suggestions! Thanks!

Jeff
  • [OMPI use... Jeffrey Layton via users
    • Re: ... Jeff Squyres (jsquyres) via users
      • ... Jeffrey Layton via users
        • ... Jeff Squyres (jsquyres) via users
          • ... Jeffrey Layton via users
            • ... Jeff Squyres (jsquyres) via users
            • ... Bernstein, Noam CIV USN NRL (6393) Washington DC (USA) via users
              • ... Tom Kacvinsky via users

Reply via email to