There were probably quite a few differences from the output of "configure" 
between GCC 9.4 and GCC 11.3.

For example, your original post cited 
"/usr/lib/gcc/x86_64-linux-gnu/9/include/float.h", which, I assume, does not 
exist on your new GCC 11.3-based system.

Meaning: if you had run make clean and then re-ran configure, it probably would 
have built ok.  But deleting the whole source tree and re-configuring + 
re-building also works.  🙂
________________________________
From: Jeffrey Layton <layto...@gmail.com>
Sent: Tuesday, July 18, 2023 11:38 AM
To: Jeff Squyres (jsquyres) <jsquy...@cisco.com>
Cc: Open MPI Users <users@lists.open-mpi.org>
Subject: Re: [OMPI users] Error build Open MPI 4.1.5 with GCC 11.3

Jeff,

Thanks for the tip - it started me thinking a bit.

I was using a directory in my /home account with 4.1.5 that I had previously 
built using GCC 9.4 (Ubuntu 20.04). I rebuilt the system with Ubuntu-22.04 but 
I did a backup of /home. Then I copied the 4.1.5 directory to /home again.

I checked and I did a "make clean" before attempting to build 4.1.5 but with 
GCC 11.3 that came with Ubuntu 22.04. In fact, I did it several times before I 
ran configure.

Even after running "make clean" I got the error I mentioned in my initial post. 
This happened several times.

This morning, I blew away my 4.1.5 directory and downloaded a fresh 4.1.5. 
Configure went fine as did compiling it.

My theory is that some cruft from building 4.1.5 with GCC 9.4 compilers hung 
around, even after "make clean". Using a "fresh" download of 4.1.5 did not 
include this "cruft" so configure and make all proceeds just fine.

I don't know if this is correct and I can't point to any smoking gun though.

Thanks!

Jeff


On Mon, Jul 17, 2023 at 2:53 PM Jeff Squyres (jsquyres) 
<jsquy...@cisco.com<mailto:jsquy...@cisco.com>> wrote:
That's a little odd.  Usually, the specific .h files that are listed as 
dependencies came from somewhere -- usually either part of the GNU Autotools 
dependency analysis.

I'm guessing that /usr/lib/gcc/x86_64-linux-gnu/9/include/float.h doesn't 
actually exist on your system -- but then how did it get into Open MPI's 
makefiles?

Did you run configure on one machine and make on a different machine, perchance?
________________________________
From: users 
<users-boun...@lists.open-mpi.org<mailto:users-boun...@lists.open-mpi.org>> on 
behalf of Jeffrey Layton via users 
<users@lists.open-mpi.org<mailto:users@lists.open-mpi.org>>
Sent: Monday, July 17, 2023 2:05 PM
To: Open MPI Users <users@lists.open-mpi.org<mailto:users@lists.open-mpi.org>>
Cc: Jeffrey Layton <layto...@gmail.com<mailto:layto...@gmail.com>>
Subject: [OMPI users] Error build Open MPI 4.1.5 with GCC 11.3

Good afternoon,

I'm trying to build Open MPI 4.1.5 using GCC 11.3. However, I get an error that 
I'm not sure how to correct. The error is,

...
  CC       pscatter.lo
  CC       piscatter.lo
  CC       pscatterv.lo
  CC       piscatterv.lo
  CC       psend.lo
  CC       psend_init.lo
  CC       psendrecv.lo
  CC       psendrecv_replace.lo
  CC       pssend_init.lo
  CC       pssend.lo
  CC       pstart.lo
  CC       pstartall.lo
  CC       pstatus_c2f.lo
  CC       pstatus_f2c.lo
  CC       pstatus_set_cancelled.lo
  CC       pstatus_set_elements.lo
  CC       pstatus_set_elements_x.lo
  CC       ptestall.lo
  CC       ptestany.lo
  CC       ptest.lo
  CC       ptest_cancelled.lo
  CC       ptestsome.lo
  CC       ptopo_test.lo
  CC       ptype_c2f.lo
  CC       ptype_commit.lo
  CC       ptype_contiguous.lo
  CC       ptype_create_darray.lo
make[3]: *** No rule to make target 
'/usr/lib/gcc/x86_64-linux-gnu/9/include/float.h', needed by 
'ptype_create_f90_complex.lo'.  Stop.
make[3]: Leaving directory '/home/laytonjb/src/openmpi-4.1.5/ompi/mpi/c/profile'
make[2]: *** [Makefile:2559: all-recursive] Error 1
make[2]: Leaving directory '/home/laytonjb/src/openmpi-4.1.5/ompi/mpi/c'
make[1]: *** [Makefile:3566: all-recursive] Error 1
make[1]: Leaving directory '/home/laytonjb/src/openmpi-4.1.5/ompi'
make: *** [Makefile:1912: all-recursive] Error 1



Here is the configuration output from configure:

Open MPI configuration:
-----------------------
Version: 4.1.5
Build MPI C bindings: yes
Build MPI C++ bindings (deprecated): no
Build MPI Fortran bindings: mpif.h, use mpi, use mpi_f08
MPI Build Java bindings (experimental): no
Build Open SHMEM support: false (no spml)
Debug build: no
Platform file: (none)

Miscellaneous
-----------------------
CUDA support: no
HWLOC support: external
Libevent support: internal
Open UCC: no
PMIx support: Internal

Transports
-----------------------
Cisco usNIC: no
Cray uGNI (Gemini/Aries): no
Intel Omnipath (PSM2): no
Intel TrueScale (PSM): no
Mellanox MXM: no
Open UCX: no
OpenFabrics OFI Libfabric: no
OpenFabrics Verbs: no
Portals4: no
Shared memory/copy in+copy out: yes
Shared memory/Linux CMA: yes
Shared memory/Linux KNEM: no
Shared memory/XPMEM: no
TCP: yes

Resource Managers
-----------------------
Cray Alps: no
Grid Engine: no
LSF: no
Moab: no
Slurm: yes
ssh/rsh: yes
Torque: no

OMPIO File Systems
-----------------------
DDN Infinite Memory Engine: no
Generic Unix FS: yes
IBM Spectrum Scale/GPFS: no
Lustre: no
PVFS2/OrangeFS: no



Any suggestions! Thanks!

Jeff



  • [OMPI use... Jeffrey Layton via users
    • Re: ... Jeff Squyres (jsquyres) via users
      • ... Jeffrey Layton via users
        • ... Jeff Squyres (jsquyres) via users
          • ... Jeffrey Layton via users
            • ... Jeff Squyres (jsquyres) via users
            • ... Bernstein, Noam CIV USN NRL (6393) Washington DC (USA) via users
              • ... Tom Kacvinsky via users

Reply via email to