Source: mpi4py Version: 3.0.3-7 Severity: serious Justification: FTBFS on amd64 Tags: bullseye sid ftbfs Usertags: ftbfs-20201226 ftbfs-bullseye
Hi, During a rebuild of all packages in sid, your package failed to build on amd64. Relevant part (hopefully): > make[1]: Entering directory '/<<PKGBUILDDIR>>' > dh_auto_build override_dh_auto_build-arch -- \ > --build-args "--mpicc=/usr/bin/mpicc --mpicxx=/usr/bin/mpicxx" > I: pybuild base:232: /usr/bin/python3 setup.py build --mpicc=/usr/bin/mpicc > --mpicxx=/usr/bin/mpicxx > running build > running build_src > running build_py > creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py > copying src/mpi4py/bench.py -> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py > copying src/mpi4py/__init__.py -> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py > copying src/mpi4py/__main__.py -> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py > copying src/mpi4py/run.py -> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py > creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/futures > copying src/mpi4py/futures/__init__.py -> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/futures > copying src/mpi4py/futures/__main__.py -> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/futures > copying src/mpi4py/futures/_base.py -> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/futures > copying src/mpi4py/futures/pool.py -> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/futures > copying src/mpi4py/futures/aplus.py -> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/futures > copying src/mpi4py/futures/_lib.py -> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/futures > copying src/mpi4py/futures/server.py -> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/futures > copying src/mpi4py/libmpi.pxd -> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py > copying src/mpi4py/__init__.pxd -> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py > copying src/mpi4py/MPI.pxd -> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py > creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/include > creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/include/mpi4py > copying src/mpi4py/include/mpi4py/mpi4py.h -> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/include/mpi4py > copying src/mpi4py/include/mpi4py/mpi4py.MPI_api.h -> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/include/mpi4py > copying src/mpi4py/include/mpi4py/mpi4py.MPI.h -> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/include/mpi4py > copying src/mpi4py/include/mpi4py/mpi4py.i -> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/include/mpi4py > copying src/mpi4py/include/mpi4py/mpi.pxi -> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/include/mpi4py > running build_clib > MPI configuration: [mpi] from 'mpi.cfg' > MPI C compiler: /usr/bin/mpicc > MPI C++ compiler: /usr/bin/mpicxx > checking for library 'lmpe' ... > /usr/bin/mpicc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv > -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g > -fwrapv -O2 -g -O2 -fdebug-prefix-map=/<<PKGBUILDDIR>>=. > -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time > -D_FORTIFY_SOURCE=2 -fPIC -c _configtest.c -o _configtest.o > /usr/bin/mpicc -pthread -Wl,-z,relro -g -O2 > -fdebug-prefix-map=/<<PKGBUILDDIR>>=. -fstack-protector-strong -Wformat > -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 _configtest.o -llmpe > -o _configtest > /usr/bin/ld: cannot find -llmpe > collect2: error: ld returned 1 exit status The full build log is available from: http://qa-logs.debian.net/2020/12/26/mpi4py_3.0.3-7_unstable.log A list of current common problems and possible solutions is available at http://wiki.debian.org/qa.debian.org/FTBFS . You're welcome to contribute! If you reassign this bug to another package, please marking it as 'affects'-ing this package. See https://www.debian.org/Bugs/server-control#affects If you fail to reproduce this, please provide a build log and diff it with me so that we can identify if something relevant changed in the meantime. About the archive rebuild: The rebuild was done on EC2 VM instances from Amazon Web Services, using a clean, minimal and up-to-date chroot. Every failed build was retried once to eliminate random failures.