Your message dated Thu, 21 Jan 2021 09:41:59 +0100
with message-id <20210121084158.ga25...@ramacher.at>
and subject line Re: Bug#980710: mpi4py: FTBFS: ld: cannot find -llmpe
has caused the Debian Bug report #980710,
regarding mpi4py: FTBFS: ld: cannot find -llmpe
to be marked as done.

This means that you claim that the problem has been dealt with.
If this is not the case it is now your responsibility to reopen the
Bug report if necessary, and/or fix the problem forthwith.

(NB: If you are a system administrator and have no idea what this
message is talking about, this may indicate a serious mail system
misconfiguration somewhere. Please contact ow...@bugs.debian.org
immediately.)


-- 
980710: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=980710
Debian Bug Tracking System
Contact ow...@bugs.debian.org with problems
--- Begin Message ---
Source: mpi4py
Version: 3.0.3-7
Severity: serious
Justification: FTBFS on amd64
Tags: bullseye sid ftbfs
Usertags: ftbfs-20210120 ftbfs-bullseye

Hi,

During a rebuild of all packages in sid, your package failed to build
on amd64.

Relevant part (hopefully):
> make[1]: Entering directory '/<<PKGBUILDDIR>>'
> dh_auto_build override_dh_auto_build-arch -- \
>       --build-args "--mpicc=/usr/bin/mpicc --mpicxx=/usr/bin/mpicxx"
> I: pybuild base:232: /usr/bin/python3 setup.py build --mpicc=/usr/bin/mpicc 
> --mpicxx=/usr/bin/mpicxx
> running build
> running build_src
> running build_py
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py
> copying src/mpi4py/bench.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py
> copying src/mpi4py/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py
> copying src/mpi4py/__main__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py
> copying src/mpi4py/run.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/futures
> copying src/mpi4py/futures/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/futures
> copying src/mpi4py/futures/__main__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/futures
> copying src/mpi4py/futures/_base.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/futures
> copying src/mpi4py/futures/pool.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/futures
> copying src/mpi4py/futures/aplus.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/futures
> copying src/mpi4py/futures/_lib.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/futures
> copying src/mpi4py/futures/server.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/futures
> copying src/mpi4py/libmpi.pxd -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py
> copying src/mpi4py/__init__.pxd -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py
> copying src/mpi4py/MPI.pxd -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/include
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/include/mpi4py
> copying src/mpi4py/include/mpi4py/mpi4py.h -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/include/mpi4py
> copying src/mpi4py/include/mpi4py/mpi4py.MPI_api.h -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/include/mpi4py
> copying src/mpi4py/include/mpi4py/mpi4py.MPI.h -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/include/mpi4py
> copying src/mpi4py/include/mpi4py/mpi4py.i -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/include/mpi4py
> copying src/mpi4py/include/mpi4py/mpi.pxi -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/include/mpi4py
> running build_clib
> MPI configuration: [mpi] from 'mpi.cfg'
> MPI C compiler:    /usr/bin/mpicc
> MPI C++ compiler:  /usr/bin/mpicxx
> checking for library 'lmpe' ...
> /usr/bin/mpicc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv 
> -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g 
> -fwrapv -O2 -g -O2 -ffile-prefix-map=/<<PKGBUILDDIR>>=. 
> -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time 
> -D_FORTIFY_SOURCE=2 -fPIC -c _configtest.c -o _configtest.o
> /usr/bin/mpicc -pthread -Wl,-z,relro -g -O2 
> -ffile-prefix-map=/<<PKGBUILDDIR>>=. -fstack-protector-strong -Wformat 
> -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 _configtest.o -llmpe 
> -o _configtest
> /usr/bin/ld: cannot find -llmpe
> collect2: error: ld returned 1 exit status

The full build log is available from:
   http://qa-logs.debian.net/2021/01/20/mpi4py_3.0.3-7_unstable.log

A list of current common problems and possible solutions is available at
http://wiki.debian.org/qa.debian.org/FTBFS . You're welcome to contribute!

If you reassign this bug to another package, please marking it as 'affects'-ing
this package. See https://www.debian.org/Bugs/server-control#affects

If you fail to reproduce this, please provide a build log and diff it with me
so that we can identify if something relevant changed in the meantime.

About the archive rebuild: The rebuild was done on EC2 VM instances from
Amazon Web Services, using a clean, minimal and up-to-date chroot. Every
failed build was retried once to eliminate random failures.

--- End Message ---
--- Begin Message ---
On 2021-01-20 22:12:39, Lucas Nussbaum wrote:
> Source: mpi4py
> Version: 3.0.3-7
> Severity: serious
> Justification: FTBFS on amd64
> Tags: bullseye sid ftbfs
> Usertags: ftbfs-20210120 ftbfs-bullseye
> 
> Hi,
> 
> During a rebuild of all packages in sid, your package failed to build
> on amd64.
> 
> Relevant part (hopefully):
> > make[1]: Entering directory '/<<PKGBUILDDIR>>'
> > dh_auto_build override_dh_auto_build-arch -- \
> >     --build-args "--mpicc=/usr/bin/mpicc --mpicxx=/usr/bin/mpicxx"
> > I: pybuild base:232: /usr/bin/python3 setup.py build --mpicc=/usr/bin/mpicc 
> > --mpicxx=/usr/bin/mpicxx
> > running build
> > running build_src
> > running build_py
> > creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py
> > copying src/mpi4py/bench.py -> 
> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py
> > copying src/mpi4py/__init__.py -> 
> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py
> > copying src/mpi4py/__main__.py -> 
> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py
> > copying src/mpi4py/run.py -> 
> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py
> > creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/futures
> > copying src/mpi4py/futures/__init__.py -> 
> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/futures
> > copying src/mpi4py/futures/__main__.py -> 
> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/futures
> > copying src/mpi4py/futures/_base.py -> 
> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/futures
> > copying src/mpi4py/futures/pool.py -> 
> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/futures
> > copying src/mpi4py/futures/aplus.py -> 
> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/futures
> > copying src/mpi4py/futures/_lib.py -> 
> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/futures
> > copying src/mpi4py/futures/server.py -> 
> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/futures
> > copying src/mpi4py/libmpi.pxd -> 
> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py
> > copying src/mpi4py/__init__.pxd -> 
> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py
> > copying src/mpi4py/MPI.pxd -> 
> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py
> > creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/include
> > creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/include/mpi4py
> > copying src/mpi4py/include/mpi4py/mpi4py.h -> 
> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/include/mpi4py
> > copying src/mpi4py/include/mpi4py/mpi4py.MPI_api.h -> 
> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/include/mpi4py
> > copying src/mpi4py/include/mpi4py/mpi4py.MPI.h -> 
> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/include/mpi4py
> > copying src/mpi4py/include/mpi4py/mpi4py.i -> 
> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/include/mpi4py
> > copying src/mpi4py/include/mpi4py/mpi.pxi -> 
> > /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9/build/mpi4py/include/mpi4py
> > running build_clib
> > MPI configuration: [mpi] from 'mpi.cfg'
> > MPI C compiler:    /usr/bin/mpicc
> > MPI C++ compiler:  /usr/bin/mpicxx
> > checking for library 'lmpe' ...
> > /usr/bin/mpicc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g 
> > -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat 
> > -Werror=format-security -g -fwrapv -O2 -g -O2 
> > -ffile-prefix-map=/<<PKGBUILDDIR>>=. -fstack-protector-strong -Wformat 
> > -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -c 
> > _configtest.c -o _configtest.o
> > /usr/bin/mpicc -pthread -Wl,-z,relro -g -O2 
> > -ffile-prefix-map=/<<PKGBUILDDIR>>=. -fstack-protector-strong -Wformat 
> > -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 _configtest.o 
> > -llmpe -o _configtest
> > /usr/bin/ld: cannot find -llmpe
> > collect2: error: ld returned 1 exit status

The actual error is:

testDL1 (test_dl.TestDL) ... A process has executed an operation involving a 
call
to the fork() system call to create a child process.

As a result, the libfabric EFA provider is operating in
a condition that could result in memory corruption or
other system errors.

For the libfabric EFA provider to work safely when fork()
is called, you will need to set the following environment
variable:
          RDMAV_FORK_SAFE

However, setting this environment variable can result in
signficant performance impact to your application due to
increased cost of memory registration.

You may want to check with your application vendor to see
if an application-level alternative (of not using fork)
exists.

Your job will now abort.
[ip-172-31-4-144:15144] *** Process received signal ***
[ip-172-31-4-144:15144] Signal: Aborted (6)
[ip-172-31-4-144:15144] Signal code:  (-6)
[ip-172-31-4-144:15144] [ 0] 
/lib/x86_64-linux-gnu/libpthread.so.0(+0x14140)[0x7f028f10e140]
[ip-172-31-4-144:15144] [ 1] 
/lib/x86_64-linux-gnu/libc.so.6(gsignal+0x141)[0x7f028edd3ce1]
[ip-172-31-4-144:15144] [ 2] 
/lib/x86_64-linux-gnu/libc.so.6(abort+0x123)[0x7f028edbd537]
[ip-172-31-4-144:15144] [ 3] 
/usr/lib/x86_64-linux-gnu/libfabric.so.1(+0x7c14e)[0x7f028567514e]
[ip-172-31-4-144:15144] [ 4] 
/lib/x86_64-linux-gnu/libc.so.6(+0x85228)[0x7f028ee1d228]
[ip-172-31-4-144:15144] [ 5] 
/lib/x86_64-linux-gnu/libc.so.6(__libc_fork+0x20)[0x7f028ee63490]
[ip-172-31-4-144:15144] [ 6] /usr/bin/python3.9[0x65306b]
[ip-172-31-4-144:15144] [ 7] /usr/bin/python3.9[0x53f65a]
[ip-172-31-4-144:15144] [ 8] 
/usr/bin/python3.9(_PyObject_MakeTpCall+0x39b)[0x51db7b]
[ip-172-31-4-144:15144] [ 9] 
/usr/bin/python3.9(_PyEval_EvalFrameDefault+0x5b10)[0x5177f0]
[ip-172-31-4-144:15144] [10] /usr/bin/python3.9[0x511237]
[ip-172-31-4-144:15144] [11] 
/usr/bin/python3.9(_PyFunction_Vectorcall+0x361)[0x529051]
[ip-172-31-4-144:15144] [12] 
/usr/bin/python3.9(_PyEval_EvalFrameDefault+0x701)[0x5123e1]
[ip-172-31-4-144:15144] [13] /usr/bin/python3.9[0x511237]
[ip-172-31-4-144:15144] [14] 
/usr/bin/python3.9(_PyFunction_Vectorcall+0x361)[0x529051]
[ip-172-31-4-144:15144] [15] /usr/bin/python3.9[0x537a8e]
[ip-172-31-4-144:15144] [16] 
/usr/bin/python3.9(_PyObject_MakeTpCall+0x1f5)[0x51d9d5]
[ip-172-31-4-144:15144] [17] 
/usr/bin/python3.9(_PyEval_EvalFrameDefault+0x6047)[0x517d27]
[ip-172-31-4-144:15144] [18] 
/usr/bin/python3.9(_PyFunction_Vectorcall+0x1a3)[0x528e93]
[ip-172-31-4-144:15144] [19] 
/usr/bin/python3.9(_PyEval_EvalFrameDefault+0x524)[0x512204]
[ip-172-31-4-144:15144] [20] 
/usr/bin/python3.9(_PyFunction_Vectorcall+0x1a3)[0x528e93]
[ip-172-31-4-144:15144] [21] 
/usr/bin/python3.9(_PyEval_EvalFrameDefault+0x524)[0x512204]
[ip-172-31-4-144:15144] [22] 
/usr/bin/python3.9(_PyFunction_Vectorcall+0x1a3)[0x528e93]
[ip-172-31-4-144:15144] [23] /usr/bin/python3.9[0x53bfab]
[ip-172-31-4-144:15144] [24] 
/usr/bin/python3.9(_PyEval_EvalFrameDefault+0x524)[0x512204]
[ip-172-31-4-144:15144] [25] 
/usr/bin/python3.9(_PyFunction_Vectorcall+0x1a3)[0x528e93]
[ip-172-31-4-144:15144] [26] 
/usr/bin/python3.9(_PyEval_EvalFrameDefault+0x701)[0x5123e1]
[ip-172-31-4-144:15144] [27] /usr/bin/python3.9[0x51093d]
[ip-172-31-4-144:15144] [28] 
/usr/bin/python3.9(_PyFunction_Vectorcall+0x361)[0x529051]
[ip-172-31-4-144:15144] [29] /usr/bin/python3.9[0x53c061]
[ip-172-31-4-144:15144] *** End of error message ***
Aborted
make[1]: *** [debian/rules:92: override_dh_auto_test] Error 1

This is #979041 in openmpi which was fixed in 4.1.0-7. With that version
of openmpi, the build succeeds. Hence I'm closing this bug.

Cheers
-- 
Sebastian Ramacher

--- End Message ---

Reply via email to