On 21 January 2016 at 20:05, M.-A. Lemburg <m...@egenix.com> wrote: > On 21.01.2016 10:31, Nick Coghlan wrote: >> On 21 January 2016 at 19:03, M.-A. Lemburg <m...@egenix.com> wrote: >>> By using the version based approach, we'd not run into this >>> problem and gain a lot more. >> >> I think it's better to start with a small core that we *know* works, >> then expand later, rather than trying to make the first iteration too >> wide. The "manylinux1" tag itself is versioned (hence the "1" at the >> end), so "manylinux2" may simply have *more* libraries defined, rather >> than newer ones. > > My argument is that the file based approach taken by the PEP > is too limiting to actually make things work for a large > set of Python packages. > > It will basically only work for packages that do not interface > to other external libraries (except for the few cases listed in > the PEP, e.g. X11, GL, which aren't always installed or > available either). > > IMO, testing the versions of a set of libraries is a safer > approach.
I still don't really understand what you mean by "testing the versions of a set of libraries", but if you have the time available to propose a competing PEP, that always leads to a stronger result than when we only have only proposed approach to consider. Regards, Nick. -- Nick Coghlan | ncogh...@gmail.com | Brisbane, Australia _______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig