On 21 January 2016 at 19:03, M.-A. Lemburg <m...@egenix.com> wrote:
> By using the version based approach, we'd not run into this
> problem and gain a lot more.

I think it's better to start with a small core that we *know* works,
then expand later, rather than trying to make the first iteration too
wide. The "manylinux1" tag itself is versioned (hence the "1" at the
end), so "manylinux2" may simply have *more* libraries defined, rather
than newer ones.

The key is that we only have one chance to make a good first
impression with binary Linux wheel support on PyPI, and we want that
to be positive for everyone:

* for publishers, going from "no Linux wheels" to "Linux wheels if you
have few external dependencies beyond glibc" is a big step up (it's
enough for a Cython accelerator module, for example, or a cffi wrapper
around a bundled library)
* for end users, we need to nigh certain that wheels built this way
will *just work*

Even with a small starting list of libraries defined, we're going to
end up with cases where the installed extension module will fail to
load, and end users will have to figure out what dependencies are
missing. The "external dependency specification" at
https://github.com/pypa/interoperability-peps/pull/30 would let pip
detect that at install time (rather the user finding out at runtime
when the module fails to load), but that will still leave the end user
to figure out how to get the external dependencies installed.

If Donald can provide the list of "most downloaded wheel files" for
other platforms, that could also be a useful guide as to how many
source builds may potentially already be avoided through the draft
"manylinux1" definition.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
_______________________________________________
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig

Reply via email to